The Data Migration tool is an open-source solution that imports data to Azure Cosmos DB from a variety of sources, including:
Azure Cosmos DB collections
Azure Table storage
While the import tool includes a graphical user interface (dtui.exe), it can also be driven from the command-line (dt.exe). In fact, there's an option to output the associated command after setting up an import through the UI. You can transform tabular source data, such as SQL Server or CSV files, to create hierarchical relationships (subdocuments) during import. Keep reading to learn more about source options, sample commands to import from each source, target options, and viewing import results.
One tasks that seems to come up over and over is migrating data from one database/format into another. I recently used Cosmos DB as my database to store every tweet that came out of Ignite. Once I had the data and wouldn’t be using Cosmos DB any more for that exercise, I needed to dump the data out to a local file to preserve the data and save money. Here is how I did it.
Download and install the Azure DocumentDB Data Migration Tool from this link: https://github.com/azure/azure-documentdb-datamigrationtool
First and foremost you have to make sure that a Cosmos DB database and collection created which you wish to migrate out.
Go to > Keys (inside your Cosmos DB blade in the portal) to copy the Primary Connection String
You’ll need to append the Database name to the end of the string. It could look like somthing like this: Database=cosmosdb-ignite will be appended to the Key copied earlier AccountEndpoint=https://mbcrump.documents.azure.com:443/;AccountKey=BxATcJnkh==;Database=cosmosdb-ignite. Save this for later.
Open the Data Migration Tool > Source Information, select DocumentDB as shown below.
You’ll need to add the ConnectionString (The one we just created) along with the Collection and in my case it is items. We’ll take the defaults on the rest and press Verify and if successful, then press Next.
In this case, I’ll export to a local JSON file and select Prettify JSON and press Next.
Then you’ll see a View Command to see the command that will be used to migrate your data. This is helpful to just learn the syntax.
Finally you can see the Import has completed! Now we have our local JSON file and can use it however we want! Supiiii dupiiiii ! 😎 TRADEMARK LEGAL NOTICE All product names, logos, and brands are property of their respective owners in the Austria or other countries.All company, product and service names used on this website are for identification purposes only. Pheniix is notaffiliated with or an official partner of Cisco, CompTIA,Dimension Data, VMware, Amazon, Microsoft, Certified Ethical Hacker, (ISC)², Juniper, Wireshark, Offensive Security,Google, GNS3, F5, Python, Linux, Java, Openstack, Vagrant, Ansible, Docker, GIT, , Blockchain or other companies.Use of these names, logos, and brands does not imply endorsement.The opinions expressed on pheniix are personal perspectives and not those of Cisco , Dimension Data or any other company. Pheniix runs as an independent blog.