Sometimes we need to move data stored in Dataverse tables from one environment to another. This could be reference data that’s needed to support our solutions, but is static in nature. Perhaps a list of countries or postcodes – data that’s never going to change but always needed.
There are many ways to move our data; in this article I’ll highlight a couple of easy ones that are hopefully the most accessible to the majority.
Table of Contents
ToggleIntro
Going back 10 years or so, I did a lot of work in Microsoft Access. It was very easy to make a copy of a table, where you could choose whether you just copied the schema or the data as well.
We don’t get the same ‘luxuries’ with Dataverse. When we move a solution from one environment to another, it will only copy the schema; the ‘structure only’ option. This makes total sense, as we wouldn’t want to contaminate our production solution with dev or test data.
Personally, I’d love something in future like a table type in Dataverse. If it’s a ‘reference data’ table, a solution export/import will take the structure and data. If it’s a ‘live data’ table, it only takes the structure. Until that magical day (that will probably never happen), we need to work around this and use other methods to take reference data across environments.
Setup
To support this article, I’ve created a small solution with x2 Dataverse tables and x2 Option Sets.
I’ll be capturing some information about Star Wars characters, so first I need some categories.
Weapons. Let’s have some weapons too:
Both of these option sets are consumed in a table; Category will be a SINGLE choice column, Weapons will be a MULTIPLE choice column as it has this checked:
There’s also a reference table for Factions, I’ll use this table for a Lookup column reference:
Wrap all that up in a Characters table with some dummy data. As well as the choice & lookup columns, there’s also an image column:
Migrate solution
Before we can migrate Dataverse data, first we need to migrate the solution. We need the structure of the tables in the destination environment, so we can then add some data to them.
If you’re unfamiliar with exporting & importing solutions, you can do this manually by first exporting the solution from your development environment. Once you’ve downloaded the .zip file, you can then import the solution into your chosen test/production/other downstream environment.
For the purposes of this article, I’ve already imported the solution into my destination environment.
Manual export/import
I’ll cover this option first; it’s a great entry-level point but does have its limitations.
Thankfully, Microsoft have these well documented here; most column types are ok but timezones, multi-select choice, images & files aren’t for this method. Our example has some of these so let’s see the impact.
Running order
In my example I have two tables. One is for our main data collection (Characters), the other a reference table that’s the source of a lookup column (Faction). As such, the Characters table is dependent on information from the Faction table. If using the manual export/import method, I’ll migrate data for lookup-supporting tables first, in isolation. In this example, I would export/import Faction data first, then Characters.
If you try to move the main data first, you’ll receive errors on import. It will try and fail to resolve information for lookup columns as the data isn’t there. If you have several tables that support lookup columns, make sure the data for all of them has been moved into your destination tables first.
Exporting
When selecting a Dataverse table, we have the Export option. Clicking this will show an additional option for Export data:
When clicked, a side panel will show progress before presenting the following message. Make sure to click Download exported data to get the .zip into the location on your computer where downloads typically go:
You can then extract the contents of the downloaded .zip file to find the table data in csv format:
We can take a closer look at the extracted file and see some patterns and what’s missing. I’ve removed some of the system columns so I’m just left with the columns I want:
Firstly, we can spot that our image & multi-select columns aren’t present. This is consistent with the limitations Microsoft have documented.
Secondly, we can see our single choice field doesn’t have the text we’ve chosen. It’s got a bunch of numbers instead, what is this skullduggery we see before us?
This is an easy one to explain. When setting up the Option Set originally, we see each value we enter is accompanied by a unique integer reference for each value. In a way, an Option Set is an object in its own right so it makes sense it’s structured like this. So when we import this csv file into our destination table, it will use these unique IDs to return the values for us.
The outputs for our faction lookup column work in an identical way. We can map each entry to the unique identifier in the table being used for the lookup.
These things might appear obvious to some, but they weren’t to me a few years ago when I first exported table data to csv. Hopefully those explanations will help others not be as confused as I was!
Importing
Navigate to your destination environment. From the solution that you’ve imported, select the relevant table. In the top menu, select Import, then Import data from Excel:
From the resulting side pane, click on Upload then navigate to the csv file you previously exported & extracted:
The schema should match exactly, so the columns from the csv should automatically map to the columns in the destination table. However if that doesn’t happen, you can click on Map columns to do the mappings manually.
When ready, click on Import to begin the import. The import button is tucked away in the top right hand corner, easy to miss (at least, it was for me when I first did this method):
There we have it, data moved from one environment to another:
This is a perfectly fine method for a single table, but what if you have several tables worth of data to move? The manual export/import method is by its very name – manual; you’ll need to do the above process for each table, one by one. We also have some limits around columns we can & can’t export data for.
There are a couple of tools we can use instead that might be better, so let’s take a look at those next.
Configuration Migration Tool (CMT)
As it says in the opening paragraph here, this is used to transport config & test data from one environment to another. At a high level, this Microsoft Learn article explains what the tool does and how it works.
The CMT tool itself is quite easy to use, but getting there perhaps not so much – especially if you’re not familiar with Visual Studio and/or running command line prompts. However, it does do the things that manual export/import doesn’t, namely migrating image and multi-choice columns. For some cases then, the juice will definitely be worth the squeeze.
If you wish to install the CMT on a work/professional device, you may not have local admin rights and so would need to log a ticket for your IT Support to install instead.
PAC CLI
To get the Configuration Migration Tool, you need to install the Power Platform CLI. To install that, you can choose one of three methods:
1- Install Visual Studio Code,
2- Install with .NET tool, OR
3- Install with Windows MSI.
Each method is linked here. I personally went with the Visual Studio Code route as that’s pretty easy to install. Once that’s on your device, finding and installing the Power Platform CLI extension is easy too.
Follow this article to get set up with both Visual Studio Code and the Power Platform CLI.
Launch CMT
With Visual Studio Code open, click on Extensions, then Power Platform Tools:
At the bottom of the main window, you’ll see an option for Terminal. That’s where we want to be:
If you can’t see that, fear not. In the top left, select Terminal, then New Terminal:
Type in pac tool cmt, then hit Enter on your keyboard:
Give it a few seconds; it should return a message saying the CMT has been launched. You will now find the CMT available to use. If you are working on multiple screens, the CMT may have popped up on a different screen to Visual Studio Code:
Create schema / Export data
Select the Create schema option, then click on Continue in the bottom right. In the next screen you can select how you’ll be accessing the data.
You can choose to sign in as the current user, just make sure your account has the right access to both the source and destination environments. Alternatively, you can simply select Display list of available organizations, then Login, to add alternative credentials instead (for example, those of a Service Account).
When prompted, select the source environment. In other words, where the solution and data currently live. Then click on Login:
From the resulting view, select the solution you need from the top left hand menu. System solutions have an orange icon next to them, your custom solutions will have a green icon. With the solutions selected, you can select individual tables or Add All:
With the CMT, we don’t have to preload tables used for Lookup columns first. In this example, we can therefore add all tables and do the export/import in a single batch.
With the tables selected and added, click on Save and Export in the bottom right. You’ll be asked to specify a location and file name for the XML file. This is storing data about the columns in the selected tables. When this is saved, the CMT will ask us if we’ve like to export the data. Select Yes:
Click on the ellipsis next to Save to data file. Specify the same location as before for the XML file. When done, click on Export Data:
All being well, you should have lots of green ticks at the end to indicate a successful export:
Import data
With the schema & data exported from the source environment, we can now load into the target environment.
With the CMT open, select the Import data option and click Continue:
As with the export step, you can sign in as the current user so long as that account has the relevant access. We need to select the target environment, so select Display list of available organizations and click Login.
Select the environment you want to import the data into and click Login:
In the next screen, click the ellipsis and navigate to where you saved the data export. Select the zip file and it’ll appear at the top:
All being well, the CMT will successfully find the file and inform you of how many tables (entities) are ready to be imported:
Click on the Import Data button at the bottom for the magic to commence. Obviously the more data there is, the longer it will take. We’ll hopefully get a variety of success messages when the data has imported.
Excellent! All data, including those in complex columns are imported successfully into our destination environment:
XRM Toolbox
If you’re not familiar with the XRM Toolbox, now is a good time to learn. It’s a fantastic AND FREE application that has over 300 tools you can add and use. These tools are built and shared by the community to help others, which is pretty awesome.
To get started with the XRM Toolbox, you’ll need to visit the website and click ‘Download latest version’. A reminder that you may need to log a ticket with your internal IT Support to do this, if you don’t have access to install programmes on your device.
Tool Library
When you have XRM Toolbox installed, load it up and select Open Tool Library from the home page:
There are endless tools to help with migrating data & settings from one environment to another. I shall focus on one in particular for this article – Data Transporter. This was recently recommended to me by my good friend, Parvez Ghumra. Parvez is a really knowledgeable & respected member of the Power Platform community, so his recommendations always fly well with me.
In the tool library, we can search for any keyword, select the tool we want, then install it:
Once installed, it will be available in both the Start page and Tools tabs:
Add connections
You’ll need to create connections to your environments to use most (if not all) tools in XRM Toolbox. In our example, we’ll need 2 connections – one for the source environment and another for the destination. This is a one-time exercise for each environment you need – connections are saved for later use as well.
We are going to need the environment URLs of both environments.
1- go to the Power Platform Admin Centre.
2- select the source environment.
3- copy the environment URL somewhere (ie Notepad).
4- repeat for all environments needed.
Back in the XRM Toolbox, click on Connect in the top left hand corner. In the resulting popup, select New. There are a number of connection methods, I’ll go with OAuth/MFA:
Insert the environment URL (including the https prefix), then click on Next:
Next, add in your email address that’s relevant for the tenant/environment. You can choose to use your own Azure App Registration if you want to, more details available on the link provided. When you click on Next, you’ll need to enter your email address & password for that account. Don’t forget to approve the MFA request if the account is set up for it.
Once connected, it’s good to give the connection a useful name so you can identify and use it easily in future.
Repeat the process for any other environments you’ll need to connect to.
🐞 BUG 🐞
Sometimes you have to close down the connections popup window, then click on Connect to reload it, to then see the created connections.
Run the tool
We’re now ready to run the Data Transporter tool. In the Tools tab, find the tool and right click. Select Open tool with new connection. Select the environment connection for the source environment – that’s where the data is currently that you want to move:
When the tool loads, the source environment will be shown in the top left hand corner. We need to set the target environment – where we are moving the data to. Click on Select target and choose the relevant environment:
On the left hand side, you can search for the table you need to move data for. Select it so it shows on the right hand side, then click on Transfer Data. Repeat the process for each table where you need to move data. As per the Export/Import process, first move data that’s going to be used for Lookup columns, then move data that references those tables:
It’s set to just create records in the top right, you can change accordingly for your needs. I must admit, I’ve only scratched the surface of this tool, you can find out more about other Data Transporter settings here.
It’s a really good tool and moves all complex column types as far as I know. There are other great content migration tools in the XRM Toolbox, sometimes the best thing to do is try and few and see what works best for you!
Power Automate
The final method I’ll cover in this article is using Power Automate. That’s right – we can build a flow to move data from one environment to another!
Since Microsoft Flow was a thing, you could only build flows referencing the Dataverse instance in the same environment as the Flow. As of late 2023, we can connect to other environments using the Dataverse connector. The release documentation can be found here and suggests this is still a preview feature. However, recently the word ‘Preview’ has disappeared from the relevant connector actions, so should be fine to use in anger.
As per the other methods:
1- ensure the solution structure is in the destination environment first, and
2- your Flow will need to move reference data that’s used for lookup columns first.
The flow
We can do this by first adding the List rows Dataverse action to a Flow. Then, select the table we want to get data from that we need to move:
Next, we can add the new Add a row to selected environment action. For the properties, select the destination environment and table for the data. Use dynamic content to populate the relevant columns originally captured in the previous List rows step:
When you configure the above, Power Automate will add an Apply to Each container so it can process multiple rows.
Once all reference data is moved, you can repeat the same steps for the master data. This is exactly the same steps as above, in that you can use dynamic content to populate a table in another environment. The only slight caveat is updating a lookup column; that’s not so straight forward. Tom Riha’s blog post on how to do this is something I’ve had in my favourites for a while as it’s always a big help here.
Here’s what the final block looks like in my flow. We previously moved the data in the Factions table, meaning that we can use it to get the correct lookup value. Single and multi-choice, as well as images, are covered by dynamic content:
Extensions
You could take this a lot further too. For example:
1- if the record already exists then update it, else add it, or
2- if the record no longer exists in the source tables, remove it from the destination tables.
You could even have some kind of setup wizard app that calls the flow, once a solution has been moved across. All quite maker friendly and easy to put together.
Conclusion
I’ve highlighted four good options for moving Dataverse data from one environment to another. There are LOADS more; other tools in the XRM Toolbox, third party tools, dataflows or even extend your Azure DevOps Pipelines to do it. Some require more setup and knowledge/skills than others, so have tried to keep this article focused on the lower-hanging, easy to use options.
What tools or methods are you using to migrate data from one environment to another? Let me know in the comments below.
If you liked this article and want to receive more helpful tips about Power Platform every week, don’t forget to subscribe or follow me on socials 😊
Nice post Craig,
But let’s be honest, it is a pain to move data between environments regardless of which approach we use, it gets even worse when you have multiple tables with lookups between, and data will need to be importer in a certain order.
Agree, it’s not the best experience but at least we have multiple options to choose from to help us. There are more advanced options too which I hope to cover in a future post!
awesome article as always Craig what would you prefer?
Me personally, the Data Transporter add-in in the XRM Toolbox. It’s my new favourite.
Hey Craig,
Thanks for sharing. Another very informative article. I tend to use the CMT tool and I find it much easier using the VS Code/CLI to install and invoke it. Didn’t realise the XRM Toolbox has this tool as well. Will explore that sometime and the Power Automate one as well. At least, now I know there are other options. Thanks
SQL4CDS in XrmToolBox can also do cross environment data migration.
Great article, Craig. Myself, I prefer the combination of the pac cli and Azure DevOps, but as you wrote, it requires certain skills.
Regarding the Power Automate, it might be worth mentioning that sometimes Choices (Option Sets for those who prefer old names) might have to be handled in a special way. Of course, that only applies when you need to migrate data from data sources other than Dataverse.
Hi! Does the CMT method work for importing into managed solutions as well? The tables are custom and I can manually customise them in the managed solution just fine, however when I try the import no data goes through. The export works just fine too.
Hi, thanks for the question. I’ll always deploy into test/prod as a managed solution and CMT works fine when interacting with those downstream tables; it’s not editing config, just adding records so should be fine. I’ve never had any issue anyway.
Hi! Great article, thank you. It definitely helps in seeing the possibilities. Is there a particular reason you don’t cover dataflows? It’s the recommended method by Microsoft (as stated here:https://learn.microsoft.com/en-us/power-apps/developer/data-platform/dataverse-odata-dataflows-migration#:~:text=Dataflows%20are%20the%20recommended%20method%20to%20migrate%20data%20between%20Dataverse%20environments.)
Hi Niels, it’s one of the options, yes, especially for syncing larger data sets but perhaps a bit overkill for smaller one-off scenarios when PAC CLI or XRM toolbox tools will suffice. Dataflows also have their own considerations & complexities that I’ll be covering in a separate article.
Thanks Craig.
I just want to know if there is a way to import the image in full &full=true? Cause i notice the PAC and XRM toolbox migrate the compressed image. The image column is enable the Can store full image but still it is migrated without the full size of image.
Good question Gen! I’m not sure on the answer right now but will get back to you