Using Azure Data Factory V2 to Load Microsoft Dynamics 365 CRM from text file

Simon Harris
Capgemini Microsoft Blog
5 min readMay 30, 2018

The following will provide step by step instructions in how to load data into Microsoft Dynamics 365 using Azure Data Factory.

Having used SSIS and Kingsway software for a while to load CRM I was interested to try Azure Data Factory V2 and see if this is the future way forward. When looking on the web I found some demos and examples of using data factory, but only a few using Dynamics CRM. The only CRM examples I found were reading from CRM and not writing records into CRM, so I decided to for myself and the below are results.

Azure Data Factory — Recently released Version 2 supports Dynamics 365 as Source or Target, allows creation of pipeline for repeating jobs and suits high data volumes.

The following will show a step by step example of how to load data to Dynamics CRM 365 from flat file using Azure Data Factory.

Required
-Create a free 30-day trial Dynamics CRM instance
-Azure subscription either MSDN or paid

Create Blob Storage for Input file

View Blob Storage and create container

Create a text file with .csv extension containing the following data:

contactid,firstname,lastname,addressline1,addressline2,city,province,postcode,country,email,telephone,mobilephone
,Mark,Bronner,18 High Street,\N,Lakford,\N,IP23 9SW,\N,MB@bob.com,\N,\N
,Luke,Jones,Magui Lan,\N,Guangzhou,Guangdong,650031,Republic of China,LJ@adatum.com,012–156–8778,135–548–8797
,Kim,Smith,11111 William Nicol Drive,\N,Johannesburg,\N,2100,South Africa,KS@adventure.com,768–555–0156,\N

Upload the new text file to Azure

Select the new text file and click upload

Click on the new file uploaded to check

Click Edit Blob to check file format is as expected

Ensure contents are as expected

Create New Azure Data Factory component

Click Author & Monitor

Select Create Pipeline

Drag Copy onto the canvas from Dataflow

Click on Copy tile, then source, then click +New

Select Azure Blob Storage

In the Azure Blob Storage Dataflow select connection then +New

Set the parameters for your subscription and storage account name, then click test connection and Finish

Click ‘Source files contains column names in first row’ and then Browse and select the file you loaded to Azure Blob storage

Click on Schema, then import Schema and Set the contact Id type to GUID

Then click Publish All button at top

Click back to the Pipeline, then highlight the copy tile, click on Sink and then click on +New

Select Dynamics 365 and Finish

Click on Connection and +New

Set properties for Service Uri, User name and Password
Note at the time of writing (May 2018) this didn’t work with UK CRM instances — possibly fixed now

Select your CRM entity to load

Then Publish all button at top

Click back on the pipeline at the top, then the copy tile, then Mapping and then Import Schema and check your mappings are as expected

Click Validate and ensure no errors

Click Publish All at top, then Debug

A new Pipeline Run instance will be created

Refresh the view until it finishes (hopefully successfully) — any error messages can be seen by click on the icon on the pipeline run row

Log into CRM to see the new contacts created :)

Overall, I found Data Factory quite difficult to get used to and limited examples/detailed demonstrations didn’t help. I had problems where the Dynamics 365 connector didn’t support UK CRM instances, which Microsoft admitted was their issue. Also, it took me a while to understand how to get CRM to create the GUID’s — seems you need to include the primary key field in the source, target and mappings otherwise you get errors. However, if you leave the primary key field value NULL in the input file and select ignore NULL values in the Sink settings it doesn’t error.

--

--