Azure Data Factory : Extract(HTTP),Transform and Load(Azure SQL)

Mohit Gupta
AzureDiary
Published in
3 min readOct 16, 2019

Overview:

The article gives you an overview of how to create Pipeline in Azure Data factory for ETL

Create Azure Data Factory in Azure Portal

Create Connection :

In this article we are using HTTP source and Azure SQL . So create two Linked service one for HTTP and one for Azure SQL

Create new linked service for HTTP API

Create another Linked service for Azure SQL Database

Create DataSets

We need three datasets

For HTTP Api Service

For SQL Staging Table

For SQL Main Table

Create table for staging data and transformed data

CREATE TABLE [dbo].[EmployeesExtract]([ID] [nvarchar](255) NOT NULL,[first_name] [nvarchar](255) NULL,[last_name] [nvarchar](255) NULL,[email] [nvarchar](255) NULL,[gender] [nvarchar](255) NULL,[ip_address] [nvarchar](255) NULL,[isDelete] [nvarchar](255) NULL,[Address] [nvarchar](255) NULL)
CREATE TABLE [dbo].[EmployeesExtract]([ID] [nvarchar](255) NOT NULL,[first_name] [nvarchar](255) NULL,[last_name] [nvarchar](255) NULL,[email] [nvarchar](255) NULL,[gender] [nvarchar](255) NULL,[ip_address] [nvarchar](255) NULL,[isDelete] [nvarchar](255) NULL,[Address] [nvarchar](255) NULL)

Create DataSets for SQL Tables

Create DataFlow for Insert/Update/Delete data into main table from staging table

Set properties for Insert condition

Set Properties for Upsert Condition

--

--

Mohit Gupta
AzureDiary

Integration Developer works on Azure , DevOps , BizTalk. Contact : mohit.e.gupta@gmail.com