How to Create Self Hosted IR and Load Data From On-Prem to Azure Cloud - Azure Data Factory Tutorial

Issue:  How to Create Self Hosted IR and Load Data From On-Prem to Azure Cloud - Azure Data Factory Tutorial 2021.


In this blog post we are going to learn how to create Self Hosted IR and load data from On-Prem to azure cloud azure data factory tutorial 2021 step by step.

Create & configure A self-hosted integration runtime :

Open azure data factory & click on manage tab, click on integration runtime and click on new:



Fig - 1 : Create & Configure A Self-Hosted Integration Runtime


Click On azure , self-hosted insert information and click create:

Fig-2 : Information column name & description

Click on the download & install integration runtime.

Fig- 3 Download & install integration runtime




Open integration runtime & copy keys from azure data factory which is shown in below picture


Fig- 4: Copy the keys for self-hosted integration runtime 

Paste the copied key and click on register & then click finsh.

Fig- 5: Insert copied keys for registration


Create a pipeline give the name to our pipeline and click & drag copy activity :

Fig-6 : perform copy activity 




Click on source + new and select sql server & click continue

Fig- 7: Create new source for copy activity

give name , crete new link server, select server name, select self hosted IR,  give data base name + tb user + password and click test connection.

Fig- 8: Create new linked service


Click on open



Click on preview


 

Click on azure blob storage + select csv file + set properties insert name & link service + create output connection & click ok + debug  and the file will be generated


Fig-9 : Create new blob storage step by step

How To Copy File from local folder to Cloud
Go to the drive where the folder is located right click on the folder go to properties and click sharing and then click share and provide the permission to the user, and click share.



Fig- 10: How To Copy File from local folder to Cloud


copy shared link :




Go to azure data factory and create new pipeline, drag copy data activity and click on source, then click on new then select file system

Fig- 11: create new pipeline

Select binary and click continue


 Name the file and create new linked service 



In the link service select server name, then select self hosted IR, then host then  user name & Password then click on test connection and click create


Click open 



and browse the required file and click ok


go to pipeline and click sink click on Blob Storage Then Click on Binary then Select Azure Blob Storage Link and Select  the output folder and click ok

Click ok Debug


As you can see in below picture task is completed:

 

How to Create Self Hosted IR and Load Data From On-Prem to Azure Cloud - Azure Data Factory Tutorial

How to Load Multiple JSON Files to SQL Server Table with File Name In SSIS - SSIS Tutorial 2021

Issue : How to Load Multiple JSON Files to SQL Server Table with File Name In SSIS - SSIS Tutorial 2021



In this blog post we are going to learn how to load Multiple JSON file to SQL Server Table with File Names in SSIS. I have generated couple of JSON Files online so we can use them for our demo.
Here is snapshot of one of JSON file we have used in demo.

Fig 1- Load JSON files to SQL Server Table in SSIS Package - Sample JSON File

Open SQL Server Management Studio (SSMS) and use the script I have provided below, First of all we have to declare the variable which is "Declare @JSON varchar (max), which I have Shown in this picture.

Declare @JSON varchar(max)
SELECT @JSON=BulkColumn
FROM OPENROWSET (BULK 'C:\MyJsonFile.json', SINGLE_CLOB) import
SELECT * FROM OPENJSON @JSON)
WITH
(
 [ID] INT,
 [first_name] varchar(100),
 [last_name] varchar(100),
 [email] varchar(100),
 [gender] nvarchar(20),
 [ip_address] varchar(10)
)


Fig 2 - How to Load JSON File in SQL Server by using TSQL

By using OPENROWSET function we have imported the JSON Files in @JSON variable.


Fig 3 - How to use OPENROWSET  to import JSON Files in variable in TSQL

OPENJSON is table value function that can convert JSON data into columns and rows. As you can see below I have use OPENJSON and provided our variable @JSON as input so we can get data in tabular format.

Fig 4- Get tabular data by using OPENJSON Function in TSQL

Next step we have to describe the list of columns those we would like to get from our JSON data. If you do not want to read all column data, you can provide the list of only required columns.

Fig5 - Get desired column from JSON file by using OPENJSON Table function in TSQL

To load our sample JSON files, we need a table in which we can load our JSON file data.


Create table by using below command:

create table dbo.SSISJson (
    [id] int,
    [first_name] varchar(100),
    last_name] varchar(100)
    [email] varchar(100),
    [gender] nvarchar(20)
    [ip_address] varchar(10),
    filename varchar (1000))

Fig 6- Create table by using TSQL for JSON File load

If you want to write the data to some table, you can provide table name. Make sure table has the same columns which your OPENJSON table 



Fig 7 - Insert data into table from JSON files by using TSQL


as you can see in below picture 1000 rows are inserted.

Fig 8- How to Insert JSON Files into SQL Server Table by using TSQL


After all the scripts tested in TSQL to load JSON file to SQL Server Table, we are going to create SSIS Package to load multiple JSON Files to same table that we created above.

Open Visual Studio & Click on Packages + New SSIS Package

Fig 9- Load JSON Files to SQL Server Table by using SSIS Package - Create New SSIS Package

In the new SSIS Package we need Foreach Loop Container to get list of JSON files from Folder



Fig 10 - Get list of JSON Files by using Foreach Loop Container in SSIS Package

In Foreach Loop Container we have to select the type &  Path of our JSON Files

Fig 11 - Choose only JSON files from folder by using Foreach Loop Container

Next Click on Variable Mappings & Input Details


Fig 12 -Save the file name into variable in Foreach Loop Container in SSIS Package

Bring Execute SQL Task and configure OLEDB Connection.

Fig 13- Use Execute SQL Task in SSIS Package to Load JSON files in SQL Server Table

You will be choosing your Server Name and authentication method to connect to your SQL Server.



Choose Server Name and Database Name for connection Manager.


Step 5: Test Connection Succeeded.


Next Click on Expressions Click on the Property & select SQL Statement source then Click on Expression & Copy Paste The Below Given Command in Expression and add the file Path Which is created earlier.
 
"Declare @JSON varchar(max)
SELECT @JSON=BulkColumn
FROM OPENROWSET (BULK "++", SINGLE_CLOB) import
insert into dbo.SSISJson
SELECT * FROM '"++"' as filename FROM OPENJSON @JSON
WITH
(
 [ID] INT,
 [first_name] varchar(100),
 [last_name] varchar(100),
 [email] varchar(100),
 [gender] varchar(20),
 [ip_address] varchar(10)
)"


Once the expressions are evaluated correctly. We are all set to test our SSIS Package. Truncate your SQL Server Table and then execute SSIS Package to load multiple JSON files  

Fig 14 - Load multiple JSON files to SQL Server table by using Execute SQL Task in SSIS Package

Go to SSMS and then execute Select query for table to confirm if data is loaded successfully.

 Fig 16- Load More than one JSON File to SQL Server table by using SSIS Package


If you still need any help and have missed any steps in creating SSIS Package, feel free to check video demo that have step by step approach how to load multiple JSON files to SQL Server table.


Video Demo: How to Load Multiple Json Files to SQL Server Table with File Name In SSIS

How to Load JSON File to SQL Server Table by using TSQL in SSMS - TSQL Tutorial

How to Load JSON File to SQL Server Table by using TSQL in SSMS - TSQL Tutorial

In this blog post, we are going to learn how we can load JSON file to SQL Server table by using TSQL.
I have generated below JSON file online so we can use for our demo.



Open SQL Server Management Studio (SSMS) and use the script I have provided below, First of all we have to declare the variable which is "Declare @JSON varchar (max), which I have Highlighted in this picture.

Declare @JSON varchar(max)
SELECT @JSON=BulkColumn
FROM OPENROWSET (BULK 'c:\MyJsonFile.json', SINGLE_CLOB) import
insert into dbo.SSISJson
SELECT * FROM OPENJSON @JSON)
WITH
(
 [ID] INT,
 [first_name] varchar(100),
 [last_name] varchar(100),
 [email] varchar(100),
 [gender] varchar(20),
 [ip_address] varchar(10)
)



By using OPENROWSET function we have imported the JSON Files in @JSON variable.



OPENJSON is table value function that can convert JSON data into columns and rows. As you can see below I have use OPENJSON and provided our variable @JSON as input so we can get data in tabular format.



 Next step we have to describe the list of columns those we would like to get from our JSON data. If you do not want to read all column data, you can provide the list of only required columns.


If you want to Insert this data in some table just follow the steps:
(SELECT * into dbo.JsonTestTable)




Once you execute the TSQL statement, all the data from JSON will be inserted into table.





Video Demo : How to load Json file to SQL Server Table by using TSQL