Azure Data Factory Check Row Count of Copied Records in Copy Activity.

Issue: Azure Data Factory Check Row Count of Copied Records in Copy Activity.


In this article, we are going to learn about azure Data Factory Check Row Count of Copied records in copy activity., we will also learn how to load multiple CSV files to SQL tables dynamically, and save filename and rows copied in the audit table. let's start our demonstration.

First of all, open the Azure data factory, go to the server, in the server click on Firewalls and virtual networks, and then ''allow Azure Services and resources to access this server''.


Open the SQL server management studio, provide the server name, select the authentication type, provide your user name and password and then connect.


Open the Azure data factory, go to the author tab, click on pipelines, then new pipeline, find and drag the Get Metadata activity, go to the source tab and click on the + New button to create a new dataset.


Select the Azure blob storage, then click on continue.


Select file format as CSV, then click on continue.


Name your dataset, select the linked service if you have already created it, or create it, provide the folder path from where we have to read the files, select none for import schema and click on ok.


Now, back to the source tab, in the filed list click on the + New button, then select the child items below.


Once we are done with our Get metadata activity, find and drag the ForEach Loop activity, connect with Get Metadata, then go to the settings tab and provide the dynamic content.



Now, go inside the ForEaeah Loop Activity, then find and drag the copy data activity, go to the source, then click on the + New button to create a new source dataset. 



Select Azure Blob storage, then click on continue.


Select Delimited Text (CSV) as format, then click on continue.



Name your sink dataset, select the linked service, provide the folder path where our files are sitting, select first row as header, Select none for import schema and then click on ok.


Once our source dataset is created, click on the open button, go inside and create a parameter, because our file name is dynamic.


Next, go to the Sink tab and click on the + New button to create a new sink dataset.


 
Select Azure SQL, then click on continue.


Name your dataset, select the linked service if you have already created it, or create it now, Select none for import schema and then click on ok




Next, in the sink dataset, click on open button.


In the open, create and provide the parameter.


Now, back to the pipeline, in the sink tab provide the values, then click on auto-create table.



Next, Find and drag the Store Procedure, then connect with the copy data activity, that can get the data and load it to the audit table, then in the settings tab select the linked service, then select the store procedure, then click on Import, it will get the list of the parameters, then provide the values to the parameters and then back to the pipeline and Debug.



Video Demo: Azure Data Factory Check Row Count of Copied Records in Copy Activity 




No comments:

Post a Comment