How to Use Stored Procedure with Parameter in Copy Activity in Azure Data Factory

 Issue: How to Use Stored Procedure with Parameter in Copy Activity in Azure Data Factory.


In this article, we are going to learn how to use stored procedures with parameters in copy activity in Azure Data Factory, lets start our demonstration.

Open the Azure data factory, then go to the Author tab and click on + Sign to create a new pipeline, then find and drag the lookup activity.


Go to the settings tab and then click on the + New button to create a new Source dataset.


Then select Azure SQL Database and then click on continue.



Name the Dataset, then select the linked service if you have already created, otherwise create a new linked service, and then select none for import schema and then click ok.



Then write the query from where we will read the files.



Go to the pipeline again and find and bring another Foreach activity and connect with the Lookup Activity.


Click on the foreach activity and then go to the settings tab and then add the lookup expressions in the description bar.


Click on the lookup then write the .value and then click on ok.

Then click on the pencil sign on the foreach loop activity and then go inside and add Copy activity inside.


Find and bring the copy data activity inside the foreach loop activity.


Click on the copy data activity, then go to the source tab, and then click on the + New button to create a new source dataset.


Select the Azure SQL database and then click on continue.


Name the source dataset, then select the linked service, select none for import schema and then click on ok.


Click on the stored procedure and then in the dropdown select you stored procedure.


Click on the Import parameter and in the value bar add the dynamic content which is coming from our foreach loop activity.


Click on ForEach iterator then add .RegionName and click on ok.





Then go to the sink tab and click on the + New button to create a new sink dataset where we will write the data,




Select Azure SQL database and click on continue.


Name the dataset, then select the linked service, select none for import schema and then click on ok.



Click on the Open button and then select the file name in which we have to write the data.


Click on the debug and it will read the data and write to our selected location.



Video Demo: How to Use Stored Procedure with Parameter in Copy Activity in Azure Data Factory


How to Load Multiple CSV Files to Multiple Tables According to File Name in Azure Data Factory

Issue: How to Load Multiple CSV Files to Multiple Tables According to File Name in Azure Data Factory

In this article we are going to learn how to load multiple .csv files to different tables according to the file name in the Azure data factory, Each file will be loaded to a separate table according to File Name. If the table already exists, then it will be loaded without dropping and recreating. IF a table does not exist, then it will create a new table and then load the data from CSV File.

How to create a pipeline:

Go to the Azure Data Factory, then click on Author tab, then click on + Button to create a new pipeline.



Then find and drag the Meta Data Activity.


Click on the Get metadata activity, then go to the Dataset, and click on the + New button to create a new dataset.



Find and select the Blob storage and then click on continue.


Then select the file format which is .CSV and then click on continue.

Name the dataset then select the linked service if you already created, otherwise create a new linked service and select, then select the file path from where you have to input your files, then select the none for import schema and then click on ok.




Click on field list, then select the child items to get the list of files.



Now find and drag the foreach loop activity, then connect it with the Get Metadata activity.


Click on Foreach loop activity, then go to the settings tab and then click on Add dynamic content.



Click on the Getmetdata to add the expressions, then after output add .childitems and then click on finish.


Click on the pencil sign and it will bring you inside the Foreach loop activity.



Inside the Foreach loop activity, find and bring the copy data activity, then go to the source tab and then click on the + New button to create a new dataset.


Select the Azure blob storage and then click on continue.


Then select the CSV file as file format and then click on continue.


Name the dataset, then select the linked service which we have selected earlier, then select file path, then select none for import schema, and then click on ok.




Inside the Source tab click on the open button then create a parameter and then select that parameter in the file name tab.


Now inside the source tab click on the value tab to add the dynamic content.


Then go to the sink tab and click on the + New button to create a new sink dataset.


Then select the Azure SQL Database and click n continue.


Name the sink dataset, then select the linked service, if already created or create a new linked service then select none for import schema and then click on ok.



Click on the Open and then create a parameter for the table name which will come from our foreach loop.


Now go to the file name tab then click on the Add dynamic content and then select the dynamic content and click on finish



Then select none as we don't have any store procedure, then click on Auto-create table.



Then go back to the pipeline and Debug.




Video Demo: How to Load Multiple CSV Files to Multiple Tables According to File Name in Azure Data Factory.