How to Create CSV Files Dynamically in Azure Blob Storage from On-Prem SQL Server Tables in Azure Data Factory

 Issue: How to Create CSV Files Dynamically in Azure Blob Storage from On-Prem SQL Server Tables in Azure Data Factory


In this article, we are going to learn, how to create CSV files dynamically in Azure Blob storage from on-premises SQL server tables in Azure Data Factory, to create CSV files dynamically in Azure blob storage from on-prem, you need an active Self-Hosted integration runtime also configured in your Azure Data Factory, in this demo we will read the data from our On-Premises to the Azure blob storage by using Azure Data Factory.

Create a Pipeline:

Open your Azure Data Factory studio, go to the author tab, click on the pipelines, then click on the new pipeline, to create a pipeline.


Find and drag the Lookup activity, go to the settings tab, and click on the + New button to create a new source dataset.


Select SQL Server and click on continue.


Name your source dataset, Select the linked service, if you have already created otherwise create the linked service and then select none for import schema and click on ok.


Write the query to get the list of the files, then uncheck the box ''First Row Only''.



Find and drag the ForEach loop Activity, then go to the settings tab and in the items, click ''Add dynamic content'' then add the lookup, and then add .value at the end of the expressions.



Go inside the ForEach loop by clicking the pencil sign on the activity, find and drag the copy data activity, then go to the source tab and click on the + New button to create a new source dataset.


Select SQL server, then click on continue.


Name your dataset, select the linked service, select none for import schema and click on ok.


Once our dataset is completed, in the source tab click on Open button and create some parameters.



In the parameters, tab click on the + sign and name the parameter.


In the connection tab, select provide the parameter that we have created.


In the source tab provide the values.


Next, go to the sink tab and click on + New button to create a new sink dataset.


Select Azure blob storage, then click on continue.


Select CSV as format and click on continue.


Name your sink dataset, select the linked service, provide the folder path where our data will be copied, select the first row as header, select none for import schema, and then click on ok.


Now, we are all set, and let's click on Debug.



Video Demo: How to Create CSV Files Dynamically in Azure Blob Storage from On-Prem SQL Server Tables in Azure Data Factory



















No comments:

Post a Comment