Parameterize Linked Services in Azure Data Factory | Loop Through Multiple Databases in Azure Data Factory Pipeline

Topic: Parameterize Linked Services in Azure Data Factory - Loop Through Multiple Databases in Azure Data Factory Pipeline.


In this article, we are going to learn, parameterize linked services in Azure Data Factory, Loop through multiple databases in the Azure Data Factory pipeline.

Create Linked Service:

Open your Azure Data Factory Studio, go to the Manage tab, click on linked services then click on the +New button to create a new linked service.


Find and select Azure SQL database, then click on continue.


Name your linked service, select your Azure subscription, then go all the way down and click on parameters, click on the + New button, name your parameter, select the type, then provide the value. 


Once we created our parameter, click on Enter manually, provide the fully qualified domain name, then in the database name click on the ''Add dynamic content'' then select the parameter we just created, select the authentication type, provide the user name, and password, and then click on Create. 



Create Pipeline:

Go to the Author tab, click on the pipeline, then click on New pipeline to create a new pipeline.


Find and drag the lookup activity, go to the Settings tab click on the + New button to create a new Source dataset.


Select Azure SQL database, then click on continue.


Name your dataset, select the linked service, that we created earlier, select none for import schema, then click on ok.


In the settings tab, click on the query, then write the query that will return us the database name and our file, then uncheck ''First Row only''.


Once we are done with our lookup activity, find and drag the ForEach loop activity, connect with the lookup activity, go to the settings tab, in the items, click on add dynamic content, here we will select our lookup, then, in the end, add ''.Value''.


Go inside the ForEach loop, find and drag the copy data activity, go to the Source tab, click on the + New button to create a new source dataset.


Select Azure SQL database, then click on continue.


Name your dataset, select the linked service, select none for import schema, then click on Ok.


In the source tab, click on the open button, go to the Parameters tab, click on the + New button, create a new parameter, then go to the Connection tab, and map the parameter with our linked service parameter, that is already shown in the connection tab, then provide the table name.


Go to the Source tab, and here we will click on add the dynamic content in the value, then select our ForEach and in the end add ''.DBName


Once we are done with our source, go to the Sink tab and click on the + New button to create a new sink dataset where we will write our files.


Select Azure blob storage, then click on continue.


Name your dataset, select the linked service that points to your blob storage, provide the folder path, select the first row as header, select import schema, then click on ok 


In the Sink tab, click on the Open button, then go to the parameter tab, click on the + New button and create a parameter for the file name, then go to the connection tab and link the parameter.


Next, go back to the Sink tab, and our parameter will appear here, we have to provide the values for our parameter that would be our ForEach, click on the add dynamic content, select our ForEach then, in the end, add ''.filename'' and finally go back to the pipeline and click on Debug.



Video Demo: Parameterize Linked Services in Azure Data Factory - Loop Through Multiple Databases in ADF Pipeline






 


















No comments:

Post a Comment