How to Create My First Pipeline in Azure Data Factory-Load CSV File to Azure SQL Table -ADF Tutorial

 Issue: How to Create My First Pipeline in Azure Data Factory-Load CSV File to Azure SQL Table.

In this article, we are going to learn how to create my first pipeline in the Azure data factory, and Load CSV file to Azure SQL table, let's start our demonstration, first of all, we have to create a new Azure Data factory, to create a new data factory go to the Data factories tab and then click on + New button to create new data factory.



Then select the Subscription, then select the resource group, then select region, then give name to the data factory and then click on the review + create button then ignore the Git configuration and then click on create.



Next, we have to create a new storage account, to create a new storage account go to the storage accounts tab, then click on + New button to create a new storage account.



Select the subscription, then select the resource group, then name the storage account then select the region, and then click on the Review + Create button, then click on create.



Next, Create some containers in the newly created storage account, to create containers, go to the storage account, then click on containers, then click on + Containers button to create a new container.



Then name the container and click on create, then create another new container, in my case I have created two containers and name them as ''Source'' and ''destination''.



Once our containers are created, go to the source container, then click on uplod button, and select the file/files which you want to upload.



Next, Create a linked service which we will use for different pipelines, to create a linked service open the Azure data factory and go to the Manage tab then click on linked services, and then click on the + New button to create a new linked service.



Then Find and select Azure blob storage then click on continue.



Then name the linked service, then select the Azure subscription and then select the storage account and then click on create.



Next, we have to create another linked service that we will use for our different pipelines, to create a linked service click on + New then find and select Azure SQL database and then click on continue.



Then give a name to the linked service, then select Azure subscription, then select server name, then select database name, choose authentication type, then provide user name and password, and then test the connection and click on create.



Next, create a new pipeline, to create a new pipeline, go to the author tab, then click on pipelines then click on + sign to create a new pipeline.




 Then name the pipeline and find and drag the copy data activity in the working window and then click on copy data activity then go to the source tab and click on + new button to create a new source dataset.



Then select the Azure blob storage and then click on continue.



Then select the file format, in my case I will go with Delimited text (CSV) select then click ok.



Then name the dataset, then select the linked service, then provide the path and file name then select none for import schema and click on ok.



Next, go to the sink tab and then click on the + New button to create a new sink dataset.



Find and select Azure SQL database and then click on continue.



Then give a name to the dataset, then select the linked service which we have created before, then select the table name, then select none for import schemas then click on ok



Next, as our pipeline is created successfully, now we have to Debug the pipeline and the copy data activity will copy our file from the source and write it to our destination container, so click on the Debug button and it will show you the results.


Video Demo: How to Create My First Pipeline in Azure Data Factory-Load CSV File to Azure SQL Table







2 comments:

  1. Creating your first Azure Data Factory pipeline and loading a CSV file into an Azure SQL table can be a vital step in streamlining your data management. If you're new to this, it's a good idea to consider Azure Consulting Services https://dbserv.com/azure-consulting . They can provide expert guidance and support throughout the process, helping you set up your data factory and optimize your data workflows for maximum efficiency. So, as you embark on this journey, remember that professional assistance is just a click away!

    ReplyDelete
  2. Mifinity to doskonały wybór dla wszystkich miłośników kasyn online. Ich bezpieczne i wygodne metody płatności zapewniają wyjątkowe doświadczenie użytkownikom. Dzięki mifinity kasyno możesz cieszyć się łatwością wypłat i wpłat, a także gwarancją ochrony danych osobowych. Ich szeroki wybór gier i atrakcyjne bonusy sprawiają, że graczom nigdy nie brakuje emocji. Niezależnie od tego, czy preferujesz gry stołowe, automaty czy kasyno na żywo, Mifinity ma wszystko, czego potrzebujesz. Zaufaj sprawdzonemu liderowi w branży, który zapewni Ci wyjątkowe doświadczenia i najlepsze wrażenia z gry. Dołącz do Mifinity już dziś i odkryj nowy poziom rozrywki w kasynie online!

    ReplyDelete