How to Copy data securely from Azure Blob storage to a SQL database by using private endpoints

 Issue: How to Copy data securely from Azure Blob storage to a SQL database by using private endpoints.


In this article, we are going to learn how to copy data securely from Azure blob storage to an Azure SQL database by using private endpoints, let's start our demonstration.



How to create Integration Runtime with Managed Network:


To create an Integration Runtime go to the Azure Data factory studio, then go to the Manage tab, then click on integration runtimes, then click on the + New button.


Select the Azure Self-Hosted, then click on continue.



Select Azure and then click on continue.


Name your Integration Runtime, then Enable Virtual Network configuration, then select your region and click on create.



Once our Integration Runtime is created, let's create a private endpoint.

How to Create Managed Private EndPoint:

Open Azure data factory, then go to the manage, click on Managed private endpoints and then click on the + New button to create a new managed private endpoint. 



Select Azure Blob storage and then click on continue.


Name your Managed Private endpoint, then Select account selection method, then select your Azure subscription, then select your storage account and then click on create.



Once our managed endpoint is created, go to the storage account. then click on networking, then select your endpoint which we have created and then click on approve.



Next, create an empty table in your storage account where we will write the data, and then go to the author tab, then click on + sign to create a pipeline, then find and drag the copy data activity, then go to the source tab and click on + New to create a new source dataset.


Then select Azure blob storage and click on continue.


Select CSV file format and then click on continue.


Name your dataset, then select linked service, then provide the path from where we will read the data, and select first row as header, then select none for import schema and then click on ok



Now, go to the sink tab and then click on the + New button to create a new sink dataset.



Select Azure SQL database and then click on continue.



Name your dataset, then select the linked service, then select the table which we have created for writing the data, and then select none for import schema and then click on ok


Now our pipeline is ready, click on debug to copy data from Azure blob storage and write to the Azure SQL database.


 


Video Demo: How to Copy data Securely from Azure Blob storage to a Azure SQL database by using private endpoints





No comments:

Post a Comment