How to Log Pipeline Audit Data for Success and Failure in Azure Data Factory - ADF Tutorial 2021

 Issue: How to Log Pipeline Audit Data for Success and Failure in Azure Data Factory

In this article, we are going to learn how to log pipeline audit data for success and failure in the Azure data factory, let's start our demonstration.

Open Azure data factory studio, go to the author tab, click on the + sign to create a new pipeline, find and bring the copy data activity, go to the source tab, click on the + New button to create a new pipeline.


Select Azure SQL database, then click on continue.


Name your data set, then create a new linked service, name your linked service, select Azure subscription, select server name, select database name, select authentication type, provide user name and password, click on create, select the table from where we will read the data, then click on Ok.


Once we are done with our source tab, go to the sink tab, click on the + New button to create a sink dataset.


Select Azure blob storage then click on continue.


Select Azure Blob storage then click on continue.


Name your dataset, select the linked service to azure blob storage, select the folder path where we write our file, select the first row as header, import schema from connection/store, and click on Ok. 



In the sink, tab click on the Open button then provide the output file name.


Next, find and bring another copy data activity, then connect it with the older one on Failure.


Click on the activity, go to the source tab, click on the + New button to create a new Source dataset.


Select Azure SQL database, then click on continue.


Name your dataset. select the linked service, select none for import schema then click on Ok


In the source tab, click on Query, then click on Add dynamic content and add the Expressions as below.

SELECT '@{pipeline().DataFactory}' as DataFactory_Name, '@{pipeline().Pipeline}' as Pipeline_Name, '@{pipeline().RunId}' as RunId, 'TotalSaleTable' as Source, 'TotalSale CSV File' as Destination, '@{activity('CopyTotalSale').output.executionDetails[0].status}' as Execution_Status, '@{activity('CopyTotalSale').output.errors[0].Code}' as ErrorCode, '@{replace(activity('CopyTotalSale').output.errors[0].Message,'''','')}' as ErrorDescriprion, '@{utcnow()}' as ErrorLoggedTime


Go to the sink tab, then click on the + New button to create a new sink dataset.


Select Azure blob storage then click on continue.


Select the file format, then click on continue.


Name your dataset, select the linked service, select the output file path, select the first row as header, import schema from connection/store, and click on Ok.


In the sink, tab click on the open button and provide the output file name.


Once we are done with our failure part, bring another copy data activity, and connect with the first one on Success, then click on the activity, go to the source tab, click on the + New button to create a new source dataset.


Select Azure SQL database, then click on continue.


Name your dataset, select the linked service, select none for import schema, then click on Ok.


In the source tab, select the Quer, then click on add dynamic content and provide the below query.

SELECT '@{pipeline().DataFactory}' as DataFactory_Name, '@{pipeline().Pipeline}' as Pipeline_Name, '@{pipeline().RunId}' as RunId, 'TotalSaleTable' as Source, 'TotalSale CSV File' as Destination, '@{activity('CopyTotalSale').output.rowsRead}' as RowsRead, '@{activity('CopyTotalSale').output.rowsCopied}' as RowsCopied, '@{activity('CopyTotalSale').output.copyDuration}' as copyDuration, '@{utcnow()}' as ErrorLoggedTime


Go to the sing tab, click on the + New button to create a new sink dataset.


Select the Azure blob storage then click on continue.


Select output file format, then click on continue.


Name your dataset, select the linked service, select the output folder path, select the first row as header, import schema from connection/store, and click on Ok, in the sink tab click on the Open button and provide the output file name.


Now, we are done with our failure and success part, click on the debug button to execute the pipeline, now if our pipeline will be failed the failure part will run and create the log file, and if our pipeline is completed with success the success part will run and write the logs for success.

Video Demo: How to Log Pipeline Audit Data for Success and Failure in Azure Data Factory










 











1 comment:

  1. Extremely useful blog which you have shared here about Superannuation Audit Services Australia This is a great way to enhance knowledge for us, and also beneficial for us. Thank you for sharing a blog like this.

    ReplyDelete