Issue: How to Log Pipeline Audit Data for Success and Failure in Azure Data Factory
In this article, we are going to learn how to log pipeline audit data for success and failure in the Azure data factory, let's start our demonstration.
Open Azure data factory studio, go to the author tab, click on the + sign to create a new pipeline, find and bring the copy data activity, go to the source tab, click on the + New button to create a new pipeline.
Name your data set, then create a new linked service, name your linked service, select Azure subscription, select server name, select database name, select authentication type, provide user name and password, click on create, select the table from where we will read the data, then click on Ok.
Once we are done with our source tab, go to the sink tab, click on the + New button to create a sink dataset.
Name your dataset, select the linked service to azure blob storage, select the folder path where we write our file, select the first row as header, import schema from connection/store, and click on Ok.
In the sink, tab click on the Open button then provide the output file name.
Next, find and bring another copy data activity, then connect it with the older one on Failure.
In the source tab, click on Query, then click on Add dynamic content and add the Expressions as below.
SELECT '@{pipeline().DataFactory}' as DataFactory_Name,
'@{pipeline().Pipeline}' as Pipeline_Name,
'@{pipeline().RunId}' as RunId,
'TotalSaleTable' as Source,
'TotalSale CSV File' as Destination,
'@{activity('CopyTotalSale').output.executionDetails[0].status}' as Execution_Status,
'@{activity('CopyTotalSale').output.errors[0].Code}' as ErrorCode,
'@{replace(activity('CopyTotalSale').output.errors[0].Message,'''','')}' as ErrorDescriprion,
'@{utcnow()}' as ErrorLoggedTime
Name your dataset, select the linked service, select the output file path, select the first row as header, import schema from connection/store, and click on Ok.
In the sink, tab click on the open button and provide the output file name.
Once we are done with our failure part, bring another copy data activity, and connect with the first one on Success, then click on the activity, go to the source tab, click on the + New button to create a new source dataset.
In the source tab, select the Quer, then click on add dynamic content and provide the below query.
SELECT '@{pipeline().DataFactory}' as DataFactory_Name,
'@{pipeline().Pipeline}' as Pipeline_Name,
'@{pipeline().RunId}' as RunId,
'TotalSaleTable' as Source,
'TotalSale CSV File' as Destination,
'@{activity('CopyTotalSale').output.rowsRead}' as RowsRead,
'@{activity('CopyTotalSale').output.rowsCopied}' as RowsCopied,
'@{activity('CopyTotalSale').output.copyDuration}' as copyDuration,
'@{utcnow()}' as ErrorLoggedTime
Name your dataset, select the linked service, select the output folder path, select the first row as header, import schema from connection/store, and click on Ok, in the sink tab click on the Open button and provide the output file name.
Extremely useful blog which you have shared here about Superannuation Audit Services Australia This is a great way to enhance knowledge for us, and also beneficial for us. Thank you for sharing a blog like this.
ReplyDelete