How to Log Pipeline Audit Data for Success and Failure in Azure Data Factory - ADF Tutorial 2021

 Issue: How to Log Pipeline Audit Data for Success and Failure in Azure Data Factory

In this article, we are going to learn how to log pipeline audit data for success and failure in the Azure data factory, let's start our demonstration.

Open Azure data factory studio, go to the author tab, click on the + sign to create a new pipeline, find and bring the copy data activity, go to the source tab, click on the + New button to create a new pipeline.


Select Azure SQL database, then click on continue.


Name your data set, then create a new linked service, name your linked service, select Azure subscription, select server name, select database name, select authentication type, provide user name and password, click on create, select the table from where we will read the data, then click on Ok.


Once we are done with our source tab, go to the sink tab, click on the + New button to create a sink dataset.


Select Azure blob storage then click on continue.


Select Azure Blob storage then click on continue.


Name your dataset, select the linked service to azure blob storage, select the folder path where we write our file, select the first row as header, import schema from connection/store, and click on Ok. 



In the sink, tab click on the Open button then provide the output file name.


Next, find and bring another copy data activity, then connect it with the older one on Failure.


Click on the activity, go to the source tab, click on the + New button to create a new Source dataset.


Select Azure SQL database, then click on continue.


Name your dataset. select the linked service, select none for import schema then click on Ok


In the source tab, click on Query, then click on Add dynamic content and add the Expressions as below.

SELECT '@{pipeline().DataFactory}' as DataFactory_Name, '@{pipeline().Pipeline}' as Pipeline_Name, '@{pipeline().RunId}' as RunId, 'TotalSaleTable' as Source, 'TotalSale CSV File' as Destination, '@{activity('CopyTotalSale').output.executionDetails[0].status}' as Execution_Status, '@{activity('CopyTotalSale').output.errors[0].Code}' as ErrorCode, '@{replace(activity('CopyTotalSale').output.errors[0].Message,'''','')}' as ErrorDescriprion, '@{utcnow()}' as ErrorLoggedTime


Go to the sink tab, then click on the + New button to create a new sink dataset.


Select Azure blob storage then click on continue.


Select the file format, then click on continue.


Name your dataset, select the linked service, select the output file path, select the first row as header, import schema from connection/store, and click on Ok.


In the sink, tab click on the open button and provide the output file name.


Once we are done with our failure part, bring another copy data activity, and connect with the first one on Success, then click on the activity, go to the source tab, click on the + New button to create a new source dataset.


Select Azure SQL database, then click on continue.


Name your dataset, select the linked service, select none for import schema, then click on Ok.


In the source tab, select the Quer, then click on add dynamic content and provide the below query.

SELECT '@{pipeline().DataFactory}' as DataFactory_Name, '@{pipeline().Pipeline}' as Pipeline_Name, '@{pipeline().RunId}' as RunId, 'TotalSaleTable' as Source, 'TotalSale CSV File' as Destination, '@{activity('CopyTotalSale').output.rowsRead}' as RowsRead, '@{activity('CopyTotalSale').output.rowsCopied}' as RowsCopied, '@{activity('CopyTotalSale').output.copyDuration}' as copyDuration, '@{utcnow()}' as ErrorLoggedTime


Go to the sing tab, click on the + New button to create a new sink dataset.


Select the Azure blob storage then click on continue.


Select output file format, then click on continue.


Name your dataset, select the linked service, select the output folder path, select the first row as header, import schema from connection/store, and click on Ok, in the sink tab click on the Open button and provide the output file name.


Now, we are done with our failure and success part, click on the debug button to execute the pipeline, now if our pipeline will be failed the failure part will run and create the log file, and if our pipeline is completed with success the success part will run and write the logs for success.

Video Demo: How to Log Pipeline Audit Data for Success and Failure in Azure Data Factory










 











4 comments:

  1. Extremely useful blog which you have shared here about Superannuation Audit Services Australia This is a great way to enhance knowledge for us, and also beneficial for us. Thank you for sharing a blog like this.

    ReplyDelete
  2. HOW CAN I RECOVER MY LOST CRYPTO ? CRYPTO SCAM VICTIMS RECOVER THEIR LOST FUNDS THROUGH LOST RECOVERY MASTERS.

    The Lost Recovery Masters is a team of experienced Crypto Experts currently collecting funds back to all scam victims. Anyone that has fallen prey to these fake online imposters and bogus investments Schemes contact Lost Recovery Masters and explain your situation; They will assist you in all crypto scam retrieval funds, bitcoin wallets reclaiming, investment scam, mobile mass surveillance, and cyber - attacks. 

    Support team Mail: (lostrecovemast@mail.com        
    WhatsApp: +1(204)819-5505.

    Don't forget to mention 
     Michelle Brice referred you.

    ReplyDelete
  3. Many have come to the conclusion that Bitcoin and other cryptocurrencies cannot be traced or recovered but it’s incorrect, it can be traced and recovered with the right tools and resources. I was one of those who didn’t believe in it but I was able to recover my Bitcoin after I sent a huge amount to the wrong address with the help of a recovery team called Hack Judas. I thought all hope was lost for good but with the intervention of Judas, I was able to trace and recover my Bitcoins. Truly remarkable work by Hack Judas and I highly recommend their service.

    Judas can be reached via E-mail: hackerjudas9@gmail.com
    INSTAGRAM: hackerjudas

    ReplyDelete
  4. How to Recover Cryptocurrency from fraudulent investment platforms.

    Have you ever been a victim of a scam? or have you Suffered a loss from a Fraudulent Ponzi Scheme? I implore you to Contact Captain WebGenesis, A Certified Cryptocurrency Assets Recovery Expert. I once fell victim to an online imposter who convinced me to invest in a Bogus Cryptocurrency scheme by claiming to have made large profits from the plan. My Mycelium wallet contained $369,000 in Crypto that I lost, I had been reporting to the Authorities tirelessly for a longtime without getting assistance before I finally got in touch with Captain WebGenesis. Fortunately after a serious long chat with Captain WebGenesis all my funds were recovered back.
    I recommend this Expert to any victim who has lost Crypto to any fake online Schemes.

    Talk to Captain WebGenesis Through ;
    Mail Captainwebgenesis@hackermail.com
    WhatsApp, +1 (447) 442-0456.

    ReplyDelete