How to perform CDC from GCP MySQL Instance to BigQuery by using DataStream in Google Cloud Platform

Topic: How to perform CDC from GCP MySQL Instance to BigQuery by using DataStream in Google Cloud Platform


How to perform CDC from GCP MySQL Instance to BigQuery by using DataStream in Google Cloud Platform Tutorial 2022, in this article we are going to learn How to perform CDC from GCP MySQL Instance to BigQuery by using DataStream in Google Cloud Platform | GCP Cloud SQL Tutorial



Video Demo: How to perform CDC from GCP MySQL Instance to BigQuery by using DataStream in Google Cloud Platform

Cross database query between Google SQL instances PostgreSQL

Topic: Cross-database query between Google SQL instances PostgreSQL


Cross-database query between Google SQL instances PostgreSQL | GCP SQL Tutorial 2022, in this article we are going to learn Cross database query between Google SQL instances PostgreSQL.

Script used in demo:

create database sales_asia
create database sales_europe
create table public.AsiaSale(id int, name varchar(100), region varchar(100))
insert into public.AsiaSale values(1,'aamir','Asia')
Select * From public.AsiaSale create table public.EuropeSale(id int, name varchar(100), region varchar(100))
insert into public.EuropeSale values(2,'lisa','Europe')
Select * From public.europesale -- we want to execute union query in sales_asia database that should get data from sales_europe.public.EuropeSale table.
Select * From public.europesale
union all
select * from public.AsiaSale 1) Set up a Foreign User-- Do this on DB from which you would like to read the tables
CREATE USER fdwuser WITH PASSWORD 'test123$';
GRANT USAGE ON SCHEMA PUBLIC TO fdwuser;
GRANT SELECT ON europesale TO fdwuser; --Check the list of Tables
select * from information_schema.tables where table_name like '%sale%' 2) Create the Extension
CREATE EXTENSION postgres_fdw;
select * from pg_extension; 3) Create the Foreign Server
CREATE SERVER secondrydb_srv FOREIGN DATA WRAPPER postgres_fdw OPTIONS (host '34.67.244.181', port '5432', dbname 'sales_europe');
Select * From pg_foreign_server; 4) Create User Mapping
CREATE USER MAPPING FOR postgres SERVER secondrydb_srv OPTIONS(user 'fdwuser',password 'test123$');
Select * From pg_user_mappings 5) Grant the Local User Access to the Foreign Data Wrapper
GRANT USAGE ON FOREIGN SERVER secondrydb_srv TO postgres; 6) Import the Foreign Schema or Tables
IMPORT FOREIGN SCHEMA public LIMIT TO (europesale) FROM SERVER secondrydb_srv INTO public; Select * From public.europesale
union all
select * from public.AsiaSale


Video Demo: Cross database query between Google SQL instances PostgreSQL


How to Perform Cross Database Queries in PostgreSQL in GCP Cloud

Topic: How to Perform Cross Database Queries in PostgreSQL in GCP Cloud


How to Perform Cross Database Queries in PostgreSQL in GCP Cloud | GCP Cloud SQL Tutorial 2022, in this video we are going to learn How to Perform Cross Database Queries in PostgreSQL in GCP Cloud | GCP Cloud SQL Tutorial 2022


Video Demo: How to Perform Cross Database Queries in PostgreSQL in GCP Cloud




Cross database query between Google SQL instances PostgreSQL

 Topic: Cross database query between Google SQL instances PostgreSQL


Cross database query between Google SQL instances PostgreSQL | GCP SQL Tutorial 2022, in this video we are going to learn Cross database query between Google SQL instances PostgreSQL | GCP SQL Tutorial 2022



 Video Demo: Cross database query between Google SQL instances PostgreSQL

How to Stop GCP PostgreSQL from Logging Passwords in Clear Text in Logs GCP Cloud SQL Tutorial

Topic:  How to Stop GCP PostgreSQL from Logging Passwords in Clear Text in Logs


How to Stop GCP PostgreSQL from Logging Passwords in Clear Text in Logs GCP Cloud SQL Tutorial | GCP SQL Tutorial 2022, in this video we are going to learnHow to Stop GCP PostgreSQL from Logging Passwords in Clear Text in Logs GCP Cloud SQL Tutorial | GCP SQL Tutorial 2022


Video Demo: How to Stop GCP PostgreSQL from Logging Passwords in Clear Text in Logs


How to Perform In place Upgrade to GCP SQL Instance | Inplace Upgrade PostgreSQL 12 to PostgreSQL 14

Topic: How to Perform In place Upgrade to GCP SQL Instance | Inplace Upgrade PostgreSQL 12 to PostgreSQL 14


How to Perform In place Upgrade to GCP SQL Instance | Inplace Upgrade PostgreSQL 12 to PostgreSQL 14 | GCP SQL Tutorial 2022, in this video we are going to learn How to Perform In place Upgrade to GCP SQL Instance | Inplace Upgrade PostgreSQL 12 to PostgreSQL 14



Video Demo: How to Perform In place Upgrade to GCP SQL Instance



How to Drop User or Role in PostgreSQL Instance on Google Cloud Platform | GCP Tutorials 2022

Topic: How to Drop User or Role in PostgreSQL Instance on Google Cloud Platform


How to Drop User or Role in PostgreSQL Instance on Google Cloud Platform | GCP SQL Tutorial 2022, in this video we are going to learn How to Drop User or Role in PostgreSQL Instance on Google Cloud Platform | GCP SQL Tutorial 2022

Script: ERROR: role "aamir" cannot be dropped because some objects depend on it DETAIL: owner of database test 1 object in database test SQL state: 2BP01 Error Invalid request: failed to delete user aamir: . role "aamir" cannot be dropped because some objects depend on it Details: owner of database test 1 object in database test. 0) Prepare the scenario Create user aamir and then create some objects create database Test create table public.mytable(id int, name varchar(100)); insert into public.mytable values(1,'aamir'); Select * from public.mytable; 1) -- Let's think about a user who has left the company and you need to drop the user. If you don't know the --password for the user, login by using postgres user and change the password of user. --How to change the password for role Alter role aamir LOGIN password 'Test123$'; 2) -- Tried to drop the role by using postgres user session drop role aamir -- you will get errot that objects are owened by user aamir GRANT postgres to aamir; --2 -- login by aamir user and run below command to assign all objects to postgres user REASSIGN OWNED BY aamir TO postgres; --3 login back to postgres and run below drop role aamir --4 Check if Objects are not dropped by dropping user.


 

Video Demo: How to Drop User or Role in PostgreSQL Instance on Google Cloud Platform

How to Schedule Maintenance with PostgreSQL pg cron Extension | Google Cloud Platform SQL Tutorial

Topic:  How to Schedule Maintenance with PostgreSQL pg cron Extension


How to Connect to GCP SQL Server Instance by using Private IP Configuration | GCP SQL Tutorial 2022, in this video we are going to learn How to Connect to GCP SQL Server Instance by using Private IP Configuration | GCP SQL Tutorial 2022

Script:

--Postgres Instance Level Settings

cloudsql.enable_pg_cron 

value= on

cron.database_name 

Value : Provide your Database Name

  

-- check if extensions are installed

select * from pg_extension;

-- use this to create extension on db, it will create two tables job and job_run_detail

CREATE EXTENSION pg_cron;

-- list of jobs to run on schedule

Select * from cron.job;

  

-- Check executed jobs status

SELECT jobid, runid, job_pid, database, username, command, status, return_message, start_time, end_time

FROM cron.job_run_details;

  

-- Create Schedule to Delete records, insrt record or create index

SELECT cron.schedule('27 19 * * *', $$DELETE FROM public.Person$$);

SELECT cron.schedule('31 19 * * *', $$insert into public.Person values (2,'Raza',200)$$);

-- create schedule to create an index on table

SELECT cron.schedule('36 19 * * *', $$CREATE INDEX idx_pid ON public.Person(id)$$);

  

  -- Schedule to run script on another database

  -- SELECT cron.schedule('<Schedule>', '<Task>', '<Database>')

  SELECT cron.schedule('40 19 * * *', 'test2','create table public.crontab(id int)' )

  

  -- also we can insert into cron.job manually

  INSERT INTO cron.job(

jobid, schedule, command, nodename, nodeport, database, username, active, jobname)

VALUES (7, '55 19 * * *', 'create table public.crontab(id int)','localhost', '5432', 'test2', 'postgres', true,'testjob');

  

  

  

-- Delete schedule

-- SELECT cron.unschedule(<ID of the scheduled task>) -- you will get the schedule if from cron.job

  Select * from cron.job;

  -- Delete the Schedule

  select cron.unschedule(6)




Video Demo: How to Schedule Maintenance with PostgreSQL pg cron Extension

How to Stop and Start SQL Instances on Schedule by using Cloud Schedule in GCP | GCP SQL Tutorial

Topic: How to Stop and Start SQL Instances on Schedule by using Cloud Schedule in Google Cloud Platform


How to Stop and Start SQL Instances on Schedule by using Cloud Schedule in GCP | GCP SQL Tutorial 2022, in this video we are going to learn How to Stop and Start SQL Instances on Schedule by using Cloud Schedule in GCP | GCP SQL Tutorial 2022


Link: https://cloud.google.com/sql/docs/mysql/start-stop-restart-instance



Video Demo: How to Stop and Start SQL Instances on Schedule by using Cloud Schedule in Google Cloud Platofrm

How to Connect to GCP SQL Server Instance by using Private IP Configuration | GCP SQL Tutorial 2022

Topic: How to Connect to GCP SQL Server Instance by using Private IP Configuration


How to Connect to GCP SQL Server Instance by using Private IP Configuration | GCP SQL Tutorial 2022, in this video we are going to learn How to Connect to GCP SQL Server Instance by using Private IP Configuration | GCP SQL Tutorial 2022


Video Demo: How to Connect to GCP SQL Server Instance by using Private IP Configuration 


How to Connect to MySQL Instance in Google Cloud Platform from Virtual Machine By Using Private IP

Topic: How to Connect to MySQL Instance in Google Cloud Platform from Virtual Machine By Using Private IP 


How to Connect to MySQL Instance in Google Cloud Platform from Virtual Machine By Using Private IP GCP Tutorial 2022, in this video, we are going to learn How to Connect to MySQL Instance in Google Cloud Platform from Virtual Machine By Using Private IP | GCP Tutorial 2022


Script:

Install MySQL Client

sudo apt-get install default-mysql-client 

Connect to MySQL by using Private IP
mysql -h IP -u root -p


Connect to MySQL with private IP
mysql -h 172.21.160.7 -u root -p


Video Demo: How to Connect to MySQL Instance in Google Cloud Platform from Virtual Machine By Using Private IP 


How to Connect to PostgreSQL Instance with Private IP by using GCP VM GCP SQL Cloud Tutorial 2022

Topic:  How to Connect to PostgreSQL Instance with Private IP by using GCP VM 


How to Connect to PostgreSQL Instance with Private IP by using GCP VM GCP SQL Cloud Tutorial 2022, in this video we are going to learn How to Connect to PostgreSQL Instance with Private IP by using GCP VM GCP SQL Cloud Tutorial 2022

Script:

Install psql client

sudo apt-get update
sudo apt-get install postgresql-client


psql "host=PrivateIP port=5432 sslmode=disable dbname=postgres user=postgres"

psql "host=172.21.160.12 port=5432 sslmode=disable dbname=postgres user=postgres"
  


Video Demo: How to Connect to PostgreSQL Instance with Private IP by using GCP VM


How to Install PostgreSQL 14.5 on Windows Machine | How to Connect to PostgreSQL by using pgAdmin

Topic: How to Install PostgreSQL 14.5 on Windows Machine and How to Connect to PostgreSQL by using pgAdmin 


How to Install PostgreSQL 14.5 on Windows Machine | How to Connect to PostgreSQL by using pgAdmin GCP Tutorial 2022, in this video we are going to learn How to Install PostgreSQL 14.5 on Windows Machine | How to Connect to PostgreSQL by using pgAdmin | GCP Tutorial 2022


Video Demo: How to Install PostgreSQL 14.5 on Windows Machine and How to Connect to PostgreSQL by using pgAdmin 


How to Install psql Command line on Windows Machine and Connect to PostgreSQL Instance on GCP

Topic: How to Install psql Command line on Windows Machine and Connect to PostgreSQL Instance on Google Cloud Platform


How to Install psql Command line on Windows Machine and Connect to PostgreSQL Instance on GCP Tutorial 2022, in this video we are going to learn How to Install psql Command line on Windows Machine and Connect to PostgreSQL Instance on GCP | GCP Tutorial 2022

Script: command line tools

https://www.enterprisedb.com/downloads/postgres-postgresql-downloads

commad line tools
https://www.enterprisedb.com/downloads/postgres-postgresql-downloads



Video Demo: How to Install psql Command line on Windows Machine and Connect to PostgreSQL Instance on GCP
 

How to Enable Slow Query Logging and General Logging for MySQL to Table or File in GCP

 Topic: How to Enable Slow Query Logging and General Logging for MySQL to Table or File in GCP


How to Enable Slow Query Logging and General Logging for MySQL to Table or File in GCP Tutorial 2022, in this video we are going to learn How to Enable Slow Query Logging and General Logging for MySQL to Table or File in GCP | GCP Tutorial 2022



Video Demo: How to Enable Slow Query Logging and General Logging for MySQL to Table or File in Google Cloud Platform

How to Create PostgreSQL 14 Instance on GCP How to Connect pgAdmin From Local Computer to PostgreSQL

Topic: How to Create PostgreSQL 14 Instance on GCP How to Connect pgAdmin From Local Computer to PostgreSQL.


How to Create PostgreSQL 14 Instance on GCP How to Connect pgAdmin From Local Computer to PostgreSQL GCP Tutorial 2022, in this video we are going to learn How to Create PostgreSQL 14 Instance on GCP How to Connect pgAdmin From Local Computer to PostgreSQL


Video Demo: How to Create PostgreSQL 14 Instance on GCP How to Connect pgAdmin From Local Computer to PostgreSQL

 

Understanding Backups of SQL Server, MySQL, PostgreSQL in GCP Limitation of Automated and OnDemand

Topic: Understanding Backups of SQL Server, MySQL, PostgreSQL in GCP Limitation of Automated and OnDemand



Understanding Backups of SQL Server, MySQL, PostgreSQL in GCP Limitation of Automated and OnDemand GCP Tutorial 2022, in this video we are going to learn Understanding Backups of SQL Server, MySQL, PostgreSQL in GCP Limitation of Automated and OnDemand.


Video Demo: Understanding Backups of SQL Server, MySQL, PostgreSQL in GCP Limitation of Automated and OnDemand

How to Upgrade SQL Server Instances from Express to Standard or Enterprise Edition in Google Cloud

Topic:  How to Upgrade SQL Server Instances from Express to Standard or Enterprise Edition in Google Cloud Platform.


How to Upgrade SQL Server Instances from Express to Standard or Enterprise Edition GCP Tutorial 2022, in this video we are going to learn How to Upgrade SQL Server Instances from Express to Standard or Enterprise Edition.


Video Demo: How to Upgrade SQL Server Instances from Express to Standard or Enterprise Edition in Google Cloud Platform

How to Migrate On Prem SQL Server Database to GCP SQL Server Instance by using BAK File & SQL File

Topic: How to Migrate On Prem SQL Server Database to GCP SQL Server Instance by using BAK File & SQL File.

How to Migrate On Prem SQL Server Database to GCP SQL Server Instance by using BAK File & SQL File GCP Tutorial 2022, in this video we are going to learn How to Migrate On Prem SQL Server Database to GCP SQL Server Instance by using BAK File & SQL File | GCP Tutorial 2022.


Video Demo: How to Migrate On Prem SQL Server Database to GCP SQL Server Instance by using BAK File & SQL File


How to Edit SQL Instance Settings on GCP | Does Restart Required if You Edit SQL Instances on GCP

 Topic: How to Edit SQL Instance Settings on Google Cloud Platform

How to Edit SQL Instance Settings on GCP | Does Restart Required if You Edit SQL Instances on GCP Tutorial 2022, in this video we are going to learn How to Edit SQL Instance Settings on GCP | Does Restart Required if You Edit SQL Instances on Google Cloud Platform.



Video Demo: How to Edit SQL Instance Settings on Google Cloud Platform.


How to Clone SQL Instances in Google Cloud Platform | Create Clone of SQL Server, MySQL, PostgreSQL

Topic: How to Clone SQL Instances in Google Cloud Platform  


How to Clone SQL Instances in Google Cloud Platform | Create Clone of SQL Server, MySQL, PostgreSQL GCP Tutorial 2022, in this video we are going to learn How to Clone SQL Instances in Google Cloud Platform | Create Clone of SQL Server, MySQL, PostgreSQL | GCP Tutorial 2022

Link: https://cloud.google.com/sql/docs/sqlserver/clone-instance#gcloud


Video Demo: How to Clone SQL Instances in Google Cloud Platform | Create Clone of SQL Server, MySQL, PostgreSQL

How to Start, Stop, & Restart SQL instances on GCP by using Console and Gcloud Commands GCP Tutorial

Topic: How to Start, Stop, & Restart SQL instances on GCP by using Console and Gcloud Commands GCP Tutorial

How to Start, Stop, & Restart SQL instances on GCP by using Console and Gcloud Commands GCP Tutorial 2022, in this video we are going to learn How to Start, Stop, & Restart SQL instances on GCP by using Console and Gcloud Commands | GCP Tutorial 2022

Link: https://cloud.google.com/sql/docs/mysql/start-stop-restart-instance


Video Demo: How to Start, Stop, & Restart SQL instances on GCP by using Console and Gcloud Commands GCP Tutorial

Google Cloud SQL Tutorial

Google Cloud SQL Tutorial

Google Cloud SQL Tutorial

Instance Management

  1. Create SQL Instance and Connect via SSMS
  2. Start, Stop, Restart SQL Instances in GCP
  3. Edit SQL Instance Settings in GCP

Backup, Recovery & Upgrades

  1. Backup Strategies and Limitations
  2. Enable Point in Time Recovery in MySQL
  3. Upgrade SQL Server Editions
  4. In-place PostgreSQL Upgrade (v12 to v14)

High Availability & Replication

  1. Setup HA and Failover for SQL Server
  2. Create MySQL 8.0 with HA
  3. Create SQL Server Read Replicas
  4. MySQL Read Replicas & Promotion

Connectivity & Access

  1. Clone SQL Instances
  2. Connect PostgreSQL with Private IP
  3. Connect MySQL via Private IP
  4. Connect SQL Server via Private IP
  5. Install PostgreSQL 14.5 on Windows
  6. Install psql CLI for PostgreSQL

Performance & Logs

  1. Enable Slow and General Logging in MySQL
  2. Stop Logging Clear Text Passwords in PostgreSQL

Scheduling & Maintenance

  1. Automate SQL Start/Stop with Cloud Scheduler
  2. Schedule PostgreSQL Maintenance with pg_cron

Migration & Security

  1. Migrate SQL Server to GCP using BAK
  2. Drop User or Role in PostgreSQL
  3. Fix SQL Server Error 53

Advanced Features

  1. Cross DB Queries in PostgreSQL (GCP)
  2. Cross DB Queries between Google SQL PostgreSQL Instances
  3. CDC from MySQL to BigQuery via DataStream

SQL Server Error 53 Could not Open Connection On SQL Server on Google Cloud Platform | GCP Tutorial

 Topic: SQL Server Error 53 Could not Open Connection On SQL Server on Google Cloud Platform 

SQL Server Error 53 Could not Open Connection On SQL Server on Google Cloud Platform GCP Tutorial 2022, in this video we are going to learn about SQL Server Error 53 Could not Open Connection On SQL Server on Google Cloud Platform | GCP Tutorial 2022.



Video Tutorial: SQL Server Error 53 Could not Open Connection On SQL Server on Google Cloud Platform

Create MySQL Instance with Read Replicas in GCP | Promote MySQL Read Replica to Stand Alone Mode

Topic: Create MySQL Instance with Read Replicas in Google Cloud Platform.


Create MySQL Instance with Read Replicas in GCP | Promote MySQL Read Replica to Stand Alone Mode GCP Tutorial 2022, in this video we are going to learn Create MySQL Instance with Read Replicas in GCP | Promote MySQL Read Replica to Stand Alone Mode


Video Demo: Create MySQL Instance with Read Replicas in GCP

 

How to Enable Point in Time Recovery for MySQL Instance in GCP | How to Restore Database with PTR

Topic: How to Enable Point in Time Recovery for MySQL Instance in GCP.


How to Enable Point in Time Recovery for MySQL Instance in GCP | How to Restore Database with PTR GCP Tutorial 2022, in this video we are going to learn How to Enable Point in Time Recovery for MySQL Instance in GCP | How to Restore Database with PTR | GCP Tutorial 2022.


Video Demo: How to Enable Point in Time Recovery for MySQL Instance in Google Cloud Platform 

 

How to Create Read Replicas for SQL Server Instance in GCP Availability Groups for SQL Server in GCP

How to Create Read Replicas for SQL Server Instance in Google Cloud Platform.

How to Create Read Replicas for SQL Server Instance in GCP Availability Groups for SQL Server in GCP | GCP Tutorial 2022, in this video we are going to learn How to Create Read Replicas for SQL Server Instance in GCP Availability Groups for SQL Server in GCP.


Link: https://cloud.google.com/sql/docs/sqlserver/replication#sql-server-limitations


Video Demo: How to Create Read Replicas for SQL Server Instance in GCP Availability Groups for SQL Server in GCP

How to Create MySQL 8.0 Instance with High Availability in Google Cloud Platform

Topic: How to Create MySQL 8.0 Instance with High Availability in Google Cloud Platform.

How to Create MySQL 8.0 Instance with High Availability in Google Cloud Platform GCP Tutorial 2022, in this video we are going to learn How to Create MySQL 8.0 Instance with High Availability in Google Cloud Platform.


Vidoe Demo: How to Create MySQL 8.0 Instance with High Availability in Google Cloud Platform 

 

How to Setup SQL Server Instance with High Availability in Google Cloud Platform | How to Failover SQL Server Instance

Topic: How to Setup SQL Server Instance with High Availability in Google Cloud Platform.

How to Setup SQL Server Instance with High Availability in GCP | How to Failover SQL Server Instance GCP Tutorial 2022, in this video we are going to learn How to Setup SQL Server Instance with High Availability in GCP | How to Failover SQL Server Instance | GCP Tutorial 2022, Google Cloud Platform Step by Step - GCP Tutorial 2022 




Video Demo:How to Setup SQL Server Instance with High Availability in Google Cloud Platform

How to Create SQL Instance on Google Cloud Platform and Connect by using SSMS - GCP Tutorial

 Topic: How to Create SQL Instance on Google Cloud Platform and Connect by using SSMS


How to Create SQL Instance on Google Cloud Platform and Connect by using SSMS GCP Tutorial 2022, in this video we are going to learn How to Create SQL Instance on Google Cloud Platform and Connect by using SSMS from local Computer | GCP Tutorial 2022, Google Cloud Platform Step by Step - GCP Tutorial 2022 - GCP Tutorial 2022 Step by Step - Google Cloud Platform Tutorial 2022.


Video Tutorial: How to Create SQL Instance on Google Cloud Platform and Connect by using SQL Server Management Studio

Azure Data Factory Interview Question & Answers

ADF Interview Questions and Answers | Azure Data Factory Q&A

ADF Interview Questions and Answers

Introduction to Azure Data Factory

  1. What is Azure Data Factory and Why Do We Use Azure Data Factory?

Integration Runtime (IR)

  1. What is Integration Runtime in ADF, and what are the Types of IR?
  2. Which IR will you use if you need to read data from an on-premises network to Azure Cloud?
  3. If your Self-Hosted IR is running slowly, what steps will you take in Azure Data Factory?
  4. How will you check if SHIR is not working?
  5. Can a single Self-Hosted IR be used in more than one Data Factory?
  6. For load balancing and failover scenarios, what is the maximum number of nodes we can add in SHIR?
  7. Is it okay to create multiple Integration Runtimes in a single Data Factory, and why?

Pipelines and Activities

  1. What are Pipelines in Azure Data Factory?
  2. What is an Activity in Azure Data Factory?
  3. How will you control the flow of activities in an Azure Data Factory Pipeline?
  4. Which activity will you use to delete all files in Azure Data Factory?
  5. Explain a few activities in Azure Data Factory Pipeline that can run a stored procedure.
  6. How can you complete a task that cannot be done with built-in activities in Azure Data Factory?
  7. Can the same linked service be used in multiple Pipelines in Azure Data Factory?
  8. To get a list of all files from Azure Blob Storage, which activity will you use?

Triggers

  1. What is a Schedule Trigger in Azure Data Factory?
  2. What are Event-Based Triggers in Azure Data Factory?
  3. What is a Tumbling Window Trigger in Azure Data Factory?

Linked Services and Datasets

  1. What are Linked Services in Azure Data Factory?
  2. What is a Dataset in Azure Data Factory?
  3. What are Parameterized Linked Services, and why do you use them in Azure Data Factory?

SSIS Integration

  1. You need to run an SSIS Package in Azure Data Factory. Where will you save the SSIS package?
  2. What are your thoughts on which is better: SSIS vs. ADF?

Data Operations

  1. Does Copy Activity support UPSERT operations?
  2. How to remove duplicate records in Azure Data Factory?
  3. If you need to add an extra column or derive extra columns from source columns, can you use Copy Activity?
  4. You are given hundreds of files, and you need to load them into separate tables and create the tables on the fly. How would you do this?
  5. Multiple files are loaded to a single table. How will you identify the file names for each record?

Error Handling and Recovery

  1. If your Data Factory is deleted, how will you recover or restore it?
  2. How can you backup and restore Azure Data Factory?

Security and Networking

  1. What are the steps to create Private Endpoints in Azure Data Factory?
  2. Why do you use Azure Key Vault with Data Factory?

Parameters and Variables

  1. What is the difference between a Parameter and a Variable in Azure Data Factory?
  2. What are the three different types of variables available in Azure Data Factory?

Monitoring and Logging

  1. Where will you see the Pipeline Runs and Triggers?
  2. How will you keep the execution log history of ADF Pipelines for more than 3 months?
  3. How will you get the last 2 months' execution details of pipelines if your pipelines have been failing?
  4. How to find out which Pipeline is billed the most?
  5. How will you check if a file exists or does not exist in Blob storage in Azure Data Factory?

Data Flows

  1. Your Data Flow is running slowly in ADF. What steps will you take?
  2. What is the difference between General Purpose and Memory Optimized Mapping Data Flows?
  3. What is the minimum and maximum Vcore for Mapping Data Flow?

CI/CD and Source Control

  1. Which source control repository will you use in Azure Data Factory?
  2. Which tool have you used for CI/CD for Azure Data Factory?
  3. How to copy an Azure Data Factory Pipeline from one Data Factory to another?

Notifications

  1. How will you send an email with a file attachment from Azure Data Factory?
  2. How will you send an email from an Azure Data Factory Pipeline?
  3. How will you send an email from Azure Data Factory?

Miscellaneous

  1. In which modes can an Azure Data Factory Pipeline run?
  2. The Microsoft EventGrid resource provider is not registered in the subscription. How to resolve this?
  3. What steps will you take if your pipelines are stuck in a queue or in progress status?
  4. How often do you upgrade or update your Self-Hosted Integration Runtime?

© 2025 Aamir Shahzad. All rights reserved.

How to use Append Variable activity in Azure Data Factory - Azure Data Factory Tutorial 2022

Topic: How to use Append Variable activity in Azure Data Factory 


In this article we are going to learn how to use append variable activity in Azure Data Factory, we are going to perform a real-time example in which what we will do, we will be emailing the list of the files which you have processed in your data factory so we will concatenate all those lists of the files and then email. let's start our demonstration.

First of all, open your Azure Data Factory studio, go to the author tab, click on the + button to create a new pipeline, then find and drag the Get Metadata activity.


Rename the Get Metadata activity for your convenience, then go to the Dataset tab, click on the + New button to create a new dataset


 Select Azure Blob Storage and then click on Continue.


Select the File format and then click on continue.


Name your dataset, select the linked service if you have already created, or create a new one, provide the input file path, select the none as import schema as per your requirement, then click on ok.


Go to the dataset tab, and select the Child Items in Field list, as we are going to read the list of the files.

Find and drag the ForEach Loop Activity, connect with the Get Metadata activity, go to the settings tab then click on the Add dynamic content.


Select the Metadata activity, expressions will be shown in the above box, then add ".Childitems" and then click on Ok



Go to the variable tab, click on the  + button to create a new variable, name your variable, and select the type.


Go inside ForEach activity, bring the wait activity or copy data activity as per your scenario, then find and bring the append variable activity, connect both of them then go to the Variables tab, select the name and click on Add Dynamic content.


Click on ForEach, then the expression will be shown in the above box, then add ".name" and click on Ok.


Outside the ForEach activity, find and bring the Set variable, then connect with the ForEach, then create a string type variable and then click on the set variable activity, go to the Variables tab, select the variable and then click on Add dynamic content.



Click on the variable name which we have created earlier, then add the curly parentheses around and click on ok.


Next. go to the Azure portal, find and open the Logic Apps, then click on the + Add button to create a new Logic app, in the basics tab select your Azure subscription, select the resource group, select the logic app type, name your logic app, select your region then click on Review + create and then click on the create.



Once the logic app is created, Open the resource and here we will setup the Email part, search for the "Request", then click on the request and then select http request.


Once the HTTP request is received click on the Use sample payload to generate schema.


In this window enter the expressions and then click on Done.



Click on the Method then select Post then click on Next.
.

Select the Email service, in my case it is Gmail, then provide a connection name, select the authentication type and click on Sign In, it will redirect to a new window and then you have to provide your user name and password for the email account. 




Once you are signed in, it will redirect you to another window where you have to provide the email address where you want to send the email, then provide and subject and the message the select the List of files variable from the drop down window, and then finally click on Save.



Once you are done with the email part, click on the HTTP box and then copy the HTTP POST URL link.


Next, go to the Azure Data Factory studio, then find and drag the web activity, then connect with the set variable activity, go to the settings tab provide the HTTP URL link, select the method, provide the body, then publish the pipeline, and then debug your pipeline.






Video Demo: How to use Append Variable activity in Azure Data Factory