How to upgrade GCP Postgresql from old version to new version by using Database Migration DMS in GCP

Topic: How to upgrade GCP Postgresql from old version to the new version by using Database Migration DMS in GCP.


Here’s a step-by-step guide to upgrading GCP PostgreSQL from an old version to a new version by using Database Migration Service (DMS) in GCP:
1.       Create a service account in the Google Cloud Console with the necessary permissions to access your GCP project, PostgreSQL instance, and DMS. You can follow the instructions provided in the official documentation.
2.       Install the Google Cloud SDK on your local machine. You can download it from the official website.
3.       Authenticate your Google Cloud SDK by running the following command in your terminal:
4.  gcloud auth login
 
5.       Set your project ID by running the following command in your terminal:
6.  gcloud config set project <project-id>
 
7.       Create a migration job by running the following command in your terminal:
8.  gcloud dms create-migration <migration-name> --source_database_host=<source-host> --source_database_port=<source-port> --destination_database_host=<destination-host> --destination_database_port=<destination-port> --reverse_ssh_tunnel=<True/False>
 
Replace <migration-name> with a name for your migration job, <source-host> and <source-port> with the hostname and port number of your source PostgreSQL instance, and <destination-host> and <destination-port> with the hostname and port number of your destination PostgreSQL instance. Set --reverse_ssh_tunnel to True if you need to use a reverse SSH tunnel for connectivity.
 
9.       Start the migration job by running the following command in your terminal:
gcloud dms start-migration <migration-name>
 
10.    Monitor the migration status by running the following command in your terminal:
gcloud dms describe-migration <migration-name>
 
11.    Verify the migration by checking that all data has been migrated successfully to the new PostgreSQL instance.
That’s it! You should now have upgraded your GCP PostgreSQL instance from an old version to a new version using DMS.
I hope this helps! Let me know if you have any questions.

Video Demo: How to upgrade GCP Postgresql from old version to new version by using Database Migration DMS in GCP

How to Get the List of all Files with Size,Modified and Path from GCS Bucket and Load into BigQuery

Topic: How to Get the List of all Files with Size, Modified and Path from GCS Bucket and Load into BigQuery.

In this post, you will learn how to get the list of all files with their size, modified date, and path from a Google Cloud Storage (GCS) bucket and load them into BigQuery.

We will guide you through the process step by step, starting with setting up the necessary permissions and credentials for accessing GCS and BigQuery.

By the end of this video, you will clearly understand how to extract file metadata from a GCS bucket, load it into BigQuery, and leverage its powerful querying capabilities for further analysis.

If you’re a data engineer, data analyst, or anyone working with large datasets in Google Cloud Platform, this post is for you! 

Script: 

function listFolderContents() { var foldername = 'Final Logos'; // provide the name of Folder from which you want to get the list of files var ListOfFiles = 'ListOfFiles_' + foldername; var folders = DriveApp.getFoldersByName(foldername) var folder = folders.next(); var contents = folder.getFiles(); var ss = SpreadsheetApp.create(ListOfFiles); var sheet = ss.getActiveSheet(); sheet.appendRow( ['name', 'link','sizeInMB'] ); var var_file; var var_name; var var_link; var var_size; while(contents.hasNext()) { var_file = contents.next(); var_name = var_file.getName(); var_link = var_file.getUrl(); var_size=var_file.getSize()/1024.0/1024.0; sheet.appendRow( [var_name, var_link,var_size] ); }
};

That’s it! You should now have a BigQuery table with all the files from your GCS bucket along with their size, modified date, and path.

 

Video Demo: How to Get the List of all Files with Size, Modified and Path from GCS Bucket and Load into BigQuery.

How to use Bing AI to write SQL Server Queries as SQL Server Developer or SQL Server DBA

Topic: How to use Bing AI to write SQL Server Queries as SQL Server Developer or SQL Server DBA.

Introduction

Bing AI is a powerful tool that can help you generate SQL queries quickly and easily. Whether you’re a SQL Server Developer or a SQL Server DBA, Bing AI can help you write complex SQL queries in seconds without knowing SQL. In this blog post, we’ll explore how to use Bing AI to generate SQL queries.

Prerequisites

Before we begin, make sure you have the following:

  • A Bing account.
  • A basic understanding of SQL.

Step-by-Step Guide

  1. Sign in to Bing: Sign in to your Bing account.

  2. Open Bing AI: Open Bing AI by clicking on the “AI” button in the top right corner of the screen.

  3. Select “SQL Query”: Select “SQL Query” from the list of options.

  4. Enter your query: Enter your query in plain English. For example, “Show me all customers who have made a purchase in the last 30 days.”

  5. Click “Generate”: Click the “Generate” button to generate your SQL query.

  6. Review your query: Review your query to make sure it’s correct.

  7. Copy and paste your query: Copy and paste your query into your SQL editor.

  8. Execute your query: Execute your query to retrieve the data you need.

And that’s it! With these simple steps, you can easily generate SQL queries using Bing AI.

Conclusion

In this blog post, we explored how to use Bing AI for generating SQL queries. We hope this guide has been helpful in getting you started with using Bing AI for generating SQL queries.


Video Demo: How to use Bing AI to write SQL Server Queries as SQL Server Developer or SQL Server DBA


How to Read Data from GCS Google Cloud Storage Bucket to Azure Blob Storage

Topic: How to Read Data from GCS  Google Cloud Storage Bucket to Azure Blob Storage.

Introduction

Google Cloud Storage (GCS) and Azure Blob Storage are two of the most popular cloud storage solutions available today. While both platforms offer similar functionality, there may be times when you need to transfer data from GCS to Azure Blob Storage. In this blog post, we’ll explore how to do just that using Azure Data Factory.

Prerequisites

Before we begin, make sure you have the following:

  • A Google Cloud Storage bucket with data you want to transfer.
  • An Azure Blob Storage account where you want to transfer the data.
  • An Azure Data Factory instance.

Step-by-Step Guide

  1. Create a Linked Service for GCS: In the Azure Data Factory portal, create a new linked service for GCS. You’ll need to provide your GCP project ID, service account email, and private key.

  2. Create a Linked Service for Azure Blob Storage: Next, create a linked service for Azure Blob Storage. You’ll need to provide your storage account name and access key.

  3. Create a Dataset for GCS: Create a new dataset for GCS by specifying the path to your bucket and any other relevant details.

  4. Create a Dataset for Azure Blob Storage: Similarly, create a new dataset for Azure Blob Storage by specifying the path to your container and any other relevant details.

  5. Create a Pipeline: Create a new pipeline in your Azure Data Factory instance. Add two activities: one for copying data from GCS to Azure Blob Storage and another for deleting the data from GCS after it has been copied.

  6. Configure the Copy Activity: In the copy activity, specify the source dataset (GCS) and destination dataset (Azure Blob Storage). You can also configure other settings such as compression and encryption.

  7. Configure the Delete Activity: In the delete activity, specify the dataset (GCS) that contains the data you want to delete.

  8. Run the Pipeline: Finally, run the pipeline to transfer data from GCS to Azure Blob Storage.

And that’s it! With these simple steps, you can easily transfer data from Google Cloud Storage to Azure Blob Storage using Azure Data Factory.

Conclusion

In this blog post, we explored how to transfer data from Google Cloud Storage to Azure Blob Storage using Azure Data Factory. We hope this guide has been helpful in getting you started with transferring data between these two cloud storage solutions.


Video Demo: How to Read Data from GCS  Google Cloud Storage Bucket to Azure Blob Storage


How to Read the Data from BigQuery Table and Write to CSV File in Blob Storage

How to Read the Data from BigQuery Table and Write to CSV File in Blob Storage

Google BigQuery is a cloud-based data warehouse that allows you to store and analyze large datasets. In this tutorial, we will learn how to read data from a BigQuery table and write it to a CSV file in Azure Blob Storage.
Prerequisites

  • A Google Cloud Platform (GCP) account with BigQuery enabled.
  • An Azure account with Blob Storage enabled.
  • A dataset and table in BigQuery with data you want to export.
Steps
  1. Open the BigQuery console.
  1. In the navigation pane, select your project and dataset.
  1. Click on the table you want to export.
  1. Click on the Export button at the top of the page.
  1. In the Export to section, select Cloud Storage.
  1. In the Destination URI field, enter the path where you want to store your CSV file in Azure Blob Storage. For example: gs://my-bucket/my-folder/my-file.csv.
  1. In the Export format section, select CSV.
  1. Click on the Export button.
Conclusion
Before we begin, make sure you have the following:
That’s it! Your data is now exported from BigQuery to Azure Blob Storage as a CSV file.
In this tutorial, we learned how to read data from a BigQuery table and write it to a CSV file in Azure Blob Storage. This can be useful when you want to share data with others or use it in other applications.
If you have any questions or comments, please leave them below. Thanks for reading!


Video Tutorial: How to Read the Data from BigQuery Table and Write to CSV File in Blob Storage




How to Activate and Deactivate Activities in Azure Data Factory Step-by-Step

 

How to Activate and Deactivate Activities in Azure Data Factory Step-by-Step

Azure Data Factory (ADF) is a cloud-based data integration service that allows you to create, schedule, and manage data pipelines. In ADF, you can activate and deactivate activities in your pipelines to control their behavior during pipeline execution. This feature is useful when you want to temporarily disable an activity without deleting it from the pipeline.

In this tutorial, we will learn how to activate and deactivate activities in ADF step-by-step. We will cover the following topics:

  1. Why activate and deactivate activities?
  2. How to activate and deactivate activities in ADF?
  3. Best practices for using this feature.

Why activate and deactivate activities?

Activating and deactivating activities in ADF can help you achieve the following goals:

  • Efficient pipeline development: You can comment out part of the pipeline without deleting it from the canvas, which significantly improves pipeline developer efficiency.
  • Flexible pipeline execution: You can skip one or more activities during validation and pipeline run, which provides more flexibility in pipeline execution.
  • Easy debugging: You can debug your pipeline by deactivating certain activities and running the pipeline with only the active activities.

How to activate and deactivate activities in ADF?

To activate or deactivate an activity in ADF, follow these steps:

  1. Open your pipeline in the ADF authoring UI.
  2. Select the activity you want to activate or deactivate.
  3. In the General tab of the activity settings, set the Activity state to either Active or Inactive.
  4. If you set the Activity state to Inactive, choose a state for Mark activity as (Succeeded, Failed, or Skipped).
  5. Save your changes.

You can also deactivate multiple activities at once by selecting them with your mouse and choosing Deactivate from the drop-down menu.

Best practices for using this feature

Here are some best practices for using the activate/deactivate feature in ADF:

  • Use this feature when you want to temporarily disable an activity without deleting it from the pipeline.
  • Comment out part of the pipeline that is not yet complete or needs further development.
  • Use this feature when you want to debug your pipeline by running it with only certain activities active.

That’s it! Now you know how to activate and deactivate activities in Azure Data Factory step-by-step. If you have any questions or comments, please leave them below.


Video Tutorial: How to Activate and Deactivate Activities in Azure Data Factory Step-by-Step