How to read data from Google Sheet Using Big Query

 Topic: How to read data from Google Sheet Using Big Query.

In this article, we will learn how to read data from Google Sheets using BigQuery. BigQuery is a powerful tool that allows you to access, analyze, visualize, and share billions of rows of data from your spreadsheet with Connected Sheets, the new BigQuery data connector.

To get started, you need to meet all of the following requirements: Access to the Google Cloud platform, BigQuery access, and a project with billing setup in BigQueryOnce you have met these requirements, you can create a BigQuery table with the datasource Google Drive document (URL of Google Sheet)After the schema is recognized, the today data from Google Sheet is visible in BigQueryYou can then run a query to display all of your daily data.

Connected Sheets runs queries on BigQuery on your behalf either when manually requested or on a defined schedule. Results of those queries are saved in your spreadsheet for analysis and sharingYou can also use Connected Sheets to collaborate with partners, analysts, or other stakeholders in a familiar spreadsheet interface.

Here are the steps to read data from Google Sheets using BigQuery:

1.       Ensure that you have access to the Google Cloud platform, BigQuery access, and a project with billing setup in BigQuery.

2.       Create a BigQuery table with the datasource Google Drive document (URL of Google Sheet).

3.       Run a query to display all of your daily data.

4.       Use Connected Sheets to collaborate with partners, analysts, or other stakeholders in a familiar spreadsheet interface.

If you’re interested in learning more about working with BigQuery data in Google Sheets, you can watch a video tutorial.

I hope this helps!


Video Tutorial: How to read data from Google Sheet Using Big Query 


How to upgrade GCP Postgresql from old version to new version by using Database Migration DMS in GCP

Topic: How to upgrade GCP Postgresql from old version to the new version by using Database Migration DMS in GCP.


Here’s a step-by-step guide to upgrading GCP PostgreSQL from an old version to a new version by using Database Migration Service (DMS) in GCP:
1.       Create a service account in the Google Cloud Console with the necessary permissions to access your GCP project, PostgreSQL instance, and DMS. You can follow the instructions provided in the official documentation.
2.       Install the Google Cloud SDK on your local machine. You can download it from the official website.
3.       Authenticate your Google Cloud SDK by running the following command in your terminal:
4.  gcloud auth login
 
5.       Set your project ID by running the following command in your terminal:
6.  gcloud config set project <project-id>
 
7.       Create a migration job by running the following command in your terminal:
8.  gcloud dms create-migration <migration-name> --source_database_host=<source-host> --source_database_port=<source-port> --destination_database_host=<destination-host> --destination_database_port=<destination-port> --reverse_ssh_tunnel=<True/False>
 
Replace <migration-name> with a name for your migration job, <source-host> and <source-port> with the hostname and port number of your source PostgreSQL instance, and <destination-host> and <destination-port> with the hostname and port number of your destination PostgreSQL instance. Set --reverse_ssh_tunnel to True if you need to use a reverse SSH tunnel for connectivity.
 
9.       Start the migration job by running the following command in your terminal:
gcloud dms start-migration <migration-name>
 
10.    Monitor the migration status by running the following command in your terminal:
gcloud dms describe-migration <migration-name>
 
11.    Verify the migration by checking that all data has been migrated successfully to the new PostgreSQL instance.
That’s it! You should now have upgraded your GCP PostgreSQL instance from an old version to a new version using DMS.
I hope this helps! Let me know if you have any questions.

Video Demo: How to upgrade GCP Postgresql from old version to new version by using Database Migration DMS in GCP

How to Get the List of all Files with Size,Modified and Path from GCS Bucket and Load into BigQuery

Topic: How to Get the List of all Files with Size, Modified and Path from GCS Bucket and Load into BigQuery.

In this post, you will learn how to get the list of all files with their size, modified date, and path from a Google Cloud Storage (GCS) bucket and load them into BigQuery.

We will guide you through the process step by step, starting with setting up the necessary permissions and credentials for accessing GCS and BigQuery.

By the end of this video, you will clearly understand how to extract file metadata from a GCS bucket, load it into BigQuery, and leverage its powerful querying capabilities for further analysis.

If you’re a data engineer, data analyst, or anyone working with large datasets in Google Cloud Platform, this post is for you! 

Script: 

function listFolderContents() { var foldername = 'Final Logos'; // provide the name of Folder from which you want to get the list of files var ListOfFiles = 'ListOfFiles_' + foldername; var folders = DriveApp.getFoldersByName(foldername) var folder = folders.next(); var contents = folder.getFiles(); var ss = SpreadsheetApp.create(ListOfFiles); var sheet = ss.getActiveSheet(); sheet.appendRow( ['name', 'link','sizeInMB'] ); var var_file; var var_name; var var_link; var var_size; while(contents.hasNext()) { var_file = contents.next(); var_name = var_file.getName(); var_link = var_file.getUrl(); var_size=var_file.getSize()/1024.0/1024.0; sheet.appendRow( [var_name, var_link,var_size] ); }
};

That’s it! You should now have a BigQuery table with all the files from your GCS bucket along with their size, modified date, and path.

 

Video Demo: How to Get the List of all Files with Size, Modified and Path from GCS Bucket and Load into BigQuery.

How to use Bing AI to write SQL Server Queries as SQL Server Developer or SQL Server DBA

Topic: How to use Bing AI to write SQL Server Queries as SQL Server Developer or SQL Server DBA.

Introduction

Bing AI is a powerful tool that can help you generate SQL queries quickly and easily. Whether you’re a SQL Server Developer or a SQL Server DBA, Bing AI can help you write complex SQL queries in seconds without knowing SQL. In this blog post, we’ll explore how to use Bing AI to generate SQL queries.

Prerequisites

Before we begin, make sure you have the following:

  • A Bing account.
  • A basic understanding of SQL.

Step-by-Step Guide

  1. Sign in to Bing: Sign in to your Bing account.

  2. Open Bing AI: Open Bing AI by clicking on the “AI” button in the top right corner of the screen.

  3. Select “SQL Query”: Select “SQL Query” from the list of options.

  4. Enter your query: Enter your query in plain English. For example, “Show me all customers who have made a purchase in the last 30 days.”

  5. Click “Generate”: Click the “Generate” button to generate your SQL query.

  6. Review your query: Review your query to make sure it’s correct.

  7. Copy and paste your query: Copy and paste your query into your SQL editor.

  8. Execute your query: Execute your query to retrieve the data you need.

And that’s it! With these simple steps, you can easily generate SQL queries using Bing AI.

Conclusion

In this blog post, we explored how to use Bing AI for generating SQL queries. We hope this guide has been helpful in getting you started with using Bing AI for generating SQL queries.


Video Demo: How to use Bing AI to write SQL Server Queries as SQL Server Developer or SQL Server DBA


How to Read Data from GCS Google Cloud Storage Bucket to Azure Blob Storage

Topic: How to Read Data from GCS  Google Cloud Storage Bucket to Azure Blob Storage.

Introduction

Google Cloud Storage (GCS) and Azure Blob Storage are two of the most popular cloud storage solutions available today. While both platforms offer similar functionality, there may be times when you need to transfer data from GCS to Azure Blob Storage. In this blog post, we’ll explore how to do just that using Azure Data Factory.

Prerequisites

Before we begin, make sure you have the following:

  • A Google Cloud Storage bucket with data you want to transfer.
  • An Azure Blob Storage account where you want to transfer the data.
  • An Azure Data Factory instance.

Step-by-Step Guide

  1. Create a Linked Service for GCS: In the Azure Data Factory portal, create a new linked service for GCS. You’ll need to provide your GCP project ID, service account email, and private key.

  2. Create a Linked Service for Azure Blob Storage: Next, create a linked service for Azure Blob Storage. You’ll need to provide your storage account name and access key.

  3. Create a Dataset for GCS: Create a new dataset for GCS by specifying the path to your bucket and any other relevant details.

  4. Create a Dataset for Azure Blob Storage: Similarly, create a new dataset for Azure Blob Storage by specifying the path to your container and any other relevant details.

  5. Create a Pipeline: Create a new pipeline in your Azure Data Factory instance. Add two activities: one for copying data from GCS to Azure Blob Storage and another for deleting the data from GCS after it has been copied.

  6. Configure the Copy Activity: In the copy activity, specify the source dataset (GCS) and destination dataset (Azure Blob Storage). You can also configure other settings such as compression and encryption.

  7. Configure the Delete Activity: In the delete activity, specify the dataset (GCS) that contains the data you want to delete.

  8. Run the Pipeline: Finally, run the pipeline to transfer data from GCS to Azure Blob Storage.

And that’s it! With these simple steps, you can easily transfer data from Google Cloud Storage to Azure Blob Storage using Azure Data Factory.

Conclusion

In this blog post, we explored how to transfer data from Google Cloud Storage to Azure Blob Storage using Azure Data Factory. We hope this guide has been helpful in getting you started with transferring data between these two cloud storage solutions.


Video Demo: How to Read Data from GCS  Google Cloud Storage Bucket to Azure Blob Storage


How to Read the Data from BigQuery Table and Write to CSV File in Blob Storage

How to Read the Data from BigQuery Table and Write to CSV File in Blob Storage

Google BigQuery is a cloud-based data warehouse that allows you to store and analyze large datasets. In this tutorial, we will learn how to read data from a BigQuery table and write it to a CSV file in Azure Blob Storage.
Prerequisites

  • A Google Cloud Platform (GCP) account with BigQuery enabled.
  • An Azure account with Blob Storage enabled.
  • A dataset and table in BigQuery with data you want to export.
Steps
  1. Open the BigQuery console.
  1. In the navigation pane, select your project and dataset.
  1. Click on the table you want to export.
  1. Click on the Export button at the top of the page.
  1. In the Export to section, select Cloud Storage.
  1. In the Destination URI field, enter the path where you want to store your CSV file in Azure Blob Storage. For example: gs://my-bucket/my-folder/my-file.csv.
  1. In the Export format section, select CSV.
  1. Click on the Export button.
Conclusion
Before we begin, make sure you have the following:
That’s it! Your data is now exported from BigQuery to Azure Blob Storage as a CSV file.
In this tutorial, we learned how to read data from a BigQuery table and write it to a CSV file in Azure Blob Storage. This can be useful when you want to share data with others or use it in other applications.
If you have any questions or comments, please leave them below. Thanks for reading!


Video Tutorial: How to Read the Data from BigQuery Table and Write to CSV File in Blob Storage




How to Activate and Deactivate Activities in Azure Data Factory Step-by-Step

 

How to Activate and Deactivate Activities in Azure Data Factory Step-by-Step

Azure Data Factory (ADF) is a cloud-based data integration service that allows you to create, schedule, and manage data pipelines. In ADF, you can activate and deactivate activities in your pipelines to control their behavior during pipeline execution. This feature is useful when you want to temporarily disable an activity without deleting it from the pipeline.

In this tutorial, we will learn how to activate and deactivate activities in ADF step-by-step. We will cover the following topics:

  1. Why activate and deactivate activities?
  2. How to activate and deactivate activities in ADF?
  3. Best practices for using this feature.

Why activate and deactivate activities?

Activating and deactivating activities in ADF can help you achieve the following goals:

  • Efficient pipeline development: You can comment out part of the pipeline without deleting it from the canvas, which significantly improves pipeline developer efficiency.
  • Flexible pipeline execution: You can skip one or more activities during validation and pipeline run, which provides more flexibility in pipeline execution.
  • Easy debugging: You can debug your pipeline by deactivating certain activities and running the pipeline with only the active activities.

How to activate and deactivate activities in ADF?

To activate or deactivate an activity in ADF, follow these steps:

  1. Open your pipeline in the ADF authoring UI.
  2. Select the activity you want to activate or deactivate.
  3. In the General tab of the activity settings, set the Activity state to either Active or Inactive.
  4. If you set the Activity state to Inactive, choose a state for Mark activity as (Succeeded, Failed, or Skipped).
  5. Save your changes.

You can also deactivate multiple activities at once by selecting them with your mouse and choosing Deactivate from the drop-down menu.

Best practices for using this feature

Here are some best practices for using the activate/deactivate feature in ADF:

  • Use this feature when you want to temporarily disable an activity without deleting it from the pipeline.
  • Comment out part of the pipeline that is not yet complete or needs further development.
  • Use this feature when you want to debug your pipeline by running it with only certain activities active.

That’s it! Now you know how to activate and deactivate activities in Azure Data Factory step-by-step. If you have any questions or comments, please leave them below.


Video Tutorial: How to Activate and Deactivate Activities in Azure Data Factory Step-by-Step


How to find out Who has created the GCP project in Google Cloud Platform

 Topic: How to find out Who has created the GCP project in the Google Cloud Platform

In this article you will learn How to find out Who has created the GCP project in Google Cloud Platform.

To find out who has created a GCP project in the Google Cloud Platform, you can access all activity in the activity tab, on the home page of your project. You can also use Stackdriver logs and create a sink of the logs that you want to oversee into BigQuery.

I hope this helps! Let me know if you have any other questions.


Video Demo: How to find out Who has created the GCP project in Google Cloud Platform


Unroll Multiple Arrays from JSON File in a Single Flatten Step in Azure Data Factory | ADF Tutorial

 Topic: Unroll Multiple Arrays from JSON File in a Single Flatten Step in Azure Data Factory.

Here is a tutorial on how to unroll multiple arrays from JSON file in a single flatten step in Azure Data Factory .

I hope this helps! Let me know if you have any other questions.


Video Demo: Unroll Multiple Arrays from JSON File in a Single Flatten Step in Azure Data Factory.


SQL Queries and Google BARD AI -Testing Bard for SQL Queries, ADF and Python Artificial Intelligence

Topic: SQL Queries and Google BARD AI -Testing Bard for SQL Queries, ADF and Python Artificial Intelligence.

Google Bard is an experimental Google chatbot that is powered by the LaMDA large language model. It’s a generative AI that accepts prompts and performs text-based tasks like providing answers and summaries and creating various forms of content.

 

I hope this helps! Let me know if you have any other questions.


Video Demo: SQL Queries and Google BARD AI -Testing Bard for SQL Queries, ADF and Python Artificial Intelligence


How to perform CDC from PostgreSQL to Big Query by using DataStream in Google Cloud Platform

Topic: How to perform CDC from PostgreSQL to Big Query by using DataStream in Google Cloud Platform.

To perform CDC from PostgreSQL to Big Query by using DataStream in Google Cloud Platform, you can create simple, end-to-end, cloud-native solutions that replicate your changed data into BigQuery using Datastream .


Here is a video tutorial on how to perform CDC from PostgreSQL to Big Query by using DataStream in Google Cloud Platform .

I hope this helps! Let me know if you have any other questions.


Video Demo: How to perform CDC from PostgreSQL to Big Query by using DataStream in Google Cloud Platform



Set Pipeline Return Value in Azure Data Factory-How to Pass Values between Two ADF Pipelines

 Topic: Set Pipeline Return Value in Azure Data Factory-How to Pass Values between Two ADF Pipelines


To set pipeline return value in Azure Data Factory and pass values between two ADF pipelines, you can use the Set Variable activity to return values from the child pipeline to the calling pipeline .

The mechanism used to set the pipeline return value is from the Set Variable activity which has been enhanced to now allow an option for either a user-defined variable (previously the only option) or now you can choose “Pipeline return value (preview)” .

To use a Set Variable activity in a pipeline, select the background of the pipeline canvas and use the Variables tab to add a variable. Search for Set Variable in the pipeline Activities pane, and drag a Set Variable activity to the pipeline canvas .

I hope this helps! Let me know if you have any other questions.


Video Demo: Set Pipeline Return Value in Azure Data Factory-How to Pass Values between Two ADF Pipelines



How to Use White or Dark Theme in Azure Data Factory Studio | Azure Data Factory Tutorial 2023

 Topic: How to Use White or Dark Theme in Azure Data Factory Studio  | Azure Data Factory Tutorial 2023


In this video, we are going to learn how to use a white or dark theme in Azuree Data Factory studio.
To change the look of the Azure Data Factory studio, you can choose between a white or dark theme. Here are the steps to change the theme:
  1. Open Azure Data Factory studio.
  2. Look for the toggle button that allows you to select your preferred theme.
  3. Click on the toggle button to switch between the white and dark themes.
Once you have made your selection, click on the Ok button to apply the changes
Please note that these settings control the look of your data factory. If you encounter any issues or limitations, it’s recommended to refer to official documentation or reach out to Microsoft support for further assistance.



Video Demo: How to Use White or Dark Theme in Azure Data Factory Studio.

How to upgrade GCP MySQL Instance by using Database Migration Service-Reduce GCP MySQL Instance Disk

 Topic: How to upgrade GCP MySQL Instance by using Database Migration Service-Reduce GCP MySQL Instance Disk.


To upgrade the database major version of your Google Cloud Platform MySQL Instance by migrating your data, you can use the Database Migration Service (DMS) . Here is a quickstart guide that explains how to migrate a database to Cloud SQL for MySQL by using Database Migration Service .

I hope this helps! Let me know if you have any other questions.


Video Demo: How to upgrade GCP MySQL Instance by using Database Migration Service-Reduce GCP MySQL Instance Disk



How to use List or Container Monitoring View in Azure Data Factory Studio

 Topic: How to use List or Container Monitoring View in Azure Data Factory Studio.


The List or Container Monitoring View in Azure Data Factory Studio is used to monitor the status of your data factory pipelines and activities. You can monitor all of your pipeline runs natively in the Azure Data Factory user experience. To open the monitoring experience, select the Monitor & Manage tile in the data factory blade of the Azure portal. 

Here is a video tutorial that explains how to use List or Container Monitoring View in Azure Data Factory Studio.

I hope this helps! Let me know if you have any other questions.


Vide Demo: How to use List or Container Monitoring View in Azure Data Factory Studio  

How to Connect MySQL Workbench or Heidi SQL to Google Cloud SQL Locally Using Cloud SQL Proxy

Topic: How to Connect MySQL Workbench or Heidi SQL to Google Cloud SQL Locally Using Cloud SQL Proxy.

In this article, we are going to learn How to Connect MySQL Workbench or Heidi SQL to Google Cloud SQL Locally Using Cloud SQL Proxy.


Video Demo: How to Connect MySQL Workbench or Heidi SQL to Google Cloud SQL Locally Using Cloud SQL Proxy