TechBrothersIT is the blog spot and a video (Youtube) Channel to learn and share Information, scenarios, real time examples about SQL Server, Transact-SQL (TSQL), SQL Server Database Administration (SQL DBA), Business Intelligence (BI), SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), Data Warehouse (DWH) Concepts, Microsoft Dynamics AX, Microsoft Dynamics Lifecycle Services and all other different Microsoft Technologies.
Label
- Azure Data Factory Interview Question & Answers
- Azure Data Factory Tutorial Step by Step
- C# Scripts
- DWH INTERVIEW QUESTIONS
- Google Cloud SQL Tutorial
- Kusto Query Language (KQL) Tutorial
- MS Dynamics AX 2012 R2 Video Tutorial
- MariaDB Admin & Dev Tutorial
- MySQL / MariaDB Developer Tutorial Beginner to Advance
- MySQL DBA Tutorial Beginner to Advance
- SQL SERVER DBA INTERVIEW QUESTIONS
- SQL SERVER DBA Video Tutorial
- SQL Server / TSQL Tutorial
- SQL Server 2016
- SQL Server High Availability on Azure Tutorial
- SQL Server Scripts
- SQL Server on Linux Tutorial
- SSIS INTERVIEW QUESTIONS
- SSIS Video Tutorial
- SSRS INTERVIEW QUESTIONS
- SSRS Video Tutorial
- TSQL INTERVIEW QUESTIONS
- Team Foundation Server 2013 Video Tutorial
- Team Foundation Server 2015 Video Tutorial
- Windows 10
- Windows Server 2012 R2 Installation Videos
How to upgrade GCP Postgresql from old version to new version by using Database Migration DMS in GCP
How to Get the List of all Files with Size,Modified and Path from GCS Bucket and Load into BigQuery
Topic: How to Get the List of all Files with Size, Modified and Path from GCS Bucket and Load into BigQuery.
We will guide you through the process step by step, starting with setting up the necessary permissions and credentials for accessing GCS and BigQuery.
};
That’s it! You should now have a BigQuery table with all the files from your GCS bucket along with their size, modified date, and path.
Video Demo: How to Get the List of all Files with Size, Modified and Path from GCS Bucket and Load into BigQuery.
How to use Bing AI to write SQL Server Queries as SQL Server Developer or SQL Server DBA
Topic: How to use Bing AI to write SQL Server Queries as SQL Server Developer or SQL Server DBA.
Introduction
Bing AI is a powerful tool that can help you generate SQL queries quickly and easily. Whether you’re a SQL Server Developer or a SQL Server DBA, Bing AI can help you write complex SQL queries in seconds without knowing SQL. In this blog post, we’ll explore how to use Bing AI to generate SQL queries.
Prerequisites
Before we begin, make sure you have the following:
- A Bing account.
- A basic understanding of SQL.
Step-by-Step Guide
Sign in to Bing: Sign in to your Bing account.
Open Bing AI: Open Bing AI by clicking on the “AI” button in the top right corner of the screen.
Select “SQL Query”: Select “SQL Query” from the list of options.
Enter your query: Enter your query in plain English. For example, “Show me all customers who have made a purchase in the last 30 days.”
Click “Generate”: Click the “Generate” button to generate your SQL query.
Review your query: Review your query to make sure it’s correct.
Copy and paste your query: Copy and paste your query into your SQL editor.
Execute your query: Execute your query to retrieve the data you need.
And that’s it! With these simple steps, you can easily generate SQL queries using Bing AI.
Conclusion
In this blog post, we explored how to use Bing AI for generating SQL queries. We hope this guide has been helpful in getting you started with using Bing AI for generating SQL queries.
Video Demo: How to use Bing AI to write SQL Server Queries as SQL Server Developer or SQL Server DBA
How to Read Data from GCS Google Cloud Storage Bucket to Azure Blob Storage
Topic: How to Read Data from GCS Google Cloud Storage Bucket to Azure Blob Storage.
Introduction
Google Cloud Storage (GCS) and Azure Blob Storage are two of the most popular cloud storage solutions available today. While both platforms offer similar functionality, there may be times when you need to transfer data from GCS to Azure Blob Storage. In this blog post, we’ll explore how to do just that using Azure Data Factory.
Prerequisites
Before we begin, make sure you have the following:
- A Google Cloud Storage bucket with data you want to transfer.
- An Azure Blob Storage account where you want to transfer the data.
- An Azure Data Factory instance.
Step-by-Step Guide
Create a Linked Service for GCS: In the Azure Data Factory portal, create a new linked service for GCS. You’ll need to provide your GCP project ID, service account email, and private key.
Create a Linked Service for Azure Blob Storage: Next, create a linked service for Azure Blob Storage. You’ll need to provide your storage account name and access key.
Create a Dataset for GCS: Create a new dataset for GCS by specifying the path to your bucket and any other relevant details.
Create a Dataset for Azure Blob Storage: Similarly, create a new dataset for Azure Blob Storage by specifying the path to your container and any other relevant details.
Create a Pipeline: Create a new pipeline in your Azure Data Factory instance. Add two activities: one for copying data from GCS to Azure Blob Storage and another for deleting the data from GCS after it has been copied.
Configure the Copy Activity: In the copy activity, specify the source dataset (GCS) and destination dataset (Azure Blob Storage). You can also configure other settings such as compression and encryption.
Configure the Delete Activity: In the delete activity, specify the dataset (GCS) that contains the data you want to delete.
Run the Pipeline: Finally, run the pipeline to transfer data from GCS to Azure Blob Storage.
And that’s it! With these simple steps, you can easily transfer data from Google Cloud Storage to Azure Blob Storage using Azure Data Factory.
Conclusion
In this blog post, we explored how to transfer data from Google Cloud Storage to Azure Blob Storage using Azure Data Factory. We hope this guide has been helpful in getting you started with transferring data between these two cloud storage solutions.
Video Demo: How to Read Data from GCS Google Cloud Storage Bucket to Azure Blob Storage
How to Read the Data from BigQuery Table and Write to CSV File in Blob Storage
How to Read the Data from BigQuery Table and Write to CSV File in Blob Storage
Google BigQuery is a cloud-based data warehouse that allows you to store and analyze large datasets. In this tutorial, we will learn how to read data from a BigQuery table and write it to a CSV file in Azure Blob Storage.
Prerequisites
- A Google Cloud Platform (GCP) account with BigQuery enabled.
- An Azure account with Blob Storage enabled.
- A dataset and table in BigQuery with data you want to export.
- Open the BigQuery console.
- In the navigation pane, select your project and dataset.
- Click on the table you want to export.
- Click on the Export button at the top of the page.
- In the Export to section, select Cloud Storage.
- In the Destination URI field, enter the path where you want to store your CSV file in Azure Blob Storage. For example:
gs://my-bucket/my-folder/my-file.csv
.
- In the Export format section, select CSV.
- Click on the Export button.
Before we begin, make sure you have the following:
That’s it! Your data is now exported from BigQuery to Azure Blob Storage as a CSV file.
In this tutorial, we learned how to read data from a BigQuery table and write it to a CSV file in Azure Blob Storage. This can be useful when you want to share data with others or use it in other applications.
If you have any questions or comments, please leave them below. Thanks for reading!
How to Activate and Deactivate Activities in Azure Data Factory Step-by-Step
How to Activate and Deactivate Activities in Azure Data Factory Step-by-Step
Azure Data Factory (ADF) is a cloud-based data integration service that allows you to create, schedule, and manage data pipelines. In ADF, you can activate and deactivate activities in your pipelines to control their behavior during pipeline execution. This feature is useful when you want to temporarily disable an activity without deleting it from the pipeline.
In this tutorial, we will learn how to activate and deactivate activities in ADF step-by-step. We will cover the following topics:
- Why activate and deactivate activities?
- How to activate and deactivate activities in ADF?
- Best practices for using this feature.
Why activate and deactivate activities?
Activating and deactivating activities in ADF can help you achieve the following goals:
- Efficient pipeline development: You can comment out part of the pipeline without deleting it from the canvas, which significantly improves pipeline developer efficiency.
- Flexible pipeline execution: You can skip one or more activities during validation and pipeline run, which provides more flexibility in pipeline execution.
- Easy debugging: You can debug your pipeline by deactivating certain activities and running the pipeline with only the active activities.
How to activate and deactivate activities in ADF?
To activate or deactivate an activity in ADF, follow these steps:
- Open your pipeline in the ADF authoring UI.
- Select the activity you want to activate or deactivate.
- In the General tab of the activity settings, set the Activity state to either Active or Inactive.
- If you set the Activity state to Inactive, choose a state for Mark activity as (Succeeded, Failed, or Skipped).
- Save your changes.
You can also deactivate multiple activities at once by selecting them with your mouse and choosing Deactivate from the drop-down menu.
Best practices for using this feature
Here are some best practices for using the activate/deactivate feature in ADF:
- Use this feature when you want to temporarily disable an activity without deleting it from the pipeline.
- Comment out part of the pipeline that is not yet complete or needs further development.
- Use this feature when you want to debug your pipeline by running it with only certain activities active.
That’s it! Now you know how to activate and deactivate activities in Azure Data Factory step-by-step. If you have any questions or comments, please leave them below.