Airflow on GCP: End to End Data Pipeline with Cloud Composer, BigQuery & GCS
Автор: CK Data Tech
Загружено: Дата премьеры: 9 дек. 2024 г.
Просмотров: 2 465 просмотров
In this comprehensive tutorial, we will learn how to build and manage end-to-end data pipeline with Cloud Composer, BigQuery, and Google Cloud Storage (GCS).
In this video, we'll dive into the world of data engineering and take a closer look at how to design, build, and orchestrate data pipelines using Cloud Composer.
You'll discover how to extract, transform, and load data from various sources into BigQuery, and then store and process large datasets within with GCS. Whether you're a data engineer, data scientist, or just starting out in data analytics, this video will provide you with the knowledge and skills you need to master end-to-end data pipelines.
Video Covers:
Overview of how to Create Cloud Composer Environment
Overview of Airflow UI within Cloud Composer
Overview of how to create BigQuery Dataset
Overview of how to create GCS Bucket
How to create and upload DAG to Airflow in Composer
DAG that generates data, stores in GCS, and then loads to BigQuery
How to transform BigQuery data after ingests
How to trigger Airflow DAG in composer
How to monitor tasks within airflow in composer environment
Basically this video covers how to orchestrate end to end ELT (Extract, Load, Transform) pipeline process in GCP (Google Cloud Platform) with the help of Airflow running on managed Cloud Composer service.
Supporting Videos:
Create Cloud Composer in GCP - • How To CREATE Composer Environment in...
GitHub Source Code:
https://github.com/kalekem/dbt_tutori...
#composer #gcp #airflow #elt #etl #datapipeline #cloudcomputing #cloudengineering #dataengineering #datalake #datawarehousing #python

Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: