no fucking license

The Complete Hands-on Introduction to Airbyte

Get started with Airbyte and learn how to use it with Apache Airflow, Snowflake, dbt and more

The Complete Hands-on Introduction to Airbyte

Preview this Course 
What you'll learn
  • Understand what Airbyte is, its architecture, concepts, and its role in the MDS
  • Install and set up Airbyte locally with Docker
  • Connect Airbyte to different data sources (databases, cloud storages, etc)
  • Configure Airbyte to send data to various destinations (DWs, databases)
  • Develop a data pipeline from scratch with Airbyte, dbt, Soda, Airflow, Postgres, and Snowflake to run your first data syncs
  • Set up monitoring and notifications with Airbyte

Welcome to the Complete Hands-On Introduction to Airbyte!

Airbyte is an open-source data integration engine that helps you consolidate data in your data warehouses, lakes, and databases. It is an alternative to Stich and Fivetran and provides hundreds of connectors mainly built by the community.

Aibyte has many connectors (+300) and is extensible. You can create your connector if it doesn't exist.

In this course, you will learn everything you need to get started with Airbyte:

What is Airbyte? Where does it fit in the data stack, and why it is helpful for you.

Essential concepts such as source, destination, connections, normalization, etc.

How to create a source and a destination to synchronize data at ease.

Airbyte best practices to efficiently move data between endpoints.

How to set up and run Airbyte locally with Docker and Kubernetes

Build a data pipeline from scratch using Airflow, dbt, Postgres, Snowflake, Airbyte and Soda.

And more.

At the end of the course, you will fully understand Airbyte and be ready to use it with your data stack!

If you need any help, don't hesitate to ask in Q/A section of Udemy, I will be more than happy to help!

See you in the course!



Who this course is for:
  • Data Engineers
  • Analytics Engineers
  • Data Architects

Post a Comment