Up and Running with Apache Airflow
.MP4, AVC, 1280x720, 30 fps | English, AAC, 2 Ch | 34m | 86.5 MB
Instructor: Pinal Dave
.MP4, AVC, 1280x720, 30 fps | English, AAC, 2 Ch | 34m | 86.5 MB
Instructor: Pinal Dave
Orchestrate and automate data workflows with Apache Airflow. This course shows how to configure Airflow, build and schedule DAGs, manage tasks with operators, and monitor workflow execution for scalable, reliable data pipelines.
What you'll learn
Managing complex data workflows without orchestration quickly leads to failures, bottlenecks, and inefficiencies. Manual processes become hard to track, maintain, and scale—especially across growing teams and datasets. In this course, Up and Running with Apache Airflow, you’ll gain the ability to orchestrate workflows and manage task execution using Apache Airflow’s flexible, scalable framework.
First, you’ll explore how to install and configure Apache Airflow in a local development environment, understand its architecture, and launch the web server and scheduler. Next, you’ll learn how to write and run DAGs—Airflow’s directed workflows—using Bash and Python operators. Then, you’ll discover how to connect tasks, monitor execution through the UI, and debug failed runs effectively. Finally, you’ll dive into scheduling concepts such as schedule_interval, start_date, catchup, and manual triggering, giving you full control over when and how workflows execute.
When you’re finished with this course, you’ll have the skills and confidence to set up and manage reliable Airflow environments and build production-ready pipelines that are automated, observable, and maintainable.