Apache Airflow is a system for scheduling and monitoring workflows for data engineering. Airflow can be used to schedule ETL jobs, machine learning work, and script execution. Airflow also gives a developer a high level view into the graph of dependencies for their data pipelines.
Chaim Turkel is a backend data architect at Tikal. He joins the show to discuss a case study of using Airflow to rearchitect the data engineering workflow of a complex financial application. We discussed the problems that Airflow solves and the process of porting existing workflows to Airflow.
ANNOUNCEMENTSThe post Airflow in Practice with Chaim Turkel appeared first on Software Engineering Daily.