dbt has become the de facto standard for transforming data in modern analytics stacks. But as projects grow, so does the question: where should dbt run in production, and how can we make it faster? In this talk, we’ll compare the performance trade-offs between running dbt natively and orchestrating it through Airflow using Cosmos, with a focus on workflow efficiency at scale. Using a 200-model dbt project as a case study, we’ll show how workflow execution time in Cosmos was reduced from 15 minutes to just 5 minutes. We’ll also discuss opportunities to push performance further—ranging from better DAG optimization to warehouse-aware scheduling strategies. Whether you’re a data engineer, analytics engineer, or platform owner, you’ll leave with practical strategies to optimize dbt execution and inspiration for what’s next in large-scale orchestration
talk-data.com
Topic
cosmos
1
tagged
Activity Trend
1
peak/qtr
2020-Q1
2026-Q1