talk-data.com talk-data.com

Topic

dbt

dbt (data build tool)

data_transformation analytics_engineering sql

758

tagged

Activity Trend

134 peak/qtr
2020-Q1 2026-Q1

Activities

758 activities · Newest first

Cost-effective data operations

Delivering trusted data at scale doesn’t have to mean ballooning costs or endless rework. In this session, we’ll explore how state-aware orchestration, powered by Fusion, drives leaner, smarter pipelines in the dbt platform. We’ll cover advanced configurations for even greater efficiency, practitioner tips that save resources, and testing patterns that cut redundancy. The result: faster runs, lower spend, and more time for impactful work.

Freedom through structure: How WHOOP scales analyst autonomy with dbt

AI and dbt unlocks the potential for any data analyst to work like full-stack dbt developers. But without the right guardrails, that freedom can quickly turn into chaos and technical debt. At WHOOP, we embraced analyst autonomy, and scaled it responsibly. In this session, you’ll learn how we empowered analysts to build in dbt while protecting data quality, staying aligned with the broader team, and avoiding technical debt. If you’re looking to give analysts more ownership without giving up control, this session will show you how to get there.

Continue the conversation from Building High-Quality AI Workflows with the dbt MCP Server in this interactive roundtable. We’ll explore how dbt’s MCP Server connects governed data to AI systems, dive deeper into the practical use cases, and talk through how organizations can adopt AI safely and effectively. This is your chance to: Ask detailed technical questions and get answers from product experts. Share your team’s challenges and learn from peers who are experimenting with MCP. Explore what’s possible today and influence where we go next. Attendance at the breakout session is encouraged but not required. Come ready to join the discussion and leave with new ideas!

This workshop will cover new analyst focused user interfaces in dbt. Understand how to develop in dbt Canvas, how to explore within the Insights query page, and how to navigate across your data control plane in the dbt Catalog. What to bring: You will need to bring your own laptop to complete the hands-on exercises. We will provide all the other sandbox environments for dbt and a data platform.

Goodbye manual testing & alert fatigue: Meet your AI data SRE

Eliminate 80% of the manual effort spent writing dbt tests, chasing noisy alerts, and fixing data issues. In this session, you'll see how data teams are using an AI Data SRE that detects, triages, and resolves issues across the entire data stack. We’ll cover everything from AI architecture to optimised incident management–and even show an agent writing production-ready PRs!

How we are building a federated, AI-augmented data platform that balances autonomy and standardization at scale

Platform engineers from global pharmaceutical company invites you to explore our journey in creating a Cloud native, Federated Data Platform using dbt Cloud, Snowflake, and Data Mesh. Discover how we established foundational tools, standards, and developed automation and self-service capabilities.

Mamma mia! My data’s in the Iceberg

Iceberg is an open storage format for large analytical datasets that is now interoperable with most modern data platforms. But the setup is complicated, and caveats abound. Jeremy Cohen will tour the archipelago of Iceberg integrations — across data warehouses, catalogs, and dbt — and demonstrate the promise of cross platform dbt Mesh to provide flexibility and collaboration for data teams. The more the merrier.

Join us for a roundtable discussion where we’ll go deeper into the ideas shared in the Turn metadata into meaning: Build context, share insights, and make better decisions with dbt Catalog breakout. Bring your questions, share your take, and connect with peers and presenters. Note attendance in the breakout session is not required.

Get certified at Coalesce! Choose from two certification exams: The dbt Analytics Engineering Certification Exam is designed to evaluate your ability to: Build, test, and maintain models to make data accessible to others Use dbt to apply engineering principles to analytics infrastructure We recommend that you have at least SQL proficiency and have had 6+ months of experience working in dbt (self-hosted dbt or the dbt platform) before attempting the exam. The dbt Architect Certification Exam assesses your ability to: Design secure, scalable dbt implementations, with a focus on environment orchestration Role-based access control Integrations with other tools Collaborative development workflows aligned with best practices What to expect Your purchase includes sitting for one attempt at one of the two in-person exams at Coalesce You will let the proctor know which certification you are sitting for Please arrive on time, this is a closed-door certification, and attendees will not be let in after the doors are closed What to bring You will need to bring your own laptop to take the exam Duration: 2 Hours Fee: $100 Trainings and certifications are not offered separately and must be purchased with a Coalesce pass Trainings and certifications are not available for Coalesce Online passes If you no-show for your certification, you will not be refunded

Rewriting the data playbook at Virgin Media O2

At Virgin Media O2, we believe that strong processes and culture matter more than any individual tool. In this talk, we’ll share how we’ve applied DevOps and software engineering principles to transform our data capabilities and enable true data modernization at scale. We’ll take you behind the scenes of how these practices shaped the design and delivery of our enterprise Data Mesh, with dbt at its core, empowering our teams to move faster, build trust in data, and fully embrace a modern, decentralized approach.

Towards a more perfect pipeline: CI/CD in the dbt Platform
talk
by Aaiden Witten (United Services Automobile Association) , Michael Sturm (United Services Automobile Association) , Timothy Shiveley (United Services Automobile Association)

In this session we’ll show how we integrated CI/CD dbt jobs to validate data and run tests on every merge request. Attendees will walk away with a blueprint for implementing CI/CD for dbt, lessons learned from our journey and best practices to keep data quality high without slowing down development.

dbt Mesh allowed for monolithic dbt projects to be broken down into more consumable and governed smaller projects. Now, learn how cross-platform mesh will allow you to take this one step further with development across data platforms using Iceberg tables. After this course you will be able to: Identify ideal use cases dbt Mesh Configure cross-project references between data platforms Navigate dbt Catalog Prerequisites for this course include: dbt Fundamentals, specifically data models and building model dependencies dbt Model governance Various data platforms What to bring: You will need to bring your own laptop to complete the hands-on exercises. We will provide all the other sandbox environments for dbt and data platform. Duration : 2 hours Fee : $200 Trainings and certifications are not offered separately and must be purchased with a Coalesce pass. Trainings and certifications are not available for Coalesce Online passes.