talk-data.com talk-data.com

Topic

dbt

dbt (data build tool)

data_transformation analytics_engineering sql

87

tagged

Activity Trend

134 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: dbt Coalesce 2025 ×

Get certified at Coalesce! Choose from two certification exams: The dbt Analytics Engineering Certification Exam is designed to evaluate your ability to: Build, test, and maintain models to make data accessible to others Use dbt to apply engineering principles to analytics infrastructure We recommend that you have at least SQL proficiency and have had 6+ months of experience working in dbt (self-hosted dbt or the dbt platform) before attempting the exam. The dbt Architect Certification Exam assesses your ability to: Design secure, scalable dbt implementations, with a focus on environment orchestration Role-based access control Integrations with other tools Collaborative development workflows aligned with best practices What to expect Your purchase includes sitting for one attempt at one of the two in-person exams at Coalesce You will let the proctor know which certification you are sitting for Please arrive on time, this is a closed-door certification, and attendees will not be let in after the doors are closed What to bring You will need to bring your own laptop to take the exam Duration: 2 Hours Fee: $100 Trainings and certifications are not offered separately and must be purchased with a Coalesce pass Trainings and certifications are not available for Coalesce Online passes If you no-show your certification, you will not be refunded

Escape the buzz and discover a quiet space to meditate, stretch, and recharge—leaving you centered and refreshed. Please leave your laptops, meetings, and phone calls at the door.

How to scale dbt across independent teams with IaC

How do you deliver a modern, governed analytics stack to dozens of independent and competing companies, each with their own priorities, budgets, and data platforms? SpareBank 1 built a platform-as-a-service using Infrastructure as Code to provision dbt environments on demand. This session shares how SB1 modernized legacy data warehouses and scaled dbt across a network of banks, offering lessons for any organization supporting multiple business units or regions.

In this session, you’ll learn how dbt’s Semantic Layer and MCP Server enable safe, AI-powered access to data through natural language interfaces. As business users start exploring data conversationally, the analyst role is evolving—from writing queries to curating trusted logic, guiding usage, and enabling scalable self-service. We’ll break down what this shift looks like and how dbt helps analysts stay at the center of decision-making.

Unleash the power of dbt on Google Cloud: BigQuery, Iceberg, DataFrames and beyond

The data world has long been divided, with data engineers and data scientists working in silos. This fragmentation creates a long, difficult journey from raw data to machine learning models. We've unified these worlds through the Google Cloud and dbt partnership. In this session, we'll show you an end-to-end workflow that simplifies data to AI journey. The availability of dbt Cloud on Google Cloud Marketplace streamlines getting started, and its integration with BigQuery's new Apache Iceberg tables creates an open foundation. We'll also highlight how BigQuery DataFrames' integration with dbt Python models lets you perform complex data science at scale, all within a single, streamlined process. Join us to learn how to build a unified data and AI platform with dbt on Google Cloud.

talk
by Tristan Handy (dbt Labs) , Elias DeFaria (dbt Labs) , Jeremy Cohen (dbt Labs) , Grace Goheen (dbt Labs) , George Fraser (Fivetran) , Bolaji Oyejide (dbt Labs)

Let’s bring it back to where it all began—with you, the dbt community—and to where we’re going. We’ll share our perspective on how the two engines, dbt Core and Fusion, are united by one common framework—and how we continue to build in the open. Plus: recognition of our exceptional community contributors who make dbt so special. For our Coalesce Online attendees, join us on Slack in #coalesce-2025 to stay connected during keynote!

Tuesday afternoon keynote: Rewrite what's possible

The future of AI is here. Join AI and data industry thought leader Ashley Kramer from OpenAI as she shares how AI-powered development and intelligent systems act as force multipliers for organizations—and how to confidently embrace these accelerants at scale. In the second half of the keynote, she'll be joined by a panel of product leaders from across the data stack for a discussion on the future of analytics in an AI-driven world and how dbt and ecosystem partners are innovating to rewrite what’s possible: turning yesterday's science fiction into today's reality. For our Coalesce Online attendees, join us on Slack in #coalesce-2025 to stay connected during keynote!

Ascending data Everest: Ericsson's climb to scalable analytics

Learn how Ericsson’s Enterprise Wireless Solutions team modernized its analytics stack, replacing legacy tools like Alteryx and ThoughtSpot with scalable, modular data models powered by the dbt platform. With a lean team, we tackled SOX compliance, streamlined operations, and expanded from internal analytics to external analytics. Now we're starting to use the new dbt Fusion engine to accelerate development and more tightly manage costs. Find out how we climbed from legacy limitations to enterprise-grade analytics performance.

dbt development in the age of AI: Improving the developer experience in the dbt platform

Learn why dbt is the best place for analysts to build reliable data products quickly, combining structured workflows with context-aware AI. We will explore how dbt Canvas, dbt Studio, and more give analysts the visibility, control, and flexibility they need to move from exploration to production without relying on engineers. You will also see how AI agents in dbt accelerate development by recommending logic, surfacing relevant models, and helping troubleshoot issues—making self-service both faster and more trusted.

How Aura Minerals used AI to migrate from PySpark to dbt 7x faster

Aura Minerals cut pipeline migration time by 87%, transforming a 45-hour manual process into a 6-hour lift with relatively little manual oversight. In this session, learn how the team used AI tools to accelerate their shift from a complex PySpark environment to a streamlined, modern dbt workflow. You’ll walk away with a clear view of the strategy, the tools they used, and what it looks like to modernize your data stack with AI as a force multiplier.

Move fast with dbt Fusion: Inside EQT’s journey to faster, simpler data

In this talk, EQT walks through their journey to dbt Fusion, focusing on how they re-architected their data workflows for speed, scalability, and maintainability. Using three real-world projects as case studies, they'll show how State Aware Orchestration reduced runtimes (you don’t want to miss this!), simplified job orchestration, and surfaced a few unexpected pitfalls along the way. They'll wrap up by sharing lessons learned and practical tips for teams planning their own Fusion migration.

Solving a $6 million problem with the dbt Semantic Layer​

At EMC Insurance, inconsistent data definitions across a rigid legacy system led to an estimated $6 million analyst-hours spent on reconciliation instead of delivering insights. We needed a way to standardize metrics fast. By using the dbt semantic layer alongside Analytics8, we created a consistent, queryable layer for metrics like loss ratio, helping teams across the business get on the same page and ensure a single source of truth. This session walks through how we approached the problem, what we built, and what we’d do differently. If you’re trying to clean up inconsistent reporting or move away from fragile legacy logic, this is for you.

Get hands-on with dbt Copilot, the AI assistant built into dbt that helps you move faster and write better code. In this workshop, you’ll learn how to use copilot to generate sql, create data tests and documentation, and build semantic models and metrics —all within your existing workflow. What to bring: You must bring your own laptop to complete the hands-on exercises. We will provide all the other sandbox environments for dbt and data platform.

Bilt's conversational data layer: How we connected data to LLMs with dbt

Bilt Rewards turned their dbt project into a natural language interface. By connecting their semantic layer and underlying data warehouse to an LLM, business users and data analysts can ask real business questions and get trusted and creative insights. This session shows how they modeled their data for AI, how they kept accuracy intact, and increased data driven conversations across the business.

Delve into the core concepts and applications of data quality with dbt. With a focus on practical implementation, you'll learn to deploy custom data tests, unit testing, and linting to ensure the reliability and accuracy of your data operations. After this course, you will be able to: Recognize scenarios that call for testing data quality Implement efficient data testing methods to ensure reliability (data tests, unit tests) Navigate other quality checks in dbt (linting, CI, compare) Prerequisites for this course include: dbt Fundamentals What to bring: You will need to bring your own laptop to complete the hands-on exercises. We will provide all the other sandbox environments for dbt and data platform. Duration: 2 hours Fee: $200 Trainings and certifications are not offered separately and must be purchased with a Coalesce pass Trainings and certifications are not available for Coalesce Online passes