talk-data.com talk-data.com

Topic

Analytics

data_analysis insights metrics

4552

tagged

Activity Trend

398 peak/qtr
2020-Q1 2026-Q1

Activities

4552 activities · Newest first

Decisions you can count on: AI + dbt at DocuSign

AI is reshaping every stage of the analytics process. And at Docusign, that transformation is already underway. The data team is using AI to boost data quality, save engineers time, and deliver insights business users can actually trust. This session takes you end to end, from automated unit tests to governed metrics, showing how Docusign connects AI-driven development with self-serve analytics powered by the dbt Semantic Layer. The result: faster delivery, fewer surprises, and smarter decisions across the org.

In this course, learn how to manage and monitor data platform costs using dbt's built-in tools. We’ll cover how to surface warehouse usage data, set up basic monitoring, and apply rule-based recommendations to optimize performance. You’ll also explore how cost insights fit naturally into the developer workflow—equipping you to make smarter decisions without leaving dbt. This course is for analytics engineers, data analysts, and data platform owners who have a foundational understanding of dbt and want to build more cost-effective data pipelines. Using these cost management and orchestration strategies, the internal dbt Labs Analytics team achieved significant savings: Our cloud compute bill was reduced by 9% by simply implementing dbt Fusion and state-aware orchestration. By understanding the impact of models on platform costs, the team reduced the number of models built in scheduled jobs by 35% and shaved 20% off of job execution times. After this course, you will be able to: Articulate how dbt development patterns impact data platform costs. Configure dbt Cloud to monitor warehouse compute spend. Use the dbt Cost Management dashboard to identify high-cost models and jobs. Apply specific optimization techniques, from materializations to advanced data modeling patterns, to reduce warehouse costs. Implement proactive strategies like dbt Fusion and state-aware orchestration to prevent future cost overruns. Prerequisites for this course include: dbt fundamentals What to bring: You will need to bring your own laptop to complete the hands-on exercises. We will provide all the other sandbox environments for dbt and data platform. Duration: 2 hours Fee: $200 Trainings and certifications are not offered separately and must be purchased with a Coalesce pass Trainings and certifications are not available for Coalesce Online passes

Help us become the #1 Data Podcast by leaving a rating & review! We are 67 reviews away! I'll walk you through the exact data analyst job hiring pipeline from a hiring manager's perspective & show you how to NOT get rejected. 💌 Join 30k+ aspiring data analysts & get my tips in your inbox weekly 👉 https://www.datacareerjumpstart.com/newsletter 🆘 Feeling stuck in your data journey? Come to my next free "How to Land Your First Data Job" training 👉 https://www.datacareerjumpstart.com/training 👩‍💻 Want to land a data job in less than 90 days? 👉 https://www.datacareerjumpstart.com/daa 👔 Ace The Interview with Confidence 👉 https://www.datacareerjumpstart.com//interviewsimulator 🔗 CONNECT WITH AVERY 🎥 YouTube Channel 🤝 LinkedIn 📸 Instagram 🎵 TikTok 💻 Website Mentioned in this episode: Join the last cohort of 2025! The LAST cohort of The Data Analytics Accelerator for 2025 kicks off on Monday, December 8th and enrollment is officially open!

To celebrate the end of the year, we’re running a special End-of-Year Sale, where you’ll get: ✅ A discount on your enrollment 🎁 6 bonus gifts, including job listings, interview prep, AI tools + more

If your goal is to land a data job in 2026, this is your chance to get ahead of the competition and start strong.

👉 Join the December Cohort & Claim Your Bonuses: https://DataCareerJumpstart.com/daa https://www.datacareerjumpstart.com/daa

Executives don’t just ask what happened- they want to know why. Answering those questions quickly and consistently is where AI often falls short.

This workshop will show you how to teach AI to answer “why” by grounding it in a semantic layer. You’ll learn:

•What a semantic layer is and how it enables consistent answers •How to model relationships and metrics for clarity and trust •How to leverage semantic views on Snowflake •How to enable AI using Cortex to provide reliable, self-serve analytics •How to extend AI to generate deeper analyses you can depend on Prepare for the session by signing up through this link: https://bit.ly/honeydew-swt-2025

By the end, you’ll be equipped to give AI the foundation it needs to explain why - delivering faster, more consistent insights your business can rely on.

This presentation outlines the transformation from legacy systems to a modern data and AI platform within Toyota Material Handling Europe. It highlights the strategic adoption of Snowflake to unify data architecture, enable real-time analytics, and support external data sharing. The journey includes the foundation and evolution of an internal AI initiative, DataLabs, which matured into a full-scale AI program.

In this episode, I’m exploring the mindset shift data professionals need to make when moving into analytics and AI data product management. From how to ask the right questions to designing for meaningful adoption, I share four key ways to think more like a product manager, and less like a deliverables machine, so your data products earn applause instead of a shoulder shrug.

Highlights/ Skip to:

Why shift to analytics and AI data product management (00:34) From accuracy to impact and redefining success with AI and analytical data products  (01:59) Key Idea 1: Moving from question asker (analyst) to problem seeker (product) (04:31) Key Idea 2: Designing change management into solutions; planning for adoption starts in the design phase (12:52) Key Idea 3: Creating tools so useful people can’t imagine working without them. (26:23) Key Idea 4: Solving for unarticulated needs vs. active needs (34:24)

Quotes from Today’s Episode “Too many analytics teams are rewarded for accuracy instead of impact. Analysts give answers, and product people ask questions.The shift from analytics to product thinking isn’t about tools or frameworks, it’s about curiosity.It’s moving from ‘here’s what the data says’ to ‘what problem are we actually trying to solve, and for whom?’That’s where the real leverage is, in asking better questions, not just delivering faster answers.”

“We often mistake usage for success.Adoption only matters if it’s meaningful adoption. A dashboard getting opened a hundred times doesn’t mean it’s valuable... it might just mean people can’t find what they need.Real success is when your users say, ‘I can’t imagine doing my job without this.’That’s the level of usefulness we should be designing for.”

“The most valuable insights aren’t always the ones people ask for. Solving active problems is good, it’s necessary. But the big unlock happens when you start surfacing and solving latent problems, the ones people don’t think to ask for.Those are the moments when users say, ‘Oh wow, that changes everything.’That’s how data teams evolve from service providers to strategic partners.”

“Here’s a simple but powerful shift for data teams: know who your real customer is. Most data teams think their customer is the stakeholder who requested the work… But the real customer is the end user whose life or decision should get better because of it. When you start designing for that person, not just the requester, everything changes: your priorities, your design, even what you choose to measure.”

Links

Need 1:1 help to navigate these questions and align your data product work to your career? Explore my new Cross-Company Group Coaching at designingforanalytics.com/groupcoaching

For peer support: the Data Product Leadership Community where peers are experimenting with these approaches. designingforanalytics.com/community

DNB, Norway’s largest bank, began building a cloud-based self-service Data & AI Platform in 2017, delivering its first capabilities by 2018. Initially focused on ML and analytics, the platform expanded in 2021 to include traditional data warehouses and modern data products. Snowflake was officially launched in 2023 after a successful PoC and pilot.

In this talk, we’ll walk through our journey.

Where We Came From

•Discover how legacy data warehouse bottlenecks sparked a shift toward decentralised, self-service data capabilities.

Where We Are

•Learn how DNB enabled teams to own and operate their data products through: •Streamlined domain onboarding •“DevOps for data” and “SQL as code” practices •Automated services for historisation (PSA)

Where We’re Going

•Explore how DNB is evolving its data mesh with: •A hybrid model of decentralised and centralised data products •Generative AI, metadata automation, and development support •Enhanced tooling and services for data consumers

EQT, a global investment organization specializing in private capital, infrastructure, and real assets, has transformed its data operations by fully adopting the modern data stack. As a cloud-native company with hundreds of internal and external data sources — from YouTube to Google Cloud Storage — EQT needed a scalable, centralized solution to ingest and transform data for complex financial use cases. Their journey took them from fragmented, Excel-based workflows to a robust, integrated data pipeline powered by Fivetran.

In this session, you’ll learn how:

•EQT streamlined external data ingestion and broke down data silos •How a unified data pipeline supports scalable financial analytics and decision-making •Fivetran’s ease of use, connector maintenance, and cost-effectiveness made it the clear choice

Get certified at Coalesce! Choose from two certification exams: The dbt Analytics Engineering Certification Exam is designed to evaluate your ability to: Build, test, and maintain models to make data accessible to others Use dbt to apply engineering principles to analytics infrastructure We recommend that you have at least SQL proficiency and have had 6+ months of experience working in dbt (self-hosted dbt or the dbt platform) before attempting the exam. The dbt Architect Certification Exam assesses your ability to: Design secure, scalable dbt implementations, with a focus on environment orchestration Role-based access control Integrations with other tools Collaborative development workflows aligned with best practices What to expect Your purchase includes sitting for one attempt at one of the two in-person exams at Coalesce You will let the proctor know which certification you are sitting for Please arrive on time, this is a closed-door certification, and attendees will not be let in after the doors are closed What to bring You will need to bring your own laptop to take the exam Duration: 2 Hours Fee: $100 Trainings and certifications are not offered separately and must be purchased with a Coalesce pass Trainings and certifications are not available for Coalesce Online passes If you no-show your certification, you will not be refunded

Get certified at Coalesce! Choose from two certification exams: The dbt Analytics Engineering Certification Exam is designed to evaluate your ability to: Build, test, and maintain models to make data accessible to others Use dbt to apply engineering principles to analytics infrastructure We recommend that you have at least SQL proficiency and have had 6+ months of experience working in dbt (self-hosted dbt or the dbt platform) before attempting the exam. The dbt Architect Certification Exam assesses your ability to: Design secure, scalable dbt implementations, with a focus on environment orchestration Role-based access control Integrations with other tools Collaborative development workflows aligned with best practices What to expect Your purchase includes sitting for one attempt at one of the two in-person exams at Coalesce You will let the proctor know which certification you are sitting for Please arrive on time, this is a closed-door certification, and attendees will not be let in after the doors are closed What to bring You will need to bring your own laptop to take the exam Duration: 2 Hours Fee: $100 Trainings and certifications are not offered separately and must be purchased with a Coalesce pass Trainings and certifications are not available for Coalesce Online passes If you no-show your certification, you will not be refunded

The role of data analysts is evolving, not disappearing. With generative AI transforming the industry, many wonder if their analytical skills will soon become obsolete. But how is the relationship between human expertise and AI tools really changing? While AI excels at coding, debugging, and automating repetitive tasks, it struggles with understanding complex business problems and domain-specific challenges. What skills should today's data professionals focus on to remain relevant? How can you leverage AI as a partner rather than viewing it as a replacement? The balance between technical expertise and business acumen has never been more critical in navigating this changing landscape. Mo Chen is a Data & Analytics Manager with over seven years of experience in financial and banking data. Currently at NatWest Group, Mo leads initiatives that enhance data management, automate reporting, and improve decision-making across the organization. After earning an MSc in Finance & Economics from the University of St Andrews, Mo launched a career in risk and credit portfolio management before transitioning into analytics. Blending economics, finance, and data engineering, Mo is skilled at turning large-scale financial data into actionable insight that supports efficiency and strategic planning. Beyond corporate life, Mo has become a passionate educator and community-builder. On YouTube, Mo hosts a fast-growing channel (185K+ subscribers, with millions of views) where he breaks down complex analytics concepts into bite-sized, actionable lessons. In the episode, Richie and Mo explore the evolving role of data analysts, the impact of AI on coding and debugging, the importance of domain knowledge for career switchers, effective communication strategies in data analysis, and much more. Links Mentioned in the Show: Mo’s Website - Build a Data Portfolio WebsiteMo’s YouTube ChannelConnect with MoGet Certified as a Data AnalystRelated Episode: Career Skills for Data Professionals with Wes Kao, Co-Founder of MavenRewatch RADAR AI  New to DataCamp? Learn on the go using the DataCamp mobile appEmpower your business with world-class data and AI skills with DataCamp for business

Summary In this episode of the Data Engineering Podcast, host Tobias Macey welcomes back Nick Schrock, CTO and founder of Dagster Labs, to discuss Compass - a Slack-native, agentic analytics system designed to keep data teams connected with business stakeholders. Nick shares his journey from initial skepticism to embracing agentic AI as model and application advancements made it practical for governed workflows, and explores how Compass redefines the relationship between data teams and stakeholders by shifting analysts into steward roles, capturing and governing context, and integrating with Slack where collaboration already happens. The conversation covers organizational observability through Compass's conversational system of record, cost control strategies, and the implications of agentic collaboration on Conway's Law, as well as what's next for Compass and Nick's optimistic views on AI-accelerated software engineering.

Announcements Hello and welcome to the Data Engineering Podcast, the show about modern data managementData teams everywhere face the same problem: they're forcing ML models, streaming data, and real-time processing through orchestration tools built for simple ETL. The result? Inflexible infrastructure that can't adapt to different workloads. That's why Cash App and Cisco rely on Prefect. Cash App's fraud detection team got what they needed - flexible compute options, isolated environments for custom packages, and seamless data exchange between workflows. Each model runs on the right infrastructure, whether that's high-memory machines or distributed compute. Orchestration is the foundation that determines whether your data team ships or struggles. ETL, ML model training, AI Engineering, Streaming - Prefect runs it all from ingestion to activation in one platform. Whoop and 1Password also trust Prefect for their data operations. If these industry leaders use Prefect for critical workflows, see what it can do for you at dataengineeringpodcast.com/prefect.Data migrations are brutal. They drag on for months—sometimes years—burning through resources and crushing team morale. Datafold's AI-powered Migration Agent changes all that. Their unique combination of AI code translation and automated data validation has helped companies complete migrations up to 10 times faster than manual approaches. And they're so confident in their solution, they'll actually guarantee your timeline in writing. Ready to turn your year-long migration into weeks? Visit dataengineeringpodcast.com/datafold today for the details. Your host is Tobias Macey and today I'm interviewing Nick Schrock about building an AI analyst that keeps data teams in the loopInterview IntroductionHow did you get involved in the area of data management?Can you describe what Compass is and the story behind it?context repository structurehow to keep it relevant/avoid sprawl/duplicationproviding guardrailshow does a tool like Compass help provide feedback/insights back to the data teams?preparing the data warehouse for effective introspection by the AILLM selectioncost managementcaching/materializing ad-hoc queriesWhy Slack and enterprise chat are important to b2b softwareHow AI is changing stakeholder relationshipsHow not to overpromise AI capabilities How does Compass relate to BI?How does Compass relate to Dagster and Data Infrastructure?What are the most interesting, innovative, or unexpected ways that you have seen Compass used?What are the most interesting, unexpected, or challenging lessons that you have learned while working on Compass?When is Compass the wrong choice?What do you have planned for the future of Compass?Contact Info LinkedInParting Question From your perspective, what is the biggest gap in the tooling or technology for data management today?Closing Announcements Thank you for listening! Don't forget to check out our other shows. Podcast.init covers the Python language, its community, and the innovative ways it is being used. The AI Engineering Podcast is your guide to the fast-moving world of building AI systems.Visit the site to subscribe to the show, sign up for the mailing list, and read the show notes.If you've learned something or tried out a project from the show then tell us about it! Email [email protected] with your story.Links DagsterDagster LabsDagster PlusDagster CompassChris Bergh DataOps EpisodeRise of Medium Code blog postContext EngineeringData StewardInformation ArchitectureConway's LawTemporal durable execution frameworkThe intro and outro music is from The Hug by The Freak Fandango Orchestra / CC BY-SA