talk-data.com talk-data.com

Topic

Snowflake

data_warehouse cloud analytics olap

550

tagged

Activity Trend

193 peak/qtr
2020-Q1 2026-Q1

Activities

550 activities · Newest first

AI is only as good as the data it runs on. Yet Gartner predicts in 2026, over 60% of AI projects will fail to deliver value - because the underlying data isn’t truly AI-ready. “Good enough” data isn’t enough.

In this exclusive BDL launch session, DataOps.live reveal Momentum, the next generation of its DataOps automation platform designed to operationalize trusted AI at enterprise scale.

Based on experiences from building over 9000 Data Products to date, Momentum introduces breakthrough capabilities including AI-Ready Data Scoring to ensure data is fit for AI use cases, Data Product Lineage for end-to-end visibility, and a Data Engineering Agent that accelerates building reusable data products. Combined with automated CI/CD, continuous observability, and governance enforcement, Momentum closes the AI-readiness gap by embedding collaboration, metadata, and automation across the entire data lifecycle.

Backed by Snowflake Ventures and trusted by leading enterprises, including AstraZeneca, Disney and AT&T, DataOps.live is the proven catalyst for scaling AI-ready data. In this session, you’ll unpack what AI-ready data really means, learn essential practices, discover a faster, easier, and more impactful way to make your AI initiatives succeed.

Be the first to see Momentum in action - the future of AI-ready data.

This session will provide a Maia demo with roadmap teasers. The demo will showcase Maia's core capabilities: authoring pipelines in business language, multiplying productivity by accelerating tasks, and enabling self-service. It demonstrates how Maia takes natural language prompts and translates them into YAML-based, human-readable Data Pipeline Language (DPL), generating graphical pipelines. Expect to see Maia interacting with Snowflake metadata to sample data and suggest transformations, as well as its ability to troubleshoot and debug pipelines in real-time. The session will also cover how Maia can create custom connectors from REST API documentation in seconds, a task that traditionally takes days . Roadmap teasers will likely include the upcoming Semantic Layer, a Pipeline Reviewing Agent, and enhanced file type support for various legacy ETL tools and code conversions.

Join Sami Hero and Tammie Coles, as they share how Ellie is reinventing data modeling with AI-native tools that empower both technical and non-technical users. With CData Embedded Cloud, Ellie brings live metadata and data models from systems like Snowflake, Databricks, and Oracle Financials into a unified modeling workspace. Their platform translates legacy structures into human-readable insights, letting users interact with a copilot-style assistant to discover, refine, and maintain data models faster—with less reliance on analysts.

You’ll see how Ellie uses generative AI to recommend new entities, reconcile differences between models and live systems, and continuously document evolving data environments. Learn how corporations are using Ellie and CData together to scale high-quality data modeling across teams. reducing rework, accelerating delivery of analytics-ready models, and making enterprise architecture accessible to the business.

Join Richard as he shares how Runna transformed its data capabilities by harnessing the power of Snowflake and AWS. This session will explore the key challenges the team faced, how they overcame them, and the practical steps they took to build a scalable, future-ready data platform. Richard will walk through what’s been achieved so far, the lessons learned along the way, and how Runna is now able to unlock complex business insights faster and more efficiently than ever before. You'll also get a sneak peek into what’s next as they continue to evolve their data strategy to support rapid growth and innovation.

In this 20-minute session, you'll learn how to build a custom Fivetran connector using the Fivetran Connector SDK and the Anthropic Workbench (AI Assistant) to integrate data from a custom REST API into Snowflake. 

You'll then learn how to create a Streamlit in Snowflake data application powering metrics and Snowflake Cortex AI-driven applications.

Get ready for a customer story that’s as bold as it is eye-opening. In this session, Eutelsat and DataOps.live pull back the curtain on what it really takes to deliver business-changing outcomes with a specific focus on the Use Cases addressed by Apache at the core. And these Use Cases are BIG – think about big, big numbers, and you still aren’t even close!

You’ll hear the inside story of how Eutelsat found itself with two “competing” cloud data platforms. What could have been an expensive headache turned out to be an advantage: Iceberg made it not only possible but cheaper and simpler to use both together, unlocking agility and cost savings that no single platform alone could provide.

The impact is already tangible. Telemetry pipelines are live and delivering massive value. Next up: interoperable Data Products seamlessly moving from Snowflake to Cloudera and vice versa, driving cross-platform innovation. And that’s just the start—Eutelsat is also positioning Iceberg as a future-proof standard for data sharing and export.

This is a story of scale, speed, and simplification—the kind of transformation only possible when a visionary team meets the right technology.

As the pioneers of the low-code market since 2001, enterprise software delivery solution OutSystems has evolved rapidly alongside the changing landscape of data. With a global presence and a vast community of over 750,000 members, OutSystems continues to leverage innovative tools, including data observability and generative AI, to help their customers succeed.

In this session, Pedro Sá Martins, Head of Data Engineering, will share the evolution of OutSystems’ data landscape, including how OutSystems has partnered with Snowflake, Fivetran and Monte Carlo to address their modern data challenges. He’ll share best practices for implementing scalable data quality programs to drive innovative technologies, as well as what’s on the data horizon for the OutSystems team.

AI agents need seamless access to enterprise data to deliver real value. DataHub's new MCP server creates the universal bridge that connects any AI agent to your entire data infrastructure through a single interface.

This session demonstrates how organizations are breaking down data silos by enabling AI agents to intelligently discover and interact with data across Snowflake, Databricks, BigQuery, and other platforms. See live examples of AI-powered data discovery, real-time incident response, and automated impact analysis.

Learn how forward-thinking data leaders are positioning their organizations at the center of the AI revolution by implementing universal data access strategies that scale across their entire ecosystem.

Change – social and environmental, rural and urban – disproportionately impacts women and children. The UN reports that 80% of people displaced by climate change are women and girls. Research shows that domestic violence and crime, particularly against women, increases with male job loss. Understanding prevailing patterns by combining location data with social and environmental data can help identify vulnerable populations and mitigate these risks.

Join this session with Snowflake and Ordnance Survey to:

• Explore how integrating different types of geospatial data, such as maps, with open source data can expose the impact of these changes

• Discover how overlaying this information enriches visualization and analysis of vulnerabilities to identify better solutions

• Learn how data from the UK’s Ordnance Survey available through the Snowflake Marketplace addresses global challenges and helps mitigate risks

• Hear about Snowflake’s initiative to End Data Disparity

Join Tom Pryor, Principal Data Engineer, as he shares how his team has harnessed the power of Snowflake to transform their data strategy into a robust, scalable foundation for digital innovation and AI enablement. This session will explore how Snowflake has unified data across the enterprise, enabling real-time insights, powering customer-facing digital applications, and laying the groundwork for advanced AI capabilities. Tom will walk through key architectural decisions, data governance practices, and the evolution from legacy systems to a modern data platform.

In the scramble for agentic systems, the question has to be asked, are we ready? 

This session highlights the common challenges and complexities we face during the rush for autonomous orchestration. We'll demonstrate how Snowflake's AI data platform offers a unified, adaptable, and trusted solution for creating data agents you can trust.

talk
by Anastasiia Stefanska (Women on Snowflake User Group) , Isabella Renzetti (Women on Snowflake User Group)

A guided session by Anastasiia Stefanska and Isabella Renzetti, co-founders of the Women on Snowflake User Group, covering: The most exciting new features and announcements from Snowflake Summit 2025 and why they matter; Practical tips on how to navigate the World Tour agenda; Highlights of must-attend sessions; How to maximize networking and learning opportunities.

Apache Polaris: The Definitive Guide

Revolutionize your understanding of modern data management with Apache Polaris (incubating), the open source catalog designed for data lakehouse industry standard Apache Iceberg. This comprehensive guide takes you on a journey through the intricacies of Apache Iceberg data lakehouses, highlighting the pivotal role of Iceberg catalogs. Authors Alex Merced, Andrew Madson, and Tomer Shiran explore Apache Polaris's architecture and features in detail, equipping you with the knowledge needed to leverage its full potential. Data engineers, data architects, data scientists, and data analysts will learn how to seamlessly integrate Apache Polaris with popular data tools like Apache Spark, Snowflake, and Dremio to enhance data management capabilities, optimize workflows, and secure datasets. Get a comprehensive introduction to Iceberg data lakehouses Understand how catalogs facilitate efficient data management and querying in Iceberg Explore Apache Polaris's unique architecture and its powerful features Deploy Apache Polaris locally, and deploy managed Apache Polaris from Snowflake and Dremio Perform basic table operations on Apache Spark, Snowflake, and Dremio

Tristan talks with Mikkel Dengsøe, co-founder at SYNQ, to break down what agentic coding looks like in analytics engineering. Mikkel walks through a hands-on project using Cursor, the dbt MCP server, Omni's AI assistant, and Snowflake. They cover where agents shine (staging, unit tests, lineage-aware checks), where they're risky (BI chat for non-experts), and how observability is shifting from dashboards to root-cause explanations. For full show notes and to read 6+ years of back issues of the podcast's companion newsletter, head to https://roundup.getdbt.com. The Analytics Engineering Podcast is sponsored by dbt Labs.

Data Modeling with Snowflake - Second Edition

Data Modeling with Snowflake provides a clear and practical guide to mastering data modeling tailored to the Snowflake Data Cloud. By integrating foundational principles of database modeling with Snowflake's unique features and functionality, this book empowers you to create scalable, cost-effective, and high-performing data solutions. What this Book will help me do Apply universal data modeling concepts within the Snowflake platform effectively. Leverage Snowflake's features such as Time Travel and Zero-Copy Cloning for optimized data solutions. Understand and utilize advanced techniques like Data Vault and Data Mesh for scalable data architecture. Master handling semi-structured data in Snowflake using practical recipes and examples. Achieve cost efficiency and resource optimization by aligning modeling principles with Snowflake's architecture. Author(s) Serge Gershkovich is an accomplished data engineer and seasoned professional in data architecture and modeling. With a passion for simplifying complex concepts, Serge's work leverages his years of hands-on experience to guide readers in mastering both foundational and advanced data management practices. His clear and practical approach ensures accessibility for all levels. Who is it for? This book is ideal for data developers and engineers seeking practical modeling guidance within Snowflake. It's suitable for data analysts looking to broaden their database design expertise, and for database beginners aiming to get a head start in structuring data. Professionals new to Snowflake will also find its clear explanations of key features aligned with modeling techniques invaluable.