talk-data.com talk-data.com

Topic

Snowflake

data_warehouse cloud analytics olap

77

tagged

Activity Trend

193 peak/qtr
2020-Q1 2026-Q1

Activities

77 activities · Newest first

How we are building a federated, AI-augmented data platform that balances autonomy and standardization at scale

Platform engineers from global pharmaceutical company invites you to explore our journey in creating a Cloud native, Federated Data Platform using dbt Cloud, Snowflake, and Data Mesh. Discover how we established foundational tools, standards, and developed automation and self-service capabilities.

How CHG Healthcare saved 15 months on their migration to Snowflake + dbt Cloud

CHG Healthcare migrated 2000+ legacy MySQL jobs to dbt Cloud and Snowflake in record time. We'll share how Datafold used their AI-powered Migration Agent to migrate and refactor convoluted legacy code into dbt Cloud and Snowflake with full automatic validation, dramatically accelerating our modernization.

How to promote governed dbt workflows for more collaborators and why you should do it

Learn how to structure your dbt projects to enable more collaborative development without losing control. This session covers best practices for managing GitHub repos, organizing Snowflake schemas, and enabling safe, governed access to dbt models. Leave with actionable workflows your platform team can implement today to balance speed and oversight.

Builders Keynote

The Snowflake AI Data Cloud provides engineers and data scientists with a platform that makes development fun by being able to focus on the code and not the infrastructure. But how? Join this keynote to take a deep dive into technical demos of Snowflake Cortex, Streamlit, Snowflake Native Apps and other AI features. During the keynote, you will hear from technical experts, including Snowflake customers, that will share best practices to help you design and implement your own AI apps and services.

How Coinbase Developed an End-to-End ML Platform on Snowflake

Join this session to learn how Coinbase builds end-to-end ML workflows on top of Snowflake’s platform for optimal data security, governance and price performance. Using features such as Snowflake Feature Store and Snowflake Model Registry, Coinbase now automates batch and online inference on predictive ML models to quickly and accurately unban users who were initially incorrectly flagged as suspected fraud or bots, resulting in an improved user experience and increased revenue.

Legacy to Snowflake: How to Avoid Failing Your Data Warehouse Migration

Migrating a legacy data warehouse to Snowflake should be a predictable task. However, after participating in numerous projects, common failure patterns have emerged. In this session, we’ll explore typical pitfalls when moving to the Snowflake AI Data Cloud and offer recommendations for avoiding them. We’ll cover mistakes at every stage of the process, from technical details to end-user involvement and everything in between — code conversion (using SnowConvert!), data migration, deployment, optimization, testing and project management.

Making the Most of Generative AI in Snowflake: A Live Prompt Engineering Demo

Unlock the full potential of generative AI in Snowflake with hands-on prompt engineering! In this live demo, we'll build and refine prompts from scratch, showcasing how to harness LLM-powered features like Cortex Analyst, Document AI and Cortex functions. Attendees will see real-time development, practical use cases and best practices to get the most out of Snowflake’s AI capabilities — whether for extracting insights, automating workflows or enhancing data pipelines with AI-driven intelligence.

Opening Keynote

Hear from Snowflake CEO, Sridhar Ramaswamy, as he discusses the impact AI has had across every organization and how the Snowflake AI Data Cloud is helping to accelerate enterprise AI. Then, in a CEO fireside conversation, Sridhar together with NVIDIA Founder and CEO, Jensen Huang, will discuss what the future holds in this new AI era. Finally, Snowflake CMO, Denise Persson, will be joined by industry leaders to discuss their organizations' data initiatives and the successes and challenges of driving impact with data and AI.

Platform Keynote

Industry-leading companies leverage the Snowflake AI Data Cloud to transform their businesses through AI innovation. Join Snowflake CEO Sridhar Ramaswamy, Co-Founder and President of Product Benoit Dageville, and EVP of Product Christian Kleinerman as they unveil the latest innovations in Snowflake’s unified platform that make it easy to break down silos, develop and distribute modern apps, and securely empower everyone with AI. You’ll see live demos from Snowflake’s engineering and product teams, and hear from some of the best-known global organizations on how they are shaping industries with the Snowflake AI Data Cloud.

Databricks + Apache Iceberg™: Managed and Foreign Tables in Unity Catalog

Unity Catalog support for Apache Iceberg™ brings open, interoperable table formats to the heart of the Databricks Lakehouse. In this session, we’ll introduce new capabilities that allow you to write Iceberg tables from any REST-compatible engine, apply fine-grained governance across all data, and unify access to external Iceberg catalogs like AWS Glue, Hive Metastore, and Snowflake Horizon. Learn how Databricks is eliminating data silos, simplifying performance with Predictive Optimization, and advancing a truly open lakehouse architecture with Delta and Iceberg side by side.

How to Migrate From Snowflake to Databricks SQL

Migrating your Snowflake data warehouse to the Databricks Data Intelligence Platform can accelerate your data modernization journey. Though a cloud platform-to-cloud platform migration should be relatively easy, the breadth of the Databricks Platform provides flexibility and hence requires careful planning and execution. In this session, we present the migration methodology, technical approaches, automation tools, product/feature mapping, a technical demo and best practices using real-world case studies for migrating data, ELT pipelines and warehouses from Snowflake to Databricks.

Iceberg Table Format Adoption and Unified Metadata Catalog Implementation in Lakehouse Platform

DoorDash Data organization actively adopts LakeHouse paradigm. This presentation describes the methodology which allows to migrate the classic Data Warehouse and Data Lake platforms to unified LakeHouse solution.The objective of this effort include Elimination of excessive data movement.Seamless integration and consolidation of the query engine layers, including Snowflake, Databricks, EMR and Trino.Query performance optimization.Abstracting away complexity of underlying storage layers and table formatsStrategic and justified decision on the Unified Metadata catalog used across varios compute platforms

Optimizing Analytics Infrastructure: Lessons from Migrating Snowflake to Databricks

This session explores the strategic migration from Snowflake to Databricks, focusing on the journey of transforming a data lake to leverage Databricks’ advanced capabilities. It outlines the assessment of key architectural differences, performance benchmarks, and cost implications driving the decision. Attendees will gain insights into planning and execution, including data ingestion pipelines, schema conversion and metadata migration. Challenges such as maintaining data quality, optimizing compute resources and minimizing downtime are discussed, alongside solutions implemented to ensure a seamless transition. The session highlights the benefits of unified analytics and enhanced scalability achieved through Databricks, delivering actionable takeaways for similar migrations.

Master Schema Translations in the Era of Open Data Lake

Unity Catalog puts variety of schemas into a centralized repository, now the developer community wants more productivity and automation for schema inference, translation, evolution and optimization especially for the scenarios of ingestion and reverse-ETL with more code generations.Coinbase Data Platform attempts to pave a path with "Schemaster" to interact with data catalog with the (proposed) metadata model to make schema translation and evolution more manageable across some of the popular systems, such as Delta, Iceberg, Snowflake, Kafka, MongoDB, DynamoDB, Postgres...This Lighting Talk covers 4 areas: The complexity and caveats of schema differences among The proposed field-level metadata model, and 2 translation patterns: point-to-point vs hub-and-spoke Why Data Profiling be augmented to enhance schema understanding and translation Integrate it with Ingestion & Reverse-ETL in a Databricks-oriented eco system Takeaway: standardize schema lineage & translation

How to Build an Open Lakehouse: Best Practices for Interoperability

Building an open data lakehouse? Start with the right blueprint. This session walks through common reference architectures for interoperable lakehouse deployments across AWS, Google Cloud, Azure and tools like Snowflake, BigQuery and Microsoft Fabric. Learn how to design for cross-platform data access, unify governance with Unity Catalog and ensure your stack is future-ready — no matter where your data lives.

Sponsored by: Onehouse | Open By Default, Fast By Design: One Lakehouse That Scales From BI to AI

You already see the value of the lakehouse. But are you truly maximizing its potential across all workloads, from BI to AI? In this session, Onehouse unveils how our open lakehouse architecture unifies your entire stack, enabling true interoperability across formats, catalogs, and engines. From lightning-fast ingestion at scale to cost-efficient processing and multi-catalog sync, Onehouse helps you go beyond trade-offs. Discover how Apache XTable (Incubating) enables cross-table-format compatibility, how OpenEngines puts your data in front of the best engine for the job, and how OneSync keeps data consistent across Snowflake, Athena, Redshift, BigQuery, and more. Meanwhile, our purpose-built lakehouse runtime slashes ingest and ETL costs. Whether you’re delivering BI, scaling AI, or building the next big thing, you need a lakehouse that’s open and powerful. Onehouse opens everything—so your data can power anything.