talk-data.com talk-data.com

Event

Data + AI Summit 2025

2025-06-09 – 2025-06-13 Databricks Summit Visit website ↗

Activities tracked

425

Filtering by: AI/ML ×

Sessions & talks

Showing 201–225 of 425 · Newest first

Search within this event →
Sponsored by: Accenture & Avanade | Reinventing State Services with Databricks: AI-Driven Innovations in Health and Transportation

Sponsored by: Accenture & Avanade | Reinventing State Services with Databricks: AI-Driven Innovations in Health and Transportation

2025-06-11 Watch
lightning_talk
Ajali Sen (Accenture)

One of the largest and trailblazing U.S. states is setting a new standard for how governments can harness data and AI to drive large-scale impact. In this session, we will explore how we are using the Databricks Data Intelligence Platform to address two of the state's most pressing challenges: public health and transportation. From vaccine tracking powered by intelligent record linkage and a service-oriented analytics architecture, to Gen AI-driven insights that reduce traffic fatalities and optimize infrastructure investments, this session reveals how scalable, secure, and real-time data solutions are transforming state operations. Join us to learn how data-driven governance is delivering better outcomes for millions—and paving the way for an AI enabled, data driven and more responsive government.

Sponsored by: Atlan | Domain-driven Data Governance in the AI Era: A Conversation with General Motors and Atlan

Sponsored by: Atlan | Domain-driven Data Governance in the AI Era: A Conversation with General Motors and Atlan

2025-06-11 Watch
lightning_talk

Now the largest automaker in the United States, selling more than 2.7 million vehicles in 2024, General Motors is setting a bold vision for its future, with Software-defined vehicles and AI as a driving force. With data as a crucial asset, a transformation of this scale calls for a modern approach to Data Governance. Join Sherri Adame, Enterprise Data Governance Leader at General Motors, to learn about GM’s novel governance approach, supported by technologies like Atlan and Databricks. Hear how Sherri and her team are shifting governance to the left with automation, implementing data contracts, and accelerating data product discovery across domains, creating a cultural shift that emphasizes data as a competitive advantage.

Sponsored by: Tiger Analytics | Data-Driven Transformation to Hypercharge Predictive and Diagnostic Supply Chain Intelligence

Sponsored by: Tiger Analytics | Data-Driven Transformation to Hypercharge Predictive and Diagnostic Supply Chain Intelligence

2025-06-11 Watch
lightning_talk
Vishal Puri (Tiger Analytics)

Manufacturers today need efficient, accurate, and flexible integrated planning across supply, demand, and finance. A leading industrial manufacturer is pursuing a competitive edge in Integrated Business Planning through data and AI.Their strategy: a connected, real-time data foundation with democratized access across silos. Using Databricks, we’re building business-centric data products to enable near real-time, collaborative decisions and scaled AI. Unity Catalog ensures data reliability and adoption. Increased data visibility is driving better on-time delivery, inventory optimization, and forecasting,resulting in measurable financial impact. In this session, we’ll share our journey to the north star of “driving from the windshield, not the rearview,” including key data, organization, and process challenges in enabling data democratization; architectural choices for Integrated Business Planning as a data product; and core capabilities delivered with Tiger’s Accelerator.

Achieving AI Success with a Solid Data Foundation

Achieving AI Success with a Solid Data Foundation

2025-06-11 Watch
talk
Santosh Kudva (GE Vernova) , Kevin Tollison (EY)

Join for an insightful presentation on creating a robust data architecture to drive business outcomes in the age of Generative AI. Santosh Kudva, GE Vernova Chief Data Officer and Kevin Tollison, EY AI Consulting Partner, will share their expertise on transforming data strategies to unleash the full potential of AI. Learn how GE Vernova, a dynamic enterprise born from the 2024 spin-off of GE, revamped its diverse landscape. They will provide a look into how they integrated the pre-spin-off Finance Data Platform into the GE Vernova Enterprise Data & Analytics ecosystem utilizing Databricks to enable high-performance AI-led analytics. Key insights include: Incorporating Generative AI into your overarching strategy Leveraging comprehensive analytics to enhance data quality Building a resilient data framework adaptable to continuous evolution Don't miss this opportunity to hear from industry leaders and gain valuable insights to elevate your data strategy and AI success.

Agent Bricks: Building Multi-Agent Systems for Structured and Unstructured Information

Agent Bricks: Building Multi-Agent Systems for Structured and Unstructured Information

2025-06-11 Watch
talk
Elise Gonzales (Databricks)

Learn how to build sophisticated systems that enable natural language interactions with both your structured databases and unstructured document collections. This session explores advanced techniques for creating unified and governed AI systems that can seamlessly interpret questions, retrieve relevant information and generate accurate answers across your entire data ecosystem. Key takeaways include: Strategies for combining vector search over unstructured documents with retrieval from structured databases Techniques for optimizing unstructured data processing through effective parsing, metadata enrichment and intelligent chunking Methods for integrating different retrieval mechanisms while ensuring consistent data governance and security Practical approaches for evaluating and improving KBQA system quality through automated and human feedback

AI Agents Fundamentals Training

2025-06-11
talk

This course will introduce you to AI agents, their transformative impact on organizations, and how Databricks enables the creation of AI agent systems. We’ll begin by exploring what AI agents are, how they differ from traditional AI systems, and why they are becoming essential in today’s data-driven landscape. Next, we’ll examine how AI agents can be used to automate tasks, enhance decision-making, and unlock new efficiencies for businesses of all sizes. Finally, we’ll review real-world examples of AI agent systems on Databricks, showcasing practical applications across industries and sharing key considerations for successful adoption. You can pass a short quiz and earn a badge to validate your learning on completion.

Autonomous AI Agents in AI Infrastructure

Autonomous AI Agents in AI Infrastructure

2025-06-11 Watch
lightning_talk
Apurva Kumar (Walmart Global Tech)

Autonomous AI agents are transforming industries by enabling systems to perform tasks, make decisions and adapt in real time without human intervention. In this talk, I will delve into the architecture and design principles required to build these agents within scalable AI infrastructure. Key topics will include constructing modular, reusable frameworks, optimizing resource allocation and enabling interoperability between agents and data pipelines. I will discuss practical use cases in which attendees will learn how to leverage containerization and orchestration techniques to enhance the flexibility and performance of these agents while ensuring low-latency decision-making. This session will also highlight challenges like ensuring robustness, ethical considerations and strategies for real-time feedback loops. Participants will gain actionable insights into building autonomous AI agents that drive efficiency, scalability and innovation in modern AI ecosystems.

Building a Self-Service Data Platform With a Small Data Team

Building a Self-Service Data Platform With a Small Data Team

2025-06-11 Watch
talk
Gleb Lesnikov (Dodo Brands) , Evgenii Dobrynin (Dodo Brands)

Discover how Dodo Brands, a global pizza and coffee business with over 1,200 retail locations and 40k employees, revolutionized their analytics infrastructure by creating a self-service data platform. This session explores the approach to empowering analysts, data scientists and ML engineers to independently build analytical pipelines with minimal involvement from data engineers. By leveraging Databricks as the backbone of their platform, the team developed automated tools like a "job-generator" that uses Jinja templates to streamline the creation of data jobs. This approach minimized manual coding and enabled non-data engineers to create over 1,420 data jobs — 90% of which were auto-generated by user configurations. Supporting thousands of weekly active users via tools like Apache Superset. This session provides actionable insights for organizations seeking to scale their analytics capabilities efficiently without expanding their data engineering teams.

Building Intelligent AI Agents With Claude Models and Databricks Mosaic AI Framework

Building Intelligent AI Agents With Claude Models and Databricks Mosaic AI Framework

2025-06-11 Watch
talk
Sam Flamini (Anthropic)

This session is repeated. Explore how Anthropic's frontier models power AI agents in Databricks Mosaic AI Agent Framework. Learn to leverage Claude's state-of-the-art capabilities for complex agentic workflows while benefiting from Databricks unified governance, credential management and evaluation tools. We'll demonstrate how Anthropic's models integrate seamlessly to create production-ready applications that combine Claude's reasoning with Databricks data intelligence capabilities.

Comprehensive Guide to MLOps on Databricks

Comprehensive Guide to MLOps on Databricks

2025-06-11 Watch
talk
Arpit Jasapara (Databricks) , Eric Golinko (Databricks)

This in-depth session explores advanced MLOps practices for implementing production-grade machine learning workflows on Databricks. We'll examine the complete MLOps journey from foundational principles to sophisticated implementation patterns, covering essential tools including MLflow, Unity Catalog, Feature Stores and version control with Git. Dive into Databricks' latest MLOps capabilities including MLflow 3.0, which enhances the entire ML lifecycle from development to deployment with particular focus on generative AI applications. Key session takeaways include: Advanced MLflow 3.0 features for LLM management and deployment Enterprise-grade governance with Unity Catalog integration Robust promotion patterns across development, staging and production CI/CD pipeline automation for continuous deployment GenAI application evaluation and streamlined deployment

Databricks on Databricks: Transforming the Sales Experience using GenAI Agents at Scale

Databricks on Databricks: Transforming the Sales Experience using GenAI Agents at Scale

2025-06-11 Watch
talk
Manjeet Singh Chhabra (Databricks) , Akhil Aggrawal (Databricks)

Databricks is transforming its sales experience with a GenAI agent — built and deployed entirely on Databricks — to automate tasks, streamline data retrieval, summarize content, and enable conversational AI for over 4,000 sellers. This agent leverages the AgentEval framework, AI Bricks, and Model Serving to process both structured and unstructured data within Databricks, unlocking deep sales insights. The agent seamlessly integrates across multiple data sources including Salesforce, Google Drive, and Glean securely via OAuth. This session includes a live demonstration and explores the business impact, architecture as well as agent development and evaluation strategies, providing a blueprint for deploying secure, scalable GenAI agents in large enterprises.

From Prediction to Prevention: Transforming Risk Management in Insurance

From Prediction to Prevention: Transforming Risk Management in Insurance

2025-06-11 Watch
talk
Sebastien Gignac (Intact Financial Corp) , Dylani Herath (Thrivent Financial) , Marcela Granados (Databricks) , Michael Ban (Nationwide)

Protecting insurers against emerging threats is critical. This session reveals how leading companies use Databricks’ Data Intelligence Platform to transform risk management, enhance fraud detection, and ensure compliance. Learn how advanced analytics, AI, and machine learning process vast data in real time to identify risks and mitigate threats. Industry leaders will share strategies for building resilient operations that protect against financial losses and reputational harm. Key takeaways: AI-powered fraud prevention using anomaly detection and predictive analytics Real-time risk assessment models integrating IoT, behavioral, and external data Strategies for robust compliance and governance with operational efficiency Discover how data intelligence is revolutionizing insurance risk management and safeguarding the industry’s future.

Hands-on Learning: AI-Powered Data Engineering with Lakeflow: Techniques for Modern Data Professionals

2025-06-11
talk
Frank Munz (Databricks)

This introductory workshop caters to data engineers seeking hands-on experience and data architects looking to deepen their knowledge. The workshop is structured to provide a solid understanding of the following data engineering and streaming concepts: Introduction to Lakeflow and the Data Intelligence Platform Getting started with Lakeflow Declarative Pipelines for declarative data pipelines in SQL using Streaming Tables and Materialized Views Mastering Databricks Workflows with advanced control flow and triggers Understanding serverless compute Data governance and lineage with Unity Catalog Generative AI for Data Engineers: Genie and Databricks Assistant We believe you can only become an expert if you work on real problems and gain hands-on experience. Therefore, we will equip you with your own lab environment in this workshop and guide you through practical exercises like using GitHub, ingesting data from various sources, creating batch and streaming data pipelines, and more.

Hands-On Learning: Build Custom Data Intelligence Apps on Databricks

2025-06-11
talk
Justin DeBrabant (Databricks) , Giran Moodley (Databricks) , Ivan Trusov (Databricks)

Want to learn how to build your own custom data intelligence applications directly in Databricks? In this workshop, we’ll guide you through a hands-on tutorial for building a Streamlit web app that leverages many of the key products at Databricks as building blocks. You’ll integrate a live DB SQL warehouse, use Genie to ask questions in natural language, and embed AI/BI dashboards for interactive visualizations. In addition, we’ll discuss key concepts and best practices for building production-ready apps, including logging and observability, scalability, different authorization models, and deployment. By the end, you'll have a working AI app—and the skills to build more.

How Skyscanner Runs Real-Time AI at Scale with Databricks

How Skyscanner Runs Real-Time AI at Scale with Databricks

2025-06-11 Watch
talk
Ahmed Bilal (Databricks) , Michael Ewins (Skyscanner)

Deploying AI in production is getting more complex — with different model types, tighter timelines, and growing infrastructure demands. In this session, we’ll walk through how Mosaic AI Model Serving helps teams deploy and scale both traditional ML and generative AI models efficiently, with built-in monitoring and governance.We’ll also hear from Skyscanner on how they’ve integrated AI into their products, scaled to 100+ production endpoints, and built the processes and team structures to support AI at scale. Key Takeaways: How Skyscanner ships and operates AI in real-world products How to deploy and scale a variety of models with low latency and minimal overhead Building compound AI systems using models, feature stores, and vector search Monitoring, debugging, and governing production workloads

Lakebase: Fully Managed Postgres for the Lakehouse

Lakebase: Fully Managed Postgres for the Lakehouse

2025-06-11 Watch
talk
Abbey Russell (Databricks) , Dave Nettleton (Databricks)

Lakebase is a new Postgres-compatible OLTP database designed to support intelligent applications. Lakebase eliminates custom ETL pipelines with built-in lakehouse table synchronization, supports sub-10ms latency for high-throughput workloads, and offers full Postgres compatibility, so you can build applications more quickly.In this session, you’ll learn how Lakebase enables faster development, production-level concurrency, and simpler operations for data engineers and application developers building modern, data-driven applications. We'll walk through key capabilities, example use cases, and how Lakebase simplifies infrastructure while unlocking new possibilities for AI and analytics.

Leveraging GenAI for Synthetic Data Generation to Improve Spark Testing and Performance in Big Data

Leveraging GenAI for Synthetic Data Generation to Improve Spark Testing and Performance in Big Data

2025-06-11 Watch
lightning_talk
Satej Kumar Sahu (Zalando SE)

Testing Spark jobs in local environments is often difficult due to the lack of suitable datasets, especially under tight timelines. This creates challenges when jobs work in development clusters but fail in production, or when they run locally but encounter issues in staging clusters due to inadequate documentation or checks. In this session, we’ll discuss how these challenges can be overcome by leveraging Generative AI to create custom synthetic datasets for local testing. By incorporating variations and sampling, a testing framework can be introduced to solve some of these challenges, allowing for the generation of realistic data to aid in performance and load testing. We’ll show how this approach helps identify performance bottlenecks early, optimize job performance and recognize scalability issues while keeping costs low. This methodology fosters better deployment practices and enhances the reliability of Spark jobs across environments.

Modernizing Critical Infrastructure: AI and Data-Driven Solutions in Nuclear and Utility Operations

Modernizing Critical Infrastructure: AI and Data-Driven Solutions in Nuclear and Utility Operations

2025-06-11 Watch
talk
Lou Martinez Sancho (Westinghouse Electric Company) , Shane Powell (Alabama Power) , Nick Whatley (Southern Company) , Amar Sethi (Databricks)

This session showcases how both Westinghouse Electric and Alabama Power Company are leveraging cloud-based tools, advanced analytics, and machine learning to transform operational resilience across the energy sector. In the first segment, we'll explore AI's crucial role in enhancing safety, efficiency, and compliance in nuclear operations through technologies like HiVE and Bertha, focusing on the unique reliability and credibility requirements of the regulated nuclear industry. We’ll then highlight how Alabama Power Company has modernized its grid management and storm preparedness by using Databricks to develop SPEAR and RAMP—applications that combine real-time data and predictive analytics to improve reliability, efficiency, and customer service.

Retail Genie: No-Code AI Apps for Empowering BI Users to be Self-Sufficient

Retail Genie: No-Code AI Apps for Empowering BI Users to be Self-Sufficient

2025-06-11 Watch
talk
Harish Rajagopalan (Databricks) , Siddhesh Pore (Databricks)

Explore how Databricks AI/BI Genie revolutionizes retail analytics, empowering business users to become self-reliant data explorers. This session highlights no-code AI apps that create a conversational interface for retail data analysis. Genie spaces harness NLP and generative AI to convert business questions into actionable insights, bypassing complex SQL queries. We'll showcase retail teams effortlessly analyzing sales trends, inventory and customer behavior through Genie's intuitive interface. Witness real-world examples of AI/BI Genie's adaptive learning, enhancing accuracy and relevance over time. Learn how this technology democratizes data access while maintaining governance via Unity Catalog integration. Discover Retail Genie's impact on decision-making, accelerating insights and cultivating a data-driven retail culture. Join us to see the future of accessible, intelligent retail analytics in action.

Revolutionizing Banking Data, Analytics and AI: Building an Enterprise Data Hub With Databricks

Revolutionizing Banking Data, Analytics and AI: Building an Enterprise Data Hub With Databricks

2025-06-11 Watch
talk
Shailender Sidhu (Deloitte) , Mohan Sankararaman (First Horizon Bank) , Jamie Cosgrove (Databricks)

Explore the transformative journey of a regional bank as it modernizes its enterprise data infrastructure amidst the challenges of legacy systems and past mergers and acquisitions. The bank is creating an Enterprise Data Hub using Deloitte's industry experience and the Databricks Data Intelligence Platform to drive growth, efficiency and Large Financial Institution readiness needs. This session will showcase how the new data hub will be a one-stop-shop for LOB and enterprise needs, while unlocking the advanced analytics and GenAI possibilities. Discover how this initiative is going to empower the ambitions of a regional bank to realize their “big bank muscle, small bank hustle.”

Scaling Real-Time Fraud Detection With Databricks: Lessons From DraftKings

Scaling Real-Time Fraud Detection With Databricks: Lessons From DraftKings

2025-06-11 Watch
talk
Greg Von Pless (DraftKings) , Monika Hristova (Draftkings)

At DraftKings, ensuring secure, fair gaming requires detecting fraud in real time with both speed and precision. In this talk, we’ll share how Databricks powers our fraud detection pipeline, integrating real-time streaming, machine learning and rule-based detection within a PySpark framework. Our system enables rapid model training, real-time inference and seamless feature transformation across historical and live data. We use shadow mode to test models and rules in live environments before deployment. Collaborating with Databricks, we push online feature store performance and enhance real-time PySpark capabilities. We'll cover PySpark-based feature transformations, real-time inference, scaling challenges and our migration from a homegrown system to Databricks. This session is for data engineers and ML practitioners optimizing real-time AI workloads, featuring a deep dive, code snippets and lessons from building and scaling fraud detection.

Self-Service Assortment and Space Analytics at Walmart Scale

Self-Service Assortment and Space Analytics at Walmart Scale

2025-06-11 Watch
talk
Alexandro Arreola-Garcia (Walmart) , Nikit Shah (Databricks)

Assortment and space analytics optimizes product selection and shelf allocation to boost sales, improve inventory management and enhance customer experience. However, challenges like evolving demand, data accuracy and operational alignment hinder success. Older approaches struggled due to siloed tools, slow performance and poor governance. Databricks unified platform resolved these issues, enabling seamless data integration, high-performance analytics and governed sharing. The innovative AI/BI Genie interface empowered self-service analytics, driving non-technical user adoption. This solution helped Walmart cut time to value by 90% and saved $5.6M annually in FTE hours leading to increased productivity. Looking ahead, AI agents will let store managers and merchants execute decisions via conversational interfaces, streamlining operations and enhancing accessibility. This transformation positions retailers to thrive in a competitive, customer-centric market.

Sponsored by: AWS | Ripple: Well-Architected Data & AI Platforms - AWS and Databricks in Harmony

Sponsored by: AWS | Ripple: Well-Architected Data & AI Platforms - AWS and Databricks in Harmony

2025-06-11 Watch
talk
Priyanka Adhia (Ripple) , Hari Rajendran (Ripple) , Rudy Chetty (AWS)

Join us as we explore the well-architected framework for modern data lakehouse architecture, where AWS's comprehensive data, AI, and infrastructure capabilities align with Databricks' unified platform approach. Building upon core principles of Operational Excellence, Security, Reliability, Performance, and Cost Optimization, we'll demonstrate how Data and AI Governance alongside Interoperability and Usability enable organizations to build robust, scalable platforms. Learn how Ripple modernized its data infrastructure by migrating from a legacy Hadoop system to a scalable, real-time analytics platform using Databricks on AWS. This session covers the challenges of high operational costs, latency, and peak-time bottlenecks—and how Ripple achieved 80% cost savings and 55% performance improvements with Photon, Graviton, Delta Lake, and Structured Streaming.

Sponsored by: Capgemini | Unlocking Business Value With SAP Business Data Cloud and Databricks: Real-World Use Cases

2025-06-11
talk
Thorsten Leiduck (SAP) , Frank Gundlich (Capgemini)

Discover how SAP Business Data Cloud and Databricks can transform your business by unifying SAP and non-SAP data for advanced analytics and AI. In this session, we’ll highlight Optimizing Cash Flow with AI with integrated diverse data sources and AI algorithms that enable accurate cash flow forecasting to help businesses identify trends, prevent bottlenecks, and improve liquidity. You’ll also learn about the importance of high-quality, well-governed data as the foundation for reliable AI models and actionable reporting. Key Takeaways: • How to integrate and leverage SAP and external data in Databricks • Using AI for predictive analytics and better decision-making • Building a trusted data foundation to drive business performance Leave this session with actionable strategies to optimize your data, enhance efficiency, and unlock new growth opportunities.

Sponsored by: Capital One Software | How Capital One Uses Tokenization to Protect Data

Sponsored by: Capital One Software | How Capital One Uses Tokenization to Protect Data

2025-06-11 Watch
lightning_talk
Pushkar Waghdhare (Capital One Software) , Tejashree Bapat (Capital One Software)

Modern companies are managing more data than ever before, and the need to derive value from that data is becoming more urgent with AI. But AI adoption is often limited due to data security challenges, and adding to this complexity is the need to remain compliant with evolving regulation. At Capital One, we’ve deployed tokenization to further secure our data without compromising performance. In this talk, we’ll discuss lessons learned from our tokenization journey and show how companies can tokenize the data in their Databricks environment.