talk-data.com talk-data.com

Event

Data + AI Summit 2025

2025-06-09 – 2025-06-13 Databricks Summit Visit website ↗

Activities tracked

425

Filtering by: AI/ML ×

Sessions & talks

Showing 351–375 of 425 · Newest first

Search within this event →
Optimize Cost and User Value Through Model Routing AI Agent

Optimize Cost and User Value Through Model Routing AI Agent

2025-06-10 Watch
talk
Aditya Gautam (Meta)

Each LLM has unique strengths and weaknesses, and there is no one-size-fits-all solution. Companies strive to balance cost reduction with maximizing the value of their use cases by considering various factors such as latency, multi-modality, API costs, user need, and prompt complexity. Model routing helps in optimizing performance and cost along with enhanced scalability and user satisfaction. Overview of cost-effective models training using AI gateway logs, user feedback, prompt, and model features to design an intelligent model-routing AI agent. Covers different strategies for model routing, deployment in Mosaic AI, re-training, and evaluation through A/B testing and end-to-end Databricks workflows. Additionally, it will delve into the details of training data collection, feature engineering, prompt formatting, custom loss functions, architectural modifications, addressing cold-start problems, query embedding generation and clustering through VectorDB, and RL policy-based exploration.

Revolutionizing Cybersecurity: SCB's Journey to a Self-Managed SIEM

Revolutionizing Cybersecurity: SCB's Journey to a Self-Managed SIEM

2025-06-10 Watch
talk
Lavy Stokhamer (Standard Chartered Bank)

Join us to explore how Standard Chartered Bank's (SCB) groundbreaking strategy is reshaping the future of the cybersecurity landscape by replacing traditional SIEM with a cutting-edge Databricks solution, achieving remarkable business outcomes: 80% Reduction in time to detect incidents 92% Faster threat investigation 35% Cost reduction 60% Better detection accuracy Significant enhancements in threat detection and response metrics Substantial increase in ML-driven use cases This session unveils SCB's journey to a distributed, multi-cloud lakehouse architecture that unlocks unprecedented performance and commercial optimization. Explore why a unified data and AI platform is becoming the cornerstone of next-generation, self-managed SIEM solutions for forward-thinking organizations in this era of AI-powered banking transformation.

Scaling Data Intelligence at NAB: Balancing Innovation with Enterprise-Grade Governance

Scaling Data Intelligence at NAB: Balancing Innovation with Enterprise-Grade Governance

2025-06-10 Watch
talk
Tom McMeekin (Databricks) , Daniel Antoinette (National Australia Bank)

In this session, discover how National Australia Bank (NAB) is reshaping its data and AI strategy by positioning data as a strategic enabler. Driven by a vision to unlock data like electricity—continuous and reliable—NAB has established a scalable foundation for data intelligence that balances agility with enterprise-grade control. We'll delve into the key architectural, security, and governance capabilities underpinning this transformation, including Unity Catalog, Serverless, Lakeflow and GenAI. The session will highlight NAB's adoption of Databricks Serverless, platform security controls like private link, and persona-based data access patterns. Attendees will walk away with practical insights into building secure, scalable, and cost-efficient data platforms that fuel innovation while meeting the demands of compliance in highly regulated environments.

Sponsored by: Accenture & Avanade | Enterprise Data Journey for The Standard Insurance Leveraging Databricks on Azure and AI Innovation

Sponsored by: Accenture & Avanade | Enterprise Data Journey for The Standard Insurance Leveraging Databricks on Azure and AI Innovation

2025-06-10 Watch
lightning_talk
Sumanta Paul (Accenture)

Modern insurers require agile, integrated data systems to harness AI. This framework for a global insurer uses Azure Databricks to unify legacy systems into a governed lakehouse medallion architecture (bronze/silver/gold layers), eliminating silos and enabling real-time analytics. The solution employs: Medallion architecture for incremental data quality improvement. Unity Catalog for centralized governance, row/column security, and audit compliance. Azure encryption/confidential computing for data mesh security. Automated ingestion/semantic/DevOps pipelines for scalability. By combining Databricks’ distributed infrastructure with Azure’s security, the insurer achieves regulatory compliance while enabling AI-driven innovation (e.g., underwriting, claims). The framework establishes a future-proof foundation for mergers/acquisitions (M&A) and cross-functional data products, balancing governance with agility.

Sponsored by: AVEVA | CONNECT and Databricks IT-OT Convergence for Industrial Intelligence at Scale

2025-06-10
talk
glenn moffett (AVEVA) , John Baier (Aveva)

Industrial organizations are unlocking new possibilities through the partnership between AVEVA and Databricks. The seamless, no-code, zero-copy solution—powered by Delta Sharing and CONNECT—enables companies to combine IT and OT data effortlessly. By bridging the gap between operational and enterprise data, businesses can harness the power of AI, data science, and business intelligence at an unprecedented scale to drive innovation. In this session, explore real-world applications of this integration, including how industry leaders are using CONNECT and Databricks to boost efficiency, reduce costs, and advance sustainability—all without fragmented point solutions. You’ll also see a live demo of the integration, showcasing how secure, scalable access to trusted industrial data is enabling new levels of industrial intelligence across sectors like mining, manufacturing, power, and oil and gas.

Sponsored by: Capital One Software | How to Manage High-Quality, Secure Data and Cost Visibility for AI

2025-06-10
talk
Laura Case (Capital One Software) , Yudhish Batra (Capital One)

Companies need robust data management capabilities to build and deploy AI. Data needs to be easy to find, understandable, and trustworthy. And it’s even more important to secure data properly from the beginning of its lifecycle, otherwise it can be at risk of exposure during training or inference. Tokenization is a highly efficient method for securing data without compromising performance. In this session, we’ll share tips for managing high-quality, well-protected data at scale that are key for accelerating AI. In addition, we’ll discuss how to integrate visibility and optimization into your compute environment to manage the hidden cost of AI — your data.

Sponsored by: Domo | Behind the Brand: How Sol de Janeiro Powers Amazon Ops with Databricks + DOMO

Sponsored by: Domo | Behind the Brand: How Sol de Janeiro Powers Amazon Ops with Databricks + DOMO

2025-06-10 Watch
talk
Caio Pimenta (Sol de Janeiro)

How does one of the world’s fastest-growing beauty brands stay ahead of Amazon’s complexity and scale retail with precision? At Sol de Janeiro, we built a real-time Amazon Operations Hub—powered by Databricks and DOMO—that drives decisions across inventory, profitability, and marketing ROI. See how the Databricks Lakehouse and DOMO dashboards work together to simplify workflows, surface actionable insights, and enable smarter decisions across the business—from frontline operators to the executive suite. In this session, you’ll get a behind-the-scenes look at how we unified trillions of rows from NetSuite, Amazon, Shopify, and carrier systems into a single source of truth. We’ll show how this hub streamlined cross-functional workflows, eliminated manual reporting, and laid the foundation for AI-powered forecasting and automation.

Sponsored by: Firebolt | The Power of Low-latency Data for AI Apps

Sponsored by: Firebolt | The Power of Low-latency Data for AI Apps

2025-06-10 Watch
lightning_talk
Cole Bowden (Firebolt)

Retrieval-augmented generation (RAG) has transformed AI applications by grounding responses with external data. It can be better. By pairing RAG with low latency SQL analytics, you can enrich responses with instant insights, leading to a more interactive and insightful user experience with fresh, data-driven intelligence. In this talk, we’ll demo how low latency SQL combined with an AI application can deliver speed, accuracy, and trust.

Sponsored by: Monte Carlo | The Illusion of Done: Why the Real Work for AI Starts in Production

Sponsored by: Monte Carlo | The Illusion of Done: Why the Real Work for AI Starts in Production

2025-06-10 Watch
lightning_talk
Shane Murray (Monte Carlo)

Your model is trained. Your pilot is live. Your data looks AI-ready. But for most teams, the toughest part of building successful AI starts after deployment. In this talk, Shane Murray and Ethan Post share lessons from the development of Monte Carlo’s Troubleshooting Agent – an AI assistant that helps users diagnose and fix data issues in production. They’ll unpack what it really takes to build and operate trustworthy AI systems in the real world, including: The Illusion of Done – Why deployment is just the beginning, and what breaks in production; Lessons from the Field – A behind-the-scenes look at the architecture, integration, and user experience of Monte Carlo’s agent; Operationalizing Reliability – How to evaluate AI performance, build the right team, and close the loop between users and model. Whether you're scaling RAG pipelines or running LLMs in production, you’ll leave with a playbook for building data and AI systems you—and your users—can trust.

Transforming Title Insurance With Databricks Batch Inference

Transforming Title Insurance With Databricks Batch Inference

2025-06-10 Watch
talk
Madhu Kolli (First American Financial) , Prabhaker Narsina (First American Financial)

Join us as we explore how First American Data & Analytics, a leading property-centric information provider, revolutionized its data extraction processes using batch inference on the Databricks Platform. Discover how it overcame the challenges of extracting data from millions of historical title policy images and reduced project timelines by 75%. Learn how First American optimized its data processing capabilities, reduced costs by 70% and enhanced the efficiency of its title insurance processes, ultimately improving the home-buying experience for buyers, sellers and lenders. This session will delve into the strategic integration of AI technologies, highlighting the power of collaboration and innovation in transforming complex data challenges into scalable solutions.

You Mean I Can Talk to My Data? Reimagining How KPMG Engages Data Using AI|BI Genie

You Mean I Can Talk to My Data? Reimagining How KPMG Engages Data Using AI|BI Genie

2025-06-10 Watch
lightning_talk
Dennis Tally (KPMG)

“I don’t want to spend time filtering through another dashboard — I just need an answer now.” We’ve all experienced the frustration of wading through dashboards, yearning for immediate answers. Traditional reports and visualizations, though essential, often complicate the process for decision-makers. The digital enterprise demands a shift towards conversational, natural language interactions with data. At KPMG, AI|BI Genie is reimagining our approach by allowing users to inquire about data just as they would consult a knowledgeable colleague, obtaining precise and actionable insights instantly. Discover how the KPMG Contract to Cash team leverages AI|BI Genie to enhance data engagement, drive insights and foster business growth. Join us to see AI|BI Genie in action and learn how you can transform your data interaction paradigm.

AI/BI Dashboards and AI/BI Genie: Dashboards and Last-Mile Analytics Made Simple

AI/BI Dashboards and AI/BI Genie: Dashboards and Last-Mile Analytics Made Simple

2025-06-10 Watch
talk
Josue Bogran (JosueBogran.com & zeb.co) , Youssef Mrini (Databricks)

Databricks announced two new features in 2024: AI/BI Dashboards and AI/BI Genie. Dashboards is a redesigned dashboarding experience for your regular reporting needs, while Genie provides a natural language experience for your last-mile analytics. In this session, Databricks Solutions Architect and content creator Youssef Mrini will present alongside Databricks MVP and content creator Josue A. Bogran on how you can get the most value from these tools for your organization. Content covered includes: Setup necessary, including Unity Catalog, permissions and compute Building out a dashboard with AI/BI Dashboards Creating and training an AI/BI Genie workspace to reliably deliver answers When to use Dashboards, Genie, and when to use other tools such as PBI, Tableau, Sigma, ChatGPT, etc. Fluff-free, full of practical tips, and geared to help you deliver immediate impact with these new Databricks capabilities.

Best Practices to Mitigate AI Security Risks

Best Practices to Mitigate AI Security Risks

2025-06-10 Watch
talk
Arun Pamulapati (Databricks) , Samrat Ray (Databricks)

This session is repeated. AI is transforming industries, enhancing customer experiences and automating decisions. As organizations integrate AI into core operations, robust security is essential. The Databricks Security team collaborated with top cybersecurity researchers from OWASP, Gartner, NIST, HITRUST and Fortune 100 companies to evolve the Databricks AI Security Framework (DASF) to version 2.0. In this session, we’ll cover an AI security architecture using Unity Catalog, MLflow, egress controls, and AI gateway. Learn how security teams, AI practitioners and data engineers can secure AI applications on Databricks. Walk away with:• A reference architecture for securing AI applications• A worksheet with AI risks and controls mapped to industry standards like MITRE, OWASP, NIST and HITRUST• A DASF AI assistant tool to test your AI security

Building AI Models In Health Care Using Semi-Synthetic Data

Building AI Models In Health Care Using Semi-Synthetic Data

2025-06-10 Watch
talk
Holden Karau (Fight Health Insurance INC)

Regulated or restricted fields like Health Care make collecting training data complicated. We all want to do the right thing, but how? This talk will look at how Fight Health Insurance used de-identified public and proprietary information to create a semi-synthetic training set for use in fine-tuning machine learning models to power Fight Paperwork. We'll explore how to incorporate the latest "reasoning" techniques in fine tuning as well as how to make models that you can afford to serve — think single GPU inference instead of a cluster of A100s. In addition to the talk we have the code used in a public GitHub repo — although it is a little rough, so you might want to use it more as a source of inspiration rather than directly forking it.

Building Knowledge Agents to Automate Document Workflows

Building Knowledge Agents to Automate Document Workflows

2025-06-10 Watch
talk
Jerry Liu (LlamaIndex)

This session is repeated. One of the biggest promises for LLM agents is automating all knowledge work over unstructured data — we call these "knowledge agents". To date, while there are fragmented tools around data connectors, storage and agent orchestration, AI engineers have trouble building and shipping production-grade agents beyond basic chatbots. In this session, we first outline the highest-value knowledge agent use cases we see being built and deployed at various enterprises. These are: Multi-step document research, Automated document extraction Report generation We then define the core architectural components around knowledge management and agent orchestration required to build these use cases. By the end you'll not only have an understanding of the core technical concepts, but also an appreciation of the ROI you can generate for end-users by shipping these use cases to production.

Composing High-Accuracy AI Systems With SLMs and Mini-Agents

Composing High-Accuracy AI Systems With SLMs and Mini-Agents

2025-06-10 Watch
talk
Sharon Zhou (Lamini)

This session is repeated. For most companies, building compound AI systems remains aspirational. LLMs are powerful, but imperfect, and their non-deterministic nature makes steering them to high accuracy a challenge. In this session, we’ll demonstrate how to build compound AI systems using SLMs and highly accurate mini-agents that can be integrated into agentic workflows. You'll learn about breakthrough techniques, including: memory RAG, an embedding algorithm that reduces hallucinations using embed-time compute to generate contextual embeddings, improving indexing and retrieval, and memory tuning, a finetuning algorithm that reduces hallucinations using a Mixture of Memory Experts (MoME) to specialize models with proprietary data. We’ll also share real-world examples (text-to-SQL, factual reasoning, function calling, code analysis and more) across various industries. With these building blocks, we’ll demonstrate how to create high accuracy mini-agents that can be composed into larger AI systems.

From Metadata to Agents: Building the future of content understanding with Coactive AI + Databricks

2025-06-10
talk
Augusto Moreno (NBC Universal) , William Gaviria Rojas (Coactive AI)

Media enterprises generate vast amounts of visual content, but unlocking its full potential requires multimodal AI at scale. Coactive AI and NBCUniversal’s Corporate Decision Sciences team are transforming how enterprises discover and understand visual content. We explore how Coactive AI and Databricks — from Delta Share to Genie — can revolutionize media content search, tagging and enrichment, enabling new levels of collaboration. Attendees will see how this AI-powered approach fuels AI workflows, enhances BI insights and drives new applications — from automating cut sheet generation to improving content compliance and recommendations. By structuring and sharing enriched media metadata, Coactive AI and NBCU are unlocking deeper intelligence and laying the groundwork for agentic AI systems that retrieve, interpret and act on visual content. This session will showcase real-world examples of these AI agents and how they can reshape future content discovery and media workflows.

Orchestration With Lakeflow Jobs

Orchestration With Lakeflow Jobs

2025-06-10 Watch
talk
Saad Ansari (Databricks) , Anthony Podgorsak (Databricks)

This session is repeated. Curious about orchestrating data pipelines on Databricks? Join us for an introduction to Lakeflow Jobs (formerly Databricks Workflows) — an easy-to-use orchestration service built into the Databricks Data Intelligence Platform. Lakeflow Jobs simplifies automating your data and AI workflows, from ETL pipelines to machine learning model training. In this beginner-friendly session, you'll learn how to: Build and manage pipelines using a visual approach Monitor workflows and rerun failures with repair runs Automate tasks like publishing dashboards or ingesting data using Lakeflow Connect Add smart triggers that respond to new files or table updates Use built-in loops and conditions to reduce manual work and make workflows more dynamic We’ll walk through common use cases, share demos and offer tips to help you get started quickly. If you're new to orchestration or just getting started with Databricks, this session is for you.

Revolutionizing Data Insights and the Buyer Experience at GM Financial with Cloud Data Modernization

Revolutionizing Data Insights and the Buyer Experience at GM Financial with Cloud Data Modernization

2025-06-10 Watch
talk
Latha Subramanian (GM Financial) , Rick Whitford (Deloitte Consulting, LLP)

Deloitte and GM (General Motors) Financial have collaborated to design and implement a cutting-edge cloud analytics platform, leveraging Databricks. In this session, we will explore how we overcame challenges including dispersed and limited data capabilities, high-cost hardware and outdated software, with a strategic and comprehensive approach. With the help of Deloitte and Databricks, we were able to develop a unified Customer360 view, integrate advanced AI-driven analytics, and establish robust data governance and cyber security measures. Attendees will gain valuable insights into the benefits realized, such as cost savings, enhanced customer experiences, and broad employee upskilling opportunities. Unlock the impact of cloud data modernization and advanced analytics in the automotive finance industry and beyond with Deloitte and Databricks.

Securing Data Collaboration: A Deep Dive Into Security, Frameworks, and Use Cases

Securing Data Collaboration: A Deep Dive Into Security, Frameworks, and Use Cases

2025-06-10 Watch
talk
El Ghali Benchekroun (Databricks) , Bilal Obeidat (Databricks) , Bhavin Kukadia (Databricks)

This session will focus on the security aspects of Databricks Delta Sharing, Databricks Cleanrooms and Databricks Marketplace, providing an exploration of how these solutions enable secure and scalable data collaboration while prioritizing privacy. Highlights: Use cases — Understand how Delta Sharing facilitates governed, real-time data exchange across platforms and how Cleanrooms support multi-party analytics without exposing sensitive information Security internals — Dive into Delta Sharing's security frameworks Dynamic views — Learn about fine-grained security controls Privacy-first Cleanrooms — Explore how Cleanrooms enable secure analytics while maintaining strict data privacy standards Private exchanges — Explore the role of private exchanges using Databricks Marketplace in securely sharing custom datasets and AI models with specific partners or subsidiaries Network security & compliance — Review best practices for network configurations and compliance measures

Simplifying Training and GenAI Finetuning Using Serverless GPU Compute

Simplifying Training and GenAI Finetuning Using Serverless GPU Compute

2025-06-10 Watch
talk
Tejas Sundaresan (Databricks)

The last year has seen the rapid progress of Open Source GenAI models and frameworks. This talk covers best practices for custom training and OSS GenAI finetuning on Databricks, powered by the newly announced Serverless GPU Compute. We’ll cover how to use Serverless GPU compute to power AI training/GenAI finetuning workloads and framework support for libraries like LLM Foundry, Composer, HuggingFace, and more. Lastly, we’ll cover how to leverage MLFlow and the Databricks Lakehouse to streamline the end to end development of these models. Key takeaways include: How Serverless GPU compute saves customers valuable developer time and overhead when dealing with GPU infrastructure Best practices for training custom deep learning models (forecasting, recommendation, personalization) and finetuning OSS GenAI Models on GPUs across the Databricks stack Leveraging distributed GPU training frameworks (e.g. Pytorch, Huggingface) on Databricks Streamlining the path to production for these models Join us to learn about the newly announced Serverless GPU Compute and the latest updates to GPU training and finetuning on Databricks!

Sponsored by: Microsoft | Leverage the power of the Microsoft Ecosystem with Azure Databricks

Sponsored by: Microsoft | Leverage the power of the Microsoft Ecosystem with Azure Databricks

2025-06-10 Watch
talk
Anavi Nahar (Microsoft)

Join us for this insightful session to learn how you can leverage the power of the Microsoft ecosystem along with Azure Databricks to take your business to the next level. Azure Databricks is a fully integrated, native, first-party solution on Microsoft Azure. Databricks and Microsoft continue to actively collaborate on product development, ensuring tight integration, optimized performance, and a streamlined support experience. Azure Databricks offers seamless integrations with Power BI, Azure Open AI, Microsoft Purview, Azure Data Lake Storage (ADLS) and Foundry. In this session, you’ll learn how you can leverage deep integration between Azure Databricks and the Microsoft solutions to empower your organization to do more with your data estate. You’ll also get an exclusive sneak peek into the product roadmap.

Transforming Financial Intelligence with FactSet Structured and Unstructured Data and Delta Sharing

Transforming Financial Intelligence with FactSet Structured and Unstructured Data and Delta Sharing

2025-06-10 Watch
talk
Kristen Clark (FactSet) , Keon Shahab (Databricks)

Join us to explore the dynamic partnership between FactSet and Databricks, transforming data accessibility and insights. Discover the launch of FactSet’s Structured DataFeeds via Delta Sharing on the Databricks Marketplace, enhancing access to crucial financial data insights. Learn about the advantages of streamlined data delivery and how this integration empowers data ecosystems. Beyond structured data, explore the innovative potential of vectorized data sharing of unstructured content such as news, transcripts, and filings. Gain insights into the importance of seamless vectorized data delivery to support GenAI applications and how FactSet is preparing to simplify client GenAI workflows with AI-ready data. Experience a demo that showcases the complete journey from data delivery to actionable GenAI application responses in a real-world Financial Services scenario. See firsthand how FactSet is simplifying client GenAI workflows with AI-ready data that drives faster, more informed financial decisions.

Transforming HP’s Print ELT Reporting with GenIT: Real-Time Insights Tool Powered by Databricks AI

Transforming HP’s Print ELT Reporting with GenIT: Real-Time Insights Tool Powered by Databricks AI

2025-06-10 Watch
talk
Weiwei Hu (HP)

Timely and actionable insights are critical for staying competitive in today’s fast-paced environment. At HP Print, manual reporting for executive leadership (ELT) has been labor-intensive, hindering agility and productivity. To address this, we developed the Generative Insights Tool (GenIT) using Databricks Genie and Mosaic AI to create a real-time insights engine automating SQL generation, data visualization, and narrative creation. GenIT delivers instant insights, enabling faster decisions, greater productivity, and improved consistency while empowering leaders to respond to printer trends. With automated querying, AI-powered narratives, and a chatbot, GenIT reduces inefficiencies and ensures quality data and insights. Our roadmap integrates multi-modal data, enhances chatbot functionality, and scales globally. This initiative shows how HP Print leverages GenAI to improve decision-making, efficiency, and agility, and we will showcase this transformation at the Databricks AI Summit.

Unify Your Data and Governance With Lakehouse Federation

Unify Your Data and Governance With Lakehouse Federation

2025-06-10 Watch
talk
Zeashan Pappa (Databricks) , Fuat Can Efeoglu (Databricks)

In today's data landscape, organizations often grapple with fragmented data spread across various databases, data warehouses and catalogs. Lakehouse Federation addresses this challenge by enabling seamless discovery, querying, and governance of distributed data without the need for duplication or migration. This session will explore how Lakehouse Federation integrates external data sources like Hive Metastore, Snowflake, SQL Server and more into a unified interface, providing consistent access controls, lineage tracking and auditing across your entire data estate. Learn how to streamline analytics and AI workloads, enhance compliance and reduce operational complexity by leveraging a single, cohesive platform for all your data needs.