talk-data.com talk-data.com

Event

Data + AI Summit 2025

2025-06-09 – 2025-06-13 Databricks Summit Visit website ↗

Activities tracked

509

Filtering by: Databricks ×

Sessions & talks

Showing 451–475 of 509 · Newest first

Search within this event →
Best Practices for Building User-Facing AI Systems on Databricks

Best Practices for Building User-Facing AI Systems on Databricks

2025-06-10 Watch
talk
Jyotsna Bharadwaj (Databricks) , Arthur Dooner (Databricks)

This session is repeated. Integrating AI agents into business systems requires tailored approaches for different maturity levels (crawl-walk-run) that balance scalability, accuracy and usability. This session addresses the critical challenge of making AI agents accessible to business users. We will explore four key integration methods: Databricks apps: The fastest way to build and run applications that leverage your data, with the full security and governance of Databricks Genie: Tool enabling non-technical users to gain data insights on Structured Data through natural language queries Chatbots: Combine real-time data retrieval with generative AI for contextual responses and process automation Batch inference: Scalable, asynchronous processing for large-scale AI tasks, optimizing efficiency and cost We'll compare these approaches, discussing their strengths, challenges and ideal use cases to help businesses select the most suitable integration strategy for their specific needs.

Breaking Silos: Enabling Databricks-Snowflake Interoperability With Iceberg and Unity Catalog

Breaking Silos: Enabling Databricks-Snowflake Interoperability With Iceberg and Unity Catalog

2025-06-10 Watch
talk
Mohit Kumar (T-Mobile) , Geoffrey Freeman (T-Mobile)

As data ecosystems grow more complex, organizations often struggle with siloed platforms and fragmented governance. In this session, we’ll explore how our team made Databricks the central hub for cross-platform interoperability, enabling seamless Snowflake integration through Unity Catalog and the Iceberg REST API. We’ll cover: Why interoperability matters and the business drivers behind our approach How Unity Catalog and Uniform simplify interoperability, allowing Databricks to expose an Iceberg REST API for external consumption Technical deep dive into data sharing, query performance, and access control across Databricks and Snowflake Lessons learned and best practices for building a multi-engine architecture while maintaining governance and efficiency By leveraging Uniform, Delta, and Iceberg, we created a flexible, vendor-agnostic architecture that bridges Databricks and Snowflake without compromising performance or security.

Building Responsible and Resilient AI: The Databricks AI Governance Framework

Building Responsible and Resilient AI: The Databricks AI Governance Framework

2025-06-10 Watch
talk
Abhi Arikapudi (Databricks) , David Wells (Databricks)

GenAI & machine learning are reshaping industries, driving innovation and redefining business strategies. As organizations embrace these technologies, they face significant challenges in managing AI initiatives effectively, such as balancing innovation with ethical integrity, operational resilience and regulatory compliance. This presentation introduces the Databricks AI Governance Framework (DAGF), a practical framework designed to empower organizations to navigate the complexities of AI. It provides strategies for building scalable, responsible AI programs that deliver measurable value, foster innovation and achieve long-term success. By examining the framework's five foundational pillars — AI organization, ethics, legal and regulatory compliance, transparency and interpretability, AI operations and infrastructure and AI security — this session highlights how AI governance aligns programs with the organization's strategic goals, mitigates risks and builds trust across stakeholders.

Driving Databricks Platform With Revenue Intelligence ROI

Driving Databricks Platform With Revenue Intelligence ROI

2025-06-10 Watch
talk
Joel Fuernsinn (Veeam)

Demonstrating a real ROI is key to driving executive and stakeholder buy-in for major technology changes. At Veeam, we aligned our Databricks Platform change with projects to increase sales pipeline and improve customer retention. By delivering targeted improvements on those critical business metrics, we created positive ROI in short order while at the same time setting the foundation for long term Databricks Platform success. This session targets data and business leaders looking to understand how they can turn their infrastructure change into a business revenue driver.

Empowering Healthcare Insights: A Unified Lakehouse Approach With Databricks

Empowering Healthcare Insights: A Unified Lakehouse Approach With Databricks

2025-06-10 Watch
talk
BIANCA STRATULAT (BJSS) , Mike Dobing (Databricks)

NHS England is revolutionizing healthcare research by enabling secure, seamless access to de-identified patient data through the Federated Data Platform (FDP). Despite vast data resources spread across regional and national systems, analysts struggle with fragmented, inconsistent datasets. Enter Databricks: powering a unified, virtual data lake with Unity Catalog at its core — integrating diverse NHS systems while ensuring compliance and security. By bridging AWS and Azure environments with a private exchange and leveraging the Iceberg connector to interface with Palantir, analysts gain scalable, reliable and governed access to vital healthcare data. This talk explores how this innovative architecture is driving actionable insights, accelerating research and ultimately improving patient outcomes.

How an Open, Scalable and Secure Data Platform is Powering Quick Commerce Swiggy's AI

How an Open, Scalable and Secure Data Platform is Powering Quick Commerce Swiggy's AI

2025-06-10 Watch
talk
Vasan Vembu Srini (Databricks) , Akash Agarwal (Swiggy)

Swiggy, India's leading quick commerce platform, serves ~13 million users across 653 cities, with 196,000 restaurant partners and 17,000 SKUs. To handle this scale, Swiggy developed a secure, scalable AI platform processing millions of predictions per second. The tech stack includes Apache Kafka for real-time streaming, Apache Spark on Databricks for analytics and ML, and Apache Flink for stream processing. The Lakehouse architecture on Delta ensures data reliability, while Unity Catalog enables centralized access control and auditing. These technologies power critical AI applications like demand forecasting, route optimization, personalized recommendations, predictive delivery SLAs, and generative AI use cases.Key Takeaway:This session explores building a data platform at scale, focusing on cost efficiency, simplicity, and speed, empowering Swiggy to seamlessly support millions of users and AI use cases.

How to Get the Most Out of Your BI Tools on Databricks

How to Get the Most Out of Your BI Tools on Databricks

2025-06-10 Watch
talk
Kyle Hale (Databricks)

Unlock the full potential of your BI tools with Databricks. This session explores how features like Photon, Databricks SQL, Liquid Clustering, AI/BI Genie and Publish to Power BI enhance performance, scalability and user experience. Learn how Databricks accelerates query performance, optimizes data layouts and integrates seamlessly with BI tools. Gain actionable insights and best practices to improve analytics efficiency, reduce latency and drive better decision-making. Whether migrating from a data warehouse or optimizing an existing setup, this talk provides the strategies to elevate your BI capabilities.

Introduction to Databricks SQL

Introduction to Databricks SQL

2025-06-10 Watch
talk
Himanshu Raja (Databricks) , Pearl Ubaru (Databricks)

This session is repeated. If you are brand new to Databricks SQL and want to get a lightning tour of this intelligent data warehouse, this session is for you. Learn about the architecture of Databricks SQL. Then show how simple, streamlined interfaces are making it easier for analysts, developers, admins and business users to get their jobs done and questions answered. We’ll show how easy it is to create a warehouse, get data, transform it and build queries and dashboards. By the end of the session, you’ll be able to build a Databricks SQL warehouse in 5 minutes.

Leveraging Databricks Unity Catalog for Enhanced Data Governance in Unipol

Leveraging Databricks Unity Catalog for Enhanced Data Governance in Unipol

2025-06-10 Watch
talk
Beniamino Del Pizzo (Unipol S.p.A.) , Giovanni Cinquepalmi (Data Reply)

In the contemporary landscape of data management, organizations are increasingly faced with the challenges of data segregation, governance and permission management, particularly when operating within complex structures such as holding companies with multiple subsidiaries. Unipol comprises seven subsidiary companies, each with a diverse array of workgroups, leading to a cumulative total of multiple operational groups. This intricate organizational structure necessitates a meticulous approach to data management, particularly regarding the segregation of data and the assignment of precise read-and-write permissions tailored to each workgroup. The challenge lies in ensuring that sensitive data remains protected while enabling seamless access for authorized users. This speech wants to demonstrate how Unity Catalog emerges as a pivotal tool in the daily use of the data platform, offering a unified governance solution that supports data management across diverse AWS environments.

Marketing Data + AI Leaders Forum

Marketing Data + AI Leaders Forum

2025-06-10 Watch
talk
Dan Morris (Databricks) , Calen Holbrooks (Airtable) , Elizabeth Dobbs (Databricks) , David Geisinger (Deloitte) , Kristen Brophy (ThredUp) , Joyce Hwang (Dropbox) , Zeynep Inanoglu Ozdemir (Atlassian Pty Ltd.) , Bryan Saftler (Databricks) , Alex Dean (Snowplow) , Derek Slager (Amperity) , Rick Schultz (Databricks) , Bryce Peake (Domino's) , Julie Foley Long (Grammarly)

Join us Tuesday June 10th, 9:10-12:10 PM PT Hosted by Databricks CMO, Rick Schultz, hear from executives and speakers at PetSmart, Valentino, Domino’s, AirTable, Dropbox, ThredUp, Grammarly, Deloitte, and more. Come for actionable strategies and real-world examples: Hear from marketing experts on how to build data and AI-driven marketing organizations. Learn how Databricks Marketing supercharges impact using the Data Intelligence Platform; scaling personalization, building more efficient campaigns, and empowering marketers to self-serve insights.

Responsible AI at Scale: Balancing Democratization and Regulation in the Financial Sector

Responsible AI at Scale: Balancing Democratization and Regulation in the Financial Sector

2025-06-10 Watch
talk
Aman Thind (State Street)

We partnered with Databricks to pioneer a new standard in financial sector's enterprise AI, balancing rapid AI democratization with strict regulatory and security requirements. At the core is our Responsible AI Gateway, enforcing jailbreak prevention and compliance on every LLM query. Real-time observability, powered by Databricks, calculates risk and accuracy metrics, detecting issues before escalation. Leveraging Databricks' model hosting ensures scalable LLM access, fortifying security and efficiency. We built frameworks to democratize AI without compromising guardrails. Operating in a regulated environment, we showcase how Databricks enables democratization and responsible AI at scale, offering best practices for financial organizations to harness AI safely and efficiently.

Sponsored by: dbt Labs | Empowering the Enterprise for the Next Era of AI and BI

Sponsored by: dbt Labs | Empowering the Enterprise for the Next Era of AI and BI

2025-06-10 Watch
talk
Elias DeFaria (dbt Labs)

The next era of data transformation has arrived. AI is enhancing developer workflows, enabling downstream teams to collaborate effectively through governed self-service. Additionally, SQL comprehension is producing detailed metadata that boosts developer efficiency while ensuring data quality and cost optimization. Experience this firsthand with dbt’s data control plane, a centralized platform that provides organizations with repeatable, scalable, and governed methods to succeed with Databricks in the modern age.

Using Identity Security With Unity Catalog for Faster, Safer Data Access

Using Identity Security With Unity Catalog for Faster, Safer Data Access

2025-06-10 Watch
talk
Siddharth Bhai (Databricks) , Kelly Albano (Databricks)

Managing authentication effectively is key to securing your data platform. In this session, we’ll explore best practices from Databricks for overcoming authentication challenges, including token visibility, MFA/SSO, CI/CD token federation and risk containment. Discover how to map your authentication maturity journey while maximizing security ROI. We'll showcase new capabilities like access token reports for improved visibility, streamlined MFA implementation and secure SSO with token federation. Learn strategies to minimize token risk through TTL limits, scoped tokens and network policies. You'll walk away with actionable insights to enhance your authentication practices and strengthen platform security on Databricks.

Startup Forum

2025-06-10
talk
Dan Tobin (Databricks) , Guy Fighel (Hetz Ventures) , Steve Sobel (Databricks) , Andrew Ferguson (Databricks) , Sri Tikkireddy (Databricks) , Aaron Jacobson (NEA) , George Webster (Zigguratum Inc) , Nima Alidoust (Tahoe Therapeutics) , Sarah Catanzaro (Amplify Partners) , Atindriyo Sanyal (Galileo)

Hear from VC leaders, startup founders and early stage customers building on Databricks around what they are seeing in the market and how they are scaling their early stage companies on Databricks. This event is a must see for VCs, founders and those interested in the early stage company ecosystem.

Accelerating Analytics: Integrating BI and Partner Tools to Databricks SQL

Accelerating Analytics: Integrating BI and Partner Tools to Databricks SQL

2025-06-10 Watch
talk
Fuat Can Efeoglu (Databricks) , Toussaint Webb (Databricks)

This session is repeated. Did you know that you can integrate with your favorite BI tools directly from Databricks SQL? You don’t even need to stand up an additional warehouse. This session shows the integrations with Microsoft Power Platform, Power BI, Tableau and dbt so you can have a seamless integration experience. Directly connect your Databricks workspace with Fabric and Power BI workspaces or Tableau to publish and sync data models, with defined primary and foreign keys, between the two platforms.

AI-Powered Marketing Data Management: Solving the Dirty Data Problem with Databricks

AI-Powered Marketing Data Management: Solving the Dirty Data Problem with Databricks

2025-06-10 Watch
talk
Steven Kostrzewski (Acxiom) , Ankur Jain (Acxiom)

Marketing teams struggle with ‘dirty data’ — incomplete, inconsistent, and inaccurate information that limits campaign effectiveness and reduces the accuracy of AI agents. Our AI-powered marketing data management platform, built on Databricks, solves this with anomaly detection, ML-driven transformations and the built-in Acxiom Referential Real ID Graph with Data Hygiene.We’ll showcase how Delta Lake, Unity Catalog and Lakeflow Declarative Pipelines power our multi-tenant architecture, enabling secure governance and 75% faster data processing. Our privacy-first design ensures compliance with GDPR, CCPA and HIPAA through role-based access, encryption key management and fine-grained data controls.Join us for a live demo and Q&A, where we’ll share real-world results and lessons learned in building a scalable, AI-driven marketing data solution with Databricks.

Boosting Data Science and AI Productivity With Databricks Notebooks

Boosting Data Science and AI Productivity With Databricks Notebooks

2025-06-10 Watch
talk
Vijay Raghavan (Thumbtack) , Jason Cui (Databricks)

This session is repeated. Want to accelerate your team's data science workflow? This session reveals how Databricks Notebooks can transform your productivity through an optimized environment designed specifically for data science and AI work. Discover how notebooks serve as a central collaboration hub where code, visualizations, documentation and results coexist seamlessly, enabling faster iteration and development. Key takeaways: Leveraging interactive coding features including multi-language support, command-mode shortcuts and magic commands Implementing version control best practices through Git integration and notebook revision history Maximizing collaboration through commenting, sharing and real-time co-editing capabilities Streamlining ML workflows with built-in MLflow tracking and experiment management You'll leave with practical techniques to enhance your notebook-based workflow and deliver AI projects faster with higher-quality results.

CI/CD for Databricks: Advanced Asset Bundles and GitHub Actions

CI/CD for Databricks: Advanced Asset Bundles and GitHub Actions

2025-06-10 Watch
talk
Dustin Vannoy (Databricks)

This session is repeated.Databricks Asset Bundles (DABs) provide a way to use the command line to deploy and run a set of Databricks assets — like notebooks, Python code, Lakeflow Declarative Pipelines and workflows. To automate deployments, you create a deployment pipeline that uses the power of DABs along with other validation steps to ensure high quality deployments.In this session you will learn how to automate CI/CD processes for Databricks while following best practices to keep deployments easy to scale and maintain. After a brief explanation of why Databricks Asset Bundles are a good option for CI/CD, we will walk through a working project including advanced variables, target-specific overrides, linting, integration testing and automatic deployment upon code review approval. You will leave the session clear on how to build your first GitHub Action using DABs.ub Action using DABs.

Databricks, the Good, the Bad and the Ugly

Databricks, the Good, the Bad and the Ugly

2025-06-10 Watch
talk
Holly Smith (Databricks)

Databricks is the bestest platform ever where everything is perfect and nothing else could ever make it any better, right? …right? You and I know, this is not true. Don’t get me wrong, there are features that I absolutely love, but there are also some that require powering through the papercuts. And then there are those that I pretend don’t exist. I’ll be opening up to give my honest take on three of each category, why I do (or don’t) like them, and then telling you which talks to attend to find out more.

Databricks Without Disruption: A Deep Dive on Catalog Federation with Hive Metastore, Glue, and Snowflake

2025-06-10
talk
John Spencer (Databricks) , Milos Stojanovic (Databricks)

You shouldn’t have to sacrifice data governance just to leverage the tools your business needs. In this session, we will give practical tips on how you can cut through the data sprawl and get a unified view of your data estate in Unity Catalog without disrupting existing workloads. We will walk through how to set up federation with Glue, Hive Metastore, and other catalogs like Snowflake, and show you how powerful new tools help you adopt Databricks at your own pace with no downtime and full interoperability.

Data Management and Governance With UC

2025-06-10
talk

In this course, you'll learn concepts and perform labs that showcase workflows using Unity Catalog - Databricks' unified and open governance solution for data and AI. We'll start off with a brief introduction to Unity Catalog, discuss fundamental data governance concepts, and then dive into a variety of topics including using Unity Catalog for data access control, managing external storage and tables, data segregation, and more. Pre-requisites: Beginner familiarity with the Databricks Data Intelligence Platform (selecting clusters, navigating the Workspace, executing notebooks), cloud computing concepts (virtual machines, object storage, etc.), production experience working with data warehouses and data lakes, intermediate experience with basic SQL concepts (select, filter, groupby, join, etc), beginner programming experience with Python (syntax, conditions, loops, functions), beginner programming experience with the Spark DataFrame API (Configure DataFrameReader and DataFrameWriter to read and write data, Express query transformations using DataFrame methods and Column expressions, etc.) Labs: Yes Certification Path: Databricks Certified Data Engineer Associate

Deploy Workloads with Lakeflow Jobs (previously Databricks Workflows)

2025-06-10
talk

In this course, you’ll learn how to orchestrate data pipelines with Lakeflow Jobs (previously Databricks Workflows) and schedule dashboard updates to keep analytics up-to-date. We’ll cover topics like getting started with Lakeflow Jobs, how to use Databricks SQL for on-demand queries, and how to configure and schedule dashboards and alerts to reflect updates to production data pipelines. Pre-requisites: Beginner familiarity with the Databricks Data Intelligence Platform (selecting clusters, navigating the Workspace, executing notebooks), cloud computing concepts (virtual machines, object storage, etc.), production experience working with data warehouses and data lakes, intermediate experience with basic SQL concepts (select, filter, groupby, join, etc), beginner programming experience with Python (syntax, conditions, loops, functions), beginner programming experience with the Spark DataFrame API (Configure DataFrameReader and DataFrameWriter to read and write data, Express query transformations using DataFrame methods and Column expressions, etc.) Labs: No Certification Path: Databricks Certified Data Engineer Associate

Easy Ways to Optimize Your Databricks Costs

Easy Ways to Optimize Your Databricks Costs

2025-06-10 Watch
talk
Youssef Mrini (Databricks) , Yassine Essawabi (Databricks)

In this session, we will explore effective strategies for optimizing costs on the Databricks platform, a leading solution for handling large-scale data workloads. Databricks, known for its open and unified approach, offers several tools and methodologies to ensure users can maximize their return on investment (ROI) while managing expenses efficiently. Key points: Understanding usage with AI/BI tools Organizing costs with tagging Setting up budgets Leveraging System Tables By the end of this session, you will have a comprehensive understanding of how to leverage Databricks' built-in tools for cost optimization, ensuring that their data and AI projects not only deliver value but do so in a cost-effective manner. This session is ideal for data engineers, financial analysts, and decision-makers looking to enhance their organization’s efficiency and financial performance through strategic cost management on Databricks.

Elevating Data Quality Standards With Databricks DQX

Elevating Data Quality Standards With Databricks DQX

2025-06-10 Watch
talk
Marcin Wojtyczka (Databricks) , Neha Milak (Databricks)

Join us for an introductory session on Databricks DQX, a Python-based framework designed to validate the quality of PySpark DataFrames. Discover how DQX can empower you to proactively tackle data quality challenges, enhance pipeline reliability and make more informed business decisions with confidence. Traditional data quality tools often fall short by providing limited, actionable insights, relying heavily on post-factum monitoring, and being restricted to batch processing. DQX overcomes these limitations by enabling real-time quality checks at the point of data entry, supporting both batch and streaming data validation and delivering granular insights at the row and column level. If you’re seeking a simple yet powerful data quality framework that integrates seamlessly with Databricks, this session is for you.

Gen AI Evaluation and Governance

2025-06-10
talk

This course introduces learners to evaluating and governing GenAI (generative artificial intelligence) systems. First, learners will explore the meaning behind and motivation for building evaluation and governance/security systems. Next, the course will connect evaluation and governance systems to the Databricks Data Intelligence Platform. Third, learners will be introduced to a variety of evaluation techniques for specific components and types of applications. Finally, the course will conclude with an analysis of evaluating entire AI systems with respect to performance and cost. Pre-requisites: Familiarity with prompt engineering, and experience with the Databricks Data Intelligence Platform. Additionally, knowledge of retrieval-augmented generation (RAG) techniques including data preparation, embeddings, vectors, and vector databases Labs: Yes Certification Path: Databricks Certified Generative AI Engineer Associate