talk-data.com talk-data.com

Topic

GenAI

Generative AI

ai machine_learning llm

1517

tagged

Activity Trend

192 peak/qtr
2020-Q1 2026-Q1

Activities

1517 activities · Newest first

AI Meets SQL: Leverage GenAI at Scale to Enrich Your Data

This session is repeated. Integrating AI into existing data workflows can be challenging, often requiring specialized knowledge and complex infrastructure. In this session, we'll share how SQL users can leverage AI/ML to access large language models (LLMs) and traditional machine learning directly from within SQL, simplifying the process of incorporating AI into data workflows. We will demonstrate how to use Databricks SQL for natural language processing, traditional machine learning, retrieval augmented generation and more. You'll learn about best practices and see examples of solving common use cases such as opinion mining, sentiment analysis, forecasting and other common AI/ML tasks.

Cross-Region AI Model Deployment for Resiliency and Compliance

AI for enterprises, particularly in the era of GenAI, requires rapid experimentation and the ability to productionize models and agents quickly and at scale. Compliance, resilience and commercial flexibility drive the need to serve models across regions. As cloud providers struggle with rising demand for GPUs in environments, VM shortages have become commonplace, and add to the pressure of general cloud outages. Enterprises that can quickly leverage GPU capacity in other cloud regions will be better equipped to capitalize on the promise of AI, while staying flexible to serve distinct user bases and complying with regulations. In this presentation we will show and discuss how to implement AI deployments across cloud regions, deploying a model across regions and using a load balancer to determine where to best route a user request.

Databricks on Databricks: Powering Marketing Insights with Lakehouse

This presentation outlines the evolution of our marketing data strategy, focusing on how we’ve built a strong foundation using the Databricks Lakehouse. We will explore key advancements across data ingestion, strategy, and insights, highlighting the transition from legacy systems to a more scalable and intelligent infrastructure. Through real-world applications, we will showcase how unified Customer 360 insights drive personalization, predictive analytics enhance campaign effectiveness, and GenAI optimizes content creation and marketing execution. Looking ahead, we will demonstrate the next phase of our CDP, the shift toward an end-user-first analytics model powered by AIBI, Genie and Matik, and the growing importance of clean rooms for secure data collaboration. This is just the beginning, and we are poised to unlock even greater capabilities in the future.

RecSys, Topic Modeling and Agents: Bridging the GenAI-Traditional ML Divide

The rise of GenAI has led to a complete reinvention of how we conceptualize Data + AI. In this breakout, we will recontextualize the rise of GenAI in traditional ML paradigms, and hopefully unite the pre- and post-LLM eras. We will demonstrate when and where GenAI may prove more effective than traditional ML algorithms, and highlight problems for which the wheel is unnecessarily being reinvented with GenAI. This session will also highlight how MLflow provides a unified means of benchmarking traditional ML against GenAI, and lay out a vision for bridging the divide between Traditional ML and GenAI practitioners.

Revolutionizing Nuclear AI With HiVE and Bertha on Databricks Architecture

In this session we will explore the revolutionary advancements in nuclear AI capabilities with HiVE and Bertha on Databricks architecture. HiVE, developed by Westinghouse, leverages over a century of proprietary data to deliver unparalleled AI capabilities. At its core is Bertha, a generative AI model designed to tackle the unique challenges of the nuclear industry. This session will delve into the technical architecture of HiVE and Bertha, showcasing how Databricks' scalable environment enhances their performance. We will discuss the secure data infrastructure supporting HiVE, ensuring data integrity and compliance. Real-world applications and use cases will demonstrate the impact of HiVE and Bertha on improving efficiency, innovation and safety in nuclear operations. Discover how the fusion of HiVE and Bertha with Databricks architecture is transforming the nuclear AI landscape and driving the future of nuclear technology.

Shifting Left — Setting up Your GenAI Ecosystem to Work for Business Analysts

At Data and AI in 2022, Databricks pioneered the term to shift left in how AI workloads would enable less data science driven people to create their own apps. In 2025, we take a look at how Experian is doing on that journey. This session highlights Databricks services that assist with the shift left paradigm for Generative AI, including how AI/BI Genie helps with Generative analytics, and how Agent Studio helps with synthetic generation of test cases to validate model performance.

Curious to know how Adidas is transforming customer experience and business impact with agentic workflows, powered by Databricks? By leveraging cutting-edge tools like MosaicML’s deployment capabilities, Mosaic AI Gateway, and MLflow, Adidas built a scalable GenAI agentic infrastructure that delivers actionable insights from growing 2 million product reviews annually. With remarkable results: 60% latency reduction (15.5 seconds to 6 seconds) 91.67% cost savings (transitioning to more efficient LLMs) 98.5% token efficiency, reducing input tokens from 200k to just 3k 20% increase in productivity (faster time to insight) Empowering over 500 decision-makers across 150+ countries, this infrastructure is set to optimize products and services for Adidas’ 500 million members by 2025 while supporting dozens of upcoming AI-driven solutions. Join us to explore how Adidas turned agentic workflows infra into a strategic advantage using Databricks and learn how you can do the same!

Manufacturing and Transportation Industry Forum | Sponsored by: Deloitte and AWS
talk
by Victor Dsouza (Applied Materials) , Richard Masters (Virgin Atlantic Airways) , Andy Isenman (Heathrow) , Dr. Andrej Levin (Boston Consulting Group) , Shiv Trisal (Databricks) , Caitlin Gordon (Databricks)

Join us for an inspiring forum showcasing how manufacturers and transportation leaders are turning today's challenges into tomorrow's opportunities. From automotive giants revolutionizing product development with generative AI to logistics providers optimizing routes for both cost and sustainability, discover how industry pioneers are reshaping the future of industrial operations. Highlighting this session is an exciting collaboration between Heathrow Airport and Virgin Atlantic, demonstrating how partnership and innovation are transforming the air travel experience. Learn how these leaders and other companies are using Databricks to tackle their most pressing challenges — from smart factory transformations to autonomous systems development — proving that the path to profitability and sustainability runs through intelligent operations.

Public Sector Industry Forum | Sponsored by: Deloitte and AWS

Join the 60-minute kickoff session at the Public Sector Forum for an opportunity to to accelerate innovation into your enterprise through governance, compliance and GenAI. Featuring keynotes from data-driven agency leaders and providing a future-looking journey from Databricks, this event offers invaluable insights. Understand the outcomes of Data and AI powering transformation across common areas of government and beyond: Improving constituent experience Reducing cost and enhancing services Identifying fraud, waste and abuse Achieving scale and security You will not want to miss this exclusive opportunity to own your data and eliminate government silos. Discover the Data + AI Company with deep compliance experience and widespread adoption.

How Anthropic Transforms Financial Services Teams With GenAI

Learn how GenAI is being applied to financial services teams using Claude, an acknowledged leader in large language models. Integrated with the scale and security of the Databricks Data Intelligence Platform, we will share how Claude is enabling financial services organizations to streamline operations, maximize productivity for investment and compliance teams and in some cases turn traditional cost-centers into revenue drivers.

GenAI Observability in Customer Care

Customer support is going through the GenAI revolution, but how can we use AI to foster deeper empathy with our end users?To enable this, Earnin has built its GenAI observability platform on Databricks, leveraging Lakeflow Declarative Pipeliness, Kafka and Databricks AI/BI.This session covers how we use Lakeflow Declarative Pipelines to monitor our customer care chatbot in near real-time and how we leverage Databricks to better anticipate our customers' needs.

Managing Data and AI Security Risks With DASF 2.0 — and a Customer Story

The Databricks Security team led a broad working group that significantly evolved the Databricks AI Security Framework (DASF) to its 2.0 version since its first release by closely collaborating with the top cyber security researchers at industry organizations such as OWASP, Gartner, NIST, HITRUST, FAIR Institute and several Fortune 100 companies to address the evolving risks and associated controls of AI systems in enterprises. Join us to to learn how The CLEVER GenAI pipeline, an AI-driven innovation in healthcare, processes over 1.5 million clinical notes daily to classify social determinants impacting veteran care while adhering to robust security measures like NIST 800-53 controls and by leveraging Databricks AI Security Framework. We will discuss robust AI security guidelines to help data and AI teams understand how to deploy their AI applications securely. This session will give a security framework for security teams, AI practitioners, data engineers and governance teams.

Sponsored by: Accenture & Avanade | Enterprise Scaling and Value of Generative AI and Agentic AI

In this talk, we will explore the transformative potential of Generative AI and Agentic AI in driving enterprise-scale innovation and delivering substantial business value. As organizations increasingly recognize the power of AI to move beyond automation towards true augmentation and intelligent decision-making, understanding the nuances of scaling these advanced AI paradigms becomes critical. We will delve into practical strategies for deploying, managing, and optimizing Agentic AI frameworks showcasing how autonomous, goal-directed AI systems can unlock new efficiencies, enhance customer experiences, and foster continuous innovation. Through real-world case studies and actionable insights, attendees will gain a comprehensive understanding of the key considerations to architect, implement, and measure the ROI of large-scale Generative and Agentic AI initiatives, positioning their enterprises for sustained growth and competitive advantage in the AI-first era.

Sponsored by: AWS | Deploying a GenAI Agent using Databricks Mosaic AI, Anthropic, LangGraph, and Amazon Bedrock

In this session, you’ll see how to build and deploy a GenAI agent and Model Context Protocol (MCP) with Databricks, Anthropic, Mosaic External AI Gateway, and Amazon Bedrock. You will learn the architecture, best-practices of using Databricks Mosaic AI, Anthropic Sonnet 3.7 first-party frontier model, and LangGraph for custom workflow orchestration in Databricks Data Intelligence Platform. You’ll also see how to use Databricks Mosaic AI to provide agent evaluation and monitoring. In addition, you will also see how inline agent will use MCP to provide tools and other resources using Amazon Nova models with Amazon Bedrock inline agent for deep research. This approach gives you the flexibility of LangGraph, the powerful managed agents offered by Amazon Bedrock, and Databricks Mosaic AI’s operational support for evaluation and monitoring.

This course introduces learners to deploying, operationalizing, and monitoring generative artificial intelligence (AI) applications. First, learners will develop knowledge and skills in deploying generative AI applications using tools like Model Serving. Next, the course will discuss operationalizing generative AI applications following modern LLMOps best practices and recommended architectures. Finally, learners will be introduced to the idea of monitoring generative AI applications and their components using Lakehouse Monitoring. Pre-requisites: Familiarity with prompt engineering and retrieval-augmented generation (RAG) techniques, including data preparation, embeddings, vectors, and vector databases. A foundational knowledge of Databricks Data Intelligence Platform tools for evaluation and governance (particularly Unity Catalog). Labs: Yes Certification Path: Databricks Certified Generative AI Engineer Associate

ReguBIM AI – Transforming BIM, Engineering, and Code Compliance with Generative AI

At Exyte, we design, engineer, and deliver ultra-clean and sustainable facilities for high-tech industries. One of the most complex tasks our engineers and designers face is ensuring that their building designs comply with constantly evolving codes and regulations – often a manual, error-prone process. To address this, we developed ReguBIM AI, a generative AI-powered assistant that helps our teams verify code compliance more efficiently and accurately by linking 3D Building Information Modeling (BIM) data with regulatory documents. Built on the Databricks Data Intelligence Platform, ReguBIM AI is part of our broader vision to apply AI meaningfully across engineering and design processes. We are proud to share that ReguBIM AI won the Grand Prize and EMEA Winner titles at the Databricks GenAI World Cup 2024 — a global hackathon that challenged over 1,500 data scientists and AI engineers from 18 countries to create innovative generative AI solutions for real-world problems.

Scaling GenAI Inference From Prototype to Production: Real-World Lessons in Speed & Cost

This lightning talk dives into real-world GenAI projects that scaled from prototype to production using Databricks’ fully managed tools. Facing cost and time constraints, we leveraged four key Databricks features—Workflows, Model Serving, Serverless Compute, and Notebooks—to build an AI inference pipeline processing millions of documents (text and audiobooks). This approach enables rapid experimentation, easy tuning of GenAI prompts and compute settings, seamless data iteration and efficient quality testing—allowing Data Scientists and Engineers to collaborate effectively. Learn how to design modular, parameterized notebooks that run concurrently, manage dependencies and accelerate AI-driven insights. Whether you're optimizing AI inference, automating complex data workflows or architecting next-gen serverless AI systems, this session delivers actionable strategies to maximize performance while keeping costs low.

Sponsored by:  Infosys | Agentic AI Governance: Shaping a Responsible Future

Agentic AI represents a quantum leap beyond generative AI—enabling systems to make autonomous decisions and act independently. While this unlocks transformative potential, it also brings complex governance challenges. This session explores novel risks, practical strategies and proven Data & AI governance frameworks for governing agentic AI at scale

Cracking Complex Documents with Databricks Mosaic AI

In this session, we will share how we are transforming the way organizations process unstructured and non-standard documents using Mosaic AI and agentic patterns within the Databricks ecosystem. We have developed a scalable pipeline that turns complex legal and regulatory content into structured, tabular data.We will walk through the full architecture, which includes Unity Catalog for secure and governed data access, Databricks Vector Search for intelligent indexing and retrieval and Databricks Apps to deliver clear insights to business users. The solution supports multiple languages and formats, making it suitable for teams working across different regions. We will also discuss some of the key technical challenges we addressed, including handling parsing inconsistencies, grounding model responses and ensuring traceability across the entire process. If you are exploring how to apply GenAI and large language models, this session is for you. Audio for this session is delivered in the conference mobile app, you must bring your own headphones to listen.

Learn to Program Not Write Prompts with DSPy

Writing prompts for our GenAI applications is long, tedious, and unmaintainable. A proper software development lifecycle requires proper testing and maintenance, something incredibly difficult to do on a block of text. Our current prompt engineering best practices have largely been manual trial and error, testing which of our prompts work well in certain situations. This process worsens as our prompts become more complex, adding multiple tasks and functionality within one long singular prompt. Enter DSPy, your PROGRAMATIC way of building GenAI Applications. Learn how DSPy allows you to modularize your prompt into modules and enforce typing through signatures. Then, utilize state of the art algorithms to optimize the prompts and weights against your evaluation datasets, just like machine learning! We will compare DSPy to a restaurant to help illustrate and demo DSPy’s capabilities. It's time to start programming, rather than prompting, again!