talk-data.com talk-data.com

Topic

Databricks

big_data analytics spark

509

tagged

Activity Trend

515 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Data + AI Summit 2025 ×
The Full Stack of Innovation: Building Data and AI Products With Databricks Apps

In this deep-dive technical session, Ivan Trusov (Sr. SSA @ Databricks) and Giran Moodley (SA @ Databricks) — will explore the full-stack development of Databricks Apps, covering everything from frameworks to deployment. We’ll walk through essential topics, including: Frameworks & tooling — Pythonic (Dash, Streamlit, Gradio) vs. JS + Python stack Development lifecycle — Debugging, issue resolution and best practices Testing — Unit, integration and load testing strategies CI/CD & deployment — Automating with Databricks Asset Bundles Monitoring & observability — OpenTelemetry, metrics collection and analysis Expect a highly practical session with several live demos, showcasing the development loop, testing workflows and CI/CD automation. Whether you’re building internal tools or AI-powered products, this talk will equip you with the knowledge to ship robust, scalable Databricks Apps.

Use External Models in Databricks: Connecting to Azure, AWS, Google Cloud, Anthropic and More

In this session you will learn how to leverage a wide set of GenAI models in Databricks, including external connections to cloud vendors and other model providers. We will cover establishing connection to externally served models, via Mosaic AI Gateway. This will showcase connection to Azure, AWS & Google Cloud models, as well as model vendors like Anthropic, Cohere, AI21 Labs and more. You will also discover best practices on model comparison, governance and cost control on those model deployments.

Building AI models of human cell: Tahoe Therapeutics on Databricks

Discover how Tahoe Therapeutics (formerly Vevo) is generating gigascale single-cell data that map how drugs interact with cells from cancer patients. They are using that to find better therapeutics, and to build AI models that can predict drug-patient interactions on Databricks. Their technology enabled the landmark Tahoe-100M atlas, the world’s largest dataset of drug responses-profiling 100 million cells across 60,000 conditions. Learn how we use Databricks to process this massive data, enabling AI models that predict drug efficacy and resistance at the cellular level. Recognized as the Grand Prize Winner of the Databricks Generative AI Startup Challenge, Tahoe sets a new standard for scalable, data-driven drug discovery.

How Serverless Empowered Nationwide to Build Cost-Efficient and World Class BI

Databricks’ Serverless compute streamlines infrastructure setup and management, delivering unparalleled performance and cost optimization for Data and BI workflows. In this presentation, we will explore how Nationwide is leveraging Databricks’ serverless technology and unified governance through Unity Catalog to build scalable, world-class BI solutions. Key features like AI/BI Dashboards, Genie, Materialized Views, Lakehouse Federation and Lakehouse Apps, all powered by serverless, have empowered business teams to deliver faster, scalable and smarter insights. We will show how Databricks’ serverless technology is enabling Nationwide to unlock new levels of efficiency and business impact, and how other organizations can adopt serverless technology to realize similar benefits.

Race to Real-Time: Low-Latency Streaming ETL Meets Next-Gen Databricks OLTP-DB

In today’s digital economy, real-time insights and rapid responsiveness are paramount to delivering exceptional user experiences and lowering TCO. In this session, discover a pioneering approach that leverages a low-latency streaming ETL pipeline built with Spark Structured Streaming and Databricks’ new OLTP-DB—a serverless, managed Postgres offering designed for transactional workloads. Validated in a live customer scenario, this architecture achieves sub-2 second end-to-end latency by seamlessly ingesting streaming data from Kinesis and merging it into OLTP-DB. This breakthrough not only enhances performance and scalability but also provides a replicable blueprint for transforming data pipelines across various verticals. Join us as we delve into the advanced optimization techniques and best practices that underpin this innovation, demonstrating how Databricks’ next-generation solutions can revolutionize real-time data processing and unlock a myriad of new use cases in data landscape.

Sponsored by: Cognizant | How Cognizant Helped RJR Transform Market Intelligence with GenAI

Cognizant developed a GenAI-driven market intelligence chatbot for RJR using Dash UI. This chatbot leverages Databricks Vector Search for vector embeddings and semantic search, along with the DBRX-Instruct LLM model to provide accurate and contextually relevant responses to user queries. The implementation involved loading prepared metadata into a Databricks vector database using the GTE model to create vector embeddings, indexing these embeddings for efficient semantic search, and integrating the DBRX-Instruct LLM into the chat system with prompts to guide the LLM in understanding and responding to user queries. The chatbot also generated responses containing URL links to dashboards with requested numerical values, enhancing user experience and productivity by reducing report navigation and discovery time by 30%. This project stands out due to its innovative AI application, advanced reasoning techniques, user-friendly interface, and seamless integration with MicroStrategy.

Sponsored by: Infosys | Beyond Hype: Scale & Democratize Agentic AI across enterprise to realize business outcomes.

Agentic AI and multimodal data are the next frontiers for realizing intelligent and autonomous business systems. Learn how Infosys innovates with Databricks for accelerating data to AI agent journey at scale across an enterprise. Hear our pragmatic capability driven approach instead of use case-based approach to bring the data universe, AI foundations, agent management, data and AI governance and collaboration under unified management.

The explosion of AI has helped make the enterprise data landscape more important, and complex, than ever before. Join us to learn how Databricks’ and Tableau’s platforms come together to empower users of all kinds to see, understand, and act on your data in a secure, governed, and performant way.

Summit Live: Databricks Apps -  Empowering data and AI teams to build and deploy applications with ease

Databricks Apps empowers data and AI teams to easily build and deploy applications with ease - in just minutes not months! It’s the fastest and most secure way to deliver impactful solutions, with built-in governance based on Unity Catalog and the Databricks Data Intelligence Platform. See demos on the latest and greatest, and how YOU can get started right away.

Crypto at Scale: Building a High-Performance Platform for Real-Time Blockchain Data

In today’s fast-evolving crypto landscape, organizations require fast, reliable intelligence to manage risk, investigate financial crime, and stay ahead of evolving threats. In this session we will discover how Elliptic built a scalable, high-performance Data Intelligence Platform that delivers real-time, actionable Blockchain insights to their customers. We’ll walk you through some of the key components of the Elliptic Platform, including the Elliptic Entity Graph and our User-Facing Analytics. Our focus will be put on the evolution of our User-Facing Analytics capabilities, and specifically how components from the Databricks ecosystem such as Structured Streaming, Delta Lake, and SQL Warehouse have played a vital role. We’ll also share some of the optimizations we’ve made to our streaming jobs to maximize performance and ensure Data Completeness. Whether you’re looking to enhance your streaming capabilities, expand your knowledge of how crypto analytics works or simply discover novel approaches to data processing at scale, this session will provide concrete strategies and valuable lessons learned.

Databricks Observability: Using System Tables to Monitor and Manage Your Databricks Instance

The session will cover how to use Unity Catalog governed system tables to understand what is happening in Databricks. We will touch on key scenarios for FinOps, DevOps and SecOps to ensure you have a well-observed Data Intelligence Platform. Learn about new developments in system tables and other features that will help you observe your Databricks instance.

Data Intelligence for Marketing Breakout: Agentic Systems for Bayesian MMM and Consumer Testing

This talk dives into leveraging GenAI to scale sophisticated decision intelligence. Learn how an AI copilot interface simplifies running complex Bayesian probabilistic models, accelerating insight generation, and accurate decision making at the enterprise level. We talk through techniques for deploying AI agents at scale to simulate market dynamics or product feature impacts, providing robust, data-driven foresight for high-stakes innovation and strategy directly within your Databricks environment. For marketing teams, this approach will help you leverage autonomous AI agents to dynamically manage media channel allocation while simulating real-world consumer behavior through synthetic testing environments.

Delivering Sub-Second Latency for Operational Workloads on Databricks

As enterprise streaming adoption accelerates, more teams are turning to real-time processing to support operational workloads that require sub-second response times. To address this need, Databricks introduced Project Lightspeed in 2022, which recently delivered Real-Time Mode in Apache Spark™ Structured Streaming. This new mode achieves consistent p99 latencies under 300ms for a wide range of stateless and stateful streaming queries. In this session, we’ll define what constitutes an operational use case, outline typical latency requirements and walk through how to meet those SLAs using Real-Time Mode in Structured Streaming.

Developing the Dreamers of Data + AI’s Future: How 84.51˚ builds upskilling to accelerate adoption

“Once an idea has taken hold of the brain it's almost impossible to eradicate. An idea that is fully formed — fully understood — that sticks, right in there somewhere.” The Data Scientists and Engineers at 84.51˚ utilize the Databricks Lakehouse for a wide array of tasks, including data exploration, analysis, machine learning operations, orchestration, automated deployments and collaboration. In this talk, 84.51˚’s Data Science Learning Lead, Michael Carrico, will share their approach to upskilling a diverse workforce to support the company’s strategic initiatives. This approach includes creating tailored learning experiences for a variety of personas using content curated in partnership with Databricks’ educational offerings. Then he will demonstrate how he puts his 11 years of data science and engineering experience to work by using the Databricks Lakehouse not just as a subject, but also as a tool to create impactful training experiences and a learning culture at 84.51˚.

Empowering the Warfighter With AI

The new Budget Execution Validation process has transformed how the Navy reviews unspent funds. Powered by Databricks Workflows, MLflow, Delta Lake and Apache Spark™, this data-driven model predicts which financial transactions are most likely to have errors, streamlining reviews and increasing accuracy. In FY24, it helped review $40 billion, freeing $1.1 billion for other priorities, including $260 million from active projects. By reducing reviews by 80%, cutting job runtime by over 50% and lowering costs by 60%, it saved 218,000 work hours and $6.7 million in labor costs. With automated workflows and robust data management, this system exemplifies how advanced tools can improve financial decision-making, save resources and ensure efficient use of taxpayer dollars.

Healthcare and Life Sciences: Getting Started with AI Agents

Healthcare and life sciences organizations are exploring AI Agents, driving transformation through intelligent supply chains to helping up-level the patient experience via virtual assistants. This session explores how you can get started with AI Agents, powered by Databricks and robust data governance, and tapping into the full potential of all your data. You’ll learn practical steps for getting started: unifying data with Databricks, ensuring compliance with Unity Catalog, and rapidly deploying AI Agents to drive operational efficiency, improve care, and foster innovation across healthcare and life sciences.

How to Migrate From Oracle to Databricks SQL

Migrating your legacy Oracle data warehouse to the Databricks Data Intelligence Platform can accelerate your data modernization journey. In this session, learn the top strategies for completing this data migration. We will cover data type conversion, basic to complex code conversions, validation and reconciliation best practices. Discover the pros and cons of using CSV files to PySpark or using pipelines to Databricks tables. See before-and-after architectures of customers who have migrated, and learn about the benefits they realized.