talk-data.com talk-data.com

Topic

AI/ML

Artificial Intelligence/Machine Learning

data_science algorithms predictive_analytics

9014

tagged

Activity Trend

1532 peak/qtr
2020-Q1 2026-Q1

Activities

9014 activities · Newest first

Measure What Matters: Quality-Focused Monitoring for Production AI Agents

Ensuring the operational excellence of AI agents in production requires robust monitoring capabilities that span both performance metrics and quality evaluation. This session explores Databricks' comprehensive Mosaic Agent Monitoring solution, designed to provide visibility into deployed AI agents through an intuitive dashboard that tracks critical operational metrics and quality indicators. We'll demonstrate how to use the Agent Monitoring solution to iteratively improve a production agent that delivers a better customer support experience while decreasing the cost of delivering customer support. We will show how to: Identify and proactively fix a quality problem with the GenAI agent’s response before it becomes a major issue. Understand user’s usage patterns and implement/test an feature improvement to the GenAI agent Key session takeaways include: Techniques for monitoring essential operational metrics, including request volume, latency, errors, and cost efficiency across your AI agent deployments Strategies for implementing continuous quality evaluation using AI judges that assess correctness, guideline adherence, and safety without requiring ground truth labels Best practices for setting up effective monitoring dashboards that enable dimension-based analysis across time periods, user feedback, and topic categories Methods for collecting and integrating end-user feedback to create a closed-loop system that drives iterative improvement of your AI agents

Optimizing EV Charging Experience: Machine Learning for Accurate Charge Time Estimation

Accurate charge time estimation is key to vehicle performance and user experience. We developed a scalable ML model that enhances real-time charge predictions in vehicle controls. Traditional rule-based methods struggle with dynamic factors like environment, vehicle state, and charging conditions. Our adaptive ML solution improves accuracy by 10%. We use Unity Catalog for data governance, Delta Tables for storage, and Liquid Clustering for data layout. Job schedulers manage data processing, while AutoML accelerates model selection. MLflow streamlines tracking, versioning, and deployment. A dedicated serving endpoint enables A/B testing and real-time insights. As our data ecosystem grew, scalability became critical. Our flexible ML framework was integrated into vehicle control systems within months. With live accuracy tracking and software-driven blending, we support 50,000+ weekly charge sessions, improving energy management and user experience.

Powering Secure and Scalable Data Governance at PepsiCo With Unity Catalog Open APIs

PepsiCo, given its scale, has numerous teams leveraging different tools and engines to access data and perform analytics and AI. To streamline governance across this diverse ecosystem, PepsiCo unifies its data and AI assets under an open and enterprise-grade governance framework with Unity Catalog. In this session, we'll explore real-world examples of how PepsiCo extends Unity Catalog’s governance to all its data and AI assets, enabling secure collaboration even for teams outside Databricks. Learn how PepsiCo architects permissions using service principals and service accounts to authenticate with Unity Catalog, building a multi-engine architecture with seamless and open governance. Attendees will gain practical insights into designing a scalable, flexible data platform that unifies governance across all teams while embracing openness and interoperability.

Real-Time Botnet Defense at CVS: AI-Driven Detection and Mitigation on Databricks

Botnet attacks mobilize digital armies of compromised devices that continuously evolve, challenging traditional security frameworks with their high-speed, high-volume nature. In this session, we will reveal our advanced system — developed on the Databricks platform — that leverages cutting-edge AI/ML capabilities to detect and mitigate bot attacks in near-real time. We will dive into the system’s robust architecture, including scalable data ingestion, feature engineering, MLOps strategies & production deployment of the system. We will address the unique challenges of processing bulk HTTP traffic data, time-series anomaly detection and attack signature identification. We will demonstrate key business values through downtime minimization and threat response automation. With sectors like healthcare facing heightened risks, ensuring data integrity and service continuity is vital. Join us to uncover lessons learned while building an enterprise-grade solution that stays ahead of adversaries.

Sponsored by: Confluent | Turn SAP Data into AI-Powered Insights with Databricks

Learn how Confluent simplifies real-time streaming of your SAP data into AI-ready Delta tables on Databricks. In this session, you'll see how Confluent’s fully managed data streaming platform—with unified Apache Kafka® and Apache Flink®—connects data from SAP S/4HANA, ECC, and 120+ other sources to enable easy development of trusted, real-time data products that fuel highly contextualized AI and analytics. With Tableflow, you can represent Kafka topics as Delta tables in just a few clicks—eliminating brittle batch jobs and custom pipelines. You’ll see a product demo showcasing how Confluent unites your SAP and Databricks environments to unlock ERP-fueled AI, all while reducing the total cost of ownership (TCO) for data streaming by up to 60%.

Sponsored by: Datafold | Breaking Free: How Evri is Modernizing SAP HANA Workflows to Databricks with AI and Datafold

With expensive contracts up for renewal, Evri faced the challenge of migrating 1,000 SAP HANA assets and 200+ Talend jobs to Databricks. This talk will cover how we transformed SAP HANA and Talend workflows into modern Databricks pipelines through AI-powered translation and validation -- without months of manual coding. We'll cover:- Techniques for handling SAP HANA's proprietary formats- Approaches for refactoring incremental pipelines while ensuring dashboard stability- The technology enabling automated translation of complex business logic- Validation strategies that guarantee migration accuracye'll share real examples of SAP HANA stored procedures transformed into Databricks code and demonstrate how we maintained 100% uptime of critical dashboards during the transition. Join us to discover how AI is revolutionizing what's possible in enterprise migrations from GUI-based legacy systems to modern, code-first data platforms.

Sponsored by: Dataiku | Agility Meets Governance: How Morgan Stanley Scales ML in a Regulated World

In regulated industries like finance, agility can't come at the cost of compliance. Morgan Stanley found the answer in combining Dataiku and Databricks to create a governed, collaborative ecosystem for machine learning and predictive analytics. This session explores how the firm accelerated model development and decision-making, reducing time-to-insight by 50% while maintaining full audit readiness. Learn how no-code workflows empowered business users, while scalable infrastructure powered Terabyte-scale ML. Discover best practices for unified data governance, risk automation, and cross-functional collaboration that unlock innovation without compromising security. Ideal for data leaders and ML practitioners in regulated industries looking to harmonize speed, control, and value.

Sponsored by: Impetus Technologies | Future-Ready Data at Scale: How Shutterfly Modernized for GenAI-Driven Personalization

As a leading personalized product retailer, Shutterfly needed a modern, secure, and performant data foundation to power GenAI-driven customer experiences. However, their existing stack was creating roadblocks in performance, governance, and machine learning scalability. In partnership with Impetus, Shutterfly embarked on a multi-phase migration to Databricks Unity Catalog. This transformation not only accelerated Shutterfly’s ability to provide AI-driven personalization at scale but also improved governance, reduced operational overhead, and laid a scalable foundation for GenAI innovation. Join experts from Databricks, Impetus, and Shutterfly to discover how this collaboration enabled faster data-driven decision-making, simplified compliance, and unlocked the agility needed to meet evolving customer demands in the GenAI era. Learn from their journey and take away best practices for your own modernization efforts.

Sponsored by: Oxylabs | Web Scraping and AI: A Quiet but Critical Partnership

Behind every powerful AI system lies a critical foundation: fresh, high-quality web data. This session explores the symbiotic relationship between web scraping and artificial intelligence that's transforming how technical teams build data-intensive applications. We'll showcase how this partnership enables crucial use cases: analyzing trends, forecasting behaviors, and enhancing AI models with real-time information. Technical challenges that once made web scraping prohibitively complex are now being solved through the very AI systems they help create. You'll learn how machine learning revolutionizes web data collection, making previously impossible scraping projects both feasible and maintainable, while dramatically reducing engineering overhead and improving data quality. Join us to explore this quiet but critical partnership that's powering the next generation of AI applications.

Sponsored by: Promethium | Delivering Self-Service Data for AI Scale on Databricks

AI initiatives often stall when data teams can’t keep up with business demand for ad hoc, self-service data. Whether it’s AI agents, BI tools, or business users—everyone needs data immediately, but the pipeline-centric modern data stack is not built for this scale of agility. Promethium enables the data teams to generate instant, contextual data products called Data Answers based on rapid, exploratory questions from the business. Data Answers empower data teams for AI-scale collaboration with the business. We will demo Promethium’s new agent capability to build data answers on Databricks for self-service data. The Promethium agent leverages and extends Genie with context from other enterprise data and applications to ensure accuracy and relevance.

Sponsored by: Salesforce | From Data to Action: A Unified and Trusted Approach

Empower AI and agents with trusted data and metadata from an end-to-end unified system. Discover how Salesforce Data Cloud, Agentforce, and Databricks work together to fuel automation, AI, and analytics through a unified data strategy—driving real-time intelligence, enabling zero-copy data sharing, and unlocking scalable activation across the enterprise.

Sponsored by: Securiti | Safely Curating Data to Enable Enterprise AI with Databricks

This session will explore how developers can easily select, extract, filter, and control data pre-ingestion to accelerate safe AI. Learn how the Securiti and Databricks partnership empowers Databricks users by providing the critical foundation for unlocking scalability and accelerating trustworthy AI development and adoption.Key Takeaways:● Understand how to leverage data intelligence to establish a foundation for frameworks like OWASP top 10 for LLM’s, NIST AI RMF and Gartner’s TRiSM.● Learn how automated data curation and synching address specific risks while accelerating AI development in Databricks.● Discover how leading organizations are able to apply robust access controls across vast swaths of mostly unstructured data● Learn how to maintain data provenance and control as data is moved and transformed through complex pipelines in the Databricks platform.

Sponsored by: Tredence | Getting Your Data Foundation Ready for Agentic AI

In this session, we take you inside a new kind of data foundation—one that moves beyond traditional tables and metrics to create meaning. We’ll explore how turning metadata into a system of understanding, not just record-keeping, can unlock powerful agentic workflows. You’ll see how business terms like "cost variance" or "lead time" become executable, traceable, and reusable assets that guide AI agents, ensure consistency, and restore trust across decentralized teams. Drawing on real challenges from the CPG world, we’ll walk through how companies are moving from governance battles to collaborative ownership, from static reports to living data definitions, and from disconnected data to decisions that act with confidence and speed. The future of AI isn't just about smarter models—it’s about smarter context. This is your roadmap for transforming your data foundation into a shared language between people and machines.

Techcombank's Multi-Million Dollar Transformation Leveraging Cloud and Databricks

The migration to the Databricks Data Intelligence Platform has enabled Techcombank to more efficiently unify data from over 50 systems, improve governance, streamline daily operational analytics pipelines and use advanced analytics tools and AI to create more meaningful and personalized experiences for customers. With Databricks, Techcombank has also introduced key solutions that are reshaping its digital banking services: AI-driven lead management system: Techcombank's internally developed AI program called 'Lead Allocation Curated Engine' (LACE) optimizes lead management and provides relationship managers with enriched insights for smarter lead allocation to drive business growth. AI-powered program for digital banking inclusion of small businesses: An AI-powered GeoSense assists frontline workers with analytics-driven insights about which small businesses and merchants to engage in the bank's digital ecosystem. And more examples, which will be presented.

Unlocking Cross-Organizational Collaboration to Protect the Environment With Databricks at DEFRA

Join us to learn how the UK's Department for Environment, Food & Rural Affairs (DEFRA) transformed data use with Databricks’ Unity Catalog, enabling nationwide projects through secure, scalable analytics. DEFRA safeguards the UK's natural environment. Historical fragmentation of data, talent and tools across siloed platforms and organizations, made it difficult to fully exploit the department’s rich data. DEFRA launched its Data Analytics & Science Hub (DASH), powered by the Databricks Data Intelligence Platform, to unify its data ecosystem. DASH enables hundreds of users to access and share datasets securely. A flagship example demonstrates its power, using Databricks to process aerial photography and satellite data to identify peatlands in need of restoration — a complex task made possible through unified data governance, scalable compute and AI. Attendees will hear about DEFRA’s journey, learn valuable lessons about building a platform crossing organizational boundaries.

Pacers Sports and Entertainment and Databricks

The Pacers Sports Group has had an amazing year. The Indianapolis Pacers in the NBA finals for the first time in 25 years. The Fever are setting attendance and viewership records with WNBA celebrity Caitlin Clark. Hear how they have transformed their data and AI capabilities for marketing, fan behavior insights, season ticket propensity models, and democratization to their non-technical personas. And receiving a 12,000x cost reduction down to just $8 a year switching to Databricks.

Scaling Blockchain ML With Databricks: From Graph Analytics to Graph Machine Learning

Coinbase leverages Databricks to scale ML on blockchain data, turning vast transaction networks into actionable insights. This session explores how Databricks’ scalable infrastructure, powered by Delta Lake, enables real-time processing for ML applications like NFT floor price predictions. We’ll show how GraphFrames helps us analyze billion-node transaction graphs (e.g., Bitcoin) for clustering and fraud detection, uncovering structural patterns in blockchain data. But traditional graph analytics has limits. We’ll go further with Graph Neural Networks (GNNs) using Kumo AI, which learn from the transaction network itself rather than relying on hand-engineered features. By encoding relationships directly into the model, GNNs adapt to new fraud tactics, capturing subtle relationships that evolve over time. Join us to see how Coinbase is advancing blockchain ML with Databricks and deep learning on graphs.

Sponsored by: Acceldata | Agentic Data Management: Trusted Data for Enterprise AI on Databricks

An intelligent, action-driven approach to bridge Data Engineering and AI/ML workflows, delivering continuous data trust through comprehensive monitoring, validation, and remediation across the entire Databricks data lifecycle. Learn how Acceldata’s Agentic Data Management (ADM) platform: Ensures end-to-end data reliability across Databricks from ingestion, transformation, feature engineering, and model deployment. Bridges data engineering and AI teams by providing unified insights across Databricks jobs, notebooks and pipelines with proactive data insights and actions. Accelerates the delivery of trustworthy enterprise AI outcomes by detecting multi-variate anomalies, monitoring feature drift, and maintaining lineage within Databricks-native environments.

Sponsored by: Google Cloud | Powering AI & Analytics: Innovations in Google Cloud Storage for Data Lakes

Enterprise customers need a powerful and adaptable data foundation to navigate demands of AI and multi-cloud environments. This session dives into how Google Cloud Storage serves as a unified platform for modern analytics data lakes, together with Databricks. Discover how Google Cloud Storage provides key innovations like performance optimizations for Apache Iceberg, Anywhere Cache as the easiest way to colocate storage and compute, Rapid Storage for ultra low latency object reads and appends, and Storage Intelligence for vital data insights and recommendations. Learn how you can optimize your infrastructure to unlock the full value of your data for AI-driven success.

Sponsored by: Qubika | Agentic AI In Finance: How To Build Agents Using Databricks And LangGraph

Join us for this session on how to build AI finance agents with Databricks and LangChain. This session introduces a powerful approach to building AI agents by combining a modular framework that integrates LangChain, retrieval-augmented generation (RAG), and Databricks' unified data platform to build intelligent, adaptable finance agents. We’ll walk through the architecture and key components, including Databricks Unity Catalog, ML Flow, and Mosaic AI involved in building a system tailored for complex financial tasks like portfolio analysis, reporting automation, and real-time risk insights. We’ll also showcase a demo of one such agent in action - a Financial Analyst Agent. This agent emulates the expertise of a seasoned data analyst, delivering in-depth analysis in seconds - eliminating the need to wait hours or days for manual reports. The solution provides organizations with 24/7 access to advanced data analysis, enabling faster, smarter decision-making.