talk-data.com talk-data.com

Topic

BI

Business Intelligence (BI)

data_visualization reporting analytics

78

tagged

Activity Trend

111 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Data + AI Summit 2025 ×
Beyond Chatbots: Building Autonomous Insurance Applications With Agentic AI Framework

The insurance industry is at the crossroads of digital transformation, facing challenges from market competition and customer expectations. While conventional ML applications have historically provided capabilities in this domain, the emergence of Agentic AI frameworks presents a revolutionary opportunity to build truly autonomous insurance applications. We will address issues related to data governance and quality while discussing how to monitor/evaluate fine-tune models. We'll demonstrate the application of the agentic framework in the insurance context and how these autonomous agents can work collaboratively to handle complex insurance workflows — from submission intake and risk evaluation to expedited quote generation. This session demonstrates how to architect intelligent insurance solutions using Databricks Mosaic AI agentic core components including Unity Catalog, Playground, model evaluation/guardrails, privacy filters, AI functions and AI/BI Genie.

Got Metrics? Build a Metric Store — A Tour of Developing Metrics Through UC Metric Views

I have metrics, you have metrics — we all have metrics. But the real problem isn’t having metrics, it’s that the numbers never line up, leading to endless cycles of reconciliation and confusion. Join us as we share how our Data Team at Databricks tackled this fundamental challenge in Business Intelligence by building an internal Metric Store — creating a single source of truth for all business metrics using the newly-launched UC Metric Views. Imagine a world where numbers always align, metric definitions are consistently applied across the organization and every metric comes with built-in ML-based forecasting, AI-powered anomaly detection and automatic explainability. That’s the future we’ve built — and we’ll show you how you can get started today.

Latest Innovations in AI/BI Dashboards and Genie

Discover how the latest innovations in Databricks AI/BI Dashboards and Genie are transforming self-service analytics. This session offers a high-level tour of new capabilities that empower business users to ask questions in natural language, generate insights faster and make smarter decisions. Whether you're a long-time Databricks user or just exploring what's possible with AI/BI, you'll walk away with a clear understanding of how these tools are evolving — and how to leverage them for greater business impact.

Securely Deploying AI/BI to All Users in Your Enterprise

Bringing AI/BI to every business user starts with getting security, access and governance right. In this session, we’ll walk through the latest best practices for configuring Databricks accounts, setting up workspaces, and managing authentication protocols to enable secure and scalable onboarding. Whether you're supporting a small team or an entire enterprise, you'll gain practical insights to protect your data while ensuring seamless and governed access to AI/BI tools.

What’s New in Databricks SQL: Latest Features and Live Demos

Databricks SQL has added significant features in the last year at a fast pace. This session will share the most impactful features and the customer use cases that inspired them. We will highlight the new SQL editor, SQL coding features, streaming tables and materialized views, BI integrations, cost management features, system tables and observability features, and more. We will also share AI-powered performance optimizations.

Summit Live: AI/BI Genie & Dashboards - Talk With Your Data With GenAI Powered Business Intelligence

AI/BI Genie lets anyone simply talk with their own data, using natural language, fully secured through UC to provide accurate answers within the context for your organization. AI/BI Dashboards goes beyond traditional BI tools, democratizing everyone to self-serve immediate interactive visuals on your own secured data. Hear from a customer and Databricks experts on the latest developments.

Databricks Lakeflow: the Foundation of Data + AI Innovation for Your Industry

Every analytics, BI and AI project relies on high-quality data. This is why data engineering, the practice of building reliable data pipelines that ingest and transform data, is consequential to the success of these projects. In this session, we'll show how you can use Lakeflow to accelerate innovation in multiple parts of the organization. We'll review real-world examples of Databricks customers using Lakeflow in different industries such as automotive, healthcare and retail. We'll touch on how the foundational data engineering capabilities Lakeflow provides help power initiatives that improve customer experiences, make real-time decisions and drive business results.

Want to learn how to build your own custom data intelligence applications directly in Databricks? In this workshop, we’ll guide you through a hands-on tutorial for building a Streamlit web app that leverages many of the key products at Databricks as building blocks. You’ll integrate a live DB SQL warehouse, use Genie to ask questions in natural language, and embed AI/BI dashboards for interactive visualizations. In addition, we’ll discuss key concepts and best practices for building production-ready apps, including logging and observability, scalability, different authorization models, and deployment. By the end, you'll have a working AI app—and the skills to build more.

Introduction to Unity Catalog Metrics: Define Your Business Metrics Once, Trust Everywhere

Today’s organizations need faster, more reliable insights — but metric sprawl and inconsistent KPIs make that difficult. In this session, you’ll learn how Unity Catalog Metrics helps unify business semantics across your organization. Define your KPIs once, apply enterprise-grade governance with fine-grained access controls, auditing and lineage, and use them across any Databricks tool — from AI/BI Dashboards and Genie to notebooks and Lakeflow. You’ll learn how to eliminate metric chaos by centrally defining and governing metrics with Unity Catalog. You’ll walk away with strategies to boost trust through built-in governance and empower every team — regardless of technical skill — to work from the same certified metrics.

Scaling AI/BI Genie: Best Practices for Curating and Managing Production Spaces

Unlock Genie's full potential with best practices for curating, deploying and monitoring Genie spaces at scale. This session offers a deep dive into the latest enhancements and provides practical guidance on designing high-quality spaces, streamlining deployment workflows and implementing robust monitoring to ensure accuracy and performance in production. Ideal for teams aiming to scale conversational analytics, you’ll leave with actionable strategies to keep your Genie spaces efficient, reliable and aligned with business outcomes.

What's New and What's Next: Building Impactful AI/BI Dashboards

Ready to take your AI/BI dashboards to the next level? This session dives into the latest capabilities in Databricks AI/BI Dashboards and how to maximize impact across your organization. Learn how data authors can tailor visualizations for different audiences, optimize performance and seamlessly integrate with Genie for a unified analytics experience. We’ll also share practical tips on how business users and data teams can better collaborate — ensuring insights are accessible, actionable and aligned to business goals.

AI-Assisted BI: Everything You Need to Know

Explore how AI is transforming business intelligence and data analytics across the Databricks platform. This session offers a comprehensive overview of AI-assisted capabilities, from generating dashboards and visualizations to integrating Genie on dashboards for conversational analytics. Whether you’re a data engineer, analyst or BI developer, this session will equip you to leverage AI with BI for better, smarter decisions.

AI/BI Genie: A Look Under the Hood of Everyone's Friendly, Neighborhood GenAI Product

Go beyond the user interface and explore the cutting-edge technology driving AI/BI Genie. This session breaks down the AI/BI Genie architecture, showcasing how LLMs, retrieval-augmented generation (RAG) and finely tuned knowledge bases work together to deliver fast, accurate responses. We’ll also explore how AI agents orchestrate workflows, optimize query performance and continuously refine their understanding. Ideal for those who want to geek out about the tech stack behind Genie, this session offers a rare look at the magic under the hood.

Bridging BI Tools: Deep Dive Into AI/BI Dashboards for Power BI Practitioners

In the rapidly-evolving field of data analytics, (AI/BI) dashboards and Power BI stand out as two formidable approaches, each offering unique strengths and catering to specific use cases. Power BI has earned its reputation for delivering user-friendly, highly customisable visualisations and reports for data analysis. On the other hand, AI/BI dashboards have gained good traction due to their seamless integration with the Databricks platform, making them an attractive option for data practitioners. This session will provide a comparison of these two tools, highlighting their respective features, strengths and potential limitations. Understanding the nuances between these tools is crucial for organizations aiming to make informed decisions about their data analytics strategy. This session will equip participants with the knowledge needed to select the most appropriate tool or combination of tools to meet their data analysis requirements and drive data-informed decision-making processes.

Building Responsible AI Agents on Databricks

This presentation explores how Databricks' Data Intelligence Platform supports the development and deployment of responsible AI in credit decisioning, ensuring fairness, transparency and regulatory compliance. Key areas include bias and fairness monitoring using Lakehouse Monitoring to track demographic metrics and automated alerts for fairness thresholds. Transparency and explainability are enhanced through the Mosaic AI Agent Framework, SHAP values and LIME for feature importance auditing. Regulatory alignment is achieved via Unity Catalog for data lineage and AIBI dashboards for compliance monitoring. Additionally, LLM reliability and security are ensured through AI guardrails and synthetic datasets to validate model outputs and prevent discriminatory patterns. The platform integrates real-time SME and user feedback via Databricks Apps and AI/BI Genie Space.

How do you transform a data pipeline from sluggish 10-hour batch processing into a real-time powerhouse that delivers insights in just 10 minutes? This was the challenge we tackled at one of France's largest manufacturing companies, where data integration and analytics were mission-critical for supply chain optimization. Power BI dashboards needed to refresh every 15 minutes. Our team struggled with legacy Azure Data Factory batch pipelines. These outdated processes couldn’t keep up, delaying insights and generating up to three daily incident tickets. We identified Lakeflow Declarative Pipelines and Databricks SQL as the game-changing solution to modernize our workflow, implement quality checks, and reduce processing times.In this session, we’ll dive into the key factors behind our success: Pipeline modernization with Lakeflow Declarative Pipelines: improving scalability Data quality enforcement: clean, reliable datasets Seamless BI integration: Using Databricks SQL to power fast, efficient queries in Power BI

Healthcare Interoperability: End-to-End Streaming FHIR Pipelines With Databricks & Redox

Redox & Databricks direct integration can streamline your interoperability workflows from responding in record time to preauthorization requests to letting attending physicians know about a change in risk for sepsis and readmission in near real time from ADTs. Data engineers will learn how to create fully-streaming ETL pipelines for ingesting, parsing and acting on insights from Redox FHIR bundles delivered directly to Unity Catalog volumes. Once available in the Lakehouse, AI/BI Dashboards and Agentic Frameworks help write FHIR messages back to Redox for direct push down to EMR systems. Parsing FHIR bundle resources has never been easier with SQL combined with the new VARIANT data type in Delta and streaming table creation against Serverless DBSQL Warehouses. We'll also use Databricks accelerators dbignite and redoxwrite for writing and posting FHIR bundles back to Redox integrated EMRs and we'll extend AI/BI with Unity Catalog SQL UDFs and the Redox API for use in Genie.

Sponsored by: Pantomath | The Shift from 3,000 to 500 BI Reports: A Data Leader’s Guide to Leaner, Smarter Data Operations

Join Sandy Steiger, Head of Advanced Analytics & Automation (formerly at TQL), as she walks through how her team tackled one of the most common and least talked about problems in data teams: report bloat, data blind spots, and broken trust with the business. You’ll learn how TQL went from 3,000 reports to fewer than 500 while gaining better visibility, faster data issue resolution, and cloud agility through practical use of lineage, automated detection, and surprising outcomes from implementing Pantomath (an automated data operations platform). Sandy will share how her team identified upstream issues (before Microsoft did), avoided major downstream breakages, and built the credibility every data team needs to earn trust from the business. Walk away with a playbook for using automation to drive smarter, faster decisions across your organization.

Sponsored by: Promethium | Delivering Self-Service Data for AI Scale on Databricks

AI initiatives often stall when data teams can’t keep up with business demand for ad hoc, self-service data. Whether it’s AI agents, BI tools, or business users—everyone needs data immediately, but the pipeline-centric modern data stack is not built for this scale of agility. Promethium enables the data teams to generate instant, contextual data products called Data Answers based on rapid, exploratory questions from the business. Data Answers empower data teams for AI-scale collaboration with the business. We will demo Promethium’s new agent capability to build data answers on Databricks for self-service data. The Promethium agent leverages and extends Genie with context from other enterprise data and applications to ensure accuracy and relevance.

Evolving Data Insights With Privacy at Mastercard

Mastercard is a global technology company whose role is anchored in trust. It supports 3.4 billion cards and over 143 billion transactions annually. To address customers’ increasing data volume and complex privacy needs, Mastercard has developed a novel service atop Databricks’ Clean Rooms and broader Data Intelligence Platform. This service combines several Databricks components with Mastercard’s IP, providing an evolved method for data-driven insights and value-added services while ensuring a unique standalone turnkey service. The result is a secure environment where multiple parties can collaborate on sensitive data without directly accessing each other’s information. After this session, attendees will understand how Mastercard used its expertise in privacy-enhancing technologies to create collaboration tools powered by Databricks’ Clean Rooms, AI/BI, Apps, Unity Catalog, Workflows and DatabricksIQ — as well as how to take advantage of this new privacy-enhancing service directly.