talk-data.com talk-data.com

Topic

Analytics

data_analysis insights metrics

178

tagged

Activity Trend

398 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Data + AI Summit 2025 ×
Sponsored by: ThoughtSpot | How Chevron Fuels Cloud Data Modernization

Learn how Chevron transitioned their central finance and procurement analytics into the cloud using Databricks and ThoughtSpot’s Agentic Analytics Platform. Explore how Chevron leverages ThoughtSpot to unlock actionable insights, enhance their semantic layer with user-driven understanding, and ultimately drive more impactful strategies for customer engagement and business growth. In this session, Chevron explains the dos, don’ts, and best practices of migrating from outdated legacy business intelligence to real time, AI-powered insights.

Streaming Meets Governance: Building AI-Ready Tables With Confluent Tableflow and Unity Catalog

Learn how Databricks and Confluent are simplifying the path from real-time data to governed, analytics- and AI-ready tables. This session will cover how Confluent Tableflow automatically materializes Kafka topics into Delta tables and registers them with Unity Catalog — eliminating the need for custom streaming pipelines. We’ll walk through how this integration helps data engineers reduce ingestion complexity, enforce data governance and make real-time data immediately usable for analytics and AI.

Unified Advanced Analytics: Integrating Power BI and Databricks Genie for Real-time Insights

In today’s data-driven landscape, business users expect seamless, interactive analytics without having to switch between different environments. This presentation explores our web application that unifies a Power BI dashboard with Databricks Genie, allowing users to query and visualize insights from the same dataset within a single, cohesive interface. We will compare two integration strategies: one that leverages a traditional webpage enhanced by an Azure bot to incorporate Genie’s capabilities, and another that utilizes Databricks Apps to deliver a smoother, native experience. We use the Genie API to build this solution. Attendees will learn the architecture behind these solutions, key design considerations and challenges encountered during implementation. Join us to see live demos of both approaches, and discover best practices for delivering an all-in-one, interactive analytics experience.

FinOps: Automated Unity Catalog Cost Observability, Data Isolation and Governance Framework

Westat, a leader in data-driven research for 60 years+, has implemented a centralized Databricks platform to support hundreds of research projects for government, foundations, and private clients. This initiative modernizes Westat’s technical infrastructure while maintaining rigorous statistical standards and streamlining data science. The platform enables isolated project environments with strict data boundaries, centralized oversight, and regulatory compliance. It allows project-specific customization of compute and analytics, and delivers scalable computing for complex analyses. Key features include config-driven Infrastructure as Code (IaC) with Terragrunt, custom tagging and AWS cost integration for ROI tracking, budget policies with alerts for proactive cost management, and a centralized dashboard with row-level security for self-service cost analytics. This unified approach provides full financial visibility and governance while empowering data teams to deliver value. Audio for this session is delivered in the conference mobile app, you must bring your own headphones to listen.

From Code to Insights: Leveraging Advanced Infrastructure and AI Capabilities.

In this talk, we will explore how AI and advanced infrastructure are transforming Insulet's development and operations. We'll highlight how our innovations have reduced scrap part costs through manufacturing analytics, showcasing efficiency and cost savings. On leveraging Databricks AI solutions and productivity, it not only identifies errors but also fixes code and assists in writing complex queries. This goes beyond suggestions, providing actual solutions. On the infrastructure side, integrating Spark with Databricks simplifies setup and reduces costs. Additionally Databricks Lakeflow Connect enables real-time updates and simplification without much coding as we integrate with Salesforce. We'll also discuss real-time processing of patient data, demonstrating how Databricks drives efficiency and productivity. Join us to learn how these innovations enhance efficiency, cost savings and performance.

Sponsored by: Capital One Software | How Capital One Balances Lower Cost and Peak Performance in Databricks

Companies need a lot of data to build and deploy AI models—and they want it quickly. To meet this demand, platform teams are quickly scaling their Databricks usage, resulting in excess cost driven by inefficiencies and performance anomalies. Capital One has over 4,000 users leveraging Databricks to power advanced analytics and machine learning capabilities at scale. In this talk, we’ll share lessons learned from optimizing our own Databricks usage while balancing lower cost with peak performance. Attendees will learn how to identify top sources of waste, best practices for cluster management, tips for user governance and methods to keep costs in check.

Sponsored by: Lovelytics | From SAP Silos to Supply Chain Superpower: How AI Is Reinventing Planning

Today’s supply chains demand more than historical insights–they need real-time intelligence. In this actionable session, discover how leading enterprises are unlocking the full potential of their SAP data by integrating it with Databricks and AI. See how CPG companies are transforming supply chain planning by combining SAP ERP data with external signals like weather and transportation data–enabling them to predict disruptions, optimize inventory, and make faster, smarter decisions. Powered by Databricks, this solution delivers true agility and resilience through a unified data architecture. Join us to learn how: You can eliminate SAP data silos and make them ML and AI-ready at scale External data sources amplify SAP use cases like forecasting and scenario planning AI-driven insights accelerate time-to-action across supply chain operations Whether you're just starting your data modernization journey or seeking ROI from SAP analytics, this session will show you what’s possible.

As first-party data becomes increasingly invaluable to organizations, Walmart Data Ventures is dedicated to bringing to life new applications of Walmart’s first-party data to better serve its customers. Through Scintilla, its integrated insights ecosystem, Walmart Data Ventures continues to expand its offerings to deliver insights and analytics that drive collaboration between our merchants, suppliers, and operators.​Scintilla users can now access Walmart data using Cloud Feeds, based on Databricks Delta Sharing technologies. In the past, Walmart used API-based data sharing models, which required users to possess certain skills and technical attributes that weren’t always available. Now, with Cloud Feeds, Scintilla users can more easily access data without a dedicated technical team behind the scenes making it happen. Attendees will gain valuable insights into how Walmart has built its robust data sharing architecture and strategies to design scalable and collaborative data sharing architectures in their own organizations.

Dusting off the Cobwebs — Moving off a 26-year-old Heritage Platform to Databricks [Teradata]

Join us to hear about how National Australia Bank (NAB) successfully completed a significant milestone in its data strategy by decommissioning its 26-year-old Teradata environment and migrating to a new strategic data platform called 'Ada'. This transition marks a pivotal shift from legacy systems to a modern, cloud-based data and AI platform powered by Databricks. The migration process, which spanned two years, involved ingesting 16 data sources, transferring 456 use cases, and collaborating with hundreds of users across 12 business units. This strategic move positions NAB to leverage the full potential of cloud-native data analytics, enabling more agile and data-driven decision-making across the organization. The successful migration to Ada represents a significant step forward in NAB's ongoing efforts to modernize its data infrastructure and capitalize on emerging technologies in the rapidly evolving financial services landscape

How HP Is Optimizing the 3D Printing Supply Chain Using Delta Sharing

HP’s 3D Print division empowers manufacturers with telemetry data to optimize operations and streamline maintenance. Using Delta Sharing, Unity Catalog and AI/BI dashboards, HP provides a secure, scalable solution for data sharing and analytics. Delta Sharing D2O enables seamless data access, even for customers not on Databricks. Apigee masks private URLs, and Unity Catalog enhances security by managing data assets. Predictive maintenance with Mosaic AI boosts uptime by identifying issues early and alerting support teams. Custom dashboards and sample code let customers run analytics using any supported client, while Apigee simplifies access by abstracting complexity. Insights from A/BI dashboards help HP refines data strategy, aligning solutions with customer needs despite the complexity of diverse technologies, fragmented systems and customer-specific requirements. This fosters trust, drives innovation,and strengthens HP as a trusted partner for scalable, secure data solutions.

IQVIA's Analytics for Patient Support Services: Transforming Scalability, Performance and Governance

This presentation will explore the transformation of IQVIA's decade-old patient support platform through the implementation of Databricks Data Intelligence Platform. Facing scalability challenges, performance bottlenecks and rising costs, the existing platform required significant redesign to handle growing data volumes and complex analytics. Key issues included static metrics limiting workflow optimization, fragmented data governance and heightened compliance and security demands. By partnering with Customertimes (a Databricks Partner) and adopting Databricks' centralized, scalable analytics solution with enhanced self-service capabilities, IQVIA achieved improved query performance, cost efficiency and robust governance, ensuring operational effectiveness and regulatory compliance in an increasingly complex environment.

Sponsored by: Deloitte | Advancing AI in Cybersecurity with Databricks & Deloitte: Data Management & Analytics

Deloitte is observing a growing trend among cybersecurity organizations to develop big data management and analytics solutions beyond traditional Security Information and Event Management (SIEM) systems. Leveraging Databricks to extend these SIEM capabilities, Deloitte can help clients lower the cost of cyber data management while enabling scalable, cloud-native architectures. Deloitte helps clients design and implement cybersecurity data meshes, using Databricks as a foundational data lake platform to unify and govern security data at scale. Additionally, Deloitte extends clients’ cybersecurity capabilities by integrating advanced AI and machine learning solutions on Databricks, driving more proactive and automated cybersecurity solutions. Attendees will gain insight into how Deloitte is utilizing Databricks to manage enterprise cyber risks and deliver performant and innovative analytics and AI insights that traditional security tools and data platforms aren’t able to deliver.

Unifying Data Delivery: Using Databricks as Your Enterprise Serving Layer

This session will take you on our journey of integrating Databricks as the core serving layer in a large enterprise, demonstrating how you can build a unified data platform that meets diverse business needs. We will walk through the steps for constructing a central serving layer by leveraging Databricks’ SQL Warehouse to efficiently deliver data to analytics tools and downstream applications. To tackle low latency requirements, we’ll show you how to incorporate an interim scalable relational database layer that delivers sub-second performance for hot data scenarios. Additionally, we’ll explore how Delta Sharing enables secure and cost-effective data distribution beyond your organization, eliminating silos and unnecessary duplication for a truly end-to-end centralized solution. This session is perfect for data architects, engineers and decision-makers looking to unlock the full potential of Databricks as a centralized serving hub.

Sponsored by: Coalesce | Bringing Order to Chaos: How to Succeed in a Data & Analytics World

Priorities shift, requirements change, resources fluctuate, and the demands on data teams are only continuing to grow. Join this session, led by Coalesce Sales Engineering Director, Michael Tantrum, to hear about the most efficient way to deliver high quality data to your organization at the speed they need to consume it. Learn how to sidestep the common pitfalls of data development for maximum data team productivity.

Tech Industry Forum: Tip of the Spear With Data and AI | Sponsored by: Aimpoint Digital and AWS

Join us for the Tech Industry Forum, formerly known as the Tech Innovators Summit, now part of Databricks Industry Experience. This session will feature keynotes, panels and expert talks led by top customer speakers and Databricks experts. Tech companies are pushing the boundaries of data and AI to accelerate innovation, optimize operations and build collaborative ecosystems. In this session, we’ll explore how unified data platforms empower organizations to scale their impact, democratize analytics across teams and foster openness for building tomorrow’s products. Key topics include: Scaling data platforms to support real-time analytics and AI-driven decision-making Democratizing access to data while maintaining robust governance and security Harnessing openness and portability to enable seamless collaboration with partners and customers After the session, connect with your peers during the exclusive Industry Forum Happy Hour. Reserve your seat today!

De-Risking Investment Decisions: QCG's Smarter Deal Evaluation Process Leveraging Databricks

Quantum Capital Group (QCG) screens hundreds of deals across the global Sustainable Energy Ecosystem, requiring deep technical due diligence. With over 1.5 billion records sourced from public, premium and proprietary datasets, their challenge was how to efficiently curate, analyze and share this data to drive smarter investment decisions. QCG partnered with Databricks & Tiger Analytics to modernize its data landscape. Using Delta tables, Spark SQL, and Unity Catalog, the team built a golden dataset that powers proprietary evaluation models and automates complex workflows. Data is now seamlessly curated, enriched and distributed — both internally and to external stakeholders — in a secure, governed and scalable way. This session explores how QCG’s investment in data intelligence has turned an overwhelming volume of information into a competitive advantage, transforming deal evaluation into a faster, more strategic process.

Site to Insight: Powering Construction Analytics Through Delta Sharing

At Procore, we're transforming the construction industry through innovative data solutions. This session unveils how we've supercharged our analytics offerings using a unified lakehouse architecture and Delta Sharing, delivering game-changing results for our customers and our business and how data professionals can unlock the full potential of their data assets and drive meaningful business outcomes. Key highlights: Learn how we've implemented seamless, secure sharing of large datasets across various BI tools and programming languages, dramatically accelerating time-to-insights for our customers Discover our approach to sharing dynamically filtered subsets of data across our numerous customers with cross-platform view sharing We'll demonstrate how our architecture has eliminated the need for data replication, fostering a more efficient, collaborative data ecosystem

Sponsored by: Impetus | Supercharge AI with automated migration to Databricks with Impetus

Migrating legacy workloads to a modern, scalable platform like Databricks can be complex and resource-intensive. Impetus, an Elite Databricks Partner and the Databricks Migration Partner of the Year 2024, simplifies this journey with LeapLogic, an automated solution for data platform modernization and migration services. LeapLogic intelligently discovers, transforms, and optimizes workloads for Databricks, ensuring minimal risk and faster time-to-value. In this session, we’ll showcase real-world success stories of enterprises that have leveraged Impetus’ LeapLogic to modernize their data ecosystems efficiently. Join us to explore how you can accelerate your migration journey, unlock actionable insights, and future-proof your analytics with a seamless transition to Databricks.

Sponsored by: Informatica | Extending Unity Catalog to Govern the Data Estate With Informatica Cloud Data Governance & Catalog

Join this 20-minute session to learn how Informatica CDGC integrates with and leverages Unity Catalog metadata to provide end-to-end governance and security across an enterprise data landscape. Topics covered will include: Comprehensive data lineage that provides complete data transformation visibility across multicloud and hybrid environments -Broad data source support to facilitate holistic cataloging and a centralized governance framework Centralized access policy management and data stewardship to enable compliance with regulatory standards Rich data quality to ensure data is cleansed, validated and trusted for analytics and AI

From Datavault to Delta Lake: Streamlining Data Sync with Lakeflow Connect

In this session, we will explore the Australian Red Cross Lifeblood's approach to synchronizing an Azure SQL Datavault 2.0 (DV2.0) implementation with Unity Catalog (UC) using Lakeflow Connect. Lifeblood's DV2.0 data warehouse, which includes raw vault (RV) and business vault (BV) tables, as well as information marts defined as views, required a multi-step process to achieve data/business logic sync with UC. This involved using Lakeflow Connect to ingest RV and BV data, followed by a custom process utilizing JDBC to ingest view definitions, and the automated/manual conversion of T-SQL to Databricks SQL views, with Lakehouse Monitoring for validation. In this talk, we will share our journey, the design decisions we made, and how the resulting solution now supports analytics workloads, analysts, and data scientists at Lifeblood.