talk-data.com talk-data.com

Topic

Databricks

big_data analytics spark

1286

tagged

Activity Trend

515 peak/qtr
2020-Q1 2026-Q1

Activities

1286 activities · Newest first

Sponsored by: DataNimbus | Building an AI Platform in 30 Days and Shaping the Future with Databricks

Join us as we dive into how Turnpoint Services, in collaboration with DataNimbus, built an Intelligence Platform on Databricks in just 30 days. We'll explore features like MLflow, LLMs, MLOps, Model Registry, Unity Catalog & Dashboard Alerts that powered AI applications such as Demand Forecasting, Customer 360 & Review Automation. Turnpoint’s transformation enabled data-driven decisions, ops efficiency & a better customer experience. Building a modern data foundation on Databricks optimizes resource allocation & drives engagement. We’ll also introduce innovations in DataNimbus Designer: AI Blocks: modular, prompt-driven smart transformers for text data, built visually & deployed directly within Databricks. These capabilities push the boundaries of what's possible on the Databricks platform. Attendees will gain practical insights, whether you're beginning your AI journey or looking to accelerate it.

Sponsored by: Genpact | Powering Change at GE Vernova: Inside One of the World’s Largest Databricks Migrations

How do you transform legacy data into a launchpad for next-gen innovation? GE Vernova is tackling it by rapidly migrating from outdated platforms to Databricks, building one of the world’s largest cloud data implementations. This overhaul wasn’t optional. Scaling AI, cutting technical debt, and slashing license costs demanded a bold, accelerated approach. Led by strategic decisions from the CDO and powered by Genpact’s AI Gigafactory, the migration is tackling 35+ Business and sub domains, 60,000+ data objects, 15,000+ jobs, 3000+ reports from 120+ diverse data sources to deliver a multi-tenant platform with unified governance. The anticipated results? Faster insights, seamless data sharing, and a standardized platform built for AI at scale. This session explores how Genpact and Databricks are fueling GE Vernova’s mission to deliver The Energy to Change the World—and what it takes to get there when speed, scale, and complexity are non-negotiable.

Sponsored by: Google Cloud | Building Powerful Agentic Ecosystems with Google Cloud's A2A

This session unveils Google Cloud's Agent2Agent (A2A) protocol, ushering in a new era of AI interoperability where diverse agents collaborate seamlessly to solve complex enterprise challenges. Join our panel of experts to discover how A2A empowers you to deeply integrate these collaborative AI systems with your existing enterprise data, custom APIs, and critical workflows. Ultimately, learn to build more powerful, versatile, and securely managed agentic ecosystems by combining specialized Google-built agents with your own custom solutions (Vertex AI or no-code). Extend this ecosystem further by serving these agents with Databricks Model Serving and governing them with Unity Catalog for consistent security and management across your enterprise.

Sponsored by: Informatica | Modernize analytics and empower AI in Databricks with trusted data using Informatica

As enterprises continue their journey to the cloud, data warehouse and data management modernization is essential to optimize analytics and drive business outcomes. Minimizing modernization timelines is important for reducing risk and shortening time to value – and ensuring enterprise data is clean, curated and governed is imperative to enable analytics and AI initiatives. In this session, learn how Informatica's Intelligent Data Management Cloud (IDMC) empowers analytics and AI on Databricks by helping data teams: · Develop no-code/low-code data pipelines that ingest, transform and clean data at enterprise scale · Improve data quality and extend enterprise governance with Informatica Cloud Data Governance and Catalog (CDGC) and Unity Catalog · Accelerate pilot-to-production with Mosaic AI

Tech Industry Session: Optimizing Costs and Controls to Democratize Data and AI

Join us for this session focused on how leading tech companies are enabling data intelligence across their organizations while maintaining cost efficiency and governance. Hear the successes and the challenges when Databricks empowers thousands of users—from engineers to business teams—by providing scalable tools for AI, BI and analytics. Topics include: Combining AI/BI and Lakehouse Apps to streamline workflows and accelerate insights Implementing systems tables, tagging and governance frameworks for granular control Democratizing data access while optimizing costs for large-scale analytical workloads Hear from customers and Databricks experts, followed by a customer panel featuring industry leaders. Gain insights into how Databricks helps tech innovators scale their platforms while maintaining operational excellence.

Telco Reimagined: Real-World Journeys in Data and AI for Customer Experience Transformation

How are today’s leading telecom operators transforming customer experience at scale with data and AI? Join us for an inspiring fireside chat with senior leaders from Optus, Plume and AT&T as they share their transformation stories — from the first steps to major milestones and the tangible business impact achieved with Databricks’ Data Intelligence Platform. You’ll hear firsthand how these forward-thinking CSP’s are driving measurable outcomes through unified data, machine learning and AI. Discover the high-impact use cases they’re prioritizing — like proactive care and hyper-personalization — and gain insight into their bold vision for the future of customer experience in telecom. Whether you're just beginning your AI journey or scaling to new heights, this session offers an authentic look at what’s working, what’s next and how data and AI are helping telecoms lead in a competitive landscape.

Unity Catalog Lakeguard: Secure and Efficient Compute for Your Enterprise

Modern data workloads span multiple sources — data lakes, databases, apps like Salesforce and services like cloud functions. But as teams scale, secure data access and governance across shared compute becomes critical. In this session, learn how to confidently integrate external data and services into your workloads using Spark and Unity Catalog on Databricks. We'll explore compute options like serverless, clusters, workflows and SQL warehouses, and show how Unity Catalog’s Lakeguard enforces fine-grained governance — even when concurrently sharing compute by multiple users. Walk away ready to choose the right compute model for your team’s needs — without sacrificing security or efficiency.

What’s New with Databricks Assistant: From Exploration to Production

Databricks Assistant helps you get from initial exploration all the way to production faster and easier than ever. In this session, we'll show you how Assistant simplifies and accelerates common workflows, boosting your productivity across notebooks and the SQL editor. You'll get practical tips, see end-to-end examples in action, and hear about the latest capabilities we're excited about. We'll also discuss how we're continually improving Assistant to make your development experience faster, more contextual and more customizable. Join us to discover how to get the most out of Databricks Assistant and empower your team to build better and faster.

Accelerating Data Transformation: Best Practices for Governance, Agility and Innovation

In this session, we will share NCS’s approach to implementing a Databricks Lakehouse architecture, focusing on key lessons learned and best practices from our recent implementations. By integrating Databricks SQL Warehouse, the DBT Transform framework and our innovative test automation framework, we’ve optimized performance and scalability, while ensuring data quality. We’ll dive into how Unity Catalog enabled robust data governance, empowering business units with self-serve analytical workspaces to create insights while maintaining control. Through the use of solution accelerators, rapid environment deployment and pattern-driven ELT frameworks, we’ve fast-tracked time-to-value and fostered a culture of innovation. Attendees will gain valuable insights into accelerating data transformation, governance and scaling analytics with Databricks.

Inscape Smart TV Data: Unlocking Consumption and Competitive Intelligence

With VIZIO's Inscape viewership data now available in the Databricks marketplace, our expansive dataset has never been easier to access. With real-time availability, flexible integrations, and secure, governed sharing, it's built for action.Join our team as we explore the full depth of this comprehensive data across both linear and streaming TV - showcasing real-world use cases like measuring the incremental reach of streaming or matching to 1st/3rd party data for ROI analyses. We will review our competitive intelligence through a share-of-voice analysis to provide the seamless steps to success.This session will show you how to turn Inscape data into a strategic advantage.

Reducing Transaction Conflicts in Databricks—Fundamentals and Applications at Asana

When using ACID-guaranteed transactions on Databricks concurrently, we can run into transaction conflicts. This talk discusses the basics of concurrent transaction functionality in Databricks—what happens when various combinations of INSERT, UPDATE and MERGE INTO happen concurrently. We discuss how table isolation level, partitioning and deletion vectors affect this. We also mention how Asana used an intermediate blind append stage to support several hundred concurrent transaction updates into the same table.

Sponsored by: Accenture & Avanade | Reinventing State Services with Databricks: AI-Driven Innovations in Health and Transportation

One of the largest and trailblazing U.S. states is setting a new standard for how governments can harness data and AI to drive large-scale impact. In this session, we will explore how we are using the Databricks Data Intelligence Platform to address two of the state's most pressing challenges: public health and transportation. From vaccine tracking powered by intelligent record linkage and a service-oriented analytics architecture, to Gen AI-driven insights that reduce traffic fatalities and optimize infrastructure investments, this session reveals how scalable, secure, and real-time data solutions are transforming state operations. Join us to learn how data-driven governance is delivering better outcomes for millions—and paving the way for an AI enabled, data driven and more responsive government.

Sponsored by: Atlan | Domain-driven Data Governance in the AI Era: A Conversation with General Motors and Atlan

Now the largest automaker in the United States, selling more than 2.7 million vehicles in 2024, General Motors is setting a bold vision for its future, with Software-defined vehicles and AI as a driving force. With data as a crucial asset, a transformation of this scale calls for a modern approach to Data Governance. Join Sherri Adame, Enterprise Data Governance Leader at General Motors, to learn about GM’s novel governance approach, supported by technologies like Atlan and Databricks. Hear how Sherri and her team are shifting governance to the left with automation, implementing data contracts, and accelerating data product discovery across domains, creating a cultural shift that emphasizes data as a competitive advantage.

Sponsored by: Hexaware | Global Data at Scale: Powering Front Office Transformation with Databricks

Global Data at Scale: Powering Front Office Transformation with DatabricksJoin KPMG for an engaging session on how we transformed our data platform and built a cutting-edge Global Data Store (GDS)—a game-changing data hub for our Front Office Transformation (FOT). Discover how we seamlessly unified data from various member firms, turning it into a dynamic engine for and enabled our business to leverage our Front Office ecosystem to enable smarter analytics and decision-making. Learn about our unique approach that rapidly integrates diverse datasets into the GDS and our hub-and-spoke model, connecting member firms’ data lakes, enabling secure, high-speed collaboration via Delta Sharing. Hear how we are leveraging Unity Catalog to help ensure data governance, compliance, and straight forward data lineage. We’ll share strategies for risk management, security (fine-grained access, encryption), and scaling a cloud-based data ecosystem.

Sponsored by: Tiger Analytics | Data-Driven Transformation to Hypercharge Predictive and Diagnostic Supply Chain Intelligence

Manufacturers today need efficient, accurate, and flexible integrated planning across supply, demand, and finance. A leading industrial manufacturer is pursuing a competitive edge in Integrated Business Planning through data and AI.Their strategy: a connected, real-time data foundation with democratized access across silos. Using Databricks, we’re building business-centric data products to enable near real-time, collaborative decisions and scaled AI. Unity Catalog ensures data reliability and adoption. Increased data visibility is driving better on-time delivery, inventory optimization, and forecasting,resulting in measurable financial impact. In this session, we’ll share our journey to the north star of “driving from the windshield, not the rearview,” including key data, organization, and process challenges in enabling data democratization; architectural choices for Integrated Business Planning as a data product; and core capabilities delivered with Tiger’s Accelerator.

Achieving AI Success with a Solid Data Foundation

Join for an insightful presentation on creating a robust data architecture to drive business outcomes in the age of Generative AI. Santosh Kudva, GE Vernova Chief Data Officer and Kevin Tollison, EY AI Consulting Partner, will share their expertise on transforming data strategies to unleash the full potential of AI. Learn how GE Vernova, a dynamic enterprise born from the 2024 spin-off of GE, revamped its diverse landscape. They will provide a look into how they integrated the pre-spin-off Finance Data Platform into the GE Vernova Enterprise Data & Analytics ecosystem utilizing Databricks to enable high-performance AI-led analytics. Key insights include: Incorporating Generative AI into your overarching strategy Leveraging comprehensive analytics to enhance data quality Building a resilient data framework adaptable to continuous evolution Don't miss this opportunity to hear from industry leaders and gain valuable insights to elevate your data strategy and AI success.

This course will introduce you to AI agents, their transformative impact on organizations, and how Databricks enables the creation of AI agent systems. We’ll begin by exploring what AI agents are, how they differ from traditional AI systems, and why they are becoming essential in today’s data-driven landscape. Next, we’ll examine how AI agents can be used to automate tasks, enhance decision-making, and unlock new efficiencies for businesses of all sizes. Finally, we’ll review real-world examples of AI agent systems on Databricks, showcasing practical applications across industries and sharing key considerations for successful adoption. You can pass a short quiz and earn a badge to validate your learning on completion.

Building a Self-Service Data Platform With a Small Data Team

Discover how Dodo Brands, a global pizza and coffee business with over 1,200 retail locations and 40k employees, revolutionized their analytics infrastructure by creating a self-service data platform. This session explores the approach to empowering analysts, data scientists and ML engineers to independently build analytical pipelines with minimal involvement from data engineers. By leveraging Databricks as the backbone of their platform, the team developed automated tools like a "job-generator" that uses Jinja templates to streamline the creation of data jobs. This approach minimized manual coding and enabled non-data engineers to create over 1,420 data jobs — 90% of which were auto-generated by user configurations. Supporting thousands of weekly active users via tools like Apache Superset. This session provides actionable insights for organizations seeking to scale their analytics capabilities efficiently without expanding their data engineering teams.