talk-data.com talk-data.com

Topic

Cloud Computing

infrastructure saas iaas

86

tagged

Activity Trend

471 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Data + AI Summit 2025 ×
Streamlining AI Application Development With Databricks Apps

Think Databricks is just for data and models? Think again. In this session, you’ll see how to build and scale a full-stack AI app capable of handling thousands of queries per second entirely on Databricks. No extra cloud platforms, no patchwork infrastructure. Just one unified platform with native hosting, LLM integration, secure access, and built-in CI/CD. Learn how Databricks Apps, along with services like Model Serving, Jobs, and Gateways, streamline your architecture, eliminate boilerplate, and accelerate development, from prototype to production.

SAP and Databricks: Building Your Lakehouse Reference Architecture

SAP is the world's 3rd-largest publicly traded software company by revenue, and recently launched the joint SAP Databricks "Business Data Cloud". See how it all works from a practitioner's perspective, including reference architecture, demo, and example customers. See firsthand how the powerful suite of SAP applications benefits from a joint Databricks solution - with data being more easily governed, discovered, shared, and used for AI/ML..

Deploying Unity Catalog OSS on Kubernetes: Simplifying Infrastructure Management

In modern data infrastructure, efficient and scalable data governance is essential for ensuring security, compliance, and accessibility. This session explores how to deploy Unity Catalog OSS on Kubernetes, leveraging its cloud-agnostic nature and efficient resource management. Helm makes Unity Catalog deployment simple and easy by providing a simplified installation process, easy configuration and credentials management.The session will cover why Kubernetes is the ideal platform, provide a technical breakdown of Unity Catalog on Kubernetes, and include a live showcase of its seamless deployment process. By the end, participants will confidently configure and deploy Unity Catalog OSS in their preferred Kubernetes environment and integrate it into their existing infrastructure.

This hands-on lab guides participants through the complete customer data analytics journey on Databricks, leveraging leading partner solutions - Fivetran, dbt Cloud, and Sigma. Attendees will learn how to:- Seamlessly connect to Fivetran, dbt Cloud, and Sigma using Databricks Partner Connect- Ingest data using Fivetran, transform and model data with dbt Cloud, and create interactive dashboards in Sigma, all on top of the Databricks Data Intelligence Platform- Empower teams to make faster, data-driven decisions by streamlining the entire analytics workflow using an integrated, scalable, and user-friendly platform

Most organizations run complex cloud data architectures that silo applications, users and data. Join this interactive hands-on workshop to learn how Databricks SQL allows you to operate a multi-cloud lakehouse architecture that delivers data warehouse performance at data lake economics — with up to 12x better price/performance than traditional cloud data warehouses.Here’s what we’ll cover: How Databricks SQL fits in the Data Intelligence Platform, enabling you to operate a multicloud lakehouse architecture that delivers data warehouse performance at data lake economics How to manage and monitor compute resources, data access and users across your lakehouse infrastructure How to query directly on your data lake using your tools of choice or the built-in SQL editor and visualizations How to use AI to increase productivity when querying, completing code or building dashboards Ask your questions during this hands-on lab, and the Databricks experts will guide you.

Lessons Learned: Building a Scalable Game Analytics Platform at Netflix

Over the past three years, Netflix has built a catalog of 100+ mobile and cloud games across TV, mobile and web platforms. With both internal and external studios contributing to this diverse ecosystem, building a robust game analytics platform became crucial for gaining insights into player behavior, optimizing game performance and driving member engagement.In this talk, we’ll share our journey of building Netflix’s Game Analytics platform from the ground up. We’ll highlight key decisions around data strategy, such as whether to develop an in-house solution or adopt an external service. We’ll discuss the challenges of balancing developer autonomy with data integrity and the complexities of managing data contracts for custom game telemetry, with an emphasis on self-service analytics. Attendees will learn how the Games Data team navigated these challenges, the lessons learned and the trade-offs involved in building a multi-tenant data ecosystem that supports diverse stakeholders.

Sponsored by: Google Cloud | Unlock price-performance and efficiency on Google Cloud: Databricks & Axion in Action

Maximize the performance of your Databricks Platform with innovations on Google Cloud. Discover how Google's Arm-based Axion C4A virtual machines (VMs) deliver breakthrough price-performance and efficiency for Databricks, supercharging Databricks Photon engine. Gain actionable strategies to optimize your Databricks deployments on Google Cloud.

Unity Catalog Deep Dive: Practitioner's Guide to Best Practices and Patterns

Join this deep dive session for practitioners on Unity Catalog, Databricks’ unified data governance solution, to explore its capabilities for managing data and AI assets across workflows. Unity Catalog provides fine-grained access control, automated lineage tracking, quality monitoring and policy enforcement and observability at scale. Whether your focus is data pipelines, analytics or machine learning and generative AI workflows, this session offers actionable insights on leveraging Unity Catalog’s open interoperability across tools and platforms to boost productivity and drive innovation. Learn governance best practices, including catalog configurations, access strategies for collaboration and controls for securing sensitive data. Additionally, discover how to design effective multi-cloud and multi-region deployments to ensure global compliance.

Sponsored by: Domo, Inc | Enabling AI-Powered Business Solutions w/Databricks & Domo

Domo's Databricks integration seamlessly connects business users to both Delta Lake data and AI/ML models, eliminating technical barriers while maximizing performance. Domo's Cloud Amplifier optimizes data processing through pushdown SQL, while the Domo AI Services layer enables anyone to leverage both traditional ML and large language models directly from Domo. During this session, we’ll explore an AI solution around fraud detection to demonstrate the power of leveraging Domo on Databricks.

Unity Catalog Implementation & Evolution at Edward Jones

This presentation outlines the evolution of Databricks and its integration with cloud analytics at Edward Jones. It focuses on the transition from Cloud V1.x to Cloud V2.0, which highlights the challenges faced with initial setup, Unity Catalog implementation and the improvements planned for the future particularly in terms of Data Cataloging, Architecture and Disaster Recovery. Highlights: Cloud Analytics Journey Current Setup (Cloud V1.x) Utilizes Medallion architecture customized to Edward Jones need. Challenges & limitations identified with integration, limited catalogs, Disaster Recovery etc. Cloud V2.0 Enhancements Modifications in storage and compute in Medallion layers Next level integration with enterprise suites Disaster Recovery readiness Future outlook

Breaking Silos: Using SAP Business Data Cloud and Delta Sharing for Seamless Access to SAP Data in Databricks

We’re excited to share with you how SAP Business Data Cloud supports Delta Sharing to share SAP data securely and seamlessly with Databricks—no complex ETL or data duplication required. This enables organizations to securely share SAP data for analytics and AI in Databricks while also supporting bidirectional data sharing back to SAP.In this session, we’ll demonstrate the integration in action, followed by a discussion of how the global beauty group, Natura, will leverage this solution. Whether you’re looking to bring SAP data into Databricks for advanced analytics or build AI models on top of trusted SAP datasets, this session will show you how to get started — securely and efficiently.

Cross-Cloud Data Mesh with Delta Sharing and UniForm in Mercedes-Benz

In this presentation, we'll show how we achieved a unified development experience for teams working on Mercedes-Benz Data Platforms in AWS and Azure. We will demonstrate how we implemented Azure to AWS and AWS to Azure data product sharing (using Delta Sharing and Cloud Tokens), integration with AWS Glue Iceberg tables through UniForm and automation to drive everything using Azure DevOps Pipelines and DABs. We will also show how to monitor and track cloud egress costs and how we present a consolidated view of all the data products and relevant cost information. The end goal is to show how customers can offer the same user experience to their engineers and not have to worry about which cloud or region the Data Product lives in. Instead, they can enroll in the data product through self-service and have it available to them in minutes, regardless of where it originates.

Smashing Silos, Shaping the Future: Data for All in the Next-Gen Ecosystem

A successful data strategy requires the right platform and the ability to empower the broader user community by creating simple, scalable and secure patterns that lower the barrier to entry while ensuring robust data practices. Guided by the belief that everyone is a data person, we focus on breaking down silos, democratizing access and enabling distributed teams to contribute through a federated "data-as-a-product" model. We’ll share the impact and lessons learned in creating a single source of truth on Unity Catalog, consolidated from diverse sources and cloud platforms. We’ll discuss how we streamlined governance with Databricks Apps, Workflows and native capabilities, ensuring compliance without hindering innovation. We’ll also cover how we maximize the value of that catalog by leveraging semantics to enable trustworthy, AI-driven self-service in AI/BI dashboards and downstream apps. Come learn how we built a next-gen data ecosystem that empowers everyone to be a data person.

Sponsored by: Genpact | Powering Change at GE Vernova: Inside One of the World’s Largest Databricks Migrations

How do you transform legacy data into a launchpad for next-gen innovation? GE Vernova is tackling it by rapidly migrating from outdated platforms to Databricks, building one of the world’s largest cloud data implementations. This overhaul wasn’t optional. Scaling AI, cutting technical debt, and slashing license costs demanded a bold, accelerated approach. Led by strategic decisions from the CDO and powered by Genpact’s AI Gigafactory, the migration is tackling 35+ Business and sub domains, 60,000+ data objects, 15,000+ jobs, 3000+ reports from 120+ diverse data sources to deliver a multi-tenant platform with unified governance. The anticipated results? Faster insights, seamless data sharing, and a standardized platform built for AI at scale. This session explores how Genpact and Databricks are fueling GE Vernova’s mission to deliver The Energy to Change the World—and what it takes to get there when speed, scale, and complexity are non-negotiable.

Sponsored by: Google Cloud | Building Powerful Agentic Ecosystems with Google Cloud's A2A

This session unveils Google Cloud's Agent2Agent (A2A) protocol, ushering in a new era of AI interoperability where diverse agents collaborate seamlessly to solve complex enterprise challenges. Join our panel of experts to discover how A2A empowers you to deeply integrate these collaborative AI systems with your existing enterprise data, custom APIs, and critical workflows. Ultimately, learn to build more powerful, versatile, and securely managed agentic ecosystems by combining specialized Google-built agents with your own custom solutions (Vertex AI or no-code). Extend this ecosystem further by serving these agents with Databricks Model Serving and governing them with Unity Catalog for consistent security and management across your enterprise.

Sponsored by: Informatica | Modernize analytics and empower AI in Databricks with trusted data using Informatica

As enterprises continue their journey to the cloud, data warehouse and data management modernization is essential to optimize analytics and drive business outcomes. Minimizing modernization timelines is important for reducing risk and shortening time to value – and ensuring enterprise data is clean, curated and governed is imperative to enable analytics and AI initiatives. In this session, learn how Informatica's Intelligent Data Management Cloud (IDMC) empowers analytics and AI on Databricks by helping data teams: · Develop no-code/low-code data pipelines that ingest, transform and clean data at enterprise scale · Improve data quality and extend enterprise governance with Informatica Cloud Data Governance and Catalog (CDGC) and Unity Catalog · Accelerate pilot-to-production with Mosaic AI

Unity Catalog Lakeguard: Secure and Efficient Compute for Your Enterprise

Modern data workloads span multiple sources — data lakes, databases, apps like Salesforce and services like cloud functions. But as teams scale, secure data access and governance across shared compute becomes critical. In this session, learn how to confidently integrate external data and services into your workloads using Spark and Unity Catalog on Databricks. We'll explore compute options like serverless, clusters, workflows and SQL warehouses, and show how Unity Catalog’s Lakeguard enforces fine-grained governance — even when concurrently sharing compute by multiple users. Walk away ready to choose the right compute model for your team’s needs — without sacrificing security or efficiency.

Sponsored by: Hexaware | Global Data at Scale: Powering Front Office Transformation with Databricks

Global Data at Scale: Powering Front Office Transformation with DatabricksJoin KPMG for an engaging session on how we transformed our data platform and built a cutting-edge Global Data Store (GDS)—a game-changing data hub for our Front Office Transformation (FOT). Discover how we seamlessly unified data from various member firms, turning it into a dynamic engine for and enabled our business to leverage our Front Office ecosystem to enable smarter analytics and decision-making. Learn about our unique approach that rapidly integrates diverse datasets into the GDS and our hub-and-spoke model, connecting member firms’ data lakes, enabling secure, high-speed collaboration via Delta Sharing. Hear how we are leveraging Unity Catalog to help ensure data governance, compliance, and straight forward data lineage. We’ll share strategies for risk management, security (fine-grained access, encryption), and scaling a cloud-based data ecosystem.