talk-data.com talk-data.com

Topic

Cloud Computing

infrastructure saas iaas

86

tagged

Activity Trend

471 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: Data + AI Summit 2025 ×
Kill Bill-ing? Revenge is a Dish Best Served Optimized with GenAI

In an era where cloud costs can spiral out of control, Sportsbet achieved a remarkable 49% reduction in Total Cost of Ownership (TCO) through an innovative AI-powered solution called 'Kill Bill.' This presentation reveals how we transformed Databricks' consumption-based pricing model from a challenge into a strategic advantage through an intelligent automation and optimization. Understand how to use GenAI to reduce Databricks TCO Leverage generative AI within Databricks solutions enables automated analysis of cluster logs, resource consumption, configurations, and codebases to provide Spark optimization suggestions Create AI agentic workflows by integrating Databricks' AI tools and Databricks Data Engineering tools Review a case study demonstrating how Total Cost of Ownership was reduced in practice. Attendees will leave with a clear understanding of how to implement AI within Databricks solutions to address similar cost challenges in their environments.

Sponsored by: Dagster Labs | The Age of AI is Changing Data Engineering for Good

The last major shift in data engineering came during the rise of the cloud, transforming how we store, manage, and analyze data. Today, we stand at the cusp of the next revolution: AI-driven data engineering. This shift promises not just faster pipelines, but a fundamental change in the way data systems are designed and maintained. AI will redefine who builds data infrastructure, automating routine tasks, enabling more teams to contribute to data platforms, and (if done right) freeing up engineers to focus on higher-value work. However, this transformation also brings heightened pressure around governance, risk, and data security, requiring new approaches to control and oversight. For those prepared, this is a moment of immense opportunity – a chance to embrace a future of smarter, faster, and more responsive data systems.

Sponsored by: Redpanda | IoT for Fun & Prophet: Scaling IoT and predicting the future with Redpanda, Iceberg & Prophet

In this talk, we’ll walk through a complete real-time IoT architecture—from an economical, high-powered ESP32 microcontroller publishing environmental sensor data to AWS IoT, through Redpanda Connect into a Redpanda BYOC cluster, and finally into Apache Iceberg for long-term analytical storage. Once the data lands, we’ll query it using Python and perform linear regression with Prophet to forecast future trends. Along the way, we’ll explore the design of a scalable, cloud-native pipeline for streaming IoT data. Whether you're tracking the weather or building the future, this session will help you architect with confidence—and maybe even predict it.

Igniting Innovation at Gilead: Convergence of Cloud, Data, AI and Agents

The convergence of cloud, data and AI is revolutionizing the pharmaceutical industry, creating a powerful ecosystem that drives innovation at scale across the entire value chain. At Gilead, teams harness these technologies on a unified cloud, data, & AI platform, accelerating business processes in pre-clinical and clinical stage, enabling smarter manufacturing and commercial processes, and deliver AI initiatives by reusing data products. Gilead will discuss how they have leveraged AWS, Databricks, and Data Mesh to manage vast amounts of heterogeneous data. Also, showcase use cases of traditional AI/ML, and Generative AI, and a Marketplace approach to drive adoption of AI Agents, demonstrating how this cloud-based, AI-powered platform is transforming the entire value chain. Gilead will also discuss how they are exploring the future of pharmaceutical innovation through Agentic AI, where the synergy of cloud, data and AI is unlocking new possibilities for a healthier world. In the second part, Muddu Sudhakar, Founder and Investor, will discuss how organizations can build and buy solutions for AI, Agents with Data Platforms. AWS and Databricks provide industry-leading platforms to build Agentic AI solutions. We will also cover Agentic AI Platform, Agent orchestration, Agent Interoperability, Agent Guardrails and Agentic workflows. This discussion also covers challenges in deploying and managing Agentic AI platforms. Enterprises need impactful AI initiatives & Agents to realize the promise and vision of AI and drive significant ROI.

AI-Powered Profits: Smarter Order and Inventory Management

Join this session to hear from two incredible companies, Xylem and Joby Aviation. Xylem shares their successful journey from fragmented legacy systems to a unified Enterprise Data Platform, demonstrating how they integrated complex ERP data across four business segments to achieve breakthrough improvements in parts management and operational efficiency. Following Xylem's story, learn how Joby Aviation leveraged Databricks to automate and accelerate flight test data checks, cutting processing times from over two hours to under thirty minutes. This session highlights how advanced cloud tools empower engineers to quickly build and run custom data checks, improving both speed and safety in flight test operations.

Databricks in Action: Azure’s Blueprint for Secure and Cost-Effective Operations

Erste Group's transition to Azure Databricks marked a significant upgrade from a legacy system to a secure, scalable and cost-effective cloud platform. The initial architecture, characterized by a complex hub-spoke design and stringent compliance regulations, was replaced with a more efficient solution. The phased migration addressed high network costs and operational inefficiencies, resulting in a 60% reduction in networking costs and a 30% reduction in compute costs for the central team. This transformation, completed over a year, now supports real-time analytics, advanced machine learning and GenAI while ensuring compliance with European regulations. The new platform features a Unity Catalogue, separate data catalogs and dedicated workspaces, demonstrating a successful shift to a cloud-based machine learning environment with significant improvements in cost, performance and security.

How Navy Federal's Enterprise Data Ecosystem Leverages Unity Catalog for Data + AI Governance

Navy Federal Credit Union has 200+ enterprise data sources in the enterprise data lake. These data assets are used for training 100+ machine learning models and hydrating a semantic layer for serving, at an average 4,000 business users daily across the credit union. The only option for extracting data from analytic semantic layer was to allow consuming application to access it via an already-overloaded cloud data warehouse. Visualizing data lineage for 1,000 + data pipelines and associated metadata is impossible and understanding the granular cost for running data pipelines is a challenge. Implementing Unity Catalog opened alternate path for accessing analytic semantic data from lake. It also opened the doors to remove duplicate data assets stored across multiple lakes which will save hundred thousands of dollars in data engineering efforts, compute and storage costs.

How to Migrate From Snowflake to Databricks SQL

Migrating your Snowflake data warehouse to the Databricks Data Intelligence Platform can accelerate your data modernization journey. Though a cloud platform-to-cloud platform migration should be relatively easy, the breadth of the Databricks Platform provides flexibility and hence requires careful planning and execution. In this session, we present the migration methodology, technical approaches, automation tools, product/feature mapping, a technical demo and best practices using real-world case studies for migrating data, ELT pipelines and warehouses from Snowflake to Databricks.

Sponsored by: Pantomath | The Shift from 3,000 to 500 BI Reports: A Data Leader’s Guide to Leaner, Smarter Data Operations

Join Sandy Steiger, Head of Advanced Analytics & Automation (formerly at TQL), as she walks through how her team tackled one of the most common and least talked about problems in data teams: report bloat, data blind spots, and broken trust with the business. You’ll learn how TQL went from 3,000 reports to fewer than 500 while gaining better visibility, faster data issue resolution, and cloud agility through practical use of lineage, automated detection, and surprising outcomes from implementing Pantomath (an automated data operations platform). Sandy will share how her team identified upstream issues (before Microsoft did), avoided major downstream breakages, and built the credibility every data team needs to earn trust from the business. Walk away with a playbook for using automation to drive smarter, faster decisions across your organization.

Sponsored by: Salesforce | From Data to Action: A Unified and Trusted Approach

Empower AI and agents with trusted data and metadata from an end-to-end unified system. Discover how Salesforce Data Cloud, Agentforce, and Databricks work together to fuel automation, AI, and analytics through a unified data strategy—driving real-time intelligence, enabling zero-copy data sharing, and unlocking scalable activation across the enterprise.

Techcombank's Multi-Million Dollar Transformation Leveraging Cloud and Databricks

The migration to the Databricks Data Intelligence Platform has enabled Techcombank to more efficiently unify data from over 50 systems, improve governance, streamline daily operational analytics pipelines and use advanced analytics tools and AI to create more meaningful and personalized experiences for customers. With Databricks, Techcombank has also introduced key solutions that are reshaping its digital banking services: AI-driven lead management system: Techcombank's internally developed AI program called 'Lead Allocation Curated Engine' (LACE) optimizes lead management and provides relationship managers with enriched insights for smarter lead allocation to drive business growth. AI-powered program for digital banking inclusion of small businesses: An AI-powered GeoSense assists frontline workers with analytics-driven insights about which small businesses and merchants to engage in the bank's digital ecosystem. And more examples, which will be presented.

Sponsored by: Google Cloud | Powering AI & Analytics: Innovations in Google Cloud Storage for Data Lakes

Enterprise customers need a powerful and adaptable data foundation to navigate demands of AI and multi-cloud environments. This session dives into how Google Cloud Storage serves as a unified platform for modern analytics data lakes, together with Databricks. Discover how Google Cloud Storage provides key innovations like performance optimizations for Apache Iceberg, Anywhere Cache as the easiest way to colocate storage and compute, Rapid Storage for ultra low latency object reads and appends, and Storage Intelligence for vital data insights and recommendations. Learn how you can optimize your infrastructure to unlock the full value of your data for AI-driven success.

A Japanese Mega-Bank’s Journey to a Modern, GenAI-Powered, Governed Data Platform

SMBC, a major Japanese multinational financial services institution, has embarked on an initiative to build a GenAI-powered, modern and well-governed cloud data platform on Azure/Databricks. This initiative aims to build an enterprise data foundation encompassing loans, deposits, securities, derivatives, and other data domains. Its primary goals are: To decommission legacy data platforms and reduce data sprawl by migrating 20+ core banking systems to a multi-tenant Azure Databricks architecture To leverage Databrick’s delta-share capabilities to address SMBC’s unique global footprint and data sharing needs To govern data by design using Unity Catalog To achieve global adoption of the frameworks, accelerators, architecture and tool stack to support similar implementations across EMEA Deloitte and SMBC leveraged the Brickbuilder asset “Data as a Service for Banking” to accelerate this highly strategic transformation.

Most organizations run complex cloud data architectures that silo applications, users and data. Join this interactive hands-on workshop to learn how Databricks SQL allows you to operate a multi-cloud lakehouse architecture that delivers data warehouse performance at data lake economics — with up to 12x better price/performance than traditional cloud data warehouses. Here’s what we’ll cover: How Databricks SQL fits in the Data Intelligence Platform, enabling you to operate a multicloud lakehouse architecture that delivers data warehouse performance at data lake economics How to manage and monitor compute resources, data access and users across your lakehouse infrastructure How to query directly on your data lake using your tools of choice or the built-in SQL editor and visualizations How to use AI to increase productivity when querying, completing code or building dashboards Ask your questions during this hands-on lab, and the Databricks experts will guide you.

Intuit's Privacy-Safe Lending Marketplace: Leveraging Databricks Clean Rooms

Intuit leverages Databricks Clean Rooms to create a secure, privacy-safe lending marketplace, enabling small business lending partners to perform analytics and deploy ML/AI workflows on sensitive data assets. This session explores the technical foundations of building isolated clean rooms across multiple partners and cloud providers, differentiating Databricks Clean Rooms from market alternatives. We'll demonstrate our automated approach to clean room lifecycle management using APIs, covering creation, collaborator onboarding, data asset sharing, workflow orchestration and activity auditing. The integration with Unity Catalog for managing clean room inputs and outputs will also be discussed. Attendees will gain insights into harnessing collaborative ML/AI potential, support various languages and workloads, and enable complex computations without compromising sensitive information in Clean Rooms.

Mastering Change Data Capture With Lakeflow Declarative Pipelines

Transactional systems are a common source of data for analytics, and Change Data Capture (CDC) offers an efficient way to extract only what’s changed. However, ingesting CDC data into an analytics system comes with challenges, such as handling out-of-order events or maintaining global order across multiple streams. These issues often require complex, stateful stream processing logic.This session will explore how Lakeflow Declarative Pipelines simplifies CDC ingestion using the Apply Changes function. With Apply Changes, global ordering across multiple change feeds is handled automatically — there is no need to manually manage state or understand advanced streaming concepts like watermarks. It supports both snapshot-based inputs from cloud storage and continuous change feeds from systems like message buses, reducing complexity for common streaming use cases.

Sponsored by: LTIMindtree | 4 Strategies to Maximize SAP Data Value with Databricks and AI

As enterprises strive to become more data-driven, SAP continues to be central to their operational backbone. However, traditional SAP ecosystems often limit the potential of AI and advanced analytics due to fragmented architectures and legacy tools. In this session, we explore four strategic options for unlocking greater value from SAP data by integrating with Databricks and cloud-native platforms. Whether you're on ECC, S4HANA, or transitioning from BW, learn how to modernize your data landscape, enable real-time insights, and power AI/ML at scale. Discover how SAP Business Data Cloud and SAP Databricks can help you build a unified, future-ready data and analytics ecosystem—without compromising on scalability, flexibility, or cost-efficiency.

Stop Guessing Spend Where It Counts: Data-Driven Decisions for High-Impact Investments on Databricks

Struggling with runaway cloud costs as your organization grows? Join us for an inside look at how Databricks’ own Data Platform team tackled escalating spend in some of the world’s largest workspaces — saving millions of dollars without sacrificing performance or user experience. We’ll share how we harnessed powerful features like System Tables, Workflows, Unity Catalog, and Photon to monitor and optimize resource usage, all while using data-driven decisions to improve efficiency and ensure we invest in the areas that truly drive business impact. You’ll hear about the real-world challenges we faced balancing governance with velocity and discover the custom tooling and best practices we developed to keep costs in check. By the end of this session, you’ll walk away with a proven roadmap for leveraging Databricks to control cloud spend at scale.

Use External Models in Databricks: Connecting to Azure, AWS, Google Cloud, Anthropic and More

In this session you will learn how to leverage a wide set of GenAI models in Databricks, including external connections to cloud vendors and other model providers. We will cover establishing connection to externally served models, via Mosaic AI Gateway. This will showcase connection to Azure, AWS & Google Cloud models, as well as model vendors like Anthropic, Cohere, AI21 Labs and more. You will also discover best practices on model comparison, governance and cost control on those model deployments.

Unlock the Potential of Your Enterprise Data With Zero-Copy Data Sharing, featuring SAP and Salesforce

Tired of data silos and the constant need to move copies of your data across different systems? Imagine a world where all your enterprise data is readily available in Databricks without the cost and complexity of duplication and ingestion. Our vision is to break down these silos by enabling seamless, zero-copy data sharing across platforms, clouds, and regions. This unlocks the true potential of your data for analytics and AI, empowering you to make faster, more informed decisions leveraging your most important enterprise data sets. This session you will hear from Databricks, SAP, and Salesforce product leaders on how zero-copy data sharing can unlock the value of enterprise data. Explore how Delta Sharing makes this vision a reality, providing secure, zero-copy data access for enterprises.SAP Business Data Cloud: See Delta Sharing in action to unlock operational reporting, supply chain optimization, and financial planning. Salesforce Data Cloud: Enable customer analytics, churn prediction, and personalized marketing.