talk-data.com
Event
Big Data LDN 2025
Activities tracked
15
Top Topics
Sessions & talks
Showing 1–15 of 15 · Newest first
AI is only as good as the data it runs on. Yet Gartner predicts in 2026, over 60% of AI projects will fail to deliver value - because the underlying data isn’t truly AI-ready. “Good enough” data isn’t enough.
In this exclusive BDL launch session, DataOps.live reveal Momentum, the next generation of its DataOps automation platform designed to operationalize trusted AI at enterprise scale.
Based on experiences from building over 9000 Data Products to date, Momentum introduces breakthrough capabilities including AI-Ready Data Scoring to ensure data is fit for AI use cases, Data Product Lineage for end-to-end visibility, and a Data Engineering Agent that accelerates building reusable data products. Combined with automated CI/CD, continuous observability, and governance enforcement, Momentum closes the AI-readiness gap by embedding collaboration, metadata, and automation across the entire data lifecycle.
Backed by Snowflake Ventures and trusted by leading enterprises, including AstraZeneca, Disney and AT&T, DataOps.live is the proven catalyst for scaling AI-ready data. In this session, you’ll unpack what AI-ready data really means, learn essential practices, discover a faster, easier, and more impactful way to make your AI initiatives succeed.
Be the first to see Momentum in action - the future of AI-ready data.
Future of Data Engineering in an Agentic World
This session will provide a Maia demo with roadmap teasers. The demo will showcase Maia's core capabilities: authoring pipelines in business language, multiplying productivity by accelerating tasks, and enabling self-service. It demonstrates how Maia takes natural language prompts and translates them into YAML-based, human-readable Data Pipeline Language (DPL), generating graphical pipelines. Expect to see Maia interacting with Snowflake metadata to sample data and suggest transformations, as well as its ability to troubleshoot and debug pipelines in real-time. The session will also cover how Maia can create custom connectors from REST API documentation in seconds, a task that traditionally takes days . Roadmap teasers will likely include the upcoming Semantic Layer, a Pipeline Reviewing Agent, and enhanced file type support for various legacy ETL tools and code conversions.
Bringing Data Modeling to the Masses with AI and Embedded Connectivity
Join Sami Hero and Tammie Coles, as they share how Ellie is reinventing data modeling with AI-native tools that empower both technical and non-technical users. With CData Embedded Cloud, Ellie brings live metadata and data models from systems like Snowflake, Databricks, and Oracle Financials into a unified modeling workspace. Their platform translates legacy structures into human-readable insights, letting users interact with a copilot-style assistant to discover, refine, and maintain data models faster—with less reliance on analysts.
You’ll see how Ellie uses generative AI to recommend new entities, reconcile differences between models and live systems, and continuously document evolving data environments. Learn how corporations are using Ellie and CData together to scale high-quality data modeling across teams. reducing rework, accelerating delivery of analytics-ready models, and making enterprise architecture accessible to the business.
How Runna Supercharged Data in Just 6 Months with Snowflake
Join Richard as he shares how Runna transformed its data capabilities by harnessing the power of Snowflake and AWS. This session will explore the key challenges the team faced, how they overcame them, and the practical steps they took to build a scalable, future-ready data platform. Richard will walk through what’s been achieved so far, the lessons learned along the way, and how Runna is now able to unlock complex business insights faster and more efficiently than ever before. You'll also get a sneak peek into what’s next as they continue to evolve their data strategy to support rapid growth and innovation.
Live Demo - Build a custom Fivetran connector in 20 minutes
In this 20-minute session, you'll learn how to build a custom Fivetran connector using the Fivetran Connector SDK and the Anthropic Workbench (AI Assistant) to integrate data from a custom REST API into Snowflake.
You'll then learn how to create a Streamlit in Snowflake data application powering metrics and Snowflake Cortex AI-driven applications.
How Espresso Uses ML To Cut Your Snowflake Bill in Half
Espresso AI uses two main techniques to run Snowflake workloads faster and cheaper: ML-based job scheduling and LLM-based query optimization. This talk will dive into the details behind both approaches.
Iceberg – Tales From a Real Implementor With DataOps.live
Get ready for a customer story that’s as bold as it is eye-opening. In this session, Eutelsat and DataOps.live pull back the curtain on what it really takes to deliver business-changing outcomes with a specific focus on the Use Cases addressed by Apache at the core. And these Use Cases are BIG – think about big, big numbers, and you still aren’t even close!
You’ll hear the inside story of how Eutelsat found itself with two “competing” cloud data platforms. What could have been an expensive headache turned out to be an advantage: Iceberg made it not only possible but cheaper and simpler to use both together, unlocking agility and cost savings that no single platform alone could provide.
The impact is already tangible. Telemetry pipelines are live and delivering massive value. Next up: interoperable Data Products seamlessly moving from Snowflake to Cloudera and vice versa, driving cross-platform innovation. And that’s just the start—Eutelsat is also positioning Iceberg as a future-proof standard for data sharing and export.
This is a story of scale, speed, and simplification—the kind of transformation only possible when a visionary team meets the right technology.
Driving Impact Through Data: The Evolution of Data Quality at OutSystems
As the pioneers of the low-code market since 2001, enterprise software delivery solution OutSystems has evolved rapidly alongside the changing landscape of data. With a global presence and a vast community of over 750,000 members, OutSystems continues to leverage innovative tools, including data observability and generative AI, to help their customers succeed.
In this session, Pedro Sá Martins, Head of Data Engineering, will share the evolution of OutSystems’ data landscape, including how OutSystems has partnered with Snowflake, Fivetran and Monte Carlo to address their modern data challenges. He’ll share best practices for implementing scalable data quality programs to drive innovative technologies, as well as what’s on the data horizon for the OutSystems team.
From Metadata to AI Mastery: DataHub’s MCP-Powered Context Engine
AI agents need seamless access to enterprise data to deliver real value. DataHub's new MCP server creates the universal bridge that connects any AI agent to your entire data infrastructure through a single interface.
This session demonstrates how organizations are breaking down data silos by enabling AI agents to intelligently discover and interact with data across Snowflake, Databricks, BigQuery, and other platforms. See live examples of AI-powered data discovery, real-time incident response, and automated impact analysis.
Learn how forward-thinking data leaders are positioning their organizations at the center of the AI revolution by implementing universal data access strategies that scale across their entire ecosystem.
Mapping Vulnerabilities: The Disparate Impact of Change
Change – social and environmental, rural and urban – disproportionately impacts women and children. The UN reports that 80% of people displaced by climate change are women and girls. Research shows that domestic violence and crime, particularly against women, increases with male job loss. Understanding prevailing patterns by combining location data with social and environmental data can help identify vulnerable populations and mitigate these risks.
Join this session with Snowflake and Ordnance Survey to:
• Explore how integrating different types of geospatial data, such as maps, with open source data can expose the impact of these changes
• Discover how overlaying this information enriches visualization and analysis of vulnerabilities to identify better solutions
• Learn how data from the UK’s Ordnance Survey available through the Snowflake Marketplace addresses global challenges and helps mitigate risks
• Hear about Snowflake’s initiative to End Data Disparity
From Data to Intelligence: How Snowflake Powers Our Digital and AI Strategy
Join Tom Pryor, Principal Data Engineer, as he shares how his team has harnessed the power of Snowflake to transform their data strategy into a robust, scalable foundation for digital innovation and AI enablement. This session will explore how Snowflake has unified data across the enterprise, enabling real-time insights, powering customer-facing digital applications, and laying the groundwork for advanced AI capabilities. Tom will walk through key architectural decisions, data governance practices, and the evolution from legacy systems to a modern data platform.
How Espresso Uses ML To Cut Your Snowflake Bill in Half
Espresso AI uses two main techniques to run Snowflake workloads faster and cheaper: ML-based job scheduling and LLM-based query optimization. This talk will dive into the details behind both approaches.
Session brought to you by Snowflake
Building Agentic Ready AI-Native Platforms
In the scramble for agentic systems, the question has to be asked, are we ready?
This session highlights the common challenges and complexities we face during the rush for autonomous orchestration. We'll demonstrate how Snowflake's AI data platform offers a unified, adaptable, and trusted solution for creating data agents you can trust.