Discover how to build a powerful AI Lakehouse and unified data fabric natively on Google Cloud. Leverage BigQuery's serverless scale and robust analytics capabilities as the core, seamlessly integrating open data formats with Apache Iceberg and efficient processing using managed Spark environments like Dataproc. Explore the essential components of this modern data environment, including data architecture best practices, robust integration strategies, high data quality assurance, and efficient metadata management with Google Cloud Data Catalog. Learn how Google Cloud's comprehensive ecosystem accelerates advanced analytics, preparing your data for sophisticated machine learning initiatives and enabling direct connection to services like Vertex AI.
talk-data.com
Topic
GCP
Google Cloud Platform (GCP)
1670
tagged
Activity Trend
Top Events
The most sought-after products don’t just appear on shelves—they arrive at the perfect moment, in perfect condition, thanks to data that works as fast as the business moves.
From premium meats to peak-season produce, Morrisons, one of the UK’s largest retailers, is building a future where shelves are stocked with exactly what customers want, when they want it.
In this session, Peter Laflin, Chief Data Officer at Morrisons, joins Striim to share how real-time data streaming into Google Cloud enables smarter, faster, and more autonomous retail operations. He’ll unpack how Morrisons is moving beyond predictive models to build AI-native, agentic systems that can sense, decide, and act at scale. Topics include:
Live store operations that respond instantly to real-world signals
AI architectures that move from “data-informed” to “data-delegated” decisions
Practical lessons from embedding real-time thinking across teams and tech stacks
This is a session for retail and data leaders who are ready to move beyond dashboards and start building intelligent systems that deliver both customer delight and operational agility.
Are you ready to build the next generation of data-driven applications? This session demystifies the world of Autonomous Agents, explaining what they are and why they are the future of AI. We’ll dive into Google Cloud's comprehensive platform for creating and deploying these agents, from our multimodal data handling to the seamless integration of Gemini models. You will learn the principles behind building your own custom data agents and understand why Google Cloud provides the definitive platform for this innovation. Join us to gain the knowledge and tools needed to architect and deploy intelligent, self-sufficient data solutions.
The growth of connected data has made graph databases essential, yet organisations often face a dilemma: choosing between an operational graph for real-time queries or an analytical engine for large-scale processing. This division leads to data silos and complex ETL pipelines, hindering the seamless integration of real-time insights with deep analytics and the ability to ground AI models in factual, enterprise-specific knowledge. Google Cloud aims to solve this with a unified "Graph Fabric," introducing Spanner Graph, which extends Spanner with native support for the ISO standard Graph Query Language (GQL). This session will cover how Google Cloud has developed a Unified Graph Solution with BigQuery and Spanner graphs to serve a full spectrum of graph needs from operational to analytical.
Discover how Google Cloud's AI-native platform is transforming data science, moving beyond traditional methods to empower you with an intuitive experience, an open ecosystem, and the ability to build intelligent, data-native AI agents. This shift eliminates integration headaches and scales your impact, enabling you to innovate faster and drive real-world outcomes. Explore how these advancements unify your workflows and unlock unprecedented possibilities for real-time, agent-driven insights.
Discussion of Rightmove's data hive and analytics platform.
Air France KLM partnered with Datashift to tackle data silos, regulatory complexity, and fragmented access by building a central data marketplace. Using Collibra and Google Cloud, they now empower the organization with trusted, governed, and self-service data to support critical operations and decision-making.
Sligro Food Group, Dutch market leader in food service, needed to centralise data to improve retail decision-making and stay competitive. Moving away from on-premise databases, they simplified integration to GCP. Join this session to learn why centralised data is key for Sligro, why enterprises choose Fivetran over legacy tools, and how to integrate data into the cloud for real-time analytics.
Discover how to design and deploy powerful multi-agent systems on Google Cloud. Cloud & AI Consultant Timothy van der Werf demonstrates how agents collaborate, share tasks, and autonomously handle complex processes using technologies like Agentspace. This session delivers practical examples and insights for anyone looking to apply AI to data-driven innovation, customer engagement, or operational optimization.
Elliot Foreman and Andrew DeLave from ProsperOps joined Yuliia and Dumky to discuss automated cloud cost optimization through commitment management. As Google go-to-market director and senior FinOps specialist, they explain how their platform manages over $4 billion in cloud spend by automating reserved instances, committed use discounts, and savings plans across AWS, Azure, and Google Cloud. The conversation covers the psychology behind commitment hesitation, break-even point mathematics for cloud discounts, workload volatility optimization, and why they avoid AI in favor of deterministic algorithms for financial decisions. They share insights on managing complex multi-cloud environments, the human vs automation debate in FinOps, and practical strategies for reducing cloud costs while mitigating commitment risks.
Episode 4 – Deploy AI Agents to Production.
Deploy AI Agents to Production
Deploy AI agents to production on Google Cloud Platform; theory and live coding.
Deploy AI Agents to Production
Guidance on deploying AI agents to production environments on GCP.
Présenté par : Rami Rekik, Staff SRE @Algolia : Découvrez la stratégie d'Algolia pour réduire ses coûts GCP de 150 000 $ par mois (soit 1,8 million de dollars par an !) en quelques jours seulement. Rami partage des optimisations concrètes : des commitments judicieux (CUDs, slots BigQuery), la suppression de ressources inutiles (logs, vues matérialisées), et l'optimisation des formats de stockage. Apprenez les actions précises, leurs impacts mesurés et comment les implémenter chez vous.
Episode 4: Deploying to Vertex AI Agent Engine.
This session will detail Allegro’s, a leading e-commerce company in Poland, journey with Apache Airflow. It will chart our evolution from a custom, on-premises Airflow-as-a-Service solution through a significant expansion to over 300 Cloud Composer instances in Google Cloud, culminating in Airflow becoming the core of our data processing. We orchestrate over 64,000 regular tasks spanning over 6,000 active DAGs on more than 200 Airflow instances. From feeding business-supporting dashboards, to managing main data marts, and handling ML pipelines, and more. We will share our practical experiences, lessons learned, and the strategies employed to manage and scale this critical infrastructure. Furthermore, we will introduce our innovative economy-of-share approach for providing ready-to-use Airflow environments, significantly enhancing both user productivity and cost efficiency.
During this workshop you are going to learn the latest features published within Cloud Composer which is a managed service for Apache Airflow on Google Cloud Platform.
During this sessions audience is going to learn about newest feature of managed Airflow offering provided by Google Cloud. If you would like to operate Airflow at scale or in regulated environments then this session is for you.