talk-data.com talk-data.com

Topic

Snowflake

data_warehouse cloud analytics olap

550

tagged

Activity Trend

193 peak/qtr
2020-Q1 2026-Q1

Activities

550 activities · Newest first

Unlock the full potential of your data. Discover how the Zero Copy integration between Salesforce Data Cloud and Snowflake creates a unified, trusted data foundation to power your AI strategy—without moving your data. We'll demonstrate how secure, governed bi-directional Zero Copy access enables real-time customer interactions and provides AI agents with the reliable context they need to succeed. Join an expert from our Munich Engineering team for an inside look at what makes Zero Copy possible: Hyper, Data Cloud's high-performance query engine, envisioned and built right here in Munich.

tesa SE is a global adhesive manufacturing company. In their highly automated tape production process it's needed to observe quality and effiency abnormal events with very short latency to avoid high costs.

Utilizing Snowflake's Machine Learning capabilities, Tesa SE is monitoring various KPI's that indicate the correct production process.

Tesa's newest innovative usecase aims to decrease waste during the production process using anomaly detection methodologies, which are trained on Snowflake, and used for inference on-edge for optimal latency.

The machine learning model pipeline components are built and served leveraging Snowflake features such as Snowflake CLI, Snowpark Pandas and other Snowflake capabilities to streamline the overarching ML process.

Snowflake recently launched new capabilities to help organizations build data agents connected to enterprise data environment. This session explores why Snowflake makes it easy to iterate, monitor and observe agentic AI systems. Learn associated best practices related to RBAC integration in agentic workflows and applications. The session will also focus on change control and environment management and the software development lifecycle within the AI agents development lifecycle.

Discover how Boehringer is managing data translation and harmonization enabling the creation of seamless data integrations worldwide. This session will detail how Boehringer transformed its fragmented, data landscape into a centralized data factory on Snowflake, reducing costs while dramatically improving scalability and efficiency. We will explore how leveraging Cortex LLM Functions automates data mapping and consistency, enabling the creation of integrated data products that empower global business steering. Join us to learn how to overcome data harmonization challenges and build a data factory mindset for better decision-making and unprecedented growth.

Discover how Snowflake Dynamic Tables revolutionize the implementation of Slowly Changing Dimensions (SCDs). Learn to create automated, self-maintaining data pipelines that eliminate complex ETL processes and reduce maintenance overhead. This session demonstrates practical examples of building Type 1 and 2 SCDs using Dynamic Tables, showcasing their cost-efficiency and performance benefits while ensuring data accuracy and historical tracking.

This session highlights Wipro's innovative approach to transforming financial operations using AI agents. By leveraging Wipro's deep industry expertise and Snowflake's data prowess, we have developed a scalable, secure, and intelligent system that revolutionizes the management of supplier-related queries around invoices and payments. Our solution has significantly increased processing efficiency, reduced latency and manual activities, and achieved high accuracy in responses. The implementation has led to substantial cost savings and improved response times, demonstrating the potential of AI-powered solutions in enhancing business operations.

Allianz replaced its legacy campaign system with a cloud-native Campaign Data Hub (CDH) powered by Snowflake, unifying data from three core business lines onto a single, real-time platform. This modern architecture reduces costs and frees up IT resources, while empowering over 8,000 agents with on-demand customer insights and the ability to pivot campaign messaging in minutes. The result is a strategic shift from legacy complexity to data-driven growth, enabling Allianz to launch hyper-personalized campaigns at scale and drive a sharp increase in agent productivity and conversion rates.

FFT, a global leader in innovative manufacturing systems, has developed a solution to the persistent challenge of bridging IT and OT data. In just six months, FFT launched the FFT DataBridge, which resides on shopfloor edge devices and seamlessly integrates production data into the Snowflake AI Data Cloud. This solution unlocks critical shopfloor analytics, AI-based forecasting, and predictive maintenance. By leveraging the power of Snowflake, FFT is helping manufacturing companies transform their operations and is continuing their journey by building a suite of IT/OT-centric applications for the Snowflake Marketplace.

This session puts the spotlight on STADA’s journey in transforming data accessibility for better decision-making. As a trusted Snowflake customer, STADA has prioritized consistent and reliable data by centralizing its commercial and supply information on Snowflake’s platform. With EPAM’s support and innovations enabled by Cortex technology, STADA has empowered its employees with intuitive tools to navigate complex data with ease. Join us to hear how STADA is driving efficiency and unlocking the full potential of its data.

The retail and consumer goods industries are undergoing significant transformation, driven by shifting consumer behaviors, global economic changes, supply chain disruptions and, most importantly, rapid technological innovation. This session is designed for business and technology leaders in RCG, offering them insights and strategies needed to navigate and thrive in this evolving landscape. Learn from the transformational experience of the leading global consumer goods company, Snowflake industry experts and key partners as they explore how data and AI technologies are shaping the industries' future.

In this session, We will enable CDAO's Enterprise Data Strategy by exploring the full lifecycle from the concept of data products—packaged, readily consumable data assets—to their distribution through Snowflake's internal marketplace.

We will drive home 5 key points within this data product lifecycle: 1. Simplified data access for business users 2. Improved data discoverability and understanding 3. Increased data utilization and adoption 4. Accelerated time to insight 5. Metrics & Considerations

Modern data engineering leverages Python to build robust, scalable, end-to-end workflows. In this talk, we will cover how Snowflake offers you a flexible development environment for developing Python data pipelines, performing transformation at scale, orchestrating and deploying your pipelines at scale. Topics we’ll cover include: – Ingest: Data source APIs, Snowflake file-to-read and ingest data of any format when files arrive, with sources outside Snowflake – Develop: Packaging (artifact repo), Python runtimes, IDE (Notebook, vscode) – Transform: Snowpark pandas, UDFs, UDAFs – Deploy: Tasks, Notebook scheduling

Connecting machines and structuring industrial data has long been one of the toughest challenges in smart manufacturing. Before unlocking the power of AI, large language models, or advanced analytics, companies must first solve the foundational task of harmonizing and organizing their data—without this, bad data only leads to bad AI.

This session covers the journey from building a Unified Namespace as the data foundation to scaling predictive use cases such as maintenance, quality optimization, and process improvements. Using customer stories from discrete and process manufacturing, we will show howDXC &Snowflake enables enterprises to connect IoT data at scale, establish a harmonized taxonomy across global operations, and drive measurable business outcomes.

By unifying diverse industrial IoT and enterprise data into a governed data layer, the Unified Namespace enables creation of an operational digital twin—a live, authoritative representation of manufacturing systems and assets that fuels scalable AI use cases like predictive maintenance, autonomous control, and AI-driven shop floor assistance. Attendees will learn howDXCs &Snowflake’s IoTbest-practicespower OT/IT convergence, continuous digital twin evolution, and AI-driven operational excellence.

Collecting and analyzing data is at the heart of every pharmaceutical R&D organization. As a result, we at Merck KGaA store large amounts of very diverse data – from assay results in early research, to operational data from clinical trials, human clinical trial data and data related to regulatory submissions. In the upcoming era of artificial intelligence (AI), making this data available to AI systems at scale is a strategic imperative. For this reason, we have formed a cross R&D workstream to modernize our compute & story ecosystem. In this presentation, we will explain the approach taken, the main challenges faced and how we are addressing them. One important component of our new ecosystem is Snowflake, where we leverage automation and blueprints to enable a consistent technical foundation across the different R&D domains.

In today's fast-paced business world, fast, reliable access to high-quality data is essential. Kuehne+Nagel addresses this by adopting Data Mesh principles, using Snowflake for managed data access. This session introduces the Kuehne+Nagel New IT Ecosystem (KNITE), a suite of platforms that replace legacy systems and implement Data Mesh principles, with Snowflake as a core component. We’ll showcase how Snowflake integrates into KNITE and highlight the features powering our Data Mesh approach.

Siemens has developed two pilots for customer-facing industrial applications on Snowflake that bring the power of OT/IT integration to life: An OEE app enabling Siemens customers to talk to their machine data and gain insights and create customised insights in a self-service manner within seconds. And a logistics insights application enabling customers to optimise the continuous reorganisation of the warehouses close to their production. Both applications use shop floor data collected through Siemens' Industrial Edge solution that can now be integrated seamlessly into the customer's Snowflake data cloud with our native connector

Dans un contexte où les entreprises doivent faire plus avec moins, la combinaison de Denodo et Snowflake s’impose comme une solution puissante pour accélérer la mise à disposition des données, tout en garantissant productivité et gouvernance, même avec des équipes réduites.

Cet atelier vous propose de découvrir comment CENTRE FRANCE a mis en place un dataware virtuel capable d’agréger des données issues de multiples sources — qu’elles soient conservées dans les systèmes d’origine ou déposées dans Snowflake — et de les rendre accessibles en temps réel via des outils comme Tableau.

Ce que vous allez découvrir :

- Un gain de temps significatif sur la préparation des données grâce à la virtualisation

- Un suivi opérationnel en temps réel 

- Un Data Catalog orienté métier permettant l’exploration, qualification, autonomie des utilisateurs

- Une gouvernance renforcée : sécurité, traçabilité

- Des vues multi-sources pour répondre aux besoins marketing (segmentation, ciblage…)

- Une intégration prévue avec les outils de mailing pour des campagnes personnalisées

- Une plateforme évolutive : support des développements spécifiques selon les besoins