talk-data.com talk-data.com

Topic

Analytics

data_analysis insights metrics

4552

tagged

Activity Trend

398 peak/qtr
2020-Q1 2026-Q1

Activities

4552 activities · Newest first

What if you could turn all of your company's data into a single, intelligent conversation? That's the promise of agentic AI. In this session, you'll see how to create and deploy a no-code agentic AI solution directly using Snowflake Intelligence. Get ready for a live demo that proves you can make critical insights instantly accessible to everyone, with a simple, conversational interface. No more sifting through dashboards. You will discover how to just ask your data a question and get an answer you could trust with Snowflake Intelligence.

Brought to You By: •⁠ Statsig ⁠ — ⁠ The unified platform for flags, analytics, experiments, and more. Most teams end up in this situation: ship a feature to 10% of users, wait a week, check three different tools, try to correlate the data, and you’re still unsure if it worked. The problem is that each tool has its own user identification and segmentation logic. Statsig solved this problem by building everything within a unified platform. Check out Statsig. •⁠ Linear – The system for modern product development. In the episode, Armin talks about how he uses an army of “AI interns” at his startup. With Linear, you can easily do the same: Linear’s Cursor integration lets you add Cursor as an agent to your workspace. This agent then works alongside you and your team to make code changes or answer questions. You’ve got to try it out: give Linear a spin and see how it integrates with Cursor. — Armin Ronacher is the creator of the Flask framework for Python, was one of the first engineers hired at Sentry, and now the co-founder of a new startup. He has spent his career thinking deeply about how tools shape the way we build software. In this episode of The Pragmatic Engineer Podcast, he joins me to talk about how programming languages compare, why Rust may not be ideal for early-stage startups, and how AI tools are transforming the way engineers work. Armin shares his view on what continues to make certain languages worth learning, and how agentic coding is driving people to work more, sometimes to their own detriment.  We also discuss:  • Why the Python 2 to 3 migration was more challenging than expected • How Python, Go, Rust, and TypeScript stack up for different kinds of work  • How AI tools are changing the need for unified codebases • What Armin learned about error handling from his time at Sentry • And much more  Jump to interesting parts: • (06:53) How Python, Go, and Rust stack up and when to use each one • (30:08) Why Armin has changed his mind about AI tools • (50:32) How important are language choices from an error-handling perspective? — Timestamps (00:00) Intro (01:34) Why the Python 2 to 3 migration created so many challenges (06:53) How Python, Go, and Rust stack up and when to use each one (08:35) The friction points that make Rust a bad fit for startups (12:28) How Armin thinks about choosing a language for building a startup (22:33) How AI is impacting the need for unified code bases (24:19) The use cases where AI coding tools excel  (30:08) Why Armin has changed his mind about AI tools (38:04) Why different programming languages still matter but may not in an AI-driven future (42:13) Why agentic coding is driving people to work more and why that’s not always good (47:41) Armin’s error-handling takeaways from working at Sentry  (50:32) How important is language choice from an error-handling perspective (56:02) Why the current SDLC still doesn’t prioritize error handling  (1:04:18) The challenges language designers face  (1:05:40) What Armin learned from working in startups and who thrives in that environment (1:11:39) Rapid fire round — The Pragmatic Engineer deepdives relevant for this episode:

— Production and marketing by ⁠⁠⁠⁠⁠⁠⁠⁠https://penname.co/⁠⁠⁠⁠⁠⁠⁠⁠. For inquiries about sponsoring the podcast, email [email protected].

Get full access to The Pragmatic Engineer at newsletter.pragmaticengineer.com/subscribe

Thomas in't Veld, founder of Tasman Analytics, joined Yuliia and Dumke to discuss why data projects fail: teams obsess over tooling while ignoring proper data modeling and business alignment. Drawing from building analytics for 70-80 companies, Thomas explains why the best data model never changes unless the business changes, and how his team acts as "data therapists" forcing marketing and sales to agree on fundamental definitions. He shares his controversial take that data modeling sits more in analysis than engineering. Another hot take: analytics engineering is merging back into data engineering, and why showing off your DAG at meetups completely misses the point - business understanding is the critical differentiator, not your technology stack.

Summary In this episode of the Data Engineering Podcast Vijay Subramanian, founder and CEO of Trace, talks about metric trees - a new approach to data modeling that directly captures a company's business model. Vijay shares insights from his decade-long experience building data practices at Rent the Runway and explains how the modern data stack has led to a proliferation of dashboards without a coherent way for business consumers to reason about cause, effect, and action. He explores how metric trees differ from and interoperate with other data modeling approaches, serve as a backend for analytical workflows, and provide concrete examples like modeling Uber's revenue drivers and customer journeys. Vijay also discusses the potential of AI agents operating on metric trees to execute workflows, organizational patterns for defining inputs and outputs with business teams, and a vision for analytics that becomes invisible infrastructure embedded in everyday decisions.

Announcements Hello and welcome to the Data Engineering Podcast, the show about modern data managementData teams everywhere face the same problem: they're forcing ML models, streaming data, and real-time processing through orchestration tools built for simple ETL. The result? Inflexible infrastructure that can't adapt to different workloads. That's why Cash App and Cisco rely on Prefect. Cash App's fraud detection team got what they needed - flexible compute options, isolated environments for custom packages, and seamless data exchange between workflows. Each model runs on the right infrastructure, whether that's high-memory machines or distributed compute. Orchestration is the foundation that determines whether your data team ships or struggles. ETL, ML model training, AI Engineering, Streaming - Prefect runs it all from ingestion to activation in one platform. Whoop and 1Password also trust Prefect for their data operations. If these industry leaders use Prefect for critical workflows, see what it can do for you at dataengineeringpodcast.com/prefect.Data migrations are brutal. They drag on for months—sometimes years—burning through resources and crushing team morale. Datafold's AI-powered Migration Agent changes all that. Their unique combination of AI code translation and automated data validation has helped companies complete migrations up to 10 times faster than manual approaches. And they're so confident in their solution, they'll actually guarantee your timeline in writing. Ready to turn your year-long migration into weeks? Visit dataengineeringpodcast.com/datafold today for the details.Your host is Tobias Macey and today I'm interviewing Vijay Subramanian about metric trees and how they empower more effective and adaptive analyticsInterview IntroductionHow did you get involved in the area of data management?Can you describe what metric trees are and their purpose?How do metric trees relate to metric/semantic layers?What are the shortcomings of existing data modeling frameworks that prevent effective use of those assets?How do metric trees build on top of existing investments in dimensional data models?What are some strategies for engaging with the business to identify metrics and their relationships?What are your recommendations for storage, representation, and retrieval of metric trees?How do metric trees fit into the overall lifecycle of organizational data workflows?When creating any new data asset it introduces overhead of maintenance, monitoring, and evolution. How do metric trees fit into existing testing and validation frameworks that teams rely on for dimensional modeling?What are some of the key differences in useful evaluation/testing that teams need to develop for metric trees?How do metric trees assist in context engineering for AI-powered self-serve access to organizational data?What are the most interesting, innovative, or unexpected ways that you have seen metric trees used?What are the most interesting, unexpected, or challenging lessons that you have learned while working on metric trees and operationalizing them at Trace?When is a metric tree the wrong abstraction?What do you have planned for the future of Trace and applications of metric trees?Contact Info LinkedInParting Question From your perspective, what is the biggest gap in the tooling or technology for data management today?Closing Announcements Thank you for listening! Don't forget to check out our other shows. Podcast.init covers the Python language, its community, and the innovative ways it is being used. The AI Engineering Podcast is your guide to the fast-moving world of building AI systems.Visit the site to subscribe to the show, sign up for the mailing list, and read the show notes.If you've learned something or tried out a project from the show then tell us about it! Email [email protected] with your story.Links Metric TreeTraceModern Data StackHadoopVerticaLuigidbtRalph KimballBill InmonMetric LayerDimensional Data WarehouseMaster Data ManagementData GovernanceFinancial P&L (Profit and Loss)EBITDA ==Earnings before interest, taxes, depreciation and amortizationThe intro and outro music is from The Hug by The Freak Fandango Orchestra / CC BY-SA

podcast_episode
by Cris deRitis , Lisa Simon (Revelio Labs) , Mark Zandi (Moody's Analytics) , Marisa DiNatale (Moody's Analytics)

The Inside Economics team welcomes Lisa Simon, Chief Economist at Revelio Labs, for an unusual jobs Friday podcast as the ongoing government shutdown prevented the release of the September employment report. Lisa details the new public labor statistics data that Revelio Labs began publishing recently in the wake of turmoil at the Bureau of Labor Statistics. The team discusses how private data sources can help fill in the gaps left by the temporary absence of government data and also dissects the current state of the labor market. Guest: Lisa Simon – Chief Economics, Revelio Labs For more about Lisa Simon, click here: https://www.reveliolabs.com/author/lisa-k-simon/ Explore the risks and realities shaping the economy in our new webinar, now streaming for free. U.S. Economic Outlook: Under Unprecedented Uncertainty Watch here: https://events.moodys.com/mc68453-wbn-2025-mau25777-us-macro-outlook-precipice-recession?mkt_tok=OT… Hosts: Mark Zandi – Chief Economist, Moody’s Analytics, Cris deRitis – Deputy Chief Economist, Moody’s Analytics, and Marisa DiNatale – Senior Director - Head of Global Forecasting, Moody’s Analytics

Questions or Comments, please email us at [email protected]. We would love to hear from you.    To stay informed and follow the insights of Moody's Analytics economists, visit Economic View.

Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

podcast_episode
by Sia Zahedi (Global financial remittance company)

From launching AI products to modernizing legacy data stacks, we're going behind the scenes of data-driven transformation in financial remittance. In this episode, we sit down with Sia Zahedi, former CDO at a global financial remittance company, to get a candid look at the projects, challenges, and decisions that define data leadership in finance. If you've ever wondered what it's like to lead data strategy at a global financial company, this one is for you. What You'll Learn: What the day-to-day of a CDO looks like Real-world use cases for AI in financial services The difference between launching AI prototypes and real products Career advice for aspiring CDOs and senior data leaders   🤝 Follow Sia on LinkedIn!   Register for free to be part of the next live session: https://bit.ly/3XB3A8b   Follow us on Socials: LinkedIn YouTube Instagram (Mavens of Data) Instagram (Maven Analytics) TikTok Facebook Medium X/Twitter

Comment structurer, industrialiser et faire évoluer une activité Data, Analytics & IA dans une entreprise industrielle internationale ? Quels sont les leviers concrets pour concilier efficacité opérationnelle et impact business mesurable ? C’est à ces questions que répondront Matthieu et David en partageant le retour d’expérience du Groupe Somfy.

Depuis trois ans, Matthieu a accompagné la montée en puissance de la fonction data au sein de Somfy, en passant successivement par la Reporting Factory, la Digital Factory, puis en prenant la tête d’une réorganisation unifiée autour des domaines Analytics, Intelligence Artificielle et Data. Ce cheminement vers une « Data, Analytics & AI Factory » a permis de bâtir un modèle centré sur la création de valeur tout en garantissant une excellence opérationnelle.

Durant cette session, il reviendra sur les grandes étapes de cette transformation et les décisions structurantes prises pour passer à l’échelle :

• Comment organiser une gouvernance agile autour des expertises data, analytics et IA ?

• Quels défis rencontrer sur le plan humain, technique et métier ?

• Comment améliorer le time to market des projets tout en assurant leur pertinence et leur industrialisation ?

Au cœur de la conférence : des exemples concrets de squads multidisciplinaires déployées sur des cas d’usage à fort impact, les bénéfices observés, et les enseignements clés pour ancrer durablement une culture de la donnée.

Une session pragmatique, orientée terrain, pour toutes les organisations qui souhaitent faire de la Data, Analytics & IA un véritable levier de transformation durable.

Cas client : Aésio Mutuelle révolutionne l'exploitation de ses données grâce aux solutions Qlik Talend Data Intégration et Qlik Cloud Analytics

Les cas d'usage de la data dans les sociétés d'assurance sont nombreux et sensibles : de la simple gestion opérationnelle des dossiers jusqu'à la lutte contre le blanchiment, en passant par les obligations légales de type LCB-FT. Mais leur gestion a tendance à créer des silos au fil du temps.

Pour ce spécialiste de l'assurance santé et prévoyance, la mise en place de la plateforme data moderne de Qlik, permettant de gérer la donnée de bout-en-bout, a transformé les tâches quotidiennes des métiers, en leur donnant une complète autonomie.

Vous souhaitez en savoir plus ? Toute l'équipe Qlik vous donne rendez-vous sur le stand D38 pour des démos live, des cas d'usage et des conseils d'experts. 

AI is no longer a distant concept; it's here, reshaping the way we live and work. From coding and customer service to creative content, AI is already taking on tasks once thought to be uniquely human. But what does that mean for the future of work, and more importantly, for the role of leaders? In this solo episode of Hub & Spoken, Jason Foster, CEO and Founder of Cynozure, explores the real implications of AI on jobs, leadership, and human value. Drawing lessons from history, automation, shipping containers, even the rise of personal computing, Jason argues that every wave of technology has shifted humans "up a level of abstraction," moving us from doing to designing, to directing and innovating. He sets out four essential human traits to thrive in the age of AI: Think bigger – focus on outcomes, strategy, and imagination Lead differently – provide clarity, orchestrate teams, and build culture Connect deeper – lean into empathy, context, and trust Grow and adapt – stay curious, resilient, and open to change 🎧 Tune in to hear Jason's take on how we can design the future we want to be part of.


Cynozure is a leading data, analytics and AI company that helps organisations to reach their data potential. It works with clients on data and AI strategy, data management, data architecture and engineering, analytics and AI, data culture and literacy, and data leadership. The company was named one of The Sunday Times' fastest-growing private companies in both 2022 and 2023 and recognised as The Best Place to Work in Data by DataIQ in 2023 and 2024. Cynozure is a certified B Corporation.   

Advanced Snowflake

As Snowflake's capabilities expand, staying updated with its latest features and functionalities can be overwhelming. The platform's rapid development gave rise to advanced tools like Snowpark and the Native App Framework, which are crucial for optimizing data operations but may seem complex to navigate. In this essential book, author Muhammad Fasih Ullah offers a detailed guide to understanding these sophisticated tools, ensuring you can leverage the full potential of Snowflake for data processing, application development, and deploying machine learning models at scale. You'll gain actionable insights and structured examples to transform your understanding and skills in handling advanced data scenarios within Snowflake. By the end of this book, you will: Grasp advanced features such as Snowpark, Snowflake Native App Framework, and Iceberg tables Enhance your projects with geospatial functions for comprehensive geospatial analytics Interact with Snowflake using a variety of programming languages through Snowpark Implement and manage machine learning models effectively using Snowpark ML Develop and deploy applications within the Snowflake environment

Cette session, présentée par Polar Analytics, plongera au cœur des défis du Big Data pour les marques d'e-commerce. Loin des concepts théoriques, nous vous présenterons un cas d'usage concret démontrant comment transformer des données fragmentées en une plateforme de business intelligence unifiée et actionable.

Beaucoup d’organisations parlent de créer une Source Unique de Vérité (SSOT), mais rares sont celles qui parviennent à en faire une réalité durable. Dans cette session, Vira Douangphouxay, Director of Analytics Engineering chez Vestiaire Collective, partagera comment son équipe a conçu et fait évoluer une initiative SSOT depuis zéro - en équilibrant scalabilité technique, alignement inter-équipes et gouvernance à long terme.

Vous découvrirez des retours d’expérience concrets : comment prioriser les actifs les plus critiques, structurer les responsabilités entre les équipes BI, produit et métier, et intégrer les bonnes pratiques dans des outils comme Coalesce, Snowflake, Catalog et Google Sheets.

Tristan Mayer, General Manager Catalog chez Coalesce, interviendra également pour apporter un éclairage complémentaire sur les bonnes pratiques outillées et les leçons tirées d'autres entreprises du secteur.

Que vous débutiez votre projet SSOT ou cherchiez à le pérenniser, cette session vous offrira une vision pragmatique de ce qu’il faut vraiment pour unifier vos définitions de données, réduire les incohérences de reporting et restaurer la confiance dans vos analyses.

Learn how Trade Republic builds its analytical data stack as a modern, real-time Lakehouse with ACID guarantees. Using Debezium for change data capture, we stream database changes and events into our data lake. We leverage Apache Iceberg to ensure interoperability across our analytics platform, powering operational reporting, data science, and executive dashboards.

Découvrez, à travers un retour d’expérience client, comment le spécialiste mondial de la cybersécurité Exclusive Networks a automatisé ses prévisions de chiffre d’affaires grâce à un workflow Alteryx conçu par Prime Analytics.

Cette démonstration live vous montrera comment passer de la consolidation des données à des prévisions multi-modèles (ARIMA, ETS), jusqu’à l’édition automatique de commentaires de gestion. Un cas d’usage 100 % actionnable — sans code, mais avec un vrai ROI.

A l’occasion de cette démo, en partant d’une page blanche et de différentes sources de données, nous irons jusqu’à déployer une application Data Analytics augmentée par des LMM en utilisant ces deux produits lancés par OVHcloud en 2025.

OVHcloud DataPlatform : une solution unifiée et permettant vos équipes de gérer en self-service de bout en bout vos projets Data & Analytics : de la collecte de tous types de données, leur exploration, leur stockage, leurs transformations, jusqu’à la construction de tableaux de bords partagés via des applications dédiées. Une service pay-as-you-go pour accélérer de déploiement et simplifier la gestion des projets Data.

AI Endpoints : une solution serverless qui permet aux développeurs d’intégrer facilement des fonctionnalités d'IA avancées à leurs applications. Grâce à plus de 40 modèles open-source de pointe incluant LLM et IA générative – pour des usages comme les agents conversationnels, modèles vocaux, assistants de code, etc. - AI Endpoints démocratise l’utilisation de l'IA, indépendamment de la taille ou du secteur de l'organisation.

Et cela en s’appuyant sur les meilleurs standards Data open-source (Apache Iceberg, Spark, SuperSet, Trino, Jupyter Notebooks…) dans des environnements respectueux de votre souveraineté technologique.