In this session, we will investigate what are the options of realizing decentralized AI with zero knowledge machine learning.
talk-data.com
Topic
AI/ML
Artificial Intelligence/Machine Learning
9014
tagged
Activity Trend
Top Events
In this session, we will investigate what are the options of realizing decentralized AI with zero knowledge machine learning.
In this session, we will investigate what are the options of realizing decentralized AI with zero knowledge machine learning.
Send us a text In this episode, Gaelle Helsmoortel joins us to discuss how to make AI truly deliver business impact, not just proof of concept. With over 25 years of experience spanning L’Oréal, startup leadership, and her current role at Dataroots, Gaelle shares her approach to turning business challenges into measurable value. She breaks down her proven 5Ps framework (Purpose, People, Process, Platform, and Performance) and explains how companies can bridge the gap between strategy and execution to generate real results. 🎧 You’ll learn: Why most AI projects fail (and how to prevent it)How to move from proof of concept to proof of valueHow to align business purpose, data, and people for maximum impactWhy “purpose before platform” is key to successful AI adoptionWhether you’re a business leader, strategist, or data professional, this episode will help you understand how to make AI work for business and deliver tangible results. 🔗 Connect with Gaëlle: 🌐 Website & Newsletter: Generative Booster – The Game Changer List
▶️ YouTube Channel: @GenerativeBooster
Thu, Oct 23, 2025 | 12 PM EST | 60-minute live webinar on AI vs. AI in ethical hacking; how AI is reshaping cybersecurity and defenses.
In this episode, we explore how data science is helping researchers simulate and understand some of the most extreme physical events on Earth, from floods in Texas to hypersonic flight. Our guests are Stephen Baek, a leading expert in geometric deep learning and associate professor of data science at the University of Virginia, and Jack Beerman, a Ph.D. student whose work is already shaping real-world applications.
Together, they discuss how AI is transforming fields like weather forecasting, materials design, sports performance, and military innovation—and why graduate researchers like Jack are essential to moving this work forward.
There's no shortage of technical content for data engineers, but a massive gap exists when it comes to the non-technical skills required to advance beyond a senior role. I sit down with Yordan Ivanov, Head of Data Engineering and writer of "Data Gibberish," to talk about this disconnect. We dive into his personal journey of failing as a manager the first time, learning the crucial "people" skills, and his current mission to help data engineers learn how to speak the language of business. Key areas we explore: The Senior-Level Content Gap: Yordan explains why his non-technical content on career strategy and stakeholder communication gets "terrible" engagement compared to technical posts, even though it's what's needed to advance.The Managerial Trap: Yordan's candid story about his first attempt at management, where he failed because he cared only about code and wasn't equipped for the people-centric aspects and politics of the role.The Danger of AI Over-reliance: A deep discussion on how leaning too heavily on AI can prevent the development of fundamental thinking and problem-solving skills, both in coding and in life.The Maturing Data Landscape: We reflect on the end of the "modern data stack euphoria" and what the wave of acquisitions means for innovation and the future of data tooling.AI Adoption in Europe vs. the US: A look at how AI adoption is perceived as massive and mandatory in Europe, while US census data shows surprisingly low enterprise adoption rates
How do you make a nanoparticle that tells you where it is and helps at the same time? In this episode, we dive into the chemistry behind polydiacetylene (PDA)—a polymer that changes colour when it senses temperature, pH, or stress.
Researchers combined PDA with biodegradable poly(glycerol adipate) to create self-reporting nanoparticles that:
Change colour from blue to red under stress or heat Track cells and nematodes without any added fluorescent dyes Degrade naturally via enzymatic action Carry drugs like usnic acid for therapeutic delivery
It’s a step toward theranostic polymers—materials that diagnose and treat simultaneously, glowing as they go. Even C. elegans joined the test, confirming safe uptake and real-time visibility.
📖 Based on the research article: “Tailoring the Properties of Polydiacetylene Nanosystems for Enhanced Cell Tracking Through Poly(glycerol Adipate) Blending: an In Vitro and In Vivo Investigation” Benedetta Brugnoli, Eleni Axioti, Philippa L. Jacob, Nana A. Berfi, Lei Lei, Benoit Couturaud, Veeren M. Chauhan, Robert J. Cavanagh, Luciano Galantini, Iolanda Francolini & Vincenzo Taresco Published in Macromolecular Chemistry and Physics (2025) 🔗 https://doi.org/10.1002/macp.202500259
🎧 Subscribe to the WOrM Podcast for more bright ideas in molecular sensing, smart polymers, and organism-level science.
This podcast is generated with artificial intelligence and curated by Veeren. If you’d like your publication featured on the show, please get in touch.
📩 More info: 🔗 www.veerenchauhan.com 📧 [email protected]
Help us become the #1 Data Podcast by leaving a rating & review! We are 67 reviews away! Data meets music 🎶 — Avery sits down with Chris Reba, a data analyst who’s studied over 1 million songs, to reveal what the numbers say about how hits are made. From uncovering Billboard chart fraud to exploring how TikTok reshaped music, this episode breaks down the art and science behind every beat. 💌 Join 30k+ aspiring data analysts & get my tips in your inbox weekly 👉 https://www.datacareerjumpstart.com/newsletter 🆘 Feeling stuck in your data journey? Come to my next free "How to Land Your First Data Job" training 👉 https://www.datacareerjumpstart.com/training 👩💻 Want to land a data job in less than 90 days? 👉 https://www.datacareerjumpstart.com/daa 👔 Ace The Interview with Confidence 👉 https://www.datacareerjumpstart.com//interviewsimulator ⌚ TIMESTAMPS 00:00 - Intro: How Chris analyzed 1M+ songs using data 01:10 - What data reveals about hit songs and music trends 03:30 - Combining qualitative and quantitative analysis 07:00 - The 1970s Billboard chart fraud explained 10:45 - Why key changes disappeared from modern pop 13:30 - How hip-hop changed song structure and sound 14:10 - TikTok’s influence on the music industry 16:10 - Inside Chris’s open-source music dataset 22:10 - Best tools for music data analysis (SQL, Python, Datawrapper) 27:45 - Advice for aspiring music data analysts 🔗 CONNECT WITH CHRIS 📕 Order Chris's Book: https://www.bloomsbury.com/us/uncharted-territory-9798765149911 📊 Check out Chris's Music Dataset: https://docs.google.com/spreadsheets/d/1j1AUgtMnjpFTz54UdXgCKZ1i4bNxFjf01ImJ-BqBEt0/edit?gid=1974823090#gid=1974823090 💌 Subscribe to Chris's' Newsletter: https://www.cantgetmuchhigher.com 📲 Follow Chris on TikTok: https://www.tiktok.com/@cdallarivamusic 🔗 CONNECT WITH AVERY 🎥 YouTube Channel 🤝 LinkedIn 📸 Instagram 🎵 TikTok 💻 Website Mentioned in this episode: Join the last cohort of 2025! The LAST cohort of The Data Analytics Accelerator for 2025 kicks off on Monday, December 8th and enrollment is officially open!
To celebrate the end of the year, we’re running a special End-of-Year Sale, where you’ll get: ✅ A discount on your enrollment 🎁 6 bonus gifts, including job listings, interview prep, AI tools + more
If your goal is to land a data job in 2026, this is your chance to get ahead of the competition and start strong.
👉 Join the December Cohort & Claim Your Bonuses: https://DataCareerJumpstart.com/daa https://www.datacareerjumpstart.com/daa
This week, I’m showing you exactly how I used AI agents to fix my job hunt — no hype, just results. I was juggling dozens of job applications, interviews, and follow-ups until I built three small agents that acted like my personal job search team. In this episode, I do a live demo of: A Researcher Agent that finds company insights automaticallyA Writer Agent that drafts personal outreach messagesA Reviewer Agent that polishes tone and clarityTogether, they turned hours of chaos into minutes of clear progress. You’ll see how these agents plan, collaborate, and improve your workflow — and how you can build your own version tonight using just ChatGPT or any LLM platform. By the end, you’ll understand what makes agents powerful: planning, memory, and feedback.
🔗 Connect with Me: Free Email NewsletterWebsite: Data & AI with MukundanGitHub: https://github.com/mukund14Twitter/X: @sankarmukund475LinkedIn: Mukundan SankarYouTube: Subscribe
Está no ar, o Data Hackers News !! Os assuntos mais quentes da semana, com as principais notícias da área de Dados, IA e Tecnologia, que você também encontra na nossa Newsletter semanal, agora no Podcast do Data Hackers !! Aperte o play e ouça agora, o Data Hackers News dessa semana! Para saber tudo sobre o que está acontecendo na área de dados, se inscreva na Newsletter semanal: https://www.datahackers.news/ Conheça nossos comentaristas do Data Hackers News: Monique Femme Demais canais do Data Hackers: Site Linkedin Instagram Tik Tok You Tube
AI-Driven Software Testing explores how Artificial Intelligence (AI) and Machine Learning (ML) are revolutionizing quality engineering (QE), making testing more intelligent, efficient, and adaptive. The book begins by examining the critical role of QE in modern software development and the paradigm shift introduced by AI/ML. It traces the evolution of software testing, from manual approaches to AI-powered automation, highlighting key innovations that enhance accuracy, speed, and scalability. Readers will gain a deep understanding of quality engineering in the age of AI, comparing traditional and AI-driven testing methodologies to uncover their advantages and challenges. Moving into practical applications, the book delves into AI-enhanced test planning, execution, and defect management. It explores AI-driven test case development, intelligent test environments, and real-time monitoring techniques that streamline the testing lifecycle. Additionally, it covers AI’s impact on continuous integration and delivery (CI/CD), predictive analytics for failure prevention, and strategies for scaling AI-driven testing across cloud platforms. Finally, it looks ahead to the future of AI in software testing, discussing emerging trends, ethical considerations, and the evolving role of QE professionals in an AI-first world. With real-world case studies and actionable insights, AI-Driven Software Testing is an essential guide for QE engineers, developers, and tech leaders looking to harness AI for smarter, faster, and more reliable software testing. What you will learn: • What are the key principles of AI/ML-driven quality engineering • What is intelligent test case generation and adaptive test automation • Explore predictive analytics for defect prevention and risk assessment • Understand integration of AI/ML tools in CI/CD pipelines Who this book is for: Quality Engineers looking to enhance software testing with AI-driven techniques. Data Scientists exploring AI applications in software quality assurance and engineering. Software Developers – Engineers seeking to integrate AI/ML into testing and automation workflows.
Unlock the full financial potential of your Snowflake environment. Learn how to cut costs, boost performance, and take control of your cloud data spend with FinOps for Snowflake—your essential guide to implementing a smart, automated, and Snowflake-optimized FinOps strategy. In today’s data-driven world, financial optimization on platforms like Snowflake is more critical than ever. Whether you're just beginning your FinOps journey or refining mature practices, this book provides a practical roadmap to align Snowflake usage with business goals, reduce costs, and improve performance—without compromising agility. Grounded in real-world case studies and packed with actionable strategies, FinOps for Snowflake shows how leading organizations are transforming their environments through automation, governance, and cost intelligence. You'll learn how to apply proven techniques for architecture tuning, workload and storage efficiency, and performance optimization—empowering you to make smarter, data-driven decisions. What You Will Learn Master FinOps principles tailored for Snowflake’s architecture and pricing model Enable collaboration across finance, engineering, and business teams Deliver real-time cost insights for smarter decision-making Optimize compute, storage, and Snowflake AI and ML services for efficiency Leverage Snowflake Cortex AI and Adoptive Warehouse/Compute for intelligent cost governance Apply proven strategies to achieve operational excellence and measurable savings Who this Book is For Data professionals, cloud engineers, FinOps practitioners, and finance teams seeking to improve cost visibility, operational efficiency, and financial accountability in Snowflake environments.
The journey from startup to billion-dollar enterprise requires more than just a great product—it demands strategic alignment between sales and marketing. How do you identify your ideal customer profile when you're just starting out? What data signals help you find the twins of your successful early adopters? With AI now automating everything from competitive analysis to content creation, the traditional boundaries between departments are blurring. But what personality traits should you look for when building teams that can scale with your growth? And how do you ensure your data strategy supports rather than hinders your AI ambitions in this rapidly evolving landscape? Denise Persson is CMO at Snowflake and has 20 years of technology marketing experience at high-growth companies. Prior to joining Snowflake, she served as CMO for Apigee, an API platform company that went public in 2015 and Google acquired in 2016. She began her career at collaboration software company Genesys, where she built and led a global marketing organization. Denise also helped lead Genesys through its expansion to become a successful IPO and acquired company. Denise holds a BA in Business Administration and Economics from Stockholm University, and holds an MBA from Georgetown University. Chris Degnan is the former CRO at Snowflake and has over 15 years of enterprise technology sales experience. Before working at Snowflake, Chris served as the AVP of the West at EMC, and prior to that as VP Western Region at Aveksa, where he helped grow the business 250% year-over-year. Before Aveksa, Chris spent eight years at EMC and managed a team responsible for 175 select accounts. Prior to EMC, Chris worked in enterprise sales at Informatica and Covalent Technologies (acquired by VMware). He holds a BA from the University of Delaware. In the episode, Richie, Denise, and Chris explore the journey to a billion-dollar ARR, the importance of customer obsession, aligning sales and marketing, leveraging data for decision-making, and the role of AI in scaling operations, and much more. Links Mentioned in the Show: SnowflakeSnowflake BUILDConnect with Denise and ChrisSnowflake is FREE on DataCamp this weekRelated Episode: Adding AI to the Data Warehouse with Sridhar Ramaswamy, CEO at SnowflakeRewatch RADAR AI New to DataCamp? Learn on the go using the DataCamp mobile appEmpower your business with world-class data and AI skills with DataCamp for business
Ryan Dolley, VP of Product Strategy at GoodData and co-host of Super Data Brothers podcast, joined Yuliia and Dumke to discuss the DBT-Fivetran merger and what it signals about the modern data stack's consolidation phase. After 16 years in BI and analytics, Ryan explains why BI adoption has been stuck at 27% for a decade and why simply adding AI chatbots won't solve it. He argues that at large enterprises, purchasing new software is actually the only viable opportunity to change company culture - not because of the features, but because it forces operational pauses and new ways of working. Ryan shares his take that AI will struggle with BI because LLMs are trained to give emotionally satisfying answers rather than accurate ones. Ryan Dolley linkedin
Dive into the world of architecting intelligent software with this comprehensive guide. This book explores the principles and practices required to integrate artificial intelligence into existing architectures to deliver scalable and robust AI-driven systems. By the end of this journey, you will be equipped with the knowledge and skills to design and optimize next-generation AI applications. What this Book will help me do Effectively integrate AI-driven components within traditional software systems while maintaining scalability and performance. Understand key architectural risks and how to address them, ensuring resilience and cost-efficiency. Apply architectural principles through hands-on exercises and real-world case studies to solidify your learning. Master AI and ML concepts crucial to modern architectures, such as inference and decision-making mechanisms. Develop actionable architectural strategies for implementing user-centric, high-performance AI systems. Author(s) Richard D Avila and Imran Ahmad bring decades of experience in software architecture and AI technologies. Richard has worked extensively in crafting AI-integrated solutions for enterprise-grade systems, while Imran specializes in making complex AI accessible and manageable for developers. Their combined expertise provides an authoritative and approachable guide to AI systems architecture. Who is it for? This book is ideal for software architects and system designers looking to understand and implement AI within their architectures. It is also a valuable resource for CTOs, VPs of Engineering, and professionals spinning on the edge of technical leadership to keep their systems competitive. Intermediate-level developers aspiring to grow into architectural roles will gain actionable insights into the principles of AI-driven systems design. Beginner architects with a passion for AI technologies will find this book to be a robust starting point.
Summary In this episode Kate Shaw, Senior Product Manager for Data and SLIM at SnapLogic, talks about the hidden and compounding costs of maintaining legacy systems—and practical strategies for modernization. She unpacks how “legacy” is less about age and more about when a system becomes a risk: blocking innovation, consuming excess IT time, and creating opportunity costs. Kate explores technical debt, vendor lock-in, lost context from employee turnover, and the slippery notion of “if it ain’t broke,” especially when data correctness and lineage are unclear. Shee digs into governance, observability, and data quality as foundations for trustworthy analytics and AI, and why exit strategies for system retirement should be planned from day one. The discussion covers composable architectures to avoid monoliths and big-bang migrations, how to bridge valuable systems into AI initiatives without lock-in, and why clear success criteria matter for AI projects. Kate shares lessons from the field on discovery, documentation gaps, parallel run strategies, and using integration as the connective tissue to unlock data for modern, cloud-native and AI-enabled use cases. She closes with guidance on planning migrations, defining measurable outcomes, ensuring lineage and compliance, and building for swap-ability so teams can evolve systems incrementally instead of living with a “bowl of spaghetti.”
Announcements Hello and welcome to the Data Engineering Podcast, the show about modern data managementData teams everywhere face the same problem: they're forcing ML models, streaming data, and real-time processing through orchestration tools built for simple ETL. The result? Inflexible infrastructure that can't adapt to different workloads. That's why Cash App and Cisco rely on Prefect. Cash App's fraud detection team got what they needed - flexible compute options, isolated environments for custom packages, and seamless data exchange between workflows. Each model runs on the right infrastructure, whether that's high-memory machines or distributed compute. Orchestration is the foundation that determines whether your data team ships or struggles. ETL, ML model training, AI Engineering, Streaming - Prefect runs it all from ingestion to activation in one platform. Whoop and 1Password also trust Prefect for their data operations. If these industry leaders use Prefect for critical workflows, see what it can do for you at dataengineeringpodcast.com/prefect.Data migrations are brutal. They drag on for months—sometimes years—burning through resources and crushing team morale. Datafold's AI-powered Migration Agent changes all that. Their unique combination of AI code translation and automated data validation has helped companies complete migrations up to 10 times faster than manual approaches. And they're so confident in their solution, they'll actually guarantee your timeline in writing. Ready to turn your year-long migration into weeks? Visit dataengineeringpodcast.com/datafold today for the details.Your host is Tobias Macey and today I'm interviewing Kate Shaw about the true costs of maintaining legacy systemsInterview IntroductionHow did you get involved in the area of data management?What are your crtieria for when a given system or service transitions to being "legacy"?In order for any service to survive long enough to become "legacy" it must be serving its purpose and providing value. What are the common factors that prompt teams to deprecate or migrate systems?What are the sources of monetary cost related to maintaining legacy systems while they remain operational?Beyond monetary cost, economics also have a concept of "opportunity cost". What are some of the ways that manifests in data teams who are maintaining or migrating from legacy systems?How does that loss of productivity impact the broader organization?How does the process of migration contribute to issues around data accuracy, reliability, etc. as well as contributing to potential compromises of security and compliance?Once a system has been replaced, it needs to be retired. What are some of the costs associated with removing a system from service?What are the most interesting, innovative, or unexpected ways that you have seen teams address the costs of legacy systems and their retirement?What are the most interesting, unexpected, or challenging lessons that you have learned while working on legacy systems migration?When is deprecation/migration the wrong choice?How have evolutionary architecture patterns helped to mitigate the costs of system retirement?Contact Info LinkedInParting Question From your perspective, what is the biggest gap in the tooling or technology for data management today?Closing Announcements Thank you for listening! Don't forget to check out our other shows. Podcast.init covers the Python language, its community, and the innovative ways it is being used. The AI Engineering Podcast is your guide to the fast-moving world of building AI systems.Visit the site to subscribe to the show, sign up for the mailing list, and read the show notes.If you've learned something or tried out a project from the show then tell us about it! Email [email protected] with your story.Links SnapLogicSLIM == SnapLogic Intelligent ModernizerOpportunity CostSunk Cost FallacyData GovernanceEvolutionary ArchitectureThe intro and outro music is from The Hug by The Freak Fandango Orchestra / CC BY-SA
Debate over the growth picture continues with global expenditure data through September showing resilience but the labor market a key area of weakness. Whether wealth effects will cushion the coming purchasing power squeeze in the US is unclear. But we maintain that there is a tension in risk markets pricing both resilience and a Fed that returns rates to neutral, with inflation looking sticky absent a more material soft patch in growth.
Speakers:
Bruce Kasman
Joseph Lupton
This communication is provided for information purposes only. Institutional clients please visit www.jpmm.com/research/disclosures for important disclosures. © 2025 JPMorgan Chase & Co. All rights reserved. This material or any portion hereof may not be reprinted, sold or redistributed without the written consent of J.P. Morgan. It is strictly prohibited to use or share without prior written consent from J.P. Morgan any research material received from J.P. Morgan or an authorized third-party (“J.P. Morgan Data”) in any third-party artificial intelligence (“AI”) systems or models when such J.P. Morgan Data is accessible by a third-party. It is permissible to use J.P. Morgan Data for internal business purposes only in an AI system or model that protects the confidentiality of J.P. Morgan Data so as to prevent any and all access to or use of such J.P. Morgan Data by any third-party.
A inteligência de dados está remodelando indústrias inteiras, e o jornalismo é um dos exemplos mais fascinantes dessa transformação! Neste episódio, contamos como a EPTV, uma das maiores afiliadas da Rede Globo, está reinventando a forma de produzir notícia com a criação de um Núcleo de Jornalismo de Dados em parceria com a Snowflake. Um projeto que combina tecnologia, inteligência artificial e análise de dados para transformar informações públicas em reportagens mais precisas, ágeis e relevantes. Exploramos como essa estrutura nasceu, os desafios de implementar uma cultura orientada a dados e o papel da Snowflake na automatização do acesso a informações, integração de fontes e uso de IA contextual para antecipar tendências e apoiar decisões editoriais. Se você quer entender como dados e IA estão moldando o futuro do jornalismo e inspirando novas formas de contar histórias, esse episódio é para você! Lembrando que você pode encontrar todos os podcasts da comunidade Data Hackers no Spotify, iTunes, Google Podcast, Castbox e muitas outras plataformas. Convidados: Marcelo Manzano - Gerente do time de Solutions Engineering na Snowflake Brasil Bruno Woth - Gerente de Dados e Desenvolvimento na EPTV Nossa Bancada Data Hackers: Monique Femme — Head of Community Management na Data Hackers Gabriel Lages — Co-founder da Data Hackers e Data & Analytics Sr. Director na Hotmart. Referências: GRUPO EP - Empresas Pioneiras Snowflake