talk-data.com talk-data.com

Topic

GenAI

Generative AI

ai machine_learning llm

1517

tagged

Activity Trend

192 peak/qtr
2020-Q1 2026-Q1

Activities

1517 activities · Newest first

Speakers will present how to efficiently manage the usage of the spare part and maintenance execution by using data from ERP Manufacturing, ERP HR and PLM combined with GenAI.

Les intervenants présenteront comment gérer efficacement l’utilisation des pièces détachées et l’exécution de la maintenance en utilisant les données des ERP Manufacturing, ERP RH et PLM, combinées à l’IA générative.

Les projets en intelligence artificielle et en machine learning nécessitent un accès à des données centralisées et de haute qualité. Pourtant, gérer des pipelines de données en interne conduit souvent à une complexité accrue, des inefficacités et des retards qui freinent l’innovation. Cette session montrera comment l’automatisation transforme la circulation des données, en simplifiant l’ingestion, la normalisation et la préparation, pour alimenter efficacement les applications d’IA et de ML.

Vous découvrirez comment des pipelines automatisés, des services de data lake managés et des modèles de déploiement rapides permettent de passer plus vite de l’expérimentation à la production, en générant une réelle valeur business.

Au programme :

● Stratégies pour centraliser et normaliser des données issues de sources diverses

● Comment obtenir des données fiables et de haute qualité grâce à l’automatisation

● Les leviers pour accélérer le passage de l’IA de l’expérimentation à la production grâce à des flux de données évolutifs

● Démonstration Live: Découvrez en direct comment construire une application GenAI et créer un chatbot capable de fournir des recommandations personnalisées – une illustration concrète de la valeur apportée par des données centralisées et fiables.

Découvrez comment la collaboration entre les équipes métiers et IT devient un levier d’innovation puissant, tout en respectant les obligations de gouvernance et sécurité. À travers des cas concrets, nous montrerons comment les business users peuvent intégrer sans code de la GenAI dans leurs processus d’analyse de donnée pour gagner en efficacité et aborder des use cases innovants. Tout cela ponctué par une vue pour l’IT qui doit s’assurer d’une utilisation responsable des modèles LLM.

Une session inspirante pour vos futurs projets data & IA!

Les responsables des données dans le monde sont désormais encouragés à se tourner vers la GenAI et les agents IA pour favoriser l’innovation, la croissance et la productivité de leur organisation, mais sans données pertinentes, les projets d’IA risquent de rester au stade de l’expérimentation.

Découvrez comment des données bien préparées alimentent une IA pertinente, responsable, robuste et évolutive.

Plongez dans les coulisses d’un partenariat stratégique : l’alliance entre BPCE Vie, acteur majeur de l’assurance en France, et Zaion, pionnier de l’IA vocale appliquée à la relation client. Lors de cette conférence, découvrez comment la solution GenAI Agent Assist révolutionne l’expérience client, notamment à travers la génération automatisée et en temps réel des comptes rendus post-appel pour les conseillers. 

Ce retour d’expérience met en lumière la façon dont deux entreprises françaises s’appuient concrètement sur une technologie d’intelligence artificielle de nouvelle génération, souveraine et éthique, pour façonner l’avenir de la relation client—et illustrer la force de l’innovation « Made in France ». 

Repetita Non Iuvant: Why Generative AI Models Cannot Feed Themselves

As AI floods the digital landscape with content, what happens when it starts repeating itself? This talk explores model collapse, a progressive erosion where LLMs and image generators loop on their own results, hindering the creation of novel output.

We will show how self-training leads to bias and loss of diversity, examine the causes of this degradation, and quantify its impact on model creativity. Finally, we will also present concrete strategies to safeguard the future of generative AI, emphasizing the critical need to preserve innovation and originality.

By the end of this talk, attendees will gain insights into the practical implications of model collapse, understanding its impact on content diversity and the long-term viability of AI.

Documents Meet LLMs: Tales from the Trenches

Processing documents with LLMs comes with unexpected challenges: handling long inputs, enforcing structured outputs, catching hallucinations, and recovering from partial failures. In this talk, we’ll cover why large context windows are not a silver bullet, why chunking is deceptively hard and how to design input and output that allow for intelligent retrial. We'll also share practical prompting strategies, discuss OCR and parsing tools, compare different LLMs (and their cloud APIs) and highlight real-world insights from our experience developing production GenAI applications with multiple document processing scenarios.

Le processus d’allocation des coûts dans Alteryx s’appuie sur des workflows automatisés pour répartir avec précision les coûts des ETP entre les centres d’activités, selon des critères prédéfinis tels que les effectifs ou les volumes, en respectant des règles spécifiques d’allocation des coûts et de calcul des KPI.

1. Ingestion et préparation des données

Alteryx se connecte à plusieurs sources (par exemple, ERP, CRM, stockage cloud) pour extraire les données liées aux ETP, aux coûts et aux volumes. Le processus agrège, prépare et aligne ces ensembles de données disparates afin de créer une base de coûts unifiée.

2.Amélioration de la qualité des données

Des règles de transformation dynamiques sont appliquées pour garantir la cohérence, supprimer les doublons, gérer les valeurs manquantes et standardiser les types de données. Des outils de profilage des données offrent une visibilité sur les anomalies et valeurs aberrantes susceptibles d’impacter la logique d’allocation.

3. Logique d’allocation des coûts

Cela permet de définir des règles d’allocation flexibles et des étapes de validation — allant de ratios simples à des règles dynamiques dictées par les besoins métiers — en fonction des moteurs de coûts, ETP et volumes, pour garantir l’exactitude des calculs de KPI.

4. Intégration de l’IA générative

Les fonctions d’IA générative (par exemple via OpenAI ou les outils Gen AI d’Alteryx) renforcent le workflow en permettant :

La génération automatique de schémas de données adaptés à un format cible.

L’assistance via un outil Copilot pour créer des transformations à partir d’instructions en langage naturel.

La création de règles d’allocation dynamiques.

5. Sortie et visualisation

Les allocations finales peuvent être exportées vers des outils de reporting, des tableaux de bord ou des data lakes. Les utilisateurs peuvent consulter des synthèses d’allocation, des écarts et des vues détaillées pour appuyer la prise de décision via des applications analytiques personnalisées.

Comment exploiter tout le potentiel de la GenAI tout en protégeant un corpus documentaire sensible et critique.

Avec Infogreffe et le Conseil National des Greffiers, nous avons développé une solution de Retrieval Augmented Generation (RAG) sur AI Foundry, spécifiquement architecturée pour répondre de manière optimale aux exigences des tribunaux de commerce.

Dans cet atelier, nous montrerons comment des documents juridiques complexes peuvent être indexés et interrogés de manière sécurisée, afin de fournir aux greffiers des réponses fiables, contextualisées et vérifiables.

Les principaux objectifs visés étant de valider la conformité d’un document, d’accélérer la recherche d’informations réglementaires et d’améliorer l’accès à des références métiers critiques.

Nous partagerons également les choix technologiques et les mesures de sécurité mises en œuvre pour garantir confidentialité, traçabilité et souveraineté de la donnée.

Un retour d’expérience pragmatique qui illustre comment la GenAI peut transformer un métier dont la donnée est un actif stratégique.

Building Data Science Tools for Sustainable Transformation

The current AI hype, driven by generative AI and particularly large language models, is creating excitement, fear, and inflated expectations. In this keynote, we'll explore geographic & mobility data science tools (such as GeoPandas and MovingPandas) to transform this hype into sustainable and positive development that empowers users.

Send us a text Replay Episode: Python, Anaconda, and the AI Frontier with Peter Wang Peter Wang — Chief AI & Innovation Officer and Co-founder of Anaconda — is back on Making Data Simple! Known for shaping the open-source ecosystem and making Python a powerhouse, Peter dives into Anaconda’s new AI incubator, the future of GenAI, and why Python isn’t just “still a thing”… it’s the thing. From branding and security to leadership and philosophy, this episode is a wild ride through the biggest opportunities (and risks) shaping AI today. Timestamps:  01:27 Meet Peter Wang 05:10 Python or R? 05:51 Anaconda’s Differentiation 07:08 Why the Name Anaconda 08:24 The AI Incubator 11:40 GenAI 14:39 Enter Python 16:08 Anaconda Commercial Services 18:40 Security 20:57 Common Points of Failure 22:53 Branding 24:50 watsonx Partnership 28:40 AI Risks 34:13 Getting Philosophical 36:13 China 44:52 Leadership Style

Linkedin: linkedin.com/in/pzwang Website: https://www.linkedin.com/company/anacondainc/, https://www.anaconda.com/ Want to be featured as a guest on Making Data Simple? Reach out to us at [email protected] and tell us why you should be next. The Making Data Simple Podcast is hosted by Al Martin, WW VP Technical Sales, IBM, where we explore trending technologies, business innovation, and leadership ... while keeping it simple & fun.

Coding with AI

Practical techniques to accelerate software development using generative AI. Let’s get real. You’d like to hand off a lot of tedious software development tasks to an assistant—and now you can! AI-powered coding tools like Copilot can accelerate research, design, code creation, testing, troubleshooting, documentation, refactoring and more. Coding with AI shows you how. Written for working developers, this book fast-tracks you to AI-powered productivity with bite-size projects, tested prompts, and techniques for getting the most out of AI. In Coding with AI you’ll learn how to: Incorporate AI tools into your development workflow Create pro-quality documentation and tests Debug and refactor software efficiently Create and organize reusable prompts Coding with AI takes you through several small Python projects with the help of AI tools, showing you exactly how to use AI to create and refine real software. This book skips the baby steps and goes straight to the techniques you’ll use on the job, every day. You’ll learn to sidestep AI inefficiencies like hallucination and identify the places where AI can save you the most time and effort. About the Technology Taking a systematic approach to coding with Al will deliver the clarity, consistency, and scalability you need for production-grade applications. With practice, you can use AI tools to break down complex problems, generate maintainable code, enhance your models, and streamline debugging, testing, and collaboration. As you learn to work with AI’s strengths—and recognize its limitations—you’ll build more reliable software and find that the quality of your generated code improves significantly. About the Book Coding with AI shows you how to gain massive benefits from a powerful array of AI-driven development tools and techniques. And it shares the insights and methods you need to use them effectively in professional projects. Following realistic examples, you’ll learn AI coding for database integration, designing a UI, and establishing an automated testing suite. You’ll even vibe code a game—but only after you’ve built a rock-solid foundation. What's Inside Incorporate AI into your development workflow Create pro-quality documentation and tests Debug and refactor software efficiently Create and organize reusable prompts About the Reader For professional software developers. Examples in Python. About the Author Jeremy C. Morgan has two decades of experience as an engineer building software for everything from Fortune 100 companies to tiny startups. Quotes Delivers exactly what working developers need: practical techniques that actually work. - Scott Hanselman, Microsoft You’ll be writing prompt engineering poetry. - Lars Klint, Atlassian Blends years of software experience with hands-on knowledge of top AI coding techniques. Essential. - Steve Buchanan, Jamf Detailed use of AI in real-world applications. A great job! - Santosh Yadav, Celonis

ActiveTigger: A Collaborative Text Annotation Research Tool for Computational Social Sciences

The exponential growth of textual data—ranging from social media posts and digital news archives to speech-to-text transcripts—has opened new frontiers for research in the social sciences. Tasks such as stance detection, topic classification, and information extraction have become increasingly common. At the same time, the rapid evolution of Natural Language Processing, especially pretrained language models and generative AI, has largely been led by the computer science community, often leaving a gap in accessibility for social scientists.

To address this, we initiated since 2023 the development of ActiveTigger, a lightweight, open-source Python application (with a web frontend in React) designed to accelerate annotation process and manage large-scale datasets through the integration of fine-tuned models. It aims to support computational social science for a large public both within and outside social sciences. Already used by a dynamic community in social sciences, the stable version is planned for early June 2025.

From a more technical prospect, the API is designed to manage the complete workflow from project creation, embeddings computation, exploration of the text corpus, human annotation with active learning, fine-tuning of pre-trained models (BERT-like), prediction on a larger corpus, and export. It also integrates LLM-as-a-service capabilities for prompt-based annotation and information extraction, offering a flexible approach for hybrid manual/automatic labeling. Accessible both with a web frontend and a Python client, ActiveTigger encourages customization and adaptation to specific research contexts and practices.

In this talk, we will delve into the motivations behind the creation of ActiveTigger, outline its technical architecture, and walk through its core functionalities. Drawing on several ongoing research projects within the Computational Social Science (CSS) group at CREST, we will illustrate concrete use cases where ActiveTigger has accelerated data annotation, enabled scalable workflows, and fostered collaborations. Beyond the technical demonstration, the talk will also open a broader reflection on the challenges and opportunities brought by generative AI in academic research—especially in terms of reliability, transparency, and methodological adaptation for qualitative and quantitative inquiries.

The repository of the project : https://github.com/emilienschultz/activetigger/

The development of this software is funded by the DRARI Ile-de-France and supported by Progédo.

Investing for Programmers

Maximize your portfolio, analyze markets, and make data-driven investment decisions using Python and generative AI. Investing for Programmers shows you how you can turn your existing skills as a programmer into a knack for making sharper investment choices. You’ll learn how to use the Python ecosystem, modern analytic methods, and cutting-edge AI tools to make better decisions and improve the odds of long-term financial success. In Investing for Programmers you’ll learn how to: Build stock analysis tools and predictive models Identify market-beating investment opportunities Design and evaluate algorithmic trading strategies Use AI to automate investment research Analyze market sentiments with media data mining In Investing for Programmers you'll learn the basics of financial investment as you conduct real market analysis, connect with trading APIs to automate buy-sell, and develop a systematic approach to risk management. Don’t worry—there’s no dodgy financial advice or flimsy get-rich-quick schemes. Real-life examples help you build your own intuition about financial markets, and make better decisions for retirement, financial independence, and getting more from your hard-earned money. About the Technology A programmer has a unique edge when it comes to investing. Using open-source Python libraries and AI tools, you can perform sophisticated analysis normally reserved for expensive financial professionals. This book guides you step-by-step through building your own stock analysis tools, forecasting models, and more so you can make smart, data-driven investment decisions. About the Book Investing for Programmers shows you how to analyze investment opportunities using Python and machine learning. In this easy-to-read handbook, experienced algorithmic investor Stefan Papp shows you how to use Pandas, NumPy, and Matplotlib to dissect stock market data, uncover patterns, and build your own trading models. You’ll also discover how to use AI agents and LLMs to enhance your financial research and decision-making process. What's Inside Build stock analysis tools and predictive models Design algorithmic trading strategies Use AI to automate investment research Analyze market sentiment with media data mining About the Reader For professional and hobbyist Python programmers with basic personal finance experience. About the Author Stefan Papp combines 20 years of investment experience in stocks, cryptocurrency, and bonds with decades of work as a data engineer, architect, and software consultant. Quotes Especially valuable for anyone looking to improve their investing. - Armen Kherlopian, Covenant Venture Capital A great breadth of topics—from basic finance concepts to cutting-edge technology. - Ilya Kipnis, Quantstrat Trader A top tip for people who want to leverage development skills to improve their investment possibilities. - Michael Zambiasi, Raiffeisen Digital Bank Brilliantly bridges the worlds of coding and finance. - Thomas Wiecki, PyMC Labs