Datagedreven werken draait om feiten, maar vooral om mensen. Want hoe goed je techniek ook functioneert, zonder draagvlak en de juiste vaardigheden binnen de organisatie wordt datagedreven werken geen succes. Ook niet met AI. In mijn presentatie laat ik zien hoe UWV datagedreven werken binnen HRM implementeert met aandacht voor visie, educatie, communicatie en agile werken. Ook deel ik hoe de inzet van AI ons in de nabije toekomst kan helpen en met welke beperkingen we moeten zien te dealen.
talk-data.com
Topic
AI/ML
Artificial Intelligence/Machine Learning
9014
tagged
Activity Trend
Top Events
ASN Bank used Agentic AI to aid the process of unifying four brands. This presentation explores two key areas: Data Lineage: This agent efficiently traced data lineage across legacy systems, accelerating the integration and providing valuable insights. Policies: This agent analyzed internal policies, greatly simplifying the complex task of creating a unified policy framework. The presentation will discuss the architecture behind these agents, showcasing the different design patterns used.
How do organizations move from predictive ML to impactful Generative AI? This session presents a strategic blueprint for this transition. It showcases how Google leverages Gemini and AI Agents to automate complex engineering workflows, achieving an 80% reduction in time spent on issue resolution. Gain a framework for fostering innovation, enabling teams, and driving measurable results with LLMs.
Text-to-SQL promised self-serve analytics, but it failed to deliver trust. Accuracy plateaued, context was lost, and hallucinations eroded confidence. In this talk, we’ll explore what comes after: reliable, proactive AI analytics powered by semantic understanding and agentic systems.
Je ziet ze overal: de jubelende succesverhalen over AI. Maar is het pad naar AI echt zo vlekkeloos? Of ben je al snel de AI clown die vooral mooie woorden orakelt? En verandert dit allemaal ook wel echt de core van je business? Bij Springbok weten we dat AI méér is dan glanzende posts. Het is ploeteren, zweten, vallen en opstaan. Ja, AI kan je core business veranderen, maar alleen met lef en doorzettingsvermogen. Want achter elk succesverhaal schuilt ook de rauwe realiteit van keihard bouwen.
De term " trustworthy AI" klinkt steeds vaker. Terecht ook: AI-systemen worden een integraal onderdeel van onze samenleving en moeten daarom betrouwbaar zijn. Maar hier ligt paradox: veel zijn juist inherent onbetrouwbaar. Ze maken fouten, hebben vooroordelen, of gedragen zich onvoorspelbaar. Dit roept cruciale vraag op: hoe bouwen we stabiele op technologie die dit fundamentele betrouwbaarheidsprobleem heeft? Dat is de waar ingaan in deze keynote!
At Data Expo, Amsterdam Data Academy presents results from our own research conducted among learning and data experts on the future of jobs in data & AI. Discover which roles are at risk, which new ones are emerging, and how to prepare. Join our session " The future of work in data & AI: which jobs will disappear and emerge?" (Wed 10 Sept, 14:15-14:45, Hall 8) visit Stand 87 for a quick scan to see if AI take over your job.
While AI gets all the spotlight, the data powering it often goes unnoticed. This talk explores web crawling, how it works, how organizations leverage it, and how recent advances in AI are unlocking entirely new ways for companies with large volumes of structured data to make that information searchable, accessible, and genuinely valuable.
As Europe’s top B2B used-goods auction platform, TBAuctions is entering the AI era. Roberto Bonilla, Lead Data Engineer, shows how Databricks, Azure, Terraform, MLflow and LangGraph unify to simplify complex AI workflows. Bas Lucieer, Head of Data, details the strategy and change management that bring a sales-driven organization along, ensuring adoption and lasting value. Together they show tech + strategy = marketplace edge.
Computer vision is becoming a key enabler of smart manufacturing, from quality inspection to robotic automation. Yet traditional development of AI models often requires large datasets, specialized expertise, and significant time. In this talk, we explore how no-code computer vision platforms are changing this landscape allowing engineers, operators, and domain experts to build, train, and deploy models without deep AI backgrounds. We’ll look at real examples from manufacturing and robotics to show how faster iteration, simpler data workflows, and scalable deployment can move automation projects from concept to production.
Computer vision is becoming a key enabler of smart manufacturing, from quality inspection to robotic automation. Yet traditional development of AI models often requires large datasets, specialized expertise, and significant time. In this talk, we explore how no-code computer vision platforms are changing this landscape allowing engineers, operators, and domain experts to build, train, and deploy models without deep AI backgrounds. We’ll look at real examples from manufacturing and robotics to show how faster iteration, simpler data workflows, and scalable deployment can move automation projects from concept to production.
Computer vision is becoming a key enabler of smart manufacturing, from quality inspection to robotic automation. Yet traditional development of AI models often requires large datasets, specialized expertise, and significant time. In this talk, we explore how no-code computer vision platforms are changing this landscape allowing engineers, operators, and domain experts to build, train, and deploy models without deep AI backgrounds. We’ll look at real examples from manufacturing and robotics to show how faster iteration, simpler data workflows, and scalable deployment can move automation projects from concept to production.
This talk explores how we generate high-performance computer vision datasets from CAD—without real-world images or manual labeling. We’ll walk through our synthetic data pipeline, including CPU-optimized defect simulation, material variation, and lighting workflows that scale to thousands of renders per part. While Blender plays a role, our focus is on how industrial data (like STEP files) and procedural generation unlock fast, flexible training sets for manufacturing QA, even on modest hardware. If you're working at the edge of 3D, automation, and vision AI—this is for you!
This talk explores how we generate high-performance computer vision datasets from CAD—without real-world images or manual labeling. We’ll walk through our synthetic data pipeline, including CPU-optimized defect simulation, material variation, and lighting workflows that scale to thousands of renders per part. While Blender plays a role, our focus is on how industrial data (like STEP files) and procedural generation unlock fast, flexible training sets for manufacturing QA, even on modest hardware. If you're working at the edge of 3D, automation, and vision AI—this is for you!
This talk explores how we generate high-performance computer vision datasets from CAD—without real-world images or manual labeling. We’ll walk through our synthetic data pipeline, including CPU-optimized defect simulation, material variation, and lighting workflows that scale to thousands of renders per part. While Blender plays a role, our focus is on how industrial data (like STEP files) and procedural generation unlock fast, flexible training sets for manufacturing QA, even on modest hardware. If you're working at the edge of 3D, automation, and vision AI—this is for you!
Manufacturing and logistics companies face increasingly complex operational challenges that traditional AI and human planning struggle to solve effectively. Collide Technology harnesses Swarm Intelligence algorithms to transform intractable problems—like scheduling hundreds or thousands of maintenance employees while simultaneously optimizing production capacity, inventory levels, and cross-sector resource allocation—into solutions delivered in seconds rather than weeks. Unlike rigid Operations Research approaches that require specialized expertise and expensive implementations, our platform democratizes industrial optimization by making sophisticated decision-making accessible to any factory or logistics operation. We deliver holistic, data-driven solutions that optimize across multiple business entities and sectors simultaneously, adapting to real-world constraints and evolving operational needs.
Manufacturing and logistics companies face increasingly complex operational challenges that traditional AI and human planning struggle to solve effectively. Collide Technology harnesses Swarm Intelligence algorithms to transform intractable problems—like scheduling hundreds or thousands of maintenance employees while simultaneously optimizing production capacity, inventory levels, and cross-sector resource allocation—into solutions delivered in seconds rather than weeks.
Unlike rigid Operations Research approaches that require specialized expertise and expensive implementations, our platform democratizes industrial optimization by making sophisticated decision-making accessible to any factory or logistics operation. We deliver holistic, data-driven solutions that optimize across multiple business entities and sectors simultaneously, adapting to real-world constraints and evolving operational needs.
Manufacturing and logistics companies face increasingly complex operational challenges that traditional AI and human planning struggle to solve effectively. Collide Technology harnesses Swarm Intelligence algorithms to transform intractable problems—like scheduling hundreds or thousands of maintenance employees while simultaneously optimizing production capacity, inventory levels, and cross-sector resource allocation—into solutions delivered in seconds rather than weeks. Unlike rigid Operations Research approaches that require specialized expertise and expensive implementations, our platform democratizes industrial optimization by making sophisticated decision-making accessible to any factory or logistics operation. We deliver holistic, data-driven solutions that optimize across multiple business entities and sectors simultaneously, adapting to real-world constraints and evolving operational needs.
Brought to You By: • Statsig — The unified platform for flags, analytics, experiments, and more. Statsig built a complete set of data tools that allow engineering teams to measure the impact of their work. This toolkit is SO valuable to so many teams, that OpenAI - who was a huge user of Statsig - decided to acquire the company, the news announced last week. Talk about validation! Check out Statsig. • Linear – The system for modern product development. Here’s an interesting story: OpenAI switched to Linear as a way to establish a shared vocabulary between teams. Every project now follows the same lifecycle, uses the same labels, and moves through the same states. Try Linear for yourself. — The Pragmatic Engineer Podcast is back with the Fall 2025 season. Expect new episodes to be published on most Wednesdays, looking ahead. Code Complete is one of the most enduring books on software engineering. Steve McConnell wrote the 900-page handbook just five years into his career, capturing what he wished he’d known when starting out. Decades later, the lessons remain relevant, and Code Complete remains a best-seller. In this episode, we talk about what has aged well, what needed updating in the second edition, and the broader career principles Steve has developed along the way. From his “career pyramid” model to his critique of “lily pad hopping,” and why periods of working in fast-paced, all-in environments can be so rewarding, the emphasis throughout is on taking ownership of your career and making deliberate choices. We also discuss: • Top-down vs. bottom-up design and why most engineers default to one approach • Why rewriting code multiple times makes it better • How taking a year off to write Code Complete crystallized key lessons • The 3 areas software designers need to understand, and why focusing only on technology may be the most limiting • And much more! Steve rarely gives interviews, so I hope you enjoy this conversation, which we recorded in Seattle. — Timestamps (00:00) Intro (01:31) How and why Steve wrote Code Complete (08:08) What code construction is and how it differs from software development (11:12) Top-down vs. bottom-up design approach (14:46) Why design documents frustrate some engineers (16:50) The case for rewriting everything three times (20:15) Steve’s career before and after Code Complete (27:47) Steve’s career advice (44:38) Three areas software designers need to understand (48:07) Advice when becoming a manager, as a developer (53:02) The importance of managing your energy (57:07) Early Microsoft and why startups are a culture of intense focus (1:04:14) What changed in the second edition of Code Complete (1:10:50) AI’s impact on software development: Steve’s take (1:17:45) Code reviews and GenAI (1:19:58) Why engineers are becoming more full-stack (1:21:40) Could AI be the exception to “no silver bullets?” (1:26:31) Steve’s advice for engineers on building a meaningful career — The Pragmatic Engineer deepdives relevant for this episode: • What changed in 50 years of computing • The past and future of modern backend practices • The Philosophy of Software Design – with John Ousterhout • AI tools for software engineers, but without the hype – with Simon Willison (co-creator of Django) • TDD, AI agents and coding – with Kent Beck — Production and marketing by https://penname.co/. For inquiries about sponsoring the podcast, email [email protected].
Get full access to The Pragmatic Engineer at newsletter.pragmaticengineer.com/subscribe
To fully unlock the potential of AI within KPN, scaling is key. Therefore KPN focuses on 4 pillars: AI Literacy, Governance, end-to-end implementation with business, IT, data and AI, and the expansion of our technical infrastructure. Together, these elements support the democratization of AI capabilities across the organization. With the emergence of Generative AI—especially Agentic AI—broad enablement has become even more critical. In this session, KPN will share organizational opportunities and challenges related to AI adoption at scale, and how it utilizes Dataiku as the central Data Science platform to drive this transformation.