talk-data.com talk-data.com

Topic

IoT

Internet of Things (IoT)

connected_devices sensors data_collection

112

tagged

Activity Trend

11 peak/qtr
2020-Q1 2026-Q1

Activities

112 activities · Newest first

Managing and Visualizing BIM Data with AI

Unlock the potential of your BIM workflows with artificial intelligence and data visualization tools. This book provides guided instruction on using software like Revit, Dynamo, Python, and Power BI to automate processes, derive insights, and craft tailored dashboards that empower data-driven decisions in AEC projects. What this Book will help me do Effectively preprocess and manage BIM data for analysis and visualization. Design interactive and insightful dashboards in Power BI for project stakeholders. Integrate real-time IoT data and advanced analytics into BIM projects. Automate repetitive tasks in Revit using Dynamo and Python scripting. Understand the ethical considerations and emerging trends in AI for BIM. Author(s) Bruno Martorelli, a seasoned BIM manager, specializes in integrating technology and data analytics into construction workflows. With a background in architecture and programming, he bridges the gap between traditional methods and modern innovations. Bruno is dedicated to sharing practical strategies for data automation and visualization. Who is it for? This book is tailored for architects, engineers, and construction managers interested in elevating their BIM practices. If you're familiar with Revit and possess a basic understanding of data management, you'll find this resource invaluable. Beginners in Python or Power BI will also find accessible guidance to start applying advanced techniques in their workflows.

AWS re:Invent 2025 - Building an AI-powered waste classification using Amazon Nova & IoT (AIM256)

AWS Chile revolutionized waste management by developing an intelligent classification system using Amazon Nova, IoT, and serverless architecture. This revolutionary solution processes waste items in under 3 seconds with 95% accuracy, while keeping implementation costs below $300 USD and operational costs under $3 per 1,000 images. Explore the technical architecture integrating edge computing, computer vision, and AI to create a real-time classification system that improved recycling efficiency by 52%, preventing 644.7 kg of CO2 emissions in six months. Learn to implement this cost-effective solution in your organization using AWS, and understand key challenges and lessons learned from this sustainability initiative.

Learn more: More AWS events: https://go.aws/3kss9CP

Subscribe: More AWS videos: http://bit.ly/2O3zS75 More AWS events videos: http://bit.ly/316g9t4

ABOUT AWS: Amazon Web Services (AWS) hosts events, both online and in-person, bringing the cloud computing community together to connect, collaborate, and learn from AWS experts. AWS is the world's most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers—including the fastest-growing startups, largest enterprises, and leading government agencies—are using AWS to lower costs, become more agile, and innovate faster.

AWSreInvent #AWSreInvent2025 #AWS

Edge Artificial Intelligence

Secure your expertise in the next wave of computing with this essential book, which provides a comprehensive guide to Edge AI, detailing its foundational concepts, deployment strategies, and real-world applications for revolutionizing performance and privacy across various industries. Edge AI has the potential to bring the computational power of AI algorithms closer to where data is generated, processed, and utilized. Traditionally, AI models are deployed in centralized cloud environments, leading to latency issues, bandwidth constraints, and privacy concerns. Edge AI addresses these limitations by enabling AI inference and decision-making directly on edge devices, such as smartphones, IoT sensors, and edge servers. Despite its challenges, edge AI presents numerous opportunities across various domains. From real-time health monitoring and predictive maintenance in industrial IoT to personalized recommendations in retail and immersive experiences in augmented reality, edge AI has the potential to revolutionize how we interact with technology. This book aims to provide a comprehensive exploration of edge AI, covering its foundational concepts, development frameworks, deployment strategies, security considerations, ethical implications, emerging trends, and real-world applications. This guide is essential for anyone pushing the boundaries to leverage edge computing for enhanced performance and efficiency. Readers will find this volume: Dives deep into the world of edge AI with a comprehensive exploration covering foundational concepts, development frameworks, deployment strategies, security considerations, ethical implications, governance frameworks, optimization techniques, and real-world applications; Offers practical guidance on implementing edge AI solutions effectively in various domains, including architecture design, development frameworks, deployment strategies, and optimization techniques; Explores concrete examples of edge AI applications across diverse domains such as healthcare, industrial IoT, smart cities, and autonomous systems, providing insights into how edge AI is revolutionizing industries and everyday life; Provides insights into emerging trends and technologies in the field of edge AI, including convergence with blockchain, augmented reality, virtual reality, autonomous systems, personalized experiences, and cybersecurity. Audience Researchers, AI experts, and industry professionals in the field of computer science, IT, and business management.

Join us to learn how to digitize your operations and scale AI workloads across hybrid and multi-cloud environments. You will learn how to secure and scale your IoT operations data and use Microsoft Fabric as your single data platform to reduce downtime, improve asset tracking, and address operational simulation scenarios.

Join us to learn how to digitize your operations and scale AI workloads across hybrid and multi-cloud environments. You will learn how to secure and scale your IoT operations data and use Microsoft Fabric as your single data platform to reduce downtime, improve asset tracking, and address operational simulation scenarios.

This session will explore why and how Snowflake's unique capabilities are crucial to enable, accelerate and implement industrial IoT use cases like root cause analysis of asset failure, predictive maintenance and quality management. The session will explain the use of specific time series capabilities (e.g. asof joins, CORR & MATCH function), built-in Cortex ML functions (like anomaly detection and forecasting) and LLMs leveraging RAG to accelerate use cases for manufacturing customers.

L'explosion des données IoT dans les environnements industriels nécessite des architectures de données robustes et évolutives. Cette présentation explore comment les data lakes, et plus spécifiquement l'architecture Lakehouse, répondent aux défis du stockage et du traitement de volumes massifs de données IoT hétérogènes.

À travers l'exemple concret du monitoring opérationnel d'un parc éolien offshore, nous démontrerons comment une solution Lakehouse permet de gérer efficacement les flux de données haute fréquence provenant de capteurs industriels. Nous détaillerons le processus complet : de l'ingestion des données de télémétrie en temps réel au déploiement de modèles de maintenance prédictive, en passant par l'entraînement d'algorithmes de détection d'anomalies et de forecasting.

Cette étude de cas illustrera les avantages clés du data lake pour l'IoT industriel : flexibilité de stockage multi-formats, capacité de traitement en temps réel et en batch, intégration native des outils de machine learning, et optimisation des coûts opérationnels. L'objectif est de fournir un retour d'expérience pratique sur l'implémentation de cette architecture dans un contexte d'Asset Integrity Management, applicable à de nombreux secteurs industriels.

Connecting machines and structuring industrial data has long been one of the toughest challenges in smart manufacturing. Before unlocking the power of AI, large language models, or advanced analytics, companies must first solve the foundational task of harmonizing and organizing their data—without this, bad data only leads to bad AI.

This session covers the journey from building a Unified Namespace as the data foundation to scaling predictive use cases such as maintenance, quality optimization, and process improvements. Using customer stories from discrete and process manufacturing, we will show howDXC &Snowflake enables enterprises to connect IoT data at scale, establish a harmonized taxonomy across global operations, and drive measurable business outcomes.

By unifying diverse industrial IoT and enterprise data into a governed data layer, the Unified Namespace enables creation of an operational digital twin—a live, authoritative representation of manufacturing systems and assets that fuels scalable AI use cases like predictive maintenance, autonomous control, and AI-driven shop floor assistance. Attendees will learn howDXCs &Snowflake’s IoTbest-practicespower OT/IT convergence, continuous digital twin evolution, and AI-driven operational excellence.

Send us a text This week on Making Data Simple, join Ajay Kulkarni, CEO and co-founder of TigerData, as we dive into the rapidly evolving world of data. Ajay shares his front-row perspective on the challenges and opportunities of building and scaling time-series databases in an era of AI-driven automation. From the mechanics of managing massive data streams to the bold bets shaping the future of IoT, this conversation goes deep into what’s breaking, what’s working, and what’s next. Whether you’re a data engineer, tech leader, or simply fascinated by the speed of AI innovation, this episode is packed with insights you won’t want to miss. 01:15 Meet AJ Kulkarni04:29 TigerData07:16 Timeseries 09:25 Use Cases 11:03 Why Progress? 11:58 Why TigerData16:05 AI is Everything21:06 The Fastest Postgres 25:45 Advanced Features28:53 Future of IOT36:48 The Future of TigerData38:03 San Francisco38:26 A Big Bet41:06 Good BooksLinkedIn: https://www.linkedin.com/in/ajaykulkarni/ Website:  https://tigerdata.com Want to be featured as a guest on Making Data Simple? Reach out to us at [email protected] and tell us why you should be next. The Making Data Simple Podcast is hosted by Al Martin, WW VP Technical Sales, IBM, where we explore trending technologies, business innovation, and leadership ... while keeping it simple & fun.

A session on leveraging vCluster to create lightweight virtual Kubernetes clusters for secure multi-tenant deployments in IoT environments, combining KubeEdge for real-time AI processing. Topics include secure multi-tenant architecture, edge AI and data operations, and performance and cost optimization.

In this episode, I talk with Ilya Preston, co-founder and CEO of PAXAFE, a logistics orchestration and decision intelligence platform for temperature-controlled supply chains (aka “cold chain”). Ilya explains how PAXAFE helps companies shipping sensitive products, like pharmaceuticals, vaccines, food, and produce, by delivering end-to-end visibility and actionable insights powered by analytics and AI that reduce product loss, improve efficiency, and support smarter real-time decisions.

Ilya shares the challenges of building a configurable system that works for transportation, planning, and quality teams across industries. We also discuss their product development philosophy, team structure, and use of AI for document processing, diagnostics, and workflow automation. 

Highlights/ Skip to:  

Intro to Paxafe  (2:13)   How PAXAFE brings tons of cold chain data together in one user experience (2:33) Innovation in cold chain analytics is up, but so is cold chain product loss. (4:42) The product challenge of getting sufficient telemetry data at the right level of specificity to derive useful analytical insights (7:14)  Why and how PAXAFE pivoted away from providing IoT hardware to collect telemetry (10:23) How PAXAFE supports complex customer workflows, cold chain logistics, and complex supply chains (13:57) Who the end users of PAXAFE are, and how the product team designs for these users (20:00) Pharma loses around $40 billion a year relying on ‘Bob’s intuition’ in the warehouse. How Paxafe balances institutional user knowledge with the cold hard facts of analytics (42:43) Lessons learned when Ilya’s team fell in love with its own product and didn’t listen to the market  (23:57)

Quotes from Today’s Episode "Our initial vision for what PAXAFE would become was 99.9% spot on. The only thing we misjudged was market readiness—we built a product that was a few years ahead of its time." –IIya

"As an industry, pharma is losing $40 billion worth of product every year because decisions are still based on warehouse intuition about what works and what doesn’t. In production, the problem is even more extreme, with roughly $800 billion lost annually due to temperature issues and excursions." -IIya

"With our own design, our initial hypothesis and vision for what Pacaf could be really shaped where we are today. Early on, we had a strong perspective on what our customers needed—and along the way, we fell in love with our own product and design.." -IIya

"We spent months perfecting risk scores… only to hear from customers, ‘I don’t care about a 71 versus a 62—just tell me what to do.’ That single insight changed everything." -IIya

"If you’re not talking to customers or building a product that supports those conversations, you’re literally wasting time. In the zero-to-product-market-fit phase, nothing else matters, you need to focus entirely on understanding your customers and iterating your product around their needs..” -IIya

"Don’t build anything on day one, probably not on day two, three, or four either. Go out and talk to customers. Focus not on what they think they need, but on their real pain points. Understand their existing workflows and the constraints they face while trying to solve those problems." -IIya

Links

PAXAFE: https://www.paxafe.com/ LinkedIn for Ilya Preston: https://www.linkedin.com/in/ilyapreston/ LinkedIn for company: https://www.linkedin.com/company/paxafe/

Building Effective Privacy Programs

Presents a structured approach to privacy management, an indispensable resource for safeguarding data in an ever-evolving digital landscape In today’s data-driven world, protecting personal information has become a critical priority for organizations of all sizes. Building Effective Privacy Programs: Cybersecurity from Principles to Practice equips professionals with the tools and knowledge to design, implement, and sustain robust privacy programs. Seamlessly integrating foundational principles, advanced privacy concepts, and actionable strategies, this practical guide serves as a detailed roadmap for navigating the complex landscape of data privacy. Bridging the gap between theoretical concepts and practical implementation, Building Effective Privacy Programs combines in-depth analysis with practical insights, offering step-by-step instructions on building privacy-by-design frameworks, conducting privacy impact assessments, and managing compliance with global regulations. In-depth chapters feature real-world case studies and examples that illustrate the application of privacy practices in a variety of scenarios, complemented by discussions of emerging trends such as artificial intelligence, blockchain, IoT, and more. Providing timely and comprehensive coverage of privacy principles, regulatory compliance, and actionable strategies, Building Effective Privacy Programs: Addresses all essential areas of cyberprivacy, from foundational principles to advanced topics Presents detailed analysis of major laws, such as GDPR, CCPA, and HIPAA, and their practical implications Offers strategies to integrate privacy principles into business processes and IT systems Covers industry-specific applications for healthcare, finance, and technology sectors Highlights successful privacy program implementations and lessons learned from enforcement actions Includes glossaries, comparison charts, sample policies, and additional resources for quick reference Written by seasoned professionals with deep expertise in privacy law, cybersecurity, and data protection, Building Effective Privacy Programs: Cybersecurity from Principles to Practice is a vital reference for privacy officers, legal advisors, IT professionals, and business executives responsible for data governance and regulatory compliance. It is also an excellent textbook for advanced courses in cybersecurity, information systems, business law, and business management.

Abstract: Detecting problems as they happen is essential in today’s fast-moving, data-driven world. In this talk, you’ll learn how to build a flexible, real-time anomaly detection pipeline using Apache Kafka and Apache Flink, backed by statistical and machine learning models. We’ll start by demystifying what anomaly really means - exploring the different types (point, contextual, and collective anomalies) and the difference between unintentional issues and intentional outliers like fraud or abuse. Then, we’ll look at how anomaly detection is solved in practice: from classical statistical models like ARIMA to deep learning models like LSTM. You’ll learn how ARIMA breaks time series into AutoRegressive, Integrated, and Moving Average components, no math degree required (just a Python library). We’ll also uncover why forgetting is a feature, not a bug, when it comes to LSTMs, and how these models learn to detect complex patterns over time. Throughout, we’ll show how Kafka handles high-throughput streaming data and how Flink enables low-latency, stateful processing to catch issues as they emerge. You’ll leave knowing not just how these systems work, but when to use each type of model depending on your data and goals. Whether you're monitoring system health, tracking IoT devices, or looking for fraud in transactions, this talk will give you the foundations and tools to detect the unexpected - before it becomes a problem.

Discussion by members of the CIISAp consortium on the design and launch of the CIISAp apprenticeship program, which links rigorous academic classes and educational training with real-world job rotations at leading industrial companies to help students identify and prevent cyber vulnerabilities and attacks on industrial control systems and OT/IT/IoT platforms.

Sponsored by: Redpanda | IoT for Fun & Prophet: Scaling IoT and predicting the future with Redpanda, Iceberg & Prophet

In this talk, we’ll walk through a complete real-time IoT architecture—from an economical, high-powered ESP32 microcontroller publishing environmental sensor data to AWS IoT, through Redpanda Connect into a Redpanda BYOC cluster, and finally into Apache Iceberg for long-term analytical storage. Once the data lands, we’ll query it using Python and perform linear regression with Prophet to forecast future trends. Along the way, we’ll explore the design of a scalable, cloud-native pipeline for streaming IoT data. Whether you're tracking the weather or building the future, this session will help you architect with confidence—and maybe even predict it.

In this session, we’ll introduce Zerobus Direct Write API, part of Lakeflow Connect, which enables you to push data directly to your lakehouse and simplify ingestion for IOT, clickstreams, telemetry, and more. We’ll start with an overview of the ingestion landscape to date. Then, we'll cover how you can “shift left” with Zerobus, embedding data ingestion into your operational systems to make analytics and AI a core component of the business, rather than an afterthought. The result is a significantly simpler architecture that scales your operations, using this new paradigm to skip unnecessary hops. We'll also highlight one of our early customers, Joby Aviation and how they use Zerobus. Finally, we’ll provide a framework to help you understand when to use Zerobus versus other ingestion offerings—and we’ll wrap up with a live Q&A so that you can hit the ground running with your own use cases.

Sponsored by: Anomalo | Reconciling IoT, Policy, and Insurer Data to Deliver Better Customer Discounts

As insurers increasingly leverage IoT data to personalize policy pricing, reconciling disparate datasets across devices, policies, and insurers becomes mission-critical. In this session, learn how Nationwide transitioned from prototype workflows in Dataiku to a hardened data stack on Databricks, enabling scalable data governance and high-impact analytics. Discover how the team orchestrates data reconciliation across Postgres, Oracle, and Databricks to align customer driving behavior with insurer and policy data—ensuring more accurate, fair discounts for policyholders. With Anomalo’s automated monitoring layered on top, Nationwide ensures data quality at scale while empowering business units to define custom logic for proactive stewardship. We’ll also look ahead to how these foundations are preparing the enterprise for unstructured data and GenAI initiatives.

How Blue Origin Accelerates Innovation With Databricks and AWS GovCloud

Blue Origin is revolutionizing space exploration with a mission-critical data strategy powered by Databricks on AWS GovCloud. Learn how they leverage Databricks to meet ITAR and FedRAMP High compliance, streamline manufacturing and accelerate their vision of a 24/7 factory. Key use cases include predictive maintenance, real-time IoT insights and AI-driven tools that transform CAD designs into factory instructions. Discover how Delta Lake, Structured Streaming and advanced Databricks functionalities like Unity Catalog enable real-time analytics and future-ready infrastructure, helping Blue Origin stay ahead in the race to adopt generative AI and serverless solutions.

From Prediction to Prevention: Transforming Risk Management in Insurance

Protecting insurers against emerging threats is critical. This session reveals how leading companies use Databricks’ Data Intelligence Platform to transform risk management, enhance fraud detection, and ensure compliance. Learn how advanced analytics, AI, and machine learning process vast data in real time to identify risks and mitigate threats. Industry leaders will share strategies for building resilient operations that protect against financial losses and reputational harm. Key takeaways: AI-powered fraud prevention using anomaly detection and predictive analytics Real-time risk assessment models integrating IoT, behavioral, and external data Strategies for robust compliance and governance with operational efficiency Discover how data intelligence is revolutionizing insurance risk management and safeguarding the industry’s future.