talk-data.com talk-data.com

Topic

streaming-messaging

110

tagged

Activity Trend

1 peak/qtr
2020-Q1 2026-Q1

Activities

110 activities · Newest first

Optimize Video Streaming Delivery

Media content today is increasingly streamed video, and this trend will only grow as the speed of consumer internet and video quality improve. Traditional video streaming platforms, such as Netflix and Hulu, now account for only a portion of this content as more and more live events are streamed over the internet. And consumer-generated content on video-based social networks such as Twitch and TikTok is now more accessible and gaining popularity. This report focuses on the current state of video delivery, including the challenges content providers face and the various solutions they're pursuing. The findings in this report are based on a recent survey conducted by Edgecast, a content delivery network (CDN) that helps companies accelerate and deliver static and dynamic content to end users around the world. You'll explore: The current state of video streaming, how it works, and how streams are delivered Responses from a survey of CDN users that produce video streams How content providers are addressing recent video streaming challenges How the information in this report can help you identify KPIs

Cloud Native Integration with Apache Camel: Building Agile and Scalable Integrations for Kubernetes Platforms

Address the most common integration challenges, by understanding the ins and outs of the choices and exemplifying the solutions with practical examples on how to create cloud native applications using Apache Camel. Camel will be our main tool, but we will also see some complementary tools and plugins that can make our development and testing easier, such as Quarkus, and tools for more specific use cases, such as Apache Kafka and Keycloak. You will learn to connect with databases, create REST APIs, transform data, connect with message oriented software (MOMs), secure your services, and test using Camel. You will also learn software architecture patterns for integration and how to leverage container platforms, such as Kubernetes. This book is suitable for those who are eager to learn an integration tool that fits the Kubernetes world, and who want to explore the integration challenges that can be solved using containers. What You Will Learn Focus on how to solve integration challenges Understand the basics of the Quarkus as it’s the foundation for the application Acquire a comprehensive view on Apache Camel Deploy an application in Kubernetes Follow good practices Who This Book Is For Java developers looking to learn Apache Camel; Apache Camel developers looking to learn more about Kubernetes deployments; software architects looking to study integration patterns for Kubernetes based systems; system administrators (operations teams) looking to get a better understand of how technologies are integrated.

Mastering Kafka Streams and ksqlDB

Working with unbounded and fast-moving data streams has historically been difficult. But with Kafka Streams and ksqlDB, building stream processing applications is easy and fun. This practical guide shows data engineers how to use these tools to build highly scalable stream processing applications for moving, enriching, and transforming large amounts of data in real time. Mitch Seymour, data services engineer at Mailchimp, explains important stream processing concepts against a backdrop of several interesting business problems. You'll learn the strengths of both Kafka Streams and ksqlDB to help you choose the best tool for each unique stream processing project. Non-Java developers will find the ksqlDB path to be an especially gentle introduction to stream processing. Learn the basics of Kafka and the pub/sub communication pattern Build stateless and stateful stream processing applications using Kafka Streams and ksqlDB Perform advanced stateful operations, including windowed joins and aggregations Understand how stateful processing works under the hood Learn about ksqlDB's data integration features, powered by Kafka Connect Work with different types of collections in ksqlDB and perform push and pull queries Deploy your Kafka Streams and ksqlDB applications to production

RabbitMQ Essentials - Second Edition

Discover how to power your distributed and scalable applications using RabbitMQ in "RabbitMQ Essentials". This book provides a detailed journey into understanding and implementing message queuing architectures, guiding you from the basics through advanced techniques. Through a realistic case study, you'll gain the skills necessary to succeed with RabbitMQ. What this Book will help me do Understand the core concepts and architecture of RabbitMQ and message queuing. Learn how to configure and use RabbitMQ, including installation and plugin management. Master the use of channels, routing strategies, and exchange types for optimized message delivery. Apply strategies for ensuring message queue scalability and robust fault-tolerance. Gain insights and best practices directly from RabbitMQ experts for production-level deployment. Author(s) None Johansson and David Dossot bring a wealth of experience managing and deploying systems based on RabbitMQ. As part of CloudAMQP, they oversee the largest RabbitMQ installations globally. This book reflects their dedication to helping developers succeed with message queuing technology. Who is it for? This book is perfectly suited for developers and software engineers interested in designing scalable and distributed applications. Whether you're new to RabbitMQ or already familiar with microservices and message queuing, "RabbitMQ Essentials" provides clear guidance and real-world insights. Beginners will appreciate its accessible approach, while advanced developers will value its comprehensive coverage and best practices.

Streaming Integration

Data is being generated at an unrelenting pace, and data storage capacity can’t keep up. Enterprises must modernize the way they use and manage data by collecting, processing, and analyzing it in real time—in other words, streaming. This practical report explains everything organizations need to know to begin their streaming integration journey and make the most of their data. Authors Steve Wilkes and Alok Pareek detail the key attributes and components of an enterprise-grade streaming integration platform, along with stream processing and analysis techniques that will help companies reap immediate value from their data and solve their most pressing business challenges. Learn how to collect and handle large volumes of data at scale See how streams move data between threads, processes, servers, and data centers Get your data in the form you need and analyze it in real time Dive into the pros and cons of data targets such as databases, Hadoop, and cloud services for specific use cases Ensure your streaming integration infrastructure scales, is secure, works 24/7, and can handle failure

The Real-Time Revolution

Time has become a precious commodity, so business leaders who can save their customers' time more effectively than competitors do will win their loyalty. This book shows how it's done. Business survival requires valuing what customers value—and in our overworked and distraction-rich era, customers value their time above all else. Real-time companies beat their rivals by being faster and more responsive in meeting customer needs. To become a real-time company, as top scholars Jerry Power and Tom Ferratt explain, you need a real-time monitoring and response system. They offer detailed advice on how to put procedures in place that will collect data on how well products or services are saving customer time; identify strengths, weaknesses, threats, and opportunities; and specify innovations needed to save even more customer time. Where should leaders look to innovate? Powers and Ferratt say to search every step in the life of a product or service, from development to production to usage. And for each step, they identify four possible levers for innovation: the design of the products or services themselves, the process used to produce them, the data that can be gathered on their use, and the people who make or provide the product or service. The book features dozens of examples of companies that are getting it right and the innovations they used to help their customers save time, all while helping themselves to a hefty slice of market share. This is a comprehensive, authoritative guide to thriving in a revolution that is sweeping every industry and sector.

Real-Time Data Analytics for Large Scale Sensor Data

Real-Time Data Analytics for Large-Scale Sensor Data covers the theory and applications of hardware platforms and architectures, the development of software methods, techniques and tools, applications, governance and adoption strategies for the use of massive sensor data in real-time data analytics. It presents the leading-edge research in the field and identifies future challenges in this fledging research area. The book captures the essence of real-time IoT based solutions that require a multidisciplinary approach for catering to on-the-fly processing, including methods for high performance stream processing, adaptively streaming adjustment, uncertainty handling, latency handling, and more. Examines IoT applications, the design of real-time intelligent systems, and how to manage the rapid growth of the large volume of sensor data Discusses intelligent management systems for applications such as healthcare, robotics and environment modeling Provides a focused approach towards the design and implementation of real-time intelligent systems for the management of sensor data in large-scale environments

Streaming Data

Managers and staff responsible for planning, hiring, and allocating resources need to understand how streaming data can fundamentally change their organizations. Companies everywhere are disrupting business, government, and society by using data and analytics to shape their business. Even if you don’t have deep knowledge of programming or digital technology, this high-level introduction brings data streaming into focus. You won’t find math or programming details here, or recommendations for particular tools in this rapidly evolving space. But you will explore the decision-making technologies and practices that organizations need to process streaming data and respond to fast-changing events. By describing the principles and activities behind this new phenomenon, author Andy Oram shows you how streaming data provides hidden gems of information that can transform the way your business works. Learn where streaming data comes from and how companies put it to work Follow a simple data processing project from ingesting and analyzing data to presenting results Explore how (and why) big data processing tools have evolved from MapReduce to Kubernetes Understand why streaming data is particularly useful for machine learning projects Learn how containers, microservices, and cloud computing led to continuous integration and DevOps

Stream Processing with Apache Flink

Get started with Apache Flink, the open source framework that powers some of the world’s largest stream processing applications. With this practical book, you’ll explore the fundamental concepts of parallel stream processing and discover how this technology differs from traditional batch data processing. Longtime Apache Flink committers Fabian Hueske and Vasia Kalavri show you how to implement scalable streaming applications with Flink’s DataStream API and continuously run and maintain these applications in operational environments. Stream processing is ideal for many use cases, including low-latency ETL, streaming analytics, and real-time dashboards as well as fraud detection, anomaly detection, and alerting. You can process continuous data of any kind, including user interactions, financial transactions, and IoT data, as soon as you generate them. Learn concepts and challenges of distributed stateful stream processing Explore Flink’s system architecture, including its event-time processing mode and fault-tolerance model Understand the fundamentals and building blocks of the DataStream API, including its time-based and statefuloperators Read data from and write data to external systems with exactly-once consistency Deploy and configure Flink clusters Operate continuously running streaming applications

Apache Kafka Quick Start Guide

Dive into the world of Apache Kafka with this concise guide that focuses on its practical use for real-time data processing in distributed systems. You'll explore Kafka's capabilities, covering essentials like configuration, messaging, serialization, and handling complex data streams using Kafka Streams and KSQL. By the end, you'll be equipped to tackle real-world streaming challenges confidently. What this Book will help me do Understand how to set up and configure Apache Kafka for real-time processing environments. Master key concepts like message validation, enrichment, and serialization. Learn to use the Schema Registry for data validation and versioning. Gain hands-on experience with data streaming and aggregation using Kafka Streams. Develop skills in using KSQL for data manipulation and stream querying. Author(s) None Estrada is an experienced software engineer with a deep understanding of distributed systems and real-time data processing. With expertise in Apache Kafka and other event-streaming platforms, Estrada approaches technical writing with an emphasis on clarity and practical application. Their passion for helping developers achieve success is reflected in their authoritative yet approachable style. Who is it for? This book is perfect for software engineers and backend developers interested in mastering real-time data processing using Apache Kafka. It is designed for readers who are eager to solve practical problems in distributed systems, irrespective of whether they have prior Kafka experience. Some familiarity with Java or other JVM languages will be helpful, although not strictly necessary. This is an ideal resource for learners seeking a hands-on, practical approach to Apache Kafka.

BizTalk Server 2016: Performance Tuning and Optimization

Gain an in depth view of optimizing the performance of BizTalk Server. This book provides best practices and techniques for improving development of high mission critical solutions. You'll see how the BizTalk Server engine works and how to proactively detect and remedy potential bottlenecks before they occur. The book starts with an overview of the BizTalk Server internal mechanisms that will help you understand the optimizations detailed throughout the book. You'll then see how the mechanisms can be applied to a BizTalk Server environment to improve low and high latency throughput scenarios. A section on testing BizTalk server solutions will guide you through the most frequently adopted techniques used to develop solutions such as performance and unit testing as part of the development cycle. With BizTalk Server 2016 you'll see how to apply side-by-side versioning to your solutions to reduce the chances of downtime, You'll also review instrumentation techniques using Event Traces for windows and business activity monitoring (BAM). While the book is focused on the latest version of BizTalk Server, most of the topics discussed will also work with BizTalk Server 2013R2. What You'll Learn Review BizTalk Server internals and how the message engine works Understand BizTalk Server architecture Gather and analyze BizTalk Server performance data Develop BizTalk Server performance solutions Use advanced troubleshooting tools to help diagnose your platform Who This Book Is For Those who have strong BizTalk and .NET Framework knowledge and want to get their BizTalk Server knowledge to the next level

Fast Data Architectures for Streaming Applications, 2nd Edition

Why have stream-oriented data systems become so popular, when batch-oriented systems have served big data needs for many years? In the updated edition of this report, Dean Wampler examines the rise of streaming systems for handling time-sensitive problems—such as detecting fraudulent financial activity as it happens. You’ll explore the characteristics of fast data architectures, along with several open source tools for implementing them. Batch processing isn’t going away, but exclusive use of these systems is now a competitive disadvantage. You’ll learn that, while fast data architectures using tools such as Kafka, Akka, Spark, and Flink are much harder to build, they represent the state of the art for dealing with mountains of data that require immediate attention. Learn how a basic fast data architecture works, step-by-step Examine how Kafka’s data backplane combines the best abstractions of log-oriented and message queue systems for integrating components Evaluate four streaming engines, including Kafka Streams, Akka Streams, Spark, and Flink Learn which streaming engines work best for different use cases Get recommendations for making real-world streaming systems responsive, resilient, elastic, and message driven Explore an example IoT streaming application that includes telemetry ingestion and anomaly detection

Inside the Message Passing Interface

A hands-on guide to writing a Message Passing Interface, this book takes the reader on a tour across major MPI implementations, best optimization techniques, application relevant usage hints, and a historical retrospective of the MPI world, all based on a quarter of a century spent inside MPI. Readers will learn to write MPI implementations from scratch, and to design and optimize communication mechanisms using pragmatic subsetting as the guiding principle. Inside the Message Passing Interface also covers MPI quirks and tricks to achieve best performance. Dr. Alexander Supalov created the Intel Cluster Tools product line, including the Intel MP Library that he designed and led between 2003 and 2015. He invented the common MPICH ABI and also guided Intel efforts in the MPI Forum during the development of the MPI-2.1, MPI-2.2, and MPI-3 standards. Before that, Alexander designed new finite-element mesh-generation methods, contributing to the PARMACS and PARASOL interfaces, and developed the first full MPI-2 and IMPI implementations in the world. He graduated from the Moscow Institute of Physics and Technology in 1990, and earned his PhD in applied mathematics at the Institute of Numerical Mathematics of the Russian Academy of Sciences in 1995. Alexander holds 26 patents (more pending worldwide).

Kafka Streams in Action

Kafka Streams in Action teaches you everything you need to know to implement stream processing on data flowing into your Kafka platform, allowing you to focus on getting more from your data without sacrificing time or effort. About the Technology Not all stream-based applications require a dedicated processing cluster. The lightweight Kafka Streams library provides exactly the power and simplicity you need for message handling in microservices and real-time event processing. With the Kafka Streams API, you filter and transform data streams with just Kafka and your application. About the Book Kafka Streams in Action teaches you to implement stream processing within the Kafka platform. In this easy-to-follow book, you’ll explore real-world examples to collect, transform, and aggregate data, work with multiple processors, and handle real-time events. You’ll even dive into streaming SQL with KSQL! Practical to the very end, it finishes with testing and operational aspects, such as monitoring and debugging. What's Inside Using the KStreams API Filtering, transforming, and splitting data Working with the Processor API Integrating with external systems About the Reader Assumes some experience with distributed systems. No knowledge of Kafka or streaming applications required. About the Author Bill Bejeck is a Kafka Streams contributor and Confluent engineer with over 15 years of software development experience. Quotes A great way to learn about Kafka Streams and how it is a key enabler of event-driven applications. - From the Foreword by Neha Narkhede, Cocreator of Apache Kafka A comprehensive guide to Kafka Streams—from introduction to production! - Bojan Djurkovic, Cvent Bridges the gap between message brokering and real-time streaming analytics. - Jim Mantheiy Jr., Next Century Valuable both as an introduction to streams as well as an ongoing reference. - Robin Coe, TD Bank

Streaming Systems

Streaming data is a big deal in big data these days. As more and more businesses seek to tame the massive unbounded data sets that pervade our world, streaming systems have finally reached a level of maturity sufficient for mainstream adoption. With this practical guide, data engineers, data scientists, and developers will learn how to work with streaming data in a conceptual and platform-agnostic way. Expanded from Tyler Akidau’s popular blog posts "Streaming 101" and "Streaming 102", this book takes you from an introductory level to a nuanced understanding of the what, where, when, and how of processing real-time data streams. You’ll also dive deep into watermarks and exactly-once processing with co-authors Slava Chernyak and Reuven Lax. You’ll explore: How streaming and batch data processing patterns compare The core principles and concepts behind robust out-of-order data processing How watermarks track progress and completeness in infinite datasets How exactly-once data processing techniques ensure correctness How the concepts of streams and tables form the foundations of both batch and streaming data processing The practical motivations behind a powerful persistent state mechanism, driven by a real-world example How time-varying relations provide a link between stream processing and the world of SQL and relational algebra

BizTalk

Why do businesses continue to use Microsoft’s BizTalk Server as the backbone to integrate line-of-business applications with their trading partners and how do recent changes make it even more effective? With the advent of Azure, we have a unique opportunity to enhance BizTalk functionality including reducing the cost of operations and maintenance. This book offers three solutions for the reader on ways to leverage BizTalk to get more from existing deployments or find ways to modernize the deployment via Azure. Microsoft partners are playing a significant role in enhancing the capabilities of BizTalk and this book includes sections that provide an in-depth review of BizTalk 360 © and the WPC HIPAA DB Toolkit ©. Over the recent past, Web 3.0 has also introduced many new concepts and open source technologies and this book covers ways to leverage these to enhance your BizTalk deployment. The authors start with a survey of the existing BizTalk Server – its history, patterns, and state of affairs –and go on to provide an in-depth elaboration of three messaging patterns that customers use for BizTalk; the advantages of updating to SQL Server 2016; a review of partner solutions that enhance BizTalk; and BizTalk with Web 3.0 for custom solutions. The book concludes with a comparison of the three viable BizTalk Azure application solutions that will enable you to make the best choice for your business.

Visualizing Streaming Data

While tools for analyzing streaming and real-time data are gaining adoption, the ability to visualize these data types has yet to catch up. Dashboards are good at conveying daily or weekly data trends at a glance, though capturing snapshots when data is transforming from moment to moment is more difficult—but not impossible. With this practical guide, application designers, data scientists, and system administrators will explore ways to create visualizations that bring context and a sense of time to streaming text data. Author Anthony Aragues guides you through the concepts and tools you need to build visualizations for analyzing data as it arrives. Determine your company’s goals for visualizing streaming data Identify key data sources and learn how to stream them Learn practical methods for processing streaming data Build a client application for interacting with events, logs, and records Explore common components for visualizing streaming data Consider analysis concepts for developing your visualization Define the dashboard’s layout, flow direction, and component movement Improve visualization quality and productivity through collaboration Explore use cases including security, IoT devices, and application data

Designing Event-Driven Systems

Many forces affect software today: larger datasets, geographical disparities, complex company structures, and the growing need to be fast and nimble in the face of change. Proven approaches such as service-oriented and event-driven architectures are joined by newer techniques such as microservices, reactive architectures, DevOps, and stream processing. Many of these patterns are successful by themselves, but as this practical ebook demonstrates, they provide a more holistic and compelling approach when applied together. Author Ben Stopford explains how service-based architectures and stream processing tools such as Apache Kafka can help you build business-critical systems. You’ll learn how to apply patterns including Event Sourcing and CQRS, and how to build multi-team systems with microservices and SOA using patterns such as "inside out databases" and "event streams as a source of truth." These approaches provide a unique foundation for how these large, autonomous service ecosystems can communicate and share data. Learn why streaming beats request-response based architectures in complex, contemporary use cases Understand why replayable logs such as Kafka provide a backbone for both service communication and shared datasets Explore how event collaboration and event sourcing patterns increase safety and recoverability with functional, event-driven approaches Build service ecosystems that blend event-driven and request-driven interfaces using a replayable log and Kafka’s Streams API Scale beyond individual teams into larger, department- and company-sized architectures, using event streams as a source of truth