talk-data.com talk-data.com

Topic

data

5765

tagged

Activity Trend

3 peak/qtr
2020-Q1 2026-Q1

Activities

5765 activities · Newest first

Tableau For Dummies, 2nd Edition

Discover how visualization turns data into action Tableau gives you the power to understand your data and put it in a format that is appealing and meaningful for everyone who needs to see it. Tableau For Dummies walks you through the steps to turn your data into a story that inspires action. This easy-to-understand guide offers insights from an enterprise data pro on how to transform data into a clear and memorable visual presentation. Navigate the Tableau user interface and connect to data sources Use drag-and-drop features to create stunning visualizations Work with templates, add graphs, and create clear charts Export your visualizations to multiple formats for easy sharing This is the perfect Dummies software guide for business professionals who need to better derive value from that all-important data.

R Packages, 2nd Edition

Turn your R code into packages that others can easily install and use. With this fully updated edition, developers and data scientists will learn how to bundle reusable R functions, sample data, and documentation together by applying the package development philosophy used by the team that maintains the "tidyverse" suite of packages. In the process, you'll learn how to automate common development tasks using a set of R packages, including devtools, usethis, testthat, and roxygen2. Authors Hadley Wickham and Jennifer Bryan from Posit (formerly known as RStudio) help you create packages quickly, then teach you how to get better over time. You'll be able to focus on what you want your package to do as you progressively develop greater mastery of the structure of a package. With this book, you will: Learn the key components of an R package, including code, documentation, and tests Streamline your development process with devtools and the RStudio IDE Get tips on effective habits such as organizing functions into files Get caught up on important new features in the devtools ecosystem Learn about the art and science of unit testing, using features in the third edition of testthat Turn your existing documentation into a beautiful and user friendly website with pkgdown Gain an appreciation of the benefits of modern code hosting platforms, such as GitHub

R for Data Science, 2nd Edition

Use R to turn data into insight, knowledge, and understanding. With this practical book, aspiring data scientists will learn how to do data science with R and RStudio, along with the tidyverse—a collection of R packages designed to work together to make data science fast, fluent, and fun. Even if you have no programming experience, this updated edition will have you doing data science quickly. You'll learn how to import, transform, and visualize your data and communicate the results. And you'll get a complete, big-picture understanding of the data science cycle and the basic tools you need to manage the details. Updated for the latest tidyverse features and best practices, new chapters show you how to get data from spreadsheets, databases, and websites. Exercises help you practice what you've learned along the way. You'll understand how to: Visualize: Create plots for data exploration and communication of results Transform: Discover variable types and the tools to work with them Import: Get data into R and in a form convenient for analysis Program: Learn R tools for solving data problems with greater clarity and ease Communicate: Integrate prose, code, and results with Quarto

How I Rob Banks

Follow FC as he steals from the world’s most secure banks and government facilities—without breaking a single law In How I Rob Banks: And Other Such Places, renowned ethical hacker and social engineer FC delivers a gripping and often hilarious discussion of his work: testing the limits of physical bank security by trying to “steal” money, data, and anything else he can get his hands on. In the book, you’ll explore the secretive world of physical assessments and follow FC as he breaks into banks and secure government locations to identify security flaws and loopholes. The author explains how banks and other secure facilities operate, both digitally and physically, and shows you the tools and techniques he uses to gain access to some of the world’s most locked-down buildings. You’ll also find: Strategies you can implement immediately to better secure your own company, home, and data against malicious actors Detailed photos, maps, and drawings to bring to life the unbelievable true stories contained inside An inside and candid look at a rarely examined industry through the eyes of one of its most respected penetration testers A can’t-miss account of real-life security exploits perfect for infosec pros, including red and blue teamers, pentesters, CIOs, CISSPs, and social engineers, How I Rob Banks also belongs in the hands of anyone who loves a great Ocean’s 11-style story pulled straight from the real world.

IBM Storage Fusion HCI System: Metro Sync Disaster Recovery Use Case

Metro sync disaster recovery (DR) provides two-way synchronous data replication between IBM Spectrum Fusion™ HCI clusters installed at two sites. In the event of a site disaster, applications can be failed over to the second site. The replication between the sites is synchronous, hence, the Metro sync DR solution is only available for metropolitan distance data centers with 40 millisecond latency or less. The procedures described in this paper for IBM Spectrum Fusion HCI 2.4 Metro sync DR are the same for IBM Storage Fusion HCI 2.5.2 Metro-DR. This IBM Redpaper publication will help you install and configure the new Metro sync DR function). The use case will show the end to end process with the failover and failback of the WordPress application. IBM Spectrum Fusion HCI and IBM Spectrum Fusion have become IBM Storage Fusion HCI System and IBM Storage Fusion. This edition uses the IBM Spectrum® brand names and will be updated with the next edition.

Intelligent Analytics for Industry 4.0 Applications

In Industry 4.0, intelligent analytics has a broader scope in terms of descriptive, predictive, and prescriptive sub-domains. To this end, the book will aim to review and highlight the challenges faced by Intelligent Analytics in Industry 4.0 and present the recent developments done to address those challenges.

Data Modeling with Snowflake

This comprehensive guide, "Data Modeling with Snowflake", is your go-to resource for mastering the art of efficient data modeling tailored to the capabilities of the Snowflake Data Cloud. In this book, you will learn how to design agile and scalable data solutions by effectively leveraging Snowflake's unique architecture and advanced features. What this Book will help me do Understand the core principles of data modeling and how they apply to Snowflake's cloud-native environment. Learn to use Snowflake's features, such as time travel and zero-copy cloning, to create efficient data solutions. Gain hands-on experience with SQL recipes that outline practical approaches to transforming and managing Snowflake data. Discover techniques for modeling structured and semi-structured data for real-world business needs. Learn to integrate universal modeling frameworks like Star Schema and Data Vault into Snowflake implementations for scalability and maintainability. Author(s) The author, Serge Gershkovich, is a seasoned expert in database design and Snowflake architecture. With years of experience in the data management field, Serge has dedicated himself to making complex technical subjects approachable to professionals at all levels. His insights in this book are informed by practical applications and real-world experience. Who is it for? This book is targeted at data professionals, ranging from newcomers to database design to seasoned SQL developers seeking to specialize in Snowflake. If you are looking to understand and apply data modeling practices effectively within Snowflake's architecture, this book is for you. Whether you're refining your modeling skills or getting started with Snowflake, it provides the practical knowledge you need to succeed.

Power BI Machine Learning and OpenAI

Microsoft Power BI Machine Learning and OpenAI offers a comprehensive exploration into advanced data analytics and artificial intelligence using Microsoft Power BI. Through hands-on, workshop-style examples, readers will discover the integration of machine learning models and OpenAI features to enhance business intelligence. This book provides practical examples, real-world scenarios, and step-by-step guidance. What this Book will help me do Learn to apply machine learning capabilities within Power BI to create predictive analytics Understand how to integrate OpenAI services to build enhanced analytics workflows Gain hands-on experience in using R and Python for advanced data visualization in Power BI Master the skills needed to build and deploy SaaS auto ML models within Power BI Leverage Power BI's AI visuals and features to elevate data storytelling Author(s) Greg Beaumont, an expert in data science and business intelligence, brings years of experience in Power BI and analytics to this book. With a focus on practical applications, Greg empowers readers to harness the power of AI and machine learning to elevate their data solutions. As a consultant and trainer, he shares his deep knowledge to help readers unlock the full potential of their tools. Who is it for? This book is ideal for data analysts, BI professionals, and data scientists who aim to integrate machine learning and OpenAI into their workflows. If you're familiar with Power BI's fundamentals and are eager to explore its advanced capabilities, this guide is tailored for you. Perfect for professionals looking to elevate their analytics to a new level, combining data science concepts with Power BI's features.

Modernize Applications with Apache Kafka

Application modernization has become increasingly important as older systems struggle to keep up with today's requirements. When you migrate legacy monolithic applications to microservices, easier maintenance and optimized resource utilization generally follow. But new challenges arise around communication within services and between applications. You can overcome many of these issues with the help of modern messaging technologies such as Apache Kafka. In this report, Jennifer Vargas and Richard Stroop from Red Hat explain how IT leaders and enterprise architects can use Kafka for microservices communication and then off-load operational needs through the use of Kubernetes and managed services. You'll also explore application modernization techniques that don't require you to break down your monolithic application. This report helps you: Understand the importance of migrating your monolithic applications to microservices Examine the various challenges you may face during the modernization process Explore application modernization techniques and learn the benefits of using Apache Kafka during the development process Learn how Apache Kafka can support business outcomes Understand how Kubernetes can help you overcome any difficulties you may encounter when using Kafka for application development

MySQL Crash Course

MySQL Crash Course is a fast-paced, no-nonsense introduction to relational database development. It’s filled with practical examples and expert advice that will have you up and running quickly. You’ll learn the basics of SQL, how to create a database, craft SQL queries to extract data, and work with events, procedures, and functions. You’ll see how to add constraints to tables to enforce rules about permitted data and use indexes to accelerate data retrieval. You’ll even explore how to call MySQL from PHP, Python, and Java. Three final projects will show you how to build a weather database from scratch, use triggers to prevent errors in an election database, and use views to protect sensitive data in a salary database. You’ll also learn how to: •Query database tables for specific information, order the results, comment SQL code, and deal with null values •Define table columns to hold strings, integers, and dates, and determine what data types to use •Join multiple database tables as well as use temporary tables, common table expressions, derived tables, and subqueries •Add, change, and remove data from tables, create views based on specific queries, write reusable stored routines, and automate and schedule events The perfect quick-start resource for database developers, MySQL Crash Course will arm you with the tools you need to build and manage fast, powerful, and secure MySQL-based data storage systems.

Practical A/B Testing

Whether you're a catalyst for organizational change or have the support you need to create an engineering culture that embraces A/B testing, this book will help you do it right. The step-by-step instructions will demystify the entire process, from constructing an A/B test to breaking down the decision factors to build an engineering platform. When you're ready to run the A/B test of your dreams, you'll have the perfect blueprint. With smart, tactful approaches to orchestrating A/B testing on a product, you'll quickly discover how to reap all the benefits that A/B testing has to offer - benefits that span your users, your product, and your team. Take the reins today, and be the change you want to see in your engineering and product organizations. Develop a hypothesis statement that's backed with metrics that demonstrate if your prediction for the experiment is correct. Build more inclusive products by leveraging audience segmentation strategies and ad-hoc post analysis to better understand the impact of changes on specific user groups. Determine which path is best for your team when deciding whether to go with a third-party A/B test framework or to build the A/B testing platform in-house. And finally, learn how to cultivate an experimentation-friendly culture within your team. Leverage the A/B testing methodology to demonstrate the impact of changes on a product to your users, your key business metrics, and the way your team works together. After all, if you aren't measuring the impact of the changes you make, how will you know if you're truly making improvements?

Data Fabric Architectures

The immense increase on the size and type of real time data generated across various edge computing platform results in unstructured databases and data silos. This edited book gathers together an international set of researchers to investigate the possibilities offered by data-fabric solutions; the volume focuses in particular on data architectures and on semantic changes in future data landscapes.

Uncertainty in Data Envelopment Analysis

Classical data envelopment analysis (DEA) models use crisp data to measure the inputs and outputs of a given system. In cases such as manufacturing systems, production processes, service systems, etc., the inputs and outputs may be complex and difficult to measure with classical DEA models. Crisp input and output data are fundamentally indispensable in the conventional DEA models. If these models contain complex uncertain data, then they will become more important and practical for decision makers.Uncertainty in Data Envelopment Analysis introduces methods to investigate uncertain data in DEA models, providing a deeper look into two types of uncertain DEA methods, fuzzy DEA and belief degree-based uncertainty DEA, which are based on uncertain measures. These models aim to solve problems encountered by classical data analysis in cases where the inputs and outputs of systems and processes are volatile and complex, making measurement difficult. Introduces methods to deal with uncertain data in DEA models, as a source of information and a reference book for researchers and engineers Presents DEA models that can be used for evaluating the outputs of many reallife systems in social and engineering subjects Provides fresh DEA models for efficiency evaluation from the perspective of imprecise data Applies the fuzzy set and uncertainty theories to DEA to produce a new method of dealing with the empirical data

Maturing the Snowflake Data Cloud: A Templated Approach to Delivering and Governing Snowflake in Large Enterprises

This project-oriented book gives you a hands-on approach to designing, developing, and templating your Snowflake platform delivery. Written by seasoned Snowflake practitioners, the book is full of practical guidance and advice to accelerate and mature your Snowflake journey. Working through the examples helps you develop the skill, knowledge, and expertise to expand your organization’s core Snowflake capability and prepare for later incorporation of additional Snowflake features as they become available. Your Snowflake platform will be resilient, fit for purpose, extensible, and guarantee rapid, consistent, and repeatable, pattern-based deployments ready for application delivery. When a Snowflake account is delivered there are no controls, guard rails, external monitoring, nor governance mechanisms baked in. From a large organization perspective, this book explains how to deliver your core Snowflake platform in the form of a Landing Zone, a consistent, templated approach that assumes familiarity with Snowflake core concepts and principles. The book also covers Snowflake from a governance perspective and addresses the “who can see what?” question, satisfying requirements to know for certain that your Snowflake accounts properly adhere to your organization’s data usage policies. The book provides a proven pathway to success by equipping you with skill, knowledge, and expertise to accelerate Snowflake adoption within your organization. The patterns delivered within this book are used for production deployment, and are proven in real-world use. Examples in the book help you succeed in an environment in which governance policies, processes, and procedures oversee and control every aspect of your Snowflake platform development and delivery life cycle. Your environment may not be so exacting, but you’ll still benefit from the rigorous and demanding perspective this book’s authors bring to the table. The book showsyou how to leverage what you already know and adds what you don’t know, all applied to deliver your Snowflake accounts. You will know how to position your organization to deliver consistent Snowflake accounts that are prepared and ready for immediate application development. What You Will Learn Create a common, consistent deployment framework for Snowflake in your organization Enable rapid up-skill and adoption of Snowflake, leveraging the benefits of cloud platforms Develop a deep understanding of Snowflake administration and configuration Implement consistent, approved design patterns that reduce account provisioning times Manage data consumption by monitoring and controlling access to datasets Who This Book Is For Systems administrators charged with delivering a common implementationpattern for all Snowflake accounts within an organization; senior managers looking to simplify the delivery of complex technology into their existing infrastructure; developers seeking to understand guard rails, monitoring, and controls to ensure that Snowflake meets their organization's requirements; sales executives needing to understand how their data usage can be monitored and gain insights into how their data is being consumed; governance colleagues wanting to know who can see each data set, and wanting to identify toxic role combinations, and have confidence that their Snowflake accounts are properly provisioned