talk-data.com talk-data.com

Topic

data

5765

tagged

Activity Trend

3 peak/qtr
2020-Q1 2026-Q1

Activities

5765 activities · Newest first

Exam Ref 70-765 Provisioning SQL Databases, First Edition

Prepare for Microsoft Exam 70-765–and help demonstrate your real-world mastery of provisioning SQL Server databases both on premise and in SQL Azure. Designed for experienced IT professionals ready to advance their status, Exam Ref focuses on the critical thinking and decision-making acumen needed for success at the MCSA level. Focus on the expertise measured by these objectives: • Implement SQL in Azure • Manage databases and instances • Manage storage This Microsoft Exam Ref: • Organizes its coverage by exam objectives • Features strategic, what-if scenarios to challenge you • Assumes you have working knowledge of SQL Server administration and maintenance, as well as Azure skills Provisioning SQL Databases About the Exam Exam 70-765 focuses on skills and knowledge for provisioning, upgrading, and configuring SQL Server; managing databases and files; and provisioning, migrating, and managing databases in the Microsoft Azure cloud. About Microsoft Certification Passing this exam as well as Exam 70-764: Administering a SQL Database Infrastructure earns you MCSA: SQL 2016 Database Administration certification, qualifying you for a position as a database administrator or infrastructure specialist. See full details at: microsoft.com/learning

Business Research Reporting

Business Research Reporting addresses the essential activities of locating, collecting, evaluating, analyzing, interpreting, and reporting business data. It highlights the value of primary and secondary research to making business decisions and solving business problems. It aims to help business managers, MBA candidates, and upper-level college students boost their research skills and report research with confidence. This book discusses primary data collection, sampling concepts, and the use of measurement and scales in preparing instruments. Also, this book explores statistical and non-statistical analysis of qualitative and quantitative data and data interpretation (findings, conclusions, and recommendations). The author shows how to locate, evaluate, and extract secondary data found on the web and in brick-and-mortar libraries, including optimized searching, evaluating, and recording. Plus, the book demonstrates how to avoid copyright infringement and plagiarism, use online citation software, and cite sources when writing and presenting. Two glossaries—one each for primary and secondary research—round out the content. Business Research Reporting can be your go-to guidebook for years to come. Reading through it in a couple of hours, you can pick up ample information to apply instantly. Then keep it handy and refer to it in your ongoing research activities.

Learning Pentaho Data Integration 8 CE - Third Edition

"Learning Pentaho Data Integration 8 CE" is your comprehensive guide to mastering data manipulation and integration using Pentaho Data Integration (PDI) 8 Community Edition. Through step-by-step instructions and practical examples, you'll learn to explore, transform, validate, and integrate data from multiple sources, equipping you to handle real-world data challenges efficiently. What this Book will help me do Effectively install and understand the foundational concepts of Pentaho Data Integration 8 Community Edition. Efficiently organize, clean, and transform raw data from various sources into useful formats. Perform advanced data operations like metadata injection, managing relational databases, and implementing ETL solutions. Design, create, and deploy comprehensive data warehouse solutions using modern best practices. Streamline daily data processing tasks with flexibility and accuracy while handling errors gracefully. Author(s) The author, Carina Roldán, is an experienced professional in the field of data science and ETL (Extract, Transform, Load) development. Her expertise in leveraging tools like Pentaho Data Integration has allowed her to contribute significantly to BI and data management projects. Her approach in writing this book reflects her commitment to simplifying complex topics for aspiring professionals. Who is it for? This book is ideal for software developers, data analysts, business intelligence professionals, and IT students aiming to enhance their skills in ETL processes using Pentaho Data Integration. Beginners who wish to learn PDI comprehensively and professionals looking to deepen their expertise will both find value in this resource. It's also suitable for individuals involved in data warehouse design and implementation. This book will equip you with the skills to handle diverse data transformation tasks effectively.

Learning PostgreSQL 10 - Second Edition

Dive into the world of PostgreSQL 10, one of the most widely used open-source database systems. This comprehensive guide will teach you the essential features and functionalities of PostgreSQL, enabling you to develop, manage, and optimize database systems with confidence and efficiency. What this Book will help me do Gain a foundational understanding of relational databases and PostgreSQL. Learn how to install, set up, and configure a PostgreSQL database system. Master SQL query writing, data manipulation, and advanced queries with PostgreSQL. Understand server-side programming with PL/pgSQL and define advanced schema objects. Optimize database performance, leverage advanced data types, and connect PostgreSQL with Python applications. Author(s) None Juba and None Volkov are seasoned experts in database management and software development. Their extensive experience with PostgreSQL ensures that each concept is explained practically and effectively. They aim to simplify complex topics for beginners and provide tips that are valuable for practitioners at various levels. Who is it for? This book is ideal for students, developers, and IT professionals who are new to PostgreSQL or wish to deepen their understanding of database technology. It caters to beginners looking to acquire foundational skills and database enthusiasts aiming to master PostgreSQL functionalities. Whether you're exploring database management for the first time or refining your existing skills, this guide is tailored for your needs.

Pro SAP Scripts, Smartforms, and Data Migration: ABAP Programming Simplified

Master SAP scripts, Smartforms, and data migration with hands-on exercises. The information provided in this book will help you decode the complexities and intricacies of SAP ABAP programming. Pro SAP Scripts, Smartforms, and Data Migration begins by describing the components of a SAP script: forms, styles, and standard texts. It then shows you how an ABAP program can invoke a SAP script form and send data to the form to provide output. You will then apply these concepts to hands-on exercises covering real business scenarios. These scenarios include creating a custom form from scratch to output purchase orders. Smartforms will then be introduced as an enhanced tool to output business documents. The book will show you how to apply the concepts of Smartforms to real-world problems. The data migration material includes details of the Legacy System Migration Workbench (LSMW). This is introduced as a platform from which every data migration task can be performed, minimizing or eliminating programming. What You Will Learn Create and deploy SAP script forms and related objects Modify a copy of a SAP-supplied SAP script form, configure it, and deploy it according to transaction code ME22N Build Smartforms forms and deploy them Carry out data migration using the batch input and call transaction methods Perform data migration using all four methods available in LSMW Modify a copy of a SAP-supplied Smartforms form, configure it, and deploy it according to transaction code NACE Who This Book Is For Readers new to SAP ABAP programming (close to three years of experience or less) are the primary target audience for this book. Intermediate users can also utilize this book as a reference source.

Learning D3.js 5 Mapping - Second Edition

This book, "Learning D3.js 5 Mapping", guides developers through the process of creating dynamic and interactive data visualizations. With a focus on D3.js, you'll learn to harness the power of JavaScript to create maps and graphical objects that inform and engage. What this Book will help me do Gain expertise in working with SVG geometric shapes to design compelling graphics. Learn techniques to manage, process, and use geographic data effectively. Master adding interactivity to visual maps to provide an immersive user experience. Understand how to optimize and manipulate geoJSON files using topoJSON. Learn to create varied map types, such as hexbins and globes, using D3.js and Canvas. Author(s) None Newton and Oscar Villarreal, among others, collaborated to author this guide. They are experienced in front-end development and data visualization, bringing a practical and hands-on approach to learning through this book. Their backgrounds ensure the book addresses common challenges faced during implementation, offering thoughtful solutions. Who is it for? "Learning D3.js 5 Mapping" is perfect for web developers familiar with HTML, CSS, and JavaScript who want to expand their expertise in data visualization and mapping. If you're looking to incorporate interactive charts or maps into your web applications, this book will provide practical guidance and solid fundamentals. No prior experience with D3.js is necessary.

R Data Mining

Dive into the world of data mining with 'R Data Mining' and discover how to utilize R's vast tools for uncovering insights in data. This hands-on guide immerses you in real-world cases, teaching both foundational concepts and advanced techniques like regression models and text mining. You'll emerge with a sharp understanding of how to transform raw data into actionable information. What this Book will help me do Gain proficiency in R packages such as dplyr and ggplot2 for data manipulation and visualization. Master the CRISP-DM methodology to systematically approach data mining projects. Develop skillsets in data cleaning and validation to ensure quality data analysis. Understand and implement multiple regression and classification techniques effectively. Learn to use ensemble learning methods and produce reporting with R Markdown. Author(s) Andrea Cirillo brings extensive expertise in data science and R programming as the author of 'R Data Mining.' Their practical approach, drawing from professional experiences in various industries, makes complex techniques accessible and engaging. Their passion for teaching translates into a meticulously crafted learning journey for aspiring data miners. Who is it for? This book is ideal for beginner to intermediate-level data analysts or aspiring data scientists eager to delve into the field of data mining using R. If you're familiar with the basics of programming in R and want to expand into practical applications of data mining methodologies, this is the resource for you. Gain hands-on experience by engaging with real-world datasets and scenarios.

IBM DS8880 Architecture and Implementation (Release 8.3)

Abstract This IBM® Redbooks® publication describes the concepts, architecture, and implementation of the IBM DS8880 family. The book provides reference information to assist readers who need to plan for, install, and configure the DS8880 systems. The IBM DS8000® family is a high-performance, high-capacity, highly secure, and resilient series of disk storage systems. The DS8880 family is the latest and most advanced of the DS8000 offerings to date. The high availability, multiplatform support, including IBM Z, and simplified management tools help provide a cost-effective path to an on-demand and cloud-based infrastructures. The IBM DS8880 family now offers business-critical, all-flash, and hybrid data systems that span a wide range of price points: DS8884 -- Business Class DS8886 -- Enterprise Class DS8888 -- Analytics Class The DS8884 and DS8886 are available as either hybrid models, or can be configured as all-flash. Each model represents the most recent in this series of high-performance, high-capacity, flexible, and resilient storage systems. These systems are intended to address the needs of the most demanding clients. Two powerful IBM POWER8® processor-based servers manage the cache to streamline disk I/O, maximizing performance and throughput. These capabilities are further enhanced with the availability of the second generation of high-performance flash enclosures (HPFEs Gen-2) and newer flash drives. Like its predecessors, the DS8880 supports advanced disaster recovery (DR) solutions, business continuity solutions, and thin provisioning. All disk drives in the DS8880 storage system include the Full Disk Encryption (FDE) feature. The DS8880 can automatically optimize the use of each storage tier, particularly flash drives, by using the IBM Easy Tier® feature.

Introducing ArcGIS API 4 for JavaScript: Turn Awesome Maps into Awesome Apps

Learn to use the ArcGIS API 4 for JavaScript to build custom web mapping applications. This book teaches you to easily create interactive displays of geographic information that you can use to tell stories and answer questions. Version 4 of the ArcGIS API for JavaScript introduces new patterns and fundamental concepts, including 3D mapping capabilities. You will learn the fundamentals of using the API in order to get the most out of it. Covering key concepts and how different components work together, you will also learn how to take advantage of the Widget framework built into the API to build your own reusable widgets for your own ArcGIS JSAPI applications. Including a series of samples you can use to leverage the API for your own applications, Introducing ArcGIS API 4 for JavaScript helps you take your existing knowledge of JavaScript to a new level, and add new features to your app libraries. What You'll Learn Create both 2D and 3D custom web mapping applications Work with popups and custom widgets Leverage the ArcGIS platform in your applications Utilize custom visualizations Who This Book Is For Developers who need to learn the ArcGIS JSAPI for work or school. Those with some JavaScript experience; GIS or mapping experience is not required.

Beginning XML with C# 7: XML Processing and Data Access for C# Developers

Master the basics of XML as well as the namespaces and objects you need to know in order to work efficiently with XML. You’ll learn extensive support for XML in everything from data access to configuration, from raw parsing to code documentation. You will see clear, practical examples that illustrate best practices in implementing XML APIs and services as part of your C#-based Windows 10 applications. Beginning XML with C# 7 is completely revised to cover the XML features of .NET Framework 4.7 using C# 7 programming language. In this update, you’ll discover the tight integration of XML with ADO.NET and LINQ as well as additional .NET support for today’s RESTful web services and Web API. Written by a Microsoft Most Valuable Professional and developer, this book demystifies everything to do with XML and C# 7. What You Will Learn: Discover how XML works with the .NET Framework Read, write, access, validate, and manipulate XML documents Transform XML with XSLT Use XML serialization and web services Combine XML in ADO.NET and SQL Server Create services using Windows Communication Foundation Work with LINQ Use XML with Web API and more Who This Book Is For : Those with experience in C# and .NET new to the nuances of using XML. Some XML experience is helpful.

Db2 Skeleton Cloning: Protecting Your Production Environment

IBM Db2® for z/OS® is well known as the gold-standard information steward. Deep synergy with the z/OS operating system and System Z platform provides support for the highest transaction volumes with the ultimate levels of availability. Just like any high-performance engine, occasional maintenance or upgrades are needed to maintain peak performance and to incorporate new features. Those that demand the highest standards and protection of their production environments know that you want to test changes outside of production first. It is common to have development or test environments for application development and verification. What about applying Db2 maintenance or performing migrations to new version or release levels? Sure, you probably perform these activities outside of production first, but are these environments similar enough to production to surface the same results as those you might encounter in production? Your production Db2 Catalog & Directory often has a different mix and complexity of objects, which were created at different levels of Db2, that can span decades of time. The best test of these activities is against your production system, but this is the system that we want to protect. How can we accomplish this? Clone it! Skeleton cloning produces a specific kind of clone, which provides a replica of the portions of your Db2 production environment that are needed to complete your testing. You can use the skeleton clone to find issues before they occur in production. This process allows you to refine maintenance steps in a safe environment and to minimize potential downtime when performing the same steps in a production system. This IBM® Redpaper™ publication gives a high-level overview of the IBM Db2 Cloning Tool and includes specific use cases for the tool. It also details the skeleton cloning process, which you can use to test migration, function levels, and maintenance, and includes demo examples that show a Db2 11 to 12 migration test using skeleton cloning.

Introduction to MATLAB for Engineers and Scientists: Solutions for Numerical Computation and Modeling

Familiarize yourself with MATLAB using this concise, practical tutorial that is focused on writing code to learn concepts. Starting from the basics, this book covers array-based computing, plotting and working with files, numerical computation formalism, and the primary concepts of approximations. Introduction to MATLAB is useful for industry engineers, researchers, and students who are looking for open-source solutions for numerical computation. In this book you will learn by doing, avoiding technical jargon, which makes the concepts easy to learn. First you’ll see how to run basic calculations, absorbing technical complexities incrementally as you progress toward advanced topics. Throughout, the language is kept simple to ensure that readers at all levels can grasp the concepts. What You'll Learn Apply sample code to your engineering or science problems Work with MATLAB arrays, functions, and loops Use MATLAB’s plotting functions for data visualization Solve numerical computing and computational engineering problems with a MATLAB case study Who This Book Is For Engineers, scientists, researchers, and students who are new to MATLAB. Some prior programming experience would be helpful but not required.

Ceph Cookbook - Second Edition

Dive into Ceph Cookbook, the ultimate guide for implementing and managing Ceph storage systems with practical solutions. With this book, you will learn to install, configure, and optimize Ceph storage clusters while mastering integration aspects such as cloud solutions. Discover troubleshooting techniques and best practices for efficient storage operations. What this Book will help me do Understand and deploy Ceph storage systems effectively. Perform performance tuning and cluster benchmarking for Ceph. Integrate Ceph storage with cloud platforms and applications seamlessly. Operate and troubleshoot Ceph clusters in production environments. Adopt advanced techniques such as erasure-coding and RBD mirroring in Ceph. Author(s) This book is authored by experts Karan Singh and team, who bring years of professional experience in the domain of storage systems design and implementation. Their deep understanding of Ceph's deployment across various applications ensures a hands-on approach to the subject. The authors' intention is to equip readers with practical and actionable knowledge. Who is it for? This resource caters to storage architects, cloud engineers, and system administrators looking to enhance their expertise in scalable storage solutions. Ideal for readers who are familiar with Linux and basic storage concepts but want to specialize in the Ceph ecosystem. Readers aiming to deploy cost-efficient and reliable software-defined storage solutions will find it invaluable.

Big Data Analytics with SAS

Discover how to leverage the power of SAS for big data analytics in 'Big Data Analytics with SAS.' This book helps you unlock key techniques for preparing, analyzing, and reporting on big data effectively using SAS. Whether you're exploring integration with Hadoop and Python or mastering SAS Studio, you'll advance your analytics capabilities. What this Book will help me do Set up a SAS environment for performing hands-on data analytics tasks efficiently. Master the fundamentals of SAS programming for data manipulation and analysis. Use SAS Studio and Jupyter Notebook to interface with SAS efficiently and effectively. Perform preparatory data workflows and advanced analytics, including predictive modeling and reporting. Integrate SAS with platforms like Hadoop, SAP HANA, and Cloud Foundry for scaling analytics processes. Author(s) None Pope is a seasoned data analytics expert with extensive experience in SAS and big data platforms. With a passion for demystifying complex data workflows, None teaches SAS techniques in an approachable way. Their expert insights and practical examples empower readers to confidently analyze and report on data. Who is it for? If you're a SAS professional or a data analyst looking to expand your skills in big data analysis, this book is for you. It suits readers aiming to integrate SAS into diverse tech ecosystems or seeking to learn predictive modeling and reporting with SAS. Both beginners and those familiar with SAS can benefit.

R Data Visualization Recipes

"R Data Visualization Recipes" is a valuable resource for data professionals who want to create clear and effective data visualizations using R. Through a series of practical recipes, the book walks you through various techniques, from mastering the basics to creating advanced, interactive dashboards. By following these recipes, you'll be equipped to use R's visualization packages to their full potential. What this Book will help me do Understand and effectively use R's diverse data visualization libraries. Create polished and informative graphics with ggplot2, ggvis, and plotly. Enhance plots with interactive and animated elements to tell a compelling story. Develop expertise in creating three-dimensional and multivariate visualizations. Design custom interactive dashboards using the power of Shiny. Author(s) None Bianchi Lanzetta is an expert in data visualization and programming, bringing years of experience in using R for applications in data analysis and graphics. With a background in software development, data science, and teaching, the author shares practical insights and clear instructions. Lanzetta's approachable and methodical writing style makes even complex topics accessible. Who is it for? This book is perfect for data professionals, analysts, and scientists who know the basics of R and want to enhance their ability to communicate findings visually. Even if you are a beginner with some exposure to R's ggplot2 package or similar, you'll find the recipes approachable and methodical. The book is ideal for readers who want practical, directly applicable techniques. Whether you're looking to augment your reporting abilities or explore advanced data visualization, you'll gain valuable skills.

Analyzing Multidimensional Well-Being

“An indispensable reference for all researchers interested in the measurement of social welfare. . .” —François Bourguignon, Emeritus Professor at Paris School of Economics, Former Chief Economist of the World Bank. “. . .a detailed, insightful, and pedagogical presentation of the theoretical grounds of multidimensional well-being, inequality, and poverty measurement. Any student, researcher, and practitioner interested in the multidimensional approach should begin their journey into such a fascinating theme with this wonderful book.” —François Maniquet, Professor, Catholic University of Louvain, Belgium. A Review of the Multidimensional Approaches to the Measurement of Welfare, Inequality, and Poverty Analyzing Multidimensional Well-Being: A Quantitative Approach offers a comprehensive approach to the measurement of well-being that includes characteristics such as income, health, literacy, and housing. The author presents a systematic comparison of the alternative approaches to the measurement of multidimensional welfare, inequality, poverty, and vulnerability. The text contains real-life applications of some multidimensional aggregations (most of which have been designed by international organizations such as the United Nations Development Program and the Organization for Economic Co-operation and Development) that help to judge the performance of a country in the various dimensions of well-being. The text offers an evaluation of how well a society is doing with respect to achievements of all the individuals in the dimensions considered and clearly investigates how achievements in the dimensions can be evaluated from different perspectives. The author includes a detailed scrutiny of alternative techniques for setting weights to individual dimensional metrics and offers an extensive analysis into both the descriptive and welfare theoretical approaches to the concerned multi-attribute measurement and related issues. This important resource: • Contains a synthesis of multidimensional welfare, inequality, poverty, and vulnerability analysis • Examines aggregations of achievement levels in the concerned dimensions of well-being from various standpoints • Shows how to measure poverty using panel data instead of restricting attention to a single period and when we have imprecise information on dimensional achievements • Argues that multidimensional analysis is intrinsically different from marginal distributions-based analysis Written for students, teachers, researchers, and scholars, Analyzing Multidimensional Well-Being: A Quantitative Approach puts the focus on various approaches to the measurementof the many aspects of well-being and quality of life. Satya R. Chakravarty is a Professor of Economics at the Indian Statistical Institute, Kolkata, India. He is an Editor of Social Choice and Welfare and a member of the Editorial Board of Journal of Economic Inequality.

Machine Learning

Machine Learning: A Constraint-Based Approach provides readers with a refreshing look at the basic models and algorithms of machine learning, with an emphasis on current topics of interest that includes neural networks and kernel machines. The book presents the information in a truly unified manner that is based on the notion of learning from environmental constraints. While regarding symbolic knowledge bases as a collection of constraints, the book draws a path towards a deep integration with machine learning that relies on the idea of adopting multivalued logic formalisms, like in fuzzy systems. A special attention is reserved to deep learning, which nicely fits the constrained- based approach followed in this book. This book presents a simpler unified notion of regularization, which is strictly connected with the parsimony principle, and includes many solved exercises that are classified according to the Donald Knuth ranking of difficulty, which essentially consists of a mix of warm-up exercises that lead to deeper research problems. A software simulator is also included. Presents fundamental machine learning concepts, such as neural networks and kernel machines in a unified manner Provides in-depth coverage of unsupervised and semi-supervised learning Includes a software simulator for kernel machines and learning from constraints that also includes exercises to facilitate learning Contains 250 solved examples and exercises chosen particularly for their progression of difficulty from simple to complex

Functional Data Structures in R: Advanced Statistical Programming in R

Get an introduction to functional data structures using R and write more effective code and gain performance for your programs. This book teaches you workarounds because data in functional languages is not mutable: for example you’ll learn how to change variable-value bindings by modifying environments, which can be exploited to emulate pointers and implement traditional data structures. You’ll also see how, by abandoning traditional data structures, you can manipulate structures by building new versions rather than modifying them. You’ll discover how these so-called functional data structures are different from the traditional data structures you might know, but are worth understanding to do serious algorithmic programming in a functional language such as R. By the end of Functional Data Structures in R, you’ll understand the choices to make in order to most effectively work with data structures when you cannot modify the data itself. These techniques are especially applicable for algorithmic development important in big data, finance, and other data science applications. What You'll Learn Carry out algorithmic programming in R Use abstract data structures Work with both immutable and persistent data Emulate pointers and implement traditional data structures in R Build new versions of traditional data structures that are known Who This Book Is For Experienced or advanced programmers with at least a comfort level with R. Some experience with data structures recommended.

Mastering MongoDB 3.x

"Mastering MongoDB 3.x" is your comprehensive guide to mastering the world of MongoDB, the leading NoSQL database. This book equips you with both foundational and advanced skills to effectively design, develop, and manage MongoDB-powered applications. Discover how to build fault-tolerant systems and dive deep into database internals, deployment strategies, and much more. What this Book will help me do Gain expertise in advanced querying using indexing and data expressions for efficient data retrieval. Master MongoDB administration for both on-premise and cloud-based environments efficiently. Learn data sharding and replication techniques to ensure scalability and fault tolerance. Understand the intricacies of MongoDB internals, including performance optimization techniques. Leverage MongoDB for big data processing by integrating with complex data pipelines. Author(s) Alex Giamas is a seasoned database developer and administrator with strong expertise in NoSQL technologies, particularly MongoDB. With years of experience guiding teams on creating and optimizing database structures, Alex ensures clear and practical methods for learning the essential aspects of MongoDB. His writing focuses on actionable knowledge and practical solutions for modern database challenges. Who is it for? This book is perfect for database developers, system architects, and administrators who are already familiar with database concepts and are looking to deepen their knowledge in NoSQL databases, specifically MongoDB. Whether you're working on building web applications, scaling data systems, or ensuring fault tolerance, this book provides the guidance to optimize your database management skill set.