talk-data.com talk-data.com

Topic

statistics

512

tagged

Activity Trend

1 peak/qtr
2020-Q1 2026-Q1

Activities

512 activities · Newest first

Designing and Conducting Survey Research: A Comprehensive Guide, 4th Edition

The industry standard guide, updated with new ideas and SPSS analysis techniques Designing and Conducting Survey Research: A Comprehensive Guide Fourth Edition is the industry standard resource that covers all major components of the survey process, updated to include new data analysis techniques and SPSS procedures with sample data sets online. The book offers practical, actionable guidance on constructing the instrument, administrating the process, and analyzing and reporting the results, providing extensive examples and worksheets that demonstrate the appropriate use of survey and data techniques. By clarifying complex statistical concepts and modern analysis methods, this guide enables readers to conduct a survey research project from initial focus concept to the final report. Public and nonprofit managers with survey research responsibilities need to stay up-to-date on the latest methods, techniques, and best practices for optimal data collection, analysis, and reporting. Designing and Conducting Survey Research is a complete resource, answering the "what", "why", and "how" every step of the way, and providing the latest information about technological advancements in data analysis. The updated fourth edition contains step-by-step SPSS data entry and analysis procedures, as well as SPSS examples throughout the text, using real data sets from real-world studies. Other new information includes topics like: Nonresponse error/bias Ethical concerns and special populations Cell phone samples in telephone surveys Subsample screening and complex skip patterns The fourth edition also contains new information on the growing importance of focus groups, and places a special emphasis on data quality including size and variability. Those who employ survey research methods will find that Designing and Conducting Survey Research contains all the information needed to better design, conduct, and analyze a more effective survey.

Cognitive Interviewing Methodology

AN INTERDISCIPLINARY PERSPECTIVE TO THE EVOLUTION OF THEORY AND METHODOLOGY WITHIN COGNITIVE INTERVIEW PROCESSES Providing a comprehensive approach to cognitive interviewing in the field of survey methodology, Cognitive Interviewing Methodology delivers a clear guide that draws upon modern, cutting-edge research from a variety of fields. Each chapter begins by summarizing the prevailing paradigms that currently dominate the field of cognitive interviewing. Then underlying theoretical foundations are presented, which supplies readers with the necessary background to understand newly-evolving techniques in the field. The theories lead into developed and practiced methods by leading practitioners, researchers, and/or academics. Finally, the edited guide lays out the limitations of cognitive interviewing studies and explores the benefits of cognitive interviewing with other methodological approaches. With a primary focus on question evaluation, Cognitive Interviewing Methodology also includes: Step-by-step procedures for conducting cognitive interviewing studies, which includes the various aspects of data collection, questionnaire design, and data interpretation Newly developed tools to benefit cognitive interviewing studies as well as the field of question evaluation, such as Q-Notes, a data entry and analysis software application, and Q-Bank, an online resource that houses question evaluation studies A unique method for questionnaire designers, survey managers, and data users to analyze, present, and document survey data results from a cognitive interviewing study An excellent reference for survey researchers and practitioners in the social sciences who utilize cognitive interviewing techniques in their everyday work, Cognitive Interviewing Methodology is also a useful supplement for courses on survey methods at the upper-undergraduate and graduate-level.

Robust Cluster Analysis and Variable Selection

Clustering remains a vibrant area of research in statistics. Although there are many books on this topic, there are relatively few that are well founded in the theoretical aspects. In Robust Cluster Analysis and Variable Selection, Gunter Ritter presents an overview of the theory and applications of probabilistic clustering and variable selection, synthesizing the key research results of the last 50 years. The author focuses on the robust clustering methods he found to be the most useful on simulated data and real-time applications. The book provides clear guidance for the varying needs of both applications, describing scenarios in which accuracy and speed are the primary goals. Robust Cluster Analysis and Variable Selection includes all of the important theoretical details, and covers the key probabilistic models, robustness issues, optimization algorithms, validation techniques, and variable selection methods. The book illustrates the different methods with simulated data and applies them to real-world data sets that can be easily downloaded from the web. This provides you with guidance in how to use clustering methods as well as applicable procedures and algorithms without having to understand their probabilistic fundamentals.

Statistical Process Control for Managers

If you have been frustrated by very technical statistical process control (SPC) training materials, then this is the book for you. This book focuses on how SPC works and why managers should consider using it in their operations. It provides you with a conceptual understanding of SPC so that appropriate decisions can be made about the benefits of incorporating SPC into the process management and quality improvement processes. Today, there is little need to make the necessary calculations by hand, so the author utilizes Minitab and NWA Quality Analyst—two of the most popular statistical analysis software packages on the market. Links are provided to the home pages of these software packages where trial versions may be downloaded for evaluation and trial use. The book also addresses the question of why SPC should be considered for use, the process of implementing SPC, how to incorporate SPC into problem identification, problem solving, and the management and improvement of processes, products, and services.

Nonparametric Hypothesis Testing: Rank and Permutation Methods with Applications in R

A novel presentation of rank and permutation tests, with accessible guidance to applications in R Nonparametric testing problems are frequently encountered in many scientific disciplines, such as engineering, medicine and the social sciences. This book summarizes traditional rank techniques and more recent developments in permutation testing as robust tools for dealing with complex data with low sample size. Key Features: Examines the most widely used methodologies of nonparametric testing. Includes extensive software codes in R featuring worked examples, and uses real case studies from both experimental and observational studies. Presents and discusses solutions to the most important and frequently encountered real problems in different fields. Features a supporting website ( www.wiley.com/go/hypothesis_testing) containing all of the data sets examined in the book along with ready to use R software codes. Nonparametric Hypothesis Testing combines an up to date overview with useful practical guidance to applications in R, and will be a valuable resource for practitioners and researchers working in a wide range of scientific fields including engineering, biostatistics, psychology and medicine.

Age-Period-Cohort Models

This book presents an introduction to the problems and strategies for modeling age, period, and cohort (APC) effects for aggregate-level data. These strategies include constrained estimation, the use of age and/or period and/or cohort characteristics, estimable functions, variance decomposition and a new technique called the s-constraint approach.

Applied Bayesian Modelling, 2nd Edition

This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBUGS and OPENBUGS. This feature continues in the new edition along with examples using R to broaden appeal and for completeness of coverage.

Basic Data Analysis for Time Series with R

Written at a readily accessible level, Basic Data Analysis for Time Series with R emphasizes the mathematical importance of collaborative analysis of data used to collect increments of time or space. Balancing a theoretical and practical approach to analyzing data within the context of serial correlation, the book presents a coherent and systematic regression-based approach to model selection. The book illustrates these principles of model selection and model building through the use of information criteria, cross validation, hypothesis tests, and confidence intervals. Focusing on frequency- and time-domain and trigonometric regression as the primary themes, the book also includes modern topical coverage on Fourier series and Akaike's Information Criterion (AIC). In addition, Basic Data Analysis for Time Series with R also features: Real-world examples to provide readers with practical hands-on experience Multiple R software subroutines employed with graphical displays Numerous exercise sets intended to support readers understanding of the core concepts Specific chapters devoted to the analysis of the Wolf sunspot number data and the Vostok ice core data sets

Better Business Decisions from Data

" Everyone encounters statistics on a daily basis. They are used in proposals, reports, requests, and advertisements, among others, to support assertions, opinions, and theories. Unless you're a trained statistician, it can be bewildering. What are the numbers really saying or not saying? Better Business Decisions from Data: Statistical Analysis for Professional Success provides the answers to these questions and more. It will show you how to use statistical data to improve small, every-day management judgments as well as major business decisions with potentially serious consequences. Author Peter Kenny-with deep experience in industry-believes that "while the methods of statistics can be complicated, the meaning of statistics is not." He first outlines the ways in which we are frequently misled by statistical results, either because of our lack of understanding or because we are being misled intentionally. Then he offers sound approaches for understanding and assessing statistical data to make excellent decisions. Kenny assumes no prior knowledge of statistical techniques; he explains concepts simply and shows how the tools are used in various business situations. With the arrival of Big Data, statistical processing has taken on a new level of importance. Kenny lays a foundation for understanding the importance and value of Big Data, and then he shows how mined data can help you see your business in a new light and uncover opportunity. Among other things, this book covers: How statistics can help you assess the probability of a successful outcome How data is collected, sampled, and best interpreted How to make effective forecasts based on the data at hand How to spot the misuse or abuse of statistical evidence in advertisements, reports, and proposals How to commission a statistical analysis Arranged in seven parts-Uncertainties, Data, Samples, Comparisons, Relationships, Forecasts, and Big Data-" Better Business Decisions from Data is a guide for busy people in general management, finance, marketing, operations, and other business disciplines who run across statistics on a daily or weekly basis. You'll return to it again and again as new challenges emerge, making better decisions each time that boost your organization's fortunes—as well as your own.

Theory and Application of Statistical Energy Analysis, 2nd Edition

This up-to-date second edition provides a comprehensive examination of the theory and application of Statistical Energy Analysis (SEA) in acoustics and vibration. Complete with examples and data taken from real problems this unique book also exploresthe influence of computers on SEA and emphasizes computer based SEA calculations. In addition to a discussion of the relationship between SEA and other procedures used in response estimation, Theory and Application of Statistical Energy Anlaysis, SecondEdition, explores the basic relationships between model and wave descriptions of systems.

Discrete and Continuous Simulation

When it comes to discovering glitches inherent in complex systems—be it a railway or banking, chemical production, medical, manufacturing, or inventory control system—developing a simulation of a system can identify problems with less time, effort, and disruption than it would take to employ the original. Advantageous to both academic and industrial practitioners, Discrete and Continuous Simulation: Theory and Practice offers a detailed view of simulation that is useful in several fields of study. This text concentrates on the simulation of complex systems, covering the basics in detail and exploring the diverse aspects, including continuous event simulation and optimization with simulation. It explores the connections between discrete and continuous simulation, and applies a specific focus to simulation in the supply chain and manufacturing field. It discusses the Monte Carlo simulation, which is the basic and traditional form of simulation. It addresses future trends and technologies for simulation, with particular emphasis given to .NET technologies and cloud computing, and proposes various simulation optimization algorithms from existing literature. Includes chapters on input modeling and hybrid simulation Introduces general probability theory Contains a chapter on Microsoft ® Excel ™ and MATLAB ®/Simulink ® Discusses various probability distributions required for simulation Describes essential random number generators Discrete and Continuous Simulation: Theory and Practice defines the simulation of complex systems. This text benefits academic researchers in industrial/manufacturing/systems engineering, computer sciences, operations research, and researchers in transportation, operations management, healthcare systems, and human–machine systems.

Advanced Backend Optimization

This book is a summary of more than a decade of research in the area of backend optimization. It contains the latest fundamental research results in this field. While existing books are often more oriented toward Masters students, this book is aimed more towards professors and researchers as it contains more advanced subjects. It is unique in the sense that it contains information that has not previously been covered by other books in the field, with chapters on phase ordering in optimizing compilation; register saturation in instruction level parallelism; code size reduction for software pipelining; memory hierarchy effects and instruction level parallelism. Other chapters provide the latest research results in well-known topics such as register need, and software pipelining and periodic register allocation.

Recursive Identification and Parameter Estimation

Recursive Identification and Parameter Estimation describes a recursive approach to solving system identification and parameter estimation problems arising from diverse areas. Supplying rigorous theoretical analysis, it presents the material and proposed algorithms in a manner that makes it easy to understand—providing readers with the modeling and identification skills required for successful theoretical research and effective application. The book begins by introducing the basic concepts of probability theory, including martingales, martingale difference sequences, Markov chains, mixing processes, and stationary processes. Next, it discusses the root-seeking problem for functions, starting with the classic RM algorithm, but with attention mainly paid to the stochastic approximation algorithms with expanding truncations (SAAWET) which serves as the basic tool for recursively solving the problems addressed in the book. The book not only identifies the results of system identification and parameter estimation, but also demonstrates how to apply the proposed approaches for addressing problems in a range of areas, including: Identification of ARMAX systems without imposing restrictive conditions Identification of typical nonlinear systems Optimal adaptive tracking Consensus of multi-agents systems Principal component analysis Distributed randomized PageRank computation This book recursively identifies autoregressive and moving average with exogenous input (ARMAX) and discusses the identification of non-linear systems. It concludes by addressing the problems arising from different areas that are solved by SAAWET. Demonstrating how to apply the proposed approaches to solve problems across a range of areas, the book is suitable for students, researchers, and engineers working in systems and control, signal processing, communication, and mathematical statistics.

The Mystery of Market Movements: An Archetypal Approach to Investment Forecasting and Modelling

A quantifiable framework for unlocking the unconscious forces that shape markets There has long been a notion that subliminal forces play a great part in causing the seemingly irrational financial bubbles, which conventional economic theory, again and again, fails to explain. However, these forces, sometimes labeled 'animal spirits' or 'irrational exuberance, have remained elusive - until now. The Mystery of Market Movements provides you with a methodology to timely predict and profit from changes in human investment behaviour based on the workings of the collective unconscious. Niklas Hageback draws in on one of psychology's most influential ideas - archetypes - to explain how they form investor's perceptions and can be predicted and turned into profit. The Mystery of Market Movements provides; A review of the collective unconscious and its archetypes based on Carl Jung's theories and empirical case studies that highlights and assesses the influences of the collective unconscious on financial bubbles and zeitgeists For the first time being able to objectively measure the impact of archetypal forces on human thoughts and behaviour with a view to provide early warning signals on major turns in the markets. This is done through a step-by-step guide on how to develop a measurement methodology based on an analysis of the language of the unconscious; figurative speech such as metaphors and symbolism, drawn out and deciphered from Big Data sources, allowing for quantification into time series The book is supplemented with an online resource that presents continuously updated bespoken archetypal indexes with predictive capabilities to major financial indexes Investors are often unaware of the real reasons behind their own financial decisions. This book explains why psychological drivers in the collective unconscious dictates not only investment behaviour but also political, cultural and social trends. Understanding these forces allows you to stay ahead of the curve and profit from market tendencies that more traditional methods completely overlook.

Bayesian Networks

Understand the Foundations of Bayesian Networks—Core Properties and Definitions Explained Bayesian Networks: With Examples in R introduces Bayesian networks using a hands-on approach. Simple yet meaningful examples in R illustrate each step of the modeling process. The examples start from the simplest notions and gradually increase in complexity. The authors also distinguish the probabilistic models from their estimation with data sets. The first three chapters explain the whole process of Bayesian network modeling, from structure learning to parameter learning to inference. These chapters cover discrete Bayesian, Gaussian Bayesian, and hybrid networks, including arbitrary random variables. The book then gives a concise but rigorous treatment of the fundamentals of Bayesian networks and offers an introduction to causal Bayesian networks. It also presents an overview of R and other software packages appropriate for Bayesian networks. The final chapter evaluates two real-world examples: a landmark causal protein signaling network paper and graphical modeling approaches for predicting the composition of different body parts. Suitable for graduate students and non-statisticians, this text provides an introductory overview of Bayesian networks. It gives readers a clear, practical understanding of the general approach and steps involved.

Fundamentals of Applied Probability and Random Processes, 2nd Edition

The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book's clear writing style and homework problems make it ideal for the classroom or for self-study. Demonstrates concepts with more than 100 illustrations, including 2 dozen new drawings Expands readers’ understanding of disruptive statistics in a new chapter (chapter 8) Provides new chapter on Introduction to Random Processes with 14 new illustrations and tables explaining key concepts. Includes two chapters devoted to the two branches of statistics, namely descriptive statistics (chapter 8) and inferential (or inductive) statistics (chapter 9).

Methods and Applications of Statistics in Clinical Trials, Volume 2: Planning, Analysis, and Inferential Methods

Methods and Applications of Statistics in Clinical Trials, Volume 2: Planning, Analysis, and Inferential Methods includes updates of established literature from the Wiley Encyclopedia of Clinical Trials as well as original material based on the latest developments in clinical trials. Prepared by a leading expert, the second volume includes numerous contributions from current prominent experts in the field of medical research. In addition, the volume features: Multiple new articles exploring emerging topics, such as evaluation methods with threshold, empirical likelihood methods, nonparametric ROC analysis, over- and under-dispersed models, and multi-armed bandit problems Up-to-date research on the Cox proportional hazard model, frailty models, trial reports, intrarater reliability, conditional power, and the kappa index Key qualitative issues including cost-effectiveness analysis, publication bias, and regulatory issues, which are crucial to the planning and data management of clinical trials

Introduction to Imprecise Probabilities

In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, including theory and applications reflecting the current state if the art. Each chapter is written by experts on the respective topics, including: Sets of desirable gambles; Coherent lower (conditional) previsions; Special cases and links to literature; Decision making; Graphical models; Classification; Reliability and risk assessment; Statistical inference; Structural judgments; Aspects of implementation (including elicitation and computation); Models in finance; Game-theoretic probability; Stochastic processes (including Markov chains); Engineering applications. Essential reading for researchers in academia, research institutes and other organizations, as well as practitioners engaged in areas such as risk analysis and engineering.

Statistical Applications for Environmental Analysis and Risk Assessment

Statistical Applications for Environmental Analysis and Risk Assessment guides readers through real-world situations and the best statistical methods used to determine the nature and extent of the problem, evaluate the potential human health and ecological risks, and design and implement remedial systems as necessary. Featuring numerous worked examples using actual data and "ready-made" software scripts, Statistical Applications for Environmental Analysis and Risk Assessment also includes: Descriptions of basic statistical concepts and principles in an informal style that does not presume prior familiarity with the subject Detailed illustrations of statistical applications in the environmental and related water resources fields using real-world data in the contexts that would typically be encountered by practitioners Software scripts using the high-powered statistical software system, R, and supplemented by USEPA's ProUCL and USDOE's VSP software packages, which are all freely available Coverage of frequent data sample issues such as non-detects, outliers, skewness, sustained and cyclical trend that habitually plague environmental data samples Clear demonstrations of the crucial, but often overlooked, role of statistics in environmental sampling design and subsequent exposure risk assessment.

Nonparametric Statistics: A Step-by-Step Approach, 2nd Edition

"...a very useful resource for courses in nonparametric statistics in which the emphasis is on applications rather than on theory. It also deserves a place in libraries of all institutions where introductory statistics courses are taught." -CHOICE This Second Edition presents a practical and understandable approach that enhances and expands the statistical toolset for readers. This book includes: New coverage of the sign test and the Kolmogorov-Smirnov two-sample test in an effort to offer a logical and natural progression to statistical power SPSS (Version 21) software and updated screen captures to demonstrate how to perform and recognize the steps in the various procedures Data sets and odd-numbered solutions provided in an appendix, and tables of critical values Supplementary material to aid in reader comprehension, which includes: narrated videos and screen animations with step-by-step instructions on how to follow the tests using SPSS; online decision trees to help users determine the needed type of statistical test; and additional solutions not found within the book.