talk-data.com talk-data.com

Topic

statistics

512

tagged

Activity Trend

1 peak/qtr
2020-Q1 2026-Q1

Activities

512 activities · Newest first

Design and Analysis of Experiments, 9th Edition

Design and Analysis of Experiments, 9th Edition continues to help senior and graduate students in engineering, business, and statistics--as well as working practitioners--to design and analyze experiments for improving the quality, efficiency and performance of working systems. This bestselling text maintains its comprehensive coverage by including: new examples, exercises, and problems (including in the areas of biochemistry and biotechnology); new topics and problems in the area of response surface; new topics in nested and split-plot design; and the residual maximum likelihood method is now emphasized throughout the book.

Practical Statistics for Data Scientists

Statistical methods are a key part of of data science, yet very few data scientists have any formal statistics training. Courses and books on basic statistics rarely cover the topic from a data science perspective. This practical guide explains how to apply various statistical methods to data science, tells you how to avoid their misuse, and gives you advice on what's important and what's not. Many data science resources incorporate statistical methods but lack a deeper statistical perspective. If you’re familiar with the R programming language, and have some exposure to statistics, this quick reference bridges the gap in an accessible, readable format. With this book, you’ll learn: Why exploratory data analysis is a key preliminary step in data science How random sampling can reduce bias and yield a higher quality dataset, even with big data How the principles of experimental design yield definitive answers to questions How to use regression to estimate outcomes and detect anomalies Key classification techniques for predicting which categories a record belongs to Statistical machine learning methods that “learn” from data Unsupervised learning methods for extracting meaning from unlabeled data

SPSS Statistics for Data Analysis and Visualization

Dive deeper into SPSS Statistics for more efficient, accurate, and sophisticated data analysis and visualization SPSS Statistics for Data Analysis and Visualization goes beyond the basics of SPSS Statistics to show you advanced techniques that exploit the full capabilities of SPSS. The authors explain when and why to use each technique, and then walk you through the execution with a pragmatic, nuts and bolts example. Coverage includes extensive, in-depth discussion of advanced statistical techniques, data visualization, predictive analytics, and SPSS programming, including automation and integration with other languages like R and Python. You'll learn the best methods to power through an analysis, with more efficient, elegant, and accurate code. IBM SPSS Statistics is complex: true mastery requires a deep understanding of statistical theory, the user interface, and programming. Most users don't encounter all of the methods SPSS offers, leaving many little-known modules undiscovered. This book walks you through tools you may have never noticed, and shows you how they can be used to streamline your workflow and enable you to produce more accurate results. Conduct a more efficient and accurate analysis Display complex relationships and create better visualizations Model complex interactions and master predictive analytics Integrate R and Python with SPSS Statistics for more efficient, more powerful code These "hidden tools" can help you produce charts that simply wouldn't be possible any other way, and the support for other programming languages gives you better options for solving complex problems. If you're ready to take advantage of everything this powerful software package has to offer, SPSS Statistics for Data Analysis and Visualization is the expert-led training you need.

Budgeting, Forecasting and Planning In Uncertain Times

Budgeting, planning and forecasting are critical management tasks that not only impact the future success of an organization, but can threaten its very survival if done badly. Yet in spite of their importance, the speed and complexity of today’s business environment has caused a rapid decrease in the planning time horizon. As a consequence the traditional planning processes have become unsuitable for most organization’s needs. In this book readers will find new, original insights, including: 7 planning models that every organization needs to plan and manage performance 6 ways in which performance can be viewed A planning framework based on best management practices that can cope with an unpredictable business environment The application of technology to planning and latest developments in systems Results of the survey conducted for the book on the state of planning in organizations

Theory of Probability

First issued in translation as a two-volume work in 1975, this classic book provides the first complete development of the theory of probability from a subjectivist viewpoint. It proceeds from a detailed discussion of the philosophical mathematical aspects to a detailed mathematical treatment of probability and statistics. De Finetti’s theory of probability is one of the foundations of Bayesian theory. De Finetti stated that probability is nothing but a subjective analysis of the likelihood that something will happen and that that probability does not exist outside the mind. It is the rate at which a person is willing to bet on something happening. This view is directly opposed to the classicist/ frequentist view of the likelihood of a particular outcome of an event, which assumes that the same event could be identically repeated many times over, and the 'probability' of a particular outcome has to do with the fraction of the time that outcome results from the repeated trials.

Statistical Intervals, 2nd Edition

Describes statistical intervals to quantify sampling uncertainty,focusing on key application needs and recently developed methodology in an easy-to-apply format Statistical intervals provide invaluable tools for quantifying sampling uncertainty. The widely hailed first edition, published in 1991, described the use and construction of the most important statistical intervals. Particular emphasis was given to intervals—such as prediction intervals, tolerance intervals and confidence intervals on distribution quantiles—frequently needed in practice, but often neglected in introductory courses. Vastly improved computer capabilities over the past 25 years have resulted in an explosion of the tools readily available to analysts. This second edition—more than double the size of the first—adds these new methods in an easy-to-apply format. In addition to extensive updating of the original chapters, the second edition includes new chapters on: • Likelihood-based statistical intervals • Nonparametric bootstrap intervals • Parametric bootstrap and other simulation-based intervals • An introduction to Bayesian intervals • Bayesian intervals for the popular binomial, Poisson and normal distributions • Statistical intervals for Bayesian hierarchical models • Advanced case studies, further illustrating the use of the newly described methods New technical appendices provide justification of the methods and pathways to extensions and further applications. A webpage directs readers to current readily accessible computer software and other useful information. Statistical Intervals: A Guide for Practitioners and Researchers, Second Edition is an up-to-date working guide and reference for all who analyze data, allowing them to quantify the uncertainty in their results using statistical intervals. William Q. Meeker is Professor of Statistics and Distinguished Professor of Liberal Arts and Sciences at Iowa State University. He is co-author of Statistical Methods for Reliability Data (Wiley, 1998) and of numerous publications in the engineering and statistical literature and has won many awards for his research. Gerald J. Hahn served for 46 years as applied statistician and manager of an 18-person statistics group supporting General Electric and has co-authored four books. His accomplishments have been recognized by GE’s prestigious Coolidge Fellowship and 19 professional society awards. Luis A. Escobar is Professor of Statistics at Louisiana State University. He is co-author of Statistical Methods for Reliability Data (Wiley, 1998) and several book chapters. His publications have appeared in the engineering and statistical literature and he has won several research and teaching awards.

Translating Statistics to Make Decisions: A Guide for the Non-Statistician

Examine and solve the common misconceptions and fallacies that non-statisticians bring to their interpretation of statistical results. Explore the many pitfalls that non-statisticians—and also statisticians who present statistical reports to non-statisticians—must avoid if statistical results are to be correctly used for evidence-based business decision making. Victoria Cox, senior statistician at the United Kingdom's Defence Science and Technology Laboratory (Dstl), distills the lessons of her long experience presenting the actionable results of complex statistical studies to users of widely varying statistical sophistication across many disciplines: from scientists, engineers, analysts, and information technologists to executives, military personnel, project managers, and officials across UK government departments, industry, academia, and international partners. The author shows how faulty statistical reasoning often undermines the utility of statistical results even among those with advanced technical training. Translating Statistics teaches statistically naive readers enough about statistical questions, methods, models, assumptions, and statements that they will be able to extract the practical message from statistical reports and better constrain what conclusions cannot be made from the results. To non-statisticians with some statistical training, this book offers brush-ups, reminders, and tips for the proper use of statistics and solutions to common errors. To fellow statisticians, the author demonstrates how to present statistical output to non-statisticians to ensure that the statistical results are correctly understood and properly applied to real-world tasks and decisions. The book avoids algebra and proofs, but it does supply code written in R for those readers who are motivated to work out examples. Pointing along the way to instructive examples of statistics gone awry, Translating Statistics walks readers through the typical course of a statistical study, progressing from the experimental design stage through the data collection process, exploratory data analysis, descriptive statistics, uncertainty, hypothesis testing, statistical modelling and multivariate methods, to graphs suitable for final presentation. The steady focus throughout the book is on how to turn the mathematical artefacts and specialist jargon that are second nature to statisticians into plain English for corporate customers and stakeholders. The final chapter neatly summarizes the book's lessons and insights for accurately communicating statistical reports to the non-statisticians who commission and act on them. What You'll Learn Recognize and avoid common errors and misconceptions that cause statistical studies to be misinterpreted and misused by non-statisticians in organizational settings Gain a practical understanding of the methods, processes, capabilities, and caveats of statistical studies to improve the application of statistical data to business decisions See how to code statistical solutions in R Who This Book Is For Non-statisticians—including both those with and without an introductory statistics course under their belts—who consume statistical reports in organizational settings, and statisticians who seek guidance for reporting statistical studies to non-statisticians in ways that will be accurately understood and will inform sound business and technical decisions

Illuminating Statistical Analysis Using Scenarios and Simulations

Features an integrated approach of statistical scenarios and simulations to aid readers in developing key intuitions needed to understand the wide ranging concepts and methods of statistics and inference Illuminating Statistical Analysis Using Scenarios and Simulations presents the basic concepts of statistics and statistical inference using the dual mechanisms of scenarios and simulations. This approach helps readers develop key intuitions and deep understandings of statistical analysis. Scenario-specific sampling simulations depict the results that would be obtained by a very large number of individuals investigating the same scenario, each with their own evidence, while graphical depictions of the simulation results present clear and direct pathways to intuitive methods for statistical inference. These intuitive methods can then be easily linked to traditional formulaic methods, and the author does not simply explain the linkages, but rather provides demonstrations throughout for a broad range of statistical phenomena. In addition, induction and deduction are repeatedly interwoven, which fosters a natural "need to know basis" for ordering the topic coverage. Examining computer simulation results is central to the discussion and provides an illustrative way to (re)discover the properties of sample statistics, the role of chance, and to (re)invent corresponding principles of statistical inference. In addition, the simulation results foreshadow the various mathematical formulas that underlie statistical analysis. In addition, this book: • Features both an intuitive and analytical perspective and includes a broad introduction to the use of Monte Carlo simulation and formulaic methods for statistical analysis • Presents straight-forward coverage of the essentials of basic statistics and ensures proper understanding of key concepts such as sampling distributions, the effects of sample size and variance on uncertainty, analysis of proportion, mean and rank differences, covariance, correlation, and regression • Introduces advanced topics such as Bayesian statistics, data mining, model cross-validation, robust regression, and resampling • Contains numerous example problems in each chapter with detailed solutions as well as an appendix that serves as a manual for constructing simulations quickly and easily using Microsoft® Office Excel® Illuminating Statistical Analysis Using Scenarios and Simulations is an ideal textbook for courses, seminars, and workshops in statistics and statistical inference and is appropriate for self-study as well. The book also serves as a thought-provoking treatise for researchers, scientists, managers, technicians, and others with a keen interest in statistical analysis. Jeffrey E. Kottemann, Ph.D., is Professor in the Perdue School at Salisbury University. Dr. Kottemann has published articles in a wide variety of academic research journals in the fields of business administration, computer science, decision sciences, economics, engineering, information systems, psychology, and public administration. He received his Ph.D. in Systems and Quantitative Methods from the University of Arizona.

Statistical Techniques for Transportation Engineering

Statistical Techniques for Transportation Engineering is written with a systematic approach in mind and covers a full range of data analysis topics, from the introductory level (basic probability, measures of dispersion, random variable, discrete and continuous distributions) through more generally used techniques (common statistical distributions, hypothesis testing), to advanced analysis and statistical modeling techniques (regression, AnoVa, and time series). The book also provides worked out examples and solved problems for a wide variety of transportation engineering challenges. Demonstrates how to effectively interpret, summarize, and report transportation data using appropriate statistical descriptors Teaches how to identify and apply appropriate analysis methods for transportation data Explains how to evaluate transportation proposals and schemes with statistical rigor

Total Survey Error in Practice

Featuring a timely presentation of total survey error (TSE), this edited volume introduces valuable tools for understanding and improving survey data quality in the context of evolving large-scale data sets This book provides an overview of the TSE framework and current TSE research as related to survey design, data collection, estimation, and analysis. It recognizes that survey data affects many public policy and business decisions and thus focuses on the framework for understanding and improving survey data quality. The book also addresses issues with data quality in official statistics and in social, opinion, and market research as these fields continue to evolve, leading to larger and messier data sets. This perspective challenges survey organizations to find ways to collect and process data more efficiently without sacrificing quality. The volume consists of the most up-to-date research and reporting from over 70 contributors representing the best academics and researchers from a range of fields. The chapters are broken out into five main sections: The Concept of TSE and the TSE Paradigm, Implications for Survey Design, Data Collection and Data Processing Applications, Evaluation and Improvement, and Estimation and Analysis. Each chapter introduces and examines multiple error sources, such as sampling error, measurement error, and nonresponse error, which often offer the greatest risks to data quality, while also encouraging readers not to lose sight of the less commonly studied error sources, such as coverage error, processing error, and specification error. The book also notes the relationships between errors and the ways in which efforts to reduce one type can increase another, resulting in an estimate with larger total error. This book: • Features various error sources, and the complex relationships between them, in 25 high-quality chapters on the most up-to-date research in the field of TSE • Provides comprehensive reviews of the literature on error sources as well as data collection approaches and estimation methods to reduce their effects • Presents examples of recent international events that demonstrate the effects of data error, the importance of survey data quality, and the real-world issues that arise from these errors • Spans the four pillars of the total survey error paradigm (design, data collection, evaluation and analysis) to address key data quality issues in official statistics and survey research Total Survey Error in Practice is a reference for survey researchers and data scientists in research areas that include social science, public opinion, public policy, and business. It can also be used as a textbook or supplementary material for a graduate-level course in survey research methods. Paul P. Biemer, PhD, is distinguished fellow at RTI International and associate director of Survey Research and Development at the Odum Institute, University of North Carolina, USA. Edith de Leeuw, PhD, is professor of survey methodology in the Department of Methodology and Statistics at Utrecht University, the Netherlands. Stephanie Eckman, PhD, is fellow at RTI International, USA. Brad Edwards is vice president, director of Field Services, and deputy area director at Westat, USA. Frauke Kreuter, PhD, is professor and director of the Joint Program in Survey Methodology, University of Maryland, USA; professor of statistics and methodology at the University of Mannheim, Germany; and head of the Statistical Methods Research Department at the Institute for Employment Research, Germany. Lars E. Lyberg, PhD, is senior advisor at Inizio, Sweden. N. Clyde Tucker, PhD, is principal survey methodologist at the American Institutes for Research, USA. Brady T. West, PhD, is research associate professor in the Survey Resea

A Panorama of Statistics

A Panorama of Statistics: Perspectives, Puzzles and Paradoxes in Statistics Eric Sowey, School of Economics, The University of New South Wales, Sydney, Australia Peter Petocz, Department of Statistics, Macquarie University, Sydney, Australia This book is a stimulating panoramic tour – quite different from a textbook journey – of the world of statistics in both its theory and practice, for teachers, students and practitioners.At each stop on the tour, the authors investigate unusual and quirky aspects of statistics, highlighting historical, biographical and philosophical dimensions of this field of knowledge. Each chapter opens with perspectives on its theme, often from several points of view. Five original and thought-provoking questions follow. These aim at widening readers’ knowledge and deepening their insight. Scattered among the questions are entertaining puzzles to solve and tantalising paradoxes to explain. Readers can compare their own statistical discoveries with the authors’ detailed answers to all the questions. The writing is lively and inviting, the ideas are rewarding, and the material is extensively cross-referenced. A Panorama of Statistics: Leads readers to discover the fascinations of statistics. Is an enjoyable companion to an undergraduate statistics textbook. Is an enriching source of knowledge for statistics teachers and practitioners. Is unique among statistics books today for its memorable content and engaging style. Lending itself equally to reading through and to dipping into, A Panorama of Statistics will surprise teachers, students and practitioners by the variety of ways in which statistics can capture and hold their interest.

Statistics for Business: Decision Making and Analysis, 3rd Edition

For one- and two-semester courses in introductory business statistics. Understand Business. Understand Data. The 3rd Edition of Statistics for Business: Decision Making and Analysis emphasizes an application-based approach, in which readers learn how to work with data to make decisions. In this contemporary presentation of business statistics, readers learn how to approach business decisions through a 4M Analytics decision making strategy—motivation, method, mechanics and message—to better understand how a business context motivates the statistical process and how the results inform a course of action. Each chapter includes hints on using Excel, Minitab Express, and JMP for calculations, pointing the reader in the right direction to get started with analysis of data. Also available with MyLab Statistics MyLab™ Statistics from Pearson is the world’s leading online resource for teaching and learning statistics; it integrates interactive homework, assessment, and media in a flexible, easy-to-use format. MyLab Statistics is a course management system that helps individual students succeed. It provides engaging experiences that personalize, stimulate, and measure learning for each student. Tools are embedded to make it easy to integrate statistical software into the course. Note: You are purchasing a standalone product; MyLab™does not come packaged with this content. Students, if interested in purchasing this title with MyLab, ask your instructor for the correct package ISBN and Course ID. Instructors, contact your Pearson representative for more information. If you would like to purchase both the physical text and MyLab, search for: 0134763734 / 9780134763736 Statistics for Business: Decision Making and Analysis, Student Value Edition Plus MyLab Statistics with Pearson eText - Access Card Package, 3/e Package consists of: 0134497260 / 9780134497266 Statistics for Business: Decision Making and Analysis, Student Value Edition 0134748646 / 9780134748641 MyLab Statistics for Business Stats with Pearson eText - Standalone Access Card - for Statistics for Business: Decision Making and Analysis

Style and Statistics

A non-technical guide to leveraging retail analytics for personal and competitive advantage Style & Statistics is a real-world guide to analytics in retail. Written specifically for the non-IT crowd, this book explains analytics in an approachable, understandable way, and provides examples of direct application to retail merchandise management, marketing, and operations. The discussion covers current industry trends and emerging-standard processes, and illustrates how analytics is providing new solutions to perennial retail problems. You'll learn how to leverage the benefits of analytics to boost your personal career, and how to interpret data in a way that's useful to the average end business user or shopper. Key concepts are detailed in easy-to-understand language, and numerous examples highlight the growing importance of understanding analytics in the retail environment. The power of analytics has become apparent across industries, but it's left an especially indelible mark on retail. It's a complex topic, but you don't need to be a data scientist to take advantage of the opportunities it brings. This book shows you what you need to know, and how to put analytics to work with retail-specific applications. Learn how analytics can help you be better at your job Dig deeper into the customer's needs, wants, and dreams Streamline merchandise management, pricing, marketing, and more Find solutions for inefficiencies and inaccuracies As the retail customer evolves, so must the retail industry. The retail landscape not only includes in-store but also website, mobile site, mobile apps, and social media . With more and more competition emerging on all sides, retailers need to use every tool at their disposal to create value and gain a competitive advantage. Analytics offers a number of ways to make your company stand out, whether it's through improved operations, customer experience, or any of the other myriad factors that build a great place to shop. Style & Statistics provides an analytics primer with a practical bent, specifically for the retail industry.

Forecasting Fundamentals

This book is for everyone who wants to make better forecasts. It is not about mathematics and statistics. It is about following a well-established forecasting process to create and implement good forecasts. This is true whether you are forecasting global markets, sales of SKUs, competitive strategy, or market disruptions. Today, most forecasts are generated using software. However, no amount of technology and statistics can compensate for a poor forecasting process. Forecasting is not just about generating a number. Forecasters need to understand the problems they are trying to solve. They also need to follow a process that is justifiable to other parties and be implemented in practice. This is what the book is about. Accurate forecasts are essential for predicting demand, identifying new market opportunities, forecasting risks, disruptions, innovation, competition, market growth and trends. Companies can navigate this daunting landscape and improve their forecasts by following some well-established principles. This book is written to provide the fundamentals business leaders need in order to make good forecasts. These fundamentals hold true regardless of what is being forecast and what technology is being used. It provides the basic foundational principles all companies need to achieve competitive forecast accuracy.

Predictive Analytics For Dummies, 2nd Edition

Real-world tips for creating business value Details on modeling, data clustering, and more Enterprise use cases to help you get started Learn to predict the future! Business today relies on effectively using data to predict trends and sales. Predictive analytics is the tool that can make it happen, and this book eliminates the tricks and shows you how to use it. You'll learn to prepare and process your data, create goals, build a predictive model, get your organization's stakeholders on board, and more. Inside... How to start a project Identifying data types Modeling tips Working with algorithms How data clustering works How data classification works How deep learning works Advice on presentations Step-by-step predictive modeling

Delayed and Network Queues

Presents an introduction to differential equations, probability, and stochastic processes with real-world applications of queues with delay and delayed network queues Featuring recent advances in queueing theory and modeling, Delayed and Network Queues provides the most up-to-date theories in queueing model applications. Balancing both theoretical and practical applications of queueing theory, the book introduces queueing network models as tools to assist in the answering of questions on cost and performance that arise throughout the life of a computer system and signal processing. Written by well-known researchers in the field, the book presents key information for understanding the essential aspects of queues with delay and networks of queues with unreliable nodes and vacationing servers. Beginning with simple analytical fundamentals, the book contains a selection of realistic and advanced queueing models that address current deficiencies. In addition, the book presents the treatment of queues with delay and networks of queues, including possible breakdowns and disruptions that may cause delay. Delayed and Network Queues also features: Numerous examples and exercises with applications in various fields of study such as mathematical sciences, biomathematics, engineering, physics, business, health industry, and economics A wide array of practical applications of network queues and queueing systems, all of which are related to the appropriate stochastic processes Up-to-date topical coverage such as single- and multiserver queues with and without delays, along with the necessary fundamental coverage of probability and difference equations Discussions on queueing models such as single- and multiserver Markovian queues with balking, reneging, delay, feedback, splitting, and blocking, as well as their role in the treatment of networks of queues with and without delay and network reliability Delayed and Network Queues is an excellent textbook for upper-undergraduate and graduate-level courses in applied mathematics, queueing theory, queueing systems, probability, and stochastic processes. The book is also an ideal reference for academics and practitioners in mathematical sciences, biomathematics, operations research, management, engineering, physics, business, economics, health industry, and industrial engineering. Aliakbar Montazer Haghighi, PhD, is Professor and Head of the Department of Mathematics at Prairie View A&M University, USA, as well as founding Editor-in-Chief of Applications and Applied Mathematics: An International Journal (AAM). His research interests include probability, statistics, stochastic processes, and queueing theory. Among his research publications and books, Dr. Haghighi is the coauthor of Difference and Differential Equations with Applications in Queueing Theory (Wiley, 2013). Dimitar P. Mishev, PhD, is Professor in the Department of Mathematics at Prairie View A&M University, USA. His research interests include differential and difference equations and queueing theory. The author of numerous research papers and three books, Dr. Mishev is the coauthor of Difference and Differential Equations with Applications in Queueing Theory (Wiley, 2013).

Statistical Shape Analysis, 2nd Edition

A thoroughly revised and updated edition of this introduction to modern statistical methods for shape analysis Shape analysis is an important tool in the many disciplines where objects are compared using geometrical features. Examples include comparing brain shape in schizophrenia; investigating protein molecules in bioinformatics; and describing growth of organisms in biology. This book is a significant update of the highly-regarded `Statistical Shape Analysis’ by the same authors. The new edition lays the foundations of landmark shape analysis, including geometrical concepts and statistical techniques, and extends to include analysis of curves, surfaces, images and other types of object data. Key definitions and concepts are discussed throughout, and the relative merits of different approaches are presented. The authors have included substantial new material on recent statistical developments and offer numerous examples throughout the text. Concepts are introduced in an accessible manner, while retaining sufficient detail for more specialist statisticians to appreciate the challenges and opportunities of this new field. Computer code has been included for instructional use, along with exercises to enable readers to implement the applications themselves in R and to follow the key ideas by hands-on analysis. Statistical Shape Analysis: with Applications in R will offer a valuable introduction to this fast-moving research area for statisticians and other applied scientists working in diverse areas, including archaeology, bioinformatics, biology, chemistry, computer science, medicine, morphometics and image analysis .

A Primer on Nonparametric Analysis, Volume I

Nonparametric statistics provide a scientific methodology for cases where customary statistics are not applicable. Nonparametric statistics are used when the requirements for parametric analysis fail, such as when data are not normally distributed or the sample size is too small. The method provides an alternative for such cases and is often nearly as powerful as parametric statistics. Another advantage of nonparametric statistics is that it offers analytical methods that are not available otherwise. Nonparametric methods are intuitive and simple to comprehend, which helps researchers in the social sciences understand the methods in spite of lacking mathematical rigor needed in analytical methods customarily used in science. This book is a methodology book and bypasses theoretical proofs while providing comprehensive explanations of the logic behind the methods and ample examples, which are all solved using direct computations as well as by using Stata. It is arranged into two integrated volumes. Although each volume, and for that matter each chapter, can be used separately, it is advisable to read as much of both volumes as possible; because familiarity with what is applicable for different problems will enhance capabilities.

A Primer on Nonparametric Analysis, Volume II

Nonparametric statistics provide a scientific methodology for cases where customary statistics are not applicable. Nonparametric statistics are used when the requirements for parametric analysis fail, such as when data are not normally distributed or the sample size is too small. The method provides an alternative for such cases and is often nearly as powerful as parametric statistics. Another advantage of nonparametric statistics is that it offers analytical methods that are not available otherwise. Nonparametric methods are intuitive and simple to comprehend, which helps researchers in the social sciences understand the methods in spite of lacking mathematical rigor needed in analytical methods customarily used in science. This book is a methodology book and bypasses theoretical proofs while providing comprehensive explanations of the logic behind the methods and ample examples, which are all solved using direct computations as well as by using Stata. It is arranged into two integrated volumes. Although each volume, and for that matter each chapter, can be used separately, it is advisable to read as much of both volumes as possible; because familiarity with what is applicable for different problems will enhance capabilities.

Demand Forecasting for Managers

Most decisions and plans in a firm require a forecast. Not matching supply with demand can make or break any business, and that's why forecasting is so invaluable. Forecasting can appear as a frightening topic with many arcane equations to master. For this reason, the authors start out from the very basics and provide a non-technical overview of common forecasting techniques as well as organizational aspects of creating a robust forecasting process. The book also discusses how to measure forecast accuracy to hold people accountable and guide continuous improvement. This book does not require prior knowledge of higher mathematics, statistics, or operations research. It is designed to serve as a first introduction to the non-expert, such as a manager overseeing a forecasting group, or an MBA student who needs to be familiar with the broad outlines of forecasting without specializing in it.