talk-data.com talk-data.com

Topic

statistics

505

tagged

Activity Trend

1 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: O'Reilly Data Science Books ×
Displaying Time Series, Spatial, and Space-Time Data with R

Code and Methods for Creating High-Quality Data GraphicsA data graphic is not only a static image, but it also tells a story about the data. It activates cognitive processes that are able to detect patterns and discover information not readily available with the raw data. This is particularly true for time series, spatial, and space-time datasets.F

Statistical Analysis: Microsoft® Excel® 2013

Use Excel 2013’s statistical tools to transform your data into knowledge Conrad Carlberg shows how to use Excel 2013 to perform core statistical tasks every business professional, student, and researcher should master. Using real-world examples, Carlberg helps you choose the right technique for each problem and get the most out of Excel’s statistical features, including recently introduced consistency functions. Along the way, he clarifies confusing statistical terminology and helps you avoid common mistakes. You’ll learn how to use correlation and regression, analyze variance and covariance, and test statistical hypotheses using the normal, binomial, t, and F distributions. To help you make accurate inferences based on samples from a population, this edition adds two more chapters on inferential statistics, covering crucial topics ranging from experimental design to the statistical power of F tests. Becoming an expert with Excel statistics has never been easier! You’ll find crystal-clear instructions, insider insights, and complete step-by-step projects—all complemented by extensive web-based resources. Master Excel’s most useful descriptive and inferential statistical tools Tell the truth with statistics—and recognize when others don’t Accurately summarize sets of values Infer a population’s characteristics from a sample’s frequency distribution Explore correlation and regression to learn how variables move in tandem Use Excel consistency functions such as STDEV.S() and STDEV.P() Test differences between two means using z tests, t tests, and Excel’s Data Analysis Add-in Use ANOVA to test differences between more than two means Explore statistical power by manipulating mean differences, standard errors, directionality, and alpha Take advantage of Recommended PivotTables, Quick Analysis, and other Excel 2013 shortcuts

Economic and Business Forecasting: Analyzing and Interpreting Econometric Results

Discover the secrets to applying simple econometric techniques to improve forecasting Equipping analysts, practitioners, and graduate students with a statistical framework to make effective decisions based on the application of simple economic and statistical methods, Economic and Business Forecasting offers a comprehensive and practical approach to quantifying and accurate forecasting of key variables. Using simple econometric techniques, author John E. Silvia focuses on a select set of major economic and financial variables, revealing how to optimally use statistical software as a template to apply to your own variables of interest. Presents the economic and financial variables that offer unique insights into economic performance Highlights the econometric techniques that can be used to characterize variables Explores the application of SAS software, complete with simple explanations of SAS-code and output Identifies key econometric issues with practical solutions to those problems Presenting the "ten commandments" for economic and business forecasting, this book provides you with a practical forecasting framework you can use for important everyday business applications.

Statistical Hypothesis Testing with SAS and R

A comprehensive guide to statistical hypothesis testing with examples in SAS and R When analyzing datasets the following questions often arise: Is there a short hand procedure for a statistical test available in SAS or R? If so, how do I use it? If not, how do I program the test myself? This book answers these questions and provides an overview of the most common statistical test problems in a comprehensive way, making it easy to find and perform an appropriate statistical test. A general summary of statistical test theory is presented, along with a basic description for each test, including the necessary prerequisites, assumptions, the formal test problem and the test statistic. Examples in both SAS and R are provided, along with program code to perform the test, resulting output and remarks explaining the necessary program parameters. Key features: Provides examples in both SAS and R for each test presented. Looks at the most common statistical tests, displayed in a clear and easy to follow way. Supported by a supplementary website http://www.d-taeger.de featuring example program code. Academics, practitioners and SAS and R programmers will find this book a valuable resource. Students using SAS and R will also find it an excellent choice for reference and data analysis.

Statistics in Action

Commissioned by the Statistical Society of Canada (SSC), this volume helps both general readers and users of statistics better appreciate the scope and importance of statistics. It presents the ways in which statistics is used while highlighting key contributions that Canadian statisticians are making to science, technology, business, government, and other areas. The book emphasizes the role and impact of computing in statistical modeling and analysis, including the issues involved with the huge amounts of data being generated by automated processes.

Examples and Problems in Mathematical Statistics

Provides the necessary skills to solve problems in mathematical statistics through theory, concrete examples, and exercises With a clear and detailed approach to the fundamentals of statistical theory, Examples and Problems in Mathematical Statistics uniquely bridges the gap between theory andapplication and presents numerous problem-solving examples that illustrate the relatednotations and proven results. Written by an established authority in probability and mathematical statistics, each chapter begins with a theoretical presentation to introduce both the topic and the important results in an effort to aid in overall comprehension. Examples are then provided, followed by problems, and finally, solutions to some of the earlier problems. In addition, Examples and Problems in Mathematical Statistics features: Over 160 practical and interesting real-world examples from a variety of fields including engineering, mathematics, and statistics to help readers become proficient in theoretical problem solving More than 430 unique exercises with select solutions Key statistical inference topics, such as probability theory, statistical distributions, sufficient statistics, information in samples, testing statistical hypotheses, statistical estimation, confidence and tolerance intervals, large sample theory, and Bayesian analysis Recommended for graduate-level courses in probability and statistical inference, Examples and Problems in Mathematical Statistics is also an ideal reference for applied statisticians and researchers.

Information Evaluation

During the reception of a piece of information, we are never passive. Depending on its origin and content, from our personal beliefs and convictions, we bestow upon this piece of information, spontaneously or after reflection, a certain amount of confidence. Too much confidence shows a degree of naivety, whereas an absolute lack of it condemns us as being paranoid. These two attitudes are symmetrically detrimental, not only to the proper perception of this information but also to its use. Beyond these two extremes, each person generally adopts an intermediate position when faced with the reception of information, depending on its provenance and credibility. We still need to understand and explain how these judgements are conceived, in what context and to what end. Spanning the approaches offered by philosophy, military intelligence, algorithmics and information science, this book presents the concepts of information and the confidence placed in it, the methods that militaries, the first to be aware of the need, have or should have adopted, tools to help them, and the prospects that they have opened up. Beyond the military context, the book reveals ways to evaluate information for the good of other fields such as economic intelligence, and, more globally, the informational monitoring by governments and businesses. Contents 1. Information: Philosophical Analysis and Strategic Applications, Mouhamadou El Hady Ba and Philippe Capet. 2. Epistemic Trust, Gloria Origgi. 3. The Fundamentals of Intelligence, Philippe Lemercier. 4. Information Evaluation in the Military Domain: Doctrines, Practices and Shortcomings, Philippe Capet and Adrien Revault d'Allonnes. 5. Multidimensional Approach to Reliability Evaluation of Information Sources, Frédéric Pichon, Christophe Labreuche, Bertrand Duqueroie and Thomas Delavallade. 6. Uncertainty of an Event and its Markers in Natural Language Processing, Mouhamadou El Hady Ba, Stéphanie Brizard, Tanneguy Dulong and Bénédicte Goujon. 7. Quantitative Information Evaluation: Modeling and Experimental Evaluation, Marie-Jeanne Lesot, Frédéric Pichon and Thomas Delavallade. 8. When Reported Information Is Second Hand, Laurence Cholvy. 9. An Architecture for the Evolution of Trust: Definition and Impact of the Necessary Dimensions of Opinion Making, Adrien Revault d'Allonnes. About the Authors Philippe Capet is a project manager and research engineer at Ektimo, working mainly on information management and control in military contexts. Thomas Delavallade is an advanced studies engineer at Thales Communications & Security, working on social media mining in the context of crisis management, cybersecurity and the fight against cybercrime.

Methods and Applications of Statistics in Clinical Trials, Volume 1: Concepts, Principles, Trials, and Designs

A complete guide to the key statistical concepts essential for the design and construction of clinical trials As the newest major resource in the field of medical research, Methods and Applications of Statistics in Clinical Trials, Volume 1: Concepts, Principles, Trials, and Designs presents a timely and authoritative reviewof the central statistical concepts used to build clinical trials that obtain the best results. The referenceunveils modern approaches vital to understanding, creating, and evaluating data obtained throughoutthe various stages of clinical trial design and analysis. Accessible and comprehensive, the first volume in a two-part set includes newly-written articles as well as established literature from the Wiley Encyclopedia of Clinical Trials. Illustrating a variety of statistical concepts and principles such as longitudinal data, missing data, covariates, biased-coin randomization, repeated measurements, and simple randomization, the book also provides in-depth coverage of the various trial designs found within phase I-IV trials. Methods and Applications of Statistics in Clinical Trials, Volume 1: Concepts, Principles, Trials, and Designs also features: Detailed chapters on the type of trial designs, such as adaptive, crossover, group-randomized, multicenter, non-inferiority, non-randomized, open-labeled, preference, prevention, and superiority trials Over 100 contributions from leading academics, researchers, and practitioners An exploration of ongoing, cutting-edge clinical trials on early cancer and heart disease, mother-to-child human immunodeficiency virus transmission trials, and the AIDS Clinical Trials Group Methods and Applications of Statistics in Clinical Trials, Volume 1: Concepts, Principles, Trials, and Designs is an excellent reference for researchers, practitioners, and students in the fields of clinicaltrials, pharmaceutics, biostatistics, medical research design, biology, biomedicine, epidemiology,and public health.

Statistical Analysis in Forensic Science: Evidential Values of Multivariate Physicochemical Data

A practical guide for determining the evidential value of physicochemical data Microtraces of various materials (e.g. glass, paint, fibres, and petroleum products) are routinely subjected to physicochemical examination by forensic experts, whose role is to evaluate such physicochemical data in the context of the prosecution and defence propositions. Such examinations return various kinds of information, including quantitative data. From the forensic point of view, the most suitable way to evaluate evidence is the likelihood ratio. This book provides a collection of recent approaches to the determination of likelihood ratios and describes suitable software, with documentation and examples of their use in practice. The statistical computing and graphics software environment R, pre-computed Bayesian networks using Hugin Researcher and a new package, calcuLatoR, for the computation of likelihood ratios are all explored. Statistical Analysis in Forensic Science will provide an invaluable practical guide for forensic experts and practitioners, forensic statisticians, analytical chemists, and chemometricians. Key features include: Description of the physicochemical analysis of forensic trace evidence. Detailed description of likelihood ratio models for determining the evidential value of multivariate physicochemical data. Detailed description of methods, such as empirical cross-entropy plots, for assessing the performance of likelihood ratio-based methods for evidence evaluation. Routines written using the open-source R software, as well as Hugin Researcher and calcuLatoR. Practical examples and recommendations for the use of all these methods in practice.

Forecasting Offertory Revenue at St. Elizabeth Seton Catholic Church

This new business analytics case study challenges readers to forecast donations, plan budgets, and manage cash flow for a religious institution suffering rapidly falling contributions. Crystallizing realistic analytical challenges faced by non-profit and for-profit organizations of all kinds, it exposes readers to the entire decision-making process, providing opportunities to perform analyses, interpret output, and recommend the best course of action. Author: Matthew J. Drake, Duquesne University.

Forecasting Sales at Ska Brewing Company

This new business analytics case study challenges readers to project trends and plan capacity for a fast-growing craft beer operation, so it can make the best possible decisions about expensive investments in brewing capacity. Crystallizing realistic analytical challenges faced by companies in many industries and markets, it exposes readers to the entire decision-making process, providing opportunities to perform analyses, interpret output, and recommend the best course of action. Author: Eric Huggins, Fort Lewis College.

Statistics for Mining Engineering

Many areas of mining engineering gather and use statistical information, provided by observing the actual operation of equipment, their systems, the development of mining works, surface subsidence that accompanies underground mining, displacement of rocks surrounding surface pits and underground drives and longwalls, amongst others. In addition, the actual modern machines used in surface mining are equipped with diagnostic systems that automatically trace all important machine parameters and send this information to the main producer’s computer. Such data not only provide information on the technical properties of the machine but they also have a statistical character. Furthermore, all information gathered during stand and lab investigations where parts, assemblies and whole devices are tested in order to prove their usefulness, have a stochastic character. All of these materials need to be developed statistically and, more importantly, based on these results mining engineers must make decisions whether to undertake actions, connected with the further operation of the machines, the further development of the works, etc. For these reasons, knowledge of modern statistics is necessary for mining engineers; not only as to how statistical analysis of data should be conducted and statistical synthesis should be done, but also as to understanding the results obtained and how to use them to make appropriate decisions in relation to the mining operation. This book on statistical analysis and synthesis starts with a short repetition of probability theory and also includes a special section on statistical prediction. The text is illustrated with many examples taken from mining practice; moreover the tables required to conduct statistical inference are included.

Growth Curve Modeling: Theory and Applications

Features recent trends and advances in the theory and techniques used to accurately measure and model growth Growth Curve Modeling: Theory and Applications features an accessible introduction to growth curve modeling and addresses how to monitor the change in variables over time since there is no "one size fits all" approach to growth measurement. A review of the requisite mathematics for growth modeling and the statistical techniques needed for estimating growth models are provided, and an overview of popular growth curves, such as linear, logarithmic, reciprocal, logistic, Gompertz, Weibull, negative exponential, and log-logistic, among others, is included. In addition, the book discusses key application areas including economic, plant, population, forest, and firm growth and is suitable as a resource for assessing recent growth modeling trends in the medical field. SAS is utilized throughout to analyze and model growth curves, aiding readers in estimating specialized growth rates and curves. Including derivations of virtually all of the major growth curves and models, Growth Curve Modeling: Theory and Applications also features: Statistical distribution analysis as it pertains to growth modeling Trend estimations Dynamic site equations obtained from growth models Nonlinear regression Yield-density curves Nonlinear mixed effects models for repeated measurements data Growth Curve Modeling: Theory and Applications is an excellent resource for statisticians, public health analysts, biologists, botanists, economists, and demographers who require a modern review of statistical methods for modeling growth curves and analyzing longitudinal data. The book is also useful for upper-undergraduate and graduate courses on growth modeling.

Nonlinear Option Pricing

New Tools to Solve Your Option Pricing Problems For nonlinear PDEs encountered in quantitative finance, advanced probabilistic methods are needed to address dimensionality issues. Written by two leaders in quantitative research—including Risk magazine’s 2013 Quant of the Year— Nonlinear Option Pricing compares various numerical methods for solving high-dimensional nonlinear problems arising in option pricing. Designed for practitioners, it is the first authored book to discuss nonlinear Black-Scholes PDEs and compare the efficiency of many different methods. Real-World Solutions for Quantitative Analysts The book helps quants develop both their analytical and numerical expertise. It focuses on general mathematical tools rather than specific financial questions so that readers can easily use the tools to solve their own nonlinear problems. The authors build intuition through numerous real-world examples of numerical implementation. Although the focus is on ideas and numerical examples, the authors introduce relevant mathematical notions and important results and proofs. The book also covers several original approaches, including regression methods and dual methods for pricing chooser options, Monte Carlo approaches for pricing in the uncertain volatility model and the uncertain lapse and mortality model, the Markovian projection method and the particle method for calibrating local stochastic volatility models to market prices of vanilla options with/without stochastic interest rates, the a + bλ technique for building local correlation models that calibrate to market prices of vanilla options on a basket, and a new stochastic representation of nonlinear PDE solutions based on marked branching diffusions.

Understanding Uncertainty, Revised Edition

Praise for the First Edition "...a reference for everyone who is interested in knowing and handling uncertainty." —Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made. Featuring new material, the Revised Edition remains the go-to guide for uncertainty and decision making, providing further applications at an accessible level including: A critical study of transitivity, a basic concept in probability A discussion of how the failure of the financial sector to use the proper approach to uncertainty may have contributed to the recent recession A consideration of betting, showing that a bookmaker's odds are not expressions of probability Applications of the book's thesis to statistics A demonstration that some techniques currently popular in statistics, like significance tests, may be unsound, even seriously misleading, because they violate the rules of probability Understanding Uncertainty, Revised Edition is ideal for students studying probability or statistics and for anyone interested in one of the most fascinating and vibrant fields of study in contemporary science and mathematics.

Image Statistics in Visual Computing

To achieve the complex task of interpreting what we see, our brains rely on statistical regularities and patterns in visual data. Knowledge of these regularities can also be considerably useful in visual computing disciplines, such as computer vision, computer graphics, and image processing. The field of natural image statistics studies the regularities to exploit their potential and better understand human vision. With numerous color figures throughout, Image Statistics in Visual Computing The authors keep the material accessible, providing mathematical definitions where appropriate to help readers understand the transforms that highlight statistical regularities present in images. The book also describes patterns that arise once the images are transformed and gives examples of applications that have successfully used statistical regularities. Numerous references enable readers to easily look up more information about a specific concept or application. A supporting website also offers additional information, including descriptions of various image databases suitable for statistics. Collecting state-of-the-art, interdisciplinary knowledge in one source, this book explores the relation of natural image statistics to human vision and shows how natural image statistics can be applied to visual computing. It encourages readers in both academic and industrial settings to develop novel insights and applications in all disciplines that relate to visual computing.

Nonparametric Statistics for Social and Behavioral Sciences

Incorporating a hands-on pedagogical approach, this text presents the concepts, principles, and methods used in performing many nonparametric procedures. It also demonstrates practical applications of the most common nonparametric procedures using IBM's SPSS software. The text is the only current nonparametric book written specifically for students in the behavioral and social sciences. With examples of real-life research problems, it emphasizes sound research designs, appropriate statistical analyses, and accurate interpretations of results.

Understanding Business Statistics

Written in a conversational tone, presents topics in a systematic and organized manner to help students navigate the material. Demonstration problems appear alongside the concepts, which makes the content easier to understand. By explaining the reasoning behind each exercise, students are more inclined to engage with the material and gain a clear understanding of how to apply statistics to the business world. Freed, Understanding Business Statistics is accompanied by Freed, Understanding Business Statistics WileyPLUS, a research-based, online environment for effective teaching and learning. This online learning system gives students instant feedback on homework assignments, provides video tutorials and variety of study tools, and offers instructors thousands of reliable, accurate problems (including every problem from the book) to deliver automatically graded assignments or tests. Available in or outside of the Blackboard Learn Environment, WileyPLUS resources help reach all types of learners and give instructors the tools they need to enhance course material. WileyPLUS sold separately from text.

Early Estimation of Project Determinants

The study initiated with underlying principles of construction production which is an impetus to ill-conditioned prediction of project determinants at the early phases of building projects. To enhance the precision of these estimations, unique solutions relying on the statistical evidences were offered. Two alternative methods of analysis, namely linear regression and artificial neural networks, were employed to recognize the patterns in the sampled projects. Comparison was conducted on the basis of prediction measurements that were computed with the help of unseen test sample. The evidences of the empirical investigation suggest offered solutions provide superior prediction accuracy when compared to current practices. Last but not least, implementation of the solutions was illustrated on a random office development.

Fast Sequential Monte Carlo Methods for Counting and Optimization

A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the field, the book places emphasis on cross-entropy, minimum cross-entropy, splitting, and stochastic enumeration. Focusing on the concepts and application of Monte Carlo techniques, Fast Sequential Monte Carlo Methods for Counting and Optimization includes: Detailed algorithms needed to practice solving real-world problems Numerous examples with Monte Carlo method produced solutions within the 1-2% limit of relative error A new generic sequential importance sampling algorithm alongside extensive numerical results An appendix focused on review material to provide additional background information Fast Sequential Monte Carlo Methods for Counting and Optimization is an excellent resource for engineers, computer scientists, mathematicians, statisticians, and readers interested in efficient simulation techniques. The book is also useful for upper-undergraduate and graduate-level courses on Monte Carlo methods.