talk-data.com talk-data.com

Event

O'Reilly Data Science Books

2013-08-09 – 2026-02-25 Oreilly Visit website ↗

Activities tracked

505

Collection of O'Reilly books on Data Science.

Filtering by: statistics ×

Sessions & talks

Showing 326–350 of 505 · Newest first

Search within this event →
Examples and Problems in Mathematical Statistics

Provides the necessary skills to solve problems in mathematical statistics through theory, concrete examples, and exercises With a clear and detailed approach to the fundamentals of statistical theory, Examples and Problems in Mathematical Statistics uniquely bridges the gap between theory andapplication and presents numerous problem-solving examples that illustrate the relatednotations and proven results. Written by an established authority in probability and mathematical statistics, each chapter begins with a theoretical presentation to introduce both the topic and the important results in an effort to aid in overall comprehension. Examples are then provided, followed by problems, and finally, solutions to some of the earlier problems. In addition, Examples and Problems in Mathematical Statistics features: Over 160 practical and interesting real-world examples from a variety of fields including engineering, mathematics, and statistics to help readers become proficient in theoretical problem solving More than 430 unique exercises with select solutions Key statistical inference topics, such as probability theory, statistical distributions, sufficient statistics, information in samples, testing statistical hypotheses, statistical estimation, confidence and tolerance intervals, large sample theory, and Bayesian analysis Recommended for graduate-level courses in probability and statistical inference, Examples and Problems in Mathematical Statistics is also an ideal reference for applied statisticians and researchers.

Information Evaluation

During the reception of a piece of information, we are never passive. Depending on its origin and content, from our personal beliefs and convictions, we bestow upon this piece of information, spontaneously or after reflection, a certain amount of confidence. Too much confidence shows a degree of naivety, whereas an absolute lack of it condemns us as being paranoid. These two attitudes are symmetrically detrimental, not only to the proper perception of this information but also to its use. Beyond these two extremes, each person generally adopts an intermediate position when faced with the reception of information, depending on its provenance and credibility. We still need to understand and explain how these judgements are conceived, in what context and to what end. Spanning the approaches offered by philosophy, military intelligence, algorithmics and information science, this book presents the concepts of information and the confidence placed in it, the methods that militaries, the first to be aware of the need, have or should have adopted, tools to help them, and the prospects that they have opened up. Beyond the military context, the book reveals ways to evaluate information for the good of other fields such as economic intelligence, and, more globally, the informational monitoring by governments and businesses. Contents 1. Information: Philosophical Analysis and Strategic Applications, Mouhamadou El Hady Ba and Philippe Capet. 2. Epistemic Trust, Gloria Origgi. 3. The Fundamentals of Intelligence, Philippe Lemercier. 4. Information Evaluation in the Military Domain: Doctrines, Practices and Shortcomings, Philippe Capet and Adrien Revault d'Allonnes. 5. Multidimensional Approach to Reliability Evaluation of Information Sources, Frédéric Pichon, Christophe Labreuche, Bertrand Duqueroie and Thomas Delavallade. 6. Uncertainty of an Event and its Markers in Natural Language Processing, Mouhamadou El Hady Ba, Stéphanie Brizard, Tanneguy Dulong and Bénédicte Goujon. 7. Quantitative Information Evaluation: Modeling and Experimental Evaluation, Marie-Jeanne Lesot, Frédéric Pichon and Thomas Delavallade. 8. When Reported Information Is Second Hand, Laurence Cholvy. 9. An Architecture for the Evolution of Trust: Definition and Impact of the Necessary Dimensions of Opinion Making, Adrien Revault d'Allonnes. About the Authors Philippe Capet is a project manager and research engineer at Ektimo, working mainly on information management and control in military contexts. Thomas Delavallade is an advanced studies engineer at Thales Communications & Security, working on social media mining in the context of crisis management, cybersecurity and the fight against cybercrime.

Methods and Applications of Statistics in Clinical Trials, Volume 1: Concepts, Principles, Trials, and Designs

A complete guide to the key statistical concepts essential for the design and construction of clinical trials As the newest major resource in the field of medical research, Methods and Applications of Statistics in Clinical Trials, Volume 1: Concepts, Principles, Trials, and Designs presents a timely and authoritative reviewof the central statistical concepts used to build clinical trials that obtain the best results. The referenceunveils modern approaches vital to understanding, creating, and evaluating data obtained throughoutthe various stages of clinical trial design and analysis. Accessible and comprehensive, the first volume in a two-part set includes newly-written articles as well as established literature from the Wiley Encyclopedia of Clinical Trials. Illustrating a variety of statistical concepts and principles such as longitudinal data, missing data, covariates, biased-coin randomization, repeated measurements, and simple randomization, the book also provides in-depth coverage of the various trial designs found within phase I-IV trials. Methods and Applications of Statistics in Clinical Trials, Volume 1: Concepts, Principles, Trials, and Designs also features: Detailed chapters on the type of trial designs, such as adaptive, crossover, group-randomized, multicenter, non-inferiority, non-randomized, open-labeled, preference, prevention, and superiority trials Over 100 contributions from leading academics, researchers, and practitioners An exploration of ongoing, cutting-edge clinical trials on early cancer and heart disease, mother-to-child human immunodeficiency virus transmission trials, and the AIDS Clinical Trials Group Methods and Applications of Statistics in Clinical Trials, Volume 1: Concepts, Principles, Trials, and Designs is an excellent reference for researchers, practitioners, and students in the fields of clinicaltrials, pharmaceutics, biostatistics, medical research design, biology, biomedicine, epidemiology,and public health.

Statistical Analysis in Forensic Science: Evidential Values of Multivariate Physicochemical Data

A practical guide for determining the evidential value of physicochemical data Microtraces of various materials (e.g. glass, paint, fibres, and petroleum products) are routinely subjected to physicochemical examination by forensic experts, whose role is to evaluate such physicochemical data in the context of the prosecution and defence propositions. Such examinations return various kinds of information, including quantitative data. From the forensic point of view, the most suitable way to evaluate evidence is the likelihood ratio. This book provides a collection of recent approaches to the determination of likelihood ratios and describes suitable software, with documentation and examples of their use in practice. The statistical computing and graphics software environment R, pre-computed Bayesian networks using Hugin Researcher and a new package, calcuLatoR, for the computation of likelihood ratios are all explored. Statistical Analysis in Forensic Science will provide an invaluable practical guide for forensic experts and practitioners, forensic statisticians, analytical chemists, and chemometricians. Key features include: Description of the physicochemical analysis of forensic trace evidence. Detailed description of likelihood ratio models for determining the evidential value of multivariate physicochemical data. Detailed description of methods, such as empirical cross-entropy plots, for assessing the performance of likelihood ratio-based methods for evidence evaluation. Routines written using the open-source R software, as well as Hugin Researcher and calcuLatoR. Practical examples and recommendations for the use of all these methods in practice.

Forecasting Offertory Revenue at St. Elizabeth Seton Catholic Church

This new business analytics case study challenges readers to forecast donations, plan budgets, and manage cash flow for a religious institution suffering rapidly falling contributions. Crystallizing realistic analytical challenges faced by non-profit and for-profit organizations of all kinds, it exposes readers to the entire decision-making process, providing opportunities to perform analyses, interpret output, and recommend the best course of action. Author: Matthew J. Drake, Duquesne University.

Forecasting Sales at Ska Brewing Company

This new business analytics case study challenges readers to project trends and plan capacity for a fast-growing craft beer operation, so it can make the best possible decisions about expensive investments in brewing capacity. Crystallizing realistic analytical challenges faced by companies in many industries and markets, it exposes readers to the entire decision-making process, providing opportunities to perform analyses, interpret output, and recommend the best course of action. Author: Eric Huggins, Fort Lewis College.

Statistics for Mining Engineering

Many areas of mining engineering gather and use statistical information, provided by observing the actual operation of equipment, their systems, the development of mining works, surface subsidence that accompanies underground mining, displacement of rocks surrounding surface pits and underground drives and longwalls, amongst others. In addition, the actual modern machines used in surface mining are equipped with diagnostic systems that automatically trace all important machine parameters and send this information to the main producer’s computer. Such data not only provide information on the technical properties of the machine but they also have a statistical character. Furthermore, all information gathered during stand and lab investigations where parts, assemblies and whole devices are tested in order to prove their usefulness, have a stochastic character. All of these materials need to be developed statistically and, more importantly, based on these results mining engineers must make decisions whether to undertake actions, connected with the further operation of the machines, the further development of the works, etc. For these reasons, knowledge of modern statistics is necessary for mining engineers; not only as to how statistical analysis of data should be conducted and statistical synthesis should be done, but also as to understanding the results obtained and how to use them to make appropriate decisions in relation to the mining operation. This book on statistical analysis and synthesis starts with a short repetition of probability theory and also includes a special section on statistical prediction. The text is illustrated with many examples taken from mining practice; moreover the tables required to conduct statistical inference are included.

Growth Curve Modeling: Theory and Applications

Features recent trends and advances in the theory and techniques used to accurately measure and model growth Growth Curve Modeling: Theory and Applications features an accessible introduction to growth curve modeling and addresses how to monitor the change in variables over time since there is no "one size fits all" approach to growth measurement. A review of the requisite mathematics for growth modeling and the statistical techniques needed for estimating growth models are provided, and an overview of popular growth curves, such as linear, logarithmic, reciprocal, logistic, Gompertz, Weibull, negative exponential, and log-logistic, among others, is included. In addition, the book discusses key application areas including economic, plant, population, forest, and firm growth and is suitable as a resource for assessing recent growth modeling trends in the medical field. SAS is utilized throughout to analyze and model growth curves, aiding readers in estimating specialized growth rates and curves. Including derivations of virtually all of the major growth curves and models, Growth Curve Modeling: Theory and Applications also features: Statistical distribution analysis as it pertains to growth modeling Trend estimations Dynamic site equations obtained from growth models Nonlinear regression Yield-density curves Nonlinear mixed effects models for repeated measurements data Growth Curve Modeling: Theory and Applications is an excellent resource for statisticians, public health analysts, biologists, botanists, economists, and demographers who require a modern review of statistical methods for modeling growth curves and analyzing longitudinal data. The book is also useful for upper-undergraduate and graduate courses on growth modeling.

Nonlinear Option Pricing

New Tools to Solve Your Option Pricing Problems For nonlinear PDEs encountered in quantitative finance, advanced probabilistic methods are needed to address dimensionality issues. Written by two leaders in quantitative research—including Risk magazine’s 2013 Quant of the Year— Nonlinear Option Pricing compares various numerical methods for solving high-dimensional nonlinear problems arising in option pricing. Designed for practitioners, it is the first authored book to discuss nonlinear Black-Scholes PDEs and compare the efficiency of many different methods. Real-World Solutions for Quantitative Analysts The book helps quants develop both their analytical and numerical expertise. It focuses on general mathematical tools rather than specific financial questions so that readers can easily use the tools to solve their own nonlinear problems. The authors build intuition through numerous real-world examples of numerical implementation. Although the focus is on ideas and numerical examples, the authors introduce relevant mathematical notions and important results and proofs. The book also covers several original approaches, including regression methods and dual methods for pricing chooser options, Monte Carlo approaches for pricing in the uncertain volatility model and the uncertain lapse and mortality model, the Markovian projection method and the particle method for calibrating local stochastic volatility models to market prices of vanilla options with/without stochastic interest rates, the a + bλ technique for building local correlation models that calibrate to market prices of vanilla options on a basket, and a new stochastic representation of nonlinear PDE solutions based on marked branching diffusions.

Understanding Uncertainty, Revised Edition

Praise for the First Edition "...a reference for everyone who is interested in knowing and handling uncertainty." —Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made. Featuring new material, the Revised Edition remains the go-to guide for uncertainty and decision making, providing further applications at an accessible level including: A critical study of transitivity, a basic concept in probability A discussion of how the failure of the financial sector to use the proper approach to uncertainty may have contributed to the recent recession A consideration of betting, showing that a bookmaker's odds are not expressions of probability Applications of the book's thesis to statistics A demonstration that some techniques currently popular in statistics, like significance tests, may be unsound, even seriously misleading, because they violate the rules of probability Understanding Uncertainty, Revised Edition is ideal for students studying probability or statistics and for anyone interested in one of the most fascinating and vibrant fields of study in contemporary science and mathematics.

Image Statistics in Visual Computing

To achieve the complex task of interpreting what we see, our brains rely on statistical regularities and patterns in visual data. Knowledge of these regularities can also be considerably useful in visual computing disciplines, such as computer vision, computer graphics, and image processing. The field of natural image statistics studies the regularities to exploit their potential and better understand human vision. With numerous color figures throughout, Image Statistics in Visual Computing The authors keep the material accessible, providing mathematical definitions where appropriate to help readers understand the transforms that highlight statistical regularities present in images. The book also describes patterns that arise once the images are transformed and gives examples of applications that have successfully used statistical regularities. Numerous references enable readers to easily look up more information about a specific concept or application. A supporting website also offers additional information, including descriptions of various image databases suitable for statistics. Collecting state-of-the-art, interdisciplinary knowledge in one source, this book explores the relation of natural image statistics to human vision and shows how natural image statistics can be applied to visual computing. It encourages readers in both academic and industrial settings to develop novel insights and applications in all disciplines that relate to visual computing.

Nonparametric Statistics for Social and Behavioral Sciences

Incorporating a hands-on pedagogical approach, this text presents the concepts, principles, and methods used in performing many nonparametric procedures. It also demonstrates practical applications of the most common nonparametric procedures using IBM's SPSS software. The text is the only current nonparametric book written specifically for students in the behavioral and social sciences. With examples of real-life research problems, it emphasizes sound research designs, appropriate statistical analyses, and accurate interpretations of results.

Understanding Business Statistics

Written in a conversational tone, presents topics in a systematic and organized manner to help students navigate the material. Demonstration problems appear alongside the concepts, which makes the content easier to understand. By explaining the reasoning behind each exercise, students are more inclined to engage with the material and gain a clear understanding of how to apply statistics to the business world. Freed, Understanding Business Statistics is accompanied by Freed, Understanding Business Statistics WileyPLUS, a research-based, online environment for effective teaching and learning. This online learning system gives students instant feedback on homework assignments, provides video tutorials and variety of study tools, and offers instructors thousands of reliable, accurate problems (including every problem from the book) to deliver automatically graded assignments or tests. Available in or outside of the Blackboard Learn Environment, WileyPLUS resources help reach all types of learners and give instructors the tools they need to enhance course material. WileyPLUS sold separately from text.

Early Estimation of Project Determinants

The study initiated with underlying principles of construction production which is an impetus to ill-conditioned prediction of project determinants at the early phases of building projects. To enhance the precision of these estimations, unique solutions relying on the statistical evidences were offered. Two alternative methods of analysis, namely linear regression and artificial neural networks, were employed to recognize the patterns in the sampled projects. Comparison was conducted on the basis of prediction measurements that were computed with the help of unseen test sample. The evidences of the empirical investigation suggest offered solutions provide superior prediction accuracy when compared to current practices. Last but not least, implementation of the solutions was illustrated on a random office development.

Fast Sequential Monte Carlo Methods for Counting and Optimization

A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the field, the book places emphasis on cross-entropy, minimum cross-entropy, splitting, and stochastic enumeration. Focusing on the concepts and application of Monte Carlo techniques, Fast Sequential Monte Carlo Methods for Counting and Optimization includes: Detailed algorithms needed to practice solving real-world problems Numerous examples with Monte Carlo method produced solutions within the 1-2% limit of relative error A new generic sequential importance sampling algorithm alongside extensive numerical results An appendix focused on review material to provide additional background information Fast Sequential Monte Carlo Methods for Counting and Optimization is an excellent resource for engineers, computer scientists, mathematicians, statisticians, and readers interested in efficient simulation techniques. The book is also useful for upper-undergraduate and graduate-level courses on Monte Carlo methods.

Nonparametric Statistical Methods, 3rd Edition

Praise for the Second Edition "This book should be an essential part of the personal library of every practicing statistician."—Technometrics Thoroughly revised and updated, the new edition of Nonparametric Statistical Methods includes additional modern topics and procedures, more practical data sets, and new problems from real-life situations. The book continues to emphasize the importance of nonparametric methods as a significant branch of modern statistics and equips readers with the conceptual and technical skills necessary to select and apply the appropriate procedures for any given situation. Written by leading statisticians, Nonparametric Statistical Methods, Third Edition provides readers with crucial nonparametric techniques in a variety of settings, emphasizing the assumptions underlying the methods. The book provides an extensive array of examples that clearly illustrate how to use nonparametric approaches for handling one- or two-sample location and dispersion problems, dichotomous data, and one-way and two-way layout problems. In addition, the Third Edition features: The use of the freely available R software to aid in computation and simulation, including many new R programs written explicitly for this new edition New chapters that address density estimation, wavelets, smoothing, ranked set sampling, and Bayesian nonparametrics Problems that illustrate examples from agricultural science, astronomy, biology, criminology, education, engineering, environmental science, geology, home economics, medicine, oceanography, physics, psychology, sociology, and space science Nonparametric Statistical Methods, Third Edition is an excellent reference for applied statisticians and practitioners who seek a review of nonparametric methods and their relevant applications. The book is also an ideal textbook for upper-undergraduate and first-year graduate courses in applied nonparametric statistics.

Financial and Actuarial Statistics, 2nd Edition

This work enables readers to obtain the mathematical and statistical background required in the current financial and actuarial industries. It also advances the application and theory of statistics in modern financial and actuarial modeling. This second edition adds a substantial amount of new material, including Excel exercises with solutions; nomenclature and notations standard to the actuarial field; a new chapter on Markov chains and actuarial applications; expanded discussions on simulation, surplus models, and ruin computations; and much more.

Applied Statistics and Probability for Engineers, 6th Edition

This best-selling engineering statistics text provides a practical approach that is more oriented to engineering and the chemical and physical sciences than many similar texts. It is packed with unique problem sets that reflect realistic situations engineers will encounter in their working lives. This text shows how statistics, the science of data is just as important for engineers as the mechanical, electrical, and materials sciences.

Handbook of Probability

THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introduction, historical background, theory and applications, algorithms, and exercises. The Handbook of Probability offers coverage of: Probability Space Probability Measure Random Variables Random Vectors in Rn Characteristic Function Moment Generating Function Gaussian Random Vectors Convergence Types Limit Theorems The Handbook of Probability is an ideal resource for researchers and practitioners in numerous fields, such as mathematics, statistics, operations research, engineering, medicine, and finance, as well as a useful text for graduate students.

Business Statistics: For Contemporary Decision Making, 8th Edition

This text is an unbound, binder-ready edition. Business Statistics: For Contemporary Decision Making, 8th Edition continues the tradition of presenting and explaining the wonders of business statistics through the use of clear, complete, student-friendly pedagogy. Ken Black's text equips readers with the quantitative decision-making skills and analysis techniques you need to make smart decisions based on real-world data.

IBM SPSS Modeler Cookbook

"IBM SPSS Modeler Cookbook" is your practical guide to mastering data mining with IBM SPSS Modeler. This comprehensive book takes you beyond the basics, offering expert insights, time-saving techniques, and powerful workflows to grow your skills and elevate your analytical abilities. You will learn to apply the CRISP-DM methodology, efficiently prepare and explore data, build advanced models, and confidently incorporate analytical results into your business decisions. What this Book will help me do Effectively apply the CRISP-DM standard process to organize your data mining projects. Leverage efficient techniques for data extraction, transformation, and preparation. Develop and evaluate predictive models for practical applications in your organization. Enhance your models by utilizing advanced features and expert tips. Automate and streamline your data mining process with scripting for ultimate control. Author(s) Keith McCormick and None Abbott are seasoned data mining professionals with deep expertise in IBM SPSS Modeler and predictive analytics. Together, they have extensive experience in consulting, training, and applying advanced analytical techniques across industries. Through their approachable and insightful writing style, they share practical knowledge and expert workflows to empower readers. Who is it for? This book is designed for individuals who have basic experience with IBM SPSS Modeler and aspire to deepen their expertise. Whether you are a data analyst looking to advance your analytical capabilities or a professional aiming to integrate data-driven solutions into your organization, this book provides the knowledge and practical guidance you need to take the next step in your data mining journey.

Introduction to Statistical Process Control

A major tool for quality control and management, statistical process control (SPC) monitors sequential processes, such as production lines and Internet traffic, to ensure that they work stably and satisfactorily. Along with covering traditional methods, Introduction to Statistical Process Control describes many recent SPC methods that improve upon the more established techniques. The author—a leading researcher on SPC—shows how these methods can handle new applications. After exploring the role of SPC and other statistical methods in quality control and management, the book covers basic statistical concepts and methods useful in SPC. It then systematically describes traditional SPC charts, including the Shewhart, CUSUM, and EWMA charts, as well as recent control charts based on change-point detection and fundamental multivariate SPC charts under the normality assumption. The text also introduces novel univariate and multivariate control charts for cases when the normality assumption is invalid and discusses control charts for profile monitoring. All computations in the examples are solved using R, with R functions and datasets available for download on the author’s website. Offering a systematic description of both traditional and newer SPC methods, this book is ideal as a primary textbook for a one-semester course in disciplines concerned with process quality control, such as statistics, industrial and systems engineering, and management sciences. It can also be used as a supplemental textbook for courses on quality improvement and system management. In addition, the book provides researchers with many useful, recent research results on SPC and gives quality control practitioners helpful guidelines on implementing up-to-date SPC techniques.

Risk Scoring for a Loan Application on IBM System z: Running IBM SPSS Real-Time Analytics

When ricocheting a solution that involves analytics, the mainframe might not be the first platform that comes to mind. However, the IBM® System z® group has developed some innovative solutions that include the well-respected mainframe benefits. This book describes a workshop that demonstrates the use of real-time advanced analytics for enhancing core banking decisions using a loan origination example. The workshop is a live hands-on experience of the entire process from analytics modeling to deployment of real-time scoring services for use on IBM z/OS®. In this IBM Redbooks® publication, we include a facilitator guide chapter as well as a participant guide chapter. The facilitator guide includes information about the preparation, such as the needed material, resources, and steps to set up and run this workshop. The participant guide shows step-by-step the tasks for a successful learning experience. The goal of the first hands-on exercise is to learn how to use IBM SPSS® Modeler for Analytics modeling. This provides the basis for the next exercise "Configuring risk assessment in SPSS Decision Management". In the third exercise, the participant experiences how real-time scoring can be implemented on a System z. This publication is written for consultants, IT architects, and IT administrators who want to become familiar with SPSS and analytics solutions on the System z.

Handbook of Economic Forecasting

The highly prized ability to make financial plans with some certainty about the future comes from the core fields of economics. In recent years the availability of more data, analytical tools of greater precision, and ex post studies of business decisions have increased demand for information about economic forecasting. Volumes 2A and 2B, which follows Nobel laureate Clive Granger's Volume 1 (2006), concentrate on two major subjects. Volume 2A covers innovations in methodologies, specifically macroforecasting and forecasting financial variables. Volume 2B investigates commercial applications, with sections on forecasters' objectives and methodologies. Experts provide surveys of a large range of literature scattered across applied and theoretical statistics journals as well as econometrics and empirical economics journals. The Handbook of Economic Forecasting Volumes 2A and 2B provide a unique compilation of chapters giving a coherent overview of forecasting theory and applications in one place and with up-to-date accounts of all major conceptual issues. Focuses on innovation in economic forecasting via industry applications Presents coherent summaries of subjects in economic forecasting that stretch from methodologies to applications Makes details about economic forecasting accessible to scholars in fields outside economics