talk-data.com talk-data.com

Topic

statistics

512

tagged

Activity Trend

1 peak/qtr
2020-Q1 2026-Q1

Activities

512 activities · Newest first

Handbook in Monte Carlo Simulation: Applications in Financial Engineering, Risk Management, and Economics

An accessible treatment of Monte Carlo methods, techniques, and applications in the field of finance and economics Providing readers with an in-depth and comprehensive guide, the Handbook in Monte Carlo Simulation: Applications in Financial Engineering, Risk Management, and Economics presents a timely account of the applicationsof Monte Carlo methods in financial engineering and economics. Written by an international leading expert in thefield, the handbook illustrates the challenges confronting present-day financial practitioners and provides various applicationsof Monte Carlo techniques to answer these issues. The book is organized into five parts: introduction andmotivation; input analysis, modeling, and estimation; random variate and sample path generation; output analysisand variance reduction; and applications ranging from option pricing and risk management to optimization. The Handbook in Monte Carlo Simulation features: An introductory section for basic material on stochastic modeling and estimation aimed at readers who may need a summary or review of the essentials Carefully crafted examples in order to spot potential pitfalls and drawbacks of each approach An accessible treatment of advanced topics such as low-discrepancy sequences, stochastic optimization, dynamic programming, risk measures, and Markov chain Monte Carlo methods Numerous pieces of R code used to illustrate fundamental ideas in concrete terms and encourage experimentation The Handbook in Monte Carlo Simulation: Applications in Financial Engineering, Risk Management, and Economics is a complete reference for practitioners in the fields of finance, business, applied statistics, econometrics, and engineering, as well as a supplement for MBA and graduate-level courses on Monte Carlo methods and simulation.

Wavelet Neural Networks: With Applications in Financial Engineering, Chaos, and Classification

A step-by-step introduction to modeling, training, and forecasting using wavelet networks Wavelet Neural Networks: With Applications in Financial Engineering, Chaos, and Classification presents the statistical model identification framework that is needed to successfully apply wavelet networks as well as extensive comparisons of alternate methods. Providing a concise and rigorous treatment for constructing optimal wavelet networks, the book links mathematical aspects of wavelet network construction to statistical modeling and forecasting applications in areas such as finance, chaos, and classification. The authors ensure that readers obtain a complete understanding of model identification by providing in-depth coverage of both model selection and variable significance testing. Featuring an accessible approach with introductory coverage of the basic principles of wavelet analysis, Wavelet Neural Networks: With Applications in Financial Engineering, Chaos, and Classification also includes: Methods that can be easily implemented or adapted by researchers, academics, and professionals in identification and modeling for complex nonlinear systems and artificial intelligence Multiple examples and thoroughly explained procedures with numerous applications ranging from financial modeling and financial engineering, time series prediction and construction of confidence and prediction intervals, and classification and chaotic time series prediction An extensive introduction to neural networks that begins with regression models and builds to more complex frameworks Coverage of both the variable selection algorithm and the model selection algorithm for wavelet networks in addition to methods for constructing confidence and prediction intervals Ideal as a textbook for MBA and graduate-level courses in applied neural network modeling, artificial intelligence, advanced data analysis, time series, and forecasting in financial engineering, the book is also useful as a supplement for courses in informatics, identification and modeling for complex nonlinear systems, and computational finance. In addition, the book serves as a valuable reference for researchers and practitioners in the fields of mathematical modeling, engineering, artificial intelligence, decision science, neural networks, and finance and economics.

Design, Evaluation, and Analysis of Questionnaires for Survey Research, 2nd Edition

Design, Evaluation, and Analysis of Questionnaires for Survey Research, Second Edition explores updates on the statistical knowledge and development of survey questionnaires, including analyzing the important decisions researchers make throughout the survey design process. The new edition provides coverage of an updated SQP program, which has an expanded question database from the Multi-trait Multi-method (MTMM) experiments. This book aims to give students and survey researchers a state-of-the-art introduction to questionnaire design and how to construct questionnaires with the highest relevance and accuracy. The pitfalls of questionnaire design are outlined throughout the book, which alerts designers of questionnaires to the many prior decisions that will affect the quality of the research outcome. It is important to measure the quality of questions at the outset in order for students and researchers to consider the consequences and methods of achieving reliable and effective questions.

Repeated Measurements and Cross-Over Designs

An introduction to state-of-the-art experimental design approaches to better understand and interpret repeated measurement data in cross-over designs. Repeated Measurements and Cross-Over Designs: Features the close tie between the design, analysis, and presentation of results Presents principles and rules that apply very generally to most areas of research, such as clinical trials, agricultural investigations, industrial procedures, quality control procedures, and epidemiological studies Includes many practical examples, such as PK/PD studies in the pharmaceutical industry, k-sample and one sample repeated measurement designs for psychological studies, and residual effects of different treatments in controlling conditions such as asthma, blood pressure, and diabetes. Utilizes SAS(R) software to draw necessary inferences. All SAS output and data sets are available via the book's related website. This book is ideal for a broad audience including statisticians in pre-clinical research, researchers in psychology, sociology, politics, marketing, and engineering.

Statistics: Principles and Methods, 7th Edition

Johnson/Bhattacharyya is unique in its clarity of exposition while maintaining the mathematical correctness of its explanations. Many other books that claim to be easier to understand often sacrifice mathematical rigor. In contrast, Johnson/ Bhattacharyya maintain a focus on accuracy without getting bogged down in unnecessary details.

Displaying Time Series, Spatial, and Space-Time Data with R

Code and Methods for Creating High-Quality Data GraphicsA data graphic is not only a static image, but it also tells a story about the data. It activates cognitive processes that are able to detect patterns and discover information not readily available with the raw data. This is particularly true for time series, spatial, and space-time datasets.F

Statistical Analysis: Microsoft® Excel® 2013

Use Excel 2013’s statistical tools to transform your data into knowledge Conrad Carlberg shows how to use Excel 2013 to perform core statistical tasks every business professional, student, and researcher should master. Using real-world examples, Carlberg helps you choose the right technique for each problem and get the most out of Excel’s statistical features, including recently introduced consistency functions. Along the way, he clarifies confusing statistical terminology and helps you avoid common mistakes. You’ll learn how to use correlation and regression, analyze variance and covariance, and test statistical hypotheses using the normal, binomial, t, and F distributions. To help you make accurate inferences based on samples from a population, this edition adds two more chapters on inferential statistics, covering crucial topics ranging from experimental design to the statistical power of F tests. Becoming an expert with Excel statistics has never been easier! You’ll find crystal-clear instructions, insider insights, and complete step-by-step projects—all complemented by extensive web-based resources. Master Excel’s most useful descriptive and inferential statistical tools Tell the truth with statistics—and recognize when others don’t Accurately summarize sets of values Infer a population’s characteristics from a sample’s frequency distribution Explore correlation and regression to learn how variables move in tandem Use Excel consistency functions such as STDEV.S() and STDEV.P() Test differences between two means using z tests, t tests, and Excel’s Data Analysis Add-in Use ANOVA to test differences between more than two means Explore statistical power by manipulating mean differences, standard errors, directionality, and alpha Take advantage of Recommended PivotTables, Quick Analysis, and other Excel 2013 shortcuts

Economic and Business Forecasting: Analyzing and Interpreting Econometric Results

Discover the secrets to applying simple econometric techniques to improve forecasting Equipping analysts, practitioners, and graduate students with a statistical framework to make effective decisions based on the application of simple economic and statistical methods, Economic and Business Forecasting offers a comprehensive and practical approach to quantifying and accurate forecasting of key variables. Using simple econometric techniques, author John E. Silvia focuses on a select set of major economic and financial variables, revealing how to optimally use statistical software as a template to apply to your own variables of interest. Presents the economic and financial variables that offer unique insights into economic performance Highlights the econometric techniques that can be used to characterize variables Explores the application of SAS software, complete with simple explanations of SAS-code and output Identifies key econometric issues with practical solutions to those problems Presenting the "ten commandments" for economic and business forecasting, this book provides you with a practical forecasting framework you can use for important everyday business applications.

Statistical Hypothesis Testing with SAS and R

A comprehensive guide to statistical hypothesis testing with examples in SAS and R When analyzing datasets the following questions often arise: Is there a short hand procedure for a statistical test available in SAS or R? If so, how do I use it? If not, how do I program the test myself? This book answers these questions and provides an overview of the most common statistical test problems in a comprehensive way, making it easy to find and perform an appropriate statistical test. A general summary of statistical test theory is presented, along with a basic description for each test, including the necessary prerequisites, assumptions, the formal test problem and the test statistic. Examples in both SAS and R are provided, along with program code to perform the test, resulting output and remarks explaining the necessary program parameters. Key features: Provides examples in both SAS and R for each test presented. Looks at the most common statistical tests, displayed in a clear and easy to follow way. Supported by a supplementary website http://www.d-taeger.de featuring example program code. Academics, practitioners and SAS and R programmers will find this book a valuable resource. Students using SAS and R will also find it an excellent choice for reference and data analysis.

Statistics in Action

Commissioned by the Statistical Society of Canada (SSC), this volume helps both general readers and users of statistics better appreciate the scope and importance of statistics. It presents the ways in which statistics is used while highlighting key contributions that Canadian statisticians are making to science, technology, business, government, and other areas. The book emphasizes the role and impact of computing in statistical modeling and analysis, including the issues involved with the huge amounts of data being generated by automated processes.

Examples and Problems in Mathematical Statistics

Provides the necessary skills to solve problems in mathematical statistics through theory, concrete examples, and exercises With a clear and detailed approach to the fundamentals of statistical theory, Examples and Problems in Mathematical Statistics uniquely bridges the gap between theory andapplication and presents numerous problem-solving examples that illustrate the relatednotations and proven results. Written by an established authority in probability and mathematical statistics, each chapter begins with a theoretical presentation to introduce both the topic and the important results in an effort to aid in overall comprehension. Examples are then provided, followed by problems, and finally, solutions to some of the earlier problems. In addition, Examples and Problems in Mathematical Statistics features: Over 160 practical and interesting real-world examples from a variety of fields including engineering, mathematics, and statistics to help readers become proficient in theoretical problem solving More than 430 unique exercises with select solutions Key statistical inference topics, such as probability theory, statistical distributions, sufficient statistics, information in samples, testing statistical hypotheses, statistical estimation, confidence and tolerance intervals, large sample theory, and Bayesian analysis Recommended for graduate-level courses in probability and statistical inference, Examples and Problems in Mathematical Statistics is also an ideal reference for applied statisticians and researchers.

Information Evaluation

During the reception of a piece of information, we are never passive. Depending on its origin and content, from our personal beliefs and convictions, we bestow upon this piece of information, spontaneously or after reflection, a certain amount of confidence. Too much confidence shows a degree of naivety, whereas an absolute lack of it condemns us as being paranoid. These two attitudes are symmetrically detrimental, not only to the proper perception of this information but also to its use. Beyond these two extremes, each person generally adopts an intermediate position when faced with the reception of information, depending on its provenance and credibility. We still need to understand and explain how these judgements are conceived, in what context and to what end. Spanning the approaches offered by philosophy, military intelligence, algorithmics and information science, this book presents the concepts of information and the confidence placed in it, the methods that militaries, the first to be aware of the need, have or should have adopted, tools to help them, and the prospects that they have opened up. Beyond the military context, the book reveals ways to evaluate information for the good of other fields such as economic intelligence, and, more globally, the informational monitoring by governments and businesses. Contents 1. Information: Philosophical Analysis and Strategic Applications, Mouhamadou El Hady Ba and Philippe Capet. 2. Epistemic Trust, Gloria Origgi. 3. The Fundamentals of Intelligence, Philippe Lemercier. 4. Information Evaluation in the Military Domain: Doctrines, Practices and Shortcomings, Philippe Capet and Adrien Revault d'Allonnes. 5. Multidimensional Approach to Reliability Evaluation of Information Sources, Frédéric Pichon, Christophe Labreuche, Bertrand Duqueroie and Thomas Delavallade. 6. Uncertainty of an Event and its Markers in Natural Language Processing, Mouhamadou El Hady Ba, Stéphanie Brizard, Tanneguy Dulong and Bénédicte Goujon. 7. Quantitative Information Evaluation: Modeling and Experimental Evaluation, Marie-Jeanne Lesot, Frédéric Pichon and Thomas Delavallade. 8. When Reported Information Is Second Hand, Laurence Cholvy. 9. An Architecture for the Evolution of Trust: Definition and Impact of the Necessary Dimensions of Opinion Making, Adrien Revault d'Allonnes. About the Authors Philippe Capet is a project manager and research engineer at Ektimo, working mainly on information management and control in military contexts. Thomas Delavallade is an advanced studies engineer at Thales Communications & Security, working on social media mining in the context of crisis management, cybersecurity and the fight against cybercrime.

Methods and Applications of Statistics in Clinical Trials, Volume 1: Concepts, Principles, Trials, and Designs

A complete guide to the key statistical concepts essential for the design and construction of clinical trials As the newest major resource in the field of medical research, Methods and Applications of Statistics in Clinical Trials, Volume 1: Concepts, Principles, Trials, and Designs presents a timely and authoritative reviewof the central statistical concepts used to build clinical trials that obtain the best results. The referenceunveils modern approaches vital to understanding, creating, and evaluating data obtained throughoutthe various stages of clinical trial design and analysis. Accessible and comprehensive, the first volume in a two-part set includes newly-written articles as well as established literature from the Wiley Encyclopedia of Clinical Trials. Illustrating a variety of statistical concepts and principles such as longitudinal data, missing data, covariates, biased-coin randomization, repeated measurements, and simple randomization, the book also provides in-depth coverage of the various trial designs found within phase I-IV trials. Methods and Applications of Statistics in Clinical Trials, Volume 1: Concepts, Principles, Trials, and Designs also features: Detailed chapters on the type of trial designs, such as adaptive, crossover, group-randomized, multicenter, non-inferiority, non-randomized, open-labeled, preference, prevention, and superiority trials Over 100 contributions from leading academics, researchers, and practitioners An exploration of ongoing, cutting-edge clinical trials on early cancer and heart disease, mother-to-child human immunodeficiency virus transmission trials, and the AIDS Clinical Trials Group Methods and Applications of Statistics in Clinical Trials, Volume 1: Concepts, Principles, Trials, and Designs is an excellent reference for researchers, practitioners, and students in the fields of clinicaltrials, pharmaceutics, biostatistics, medical research design, biology, biomedicine, epidemiology,and public health.

Statistical Analysis in Forensic Science: Evidential Values of Multivariate Physicochemical Data

A practical guide for determining the evidential value of physicochemical data Microtraces of various materials (e.g. glass, paint, fibres, and petroleum products) are routinely subjected to physicochemical examination by forensic experts, whose role is to evaluate such physicochemical data in the context of the prosecution and defence propositions. Such examinations return various kinds of information, including quantitative data. From the forensic point of view, the most suitable way to evaluate evidence is the likelihood ratio. This book provides a collection of recent approaches to the determination of likelihood ratios and describes suitable software, with documentation and examples of their use in practice. The statistical computing and graphics software environment R, pre-computed Bayesian networks using Hugin Researcher and a new package, calcuLatoR, for the computation of likelihood ratios are all explored. Statistical Analysis in Forensic Science will provide an invaluable practical guide for forensic experts and practitioners, forensic statisticians, analytical chemists, and chemometricians. Key features include: Description of the physicochemical analysis of forensic trace evidence. Detailed description of likelihood ratio models for determining the evidential value of multivariate physicochemical data. Detailed description of methods, such as empirical cross-entropy plots, for assessing the performance of likelihood ratio-based methods for evidence evaluation. Routines written using the open-source R software, as well as Hugin Researcher and calcuLatoR. Practical examples and recommendations for the use of all these methods in practice.

Forecasting Offertory Revenue at St. Elizabeth Seton Catholic Church

This new business analytics case study challenges readers to forecast donations, plan budgets, and manage cash flow for a religious institution suffering rapidly falling contributions. Crystallizing realistic analytical challenges faced by non-profit and for-profit organizations of all kinds, it exposes readers to the entire decision-making process, providing opportunities to perform analyses, interpret output, and recommend the best course of action. Author: Matthew J. Drake, Duquesne University.

Forecasting Sales at Ska Brewing Company

This new business analytics case study challenges readers to project trends and plan capacity for a fast-growing craft beer operation, so it can make the best possible decisions about expensive investments in brewing capacity. Crystallizing realistic analytical challenges faced by companies in many industries and markets, it exposes readers to the entire decision-making process, providing opportunities to perform analyses, interpret output, and recommend the best course of action. Author: Eric Huggins, Fort Lewis College.

Statistics for Mining Engineering

Many areas of mining engineering gather and use statistical information, provided by observing the actual operation of equipment, their systems, the development of mining works, surface subsidence that accompanies underground mining, displacement of rocks surrounding surface pits and underground drives and longwalls, amongst others. In addition, the actual modern machines used in surface mining are equipped with diagnostic systems that automatically trace all important machine parameters and send this information to the main producer’s computer. Such data not only provide information on the technical properties of the machine but they also have a statistical character. Furthermore, all information gathered during stand and lab investigations where parts, assemblies and whole devices are tested in order to prove their usefulness, have a stochastic character. All of these materials need to be developed statistically and, more importantly, based on these results mining engineers must make decisions whether to undertake actions, connected with the further operation of the machines, the further development of the works, etc. For these reasons, knowledge of modern statistics is necessary for mining engineers; not only as to how statistical analysis of data should be conducted and statistical synthesis should be done, but also as to understanding the results obtained and how to use them to make appropriate decisions in relation to the mining operation. This book on statistical analysis and synthesis starts with a short repetition of probability theory and also includes a special section on statistical prediction. The text is illustrated with many examples taken from mining practice; moreover the tables required to conduct statistical inference are included.

Growth Curve Modeling: Theory and Applications

Features recent trends and advances in the theory and techniques used to accurately measure and model growth Growth Curve Modeling: Theory and Applications features an accessible introduction to growth curve modeling and addresses how to monitor the change in variables over time since there is no "one size fits all" approach to growth measurement. A review of the requisite mathematics for growth modeling and the statistical techniques needed for estimating growth models are provided, and an overview of popular growth curves, such as linear, logarithmic, reciprocal, logistic, Gompertz, Weibull, negative exponential, and log-logistic, among others, is included. In addition, the book discusses key application areas including economic, plant, population, forest, and firm growth and is suitable as a resource for assessing recent growth modeling trends in the medical field. SAS is utilized throughout to analyze and model growth curves, aiding readers in estimating specialized growth rates and curves. Including derivations of virtually all of the major growth curves and models, Growth Curve Modeling: Theory and Applications also features: Statistical distribution analysis as it pertains to growth modeling Trend estimations Dynamic site equations obtained from growth models Nonlinear regression Yield-density curves Nonlinear mixed effects models for repeated measurements data Growth Curve Modeling: Theory and Applications is an excellent resource for statisticians, public health analysts, biologists, botanists, economists, and demographers who require a modern review of statistical methods for modeling growth curves and analyzing longitudinal data. The book is also useful for upper-undergraduate and graduate courses on growth modeling.

Nonlinear Option Pricing

New Tools to Solve Your Option Pricing Problems For nonlinear PDEs encountered in quantitative finance, advanced probabilistic methods are needed to address dimensionality issues. Written by two leaders in quantitative research—including Risk magazine’s 2013 Quant of the Year— Nonlinear Option Pricing compares various numerical methods for solving high-dimensional nonlinear problems arising in option pricing. Designed for practitioners, it is the first authored book to discuss nonlinear Black-Scholes PDEs and compare the efficiency of many different methods. Real-World Solutions for Quantitative Analysts The book helps quants develop both their analytical and numerical expertise. It focuses on general mathematical tools rather than specific financial questions so that readers can easily use the tools to solve their own nonlinear problems. The authors build intuition through numerous real-world examples of numerical implementation. Although the focus is on ideas and numerical examples, the authors introduce relevant mathematical notions and important results and proofs. The book also covers several original approaches, including regression methods and dual methods for pricing chooser options, Monte Carlo approaches for pricing in the uncertain volatility model and the uncertain lapse and mortality model, the Markovian projection method and the particle method for calibrating local stochastic volatility models to market prices of vanilla options with/without stochastic interest rates, the a + bλ technique for building local correlation models that calibrate to market prices of vanilla options on a basket, and a new stochastic representation of nonlinear PDE solutions based on marked branching diffusions.

Understanding Uncertainty, Revised Edition

Praise for the First Edition "...a reference for everyone who is interested in knowing and handling uncertainty." —Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made. Featuring new material, the Revised Edition remains the go-to guide for uncertainty and decision making, providing further applications at an accessible level including: A critical study of transitivity, a basic concept in probability A discussion of how the failure of the financial sector to use the proper approach to uncertainty may have contributed to the recent recession A consideration of betting, showing that a bookmaker's odds are not expressions of probability Applications of the book's thesis to statistics A demonstration that some techniques currently popular in statistics, like significance tests, may be unsound, even seriously misleading, because they violate the rules of probability Understanding Uncertainty, Revised Edition is ideal for students studying probability or statistics and for anyone interested in one of the most fascinating and vibrant fields of study in contemporary science and mathematics.