talk-data.com talk-data.com

Topic

data

2093

tagged

Activity Trend

3 peak/qtr
2020-Q1 2026-Q1

Activities

Showing filtered results

Filtering by: O'Reilly Data Science Books ×
Bayesian Analysis of Stochastic Process Models

Bayesian analysis of complex models based on stochastic processes has in recent years become a growing area. This book provides a unified treatment of Bayesian analysis of models based on stochastic processes, covering the main classes of stochastic processing including modeling, computational, inference, forecasting, decision making and important applied models. Key features: Explores Bayesian analysis of models based on stochastic processes, providing a unified treatment. Provides a thorough introduction for research students. Computational tools to deal with complex problems are illustrated along with real life case studies Looks at inference, prediction and decision making. Researchers, graduate and advanced undergraduate students interested in stochastic processes in fields such as statistics, operations research (OR), engineering, finance, economics, computer science and Bayesian analysis will benefit from reading this book. With numerous applications included, practitioners of OR, stochastic modelling and applied statistics will also find this book useful.

Modelling Under Risk and Uncertainty: An Introduction to Statistical, Phenomenological and Computational Methods

Modelling has permeated virtually all areas of industrial, environmental, economic, bio-medical or civil engineering: yet the use of models for decision-making raises a number of issues to which this book is dedicated: How uncertain is my model ? Is it truly valuable to support decision-making ? What kind of decision can be truly supported and how can I handle residual uncertainty ? How much refined should the mathematical description be, given the true data limitations ? Could the uncertainty be reduced through more data, increased modeling investment or computational budget ? Should it be reduced now or later ? How robust is the analysis or the computational methods involved ? Should / could those methods be more robust ? Does it make sense to handle uncertainty, risk, lack of knowledge, variability or errors altogether ? How reasonable is the choice of probabilistic modeling for rare events ? How rare are the events to be considered ? How far does it make sense to handle extreme events and elaborate confidence figures ? Can I take advantage of expert / phenomenological knowledge to tighten the probabilistic figures ? Are there connex domains that could provide models or inspiration for my problem ? Written by a leader at the crossroads of industry, academia and engineering, and based on decades of multi-disciplinary field experience, Modelling Under Risk and Uncertainty gives a self-consistent introduction to the methods involved by any type of modeling development acknowledging the inevitable uncertainty and associated risks. It goes beyond the "black-box" view that some analysts, modelers, risk experts or statisticians develop on the underlying phenomenology of the environmental or industrial processes, without valuing enough their physical properties and inner modelling potential nor challenging the practical plausibility of mathematical hypotheses; conversely it is also to attract environmental or engineering modellers to better handle model confidence issues through finer statistical and risk analysis material taking advantage of advanced scientific computing, to face new regulations departing from deterministic design or support robust decision-making. Modelling Under Risk and Uncertainty: Addresses a concern of growing interest for large industries, environmentalists or analysts: robust modeling for decision-making in complex systems. Gives new insights into the peculiar mathematical and computational challenges generated by recent industrial safety or environmental control analysis for rare events. Implements decision theory choices differentiating or aggregating the dimensions of risk/aleatory and epistemic uncertainty through a consistent multi-disciplinary set of statistical estimation, physical modelling, robust computation and risk analysis. Provides an original review of the advanced inverse probabilistic approaches for model identification, calibration or data assimilation, key to digest fast-growing multi-physical data acquisition. Illustrated with one favourite pedagogical example crossing natural risk, engineering and economics, developed throughout the book to facilitate the reading and understanding. Supports Master/PhD-level course as well as advanced tutorials for professional training Analysts and researchers in numerical modeling, applied statistics, scientific computing, reliability, advanced engineering, natural risk or environmental science will benefit from this book.

Fundamentals of Predictive Analytics with JMP

Fundamentals of Predictive Analytics with JMP bridges the gap between courses on basic statistics, which focus on univariate and bivariate analysis, and courses on data mining/predictive analytics. This book provides the technical knowledge and problem-solving skills needed to perform real data multivariate analysis. Utilizing JMP 10 and JMP Pro, this book offers new and enhanced resources, including an add-in to Microsoft Excel, Graph Builder, and data mining capabilities.

Written for students in undergraduate and graduate statistics courses, this book first teaches students to recognize when it is appropriate to use the tool, to understand what variables and data are required, and to know what the results might be. Second, it teaches them how to interpret the results, followed by step-by-step instructions on how and where to perform and evaluate the analysis in JMP.

With the new emphasis on business intelligence, business analytics and predictive analytics, this book is invaluable to everyone who needs to expand their knowledge of statistics and apply real problem-solving analysis.

This book is part of the SAS Press program.

IBM Cognos TM1 Developer's Certification guide

The IBM Cognos TM1 Developer's Certification Guide is your hands-on resource to preparing for and passing the COG-310 certification exam. This book offers a practical, example-driven approach to mastering the core concepts and tools within IBM Cognos TM1, including dimension construction, scripting with Turbo Integrator, rules writing, and cube design. What this Book will help me do Master the key components and architecture of Cognos TM1 to build efficient financial models. Gain proficiency in Turbo Integrator scripting to automate data integration and transformations. Learn to create and use dimensions, cubes, and rules effectively within the TM1 environment. Understand advanced concepts like drill-through functionality, virtual cubes, and lookup cubes. Enhance your data presentation and reporting skills tailored to TM1 solutions. Author(s) James D. Miller is an experienced educator and IBM Cognos TM1 professional, with a strong background in financial and enterprise planning systems. With years of experience in the field, James brings his practical knowledge into his writing, making complex technical topics approachable and clear. He is committed to helping learners achieve their professional certifications and enhance their skill sets. Who is it for? This book is ideal for beginner to intermediate IBM Cognos TM1 developers who are looking to gain expertise in the field and obtain the COG-310 certification. If you are someone interested in enhancing your financial modeling skills and advancing your career, this guide is designed to meet your needs. It suits individuals wishing for structured, hands-on learning with practical exercises to build actual project-ready competence. Anyone aiming to independently prepare for the COG-310 certification exam will greatly benefit from this content.

OHSAS 18001 Step by Step: A Practical Guide

An essential guide to OHSAS 18001 We say 'take care' as we wave our loved ones goodbye in the morning, but how often is this message taken into the workplace? In this easy-to-understand and timely pocket guide, Naeem Sadiq, examines the Understanda as it gears up to meet OHSAS 18001 standards of occupational health and safety. Real-world scenarios Using a wide variety of fictional 'real world' scenarios, Sadiq demonstrates the hazards that might be present in a workplace, how to assess risk, how to manage OHSAS 18001 implementation and how to communicate its implementation through all levels of management. Sadiq takes the complex, and often impenetrable, concepts that surround health and safety and presents them with absolute precision and clarity. A sound understanding of OHSAS 18001 OHSAS 18001: Step by Step is more than a primer. Besides giving the reader a sound understanding of OHSAS 18001, the pocket guide can be used as a step-by-step instructional manual for anyone tasked with implementing operational health and safety standards in the workplace. This pocket guide gives its readers: A comprehensive explanation of OHSAS 18001 and its implications An understanding of how OHSAS 18001 can be implemented through the PDCI (Plan-Do-Check-Improve) management principle A 'how-to' guide for establishing an Occupational Health and Safety (OH&S) Policy A 'how-to' guide for identifying risks and controls within the organisation An understanding of the law; the legislative and contractual OH&S requirements to which an organisation subscribes An explanation of how OH&S objectives can be determined and established, and how to apportion responsibility and accountability throughout the organisation Clear understanding of OH&S accountability and the need for diligent record-keeping A 'how-to' guide for setting up a training, competence and awareness regime Understanding of how OHSAS 18001 protects not just colleagues, but customers and contractors who enter your workplace Expert guidance on how to deal with emergencies. " Buy this pocket guide and protect your workforce with OHSAS 18001!

Statistical Thinking: Improving Business Performance, Second Edition

How statistical thinking and methodology can help you make crucial business decisions Straightforward and insightful, Statistical Thinking: Improving Business Performance, Second Edition, prepares you for business leadership by developing your capacity to apply statistical thinking to improve business processes. Unique and compelling, this book shows you how to derive actionable conclusions from data analysis, solve real problems, and improve real processes. Here, you'll discover how to implement statistical thinking and methodology in your work to improve business performance. Explores why statistical thinking is necessary and helpful Provides case studies that illustrate how to integrate several statistical tools into the decision-making process Facilitates and encourages an experiential learning environment to enable you to apply material to actual problems With an in-depth discussion of JMP® software, the new edition of this important book focuses on skills to improve business processes, including collecting data appropriate for a specified purpose, recognizing limitations in existing data, and understanding the limitations of statistical analyses.

Taming The Big Data Tidal Wave: Finding Opportunities in Huge Data Streams with Advanced Analytics

You receive an e-mail. It contains an offer for a complete personal computer system. It seems like the retailer read your mind since you were exploring computers on their web site just a few hours prior.... As you drive to the store to buy the computer bundle, you get an offer for a discounted coffee from the coffee shop you are getting ready to drive past. It says that since you're in the area, you can get 10% off if you stop by in the next 20 minutes.... As you drink your coffee, you receive an apology from the manufacturer of a product that you complained about yesterday on your Facebook page, as well as on the company's web site.... Finally, once you get back home, you receive notice of a special armor upgrade available for purchase in your favorite online video game. It is just what is needed to get past some spots you've been struggling with.... Sound crazy? Are these things that can only happen in the distant future? No. All of these scenarios are possible today! Big data. Advanced analytics. Big data analytics. It seems you can't escape such terms today. Everywhere you turn people are discussing, writing about, and promoting big data and advanced analytics. Well, you can now add this book to the discussion. What is real and what is hype? Such attention can lead one to the suspicion that perhaps the analysis of big data is something that is more hype than substance. While there has been a lot of hype over the past few years, the reality is that we are in a transformative era in terms of analytic capabilities and the leveraging of massive amounts of data. If you take the time to cut through the sometimes-over-zealous hype present in the media, you'll find something very real and very powerful underneath it. With big data, the hype is driven by genuine excitement and anticipation of the business and consumer benefits that analyzing it will yield over time. Big data is the next wave of new data sources that will drive the next wave of analytic innovation in business, government, and academia. These innovations have the potential to radically change how organizations view their business. The analysis that big data enables will lead to decisions that are more informed and, in some cases, different from what they are today. It will yield insights that many can only dream about today. As you'll see, there are many consistencies with the requirements to tame big data and what has always been needed to tame new data sources. However, the additional scale of big data necessitates utilizing the newest tools, technologies, methods, and processes. The old way of approaching analysis just won't work. It is time to evolve the world of advanced analytics to the next level. That's what this book is about. Taming the Big Data Tidal Wave isn't just the title of this book, but rather an activity that will determine which businesses win and which lose in the next decade. By preparing and taking the initiative, organizations can ride the big data tidal wave to success rather than being pummeled underneath the crushing surf. What do you need to know and how do you prepare in order to start taming big data and generating exciting new analytics from it? Sit back, get comfortable, and prepare to find out!

Introduction to Linear Regression Analysis, 5th Edition

Praise for the Fourth Edition "As with previous editions, the authors have produced a leading textbook on regression." —Journal of the American Statistical Association A comprehensive and up-to-date introduction to the fundamentals of regression analysis Introduction to Linear Regression Analysis, Fifth Edition continues to present both the conventional and less common uses of linear regression in today's cutting-edge scientific research. The authors blend both theory and application to equip readers with an understanding of the basic principles needed to apply regression model-building techniques in various fields of study, including engineering, management, and the health sciences. Following a general introduction to regression modeling, including typical applications, a host of technical tools are outlined such as basic inference procedures, introductory aspects of model adequacy checking, and polynomial regression models and their variations. The book then discusses how transformations and weighted least squares can be used to resolve problems of model inadequacy and also how to deal with influential observations. The Fifth Edition features numerous newly added topics, including: A chapter on regression analysis of time series data that presents the Durbin-Watson test and other techniques for detecting autocorrelation as well as parameter estimation in time series regression models Regression models with random effects in addition to a discussion on subsampling and the importance of the mixed model Tests on individual regression coefficients and subsets of coefficients Examples of current uses of simple linear regression models and the use of multiple regression models for understanding patient satisfaction data. In addition to Minitab, SAS, and S-PLUS, the authors have incorporated JMP and the freely available R software to illustrate the discussed techniques and procedures in this new edition. Numerous exercises have been added throughout, allowing readers to test their understanding of the material, and a related FTP site features the presented data sets, extensive problem solutions, software hints, and PowerPoint slides to facilitate instructional use of the book. Introduction to Linear Regression Analysis, Fifth Edition is an excellent book for statistics and engineering courses on regression at the upper-undergraduate and graduate levels. The book also serves as a valuable, robust resource for professionals in the fields of engineering, life and biological sciences, and the social sciences.

Stochastic Modeling and Analysis of Telecoms Networks

This book addresses the stochastic modeling of telecommunication networks, introducing the main mathematical tools for that purpose, such as Markov processes, real and spatial point processes and stochastic recursions, and presenting a wide list of results on stability, performances and comparison of systems. The authors propose a comprehensive mathematical construction of the foundations of stochastic network theory: Markov chains, continuous time Markov chains are extensively studied using an original martingale-based approach. A complete presentation of stochastic recursions from an ergodic theoretical perspective is also provided, as well as spatial point processes. Using these basic tools, stability criteria, performance measures and comparison principles are obtained for a wide class of models, from the canonical M/M/1 and G/G/1 queues to more sophisticated systems, including the current "hot topics" of spatial radio networking, OFDMA and real-time networks. Contents 1. Introduction. Part 1: Discrete-time Modeling 2. Stochastic Recursive Sequences. 3. Markov Chains. 4. Stationary Queues. 5. The M/GI/1 Queue. Part 2: Continuous-time Modeling 6. Poisson Process. 7. Markov Process. 8. Systems with Delay. 9. Loss Systems. Part 3: Spatial Modeling 10. Spatial Point Processes.

Advanced Web Metrics with Google Analytics, 3rd Edition

Get the latest information about using the #1 web analytics tool from this fully updated guide Google Analytics is the free tool used by millions of web site owners to assess the effectiveness of their efforts. Its revised interface and new features will offer even more ways to increase the value of your web site, and this book will teach you how to use each one to best advantage. Featuring new content based on reader and client requests, the book helps you implement new methods and concepts, track social and mobile visitors, use the new multichannel funnel reporting features, understand which filters to use, and much more. Gets you up and running with all the new tools in the revamped Google Analytics, and includes content requested by readers and users especially for new GA users Covers social media analytics features, advanced segmentation displays, multi-dashboard configurations, and using Top 20 reports Provides a detailed best-practices implementation guide covering advanced topics, such as how to set up GA to track dynamic web pages, banners, outgoing links, and contact forms Includes case studies and demonstrates how to optimize pay-per-click accounts, integrate AdSense, work with new reports and reporting tools, use ad version testing, and more Make your web site a more effective business tool with the detailed information and advice about Google Analytics in Advanced Web Metrics with Google Analytics, 3nd Edition.

Event History Analysis with R

With an emphasis on social science applications, Event History Analysis with R presents an introduction to survival and event history analysis using real-life examples. Keeping mathematical details to a minimum, the book covers key topics, including both discrete and continuous time data, parametric proportional hazards, and accelerated failure times. Features Introduces parametric proportional hazards models with baseline distributions like the Weibull, Gompertz, Lognormal, and Piecewise constant hazard distributions, in addition to traditional Cox regression Presents mathematical details as well as technical material in an appendix Includes real examples with applications in demography, econometrics, and epidemiology Provides a dedicated R package, eha, containing special treatments, including making cuts in the Lexis diagram, creating communal covariates, and creating period statistics A much-needed primer, Event History Analysis with R is a didactically excellent resource for students and practitioners of applied event history and survival analysis.

Logistic Regression Using SAS, 2nd Edition

If you are a researcher or student with experience in multiple linear regression and want to learn about logistic regression, Paul Allison's Logistic Regression Using SAS: Theory and Application, Second Edition, is for you! Informal and nontechnical, this book both explains the theory behind logistic regression, and looks at all the practical details involved in its implementation using SAS. Several real-world examples are included in full detail. This book also explains the differences and similarities among the many generalizations of the logistic regression model. The following topics are covered: binary logistic regression, logit analysis of contingency tables, multinomial logit analysis, ordered logit analysis, discrete-choice analysis, and Poisson regression. Other highlights include discussions on how to use the GENMOD procedure to do loglinear analysis and GEE estimation for longitudinal binary data. Only basic knowledge of the SAS DATA step is assumed. The second edition describes many new features of PROC LOGISTIC, including conditional logistic regression, exact logistic regression, generalized logit models, ROC curves, the ODDSRATIO statement (for analyzing interactions), and the EFFECTPLOT statement (for graphing nonlinear effects). Also new is coverage of PROC SURVEYLOGISTIC (for complex samples), PROC GLIMMIX (for generalized linear mixed models), PROC QLIM (for selection models and heterogeneous logit models), and PROC MDC (for advanced discrete choice models).

This book is part of the SAS Press program.

JMP Start Statistics, 5th Edition

Updated for JMP 10, the book provides hands-on tutorials with just the right amount of conceptual and motivational material to illustrate how to use the intuitive interface for data analysis in JMP. Features concept-specific tutorials, examples, brief reviews of concepts, step-by-step illustrations, and exercises.

Designing Great Data Products

In the past few years, we’ve seen many data products based on predictive modeling. These products range from weather forecasting to recommendation engines like Amazon's. Prediction technology can be interesting and mathematically elegant, but we need to take the next step: going from recommendations to products that can produce optimal strategies for meeting concrete business objectives. We already know how to build these products: they've been in use for the past decade or so, but they're not as common as they should be. This report shows how to take the next step: to go from simple predictions and recommendations to a new generation of data products with the potential to revolutionize entire industries.

Quantifying the User Experience

Quantifying the User Experience: Practical Statistics for User Research offers a practical guide for using statistics to solve quantitative problems in user research. Many designers and researchers view usability and design as qualitative activities, which do not require attention to formulas and numbers. However, usability practitioners and user researchers are increasingly expected to quantify the benefits of their efforts. The impact of good and bad designs can be quantified in terms of conversions, completion rates, completion times, perceived satisfaction, recommendations, and sales. The book discusses ways to quantify user research; summarize data and compute margins of error; determine appropriate samples sizes; standardize usability questionnaires; and settle controversies in measurement and statistics. Each chapter concludes with a list of key points and references. Most chapters also include a set of problems and answers that enable readers to test their understanding of the material. This book is a valuable resource for those engaged in measuring the behavior and attitudes of people during their interaction with interfaces. Provides practical guidance on solving usability testing problems with statistics for any project, including those using Six Sigma practices Show practitioners which test to use, why they work, best practices in application, along with easy-to-use excel formulas and web-calculators for analyzing data Recommends ways for practitioners to communicate results to stakeholders in plain English Resources and tools available at the authors’ site: http://www.measuringu.com/

Adaptive Tests of Significance Using Permutations of Residuals with R and SAS

Provides the tools needed to successfully perform adaptive tests across a broad range of datasets Adaptive Tests of Significance Using Permutations of Residuals with R and SAS® illustrates the power of adaptive tests and showcases their ability to adjust the testing method to suit a particular set of data. The book utilizes state-of-the-art software to demonstrate the practicality and benefits for data analysis in various fields of study. Beginning with an introduction, the book moves on to explore the underlying concepts of adaptive tests, including: Smoothing methods and normalizing transformations Permutation tests with linear methods Applications of adaptive tests Multicenter and cross-over trials Analysis of repeated measures data Adaptive confidence intervals and estimates Throughout the book, numerous figures illustrate the key differences among traditional tests, nonparametric tests, and adaptive tests. R and SAS® software packages are used to perform the discussed techniques, and the accompanying datasets are available on the book's related website. In addition, exercises at the end of most chapters enable readers to analyze the presented datasets by putting new concepts into practice. Adaptive Tests of Significance Using Permutations of Residuals with R and SAS® is an insightful reference for professionals and researchers working with statistical methods across a variety of fields including the biosciences, pharmacology, and business. The book also serves as a valuable supplement for courses on regression analysis and adaptive analysis at the upper-undergraduate and graduate levels.

Mathematics and Statistics for Financial Risk Management

Mathematics and Statistics for Financial Risk Management is a practical guide to modern financial risk management for both practitioners and academics. The recent financial crisis and its impact on the broader economy underscore the importance of financial risk management in today's world. At the same time, financial products and investment strategies are becoming increasingly complex. Today, it is more important than ever that risk managers possess a sound understanding of mathematics and statistics. In a concise and easy-to-read style, each chapter of this book introduces a different topic in mathematics or statistics. As different techniques are introduced, sample problems and application sections demonstrate how these techniques can be applied to actual risk management problems. Exercises at the end of each chapter and the accompanying solutions at the end of the book allow readers to practice the techniques they are learning and monitor their progress. A companion website includes interactive Excel spreadsheet examples and templates. This comprehensive resource covers basic statistical concepts from volatility and Bayes' Law to regression analysis and hypothesis testing. Widely used risk models, including Value-at-Risk, factor analysis, Monte Carlo simulations, and stress testing are also explored. A chapter on time series analysis introduces interest rate modeling, GARCH, and jump-diffusion models. Bond pricing, portfolio credit risk, optimal hedging, and many other financial risk topics are covered as well. If you're looking for a book that will help you understand the mathematics and statistics of financial risk management, look no further.

How 30 Great Ads Were Made

This book takes readers behind the scenes in the world of advertising, showcasing 30 phenomenally successful campaigns from the last decade. Fascinating not only for industry professionals but also for anyone who is curious about this creative process. Technical information on how the ads were developed is accompanied by anecdotes from the creatives, directors and clients, with accounts of how the ads were made and the problems encountered along the way. Each campaign is illustrated with imagery showing the stages it went through in development – including sketches and early ideas that may have been abandoned, storyboards, animatics and photos from shoots, as well as shots of the final ads. In addition to offering an insight into the working practices within advertising, the book also demonstrates how the current period of rapid change is influencing the skills now required to work within the industry.

Webbots, Spiders, and Screen Scrapers, 2nd Edition

There's a wealth of data online, but sorting and gathering it by hand can be tedious and time consuming. Rather than click through page after endless page, why not let bots do the work for you? Webbots, Spiders, and Screen Scrapers will show you how to create simple programs with PHP/CURL to mine, parse, and archive online data to help you make informed decisions.

Carpenter's Guide to Innovative SAS Techniques

Carpenter's Guide to Innovative SAS Techniques offers advanced SAS programmers an all-in-one programming reference that includes advanced topics not easily found outside the depths of SAS documentation or more advanced training classes. Art Carpenter has written fifteen chapters of advanced tips and techniques, including topics on data summary, data analysis, and data reporting. Special emphasis is placed on DATA step techniques that solve complex data problems. There are numerous examples that illustrate advanced techniques that take advantage of formats, interface with the macro language, and utilize the Output Delivery System. Additional topics include operating system interfaces, table lookup techniques, and the creation of customized reports.