Probability, Statistics and Other Frightening Stuff

Probability, Statistics and Other Frightening Stuff

Author: Alan R. Jones

Publisher: Routledge

Published: 2018-10-09

Total Pages: 485

ISBN-13: 135166137X

DOWNLOAD EBOOK

Probability, Statistics and Other Frightening Stuff (Volume II of the Working Guides to Estimating & Forecasting series) considers many of the commonly used Descriptive Statistics in the world of estimating and forecasting. It considers values that are representative of the ‘middle ground’ (Measures of Central Tendency), and the degree of data scatter (Measures of Dispersion and Shape) around the ‘middle ground’ values. A number of Probability Distributions and where they might be used are discussed, along with some fascinating and useful ‘rules of thumb’ or short-cut properties that estimators and forecasters can exploit in plying their trade. With the help of a ‘Correlation Chicken’, the concept of partial correlation is explained, including how the estimator or forecaster can exploit this in reflecting varying levels of independence and imperfect dependence between an output or predicted value (such as cost) and an input or predictor variable such as size. Under the guise of ‘Tails of the unexpected’ the book concludes with two chapters devoted to Hypothesis Testing (or knowing when to accept or reject the validity of an assumed estimating relationship), and a number of statistically-based tests to help the estimator to decide whether to include or exclude a data point as an ‘outlier’, one that appears not to be representative of that which the estimator is tasked to produce. This is a valuable resource for estimators, engineers, accountants, project risk specialists as well as students of cost engineering.


Probability, Statistics and Other Frightening Stuff

Probability, Statistics and Other Frightening Stuff

Author: Alan R. Jones

Publisher: Routledge

Published: 2018-10-09

Total Pages: 472

ISBN-13: 1351661388

DOWNLOAD EBOOK

Probability, Statistics and Other Frightening Stuff (Volume II of the Working Guides to Estimating & Forecasting series) considers many of the commonly used Descriptive Statistics in the world of estimating and forecasting. It considers values that are representative of the ‘middle ground’ (Measures of Central Tendency), and the degree of data scatter (Measures of Dispersion and Shape) around the ‘middle ground’ values. A number of Probability Distributions and where they might be used are discussed, along with some fascinating and useful ‘rules of thumb’ or short-cut properties that estimators and forecasters can exploit in plying their trade. With the help of a ‘Correlation Chicken’, the concept of partial correlation is explained, including how the estimator or forecaster can exploit this in reflecting varying levels of independence and imperfect dependence between an output or predicted value (such as cost) and an input or predictor variable such as size. Under the guise of ‘Tails of the unexpected’ the book concludes with two chapters devoted to Hypothesis Testing (or knowing when to accept or reject the validity of an assumed estimating relationship), and a number of statistically-based tests to help the estimator to decide whether to include or exclude a data point as an ‘outlier’, one that appears not to be representative of that which the estimator is tasked to produce. This is a valuable resource for estimators, engineers, accountants, project risk specialists as well as students of cost engineering.


Risk, Opportunity, Uncertainty and Other Random Models

Risk, Opportunity, Uncertainty and Other Random Models

Author: Alan R. Jones

Publisher: Routledge

Published: 2018-09-13

Total Pages: 292

ISBN-13: 1351661299

DOWNLOAD EBOOK

Risk, Opportunity, Uncertainty and Other Random Models (Volume V in the Working Guides to Estimating and Forecasting series) goes part way to debunking the myth that research and development cost are somewhat random, as under certain conditions they can be observed to follow a pattern of behaviour referred to as a Norden-Rayleigh Curve, which unfortunately has to be truncated to stop the myth from becoming a reality! However, there is a practical alternative in relation to a particular form of PERT-Beta Curve. However, the major emphasis of this volume is the use of Monte Carlo Simulation as a general technique for narrowing down potential outcomes of multiple interacting variables or cost drivers. Perhaps the most common of these in the evaluation of Risk, Opportunity and Uncertainty. The trouble is that many Monte Carlo Simulation tools are ‘black boxes’ and too few estimators and forecasters really appreciate what is happening inside the ‘black box’. This volume aims to resolve that and offers tips into things that might need to be considered to remove some of the uninformed random input that often creates a misinformed misconception of ‘it must be right!’ Monte Carlo Simulation can be used to model variable determine Critical Paths in a schedule, and is key to modelling Waiting Times and cues with random arisings. Supported by a wealth of figures and tables, this is a valuable resource for estimators, engineers, accountants, project risk specialists as well as students of cost engineering.


Best Fit Lines & Curves

Best Fit Lines & Curves

Author: Alan R. Jones

Publisher: Routledge

Published: 2018-10-09

Total Pages: 498

ISBN-13: 1351661442

DOWNLOAD EBOOK

Best Fit Lines and Curves, and Some Mathe-Magical Transformations (Volume III of the Working Guides to Estimating & Forecasting series) concentrates on techniques for finding the Best Fit Line or Curve to some historical data allowing us to interpolate or extrapolate the implied relationship that will underpin our prediction. A range of simple ‘Moving Measures’ are suggested to smooth the underlying trend and quantify the degree of noise or scatter around that trend. The advantages and disadvantages are discussed and a simple way to offset the latent disadvantage of most Moving Measure Techniques is provided. Simple Linear Regression Analysis, a more formal numerical technique that calculates the line of best fit subject to defined ‘goodness of fit’ criteria. Microsoft Excel is used to demonstrate how to decide whether the line of best fit is a good fit, or just a solution in search of some data. These principles are then extended to cover multiple cost drivers, and how we can use them to quantify 3-Point Estimates. With a deft sleight of hand, certain commonly occurring families of non-linear relationships can be transformed mathe-magically into linear formats, allowing us to exploit the powers of Regression Analysis to find the Best Fit Curves. The concludes with an exploration of the ups and downs of seasonal data (Time Series Analysis). Supported by a wealth of figures and tables, this is a valuable resource for estimators, engineers, accountants, project risk specialists as well as students of cost engineering.


Principles, Process and Practice of Professional Number Juggling

Principles, Process and Practice of Professional Number Juggling

Author: Alan R. Jones

Publisher: Routledge

Published: 2018-09-13

Total Pages: 242

ISBN-13: 1351661353

DOWNLOAD EBOOK

Principles, Process and Practice of Professional Number Juggling (Volume 1 of the Working Guides to Estimating & Forecasting series) sets the scene of TRACEability and good estimate practice that is followed in the other volumes in this series of five working guides. It clarifies the difference between an Estimating Process, Procedure, Approach, Method and Technique. It expands on these definitions of Approach (Top-down, Bottom-up and ‘Ethereal’) and Method (Analogy, Parametric and ‘Trusted Source’) and discusses how these form the basis of all other means of establishing an estimate. This volume also underlines the importance of ‘data normalisation’ in any estimating procedure, and demonstrates that the Estimating by Analogy Method, in essence, is a simple extension of Data Normalisation. The author looks at simple measures of assessing the maturity or health of an estimate, and offers a means of assessing a spreadsheet for any inherent risks or errors that may be introduced by failing to follow good practice in spreadsheet design and build. This book provides a taster of the more numerical techniques covered in the remainder of the series by considering how an estimator can potentially exploit Benford’s Law (traditionally used in Fraud Detection) to identify systematic bias from third party contributors. It will be a valuable resource for estimators, engineers, accountants, project risk specialists as well as students of cost engineering.


Applications of Networks, Sensors and Autonomous Systems Analytics

Applications of Networks, Sensors and Autonomous Systems Analytics

Author: Jyotsna Kumar Mandal

Publisher: Springer Nature

Published: 2021-11-27

Total Pages: 364

ISBN-13: 9811673055

DOWNLOAD EBOOK

This book presents high-quality research papers presented at International Conference on Applications of Networks, Sensors and Autonomous Systems Analytics (ICANSAA 2020), held during December, 11 – 12, 2020, at JIS College of Engineering, Kalyani, West Bengal, India. The major topics covered are cyber-physical systems and sensor networks, data analytics and autonomous systems and MEMS and NEMS with applications in biomedical devices. It includes novel and innovative work from experts, practitioners, scientists, and decision-makers from academia and industry.


Learning, Unlearning and Re-Learning Curves

Learning, Unlearning and Re-Learning Curves

Author: Alan R. Jones

Publisher: Routledge

Published: 2018-09-13

Total Pages: 304

ISBN-13: 1351661477

DOWNLOAD EBOOK

Learning, Unlearning and Re-learning Curves (Volume IV of the Working Guides to Estimating & Forecasting series) focuses in on Learning Curves, and the various tried and tested models of Wright, Crawford, DeJong, Towill-Bevis and others. It explores the differences and similarities between the various models and examines the key properties that Estimators and Forecasters can exploit. A discussion about Learning Curve Cost Drivers leads to the consideration of a little used but very powerful technique of Learning Curve modelling called Segmentation, which looks at an organisation’s complex learning curve as the product of multiple shallower learning curves. Perhaps the biggest benefit is that it simplifies the calculations in Microsoft Excel where there is a change in the rate of learning observed or expected. The same technique can be used to model and calibrate discontinuities in the learning process that result in setbacks and uplifts in time or cost. This technique is compared with other, better known techniques such as Anderlohr’s. Equivalent Unit Learning is another, relative new technique that can be used alongside traditional completed unit learning to give an early warning of changes in the rates of learning. Finally, a Learning Curve can be exploited to estimate the penalty of collaborative working across multiple partners. Supported by a wealth of figures and tables, this is a valuable resource for estimators, engineers, accountants, project risk specialists, as well as students of cost engineering.


The Book of Why

The Book of Why

Author: Judea Pearl

Publisher: Basic Books

Published: 2018-05-15

Total Pages: 432

ISBN-13: 0465097618

DOWNLOAD EBOOK

A Turing Award-winning computer scientist and statistician shows how understanding causality has revolutionized science and will revolutionize artificial intelligence "Correlation is not causation." This mantra, chanted by scientists for more than a century, has led to a virtual prohibition on causal talk. Today, that taboo is dead. The causal revolution, instigated by Judea Pearl and his colleagues, has cut through a century of confusion and established causality -- the study of cause and effect -- on a firm scientific basis. His work explains how we can know easy things, like whether it was rain or a sprinkler that made a sidewalk wet; and how to answer hard questions, like whether a drug cured an illness. Pearl's work enables us to know not just whether one thing causes another: it lets us explore the world that is and the worlds that could have been. It shows us the essence of human thought and key to artificial intelligence. Anyone who wants to understand either needs The Book of Why.