Celebrating Statistics

Celebrating Statistics

Author: A. C. Davison

Publisher: OUP Oxford

Published: 2005-09-22

Total Pages: 320

ISBN-13: 0191524212

DOWNLOAD EBOOK

Sir David Cox is among the most important statisticians of the past half-century. He has made pioneering and highly influential contributions to a uniquely wide range of topics in statistics and applied probability. His teaching has inspired generations of students, and many well-known researchers have begun as his graduate students or have worked with him at early stages of their careers. Legions of others have been stimulated and enlightened by the clear, concise, and direct exposition exemplified by his many books, papers, and lectures. This book presents a collection of chapters by major statistical researchers who attended a conference held at the University of Neuchatel in July 2004 to celebrate David Cox's 80th birthday. Each chapter is carefully crafted and collectively present current developments across a wide range of research areas from epidemiology, environmental science, finance, computing and medicine. Edited by Anthony Davison, Ecole Polytechnique Federale de Lausanne, Switzerland; Yadolah Dodge, University of Neuchatel, Switzerland; and N. Wermuth, Goteborg University, Sweden, with chapters by Ole E. Barndorff-Nielsen, Sarah C. Darby, Christina Davies, Peter J. Diggle, David Firth, Peter Hall, Valerie S. Isham, Kung-Yee Liang, Peter McCullagh, Paul McGale, Amilcare Porporato, Nancy Reid, Brian D. Ripley, Ignacio Rodriguez-Iturbe, Andrea Rotnitzky, Neil Shephard, Scott L. Zeger, and including a brief biography of David Cox, this book is suitable for students of statistics, epidemiology, environmental science, finance, computing and medicine, and academic and practising statisticians.


E.T. Jaynes

E.T. Jaynes

Author: Edwin T. Jaynes

Publisher: Springer Science & Business Media

Published: 1989-04-30

Total Pages: 468

ISBN-13: 9780792302131

DOWNLOAD EBOOK

The first six chapters of this volume present the author's 'predictive' or information theoretic' approach to statistical mechanics, in which the basic probability distributions over microstates are obtained as distributions of maximum entropy (Le. , as distributions that are most non-committal with regard to missing information among all those satisfying the macroscopically given constraints). There is then no need to make additional assumptions of ergodicity or metric transitivity; the theory proceeds entirely by inference from macroscopic measurements and the underlying dynamical assumptions. Moreover, the method of maximizing the entropy is completely general and applies, in particular, to irreversible processes as well as to reversible ones. The next three chapters provide a broader framework - at once Bayesian and objective - for maximum entropy inference. The basic principles of inference, including the usual axioms of probability, are seen to rest on nothing more than requirements of consistency, above all, the requirement that in two problems where we have the same information we must assign the same probabilities. Thus, statistical mechanics is viewed as a branch of a general theory of inference, and the latter as an extension of the ordinary logic of consistency. Those who are familiar with the literature of statistics and statistical mechanics will recognize in both of these steps a genuine 'scientific revolution' - a complete reversal of earlier conceptions - and one of no small significance.


Statistics Done Wrong

Statistics Done Wrong

Author: Alex Reinhart

Publisher: No Starch Press

Published: 2015-03-01

Total Pages: 177

ISBN-13: 1593276206

DOWNLOAD EBOOK

Scientific progress depends on good research, and good research needs good statistics. But statistical analysis is tricky to get right, even for the best and brightest of us. You'd be surprised how many scientists are doing it wrong. Statistics Done Wrong is a pithy, essential guide to statistical blunders in modern science that will show you how to keep your research blunder-free. You'll examine embarrassing errors and omissions in recent research, learn about the misconceptions and scientific politics that allow these mistakes to happen, and begin your quest to reform the way you and your peers do statistics. You'll find advice on: –Asking the right question, designing the right experiment, choosing the right statistical analysis, and sticking to the plan –How to think about p values, significance, insignificance, confidence intervals, and regression –Choosing the right sample size and avoiding false positives –Reporting your analysis and publishing your data and source code –Procedures to follow, precautions to take, and analytical software that can help Scientists: Read this concise, powerful guide to help you produce statistically sound research. Statisticians: Give this book to everyone you know. The first step toward statistics done right is Statistics Done Wrong.


Statistical Inference as Severe Testing

Statistical Inference as Severe Testing

Author: Deborah G. Mayo

Publisher: Cambridge University Press

Published: 2018-09-20

Total Pages: 503

ISBN-13: 1108563309

DOWNLOAD EBOOK

Mounting failures of replication in social and biological sciences give a new urgency to critically appraising proposed reforms. This book pulls back the cover on disagreements between experts charged with restoring integrity to science. It denies two pervasive views of the role of probability in inference: to assign degrees of belief, and to control error rates in a long run. If statistical consumers are unaware of assumptions behind rival evidence reforms, they can't scrutinize the consequences that affect them (in personalized medicine, psychology, etc.). The book sets sail with a simple tool: if little has been done to rule out flaws in inferring a claim, then it has not passed a severe test. Many methods advocated by data experts do not stand up to severe scrutiny and are in tension with successful strategies for blocking or accounting for cherry picking and selective reporting. Through a series of excursions and exhibits, the philosophy and history of inductive inference come alive. Philosophical tools are put to work to solve problems about science and pseudoscience, induction and falsification.