In this new edition of Probability and Evidence, first published in 1972, one of the foremost analytical philosophers of the twentieth century addresses central questions in epistemology and the philosophy of science. Based on Ayer's influential Dewey Lectures of 1970, Probability and Evidence contains revised versions of the lectures and two additional essays. This new edition includes Graham Macdonald's extensive introduction explaining the book's importance and influence in contemporary philosophy.
Interpreting statistical data as evidence, Statistical Evidence: A Likelihood Paradigm focuses on the law of likelihood, fundamental to solving many of the problems associated with interpreting data in this way. Statistics has long neglected this principle, resulting in a seriously defective methodology. This book redresses the balance, explaining why science has clung to a defective methodology despite its well-known defects. After examining the strengths and weaknesses of the work of Neyman and Pearson and the Fisher paradigm, the author proposes an alternative paradigm which provides, in the law of likelihood, the explicit concept of evidence missing from the other paradigms. At the same time, this new paradigm retains the elements of objective measurement and control of the frequency of misleading results, features which made the old paradigms so important to science. The likelihood paradigm leads to statistical methods that have a compelling rationale and an elegant simplicity, no longer forcing the reader to choose between frequentist and Bayesian statistics.
Does God exist? This is probably the most debated question in the history of mankind. Scholars, scientists, and philosophers have spent their lifetimes trying to prove or disprove the existence of God, only to have their theories crucified by other scholars, scientists, and philosophers. Where the debate breaks down is in the ambiguities and colloquialisms of language. But, by using a universal, unambiguous language—namely, mathematics—can this question finally be answered definitively? That’s what Dr. Stephen Unwin attempts to do in this riveting, accessible, and witty book, The Probability of God. At its core, this groundbreaking book reveals how a math equation developed more than 200 years ago by noted European philosopher Thomas Bayes can be used to calculate the probability that God exists. The equation itself is much more complicated than a simple coin toss (heads, He’s up there running the show; tails, He’s not). Yet Dr. Unwin writes with a clarity that makes his mathematical proof easy for even the nonmathematician to understand and a verve that makes his book a delight to read. Leading you carefully through each step in his argument, he demonstrates in the end that God does indeed exist. Whether you’re a devout believer and agree with Dr. Unwin’s proof or are unsure about all things divine, you will find this provocative book enlightening and engaging. “One of the most innovative works [in the science and religion movement] is The Probability of God...An entertaining exercise in thinking.”—Michael Shermer, Scientific American “Unwin’s book [is] peppered with wry, self-deprecating humor that makes the scientific discussions more accessible...Spiritually inspiring.”--Chicago Sun Times “A pleasantly breezy account of some complicated matters well worth learning about.”--Philadelphia Inquirer “One of the best things about the book is its humor.”--Cleveland Plain Dealer “In a book that is surprisingly lighthearted and funny, Unwin manages to pack in a lot of facts about science and philosophy.”--Salt Lake Tribune
This book explores the nature of factual inference in adjudication. The book should be useful to students of law in Continental Europe as well as to students of Anglo-American law. While a good many countries do not use the sorts of rules of evidence found in the Anglo-American legal tradition, their procedural systems nevertheless frequently use a variety of rules and principles to regulate and structure the acquisition, presentation, and evalu ation of evidence. In this sense, almost all legal systems have a law of proof. This book should also be useful to scholars in fields other than law. While the papers focus on inference in adjudication, they deal with a wide variety of issues that are important in disciplines such as the philosophy of science, statistics, and psychology. For example, there is extensive discussion of the role of generalizations and hypotheses in inference and of the significance of the fact that the actors who evaluate data also in some sense constitute the data that they evaluate. Furthermore, explanations of the manner in which some legal systems structure fact-finding processes may highlight features of inferential processes that have yet to be adequately tackled by scholars in fields other than law.
David Hume’s argument against believing in miracles has attracted nearly continuous attention from philosophers and theologians since it was first published in 1748. Hume’s many commentators, however, both pro and con, have often misunderstood key aspects of Hume’s account of evidential probability and as a result have misrepresented Hume’s argument and conclusions regarding miracles in fundamental ways. This book argues that Hume’s account of probability descends from a long and laudable tradition that goes back to ancient Roman and medieval law. That account is entirely and deliberately non-mathematical. As a result, any analysis of Hume’s argument in terms of the mathematical theory of probability is doomed to failure. Recovering the knowledge of this ancient tradition of probable reasoning leads us to a correct interpretation of Hume’s argument against miracles, enables a more accurate understanding of many other episodes in the history of science and of philosophy, and may be also useful in contemporary attempts to weigh evidence in epistemically complex situations where confirmation theory and mathematical probability theory have proven to be less helpful than we would have hoped.
How did we make reliable predictions before Pascal and Fermat's discovery of the mathematics of probability in 1654? What methods in law, science, commerce, philosophy, and logic helped us to get at the truth in cases where certainty was not attainable? In The Science of Conjecture, James Franklin examines how judges, witch inquisitors, and juries evaluated evidence; how scientists weighed reasons for and against scientific theories; and how merchants counted shipwrecks to determine insurance rates. The Science of Conjecture provides a history of rational methods of dealing with uncertainty and explores the coming to consciousness of the human understanding of risk.
Martin Smith explores a question central to philosophy—namely, what does it take for a belief to be justified or rational? According to a widespread view, whether one has justification for believing a proposition is determined by how probable that proposition is, given one's evidence. In the present book this view is rejected and replaced with another: in order for one to have justification for believing a proposition, one's evidence must normically support it—roughly, one's evidence must make the falsity of that proposition abnormal in the sense of calling for special, independent explanation. This conception of justification bears upon a range of topics in epistemology and beyond, including the relation between justification and knowledge, the force of statistical evidence, the problem of scepticism, the lottery and preface paradoxes, the viability of multiple premise closure, the internalist/externalist debate, the psychology of human reasoning, and the relation between belief and degrees of belief. Ultimately, this way of looking at justification guides us to a new, unfamiliar picture of how we should respond to our evidence and manage our own fallibility. This picture is developed here.