This is a collection of classic research papers on the Dempster-Shafer theory of belief functions. The book is the authoritative reference in the field of evidential reasoning and an important archival reference in a wide range of areas including uncertainty reasoning in artificial intelligence and decision making in economics, engineering, and management. The book includes a foreword reflecting the development of the theory in the last forty years.
The book focuses on applications of belief functions to business decisions. Section I introduces the intuitive, conceptual and historical development of belief functions. Three different interpretations (the marginally correct approximation, the qualitative model, and the quantitative model) of belief functions are investigated, and rough set theory and structured query language (SQL) are used to express belief function semantics. Section II presents applications of belief functions in information systems and auditing. Included are discussions on how a belief-function framework provides a more efficient and effective audit methodology and also the appropriateness of belief functions to represent uncertainties in audit evidence. The third section deals with applications of belief functions to mergers and acquisitions; financial analysis of engineering enterprises; forecast demand for mobile satellite services; modeling financial portfolios; and economics.
In recent years, the theory has become widely accepted and has beenfurther developed, but a detailed introduction is needed in orderto make the material available and accessible to a wide audience.This will be the first book providing such an introduction,covering core theory and recent developments which can be appliedto many application areas. All authors of individual chapters areleading researchers on the specific topics, assuring high qualityand up-to-date contents. An Introduction to Imprecise Probabilities provides acomprehensive introduction to imprecise probabilities, includingtheory and applications reflecting the current state if the art.Each chapter is written by experts on the respective topics,including: Sets of desirable gambles; Coherent lower (conditional)previsions; Special cases and links to literature; Decision making;Graphical models; Classification; Reliability and risk assessment;Statistical inference; Structural judgments; Aspects ofimplementation (including elicitation and computation); Models infinance; Game-theoretic probability; Stochastic processes(including Markov chains); Engineering applications. Essential reading for researchers in academia, researchinstitutes and other organizations, as well as practitionersengaged in areas such as risk analysis and engineering.
For quite some time, philosophers, economists, and statisticians have endorsed a view on rational choice known as Bayesianism. The work on this book has grown out of a feeling that the Bayesian view has come to dominate the academic com- nitytosuchanextentthatalternative,non-Bayesianpositionsareseldomextensively researched. Needless to say, I think this is a pity. Non-Bayesian positions deserve to be examined with much greater care, and the present work is an attempt to defend what I believe to be a coherent and reasonably detailed non-Bayesian account of decision theory. The main thesis I defend can be summarised as follows. Rational agents m- imise subjective expected utility, but contrary to what is claimed by Bayesians, ut- ity and subjective probability should not be de?ned in terms of preferences over uncertain prospects. On the contrary, rational decision makers need only consider preferences over certain outcomes. It will be shown that utility and probability fu- tions derived in a non-Bayesian manner can be used for generating preferences over uncertain prospects, that support the principle of maximising subjective expected utility. To some extent, this non-Bayesian view gives an account of what modern - cision theory could have been like, had decision theorists not entered the Bayesian path discovered by Ramsey, de Finetti, Savage, and others. I will not discuss all previous non-Bayesian positions presented in the literature.
Peter F. Drucker argues that what underlies the current malaise of so many large and successful organizations worldwide is that their theory of the business no longer works. The story is a familiar one: a company that was a superstar only yesterday finds itself stagnating and frustrated, in trouble and, often, in a seemingly unmanageable crisis. The root cause of nearly every one of these crises is not that things are being done poorly. It is not even that the wrong things are being done. Indeed, in most cases, the right things are being done—but fruitlessly. What accounts for this apparent paradox? The assumptions on which the organization has been built and is being run no longer fit reality. These are the assumptions that shape any organization's behavior, dictate its decisions about what to do and what not to do, and define what an organization considers meaningful results. These assumptions are what Drucker calls a company's theory of the business. The Harvard Business Review Classics series offers you the opportunity to make seminal Harvard Business Review articles a part of your permanent management library. Each highly readable volume contains a groundbreaking idea that continues to shape best practices and inspire countless managers around the world—and will have a direct impact on you today and for years to come.
Both in science and in practical affairs we reason by combining facts only inconclusively supported by evidence. Building on an abstract understanding of this process of combination, this book constructs a new theory of epistemic probability. The theory draws on the work of A. P. Dempster but diverges from Depster's viewpoint by identifying his "lower probabilities" as epistemic probabilities and taking his rule for combining "upper and lower probabilities" as fundamental. The book opens with a critique of the well-known Bayesian theory of epistemic probability. It then proceeds to develop an alternative to the additive set functions and the rule of conditioning of the Bayesian theory: set functions that need only be what Choquet called "monotone of order of infinity." and Dempster's rule for combining such set functions. This rule, together with the idea of "weights of evidence," leads to both an extensive new theory and a better understanding of the Bayesian theory. The book concludes with a brief treatment of statistical inference and a discussion of the limitations of epistemic probability. Appendices contain mathematical proofs, which are relatively elementary and seldom depend on mathematics more advanced that the binomial theorem.
People often follow intuitive principles of decision making, ranging from group loyalty to the belief that nature is benign. But instead of using these principles as rules of thumb, we often treat them as absolutes and ignore the consequences of following them blindly. In Judgment Misguided, Jonathan Baron explores our well-meant and deeply felt personal intuitions about what is right and wrong, and how they affect the public domain. Baron argues that when these intuitions are valued in their own right, rather than as a means to another end, they often prevent us from achieving the results we want. Focusing on cases where our intuitive principles take over public decision making, the book examines some of our most common intuitions and the ways they can be misused. According to Baron, we can avoid these problems by paying more attention to the effects of our decisions. Written in a accessible style, the book is filled with compelling case studies, such as abortion, nuclear power, immigration, and the decline of the Atlantic fishery, among others, which illustrate a range of intuitions and how they impede the public's best interests. Judgment Misguided will be important reading for those involved in public decision making, and researchers and students in psychology and the social sciences, as well as everyone looking for insight into the decisions that affect us all.
Non-Additive Measure and Integral is the first systematic approach to the subject. Much of the additive theory (convergence theorems, Lebesgue spaces, representation theorems) is generalized, at least for submodular measures which are characterized by having a subadditive integral. The theory is of interest for applications to economic decision theory (decisions under risk and uncertainty), to statistics (including belief functions, fuzzy measures) to cooperative game theory, artificial intelligence, insurance, etc. Non-Additive Measure and Integral collects the results of scattered and often isolated approaches to non-additive measures and their integrals which originate in pure mathematics, potential theory, statistics, game theory, economic decision theory and other fields of application. It unifies, simplifies and generalizes known results and supplements the theory with new results, thus providing a sound basis for applications and further research in this growing field of increasing interest. It also contains fundamental results of sigma-additive and finitely additive measure and integration theory and sheds new light on additive theory. Non-Additive Measure and Integral employs distribution functions and quantile functions as basis tools, thus remaining close to the familiar language of probability theory. In addition to serving as an important reference, the book can be used as a mathematics textbook for graduate courses or seminars, containing many exercises to support or supplement the text.
The principal aim of this book is to introduce to the widest possible audience an original view of belief calculus and uncertainty theory. In this geometric approach to uncertainty, uncertainty measures can be seen as points of a suitably complex geometric space, and manipulated in that space, for example, combined or conditioned. In the chapters in Part I, Theories of Uncertainty, the author offers an extensive recapitulation of the state of the art in the mathematics of uncertainty. This part of the book contains the most comprehensive summary to date of the whole of belief theory, with Chap. 4 outlining for the first time, and in a logical order, all the steps of the reasoning chain associated with modelling uncertainty using belief functions, in an attempt to provide a self-contained manual for the working scientist. In addition, the book proposes in Chap. 5 what is possibly the most detailed compendium available of all theories of uncertainty. Part II, The Geometry of Uncertainty, is the core of this book, as it introduces the author’s own geometric approach to uncertainty theory, starting with the geometry of belief functions: Chap. 7 studies the geometry of the space of belief functions, or belief space, both in terms of a simplex and in terms of its recursive bundle structure; Chap. 8 extends the analysis to Dempster’s rule of combination, introducing the notion of a conditional subspace and outlining a simple geometric construction for Dempster’s sum; Chap. 9 delves into the combinatorial properties of plausibility and commonality functions, as equivalent representations of the evidence carried by a belief function; then Chap. 10 starts extending the applicability of the geometric approach to other uncertainty measures, focusing in particular on possibility measures (consonant belief functions) and the related notion of a consistent belief function. The chapters in Part III, Geometric Interplays, are concerned with the interplay of uncertainty measures of different kinds, and the geometry of their relationship, with a particular focus on the approximation problem. Part IV, Geometric Reasoning, examines the application of the geometric approach to the various elements of the reasoning chain illustrated in Chap. 4, in particular conditioning and decision making. Part V concludes the book by outlining a future, complete statistical theory of random sets, future extensions of the geometric approach, and identifying high-impact applications to climate change, machine learning and artificial intelligence. The book is suitable for researchers in artificial intelligence, statistics, and applied science engaged with theories of uncertainty. The book is supported with the most comprehensive bibliography on belief and uncertainty theory.
The basic tools for analyzing macroeconomic fluctuations and policies, applied to concrete issues and presented within an integrated New Keynesian framework. This textbook presents the basic tools for analyzing macroeconomic fluctuations and policies and applies them to contemporary issues. It employs a unified New Keynesian framework for understanding business cycles, major crises, and macroeconomic policies, introducing students to the approach most often used in academic macroeconomic analysis and by central banks and international institutions. The book addresses such topics as how recessions and crises spread; what instruments central banks and governments have to stimulate activity when private demand is weak; and what “unconventional” macroeconomic policies might work when conventional monetary policy loses its effectiveness (as has happened in many countries in the aftermath of the Great Recession.). The text introduces the foundations of modern business cycle theory through the notions of aggregate demand and aggregate supply, and then applies the theory to the study of regular business-cycle fluctuations in output, inflation, and employment. It considers conventional monetary and fiscal policies aimed at stabilizing the business cycle, and examines unconventional macroeconomic policies, including forward guidance and quantitative easing, in situations of “liquidity trap”—deep crises in which conventional policies are either ineffective or have very different effects than in normal time. This book is the first to use the New Keynesian framework at the advanced undergraduate level, connecting undergraduate learning not only with the more advanced tools taught at the graduate level but also with the large body of policy-oriented research in academic journals. End-of-chapter problems help students master the materials presented.