The Dempster–Shafer (DS) theory of evidence can combine evidence with one parameter. The evidential reasoning (ER) approach is an extension of DS theory that can combine evidence with two parameters (weights and reliabilities). However, it has three infeasible aspects: reliability dependence, unreliability effectiveness, and intergeneration inconsistency.
Both in science and in practical affairs we reason by combining facts only inconclusively supported by evidence. Building on an abstract understanding of this process of combination, this book constructs a new theory of epistemic probability. The theory draws on the work of A. P. Dempster but diverges from Depster's viewpoint by identifying his "lower probabilities" as epistemic probabilities and taking his rule for combining "upper and lower probabilities" as fundamental. The book opens with a critique of the well-known Bayesian theory of epistemic probability. It then proceeds to develop an alternative to the additive set functions and the rule of conditioning of the Bayesian theory: set functions that need only be what Choquet called "monotone of order of infinity." and Dempster's rule for combining such set functions. This rule, together with the idea of "weights of evidence," leads to both an extensive new theory and a better understanding of the Bayesian theory. The book concludes with a brief treatment of statistical inference and a discussion of the limitations of epistemic probability. Appendices contain mathematical proofs, which are relatively elementary and seldom depend on mathematics more advanced that the binomial theorem.
In Dempster–Shafer evidence theory, the basic probability assignment (BPA) can effectively represent and process uncertain information. How to transform the BPA of uncertain information into a decision probability remains a problem to be solved. In the light of this issue, we develop a novel decision probability transformation method to realize the transition from the belief decision to the probability decision in the framework of Dempster–Shafer evidence theory. The newly proposed method considers the transformation of BPA with multi-subset focal elements from the perspective of the belief interval, and applies the continuous interval argument ordered weighted average operator to quantify the data information contained in the belief interval for each singleton. Afterward, we present an approach to calculate the support degree of the singleton based on quantitative data information. According to the support degree of the singleton, the BPA of multi-subset focal elements is allocated reasonably. Furthermore, we introduce the concepts of probabilistic information content in this paper, which is utilized to evaluate the performance of the decision probability transformation method. Eventually, a few numerical examples and a practical application are given to demonstrate the rationality and accuracy of our proposed method.
This is a collection of classic research papers on the Dempster-Shafer theory of belief functions. The book is the authoritative reference in the field of evidential reasoning and an important archival reference in a wide range of areas including uncertainty reasoning in artificial intelligence and decision making in economics, engineering, and management. The book includes a foreword reflecting the development of the theory in the last forty years.
This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Dempster-Shafer theory offers an alternative to traditional probabilistic theory for the mathematical representation of uncertainty. The significant innovation of this framework is that it allows for the allocation of a probability mass to sets or intervals. Dempster-Shafer theory does not require an assumption regarding the probability of the individual constituents of the set or interval. This is a potentially valuable tool for the evaluation of risk and reliability in engineering applications when it is not possible to obtain a precise measurement from experiments, or when knowledge is obtained from expert elicitation. An important aspect of this theory is the combination of evidence obtained from multiple sources and the modeling of conflict between them. This report surveys a number of possible combination rules for Dempster-Shafer structures and provides examples of the implementation of these rules for discrete and interval-valued data.
Information is precious. It reduces our uncertainty in making decisions. Knowledge about the outcome of an uncertain event gives the possessor an advantage. It changes the course of lives, nations, and history itself. Information is the food of Maxwell's demon. His power comes from know ing which particles are hot and which particles are cold. His existence was paradoxical to classical physics and only the realization that information too was a source of power led to his taming. Information has recently become a commodity, traded and sold like or ange juice or hog bellies. Colleges give degrees in information science and information management. Technology of the computer age has provided access to information in overwhelming quantity. Information has become something worth studying in its own right. The purpose of this volume is to introduce key developments and results in the area of generalized information theory, a theory that deals with uncertainty-based information within mathematical frameworks that are broader than classical set theory and probability theory. The volume is organized as follows.
The emerging technology of multisensor data fusion has a wide range of applications, both in Department of Defense (DoD) areas and in the civilian arena. The techniques of multisensor data fusion draw from an equally broad range of disciplines, including artificial intelligence, pattern recognition, and statistical estimation. With the rapid evolut
The ever increasing public demand and the setting-up of national and international legislation on safety assessment of potentially dangerous plants require that a correspondingly increased effort be devoted by regulatory bodies and industrial organisations to collect reliability data in order to produce safety analyses. Reliability data are also needed to assess availability of plants and services and to improve quality of production processes, in particular, to meet the needs of plant operators and/or designers regarding maintenance planning, production availability, etc. The need for an educational effort in the field of data acquisition and processing has been stressed within the framework of EuReDatA, an association of organisations operating reliability data banks. This association aims to promote data exchange and pooling of data between organisations and to encourage the adoption of compatible standards and basic definitions for a consistent exchange of reliability data. Such basic definitions are considered to be essential in order to improve data quality. To cover issues directly linked to the above areas ample space is devoted to the definition of failure events, common cause and human error data, feedback of operational and disturbance data, event data analysis, lifetime distributions, cumulative distribution functions, density functions, Bayesian inference methods, multivariate analysis, fuzzy sets and possibility theory, etc.