We present a new methodology for decision-making support based on belief functions thanks to a new theoretical canonical decomposition of dichotomous basic belief assignments (BBAs) that has been developed recently. This decomposition based on proportional conflict redistribution rule no 5 (PCR5) always exists and is unique. This new PCR5-based decomposition method circumvents the exponential complexity of the direct fusion of BBAs with PCR5 rule and it allows to fuse quickly many sources of evidences. The method we propose in this paper provides both a decision and an estimation of the quality of the decision made, which is appealing for decision-making support systems.
This volume constitutes the refereed proceedings of the 7th International Conference on Modelling and Development of Intelligent Systems, MDIS 2020, held in Sibiu, Romania, in October 2020. Due to the COVID-19 pandemic the conference was held online. The 25 revised full papers presented in the volume were carefully reviewed and selected from 57 submissions. The papers are organized in topical sections on evolutionary computing; intelligent systems for decision support; machine learning; mathematical models for development of intelligent systems; modelling and optimization of dynamic systems; ontology engineering.
This fifth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 (available at fs.unm.edu/DSmT-book4.pdf or www.onera.fr/sites/default/files/297/2015-DSmT-Book4.pdf) in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered. First Part of this book presents some theoretical advances on DSmT, dealing mainly with modified Proportional Conflict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classifiers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes. Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identification of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classification. Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classification, and hybrid techniques mixing deep learning with belief functions as well.
This book presents recent advances in computational optimization. Our everyday life is unthinkable without optimization. We try to minimize our effort and to maximize the achieved profit. Many real-world and industrial problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks. The book is a comprehensive collection of extended contributions from the Workshops on Computational Optimization 2020. The book includes important real problems like modeling of physical processes, workforce planning, parameter settings for controlling different processes, transportation problems, wireless sensor networks, machine scheduling, air pollution modeling, solving multiple integrals and systems of differential equations which describe real processes, solving engineering problems. It shows how to develop algorithms for them based on new intelligent methods like evolutionary computations, ant colony optimization, constrain programming and others. This research demonstrates how some real-world problems arising in engineering, economics and other domains can be formulated as optimization problems.
This book constitutes the thoroughly refereed proceedings of the Third International Conference on Belief Functions, BELIEF 2014, held in Oxford, UK, in September 2014. The 47 revised full papers presented in this book were carefully selected and reviewed from 56 submissions. The papers are organized in topical sections on belief combination; machine learning; applications; theory; networks; information fusion; data association; and geometry.
The theory of belief functions is widely used for data from multiple sources. Different evidence combination rules have been proposed in this framework according to the properties of the sources to combine. However, most of these combination rules are not efficient when there are a large number of sources. This is due to either the complexity or the existence of an absorbing element such as the total conflict mass function for the conjunctive based rules when applied on unreliable evidence. In this paper, based on the assumption that the majority of sources are reliable, a combination rule for a large number of sources is proposed using a simple idea: the more common ideas the sources share, the more reliable these sources are supposed to be.
An introduction to decision making under uncertainty from a computational perspective, covering both theory and applications ranging from speech recognition to airborne collision avoidance. Many important problems involve decision making under uncertainty—that is, choosing actions based on often imperfect observations, with unknown outcomes. Designers of automated decision support systems must take into account the various sources of uncertainty while balancing the multiple objectives of the system. This book provides an introduction to the challenges of decision making under uncertainty from a computational perspective. It presents both the theory behind decision making models and algorithms and a collection of example applications that range from speech recognition to aircraft collision avoidance. Focusing on two methods for designing decision agents, planning and reinforcement learning, the book covers probabilistic models, introducing Bayesian networks as a graphical model that captures probabilistic relationships between variables; utility theory as a framework for understanding optimal decision making under uncertainty; Markov decision processes as a method for modeling sequential problems; model uncertainty; state uncertainty; and cooperative decision making involving multiple interacting agents. A series of applications shows how the theoretical concepts can be applied to systems for attribute-based person search, speech applications, collision avoidance, and unmanned aircraft persistent surveillance. Decision Making Under Uncertainty unifies research from different communities using consistent notation, and is accessible to students and researchers across engineering disciplines who have some prior exposure to probability theory and calculus. It can be used as a text for advanced undergraduate and graduate students in fields including computer science, aerospace and electrical engineering, and management science. It will also be a valuable professional reference for researchers in a variety of disciplines.
An introduction to the techniques and algorithms of the newest field in robotics. Probabilistic robotics is a new and growing area in robotics, concerned with perception and control in the face of uncertainty. Building on the field of mathematical statistics, probabilistic robotics endows robots with a new level of robustness in real-world situations. This book introduces the reader to a wealth of techniques and algorithms in the field. All algorithms are based on a single overarching mathematical foundation. Each chapter provides example implementations in pseudo code, detailed mathematical derivations, discussions from a practitioner's perspective, and extensive lists of exercises and class projects. The book's Web site, www.probabilistic-robotics.org, has additional material. The book is relevant for anyone involved in robotic software development and scientific research. It will also be of interest to applied statisticians and engineers dealing with real-world sensor data.
This book presents a contemporary view of the role of information quality in information fusion and decision making, and provides a formal foundation and the implementation strategies required for dealing with insufficient information quality in building fusion systems for decision making. Information fusion is the process of gathering, processing, and combining large amounts of information from multiple and diverse sources, including physical sensors to human intelligence reports and social media. That data and information may be unreliable, of low fidelity, insufficient resolution, contradictory, fake and/or redundant. Sources may provide unverified reports obtained from other sources resulting in correlations and biases. The success of the fusion processing depends on how well knowledge produced by the processing chain represents reality, which in turn depends on how adequate data are, how good and adequate are the models used, and how accurate, appropriate or applicable prior and contextual knowledge is. By offering contributions by leading experts, this book provides an unparalleled understanding of the problem of information quality in information fusion and decision-making for researchers and professionals in the field.