New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

Author: Leandro Pardo

Publisher: MDPI

Published: 2019-05-20

Total Pages: 344

ISBN-13: 3038979368

DOWNLOAD EBOOK

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.


New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

Author: Leandro Pardo

Publisher:

Published: 2019

Total Pages: 344

ISBN-13: 9783038979371

DOWNLOAD EBOOK

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald's statistics, likelihood ratio statistics and Rao's score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.


Statistical Inference Based on Divergence Measures

Statistical Inference Based on Divergence Measures

Author: Leandro Pardo

Publisher: CRC Press

Published: 2018-11-12

Total Pages: 513

ISBN-13: 1420034812

DOWNLOAD EBOOK

The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this p


Concepts and Recent Advances in Generalized Information Measures and Statistics

Concepts and Recent Advances in Generalized Information Measures and Statistics

Author: Andres M. Kowalski, Raul D. Rossignoli and Evaldo M. F. Curado

Publisher: Bentham Science Publishers

Published: 2013-12-13

Total Pages: 432

ISBN-13: 1608057607

DOWNLOAD EBOOK

Since the introduction of the information measure widely known as Shannon entropy, quantifiers based on information theory and concepts such as entropic forms and statistical complexities have proven to be useful in diverse scientific research fields. This book contains introductory tutorials suitable for the general reader, together with chapters dedicated to the basic concepts of the most frequently employed information measures or quantifiers and their recent applications to different areas, including physics, biology, medicine, economics, communication and social sciences. As these quantifiers are powerful tools for the study of general time and data series independently of their sources, this book will be useful to all those doing research connected with information analysis. The tutorials in this volume are written at a broadly accessible level and readers will have the opportunity to acquire the knowledge necessary to use the information theory tools in their field of interest.


Statistical Inference Based on Divergence Measures

Statistical Inference Based on Divergence Measures

Author: Leandro Pardo

Publisher: Chapman and Hall/CRC

Published: 2005-10-10

Total Pages: 512

ISBN-13: 9781584886006

DOWNLOAD EBOOK

The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach. Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, presenting the interesting possibility of introducing alternative test statistics to classical ones like Wald, Rao, and likelihood ratio. Each chapter concludes with exercises that clarify the theoretical results and present additional results that complement the main discussions. Clear, comprehensive, and logically developed, this book offers a unique opportunity to gain not only a new perspective on some standard statistics problems, but the tools to put it into practice.


Entropy and Information Theory

Entropy and Information Theory

Author: Robert M. Gray

Publisher: Springer Science & Business Media

Published: 2011-01-27

Total Pages: 430

ISBN-13: 1441979700

DOWNLOAD EBOOK

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes Expanded discussion of results from ergodic theory relevant to information theory Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources New material on trading off information and distortion, including the Marton inequality New material on the properties of optimal and asymptotically optimal source codes New material on the relationships of source coding and rate-constrained simulation or modeling of random processes Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.


Divergence Measures

Divergence Measures

Author: Igal Sason

Publisher: Mdpi AG

Published: 2022-06

Total Pages: 256

ISBN-13: 9783036543321

DOWNLOAD EBOOK

Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled "Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems", includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the Rényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.


Information Theory and Statistics

Information Theory and Statistics

Author: Solomon Kullback

Publisher: Courier Corporation

Published: 2012-09-11

Total Pages: 436

ISBN-13: 0486142043

DOWNLOAD EBOOK

Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.


Universal Estimation of Information Measures for Analog Sources

Universal Estimation of Information Measures for Analog Sources

Author: Qing Wang

Publisher: Now Publishers Inc

Published: 2009-05-26

Total Pages: 104

ISBN-13: 1601982305

DOWNLOAD EBOOK

Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several nonparametric algorithms have been proposed to estimate information measures. Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence. Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory


Information Theory and Statistical Learning

Information Theory and Statistical Learning

Author: Frank Emmert-Streib

Publisher: Springer Science & Business Media

Published: 2009

Total Pages: 443

ISBN-13: 0387848150

DOWNLOAD EBOOK

This interdisciplinary text offers theoretical and practical results of information theoretic methods used in statistical learning. It presents a comprehensive overview of the many different methods that have been developed in numerous contexts.