Entropy and Information Theory

Entropy and Information Theory

Author: Robert M. Gray

Publisher: Springer Science & Business Media

Published: 2011-01-27

Total Pages: 430

ISBN-13: 1441979700

DOWNLOAD EBOOK

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes Expanded discussion of results from ergodic theory relevant to information theory Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources New material on trading off information and distortion, including the Marton inequality New material on the properties of optimal and asymptotically optimal source codes New material on the relationships of source coding and rate-constrained simulation or modeling of random processes Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.


Information-Spectrum Methods in Information Theory

Information-Spectrum Methods in Information Theory

Author: Te Sun Han

Publisher: Springer Science & Business Media

Published: 2002-10-08

Total Pages: 568

ISBN-13: 9783540435815

DOWNLOAD EBOOK

From the reviews: "This book nicely complements the existing literature on information and coding theory by concentrating on arbitrary nonstationary and/or nonergodic sources and channels with arbitrarily large alphabets. Even with such generality the authors have managed to successfully reach a highly unconventional but very fertile exposition rendering new insights into many problems." -- MATHEMATICAL REVIEWS


Elements of Information Theory

Elements of Information Theory

Author: Thomas M. Cover

Publisher: John Wiley & Sons

Published: 2006-07-18

Total Pages: 788

ISBN-13: 0471241954

DOWNLOAD EBOOK

The latest edition of this classic is updated with new problem sets and material The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points. The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated references Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.


Mathematical Analysis

Mathematical Analysis

Author: R. V. Gamkrelidze

Publisher: Springer Science & Business Media

Published: 2012-12-06

Total Pages: 223

ISBN-13: 1468433032

DOWNLOAD EBOOK

This volume contains three articles: "Asymptotic methods in the theory of ordinary differential equations" b'y V. F. Butuzov, A. B. Vasil'eva, and M. V. Fedoryuk, "The theory of best ap proximation in Dormed linear spaces" by A. L. Garkavi, and "Dy namical systems with invariant measure" by A. 'VI. Vershik and S. A. Yuzvinskii. The first article surveys the literature on linear and non linear singular asymptotic problems, in particular, differential equations with a small parameter. The period covered by the survey is primarily 1962-1967. The second article is devoted to the problem of existence, characterization, and uniqueness of best approximations in Banach spaces. One of the chapters also deals with the problem of the convergence of positive operators, inasmuch as the ideas and methods of this theory are close to those of the theory of best ap proximation. The survey covers the literature of the decade 1958-1967. The third article is devoted to a comparatively new and rapid ly growing branch of mathematics which is closely related to many classical and modern mathematical disciplines. A survey is given of results in entropy theory, classical dynamic systems, ergodic theorems, etc. The results surveyed were primarily published during the period 1956-1967.


Information Theory and Network Coding

Information Theory and Network Coding

Author: Raymond W. Yeung

Publisher: Springer Science & Business Media

Published: 2008-09-10

Total Pages: 592

ISBN-13: 0387792333

DOWNLOAD EBOOK

This book is an evolution from my book A First Course in Information Theory published in 2002 when network coding was still at its infancy. The last few years have witnessed the rapid development of network coding into a research ?eld of its own in information science. With its root in infor- tion theory, network coding has not only brought about a paradigm shift in network communications at large, but also had signi?cant in?uence on such speci?c research ?elds as coding theory, networking, switching, wireless c- munications,distributeddatastorage,cryptography,andoptimizationtheory. While new applications of network coding keep emerging, the fundamental - sults that lay the foundation of the subject are more or less mature. One of the main goals of this book therefore is to present these results in a unifying and coherent manner. While the previous book focused only on information theory for discrete random variables, the current book contains two new chapters on information theory for continuous random variables, namely the chapter on di?erential entropy and the chapter on continuous-valued channels. With these topics included, the book becomes more comprehensive and is more suitable to be used as a textbook for a course in an electrical engineering department.


Abstract Methods in Information Theory

Abstract Methods in Information Theory

Author: Y–ichir“ Kakihara

Publisher: World Scientific

Published: 1999

Total Pages: 272

ISBN-13: 9789810237110

DOWNLOAD EBOOK

Information Theory is studied from the following view points: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov-Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean stationary) sources. The main purpose of this book is to present information channels in the environment of real and functional analysis as well as probability theory. Ergodic channels are characterized in various manners. Mixing and AMS channels are also considered in detail with some illustrations. A few other aspects of information channels including measurability, approximation and noncommutative extensions, are also discussed.


Feynman-Kac Formulae

Feynman-Kac Formulae

Author: Pierre Del Moral

Publisher: Springer Science & Business Media

Published: 2012-12-06

Total Pages: 567

ISBN-13: 1468493930

DOWNLOAD EBOOK

This text takes readers in a clear and progressive format from simple to recent and advanced topics in pure and applied probability such as contraction and annealed properties of non-linear semi-groups, functional entropy inequalities, empirical process convergence, increasing propagations of chaos, central limit, and Berry Esseen type theorems as well as large deviation principles for strong topologies on path-distribution spaces. Topics also include a body of powerful branching and interacting particle methods.