Transactions of the Seventh Prague Conference on Information Theory, Statistical Decision Functions, Random Processes and of the 1974 European Meeting of Statisticians

Transactions of the Seventh Prague Conference on Information Theory, Statistical Decision Functions, Random Processes and of the 1974 European Meeting of Statisticians

Author: J. Kozesnik

Publisher: Springer Science & Business Media

Published: 2012-12-06

Total Pages: 577

ISBN-13: 9401099103

DOWNLOAD EBOOK

The Prague Conferences on Information Theory, Statistical Decision Functions, and Random Processes have been organized every three years since 1956. During the eighteen years of their existence the Prague Conferences developed from a platform for presenting results obtained by a small group of researchers into a probabilistic congress, this being documented by the increasing number of participants as well as of presented papers. The importance of the Seventh Prague Conference has been emphasized by the fact that this Conference was held jointly with the eighth European Meeting of Statisticians. This joint meeting was held from August 18 to 23, 1974 at the Technical University of Prague. The Conference was organized by the Institute of Information Theory and Automation of the Czechoslovak Academy of Sciences and was sponsored by the Czechoslovak Academy of Sciences, by the Committee for the European Region of the Institute of Mathematical Statistics, and by the International As sociation for Statistics in Physical Sciences. More than 300 specialists from 25 countries participated in the Conference. In 57 sessions 164 papers (including 17 invited papers) were read, 128 of which are published in the present two volumes of the Transactions of the Conference. Volume A includes papers related mainly to probability theory and stochastic processes, whereas the papers of Volume B concern mainly statistics and information theory.


A First Course in Information Theory

A First Course in Information Theory

Author: Raymond W. Yeung

Publisher: Springer Science & Business Media

Published: 2002

Total Pages: 440

ISBN-13: 9780306467912

DOWNLOAD EBOOK

An introduction to information theory for discrete random variables. Classical topics and fundamental tools are presented along with three selected advanced topics. Yeung (Chinese U. of Hong Kong) presents chapters on information measures, zero-error data compression, weak and strong typicality, the I-measure, Markov structures, channel capacity, rate distortion theory, Blahut-Arimoto algorithms, information inequalities, and Shannon-type inequalities. The advanced topics included are single-source network coding, multi-source network coding, and entropy and groups. Annotation copyrighted by Book News, Inc., Portland, OR.


Topics in Statistical Information Theory

Topics in Statistical Information Theory

Author: Solomon Kullback

Publisher: Springer Science & Business Media

Published: 2013-12-01

Total Pages: 169

ISBN-13: 1461580803

DOWNLOAD EBOOK

The relevance of information theory to statistical theory and its applications to stochastic processes is a unifying influence in these TOPICS. The integral representation of discrimination information is presented in these TOPICS reviewing various approaches used in the literature, and is also developed herein using intrinsically information-theoretic methods. Log likelihood ratios associated with various stochastic processes are computed by an application of minimum discrimination information estimates. Linear discriminant functionals are used in the information-theoretic analysis of a variety of stochastic processes. Sections are numbered serially within each chapter, with a decimal notation for subsections. Equations, examples, theorems and lemmas, are numbered serially within each section with a decimal notation. The digits to the left of the decimal point represent the section and the digits to the right of the decimal point the serial number within the section. When reference is made to a section, equation, example, theorem or lemma within the same chapter only the section number or equation number, etc., is given. When the reference is to a section ,equation, etc., in a different chapter, then in addition to the section or equation etc., number, the chapter number is also given. References to the bibliography are by the author's name followed by the year of publication in parentheses. The transpose of a matrix is denoted by a prime; thus one-row matrices are denoted by primes as the transposes of one-column matrices (vectors).


Information-Spectrum Methods in Information Theory

Information-Spectrum Methods in Information Theory

Author: Te Sun Han

Publisher: Springer Science & Business Media

Published: 2002-10-08

Total Pages: 568

ISBN-13: 9783540435815

DOWNLOAD EBOOK

From the reviews: "This book nicely complements the existing literature on information and coding theory by concentrating on arbitrary nonstationary and/or nonergodic sources and channels with arbitrarily large alphabets. Even with such generality the authors have managed to successfully reach a highly unconventional but very fertile exposition rendering new insights into many problems." -- MATHEMATICAL REVIEWS


Abstract Methods In Information Theory

Abstract Methods In Information Theory

Author: Yuichiro Kakihara

Publisher: World Scientific

Published: 1999-10-15

Total Pages: 265

ISBN-13: 9814495417

DOWNLOAD EBOOK

Information Theory is studied from the following view points: (1) the theory of entropy as amount of information; (2) the mathematical structure of information sources (probability measures); and (3) the theory of information channels. Shannon entropy and Kolmogorov-Sinai entropy are defined and their basic properties are examined, where the latter entropy is extended to be a linear functional on a certain set of measures. Ergodic and mixing properties of stationary sources are studied as well as AMS (asymptotically mean stationary) sources. The main purpose of this book is to present information channels in the environment of real and functional analysis as well as probability theory. Ergodic channels are characterized in various manners. Mixing and AMS channels are also considered in detail with some illustrations. A few other aspects of information channels including measurability, approximation and noncommutative extensions, are also discussed.