Information Theory, Inference and Learning Algorithms

Information Theory, Inference and Learning Algorithms

Author: David J. C. MacKay

Publisher: Cambridge University Press

Published: 2003-09-25

Total Pages: 694

ISBN-13: 9780521642989

DOWNLOAD EBOOK

Information theory and inference, taught together in this exciting textbook, lie at the heart of many important areas of modern technology - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics and cryptography. The book introduces theory in tandem with applications. Information theory is taught alongside practical communication systems such as arithmetic coding for data compression and sparse-graph codes for error-correction. Inference techniques, including message-passing algorithms, Monte Carlo methods and variational approximations, are developed alongside applications to clustering, convolutional codes, independent component analysis, and neural networks. Uniquely, the book covers state-of-the-art error-correcting codes, including low-density-parity-check codes, turbo codes, and digital fountain codes - the twenty-first-century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, the book is ideal for self-learning, and for undergraduate or graduate courses. It also provides an unparalleled entry point for professionals in areas as diverse as computational biology, financial engineering and machine learning.


Information-Spectrum Methods in Information Theory

Information-Spectrum Methods in Information Theory

Author: Te Sun Han

Publisher: Springer Science & Business Media

Published: 2013-04-18

Total Pages: 552

ISBN-13: 3662120666

DOWNLOAD EBOOK

From the reviews: "This book nicely complements the existing literature on information and coding theory by concentrating on arbitrary nonstationary and/or nonergodic sources and channels with arbitrarily large alphabets. Even with such generality the authors have managed to successfully reach a highly unconventional but very fertile exposition rendering new insights into many problems." -- MATHEMATICAL REVIEWS


Information Theory Applied To Space-time Physics

Information Theory Applied To Space-time Physics

Author: Henning F Harmuth

Publisher: World Scientific

Published: 1993-01-31

Total Pages: 320

ISBN-13: 9814504572

DOWNLOAD EBOOK

The success of Newton's mechanic, Maxwell's electrodynamic, Einstein's theories of relativity, and quantum mechanics is a strong argument for the space-time continuum. Nevertheless, doubts have been expressed about the use of a continuum in a science squarely based on observation and measurement. An exact science requires that qualitative arguments must be reduced to quantitative statements. The observability of a continuum can be reduced from qualitative arguments to quantitative statements by means of information theory.Information theory was developed during the last decades within electrical communications, but it is almost unknown in physics. The closest approach to information theory in physics is the calculus of propositions, which has been used in books on the frontier of quantum mechanics and the general theory of relativity. Principles of information theory are discussed in this book. The ability to think readily in terms of a finite number of discrete samples is developed over many years of using information theory and digital computers, just as the ability to think readily in terms of a continuum is developed by long use of differential calculus.


Introduction to Information Theory and Data Compression, Second Edition

Introduction to Information Theory and Data Compression, Second Edition

Author: D.C. Hankerson

Publisher: CRC Press

Published: 2003-02-26

Total Pages: 394

ISBN-13: 9781584883135

DOWNLOAD EBOOK

An effective blend of carefully explained theory and practical applications, this text imparts the fundamentals of both information theory and data compression. Although the two topics are related, this unique text allows either topic to be presented independently, and it was specifically designed so that the data compression section requires no prior knowledge of information theory. The treatment of information theory, while theoretical and abstract, is quite elementary, making this text less daunting than many others. After presenting the fundamental definitions and results of the theory, the authors then apply the theory to memoryless, discrete channels with zeroth-order, one-state sources. The chapters on data compression acquaint students with a myriad of lossless compression methods and then introduce two lossy compression methods. Students emerge from this study competent in a wide range of techniques. The authors' presentation is highly practical but includes some important proofs, either in the text or in the exercises, so instructors can, if they choose, place more emphasis on the mathematics. Introduction to Information Theory and Data Compression, Second Edition is ideally suited for an upper-level or graduate course for students in mathematics, engineering, and computer science. Features: Expanded discussion of the historical and theoretical basis of information theory that builds a firm, intuitive grasp of the subject Reorganization of theoretical results along with new exercises, ranging from the routine to the more difficult, that reinforce students' ability to apply the definitions and results in specific situations. Simplified treatment of the algorithm(s) of Gallager and Knuth Discussion of the information rate of a code and the trade-off between error correction and information rate Treatment of probabilistic finite state source automata, including basic results, examples, references, and exercises Octave and MATLAB image compression codes included in an appendix for use with the exercises and projects involving transform methods Supplementary materials, including software, available for download from the authors' Web site at www.dms.auburn.edu/compression


Information Theory

Information Theory

Author: Imre Csiszár

Publisher: Elsevier

Published: 2014-07-10

Total Pages: 465

ISBN-13: 1483281574

DOWNLOAD EBOOK

Information Theory: Coding Theorems for Discrete Memoryless Systems presents mathematical models that involve independent random variables with finite range. This three-chapter text specifically describes the characteristic phenomena of information theory. Chapter 1 deals with information measures in simple coding problems, with emphasis on some formal properties of Shannon's information and the non-block source coding. Chapter 2 describes the properties and practical aspects of the two-terminal systems. This chapter also examines the noisy channel coding problem, the computation of channel capacity, and the arbitrarily varying channels. Chapter 3 looks into the theory and practicality of multi-terminal systems. This book is intended primarily for graduate students and research workers in mathematics, electrical engineering, and computer science.


Applied Information Theory

Applied Information Theory

Author: I. M. Kogan

Publisher: CRC Press

Published: 1988

Total Pages: 482

ISBN-13: 9782881240645

DOWNLOAD EBOOK

Since the main principles of applied information theory were formulated in the 1940s, the science has been greatly developed and today its areas of application range from traditional communication engineering problems to humanities and the arts. Interdisciplinary in scope, this book is a single-source reference for all applications areas, including engineering, radar, computing technology, television, the life sciences (including biology, physiology and psychology) and arts criticism. A review of the current state of information theory is provided; the author also presents several generalized and original results, and gives a treatment of various problems. This is a reference for both specialists and non-professionals in information theory and general cybernetics.


Introduction to Coding and Information Theory

Introduction to Coding and Information Theory

Author: Steven Roman

Publisher: Springer Science & Business Media

Published: 1996-11-26

Total Pages: 344

ISBN-13: 9780387947044

DOWNLOAD EBOOK

This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.


Information Theory in Computer Vision and Pattern Recognition

Information Theory in Computer Vision and Pattern Recognition

Author: Francisco Escolano Ruiz

Publisher: Springer Science & Business Media

Published: 2009-07-14

Total Pages: 375

ISBN-13: 1848822979

DOWNLOAD EBOOK

Information theory has proved to be effective for solving many computer vision and pattern recognition (CVPR) problems (such as image matching, clustering and segmentation, saliency detection, feature selection, optimal classifier design and many others). Nowadays, researchers are widely bringing information theory elements to the CVPR arena. Among these elements there are measures (entropy, mutual information...), principles (maximum entropy, minimax entropy...) and theories (rate distortion theory, method of types...). This book explores and introduces the latter elements through an incremental complexity approach at the same time where CVPR problems are formulated and the most representative algorithms are presented. Interesting connections between information theory principles when applied to different problems are highlighted, seeking a comprehensive research roadmap. The result is a novel tool both for CVPR and machine learning researchers, and contributes to a cross-fertilization of both areas.