Books on information theory and coding have proliferated over the last few years, but few succeed in covering the fundamentals without losing students in mathematical abstraction. Even fewer build the essential theoretical framework when presenting algorithms and implementation details of modern coding systems. Without abandoning the theoret
This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters deal with coding theory: communication channels, decoding in the presence of errors, the general theory of linear codes, and such specific codes as Hamming codes, the simplex codes, and many others.
First comprehensive introduction to information theory explores the work of Shannon, McMillan, Feinstein, and Khinchin. Topics include the entropy concept in probability theory, fundamental theorems, and other subjects. 1957 edition.
The latest edition of this classic is updated with new problem sets and material The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points. The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated references Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
Information theory and inference, taught together in this exciting textbook, lie at the heart of many important areas of modern technology - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics and cryptography. The book introduces theory in tandem with applications. Information theory is taught alongside practical communication systems such as arithmetic coding for data compression and sparse-graph codes for error-correction. Inference techniques, including message-passing algorithms, Monte Carlo methods and variational approximations, are developed alongside applications to clustering, convolutional codes, independent component analysis, and neural networks. Uniquely, the book covers state-of-the-art error-correcting codes, including low-density-parity-check codes, turbo codes, and digital fountain codes - the twenty-first-century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, the book is ideal for self-learning, and for undergraduate or graduate courses. It also provides an unparalleled entry point for professionals in areas as diverse as computational biology, financial engineering and machine learning.
This book is an introduction to information and coding theory at the graduate or advanced undergraduate level. It assumes a basic knowledge of probability and modern algebra, but is otherwise self- contained. The intent is to describe as clearly as possible the fundamental issues involved in these subjects, rather than covering all aspects in an encyclopedic fashion. The first quarter of the book is devoted to information theory, including a proof of Shannon's famous Noisy Coding Theorem. The remainder of the book is devoted to coding theory and is independent of the information theory portion of the book. After a brief discussion of general families of codes, the author discusses linear codes (including the Hamming, Golary, the Reed-Muller codes), finite fields, and cyclic codes (including the BCH, Reed-Solomon, Justesen, Goppa, and Quadratic Residue codes). An appendix reviews relevant topics from modern algebra.
This book is an evolution from my book A First Course in Information Theory published in 2002 when network coding was still at its infancy. The last few years have witnessed the rapid development of network coding into a research ?eld of its own in information science. With its root in infor- tion theory, network coding has not only brought about a paradigm shift in network communications at large, but also had signi?cant in?uence on such speci?c research ?elds as coding theory, networking, switching, wireless c- munications,distributeddatastorage,cryptography,andoptimizationtheory. While new applications of network coding keep emerging, the fundamental - sults that lay the foundation of the subject are more or less mature. One of the main goals of this book therefore is to present these results in a unifying and coherent manner. While the previous book focused only on information theory for discrete random variables, the current book contains two new chapters on information theory for continuous random variables, namely the chapter on di?erential entropy and the chapter on continuous-valued channels. With these topics included, the book becomes more comprehensive and is more suitable to be used as a textbook for a course in an electrical engineering department.