A very active field of research is emerging at the frontier of statistical physics, theoretical computer science/discrete mathematics, and coding/information theory. This book sets up a common language and pool of concepts, accessible to students and researchers from each of these fields.
This monograph is based on lecture notes of a graduate course, which focuses on the relations between information theory and statistical physics. The course was delivered at the Technion during the Spring of 2010 for the first time, and its target audience consists of EE graduate students in the area of communications and information theory, as well as graduate students in Physics who have basic background in information theory. Strong emphasis is given to the analogy and parallelism between information theory and statistical physics, as well as to the insights, the analysis tools and techniques that can be borrowed from statistical physics and 'imported' to certain problem areas in information theory. This is a research trend that has been very active in the last few decades, and the hope is that by exposing the students to the meeting points between these two disciplines, their background and perspective may be expanded and enhanced. This monograph is substantially revised and expanded relative to an earlier version posted in arXiv (1006.1565v1 cs.iT]).
This concise and readable book addresses primarily readers with a background in classical statistical physics and introduces quantum mechanical notions as required. Conceived as a primer to bridge the gap between statistical physics and quantum information, it emphasizes concepts and thorough discussions of the fundamental notions and prepares the reader for deeper studies, not least through a selection of well chosen exercises.
Statistical physics has its origins in attempts to describe the thermal properties of matter in terms of its constituent particles, and has played a fundamental role in the development of quantum mechanics. Based on lectures taught by Professor Kardar at MIT, this textbook introduces the central concepts and tools of statistical physics. It contains a chapter on probability and related issues such as the central limit theorem and information theory, and covers interacting particles, with an extensive description of the van der Waals equation and its derivation by mean field approximation. It also contains an integrated set of problems, with solutions to selected problems at the end of the book and a complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873420. A companion volume, Statistical Physics of Fields, discusses non-mean field aspects of scaling and critical phenomena, through the perspective of renormalization group.
In this highly readable book, H.S. Green, a former student of Max Born and well known as an author in physics and in the philosophy of science, presents a timely analysis of theoretical physics and related fundamental problems.
The first six chapters of this volume present the author's 'predictive' or information theoretic' approach to statistical mechanics, in which the basic probability distributions over microstates are obtained as distributions of maximum entropy (Le. , as distributions that are most non-committal with regard to missing information among all those satisfying the macroscopically given constraints). There is then no need to make additional assumptions of ergodicity or metric transitivity; the theory proceeds entirely by inference from macroscopic measurements and the underlying dynamical assumptions. Moreover, the method of maximizing the entropy is completely general and applies, in particular, to irreversible processes as well as to reversible ones. The next three chapters provide a broader framework - at once Bayesian and objective - for maximum entropy inference. The basic principles of inference, including the usual axioms of probability, are seen to rest on nothing more than requirements of consistency, above all, the requirement that in two problems where we have the same information we must assign the same probabilities. Thus, statistical mechanics is viewed as a branch of a general theory of inference, and the latter as an extension of the ordinary logic of consistency. Those who are familiar with the literature of statistics and statistical mechanics will recognize in both of these steps a genuine 'scientific revolution' - a complete reversal of earlier conceptions - and one of no small significance.
In each generation, scientists must redefine their fields: abstracting, simplifying and distilling the previous standard topics to make room for new advances and methods. Sethna's book takes this step for statistical mechanics - a field rooted in physics and chemistry whose ideas and methods are now central to information theory, complexity, and modern biology. Aimed at advanced undergraduates and early graduate students in all of these fields, Sethna limits his main presentation to the topics that future mathematicians and biologists, as well as physicists and chemists, will find fascinating and central to their work. The amazing breadth of the field is reflected in the author's large supply of carefully crafted exercises, each an introduction to a whole field of study: everything from chaos through information theory to life at the end of the universe.
Physicists, when modelling physical systems with a large number of degrees of freedom, and statisticians, when performing data analysis, have developed their own concepts and methods for making the `best' inference. But are these methods equivalent, or not? What is the state of the art in making inferences? The physicists want answers. More: neural computation demands a clearer understanding of how neural systems make inferences; the theory of chaotic nonlinear systems as applied to time series analysis could profit from the experience already booked by the statisticians; and finally, there is a long-standing conjecture that some of the puzzles of quantum mechanics are due to our incomplete understanding of how we make inferences. Matter enough to stimulate the writing of such a book as the present one. But other considerations also arise, such as the maximum entropy method and Bayesian inference, information theory and the minimum description length. Finally, it is pointed out that an understanding of human inference may require input from psychologists. This lively debate, which is of acute current interest, is well summarized in the present work.
This superb new book is one of the first publications in recent years to provide a broad overview of this interdisciplinary field. Most of the book is written in a self contained manner, assuming only a general knowledge of statistical mechanics and basic probabilty theory . It provides the reader with a sound introduction to the field and to the analytical techniques necessary to follow its most recent developments
Statistical physics and thermodynamics describe the behaviour of systems on the macroscopic scale. Their methods are applicable to a wide range of phenomena, from neutron stars to heat engines, or from chemical reactions to phase transitions. The pertinent laws are among the most universal ones of all laws of physics.