From the reviews: "... Each chapter of the book is followed by a notes section and by a problems section. There are over 100 problems, many of which have hints. The book may be recommended as a text, it provides a completly self-contained reading ..." --S. Pogosian in Zentralblatt für Mathematik
This book has two main topics: large deviations and equilibrium statistical mechanics. I hope to convince the reader that these topics have many points of contact and that in being treated together, they enrich each other. Entropy, in its various guises, is their common core. The large deviation theory which is developed in this book focuses upon convergence properties of certain stochastic systems. An elementary example is the weak law of large numbers. For each positive e, P{ISn/nl 2: e} con verges to zero as n --+ 00, where Sn is the nth partial sum of indepen dent identically distributed random variables with zero mean. Large deviation theory shows that if the random variables are exponentially bounded, then the probabilities converge to zero exponentially fast as n --+ 00. The exponen tial decay allows one to prove the stronger property of almost sure conver gence (Sn/n --+ 0 a.s.). This example will be generalized extensively in the book. We will treat a large class of stochastic systems which involve both indepen dent and dependent random variables and which have the following features: probabilities converge to zero exponentially fast as the size of the system increases; the exponential decay leads to strong convergence properties of the system. The most fascinating aspect of the theory is that the exponential decay rates are computable in terms of entropy functions. This identification between entropy and decay rates of large deviation probabilities enhances the theory significantly.
From the reviews: "... Besides the fact that the author's treatment of large deviations is a nice contribution to the literature on the subject, his book has the virue that it provides a beautifully unified and mathematically appealing account of certain aspects of statistical mechanics. ... Furthermore, he does not make the mistake of assuming that his mathematical audience will be familiar with the physics and has done an admireable job of explaining the necessary physical background. Finally, it is clear that the author's book is the product of many painstaking hours of work; and the reviewer is confident that its readers will benefit from his efforts." D. Stroock in Mathematical Reviews 1985 "... Each chapter of the book is followed by a notes section and by a problems section. There are over 100 problems, many of which have hints. The book may be recommended as a text, it provides a completly self-contained reading ..." S. Pogosian in Zentralblatt für Mathematik 1986
This is an introductory course on the methods of computing asymptotics of probabilities of rare events: the theory of large deviations. The book combines large deviation theory with basic statistical mechanics, namely Gibbs measures with their variational characterization and the phase transition of the Ising model, in a text intended for a one semester or quarter course. The book begins with a straightforward approach to the key ideas and results of large deviation theory in the context of independent identically distributed random variables. This includes Cramér's theorem, relative entropy, Sanov's theorem, process level large deviations, convex duality, and change of measure arguments. Dependence is introduced through the interactions potentials of equilibrium statistical mechanics. The phase transition of the Ising model is proved in two different ways: first in the classical way with the Peierls argument, Dobrushin's uniqueness condition, and correlation inequalities and then a second time through the percolation approach. Beyond the large deviations of independent variables and Gibbs measures, later parts of the book treat large deviations of Markov chains, the Gärtner-Ellis theorem, and a large deviation theorem of Baxter and Jain that is then applied to a nonstationary process and a random walk in a dynamical random environment. The book has been used with students from mathematics, statistics, engineering, and the sciences and has been written for a broad audience with advanced technical training. Appendixes review basic material from analysis and probability theory and also prove some of the technical results used in the text.
In each generation, scientists must redefine their fields: abstracting, simplifying and distilling the previous standard topics to make room for new advances and methods. Sethna's book takes this step for statistical mechanics - a field rooted in physics and chemistry whose ideas and methods are now central to information theory, complexity, and modern biology. Aimed at advanced undergraduates and early graduate students in all of these fields, Sethna limits his main presentation to the topics that future mathematicians and biologists, as well as physicists and chemists, will find fascinating and central to their work. The amazing breadth of the field is reflected in the author's large supply of carefully crafted exercises, each an introduction to a whole field of study: everything from chaos through information theory to life at the end of the universe.