The observation of the concentration of measure phenomenon is inspired by isoperimetric inequalities. This book offers the basic techniques and examples of the concentration of measure phenomenon. It presents concentration functions and inequalities, isoperimetric and functional examples, spectrum and topological applications and product measures.
Concentration of Measure Inequalities in Information Theory, Communications, and Coding focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding.
Describes the interplay between the probabilistic structure (independence) and a variety of tools ranging from functional inequalities to transportation arguments to information theory. Applications to the study of empirical processes, random projections, random matrix theory, and threshold phenomena are also presented.
Isoperimetric, measure concentration and random process techniques appear at the basis of the modern understanding of Probability in Banach spaces. Based on these tools, the book presents a complete treatment of the main aspects of Probability in Banach spaces (integrability and limit theorems for vector valued random variables, boundedness and continuity of random processes) and of some of their links to Geometry of Banach spaces (via the type and cotype properties). Its purpose is to present some of the main aspects of this theory, from the foundations to the most important achievements. The main features of the investigation are the systematic use of isoperimetry and concentration of measure and abstract random process techniques (entropy and majorizing measures). Examples of these probabilistic tools and ideas to classical Banach space theory are further developed.
Randomized algorithms have become a central part of the algorithms curriculum, based on their increasingly widespread use in modern applications. This book presents a coherent and unified treatment of probabilistic techniques for obtaining high probability estimates on the performance of randomized algorithms. It covers the basic toolkit from the Chernoff–Hoeffding bounds to more sophisticated techniques like martingales and isoperimetric inequalities, as well as some recent developments like Talagrand's inequality, transportation cost inequalities and log-Sobolev inequalities. Along the way, variations on the basic theme are examined, such as Chernoff–Hoeffding bounds in dependent settings. The authors emphasise comparative study of the different methods, highlighting respective strengths and weaknesses in concrete example applications. The exposition is tailored to discrete settings sufficient for the analysis of algorithms, avoiding unnecessary measure-theoretic details, thus making the book accessible to computer scientists as well as probabilists and discrete mathematicians.
Concentration inequalities, which express the fact that certain complicated random variables are almost constant, have proven of utmost importance in many areas of probability and statistics. This volume contains refined versions of these inequalities, and their relationship to many applications particularly in stochastic analysis. The broad range and the high quality of the contributions make this book highly attractive for graduates, postgraduates and researchers in the above areas.
This book deals with the geometrical structure of finite dimensional normed spaces, as the dimension grows to infinity. This is a part of what came to be known as the Local Theory of Banach Spaces (this name was derived from the fact that in its first stages, this theory dealt mainly with relating the structure of infinite dimensional Banach spaces to the structure of their lattice of finite dimensional subspaces). Our purpose in this book is to introduce the reader to some of the results, problems, and mainly methods developed in the Local Theory, in the last few years. This by no means is a complete survey of this wide area. Some of the main topics we do not discuss here are mentioned in the Notes and Remarks section. Several books appeared recently or are going to appear shortly, which cover much of the material not covered in this book. Among these are Pisier's [Pis6] where factorization theorems related to Grothendieck's theorem are extensively discussed, and Tomczak-Jaegermann's [T-Jl] where operator ideals and distances between finite dimensional normed spaces are studied in detail. Another related book is Pietch's [Pie].
This is the first book to provide a comprehensive overview of foundational results and recent progress in the study of random matrices from the classical compact groups, drawing on the subject's deep connections to geometry, analysis, algebra, physics, and statistics. The book sets a foundation with an introduction to the groups themselves and six different constructions of Haar measure. Classical and recent results are then presented in a digested, accessible form, including the following: results on the joint distributions of the entries; an extensive treatment of eigenvalue distributions, including the Weyl integration formula, moment formulae, and limit theorems and large deviations for the spectral measures; concentration of measure with applications both within random matrix theory and in high dimensional geometry; and results on characteristic polynomials with connections to the Riemann zeta function. This book will be a useful reference for researchers and an accessible introduction for students in related fields.
This work is devoted to the study of rates of convergence of the empirical measures μn=1n∑nk=1δXk, n≥1, over a sample (Xk)k≥1 of independent identically distributed real-valued random variables towards the common distribution μ in Kantorovich transport distances Wp. The focus is on finite range bounds on the expected Kantorovich distances E(Wp(μn,μ)) or [E(Wpp(μn,μ))]1/p in terms of moments and analytic conditions on the measure μ and its distribution function. The study describes a variety of rates, from the standard one 1n√ to slower rates, and both lower and upper-bounds on E(Wp(μn,μ)) for fixed n in various instances. Order statistics, reduction to uniform samples and analysis of beta distributions, inverse distribution functions, log-concavity are main tools in the investigation. Two detailed appendices collect classical and some new facts on inverse distribution functions and beta distributions and their densities necessary to the investigation.