Geometric flows have many applications in physics and geometry. The mean curvature flow occurs in the description of the interface evolution in certain physical models. This is related to the property that such a flow is the gradient flow of the area functional and therefore appears naturally in problems where a surface energy is minimized. The mean curvature flow also has many geometric applications, in analogy with the Ricci flow of metrics on abstract riemannian manifolds. One can use this flow as a tool to obtain classification results for surfaces satisfying certain curvature conditions, as well as to construct minimal surfaces. Geometric flows, obtained from solutions of geometric parabolic equations, can be considered as an alternative tool to prove isoperimetric inequalities. On the other hand, isoperimetric inequalities can help in treating several aspects of convergence of these flows. Isoperimetric inequalities have many applications in other fields of geometry, like hyperbolic manifolds.
This work gives a coherent introduction to isoperimetric inequalities in Riemannian manifolds, featuring many of the results obtained during the last 25 years and discussing different techniques in the area. Written in a clear and appealing style, the book includes sufficient introductory material, making it also accessible to graduate students. It will be of interest to researchers working on geometric inequalities either from a geometric or analytic point of view, but also to those interested in applying the described techniques to their field.
The interactions between concentration, isoperimetry and functional inequalities have led to many significant advances in functional analysis and probability theory. Important progress has also taken place in combinatorics, geometry, harmonic analysis and mathematical physics, with recent new applications in random matrices and information theory. This will appeal to graduate students and researchers interested in the interplay between analysis, probability, and geometry.
This book deals with the geometrical structure of finite dimensional normed spaces, as the dimension grows to infinity. This is a part of what came to be known as the Local Theory of Banach Spaces (this name was derived from the fact that in its first stages, this theory dealt mainly with relating the structure of infinite dimensional Banach spaces to the structure of their lattice of finite dimensional subspaces). Our purpose in this book is to introduce the reader to some of the results, problems, and mainly methods developed in the Local Theory, in the last few years. This by no means is a complete survey of this wide area. Some of the main topics we do not discuss here are mentioned in the Notes and Remarks section. Several books appeared recently or are going to appear shortly, which cover much of the material not covered in this book. Among these are Pisier's [Pis6] where factorization theorems related to Grothendieck's theorem are extensively discussed, and Tomczak-Jaegermann's [T-Jl] where operator ideals and distances between finite dimensional normed spaces are studied in detail. Another related book is Pietch's [Pie].
A 1988 classic, covering Two-dimensional Surfaces; Domains on the Plane and on Surfaces; Brunn-Minkowski Inequality and Classical Isoperimetric Inequality; Isoperimetric Inequalities for Various Definitions of Area; and Inequalities Involving Mean Curvature.
High-dimensional probability offers insight into the behavior of random vectors, random matrices, random subspaces, and objects used to quantify uncertainty in high dimensions. Drawing on ideas from probability, analysis, and geometry, it lends itself to applications in mathematics, statistics, theoretical computer science, signal processing, optimization, and more. It is the first to integrate theory, key tools, and modern applications of high-dimensional probability. Concentration inequalities form the core, and it covers both classical results such as Hoeffding's and Chernoff's inequalities and modern developments such as the matrix Bernstein's inequality. It then introduces the powerful methods based on stochastic processes, including such tools as Slepian's, Sudakov's, and Dudley's inequalities, as well as generic chaining and bounds based on VC dimension. A broad range of illustrations is embedded throughout, including classical and modern results for covariance estimation, clustering, networks, semidefinite programming, coding, dimension reduction, matrix completion, machine learning, compressed sensing, and sparse regression.