This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of their use in practice. This also includes a treatment of the Berry–Esseen error estimate in the central limit theorem. The authors assume mathematical maturity at a graduate level; otherwise the book is suitable for students with varying levels of background in analysis and measure theory. For the reader who needs refreshers, theorems from analysis and measure theory used in the main text are provided in comprehensive appendices, along with their proofs, for ease of reference. Rabi Bhattacharya is Professor of Mathematics at the University of Arizona. Edward Waymire is Professor of Mathematics at Oregon State University. Both authors have co-authored numerous books, including a series of four upcoming graduate textbooks in stochastic processes with applications.
Probability and Measure Theory, Second Edition, is a text for a graduate-level course in probability that includes essential background topics in analysis. It provides extensive coverage of conditional probability and expectation, strong laws of large numbers, martingale theory, the central limit theorem, ergodic theory, and Brownian motion. Clear, readable style Solutions to many problems presented in text Solutions manual for instructors Material new to the second edition on ergodic theory, Brownian motion, and convergence theorems used in statistics No knowledge of general topology required, just basic analysis and metric spaces Efficient organization
This book grew from a one-semester course offered for many years to a mixed audience of graduate and undergraduate students who have not had the luxury of taking a course in measure theory. The core of the book covers the basic topics of independence, conditioning, martingales, convergence in distribution, and Fourier transforms. In addition there are numerous sections treating topics traditionally thought of as more advanced, such as coupling and the KMT strong approximation, option pricing via the equivalent martingale measure, and the isoperimetric inequality for Gaussian processes. The book is not just a presentation of mathematical theory, but is also a discussion of why that theory takes its current form. It will be a secure starting point for anyone who needs to invoke rigorous probabilistic arguments and understand what they mean.
Assuming only calculus and linear algebra, Professor Taylor introduces readers to measure theory and probability, discrete martingales, and weak convergence. This is a technically complete, self-contained and rigorous approach that helps the reader to develop basic skills in analysis and probability. Students of pure mathematics and statistics can thus expect to acquire a sound introduction to basic measure theory and probability, while readers with a background in finance, business, or engineering will gain a technical understanding of discrete martingales in the equivalent of one semester. J. C. Taylor is the author of numerous articles on potential theory, both probabilistic and analytic, and is particularly interested in the potential theory of symmetric spaces.
This is a graduate level textbook on measure theory and probability theory. The book can be used as a text for a two semester sequence of courses in measure theory and probability theory, with an option to include supplemental material on stochastic processes and special topics. It is intended primarily for first year Ph.D. students in mathematics and statistics although mathematically advanced students from engineering and economics would also find the book useful. Prerequisites are kept to the minimal level of an understanding of basic real analysis concepts such as limits, continuity, differentiability, Riemann integration, and convergence of sequences and series. A review of this material is included in the appendix. The book starts with an informal introduction that provides some heuristics into the abstract concepts of measure and integration theory, which are then rigorously developed. The first part of the book can be used for a standard real analysis course for both mathematics and statistics Ph.D. students as it provides full coverage of topics such as the construction of Lebesgue-Stieltjes measures on real line and Euclidean spaces, the basic convergence theorems, L^p spaces, signed measures, Radon-Nikodym theorem, Lebesgue's decomposition theorem and the fundamental theorem of Lebesgue integration on R, product spaces and product measures, and Fubini-Tonelli theorems. It also provides an elementary introduction to Banach and Hilbert spaces, convolutions, Fourier series and Fourier and Plancherel transforms. Thus part I would be particularly useful for students in a typical Statistics Ph.D. program if a separate course on real analysis is not a standard requirement. Part II (chapters 6-13) provides full coverage of standard graduate level probability theory. It starts with Kolmogorov's probability model and Kolmogorov's existence theorem. It then treats thoroughly the laws of large numbers including renewal theory and ergodic theorems with applications and then weak convergence of probability distributions, characteristic functions, the Levy-Cramer continuity theorem and the central limit theorem as well as stable laws. It ends with conditional expectations and conditional probability, and an introduction to the theory of discrete time martingales. Part III (chapters 14-18) provides a modest coverage of discrete time Markov chains with countable and general state spaces, MCMC, continuous time discrete space jump Markov processes, Brownian motion, mixing sequences, bootstrap methods, and branching processes. It could be used for a topics/seminar course or as an introduction to stochastic processes. Krishna B. Athreya is a professor at the departments of mathematics and statistics and a Distinguished Professor in the College of Liberal Arts and Sciences at the Iowa State University. He has been a faculty member at University of Wisconsin, Madison; Indian Institute of Science, Bangalore; Cornell University; and has held visiting appointments in Scandinavia and Australia. He is a fellow of the Institute of Mathematical Statistics USA; a fellow of the Indian Academy of Sciences, Bangalore; an elected member of the International Statistical Institute; and serves on the editorial board of several journals in probability and statistics. Soumendra N. Lahiri is a professor at the department of statistics at the Iowa State University. He is a fellow of the Institute of Mathematical Statistics, a fellow of the American Statistical Association, and an elected member of the International Statistical Institute.
This very well written and accessible book emphasizes the reasons for studying measure theory, which is the foundation of much of probability. By focusing on measure, many illustrative examples and applications, including a thorough discussion of standard probability distributions and densities, are opened. The book also includes many problems and their fully worked solutions.
Introductory Probability is a pleasure to read and provides a fine answer to the question: How do you construct Brownian motion from scratch, given that you are a competent analyst? There are at least two ways to develop probability theory. The more familiar path is to treat it as its own discipline, and work from intuitive examples such as coin flips and conundrums such as the Monty Hall problem. An alternative is to first develop measure theory and analysis, and then add interpretation. Bhattacharya and Waymire take the second path.
This book provides in a concise, yet detailed way, the bulk of the probabilistic tools that a student working toward an advanced degree in statistics, probability and other related areas, should be equipped with. The approach is classical, avoiding the use of mathematical tools not necessary for carrying out the discussions. All proofs are presented in full detail. * Excellent exposition marked by a clear, coherent and logical devleopment of the subject * Easy to understand, detailed discussion of material * Complete proofs