Filtering and smoothing methods are used to produce an accurate estimate of the state of a time-varying system based on multiple observational inputs (data). Interest in these methods has exploded in recent years, with numerous applications emerging in fields such as navigation, aerospace engineering, telecommunications and medicine. This compact, informal introduction for graduate students and advanced undergraduates presents the current state-of-the-art filtering and smoothing methods in a unified Bayesian framework. Readers learn what non-linear Kalman filters and particle filters are, how they are related, and their relative advantages and disadvantages. They also discover how state-of-the-art Bayesian parameter estimation methods can be combined with state-of-the-art filtering and smoothing algorithms. The book's practical and algorithmic approach assumes only modest mathematical prerequisites. Examples include MATLAB computations, and the numerous end-of-chapter exercises include computational assignments. MATLAB/GNU Octave source code is available for download at www.cambridge.org/sarkka, promoting hands-on work with the methods.
The first comprehensive development of Bayesian Bounds for parameter estimation and nonlinear filtering/tracking Bayesian estimation plays a central role in many signal processing problems encountered in radar, sonar, communications, seismology, and medical diagnosis. There are often highly nonlinear problems for which analytic evaluation of the exact performance is intractable. A widely used technique is to find bounds on the performance of any estimator and compare the performance of various estimators to these bounds. This book provides a comprehensive overview of the state of the art in Bayesian Bounds. It addresses two related problems: the estimation of multiple parameters based on noisy measurements and the estimation of random processes, either continuous or discrete, based on noisy measurements. An extensive introductory chapter provides an overview of Bayesian estimation and the interrelationship and applicability of the various Bayesian Bounds for both static parameters and random processes. It provides the context for the collection of papers that are included. This book will serve as a comprehensive reference for engineers and statisticians interested in both theory and application. It is also suitable as a text for a graduate seminar or as a supplementary reference for an estimation theory course.
Bayesian Inference of State Space Models: Kalman Filtering and Beyond offers a comprehensive introduction to Bayesian estimation and forecasting for state space models. The celebrated Kalman filter, with its numerous extensions, takes centre stage in the book. Univariate and multivariate models, linear Gaussian, non-linear and non-Gaussian models are discussed with applications to signal processing, environmetrics, economics and systems engineering. Over the past years there has been a growing literature on Bayesian inference of state space models, focusing on multivariate models as well as on non-linear and non-Gaussian models. The availability of time series data in many fields of science and industry on the one hand, and the development of low-cost computational capabilities on the other, have resulted in a wealth of statistical methods aimed at parameter estimation and forecasting. This book brings together many of these methods, presenting an accessible and comprehensive introduction to state space models. A number of data sets from different disciplines are used to illustrate the methods and show how they are applied in practice. The R package BTSA, created for the book, includes many of the algorithms and examples presented. The book is essentially self-contained and includes a chapter summarising the prerequisites in undergraduate linear algebra, probability and statistics. An up-to-date and complete account of state space methods, illustrated by real-life data sets and R code, this textbook will appeal to a wide range of students and scientists, notably in the disciplines of statistics, systems engineering, signal processing, data science, finance and econometrics. With numerous exercises in each chapter, and prerequisite knowledge conveniently recalled, it is suitable for upper undergraduate and graduate courses.
Monte Carlo methods are revolutionizing the on-line analysis of data in many fileds. They have made it possible to solve numerically many complex, non-standard problems that were previously intractable. This book presents the first comprehensive treatment of these techniques.
Smoothness Priors Analysis of Time Series addresses some of the problems of modeling stationary and nonstationary time series primarily from a Bayesian stochastic regression "smoothness priors" state space point of view. Prior distributions on model coefficients are parametrized by hyperparameters. Maximizing the likelihood of a small number of hyperparameters permits the robust modeling of a time series with relatively complex structure and a very large number of implicitly inferred parameters. The critical statistical ideas in smoothness priors are the likelihood of the Bayesian model and the use of likelihood as a measure of the goodness of fit of the model. The emphasis is on a general state space approach in which the recursive conditional distributions for prediction, filtering, and smoothing are realized using a variety of nonstandard methods including numerical integration, a Gaussian mixture distribution-two filter smoothing formula, and a Monte Carlo "particle-path tracing" method in which the distributions are approximated by many realizations. The methods are applicable for modeling time series with complex structures.
By restricting to Gaussian distributions, the optimal Bayesian filtering problem can be transformed into an algebraically simple form, which allows for computationally efficient algorithms. Three problem settings are discussed in this thesis: (1) filtering with Gaussians only, (2) Gaussian mixture filtering for strong nonlinearities, (3) Gaussian process filtering for purely data-driven scenarios. For each setting, efficient algorithms are derived and applied to real-world problems.
A bottom-up approach that enables readers to master and apply the latest techniques in state estimation This book offers the best mathematical approaches to estimating the state of a general system. The author presents state estimation theory clearly and rigorously, providing the right amount of advanced material, recent research results, and references to enable the reader to apply state estimation techniques confidently across a variety of fields in science and engineering. While there are other textbooks that treat state estimation, this one offers special features and a unique perspective and pedagogical approach that speed learning: * Straightforward, bottom-up approach begins with basic concepts and then builds step by step to more advanced topics for a clear understanding of state estimation * Simple examples and problems that require only paper and pen to solve lead to an intuitive understanding of how theory works in practice * MATLAB(r)-based source code that corresponds to examples in the book, available on the author's Web site, enables readers to recreate results and experiment with other simulation setups and parameters Armed with a solid foundation in the basics, readers are presented with a careful treatment of advanced topics, including unscented filtering, high order nonlinear filtering, particle filtering, constrained state estimation, reduced order filtering, robust Kalman filtering, and mixed Kalman/H? filtering. Problems at the end of each chapter include both written exercises and computer exercises. Written exercises focus on improving the reader's understanding of theory and key concepts, whereas computer exercises help readers apply theory to problems similar to ones they are likely to encounter in industry. With its expert blend of theory and practice, coupled with its presentation of recent research results, Optimal State Estimation is strongly recommended for undergraduate and graduate-level courses in optimal control and state estimation theory. It also serves as a reference for engineers and science professionals across a wide array of industries.
This book provides a general introduction to Sequential Monte Carlo (SMC) methods, also known as particle filters. These methods have become a staple for the sequential analysis of data in such diverse fields as signal processing, epidemiology, machine learning, population ecology, quantitative finance, and robotics. The coverage is comprehensive, ranging from the underlying theory to computational implementation, methodology, and diverse applications in various areas of science. This is achieved by describing SMC algorithms as particular cases of a general framework, which involves concepts such as Feynman-Kac distributions, and tools such as importance sampling and resampling. This general framework is used consistently throughout the book. Extensive coverage is provided on sequential learning (filtering, smoothing) of state-space (hidden Markov) models, as this remains an important application of SMC methods. More recent applications, such as parameter estimation of these models (through e.g. particle Markov chain Monte Carlo techniques) and the simulation of challenging probability distributions (in e.g. Bayesian inference or rare-event problems), are also discussed. The book may be used either as a graduate text on Sequential Monte Carlo methods and state-space modeling, or as a general reference work on the area. Each chapter includes a set of exercises for self-study, a comprehensive bibliography, and a “Python corner,” which discusses the practical implementation of the methods covered. In addition, the book comes with an open source Python library, which implements all the algorithms described in the book, and contains all the programs that were used to perform the numerical experiments.