This book is a comprehensive treatment of inference for hidden Markov models, including both algorithms and statistical theory. Topics range from filtering and smoothing of the hidden Markov chain to parameter estimation, Bayesian methods and estimation of the number of states. In a unified way the book covers both models with finite state spaces and models with continuous state spaces (also called state-space models) requiring approximate simulation-based algorithms that are also described in detail. Many examples illustrate the algorithms and theory. This book builds on recent developments to present a self-contained view.
Monte Carlo methods are revolutionizing the on-line analysis of data in many fileds. They have made it possible to solve numerically many complex, non-standard problems that were previously intractable. This book presents the first comprehensive treatment of these techniques.
Statistical and mathematical models are defined by parameters that describe different characteristics of those models. Ideally it would be possible to find parameter estimates for every parameter in that model, but, in some cases, this is not possible. For example, two parameters that only ever appear in the model as a product could not be estimated individually; only the product can be estimated. Such a model is said to be parameter redundant, or the parameters are described as non-identifiable. This book explains why parameter redundancy and non-identifiability is a problem and the different methods that can be used for detection, including in a Bayesian context. Key features of this book: Detailed discussion of the problems caused by parameter redundancy and non-identifiability Explanation of the different general methods for detecting parameter redundancy and non-identifiability, including symbolic algebra and numerical methods Chapter on Bayesian identifiability Throughout illustrative examples are used to clearly demonstrate each problem and method. Maple and R code are available for these examples More in-depth focus on the areas of discrete and continuous state-space models and ecological statistics, including methods that have been specifically developed for each of these areas This book is designed to make parameter redundancy and non-identifiability accessible and understandable to a wide audience from masters and PhD students to researchers, from mathematicians and statisticians to practitioners using mathematical or statistical models.
As more applications are found, interest in Hidden Markov Models continues to grow. Following comments and feedback from colleagues, students and other working with Hidden Markov Models the corrected 3rd printing of this volume contains clarifications, improvements and some new material, including results on smoothing for linear Gaussian dynamics. In Chapter 2 the derivation of the basic filters related to the Markov chain are each presented explicitly, rather than as special cases of one general filter. Furthermore, equations for smoothed estimates are given. The dynamics for the Kalman filter are derived as special cases of the authors’ general results and new expressions for a Kalman smoother are given. The Chapters on the control of Hidden Markov Chains are expanded and clarified. The revised Chapter 4 includes state estimation for discrete time Markov processes and Chapter 12 has a new section on robust control.
Machine learning techniques provide cost-effective alternatives to traditional methods for extracting underlying relationships between information and data and for predicting future events by processing existing information to train models. Efficient Learning Machines explores the major topics of machine learning, including knowledge discovery, classifications, genetic algorithms, neural networking, kernel methods, and biologically-inspired techniques. Mariette Awad and Rahul Khanna’s synthetic approach weaves together the theoretical exposition, design principles, and practical applications of efficient machine learning. Their experiential emphasis, expressed in their close analysis of sample algorithms throughout the book, aims to equip engineers, students of engineering, and system designers to design and create new and more efficient machine learning systems. Readers of Efficient Learning Machines will learn how to recognize and analyze the problems that machine learning technology can solve for them, how to implement and deploy standard solutions to sample problems, and how to design new systems and solutions. Advances in computing performance, storage, memory, unstructured information retrieval, and cloud computing have coevolved with a new generation of machine learning paradigms and big data analytics, which the authors present in the conceptual context of their traditional precursors. Awad and Khanna explore current developments in the deep learning techniques of deep neural networks, hierarchical temporal memory, and cortical algorithms. Nature suggests sophisticated learning techniques that deploy simple rules to generate highly intelligent and organized behaviors with adaptive, evolutionary, and distributed properties. The authors examine the most popular biologically-inspired algorithms, together with a sample application to distributed datacenter management. They also discuss machine learning techniques for addressing problems of multi-objective optimization in which solutions in real-world systems are constrained and evaluated based on how well they perform with respect to multiple objectives in aggregate. Two chapters on support vector machines and their extensions focus on recent improvements to the classification and regression techniques at the core of machine learning.
Hidden Markov Models for Time Series: An Introduction Using R, Second Edition illustrates the great flexibility of hidden Markov models (HMMs) as general-purpose models for time series data. The book provides a broad understanding of the models and their uses. After presenting the basic model formulation, the book covers estimation, forecasting, decoding, prediction, model selection, and Bayesian inference for HMMs. Through examples and applications, the authors describe how to extend and generalize the basic model so that it can be applied in a rich variety of situations. The book demonstrates how HMMs can be applied to a wide range of types of time series: continuous-valued, circular, multivariate, binary, bounded and unbounded counts, and categorical observations. It also discusses how to employ the freely available computing environment R to carry out the computations. Features Presents an accessible overview of HMMs Explores a variety of applications in ecology, finance, epidemiology, climatology, and sociology Includes numerous theoretical and programming exercises Provides most of the analysed data sets online New to the second edition A total of five chapters on extensions, including HMMs for longitudinal data, hidden semi-Markov models and models with continuous-valued state process New case studies on animal movement, rainfall occurrence and capture-recapture data
The Application of Hidden Markov Models in Speech Recognition presents the core architecture of a HMM-based LVCSR system and proceeds to describe the various refinements which are needed to achieve state-of-the-art performance.
Drawing on the authors' extensive research in the analysis of categorical longitudinal data, this book focuses on the formulation of latent Markov models and the practical use of these models. It demonstrates how to use the models in three types of analysis, with numerous examples illustrating how latent Markov models are used in economics, education, sociology, and other fields. The R and MATLAB routines used for the examples are available on the authors' website.
Designed for researchers and students, Nonlinear Times Series: Theory, Methods and Applications with R Examples familiarizes readers with the principles behind nonlinear time series models—without overwhelming them with difficult mathematical developments. By focusing on basic principles and theory, the authors give readers the background required to craft their own stochastic models, numerical methods, and software. They will also be able to assess the advantages and disadvantages of different approaches, and thus be able to choose the right methods for their purposes. The first part can be seen as a crash course on "classical" time series, with a special emphasis on linear state space models and detailed coverage of random coefficient autoregressions, both ARCH and GARCH models. The second part introduces Markov chains, discussing stability, the existence of a stationary distribution, ergodicity, limit theorems, and statistical inference. The book concludes with a self-contained account on nonlinear state space and sequential Monte Carlo methods. An elementary introduction to nonlinear state space modeling and sequential Monte Carlo, this section touches on current topics, from the theory of statistical inference to advanced computational methods. The book can be used as a support to an advanced course on these methods, or an introduction to this field before studying more specialized texts. Several chapters highlight recent developments such as explicit rate of convergence of Markov chains and sequential Monte Carlo techniques. And while the chapters are organized in a logical progression, the three parts can be studied independently. Statistics is not a spectator sport, so the book contains more than 200 exercises to challenge readers. These problems strengthen intellectual muscles strained by the introduction of new theory and go on to extend the theory in significant ways. The book helps readers hone their skills in nonlinear time series analysis and their applications.