Comprises the proceedings of the AMS-IMS-SIAM Summer Research Conference on Statistical Inference from Stochastic Processes, held at Cornell University in August 1987. This book provides students and researchers with a familiarity with the foundations of inference from stochastic processes and intends to provide a knowledge of the developments.
Covering both theory and applications, this collection of eleven contributed papers surveys the role of probabilistic models and statistical techniques in image analysis and processing, develops likelihood methods for inference about parameters that determine the drift and the jump mechanism of a di
The first book in inference for stochastic processes from a statistical, rather than a probabilistic, perspective. It provides a systematic exposition of theoretical results from over ten years of mathematical literature and presents, for the first time in book form, many new techniques and approaches.
This is the first book designed to introduce Bayesian inference procedures for stochastic processes. There are clear advantages to the Bayesian approach (including the optimal use of prior information). Initially, the book begins with a brief review of Bayesian inference and uses many examples relevant to the analysis of stochastic processes, including the four major types, namely those with discrete time and discrete state space and continuous time and continuous state space. The elements necessary to understanding stochastic processes are then introduced, followed by chapters devoted to the Bayesian analysis of such processes. It is important that a chapter devoted to the fundamental concepts in stochastic processes is included. Bayesian inference (estimation, testing hypotheses, and prediction) for discrete time Markov chains, for Markov jump processes, for normal processes (e.g. Brownian motion and the Ornstein–Uhlenbeck process), for traditional time series, and, lastly, for point and spatial processes are described in detail. Heavy emphasis is placed on many examples taken from biology and other scientific disciplines. In order analyses of stochastic processes, it will use R and WinBUGS. Features: Uses the Bayesian approach to make statistical Inferences about stochastic processes The R package is used to simulate realizations from different types of processes Based on realizations from stochastic processes, the WinBUGS package will provide the Bayesian analysis (estimation, testing hypotheses, and prediction) for the unknown parameters of stochastic processes To illustrate the Bayesian inference, many examples taken from biology, economics, and astronomy will reinforce the basic concepts of the subject A practical approach is implemented by considering realistic examples of interest to the scientific community WinBUGS and R code are provided in the text, allowing the reader to easily verify the results of the inferential procedures found in the many examples of the book Readers with a good background in two areas, probability theory and statistical inference, should be able to master the essential ideas of this book.
This work is an overview of statistical inference in stationary, discrete time stochastic processes. Results in the last fifteen years, particularly on non-Gaussian sequences and semi-parametric and non-parametric analysis have been reviewed. The first chapter gives a background of results on martingales and strong mixing sequences, which enable us to generate various classes of CAN estimators in the case of dependent observations. Topics discussed include inference in Markov chains and extension of Markov chains such as Raftery's Mixture Transition Density model and Hidden Markov chains and extensions of ARMA models with a Binomial, Poisson, Geometric, Exponential, Gamma, Weibull, Lognormal, Inverse Gaussian and Cauchy as stationary distributions. It further discusses applications of semi-parametric methods of estimation such as conditional least squares and estimating functions in stochastic models. Construction of confidence intervals based on estimating functions is discussed in some detail. Kernel based estimation of joint density and conditional expectation are also discussed. Bootstrap and other resampling procedures for dependent sequences such as Markov chains, Markov sequences, linear auto-regressive moving average sequences, block based bootstrap for stationary sequences and other block based procedures are also discussed in some detail. This work can be useful for researchers interested in knowing developments in inference in discrete time stochastic processes. It can be used as a material for advanced level research students.
This book was first published in 2004. Many observed phenomena, from the changing health of a patient to values on the stock market, are characterised by quantities that vary over time: stochastic processes are designed to study them. This book introduces practical methods of applying stochastic processes to an audience knowledgeable only in basic statistics. It covers almost all aspects of the subject and presents the theory in an easily accessible form that is highlighted by application to many examples. These examples arise from dozens of areas, from sociology through medicine to engineering. Complementing these are exercise sets making the book suited for introductory courses in stochastic processes. Software (available from www.cambridge.org) is provided for the freely available R system for the reader to apply to all the models presented.
This work is an overview of statistical inference in stationary, discrete time stochastic processes. Results in the last fifteen years, particularly on non-Gaussian sequences and semi-parametric and non-parametric analysis have been reviewed. The first chapter gives a background of results on martingales and strong mixing sequences, which enable us to generate various classes of CAN estimators in the case of dependent observations. Topics discussed include inference in Markov chains and extension of Markov chains such as Raftery's Mixture Transition Density model and Hidden Markov chains and extensions of ARMA models with a Binomial, Poisson, Geometric, Exponential, Gamma, Weibull, Lognormal, Inverse Gaussian and Cauchy as stationary distributions. It further discusses applications of semi-parametric methods of estimation such as conditional least squares and estimating functions in stochastic models. Construction of confidence intervals based on estimating functions is discussed in some detail. Kernel based estimation of joint density and conditional expectation are also discussed. Bootstrap and other resampling procedures for dependent sequences such as Markov chains, Markov sequences, linear auto-regressive moving average sequences, block based bootstrap for stationary sequences and other block based procedures are also discussed in some detail. This work can be useful for researchers interested in knowing developments in inference in discrete time stochastic processes. It can be used as a material for advanced level research students.
This text is an Elementary Introduction to Stochastic Processes in discrete and continuous time with an initiation of the statistical inference. The material is standard and classical for a first course in Stochastic Processes at the senior/graduate level (lessons 1-12). To provide students with a view of statistics of stochastic processes, three lessons (13-15) were added. These lessons can be either optional or serve as an introduction to statistical inference with dependent observations. Several points of this text need to be elaborated, (1) The pedagogy is somewhat obvious. Since this text is designed for a one semester course, each lesson can be covered in one week or so. Having in mind a mixed audience of students from different departments (Math ematics, Statistics, Economics, Engineering, etc.) we have presented the material in each lesson in the most simple way, with emphasis on moti vation of concepts, aspects of applications and computational procedures. Basically, we try to explain to beginners questions such as "What is the topic in this lesson?" "Why this topic?", "How to study this topic math ematically?". The exercises at the end of each lesson will deepen the stu dents' understanding of the material, and test their ability to carry out basic computations. Exercises with an asterisk are optional (difficult) and might not be suitable for homework, but should provide food for thought.
The YUIMA package is the first comprehensive R framework based on S4 classes and methods which allows for the simulation of stochastic differential equations driven by Wiener process, Lévy processes or fractional Brownian motion, as well as CARMA, COGARCH, and Point processes. The package performs various central statistical analyses such as quasi maximum likelihood estimation, adaptive Bayes estimation, structural change point analysis, hypotheses testing, asynchronous covariance estimation, lead-lag estimation, LASSO model selection, and so on. YUIMA also supports stochastic numerical analysis by fast computation of the expected value of functionals of stochastic processes through automatic asymptotic expansion by means of the Malliavin calculus. All models can be multidimensional, multiparametric or non parametric.The book explains briefly the underlying theory for simulation and inference of several classes of stochastic processes and then presents both simulation experiments and applications to real data. Although these processes have been originally proposed in physics and more recently in finance, they are becoming popular also in biology due to the fact the time course experimental data are now available. The YUIMA package, available on CRAN, can be freely downloaded and this companion book will make the user able to start his or her analysis from the first page.