This text on economic forecasting asks why some practices seem to work empirically despite a lack of formal support from theory. After reviewing the conventional approach to forecasting, it looks at the implications for causal modelling, presents forecast errors and delineates sources of failure.
Co-integration, equilibrium and equilibrium correction are key concepts in modern applications of econometrics to real world problems. This book provides direction and guidance to the now vast literature facing students and graduate economists. Econometric theory is linked to practical issues such as how to identify equilibrium relationships, how to deal with structural breaks associated with regime changes and what to do when variables are of different orders of integration.
This book provides a formal analysis of the models, procedures, and measures of economic forecasting with a view to improving forecasting practice. David Hendry and Michael Clements base the analyses on assumptions pertinent to the economies to be forecast, viz. a non-constant, evolving economic system, and econometric models whose form and structure are unknown a priori. The authors find that conclusions which can be established formally for constant-parameter stationary processes and correctly-specified models often do not hold when unrealistic assumptions are relaxed. Despite the difficulty of proceeding formally when models are mis-specified in unknown ways for non-stationary processes that are subject to structural breaks, Hendry and Clements show that significant insights can be gleaned. For example, a formal taxonomy of forecasting errors can be developed, the role of causal information clarified, intercept corrections re-established as a method for achieving robustness against forms of structural change, and measures of forecast accuracy re-interpreted.
Forecasting is required in many situations. Stocking an inventory may require forecasts of demand months in advance. Telecommunication routing requires traffic forecasts a few minutes ahead. Whatever the circumstances or time horizons involved, forecasting is an important aid in effective and efficient planning. This textbook provides a comprehensive introduction to forecasting methods and presents enough information about each method for readers to use them sensibly.
Statistics in Volcanology is a comprehensive guide to modern statistical methods applied in volcanology written by today's leading authorities. The volume aims to show how the statistical analysis of complex volcanological data sets, including time series, and numerical models of volcanic processes can improve our ability to forecast volcanic eruptions. Specific topics include the use of expert elicitation and Bayesian methods in eruption forecasting, statistical models of temporal and spatial patterns of volcanic activity, analysis of time series in volcano seismology, probabilistic hazard assessment, and assessment of numerical models using robust statistical methods. Also provided are comprehensive overviews of volcanic phenomena, and a full glossary of both volcanological and statistical terms. Statistics in Volcanology is essential reading for advanced undergraduates, graduate students, and research scientists interested in this multidisciplinary field.
This text presents modern developments in time series analysis and focuses on their application to economic problems. The book first introduces the fundamental concept of a stationary time series and the basic properties of covariance, investigating the structure and estimation of autoregressive-moving average (ARMA) models and their relations to the covariance structure. The book then moves on to non-stationary time series, highlighting its consequences for modeling and forecasting and presenting standard statistical tests and regressions. Next, the text discusses volatility models and their applications in the analysis of financial market data, focusing on generalized autoregressive conditional heteroskedastic (GARCH) models. The second part of the text devoted to multivariate processes, such as vector autoregressive (VAR) models and structural vector autoregressive (SVAR) models, which have become the main tools in empirical macroeconomics. The text concludes with a discussion of co-integrated models and the Kalman Filter, which is being used with increasing frequency. Mathematically rigorous, yet application-oriented, this self-contained text will help students develop a deeper understanding of theory and better command of the models that are vital to the field. Assuming a basic knowledge of statistics and/or econometrics, this text is best suited for advanced undergraduate and beginning graduate students.
Some of the key mathematical results are stated without proof in order to make the underlying theory acccessible to a wider audience. The book assumes a knowledge only of basic calculus, matrix algebra, and elementary statistics. The emphasis is on methods and the analysis of data sets. The logic and tools of model-building for stationary and non-stationary time series are developed in detail and numerous exercises, many of which make use of the included computer package, provide the reader with ample opportunity to develop skills in this area. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space models, with an optional chapter on spectral analysis. Additional topics include harmonic regression, the Burg and Hannan-Rissanen algorithms, unit roots, regression with ARMA errors, structural models, the EM algorithm, generalized state-space models with applications to time series of count data, exponential smoothing, the Holt-Winters and ARAR forecasting algorithms, transfer function models and intervention analysis. Brief introducitons are also given to cointegration and to non-linear, continuous-time and long-memory models. The time series package included in the back of the book is a slightly modified version of the package ITSM, published separately as ITSM for Windows, by Springer-Verlag, 1994. It does not handle such large data sets as ITSM for Windows, but like the latter, runs on IBM-PC compatible computers under either DOS or Windows (version 3.1 or later). The programs are all menu-driven so that the reader can immediately apply the techniques in the book to time series data, with a minimal investment of time in the computational and algorithmic aspects of the analysis.
This book examines conventional time series in the context of stationary data prior to a discussion of cointegration, with a focus on multivariate models. The authors provide a detailed and extensive study of impulse responses and forecasting in the stationary and non-stationary context, considering small sample correction, volatility and the impact of different orders of integration. Models with expectations are considered along with alternate methods such as Singular Spectrum Analysis (SSA), the Kalman Filter and Structural Time Series, all in relation to cointegration. Using single equations methods to develop topics, and as examples of the notion of cointegration, Burke, Hunter, and Canepa provide direction and guidance to the now vast literature facing students and graduate economists.
This book contains an extensive up-to-date overview of nonlinear time series models and their application to modelling economic relationships. It considers nonlinear models in stationary and nonstationary frameworks, and both parametric and nonparametric models are discussed. The book contains examples of nonlinear models in economic theory and presents the most common nonlinear time series models. Importantly, it shows the reader how to apply these models in practice. For thispurpose, the building of various nonlinear models with its three stages of model building: specification, estimation and evaluation, is discussed in detail and is illustrated by several examples involving both economic and non-economic data. Since estimation of nonlinear time series models is carried outusing numerical algorithms, the book contains a chapter on estimating parametric nonlinear models and another on estimating nonparametric ones.Forecasting is a major reason for building time series models, linear or nonlinear. The book contains a discussion on forecasting with nonlinear models, both parametric and nonparametric, and considers numerical techniques necessary for computing multi-period forecasts from them. The main focus of the book is on models of the conditional mean, but models of the conditional variance, mainly those of autoregressive conditional heteroskedasticity, receive attention as well. A separate chapter isdevoted to state space models. As a whole, the book is an indispensable tool for researchers interested in nonlinear time series and is also suitable for teaching courses in econometrics and time series analysis.
Economic Theory, Econometrics, and Mathematical Economics, Second Edition: Forecasting Economic Time Series presents the developments in time series analysis and forecasting theory and practice. This book discusses the application of time series procedures in mainstream economic theory and econometric model building. Organized into 10 chapters, this edition begins with an overview of the problem of dealing with time series possessing a deterministic seasonal component. This text then provides a description of time series in terms of models known as the time-domain approach. Other chapters consider an alternative approach, known as spectral or frequency-domain analysis, that often provides useful insights into the properties of a series. This book discusses as well a unified approach to the fitting of linear models to a given time series. The final chapter deals with the main advantage of having a Gaussian series wherein the optimal single series, least-squares forecast will be a linear forecast. This book is a valuable resource for economists.