Frequency of Observation and the Estimation of Integrated Volatility in Deep and Liquid Financial Markets

Frequency of Observation and the Estimation of Integrated Volatility in Deep and Liquid Financial Markets

Author: Alain Chaboud

Publisher:

Published: 2008

Total Pages: 60

ISBN-13:

DOWNLOAD EBOOK

Using two newly available ultrahigh-frequency datasets, we investigate empirically how frequently one can sample certain foreign exchange and U.S. Treasury security returns without contaminating estimates of their integrated volatility with market microstructure noise. We find that one can sample FX returns as frequently as once every 15 to 20 seconds without contaminating volatility estimates; bond returns may be sampled as frequently as once every 2 to 3 minutes on days without U.S. macroeconomic announcements, and as frequently as once every 40 seconds on announcement days. With a simple realized kernel estimator, the sampling frequencies can be increased to once every 2 to 5 seconds for FX returns and to about once every 30 to 40 seconds for bond returns. These sampling frequencies, especially in the case of FX returns, are much higher than those often recommended in the empirical literature on realized volatility in equity markets. The higher sampling frequencies for FX and bond returns likely reflects the superior depth and liquidity of these markets.


Econophysics Approaches to Large-Scale Business Data and Financial Crisis

Econophysics Approaches to Large-Scale Business Data and Financial Crisis

Author: Misako Takayasu

Publisher: Springer Science & Business Media

Published: 2010-04-27

Total Pages: 320

ISBN-13: 4431538534

DOWNLOAD EBOOK

In recent years, as part of the increasing “informationization” of industry and the economy, enterprises have been accumulating vast amounts of detailed data such as high-frequency transaction data in nancial markets and point-of-sale information onindividualitems in theretail sector. Similarly,vast amountsof data arenow ava- able on business networks based on inter rm transactions and shareholdings. In the past, these types of information were studied only by economists and management scholars. More recently, however, researchers from other elds, such as physics, mathematics, and information sciences, have become interested in this kind of data and, based on novel empirical approaches to searching for regularities and “laws” akin to those in the natural sciences, have produced intriguing results. This book is the proceedings of the international conference THICCAPFA7 that was titled “New Approaches to the Analysis of Large-Scale Business and E- nomic Data,” held in Tokyo, March 1–5, 2009. The letters THIC denote the Tokyo Tech (Tokyo Institute of Technology)–Hitotsubashi Interdisciplinary Conference. The conference series, titled APFA (Applications of Physics in Financial Analysis), focuses on the analysis of large-scale economic data. It has traditionally brought physicists and economists together to exchange viewpoints and experience (APFA1 in Dublin 1999, APFA2 in Liege ` 2000, APFA3 in London 2001, APFA4 in Warsaw 2003, APFA5 in Torino 2006, and APFA6 in Lisbon 2007). The aim of the conf- ence is to establish fundamental analytical techniques and data collection methods, taking into account the results from a variety of academic disciplines.


Frequency of Observation and the Estimation of Integrated Volatility in Deep and Liquid Financial Markets

Frequency of Observation and the Estimation of Integrated Volatility in Deep and Liquid Financial Markets

Author: Alain Chaboud

Publisher:

Published: 2021

Total Pages: 0

ISBN-13:

DOWNLOAD EBOOK

Using two newly available ultrahigh-frequency datasets, we investigate empirically how frequently one can sample certain foreign exchange and U.S. Treasury security returns without contaminating estimates of their integrated volatility with market microstructure noise. Using volatility signature plots and a recently-proposed formal decision rule to select the sampling frequency, we find that one can sample FX returns as frequently as once every 15 to 20 seconds without contaminating volatility estimates; bond returns may be sampled as frequently as once every 2 to 3 minutes on days without U.S. macroeconomic announcements, and as frequently as once every 40 seconds on announcement days. With a simple realized kernel estimator, the sampling frequencies can be increased to once every 2 to 5 seconds for FX returns and to about once every 30 to 40 seconds for bond returns. These sampling frequencies, especially in the case of FX returns, are much higher than those often recommended in the empirical literature on realized volatility in equity markets. We suggest that the generally superior depth and liquidity of trading in FX and government bond markets contributes importantly to this difference.


Testing for Cointegration Using the Johansen Methodology when Variables are Near-integrated

Testing for Cointegration Using the Johansen Methodology when Variables are Near-integrated

Author: Erik Hjalmarsson

Publisher:

Published: 2007

Total Pages: 28

ISBN-13:

DOWNLOAD EBOOK

We investigate the properties of Johansen's (1988, 1991) maximum eigenvalue and trace tests for cointegration under the empirically relevant situation of near-integrated variables. Using Monte Carlo techniques, we show that in a system with near-integrated variables, the probability of reaching an erroneous conclusion regarding the cointegrating rank of the system is generally substantially higher than the nominal size. The risk of concluding that completely unrelated series are cointegrated is therefore non-negligible. The spurious rejection rate can be reduced by performing additional tests of restrictions on the cointegrating vector(s), although it is still substantially larger than the nominal size.


DSGE Models and Central Banks

DSGE Models and Central Banks

Author: Camilo Ernesto Tovar Mora

Publisher:

Published: 2008

Total Pages: 36

ISBN-13:

DOWNLOAD EBOOK

Over the past 15 years there has been remarkable progress in the specification and estimation of dynamic stochastic general equilibrium (DSGE) models. Central banks in developed and emerging market economies have become increasingly interested in their usefulness for policy analysis and forecasting. This paper reviews some issues and challenges surrounding the use of these models at central banks. It recognises that they offer coherent frameworks for structuring policy discussions. Nonetheless, they are not ready to accomplish all that is being asked of them. First, they still need to incorporate relevant transmission mechanisms or sectors of the economy; second, issues remain on how to empirically validate them; and finally, challenges remain on how to effectively communicate their features and implications to policy makers and to the public. Overall, at their current stage DSGE models have important limitations. How much of a problem this is will depend on their specific use at central banks.


A Residual-based Cointegration Test for Near Unit Root Variables

A Residual-based Cointegration Test for Near Unit Root Variables

Author: Erik Hjalmarsson

Publisher:

Published: 2007

Total Pages: 40

ISBN-13:

DOWNLOAD EBOOK

Methods of inference based on a unit root assumption in the data are typically not robust to even small deviations from this assumption. In this paper, we propose robust procedures for a residual-based test of cointegration when the data are generated by a near unit root process. A Bonferroni method is used to address the uncertainty regarding the exact degree of persistence in the process. We thus provide a method for valid inference in multivariate near unit root processes where standard cointegration tests may be subject to substantial size distortions and standard OLS inference may lead to spurious results. Empirical illustrations are given by: (i) a re-examination of the Fisher hypothesis, and (ii) a test of the validity of the cointegrating relationship between aggregate consumption, asset holdings, and labor income, which has attracted a great deal of attention in the recent finance literature.


On the Application of Automatic Differentiation to the Likelihood Function for Dynamic General Equilibrium Models

On the Application of Automatic Differentiation to the Likelihood Function for Dynamic General Equilibrium Models

Author: Houtan Bastani

Publisher:

Published: 2008

Total Pages: 28

ISBN-13:

DOWNLOAD EBOOK

A key application of automatic differentiation (AD) is to facilitate numerical optimization problems. Such problems are at the core of many estimation techniques, including maximum likelihood. As one of the first applications of AD in the field of economics, we used Tapenade to construct derivatives for the likelihood function of any linear or linearized general equilibrium model solved under the assumption of rational expectations. We view our main contribution as providing an important check on finite-difference (FD) numerical derivatives. We also construct Monte Carlo experiments to compare maximum-likelihood estimates obtained with and without the aid of automatic derivatives. We find that the convergence rate of our optimization algorithm can increase substantially when we use AD derivatives.


Measuring Liquidity in Financial Markets

Measuring Liquidity in Financial Markets

Author: Abdourahmane Sarr

Publisher: International Monetary Fund

Published: 2002-12

Total Pages: 72

ISBN-13:

DOWNLOAD EBOOK

This paper provides an overview of indicators that can be used to illustrate and analyze liquidity developments in financial markets. The measures include bid-ask spreads, turnover ratios, and price impact measures. They gauge different aspects of market liquidity, namely tightness (costs), immediacy, depth, breadth, and resiliency. These measures are applied in selected foreign exchange, money, and capital markets to illustrate their operational usefulness. A number of measures must be considered because there is no single theoretically correct and universally accepted measure to determine a market's degree of liquidity and because market-specific factors and peculiarities must be considered.