Shrinkage Estimation of Large Covariance Matrices

Shrinkage Estimation of Large Covariance Matrices

Author: Olivier Ledoit

Publisher:

Published: 2019

Total Pages:

ISBN-13:

DOWNLOAD EBOOK

Under rotation-equivariant decision theory, sample covariance matrix eigenvalues can be optimally shrunk by recombining sample eigenvectors with a (potentially nonlinear) function of the unobservable population covariance matrix. The optimal shape of this function reflects the loss/risk that is to be minimized. We introduce a broad family of covariance matrix estimators that can handle all regular functional transformations of the population covariance matrix under large-dimensional asymptotics. We solve the problem of optimal covariance matrix estimation under a variety of loss functions motivated by statistical precedent, probability theory, and differential geometry. The key statistical ingredient of our nonlinear shrinkage methodology is a new estimator of the angle between sample and population eigenvectors, without making strong assumptions on the population eigenvalues. We also compare our methodology to two simpler ones from the literature, linear shrinkage and shrinkage based on the spiked covariance model, via both Monte Carlo simulations and an empirical application.


Shrinkage Estimation for Mean and Covariance Matrices

Shrinkage Estimation for Mean and Covariance Matrices

Author: Hisayuki Tsukuma

Publisher: Springer Nature

Published: 2020-04-16

Total Pages: 119

ISBN-13: 9811515964

DOWNLOAD EBOOK

This book provides a self-contained introduction to shrinkage estimation for matrix-variate normal distribution models. More specifically, it presents recent techniques and results in estimation of mean and covariance matrices with a high-dimensional setting that implies singularity of the sample covariance matrix. Such high-dimensional models can be analyzed by using the same arguments as for low-dimensional models, thus yielding a unified approach to both high- and low-dimensional shrinkage estimations. The unified shrinkage approach not only integrates modern and classical shrinkage estimation, but is also required for further development of the field. Beginning with the notion of decision-theoretic estimation, this book explains matrix theory, group invariance, and other mathematical tools for finding better estimators. It also includes examples of shrinkage estimators for improving standard estimators, such as least squares, maximum likelihood, and minimum risk invariant estimators, and discusses the historical background and related topics in decision-theoretic estimation of parameter matrices. This book is useful for researchers and graduate students in various fields requiring data analysis skills as well as in mathematical statistics.


High-Dimensional Covariance Matrix Estimation

High-Dimensional Covariance Matrix Estimation

Author: Aygul Zagidullina

Publisher: Springer Nature

Published: 2021-10-29

Total Pages: 123

ISBN-13: 3030800652

DOWNLOAD EBOOK

This book presents covariance matrix estimation and related aspects of random matrix theory. It focuses on the sample covariance matrix estimator and provides a holistic description of its properties under two asymptotic regimes: the traditional one, and the high-dimensional regime that better fits the big data context. It draws attention to the deficiencies of standard statistical tools when used in the high-dimensional setting, and introduces the basic concepts and major results related to spectral statistics and random matrix theory under high-dimensional asymptotics in an understandable and reader-friendly way. The aim of this book is to inspire applied statisticians, econometricians, and machine learning practitioners who analyze high-dimensional data to apply the recent developments in their work.


High-Dimensional Covariance Matrix Estimation: Shrinkage Toward a Diagonal Target

High-Dimensional Covariance Matrix Estimation: Shrinkage Toward a Diagonal Target

Author: Mr. Sakai Ando

Publisher: International Monetary Fund

Published: 2023-12-08

Total Pages: 32

ISBN-13:

DOWNLOAD EBOOK

This paper proposes a novel shrinkage estimator for high-dimensional covariance matrices by extending the Oracle Approximating Shrinkage (OAS) of Chen et al. (2009) to target the diagonal elements of the sample covariance matrix. We derive the closed-form solution of the shrinkage parameter and show by simulation that, when the diagonal elements of the true covariance matrix exhibit substantial variation, our method reduces the Mean Squared Error, compared with the OAS that targets an average variance. The improvement is larger when the true covariance matrix is sparser. Our method also reduces the Mean Squared Error for the inverse of the covariance matrix.


Explorations in Harmonic Analysis

Explorations in Harmonic Analysis

Author: Steven G. Krantz

Publisher: Springer Science & Business Media

Published: 2009-05-24

Total Pages: 367

ISBN-13: 0817646698

DOWNLOAD EBOOK

This self-contained text provides an introduction to modern harmonic analysis in the context in which it is actually applied, in particular, through complex function theory and partial differential equations. It takes the novice mathematical reader from the rudiments of harmonic analysis (Fourier series) to the Fourier transform, pseudodifferential operators, and finally to Heisenberg analysis.


High-Dimensional Covariance Estimation

High-Dimensional Covariance Estimation

Author: Mohsen Pourahmadi

Publisher: John Wiley & Sons

Published: 2013-06-24

Total Pages: 204

ISBN-13: 1118034295

DOWNLOAD EBOOK

Methods for estimating sparse and large covariance matrices Covariance and correlation matrices play fundamental roles in every aspect of the analysis of multivariate data collected from a variety of fields including business and economics, health care, engineering, and environmental and physical sciences. High-Dimensional Covariance Estimation provides accessible and comprehensive coverage of the classical and modern approaches for estimating covariance matrices as well as their applications to the rapidly developing areas lying at the intersection of statistics and machine learning. Recently, the classical sample covariance methodologies have been modified and improved upon to meet the needs of statisticians and researchers dealing with large correlated datasets. High-Dimensional Covariance Estimation focuses on the methodologies based on shrinkage, thresholding, and penalized likelihood with applications to Gaussian graphical models, prediction, and mean-variance portfolio management. The book relies heavily on regression-based ideas and interpretations to connect and unify many existing methods and algorithms for the task. High-Dimensional Covariance Estimation features chapters on: Data, Sparsity, and Regularization Regularizing the Eigenstructure Banding, Tapering, and Thresholding Covariance Matrices Sparse Gaussian Graphical Models Multivariate Regression The book is an ideal resource for researchers in statistics, mathematics, business and economics, computer sciences, and engineering, as well as a useful text or supplement for graduate-level courses in multivariate analysis, covariance estimation, statistical learning, and high-dimensional data analysis.


Quadratic Shrinkage for Large Covariance Matrices

Quadratic Shrinkage for Large Covariance Matrices

Author: Olivier Ledoit

Publisher:

Published: 2019

Total Pages:

ISBN-13:

DOWNLOAD EBOOK

This paper constructs a new estimator for large covariance matrices by drawing a bridge between the classic Stein (1975) estimator in finite samples and recent progress under large-dimensional asymptotics. Our formula is quadratic: it has two shrinkage targets weighted by quadratic functions of the concentration ratio (matrix dimension divided by sample size, a standard measure of the curse of dimensionality). The first target dominates mid-level concentrations and the second one higher levels. This extra degree of freedom enables us to outperform linear shrinkage when optimal shrinkage is not linear (which is the general case). Both of our targets are based on what we term the "Stein shrinker", a local attraction operator that pulls sample covariance matrix eigenvalues towards their nearest neighbors, but whose force diminishes with distance, like gravitation. We prove that no cubic or higher- order nonlinearities beat quadratic with respect to Frobenius loss under large-dimensional asymptotics. Non-normality and the case where the matrix dimension exceeds the sample size are accommodated. Monte Carlo simulations confirm state-of-the-art performance in terms of accuracy, speed, and scalability.


Large Covariance and Autocovariance Matrices

Large Covariance and Autocovariance Matrices

Author: Arup Bose

Publisher: CRC Press

Published: 2018-07-03

Total Pages: 272

ISBN-13: 1351398164

DOWNLOAD EBOOK

Large Covariance and Autocovariance Matrices brings together a collection of recent results on sample covariance and autocovariance matrices in high-dimensional models and novel ideas on how to use them for statistical inference in one or more high-dimensional time series models. The prerequisites include knowledge of elementary multivariate analysis, basic time series analysis and basic results in stochastic convergence. Part I is on different methods of estimation of large covariance matrices and auto-covariance matrices and properties of these estimators. Part II covers the relevant material on random matrix theory and non-commutative probability. Part III provides results on limit spectra and asymptotic normality of traces of symmetric matrix polynomial functions of sample auto-covariance matrices in high-dimensional linear time series models. These are used to develop graphical and significance tests for different hypotheses involving one or more independent high-dimensional linear time series. The book should be of interest to people in econometrics and statistics (large covariance matrices and high-dimensional time series), mathematics (random matrices and free probability) and computer science (wireless communication). Parts of it can be used in post-graduate courses on high-dimensional statistical inference, high-dimensional random matrices and high-dimensional time series models. It should be particularly attractive to researchers developing statistical methods in high-dimensional time series models. Arup Bose is a professor at the Indian Statistical Institute, Kolkata, India. He is a distinguished researcher in mathematical statistics and has been working in high-dimensional random matrices for the last fifteen years. He has been editor of Sankhyā for several years and has been on the editorial board of several other journals. He is a Fellow of the Institute of Mathematical Statistics, USA and all three national science academies of India, as well as the recipient of the S.S. Bhatnagar Award and the C.R. Rao Award. His first book Patterned Random Matrices was also published by Chapman & Hall. He has a forthcoming graduate text U-statistics, M-estimates and Resampling (with Snigdhansu Chatterjee) to be published by Hindustan Book Agency. Monika Bhattacharjee is a post-doctoral fellow at the Informatics Institute, University of Florida. After graduating from St. Xavier's College, Kolkata, she obtained her master’s in 2012 and PhD in 2016 from the Indian Statistical Institute. Her thesis in high-dimensional covariance and auto-covariance matrices, written under the supervision of Dr. Bose, has received high acclaim.