B0458
Title: No-lose converging kernel estimation of long-run variance
Authors: Kin Wai Chan - The Chinese University of Hong Kong (Hong Kong) [presenting]
Xu Liu - The Chinese University of Hong Kong (Hong Kong)
Abstract: Kernel estimators have been popular for decades in long-run variance estimation. To minimize the loss of efficiency measured by the mean-squared error in important aspects of kernel estimation, a novel class of converging kernel estimators is proposed that have no-lose properties including (1) no efficiency loss from estimating the bandwidth as the optimal choice is universal; (2) no efficiency loss from ensuring positive-definiteness using a principle-driven aggregation technique; and (3) no efficiency loss asymptotically from potentially misspecified prewhitening models and transformations of the time series. A shrinkage prewhitening transformation is proposed for more robust finite-sample performance. The estimator has a positive bias that diminishes with the sample size so that it is more conservative compared with the typically negatively biased classical estimators. The proposal improves upon all standard kernel functions and can be well generalized to the multivariate case. Its performance is discussed through simulation results and two real-data applications including the forecast breakdown test and MCMC convergence diagnostics.