COMPSTAT 2022: Start Registration
View Submission - COMPSTAT2022
A0263
Title: High dimensional generalised penalised least squares Authors:  Ilias Chronopoulos - University of Essex (United Kingdom)
George Kapetanios - Kings College, University of London (United Kingdom)
Aikaterini Chrysikou - Kings College, University of London (United Kingdom) [presenting]
Abstract: Inference is developed for high dimensional linear models, with serially correlated errors. We examine the latter using the Lasso under the assumption of strong mixing in the covariates and error process, allowing for fatter tails in their distribution. While the Lasso estimator performs poorly under such circumstances, we estimate via penalised FGLS the parameters of interest and extend the asymptotic properties of the Lasso under more general conditions. Our theoretical results indicate that the non-asymptotic bounds for stationary dependent processes are sharper, while the rate of the Lasso under general conditions appears slower as $T,p\to \infty$. Further, we use the de-biased Lasso to perform inference on the parameters of interest. Using simulated data, we find that with the debiased generalised least squares estimator, our \emph{t-tests} appear more powerful and correctly sized, while the true value of parameter is included in the 95\% confidence interval with satisfying coverage rates at different levels of autocorrelation and parameter sparsity.