EcoSta 2024: Start Registration
View Submission - EcoSta2024
A1055
Title: Risk bounds for quantile additive trend filtering Authors:  Zhi Zhang - UCLA (United States) [presenting]
Kyle Ritscher - UCLA (United States)
Oscar Hernan Madrid Padilla - UCLA (United States)
Abstract: Risk bounds for quantile additive trend filtering are investigated, a method gaining increasing significance in the realms of additive trend filtering and quantile regression. The constrained version of quantile trend filtering within additive models is investigated, considering fixed and growing input dimensions. In the fixed dimension case, an error rate that mirrors the non-quantile minimax rate is discovered for additive trend filtering, featuring the main term $n^{-2r/(2r+1)}$. In scenarios with a growing input dimension ($d$), quantile additive trend filtering introduces a polynomial factor of $d^{(2r+2)/(2r+1)}$. This aligns with the non-quantile variant, featuring a linear factor $d$, particularly pronounced for larger $r$ values. Additionally, a practical algorithm is proposed for implementing quantile trend filtering within additive models using dimension-wise backfitting. Experiments are conducted with evenly spaced data points or data that samples from a uniform distribution in the interval $[0,1]^d$, applying distinct component functions and introducing noise from normal and heavy-tailed distributions. The findings confirm the estimator's convergence as $n$ increases and its superiority, particularly in heavy-tailed distribution scenarios. These results deepen the understanding of additive trend filtering models in quantile settings, offering valuable insights for practical applications and future research.