A0785
Title: Posterior concentration for Levy adaptive B-spline regression in Besov spaces
Authors: Jeunghun Oh - Seoul National University (Korea, South) [presenting]
Sewon Park - Samsung SDS Security Algorithm Lab (Korea, South)
Jaeyong Lee - Seoul National University (Korea, South)
Abstract: The Levy adaptive B-spline (LABS) regression model is studied - an implementation of LARK employing mixtures of B-spline kernels of varying polynomial degrees. Since the model's mean function is a linear combination of B-spline kernels, LABS flexibly captures local spatial features, including jump discontinuities and sharp peaks, and can therefore represent functions lying in Besov spaces. Focusing on one-dimensional regression with homoskedastic noise, Asymptotic guarantees are established when the true function lies in a Besov space of smoothness $s > 0$. Specifically, it is proven that the LABS posterior contracts around the truth at a nearly optimal rate under the $L^2$ loss-optimal up to logarithmic factors, while automatically adapting to unknown smoothness. This fills a gap in the literature, where rigorous posterior rates for fully Bayesian spline-kernel methods on Besov classes have been scarce. Numerical experiments-comprising simulations on Besov-space test functions and an application to real data-corroborate the theory and illustrate the practical utility of LABS.