EcoSta 2023: Start Registration
View Submission - EcoSta2023
A1055
Title: Scalable and efficient computation of additive Gaussian processes with applications to Bayesian optimization Authors:  Liang Ding - Fudan University (China) [presenting]
Abstract: Additive Gaussian Processes (GPs) are popular as priors in scalable high-dimensional Bayesian optimization. In Bayesian optimization, the next sampling point is determined by optimizing an adaptive acquisition function to the GP posterior. However, after sampling the $n$-th design point, updating the posterior requires $O(n^3)$ time. Compounding the value and gradient of the acquisition function requires $O(n^2)$, making it time-consuming, particularly for large $n$. While efficient algorithms exist for posterior updates of additive GPs, few studies have focused on the efficient computation of the acquisition function and its gradient. Since searching for the next sampling point requires iteratively acquiring the acquisition function and its gradient, it is more time-consuming than updating the posterior. Algorithms are proposed for posterior updates, hyperparameters learning, and computations of the acquisition function and its gradient of additive GPs with Matern covariances. The algorithms significantly reduce the time complexity of computing the acquisition function and its gradient from $O(n^2)$ to $O(\log n)$ for general learning rates and even to $O(1)$ for small learning rates, while achieving comparable computational efficiency to current best algorithms in updating the posterior and learning hyperparameters.