CMStatistics 2023: Start Registration
View Submission - CMStatistics
B0290
Title: Error estimation for random Fourier features Authors:  Junwen Yao - UC Davis (United States)
Benjamin Erichson - ICSI and UC Berkeley (United States)
Miles Lopes - UC Davis (United States) [presenting]
Abstract: Random Fourier features (RFF) are among the most popular and broadly applicable approaches for scaling up kernel methods. In essence, RFF allows the user to avoid costly computations with a large kernel matrix via a fast randomized approximation. However, a pervasive difficulty in applying RFF is that the user does not know the actual error of the approximation, or how this error will propagate into downstream learning tasks. Up to now, the RFF literature has primarily dealt with these uncertainties using theoretical error bounds, but from a user standpoint, such results are typically impractical, either because they are highly conservative or involve unknown quantities. To tackle these general issues in a data-driven way, a bootstrap approach is developed to numerically estimate the errors of RFF approximations. Three key advantages of this approach are: (1) the error estimates are specific to the problem at hand, avoiding the pessimism of worst-case bounds; (2) the approach is flexible with respect to different uses of RFF, and can even estimate errors in downstream learning tasks; (3) the approach enables adaptive computation, in the sense that the user can quickly inspect the error of a rough initial kernel approximation and then predict how much extra work is needed. Furthermore, in exchange for all of these benefits, the error estimates can be obtained at a modest computational cost.