Simon Barthelmé, CNRS, Gipsa-lab
Kernel matrices are ubiquitous in statistics, numerical analysis, and machine learning, appearing in methods for interpolation, regression, inverse problems, etc. One perennial difficulty in kernel matrices has to do with having to choose a spatial length-scale parameter. Fornberg \& Driscoll showed in 2002 that radial basis interpolation, a special case of a kernel method, could be studied in the "flat limit". The goal is to characterise kernel methods as the length-scale of the kernel function tends to infinity, so that kernels appear flat over the range of the data.
In this talk I will explain how kernel matrices behave in the flat limit. We have been able to show that the eigenvectors and eigenvalues in that regime are tightly related to orthogonal polynomials or splines, depending on the smoothness of the kernel.
I'll describe an application to Gaussian Process/Kernel Ridge regression. Our results show that GP regression tends in the flat limit to (multivariate) polynomial regression, or (polyharmonic) spline regression, depending on the kernel. Eliminating the spatial scale parameter thus results in simpler, scale-free methods.
Simon Barthelmé (CNRS, Gipsa-lab), joint work with K. Usevich, N. Tremblay, P.-O. Amblard