The Exact Sample Complexity Gain from Invariances for Kernel Regression on Manifolds

Wednesday, March 1, 2023 - 4:00pm to 4:30pm

Event Calendar Category

LIDS & Stats Tea

Speaker Name

Behrooz Tahmasebi

Affiliation

CSAIL

Building and Room Number

LIDS Lounge

In practice, encoding invariances into models helps sample complexity. In this work, we tighten and generalize theoretical results on how invariances improve sample complexity. In particular, we provide minimax optimal rates for kernel ridge regression on any manifold, with a target function that is invariant to an arbitrary group action on the manifold. Our results hold for (almost) any group action, even groups of positive dimension. For a finite group, the gain increases the ``effective'' number of samples by the group size. For groups of positive dimension, the gain is observed by a reduction in the manifold's dimension, in addition to a factor proportional to the volume of the quotient space. Our proof takes the viewpoint of differential geometry, in contrast to the more common strategy of using invariant polynomials. Hence, this new geometric viewpoint on learning with invariances may be of independent interest.

Behrooz Tahmasebi is a Ph.D. Student in EECS at MIT, under the supervision of Prof. Stefanie Jegelka. His research interests include deep learning theory, learning with group invariances, and learning with graphs.