Stable Foundations for Learning: a framework for learning theory (in both the classical and modern regime).

TitleStable Foundations for Learning: a framework for learning theory (in both the classical and modern regime).
Publication TypeCBMM Memos
Year of Publication2020
AuthorsPoggio, T
Date Published03/2020
Abstract

We consider here the class of supervised learning algorithms known as Empirical Risk Minimization (ERM). The classical theory by Vapnik and others characterize universal consistency of ERM in the classical regime in which the architecture of the learning network is fixed and n, the number of training examples, goes to infinity. We do not have a similar general theory for the modern regime of interpolating regressors and overparameterized deep networks, in which d > n as n goes to infinity.

In this note I propose the outline of such a theory based on the specific notion of CVloo stability of the learning algorithm with respect to perturbations of the training set. The theory shows that for interpolating regressors and separating classifiers (either kernel machines or deep RELU networks)

  1. minimizing CVloo stability minimizes the expected error
  2.  minimum norm solutions are the most stable solutions

The hope is that this approach may lead to a unified theory encompassing both the modern regime and the classical one.

DSpace@MIT

https://hdl.handle.net/1721.1/124343

CBMM Memo No:  103

Associated Module: 

CBMM Relationship: 

  • CBMM Funded