A Precise High-Dimensional Asymptotic Theory for Boosting and Minimum-L1-Norm Interpolated Classifiers

This paper establishes a precise high-dimensional asymptotic theory for boosting on separable data, taking statistical and computational perspectives.

February 2020 · Tengyuan Liang, Pragya Sur

Just Interpolate: Kernel Ridgeless Regression Can Generalize

In the absence of explicit regularization, interpolating kernel machine has the potential to fit the training data perfectly, at the same time, still generalizes well on test data. We isolate a phenomenon of implicit regularization for minimum-norm interpolated solutions.

August 2018 · Tengyuan Liang, Alexander Rakhlin