Blessings and Curses of Covariate Shifts: Adversarial Learning Dynamics, Directional Convergence, and Equilibria
Blessings and curses of covariate shifts, directional convergece, and the connection to experimental design.
Blessings and curses of covariate shifts, directional convergece, and the connection to experimental design.
This paper proposes a computationally efficient method to construct nonparametric, heteroscedastic prediction bands for uncertainty quantification.
This paper provides elementary analyses of the regret and generalization of minimum-norm interpolating classifiers.
This paper establishes a precise high-dimensional asymptotic theory for boosting on separable data, taking statistical and computational perspectives.
We study the risk of minimum-norm interpolants of data in Reproducing Kernel Hilbert Spaces. Our upper bounds on the risk are of a multiple-descent shape. Empirical evidence supports our finding that minimum-norm interpolants in RKHS can exhibit this unusual non-monotonicity in sample size.
What are the provable benefits of the adaptive representation by neural networks compared to the pre-specified fixed basis representation in the classical nonparametric literature? We answer the above questions via a dynamic reproducing kernel Hilbert space (RKHS) approach indexed by the training process of neural networks.
In the absence of explicit regularization, interpolating kernel machine has the potential to fit the training data perfectly, at the same time, still generalizes well on test data. We isolate a phenomenon of implicit regularization for minimum-norm interpolated solutions.