Mehler’s Formula, Branching Process, and Compositional Kernels of Deep Neural Networks

We utilize a connection between compositional kernels and branching processes via Mehler’s formula to study deep neural networks. This new probabilistic insight provides us a novel perspective on the mathematical role of activation functions in compositional neural networks. We study the unscaled and rescaled limits of the compositional kernels and explore the different phases of the limiting behavior, as the compositional depth increases.

April 2020 · Tengyuan Liang, Hai Tran-Bach

A Precise High-Dimensional Asymptotic Theory for Boosting and Minimum-L1-Norm Interpolated Classifiers

This paper establishes a precise high-dimensional asymptotic theory for boosting on separable data, taking statistical and computational perspectives.

February 2020 · Tengyuan Liang, Pragya Sur