Manuscripts

. Reversible Gromov-Monge Sampler for Simulation-Based Inference. arXiv:2109.14090, 2021.

Preprint PDF

. Interpolating Classifiers Make Few Mistakes. arXiv:2101.11815, 2021.

Preprint PDF

. Deep Learning for Individual Heterogeneity: An Automatic Inference Framework. arXiv:2010.14694, 2020.

Preprint PDF

Publications

More Publications

. How Well Generative Adversarial Networks Learn Distributions. Journal of Machine Learning Research, 2021.

Preprint PDF

. Mehler’s Formula, Branching Process, and Compositional Kernels of Deep Neural Networks. Journal of the American Statistical Association (Theory and Methods), 2021.

Preprint PDF

. Deep Neural Networks for Estimation and Inference. Econometrica, 2021.

Preprint PDF Media

. Training Neural Networks as Learning Data-adaptive Kernels: Provable Representation and Approximation Benefits. Journal of the American Statistical Association (Theory and Methods), 2021.

Preprint PDF

. On the Multiple Descent of Minimum-Norm Interpolants and Restricted Lower Isometry of Kernels. Conference on Learning Theory (COLT), 2020.

Preprint PDF Corrigendum

. Just Interpolate: Kernel ''Ridgeless'' Regression Can Generalize. Annals of Statistics, 2020.

Preprint PDF Supplement

. Weighted Message Passing and Minimum Energy Flow for Heterogeneous Stochastic Block Models with Side Information. Journal of Machine Learning Research, 2020.

Preprint PDF

. Statistical Inference for the Population Landscape via Moment Adjusted Stochastic Gradients. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2019.

Preprint PDF

. Interaction Matters: A Note on Non-asymptotic Local Convergence of Generative Adversarial Networks. International Conference on Artificial Intelligence and Statistics (AISTATS), 2019.

Preprint PDF

. Fisher-Rao Metric, Geometry, and Complexity of Neural Networks. International Conference on Artificial Intelligence and Statistics (AISTATS), 2019.

Preprint PDF

. Local Optimality and Generalization Guarantees for the Langevin Algorithm via Empirical Metastability. Conference on Learning Theory (COLT), 2018.

Preprint PDF

. Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP. International Conference on Machine Learning (ICML), 2017.

Preprint PDF

. Computational and Statistical Boundaries for Submatrix Localization in a Large Noisy Matrix. Annals of Statistics, 2017.

Preprint PDF

. On Detection and Structural Reconstruction of Small-World Random Networks. IEEE Transactions on Network Science and Engineering, 2017.

Preprint PDF

. Geometric Inference for General High-Dimensional Linear Inverse Problems. Annals of Statistics, 2016.

Preprint PDF

. Learning with Square Loss: Localization through Offset Rademacher Complexity. Conference on Learning Theory (COLT), 2015.

Preprint PDF

. Escaping the Local Minima via Simulated Annealing: Optimization of Approximately Convex Functions. Conference on Learning Theory (COLT), 2015.

Preprint PDF

Teaching

Booth:

Wharton:

  • 622 (MBA): Advanced Quantitative Modeling: Spring 15, Spring 14

Awards

  • NSF CAREER Award, 2021-2026

  • David G. Booth Faculty Fellow, William S. Fishman Faculty Scholar, 2021-2022

  • George C. Tiao Faculty Fellowship, 2017-2021
    for research in computational and data science

  • J. Parker Memorial Bursk Award, 2016
    for excellence in research

  • US Junior Oberwolfach Fellow, 2015

  • Winkelman Fellowship, 2014-2017
    the highest honorific fellowship awarded by the Wharton School

Professional Service

Workshops & Talks

More Talks

UBC
Oct 15, 2021
ICML 2021 Workshop
Jul 24, 2021
LSE
Jun 3, 2021
Durham University Business School
May 27, 2021
Rutgers
Mar 10, 2021
UMass Amherst
Mar 5, 2021
NSF-Simons Collaboration, Mathematics of Deep Learning
Dec 16, 2020
JSM 2020
Aug 5, 2020
Google Research NYC
Jun 12, 2020
Duke
Mar 25, 2020

Contact