Biography (Third-Person Narrative)
Tengyuan Liang is a Professor of Econometrics and Statistics in the Wallman Society of Fellows at the University of Chicago Booth School of Business. His research focuses on the statistical and computational foundations of AI and its reliable applications in business and economics. He has published in leading journals in applied mathematics, economics, machine learning, and statistics. He received the CAREER award from NSF Division of Mathematical Sciences for his work on modern statistical learning paradigms. He served as an Associate Editor for the Journal of the American Statistical Association and the Operations Research, on the Editorial Board of the Journal of Machine Learning Research.
What I Study
I use insights and principles from learning theory and statistical theory to understand models and data. On the applied side, I study causal machine learning in business and economic contexts.
In past work I have uncovered the presence and effects of implicit regularization in kernel machines, boosting methods, and neural networks in high-dimensional and over-parametrized regimes. I have developed statistical and computational theories for generative models, including generative adversarial networks, probabilistic diffusion models, and PDE-based stochastic samplers. I have contributed to the rigorous application of machine learning and optimization techniques in causal inference and uncertainty quantification.
Currently, I am thinking about the following topics:
Selected Peer-Reviewed Work
Some recent publications:
T. Liang, B. Recht. Randomization Inference When N Equals One.
Biometrika, forthcoming, 1-23, 2025.T. Liang. Blessings and Curses of Covariate Shifts: Adversarial Learning Dynamics, Directional Convergence, and Equilibria.
Journal of Machine Learning Research, 25(140):1-27, 2024.Y. Hur, W. Guo, T. Liang. Reversible Gromov-Monge Sampler for Simulation-Based Inference.
SIAM Journal on Mathematics of Data Science, 6(2):283-310, 2024.T. Liang, S. Sen, P. Sur. High-Dimensional Asymptotics of Langevin Dynamics in Spiked Matrix Models.
Information and Inference: A Journal of the IMA, 12(4):2720-2752, 2023.T. Liang. Universal Prediction Band via Semi-Definite Programming.
Journal of the Royal Statistical Society: Series B (Statistical Methodology), 84(4):1558–1580, 2022.
Some selected publications in mostly chronological order:
T. Liang, A. Rakhlin. Just Interpolate: Kernel “Ridgeless” Regression Can Generalize.
The Annals of Statistics, 48(3):1329-1347, 2020.M. H. Farrell, T. Liang, S. Misra. Deep Neural Networks for Estimation and Inference.
Econometrica, 89(1):181-213, 2021.T. Liang. How Well Generative Adversarial Networks Learn Distributions.
Journal of Machine Learning Research, 22(228):1-41, 2021.T. Liang, A. Rakhlin, X. Zhai. On the Multiple Descent of Minimum-Norm Interpolants and Restricted Lower Isometry of Kernels.
Conference on Learning Theory, 125:2683-2711, 2020.X. Dou, T. Liang. Training Neural Networks as Learning Data-adaptive Kernels: Provable Representation and Approximation Benefits.
Journal of the American Statistical Association (Theory and Methods), 116:535, 1507-1520, 2021.T. Liang, P. Sur. A Precise High-Dimensional Asymptotic Theory for Boosting and Minimum-L1-Norm Interpolated Classifiers.
The Annals of Statistics, 50(3):1669-1695, 2022.