Figure caption

BUSN 41918 (PhD): Data, Learning, and Algorithms

This Ph.D. level course will provide an overview of machine learning and its algorithmic paradigms, and explore recent topics on learning, inference, and decision-making with large data sets. Emphasis will be made on theoretical insights and algorithmic principles.

January 2024 · Prof. Tengyuan Liang

Randomization Inference When N Equals One

A statistical theory for N-of-1 experiments, where a unit serves as its own control and treatment in different time windows.

October 2023 · Tengyuan Liang, Benjamin Recht

Blessings and Curses of Covariate Shifts: Adversarial Learning Dynamics, Directional Convergence, and Equilibria

Blessings and curses of covariate shifts, directional convergece, and the connection to experimental design.

December 2022 · Tengyuan Liang

Online Learning to Transport via the Minimal Selection Principle

Motivated by robust dynamic resource allocation in operations research, we study the Online Learning to Transport (OLT) problem where the decision variable is a probability measure, an infinite-dimensional object. We draw connections between online learning, optimal transport, and partial differential equations through an insight called the minimal selection principle, originally studied in the Wasserstein gradient flow setting by Ambrosio et al. (2005).

February 2022 · Wenxuan Guo, YoonHaeng Hur, Tengyuan Liang, Christopher Ryan

Universal Prediction Band via Semi-Definite Programming

This paper proposes a computationally efficient method to construct nonparametric, heteroscedastic prediction bands for uncertainty quantification.

March 2021 · Tengyuan Liang

Local Optimality and Generalization Guarantees for the Langevin Algorithm via Empirical Metastability

We study the detailed path-wise behavior of the discrete-time Langevin algorithm for non-convex Empirical Risk Minimization (ERM) through the lens of metastability, adopting some techniques from Berglund and Gentz (2003).

February 2018 · Belinda Tzen, Tengyuan Liang, Maxim Raginsky

Statistical Inference for the Population Landscape via Moment Adjusted Stochastic Gradients

Modern statistical inference tasks often require iterative optimization methods to compute the solution. Convergence analysis from an optimization viewpoint only informs us how well the solution is approximated numerically but overlooks the sampling nature of the data. We introduce the moment-adjusted stochastic gradient descents, a new stochastic optimization method for statistical inference.

December 2017 · Tengyuan Liang, Weijie J. Su