Gaussianized Design Optimization for Covariate Balance in Randomized Experiments

This paper presents Gaussianized Design Optimization, a novel framework for optimally balancing covariates in experimental design.

November 2024 · Wenxuan Guo, Tengyuan Liang, Panos Toulis

A Convexified Matching Approach to Imputation and Individualized Inference

We introduce a new convexified matching method for missing value imputation and individualized inference inspired by computational optimal transport.

July 2024 · YoonHaeng Hur, Tengyuan Liang

Learning When the Concept Shifts: Confounding, Invariance, and Dimension Reduction

Confounding can obfuscate the definition of the best prediction model (concept shift) and shift covariates to domains yet unseen (covariate shift). Therefore, a model maximizing prediction accuracy in the source environment could suffer a significant accuracy drop in the target environment. We propose a new domain adaptation method for observational data in the presence of confounding, and characterize the the stability and predictability tradeoff leveraging a structural causal model.

June 2024 · Kulunu Dharmakeerthi, YoonHaeng Hur, Tengyuan Liang

Randomization Inference When N Equals One

A statistical theory for N-of-1 experiments, where a unit serves as its own control and treatment in rapid interleaving time windows.

October 2023 · Tengyuan Liang, Benjamin Recht

Blessings and Curses of Covariate Shifts: Adversarial Learning Dynamics, Directional Convergence, and Equilibria

Blessings and curses of covariate shifts, directional convergece, and the connection to experimental design.

December 2022 · Tengyuan Liang

Deep Neural Networks for Estimation and Inference

Can deep neural networks with standard archtectures estimate treatment effects and perform downstream uncertainty quantification tasks?

September 2018 · Max H. Farrell, Tengyuan Liang, Sanjog Misra