Deep Neural Networks for Estimation and Inference
Can deep neural networks with standard archtectures estimate treatment effects and perform downstream uncertainty quantification tasks?
Can deep neural networks with standard archtectures estimate treatment effects and perform downstream uncertainty quantification tasks?
This paper proposes a computationally efficient method to construct nonparametric, heteroscedastic prediction bands for uncertainty quantification.
Blessings and curses of covariate shifts, directional convergece, and the connection to experimental design.
Detecting weak, systematic distribution shifts and quantitatively modeling individual, heterogeneous responses to policies or incentives have found increasing empirical applications in social and economic sciences. We propose a model for weak distribution shifts via displacement interpolation, drawing from the optimal transport theory.
A statistical theory for N-of-1 experiments, where a unit serves as its own control and treatment in rapid interleaving time windows.
Confounding can obfuscate the definition of the best prediction model (concept shift) and shift covariates to domains yet unseen (covariate shift). Therefore, a model maximizing prediction accuracy in the source environment could suffer a significant accuracy drop in the target environment. We propose a new domain adaptation method for observational data in the presence of confounding, and characterize the the stability and predictability tradeoff leveraging a structural causal model.
We introduce a new convexified matching method for missing value imputation and individualized inference inspired by computational optimal transport.
This paper presents Gaussianized Design Optimization, a novel framework for optimally balancing covariates in experimental design.