Randomization Inference When N Equals One
A statistical theory for N-of-1 experiments, where a unit serves as its own control and treatment in different time windows.
A statistical theory for N-of-1 experiments, where a unit serves as its own control and treatment in different time windows.
Detecting weak, systematic distribution shifts and quantitatively modeling individual, heterogeneous responses to policies or incentives have found increasing empirical applications in social and economic sciences. We propose a model for weak distribution shifts via displacement interpolation, drawing from the optimal transport theory.
Motivated by the seminal work on distance and isomorphism between metric measure spaces, we propose a new notion called the Reversible Gromov-Monge (RGM) distance and study how RGM can be used to design new transform samplers to perform simulation-based inference.
This course covers fundamental statistical concepts and basic computational tools in data analysis. The goal is to learn how to perform descriptive and predictive data analysis based on real datasets. This course also serves as a quantitative foundation for Chicago Booth elective courses in marketing, finance, economics and more advanced courses in data science.
Modern statistical inference tasks often require iterative optimization methods to compute the solution. Convergence analysis from an optimization viewpoint only informs us how well the solution is approximated numerically but overlooks the sampling nature of the data. We introduce the moment-adjusted stochastic gradient descents, a new stochastic optimization method for statistical inference.