Download:

Abstract:

Modern statistical inference tasks often require iterative optimization methods to compute the solution. Convergence analysis from an optimization viewpoint only informs us how well the solution is approximated numerically but overlooks the sampling nature of the data. In contrast, recognizing the randomness in the data, statisticians are keen to provide uncertainty quantification, or confidence, for the solution obtained using iterative optimization methods. This paper makes progress along this direction by introducing the moment-adjusted stochastic gradient descents, a new stochastic optimization method for statistical inference. We establish non-asymptotic theory that characterizes the statistical distribution for certain iterative methods with optimization guarantees. On the statistical front, the theory allows for model mis-specification, with very mild conditions on the data. For optimization, the theory is flexible for both convex and non-convex cases. Remarkably, the moment-adjusting idea motivated from “error standardization” in statistics achieves a similar effect as acceleration in first-order optimization methods used to fit generalized linear models. We also demonstrate this acceleration effect in the non-convex setting through numerical experiments.


Citation

Tengyuan Liang, and Weijie J. Su. 2019. “Statistical Inference for the Population Landscape via Moment Adjusted Stochastic Gradients.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 81 (2): 431-456.

@article{Liang_2019,
   title={Statistical Inference for the Population Landscape via Moment-Adjusted Stochastic Gradients},
   volume={81},
   ISSN={1467-9868},
   url={http://dx.doi.org/10.1111/rssb.12313},
   DOI={10.1111/rssb.12313},
   number={2},
   journal={Journal of the Royal Statistical Society Series B: Statistical Methodology},
   publisher={Oxford University Press (OUP)},
   author={Liang, Tengyuan and Su, Weijie J.},
   year={2019},
   month=feb, pages={431–456} }