Statistical Inference without Excess Data Using Only Stochastic Gradients: Volume 1

The authors present a novel statistical inference framework for convex empirical risk minimization, using approximate stochastic Newton steps. The proposed algorithm is based on the notion of finite differences and allows the approximation of a Hessian-vector product from first-order information. In theory, their method efficiently computes the statistical error covariance in M-estimation, both for unregularized convex learning problems and high-dimensional LASSO regression, without using exact second order information, or resampling the entire data set. The authors also present a stochastic gradient sampling scheme for statistical inference in non-i.i.d. time series analysis, where the authors sample contiguous blocks of indices. In practice, the authors demonstrate the effectiveness of their framework on large-scale machine learning problems, that go even beyond convexity: as a highlight, their work can be used to detect certain adversarial attacks on neural networks.


  • English

Media Info

  • Media Type: Digital/other
  • Features: Figures; References; Tables;
  • Pagination: 22p

Subject/Index Terms

Filing Info

  • Accession Number: 01764644
  • Record Type: Publication
  • Report/Paper Numbers: D-STOP/2020/159, Report 159
  • Contract Numbers: DTRT13-G-UTC58
  • Created Date: Jan 27 2021 12:34PM