On stochastic and deterministic quasi-Newton methods for nonstrongly convex optimization: Asymptotic convergence and rate analysis

Farzad Yousefian, Angelia Nedić, Uday V. Shanbhag

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

Motivated by applications arising from large-scale optimization and machine learning, we consider stochastic quasi-Newton (SQN) methods for solving unconstrained convex optimization problems. Much of the convergence analysis of SQN methods, in both full and limited-memory regimes, requires the objective function to be strongly convex. However, this assumption is fairly restrictive and does not hold in many applications. To the best of our knowledge, no rate state- ments currently exist for SQN methods in the absence of such an assumption. Furthermore, among the existing first-order methods for addressing stochastic optimization problems with merely convex objectives, techniques equipped with provable convergence rates employ averaging. However, this averaging technique has a detrimental impact on inducing sparsity. Motivated by these gaps, we consider optimization problems with non-strongly convex objectives with Lipschitz but possibly un- bounded gradients. The main contributions of the paper are as follows: (i) To address large-scale stochastic optimization problems, we develop an iteratively regularized stochastic limited-memory BFGS (IRS-LBFGS) algorithm, where the step size, regularization parameter, and the Hessian in- verse approximation are updated iteratively. We establish convergence of the iterates (with no averaging) to an optimal solution of the original problem both in an almost-sure sense and in a mean sense. The convergence rate is derived in terms of the objective function value and is shown to be O(1/k(1/3-∈)), where ∈ is an arbitrary small positive scalar. (ii) In deterministic regimes, we show that the algorithm displays a rate O(1/k1-∈). We present numerical experiments performed on a large-scale text classification problem and compare IRS-LBFGS with standard SQN methods as well as first-order methods such as SAGA and IAG.

Original languageEnglish (US)
Pages (from-to)1144-1172
Number of pages29
JournalSIAM Journal on Optimization
Volume30
Issue number2
DOIs
StatePublished - 2020

Keywords

  • Large scale optimization
  • Quasi-Newton
  • Regularization
  • Stochastic optimization

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'On stochastic and deterministic quasi-Newton methods for nonstrongly convex optimization: Asymptotic convergence and rate analysis'. Together they form a unique fingerprint.

Cite this