On stochastic gradient and subgradient methods with adaptive steplength sequences

Farzad Yousefian, Angelia Nedi, Uday V. Shanbhag

Research output: Contribution to journalArticlepeer-review

83 Scopus citations

Abstract

Traditionally, stochastic approximation (SA) schemes have been popular choices for solving stochastic optimization problems. However, the performance of standard SA implementations can vary significantly based on the choice of the steplength sequence, and in general, little guidance is provided about good choices. Motivated by this gap, we present two adaptive steplength schemes for strongly convex differentiable stochastic optimization problems, equipped with convergence theory, that aim to overcome some of the reliance on user-specific parameters. The first scheme, referred to as a recursive steplength stochastic approximation (RSA) scheme, optimizes the error bounds to derive a rule that expresses the steplength at a given iteration as a simple function of the steplength at the previous iteration and certain problem parameters. The second scheme, termed as a cascading steplength stochastic approximation (CSA) scheme, maintains the steplength sequence as a piecewise-constant decreasing function with the reduction in the steplength occurring when a suitable error threshold is met. Then, we allow for nondifferentiable objectives but with bounded subgradients over a certain domain. In such a regime, we propose a local smoothing technique, based on random local perturbations of the objective function, that leads to a differentiable approximation of the function. Assuming a uniform distribution on the local randomness, we establish a Lipschitzian property for the gradient of the approximation and prove that the obtained Lipschitz bound grows at a modest rate with problem size. This facilitates the development of an adaptive steplength stochastic approximation framework, which now requires sampling in the product space of the original measure and the artificially introduced distribution.

Original languageEnglish (US)
Pages (from-to)56-67
Number of pages12
JournalAutomatica
Volume48
Issue number1
DOIs
StatePublished - Jan 2012
Externally publishedYes

Keywords

  • Adaptive steplength
  • Convex optimization
  • Randomized smoothing techniques
  • Stochastic approximation
  • Stochastic optimization

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'On stochastic gradient and subgradient methods with adaptive steplength sequences'. Together they form a unique fingerprint.

Cite this