Adaptive Sequential Stochastic Optimization

Craig Wilson, Venugopal V. Veeravalli, Angelia Nedich

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

A framework is introduced for sequentially solving convex stochastic minimization problems, where the objective functions change slowly, in the sense that the distance between successive minimizers is bounded. The minimization problems are solved by sequentially applying a selected optimization algorithm, such as stochastic gradient descent, based on drawing a number of samples in order to carry the iterations. Two tracking criteria are introduced to evaluate approximate minimizer quality: one based on being accurate with respect to the mean trajectory, and the other based on being accurate in high probability. An estimate of a bound on the minimizers' change, combined with properties of the chosen optimization algorithm, is used to select the number of samples needed to meet the desired tracking criterion. A technique to estimate the change in minimizers is provided along with analysis to show that eventually the estimate upper bounds the change in minimizers. This estimate of the change in minimizers provides sample size selection rules that guarantee that the tracking criterion is met for sufficiently large number of time steps. Simulations are used to confirm that the estimation approach provides the desired tracking accuracy in practice, while being efficient in terms of number of samples used in each time step.

Original languageEnglish (US)
Article number8316928
Pages (from-to)496-509
Number of pages14
JournalIEEE Transactions on Automatic Control
Volume64
Issue number2
DOIs
StatePublished - Feb 2019

Keywords

  • Gradient methods
  • stochastic optimization
  • time-varying objective
  • tracking problems

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Adaptive Sequential Stochastic Optimization'. Together they form a unique fingerprint.

Cite this