On stochastic subgradient mirror-descent algorithm with weighted averaging

Angelia Nedich, Soomin Lee

Research output: Contribution to journalArticle

29 Citations (Scopus)

Abstract

This paper considers stochastic subgradient mirror-descent method for solving constrained convex minimization problems. In particular, a stochastic subgradient mirror-descent method with weighted iterate-averaging is investigated and its per-iterate convergence rate is analyzed. The novel part of the approach is in the choice of weights that are used to construct the averages. Through the use of these weighted averages, we show that the known optimal rates can be obtained with simpler algorithms than those currently existing in the literature. Specifically, by suitably choosing the stepsize values, one can obtain the rate of the order 1/√k for strongly convex functions, and the rate 1/√1+k for general convex functions (not necessarily differentiable). Furthermore, for the latter case, it is shown that a stochastic subgradient mirror-descent with iterate averaging converges (along a subsequence) to an optimal solution, almost surely, even with the stepsize of the form 1/√1+k, which was not previously known. The stepsize choices that achieve the best rates are those proposed by Tseng for acceleration of proximal gradient methods [P. Tseng, SIAM J. Optim., submitted].

Original languageEnglish (US)
Pages (from-to)84-107
Number of pages24
JournalSIAM Journal on Optimization
Volume24
Issue number1
DOIs
StatePublished - 2014
Externally publishedYes

Fingerprint

Subgradient
Descent Algorithm
Iterate
Averaging
Descent Method
Mirror
Mirrors
Convex function
Proximal Methods
Convex Minimization
Constrained Minimization
Gradient methods
Optimal Rates
Gradient Method
Weighted Average
Descent
Subsequence
Minimization Problem
Differentiable
Convergence Rate

Keywords

  • Convex optimization
  • Mirror-descent algorithm
  • Stochastic subgradient method
  • Weighted averaging

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science

Cite this

On stochastic subgradient mirror-descent algorithm with weighted averaging. / Nedich, Angelia; Lee, Soomin.

In: SIAM Journal on Optimization, Vol. 24, No. 1, 2014, p. 84-107.

Research output: Contribution to journalArticle

@article{33a2aa2f214f4d4ba265513d47e96109,
title = "On stochastic subgradient mirror-descent algorithm with weighted averaging",
abstract = "This paper considers stochastic subgradient mirror-descent method for solving constrained convex minimization problems. In particular, a stochastic subgradient mirror-descent method with weighted iterate-averaging is investigated and its per-iterate convergence rate is analyzed. The novel part of the approach is in the choice of weights that are used to construct the averages. Through the use of these weighted averages, we show that the known optimal rates can be obtained with simpler algorithms than those currently existing in the literature. Specifically, by suitably choosing the stepsize values, one can obtain the rate of the order 1/√k for strongly convex functions, and the rate 1/√1+k for general convex functions (not necessarily differentiable). Furthermore, for the latter case, it is shown that a stochastic subgradient mirror-descent with iterate averaging converges (along a subsequence) to an optimal solution, almost surely, even with the stepsize of the form 1/√1+k, which was not previously known. The stepsize choices that achieve the best rates are those proposed by Tseng for acceleration of proximal gradient methods [P. Tseng, SIAM J. Optim., submitted].",
keywords = "Convex optimization, Mirror-descent algorithm, Stochastic subgradient method, Weighted averaging",
author = "Angelia Nedich and Soomin Lee",
year = "2014",
doi = "10.1137/120894464",
language = "English (US)",
volume = "24",
pages = "84--107",
journal = "SIAM Journal on Optimization",
issn = "1052-6234",
publisher = "Society for Industrial and Applied Mathematics Publications",
number = "1",

}

TY - JOUR

T1 - On stochastic subgradient mirror-descent algorithm with weighted averaging

AU - Nedich, Angelia

AU - Lee, Soomin

PY - 2014

Y1 - 2014

N2 - This paper considers stochastic subgradient mirror-descent method for solving constrained convex minimization problems. In particular, a stochastic subgradient mirror-descent method with weighted iterate-averaging is investigated and its per-iterate convergence rate is analyzed. The novel part of the approach is in the choice of weights that are used to construct the averages. Through the use of these weighted averages, we show that the known optimal rates can be obtained with simpler algorithms than those currently existing in the literature. Specifically, by suitably choosing the stepsize values, one can obtain the rate of the order 1/√k for strongly convex functions, and the rate 1/√1+k for general convex functions (not necessarily differentiable). Furthermore, for the latter case, it is shown that a stochastic subgradient mirror-descent with iterate averaging converges (along a subsequence) to an optimal solution, almost surely, even with the stepsize of the form 1/√1+k, which was not previously known. The stepsize choices that achieve the best rates are those proposed by Tseng for acceleration of proximal gradient methods [P. Tseng, SIAM J. Optim., submitted].

AB - This paper considers stochastic subgradient mirror-descent method for solving constrained convex minimization problems. In particular, a stochastic subgradient mirror-descent method with weighted iterate-averaging is investigated and its per-iterate convergence rate is analyzed. The novel part of the approach is in the choice of weights that are used to construct the averages. Through the use of these weighted averages, we show that the known optimal rates can be obtained with simpler algorithms than those currently existing in the literature. Specifically, by suitably choosing the stepsize values, one can obtain the rate of the order 1/√k for strongly convex functions, and the rate 1/√1+k for general convex functions (not necessarily differentiable). Furthermore, for the latter case, it is shown that a stochastic subgradient mirror-descent with iterate averaging converges (along a subsequence) to an optimal solution, almost surely, even with the stepsize of the form 1/√1+k, which was not previously known. The stepsize choices that achieve the best rates are those proposed by Tseng for acceleration of proximal gradient methods [P. Tseng, SIAM J. Optim., submitted].

KW - Convex optimization

KW - Mirror-descent algorithm

KW - Stochastic subgradient method

KW - Weighted averaging

UR - http://www.scopus.com/inward/record.url?scp=84897542293&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84897542293&partnerID=8YFLogxK

U2 - 10.1137/120894464

DO - 10.1137/120894464

M3 - Article

AN - SCOPUS:84897542293

VL - 24

SP - 84

EP - 107

JO - SIAM Journal on Optimization

JF - SIAM Journal on Optimization

SN - 1052-6234

IS - 1

ER -