The effect of deterministic noise in subgradient methods

Angelia Nedich, Dimitri P. Bertsekas

Research output: Contribution to journalArticle

28 Citations (Scopus)

Abstract

In this paper, we study the influence of noise on subgradient methods for convex constrained optimization. The noise may be due to various sources, and is manifested in inexact computation of the subgradients and function values. Assuming that the noise is deterministic and bounded, we discuss the convergence properties for two cases: the case where the constraint set is compact, and the case where this set need not be compact but the objective function has a sharp set of minima (for example the function is polyhedral). In both cases, using several different stepsize rules, we prove convergence to the optimal value within some tolerance that is given explicitly in terms of the errors. In the first case, the tolerance is nonzero, but in the second case, the optimal value can be obtained exactly, provided the size of the error in the subgradient computation is below some threshold. We then extend these results to objective functions that are the sum of a large number of convex functions, in which case an incremental subgradient method can be used.

Original languageEnglish (US)
Pages (from-to)75-99
Number of pages25
JournalMathematical Programming
Volume125
Issue number1
DOIs
StatePublished - Sep 2010
Externally publishedYes

Fingerprint

Subgradient Method
Subgradient
Tolerance
Objective function
Constrained Optimization
Convex Optimization
Value Function
Convergence Properties
Convex function
Convex optimization
Constrained optimization

Keywords

  • Mathematics Subject Classification (2000): 90C25

ASJC Scopus subject areas

  • Software
  • Mathematics(all)

Cite this

The effect of deterministic noise in subgradient methods. / Nedich, Angelia; Bertsekas, Dimitri P.

In: Mathematical Programming, Vol. 125, No. 1, 09.2010, p. 75-99.

Research output: Contribution to journalArticle

Nedich, Angelia ; Bertsekas, Dimitri P. / The effect of deterministic noise in subgradient methods. In: Mathematical Programming. 2010 ; Vol. 125, No. 1. pp. 75-99.
@article{f4d692916d1049c28ecee8403f97c806,
title = "The effect of deterministic noise in subgradient methods",
abstract = "In this paper, we study the influence of noise on subgradient methods for convex constrained optimization. The noise may be due to various sources, and is manifested in inexact computation of the subgradients and function values. Assuming that the noise is deterministic and bounded, we discuss the convergence properties for two cases: the case where the constraint set is compact, and the case where this set need not be compact but the objective function has a sharp set of minima (for example the function is polyhedral). In both cases, using several different stepsize rules, we prove convergence to the optimal value within some tolerance that is given explicitly in terms of the errors. In the first case, the tolerance is nonzero, but in the second case, the optimal value can be obtained exactly, provided the size of the error in the subgradient computation is below some threshold. We then extend these results to objective functions that are the sum of a large number of convex functions, in which case an incremental subgradient method can be used.",
keywords = "Mathematics Subject Classification (2000): 90C25",
author = "Angelia Nedich and Bertsekas, {Dimitri P.}",
year = "2010",
month = "9",
doi = "10.1007/s10107-008-0262-5",
language = "English (US)",
volume = "125",
pages = "75--99",
journal = "Mathematical Programming",
issn = "0025-5610",
publisher = "Springer-Verlag GmbH and Co. KG",
number = "1",

}

TY - JOUR

T1 - The effect of deterministic noise in subgradient methods

AU - Nedich, Angelia

AU - Bertsekas, Dimitri P.

PY - 2010/9

Y1 - 2010/9

N2 - In this paper, we study the influence of noise on subgradient methods for convex constrained optimization. The noise may be due to various sources, and is manifested in inexact computation of the subgradients and function values. Assuming that the noise is deterministic and bounded, we discuss the convergence properties for two cases: the case where the constraint set is compact, and the case where this set need not be compact but the objective function has a sharp set of minima (for example the function is polyhedral). In both cases, using several different stepsize rules, we prove convergence to the optimal value within some tolerance that is given explicitly in terms of the errors. In the first case, the tolerance is nonzero, but in the second case, the optimal value can be obtained exactly, provided the size of the error in the subgradient computation is below some threshold. We then extend these results to objective functions that are the sum of a large number of convex functions, in which case an incremental subgradient method can be used.

AB - In this paper, we study the influence of noise on subgradient methods for convex constrained optimization. The noise may be due to various sources, and is manifested in inexact computation of the subgradients and function values. Assuming that the noise is deterministic and bounded, we discuss the convergence properties for two cases: the case where the constraint set is compact, and the case where this set need not be compact but the objective function has a sharp set of minima (for example the function is polyhedral). In both cases, using several different stepsize rules, we prove convergence to the optimal value within some tolerance that is given explicitly in terms of the errors. In the first case, the tolerance is nonzero, but in the second case, the optimal value can be obtained exactly, provided the size of the error in the subgradient computation is below some threshold. We then extend these results to objective functions that are the sum of a large number of convex functions, in which case an incremental subgradient method can be used.

KW - Mathematics Subject Classification (2000): 90C25

UR - http://www.scopus.com/inward/record.url?scp=77956393281&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77956393281&partnerID=8YFLogxK

U2 - 10.1007/s10107-008-0262-5

DO - 10.1007/s10107-008-0262-5

M3 - Article

AN - SCOPUS:77956393281

VL - 125

SP - 75

EP - 99

JO - Mathematical Programming

JF - Mathematical Programming

SN - 0025-5610

IS - 1

ER -