Random block-coordinate gradient projection algorithms

Chandramani Singh, Angelia Nedich, R. Srikant

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

In this paper, we study gradient projection algorithms based on random partial updates of decision variables. These algorithms generalize random coordinate descent methods. We analyze these algorithms with and without assuming strong convexity of the objective functions. We also present an accelerated version of the algorithm based on Nesterov's two-step gradient method [1]. In each case, we prove convergence and provide a bound on the rate of convergence. We see that the randomized algorithms exhibit similar rates of convergence as their full gradient based deterministic counterparts.

Original languageEnglish (US)
Article number7039379
Pages (from-to)185-190
Number of pages6
JournalUnknown Journal
Volume2015-February
Issue numberFebruary
DOIs
StatePublished - 2014
Externally publishedYes

Fingerprint

Gradient Projection
Projection Algorithm
Gradient Algorithm
Rate of Convergence
Coordinate Descent
Descent Method
Two-step Method
Gradient Method
Randomized Algorithms
Convexity
Objective function
Update
Gradient methods
Gradient
Partial
Generalise

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Modeling and Simulation
  • Control and Optimization

Cite this

Singh, C., Nedich, A., & Srikant, R. (2014). Random block-coordinate gradient projection algorithms. Unknown Journal, 2015-February(February), 185-190. [7039379]. https://doi.org/10.1109/CDC.2014.7039379

Random block-coordinate gradient projection algorithms. / Singh, Chandramani; Nedich, Angelia; Srikant, R.

In: Unknown Journal, Vol. 2015-February, No. February, 7039379, 2014, p. 185-190.

Research output: Contribution to journalArticle

Singh, C, Nedich, A & Srikant, R 2014, 'Random block-coordinate gradient projection algorithms', Unknown Journal, vol. 2015-February, no. February, 7039379, pp. 185-190. https://doi.org/10.1109/CDC.2014.7039379
Singh, Chandramani ; Nedich, Angelia ; Srikant, R. / Random block-coordinate gradient projection algorithms. In: Unknown Journal. 2014 ; Vol. 2015-February, No. February. pp. 185-190.
@article{51573e276bb64085a6b700067bbe4c20,
title = "Random block-coordinate gradient projection algorithms",
abstract = "In this paper, we study gradient projection algorithms based on random partial updates of decision variables. These algorithms generalize random coordinate descent methods. We analyze these algorithms with and without assuming strong convexity of the objective functions. We also present an accelerated version of the algorithm based on Nesterov's two-step gradient method [1]. In each case, we prove convergence and provide a bound on the rate of convergence. We see that the randomized algorithms exhibit similar rates of convergence as their full gradient based deterministic counterparts.",
author = "Chandramani Singh and Angelia Nedich and R. Srikant",
year = "2014",
doi = "10.1109/CDC.2014.7039379",
language = "English (US)",
volume = "2015-February",
pages = "185--190",
journal = "Scanning Electron Microscopy",
issn = "0586-5581",
publisher = "Scanning Microscopy International",
number = "February",

}

TY - JOUR

T1 - Random block-coordinate gradient projection algorithms

AU - Singh, Chandramani

AU - Nedich, Angelia

AU - Srikant, R.

PY - 2014

Y1 - 2014

N2 - In this paper, we study gradient projection algorithms based on random partial updates of decision variables. These algorithms generalize random coordinate descent methods. We analyze these algorithms with and without assuming strong convexity of the objective functions. We also present an accelerated version of the algorithm based on Nesterov's two-step gradient method [1]. In each case, we prove convergence and provide a bound on the rate of convergence. We see that the randomized algorithms exhibit similar rates of convergence as their full gradient based deterministic counterparts.

AB - In this paper, we study gradient projection algorithms based on random partial updates of decision variables. These algorithms generalize random coordinate descent methods. We analyze these algorithms with and without assuming strong convexity of the objective functions. We also present an accelerated version of the algorithm based on Nesterov's two-step gradient method [1]. In each case, we prove convergence and provide a bound on the rate of convergence. We see that the randomized algorithms exhibit similar rates of convergence as their full gradient based deterministic counterparts.

UR - http://www.scopus.com/inward/record.url?scp=84988227804&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84988227804&partnerID=8YFLogxK

U2 - 10.1109/CDC.2014.7039379

DO - 10.1109/CDC.2014.7039379

M3 - Article

VL - 2015-February

SP - 185

EP - 190

JO - Scanning Electron Microscopy

JF - Scanning Electron Microscopy

SN - 0586-5581

IS - February

M1 - 7039379

ER -