Stochastic Gradient-Push for Strongly Convex Functions on Time-Varying Directed Graphs

Angelia Nedich, Alex Olshevsky

Research output: Contribution to journalArticle

70 Scopus citations

Abstract

We investigate the convergence rate of the recently proposed subgradient-push method for distributed optimization over time-varying directed graphs. The subgradient-push method can be implemented in a distributed way without requiring knowledge of either the number of agents or the graph sequence; each node is only required to know its out-degree at each time. Our main result is a convergence rate of O((ln t)/t) for strongly convex functions with Lipschitz gradients even if only stochastic gradient samples are available; this is asymptotically faster than the O((ln t)/t) rate previously known for (general) convex functions.

Original languageEnglish (US)
Article number7405263
Pages (from-to)3936-3947
Number of pages12
JournalIEEE Transactions on Automatic Control
Volume61
Issue number12
DOIs
StatePublished - Dec 1 2016
Externally publishedYes

Keywords

  • Distributed algorithms
  • gradient methods
  • parameter estimation

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Stochastic Gradient-Push for Strongly Convex Functions on Time-Varying Directed Graphs'. Together they form a unique fingerprint.

  • Cite this