Linearly convergent decentralized consensus optimization over directed networks

Angelia Nedich, Alex Olshevsky, Wei Shi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Recently, there have been growing interests in solving distributed consensus optimization problems over directed networks that consist of multiple agents. In this paper, we develop a first-order (gradient-based) algorithm, referred to as Push-DIGing, for this class of problems. To run Push-DIGing, each agent in the network only needs to know its own out-degree and employs a fixed step-size. Under the strong convexity assumption, we prove that the introduced algorithm converges to the global minimizer at some R-linear (geometric) rate as long as the nonnegative step-size is no greater than some explicit bound.

Original languageEnglish (US)
Title of host publication2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages485-489
Number of pages5
ISBN (Electronic)9781509045457
DOIs
StatePublished - Apr 19 2017
Externally publishedYes
Event2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016 - Washington, United States
Duration: Dec 7 2016Dec 9 2016

Other

Other2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016
CountryUnited States
CityWashington
Period12/7/1612/9/16

Keywords

  • Directed network
  • Distributed optimization
  • Linear convergence
  • Small-gain theorem

ASJC Scopus subject areas

  • Signal Processing
  • Computer Networks and Communications

Cite this

Nedich, A., Olshevsky, A., & Shi, W. (2017). Linearly convergent decentralized consensus optimization over directed networks. In 2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016 - Proceedings (pp. 485-489). [7905889] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/GlobalSIP.2016.7905889

Linearly convergent decentralized consensus optimization over directed networks. / Nedich, Angelia; Olshevsky, Alex; Shi, Wei.

2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2017. p. 485-489 7905889.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Nedich, A, Olshevsky, A & Shi, W 2017, Linearly convergent decentralized consensus optimization over directed networks. in 2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016 - Proceedings., 7905889, Institute of Electrical and Electronics Engineers Inc., pp. 485-489, 2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016, Washington, United States, 12/7/16. https://doi.org/10.1109/GlobalSIP.2016.7905889
Nedich A, Olshevsky A, Shi W. Linearly convergent decentralized consensus optimization over directed networks. In 2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016 - Proceedings. Institute of Electrical and Electronics Engineers Inc. 2017. p. 485-489. 7905889 https://doi.org/10.1109/GlobalSIP.2016.7905889
Nedich, Angelia ; Olshevsky, Alex ; Shi, Wei. / Linearly convergent decentralized consensus optimization over directed networks. 2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2017. pp. 485-489
@inproceedings{5af3c9c7fa7a4e1abadb087883c11bed,
title = "Linearly convergent decentralized consensus optimization over directed networks",
abstract = "Recently, there have been growing interests in solving distributed consensus optimization problems over directed networks that consist of multiple agents. In this paper, we develop a first-order (gradient-based) algorithm, referred to as Push-DIGing, for this class of problems. To run Push-DIGing, each agent in the network only needs to know its own out-degree and employs a fixed step-size. Under the strong convexity assumption, we prove that the introduced algorithm converges to the global minimizer at some R-linear (geometric) rate as long as the nonnegative step-size is no greater than some explicit bound.",
keywords = "Directed network, Distributed optimization, Linear convergence, Small-gain theorem",
author = "Angelia Nedich and Alex Olshevsky and Wei Shi",
year = "2017",
month = "4",
day = "19",
doi = "10.1109/GlobalSIP.2016.7905889",
language = "English (US)",
pages = "485--489",
booktitle = "2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Linearly convergent decentralized consensus optimization over directed networks

AU - Nedich, Angelia

AU - Olshevsky, Alex

AU - Shi, Wei

PY - 2017/4/19

Y1 - 2017/4/19

N2 - Recently, there have been growing interests in solving distributed consensus optimization problems over directed networks that consist of multiple agents. In this paper, we develop a first-order (gradient-based) algorithm, referred to as Push-DIGing, for this class of problems. To run Push-DIGing, each agent in the network only needs to know its own out-degree and employs a fixed step-size. Under the strong convexity assumption, we prove that the introduced algorithm converges to the global minimizer at some R-linear (geometric) rate as long as the nonnegative step-size is no greater than some explicit bound.

AB - Recently, there have been growing interests in solving distributed consensus optimization problems over directed networks that consist of multiple agents. In this paper, we develop a first-order (gradient-based) algorithm, referred to as Push-DIGing, for this class of problems. To run Push-DIGing, each agent in the network only needs to know its own out-degree and employs a fixed step-size. Under the strong convexity assumption, we prove that the introduced algorithm converges to the global minimizer at some R-linear (geometric) rate as long as the nonnegative step-size is no greater than some explicit bound.

KW - Directed network

KW - Distributed optimization

KW - Linear convergence

KW - Small-gain theorem

UR - http://www.scopus.com/inward/record.url?scp=85019201658&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85019201658&partnerID=8YFLogxK

U2 - 10.1109/GlobalSIP.2016.7905889

DO - 10.1109/GlobalSIP.2016.7905889

M3 - Conference contribution

AN - SCOPUS:85019201658

SP - 485

EP - 489

BT - 2016 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2016 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -