Decentralize and randomize: Faster algorithm for Wasserstein barycenters

Pavel Dvurechensky, Darina Dvinskikh, Alexander Gasnikov, César A. Uribe, Angelia Nedich

Research output: Contribution to journalConference article

Abstract

We study the decentralized distributed computation of discrete approximations for the regularized Wasserstein barycenter of a finite set of continuous probability measures distributedly stored over a network. We assume there is a network of agents/machines/computers, and each agent holds a private continuous probability measure and seeks to compute the barycenter of all the measures in the network by getting samples from its local measure and exchanging information with its neighbors. Motivated by this problem, we develop, and analyze, a novel accelerated primal-dual stochastic gradient method for general stochastic convex optimization problems with linear equality constraints. Then, we apply this method to the decentralized distributed optimization setting to obtain a new algorithm for the distributed semi-discrete regularized Wasserstein barycenter problem. Moreover, we show explicit non-asymptotic complexity for the proposed algorithm. Finally, we show the effectiveness of our method on the distributed computation of the regularized Wasserstein barycenter of univariate Gaussian and von Mises distributions, as well as some applications to image aggregation. 1

Original languageEnglish (US)
Pages (from-to)10760-10770
Number of pages11
JournalAdvances in Neural Information Processing Systems
Volume2018-December
StatePublished - Jan 1 2018
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: Dec 2 2018Dec 8 2018

Fingerprint

Gradient methods
Convex optimization
Agglomeration

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Dvurechensky, P., Dvinskikh, D., Gasnikov, A., Uribe, C. A., & Nedich, A. (2018). Decentralize and randomize: Faster algorithm for Wasserstein barycenters. Advances in Neural Information Processing Systems, 2018-December, 10760-10770.

Decentralize and randomize : Faster algorithm for Wasserstein barycenters. / Dvurechensky, Pavel; Dvinskikh, Darina; Gasnikov, Alexander; Uribe, César A.; Nedich, Angelia.

In: Advances in Neural Information Processing Systems, Vol. 2018-December, 01.01.2018, p. 10760-10770.

Research output: Contribution to journalConference article

Dvurechensky, P, Dvinskikh, D, Gasnikov, A, Uribe, CA & Nedich, A 2018, 'Decentralize and randomize: Faster algorithm for Wasserstein barycenters', Advances in Neural Information Processing Systems, vol. 2018-December, pp. 10760-10770.
Dvurechensky P, Dvinskikh D, Gasnikov A, Uribe CA, Nedich A. Decentralize and randomize: Faster algorithm for Wasserstein barycenters. Advances in Neural Information Processing Systems. 2018 Jan 1;2018-December:10760-10770.
Dvurechensky, Pavel ; Dvinskikh, Darina ; Gasnikov, Alexander ; Uribe, César A. ; Nedich, Angelia. / Decentralize and randomize : Faster algorithm for Wasserstein barycenters. In: Advances in Neural Information Processing Systems. 2018 ; Vol. 2018-December. pp. 10760-10770.
@article{8520a52d592f4af09dfe48da2222b78a,
title = "Decentralize and randomize: Faster algorithm for Wasserstein barycenters",
abstract = "We study the decentralized distributed computation of discrete approximations for the regularized Wasserstein barycenter of a finite set of continuous probability measures distributedly stored over a network. We assume there is a network of agents/machines/computers, and each agent holds a private continuous probability measure and seeks to compute the barycenter of all the measures in the network by getting samples from its local measure and exchanging information with its neighbors. Motivated by this problem, we develop, and analyze, a novel accelerated primal-dual stochastic gradient method for general stochastic convex optimization problems with linear equality constraints. Then, we apply this method to the decentralized distributed optimization setting to obtain a new algorithm for the distributed semi-discrete regularized Wasserstein barycenter problem. Moreover, we show explicit non-asymptotic complexity for the proposed algorithm. Finally, we show the effectiveness of our method on the distributed computation of the regularized Wasserstein barycenter of univariate Gaussian and von Mises distributions, as well as some applications to image aggregation. 1",
author = "Pavel Dvurechensky and Darina Dvinskikh and Alexander Gasnikov and Uribe, {C{\'e}sar A.} and Angelia Nedich",
year = "2018",
month = "1",
day = "1",
language = "English (US)",
volume = "2018-December",
pages = "10760--10770",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

TY - JOUR

T1 - Decentralize and randomize

T2 - Faster algorithm for Wasserstein barycenters

AU - Dvurechensky, Pavel

AU - Dvinskikh, Darina

AU - Gasnikov, Alexander

AU - Uribe, César A.

AU - Nedich, Angelia

PY - 2018/1/1

Y1 - 2018/1/1

N2 - We study the decentralized distributed computation of discrete approximations for the regularized Wasserstein barycenter of a finite set of continuous probability measures distributedly stored over a network. We assume there is a network of agents/machines/computers, and each agent holds a private continuous probability measure and seeks to compute the barycenter of all the measures in the network by getting samples from its local measure and exchanging information with its neighbors. Motivated by this problem, we develop, and analyze, a novel accelerated primal-dual stochastic gradient method for general stochastic convex optimization problems with linear equality constraints. Then, we apply this method to the decentralized distributed optimization setting to obtain a new algorithm for the distributed semi-discrete regularized Wasserstein barycenter problem. Moreover, we show explicit non-asymptotic complexity for the proposed algorithm. Finally, we show the effectiveness of our method on the distributed computation of the regularized Wasserstein barycenter of univariate Gaussian and von Mises distributions, as well as some applications to image aggregation. 1

AB - We study the decentralized distributed computation of discrete approximations for the regularized Wasserstein barycenter of a finite set of continuous probability measures distributedly stored over a network. We assume there is a network of agents/machines/computers, and each agent holds a private continuous probability measure and seeks to compute the barycenter of all the measures in the network by getting samples from its local measure and exchanging information with its neighbors. Motivated by this problem, we develop, and analyze, a novel accelerated primal-dual stochastic gradient method for general stochastic convex optimization problems with linear equality constraints. Then, we apply this method to the decentralized distributed optimization setting to obtain a new algorithm for the distributed semi-discrete regularized Wasserstein barycenter problem. Moreover, we show explicit non-asymptotic complexity for the proposed algorithm. Finally, we show the effectiveness of our method on the distributed computation of the regularized Wasserstein barycenter of univariate Gaussian and von Mises distributions, as well as some applications to image aggregation. 1

UR - http://www.scopus.com/inward/record.url?scp=85064808215&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85064808215&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85064808215

VL - 2018-December

SP - 10760

EP - 10770

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -