Fast and privacy preserving distributed low-rank regression

Hoi To Wai, Anna Scaglione, Jean Lafond, Eric Moulines

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

This paper proposes a fast and privacy preserving distributed algorithm for handling low-rank regression problems with nuclear norm constraint. Traditional projected gradient algorithms have high computation costs due to their projection steps when they are used to solve these problems. Our gossip-based algorithm, called the fast DeFW algorithm, overcomes this issue since it is projection-free. In particular, the algorithm incorporates a carefully designed decentralized power method step to reduce the complexity by distributed computation over network. Meanwhile, privacy is preserved as the agents do not exchange the private data, but only a random projection of them. We show that the fast DeFW algorithm converges for both convex and non-convex losses. As an application example, we consider the low-rank matrix completion problem and provide numerical results to support our findings.

Original languageEnglish (US)
Title of host publication2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages4451-4455
Number of pages5
ISBN (Electronic)9781509041176
DOIs
StatePublished - Jun 16 2017
Event2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - New Orleans, United States
Duration: Mar 5 2017Mar 9 2017

Other

Other2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017
CountryUnited States
CityNew Orleans
Period3/5/173/9/17

Fingerprint

Parallel algorithms
Costs

Keywords

  • distributed optimization
  • Frank-Wolfe algorithm
  • gossip algorithms
  • low-rank regression
  • power method

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Cite this

Wai, H. T., Scaglione, A., Lafond, J., & Moulines, E. (2017). Fast and privacy preserving distributed low-rank regression. In 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings (pp. 4451-4455). [7952998] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICASSP.2017.7952998

Fast and privacy preserving distributed low-rank regression. / Wai, Hoi To; Scaglione, Anna; Lafond, Jean; Moulines, Eric.

2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2017. p. 4451-4455 7952998.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Wai, HT, Scaglione, A, Lafond, J & Moulines, E 2017, Fast and privacy preserving distributed low-rank regression. in 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings., 7952998, Institute of Electrical and Electronics Engineers Inc., pp. 4451-4455, 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017, New Orleans, United States, 3/5/17. https://doi.org/10.1109/ICASSP.2017.7952998
Wai HT, Scaglione A, Lafond J, Moulines E. Fast and privacy preserving distributed low-rank regression. In 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings. Institute of Electrical and Electronics Engineers Inc. 2017. p. 4451-4455. 7952998 https://doi.org/10.1109/ICASSP.2017.7952998
Wai, Hoi To ; Scaglione, Anna ; Lafond, Jean ; Moulines, Eric. / Fast and privacy preserving distributed low-rank regression. 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings. Institute of Electrical and Electronics Engineers Inc., 2017. pp. 4451-4455
@inproceedings{7f102d66a4954cad95c2d281c82d4ea7,
title = "Fast and privacy preserving distributed low-rank regression",
abstract = "This paper proposes a fast and privacy preserving distributed algorithm for handling low-rank regression problems with nuclear norm constraint. Traditional projected gradient algorithms have high computation costs due to their projection steps when they are used to solve these problems. Our gossip-based algorithm, called the fast DeFW algorithm, overcomes this issue since it is projection-free. In particular, the algorithm incorporates a carefully designed decentralized power method step to reduce the complexity by distributed computation over network. Meanwhile, privacy is preserved as the agents do not exchange the private data, but only a random projection of them. We show that the fast DeFW algorithm converges for both convex and non-convex losses. As an application example, we consider the low-rank matrix completion problem and provide numerical results to support our findings.",
keywords = "distributed optimization, Frank-Wolfe algorithm, gossip algorithms, low-rank regression, power method",
author = "Wai, {Hoi To} and Anna Scaglione and Jean Lafond and Eric Moulines",
year = "2017",
month = "6",
day = "16",
doi = "10.1109/ICASSP.2017.7952998",
language = "English (US)",
pages = "4451--4455",
booktitle = "2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Fast and privacy preserving distributed low-rank regression

AU - Wai, Hoi To

AU - Scaglione, Anna

AU - Lafond, Jean

AU - Moulines, Eric

PY - 2017/6/16

Y1 - 2017/6/16

N2 - This paper proposes a fast and privacy preserving distributed algorithm for handling low-rank regression problems with nuclear norm constraint. Traditional projected gradient algorithms have high computation costs due to their projection steps when they are used to solve these problems. Our gossip-based algorithm, called the fast DeFW algorithm, overcomes this issue since it is projection-free. In particular, the algorithm incorporates a carefully designed decentralized power method step to reduce the complexity by distributed computation over network. Meanwhile, privacy is preserved as the agents do not exchange the private data, but only a random projection of them. We show that the fast DeFW algorithm converges for both convex and non-convex losses. As an application example, we consider the low-rank matrix completion problem and provide numerical results to support our findings.

AB - This paper proposes a fast and privacy preserving distributed algorithm for handling low-rank regression problems with nuclear norm constraint. Traditional projected gradient algorithms have high computation costs due to their projection steps when they are used to solve these problems. Our gossip-based algorithm, called the fast DeFW algorithm, overcomes this issue since it is projection-free. In particular, the algorithm incorporates a carefully designed decentralized power method step to reduce the complexity by distributed computation over network. Meanwhile, privacy is preserved as the agents do not exchange the private data, but only a random projection of them. We show that the fast DeFW algorithm converges for both convex and non-convex losses. As an application example, we consider the low-rank matrix completion problem and provide numerical results to support our findings.

KW - distributed optimization

KW - Frank-Wolfe algorithm

KW - gossip algorithms

KW - low-rank regression

KW - power method

UR - http://www.scopus.com/inward/record.url?scp=85023771468&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85023771468&partnerID=8YFLogxK

U2 - 10.1109/ICASSP.2017.7952998

DO - 10.1109/ICASSP.2017.7952998

M3 - Conference contribution

AN - SCOPUS:85023771468

SP - 4451

EP - 4455

BT - 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -