TY - GEN
T1 - Fast and privacy preserving distributed low-rank regression
AU - Wai, Hoi To
AU - Scaglione, Anna
AU - Lafond, Jean
AU - Moulines, Eric
N1 - Funding Information:
This work is supported by NSF CCF-1O 1181l.
Publisher Copyright:
© 2017 IEEE.
PY - 2017/6/16
Y1 - 2017/6/16
N2 - This paper proposes a fast and privacy preserving distributed algorithm for handling low-rank regression problems with nuclear norm constraint. Traditional projected gradient algorithms have high computation costs due to their projection steps when they are used to solve these problems. Our gossip-based algorithm, called the fast DeFW algorithm, overcomes this issue since it is projection-free. In particular, the algorithm incorporates a carefully designed decentralized power method step to reduce the complexity by distributed computation over network. Meanwhile, privacy is preserved as the agents do not exchange the private data, but only a random projection of them. We show that the fast DeFW algorithm converges for both convex and non-convex losses. As an application example, we consider the low-rank matrix completion problem and provide numerical results to support our findings.
AB - This paper proposes a fast and privacy preserving distributed algorithm for handling low-rank regression problems with nuclear norm constraint. Traditional projected gradient algorithms have high computation costs due to their projection steps when they are used to solve these problems. Our gossip-based algorithm, called the fast DeFW algorithm, overcomes this issue since it is projection-free. In particular, the algorithm incorporates a carefully designed decentralized power method step to reduce the complexity by distributed computation over network. Meanwhile, privacy is preserved as the agents do not exchange the private data, but only a random projection of them. We show that the fast DeFW algorithm converges for both convex and non-convex losses. As an application example, we consider the low-rank matrix completion problem and provide numerical results to support our findings.
KW - Frank-Wolfe algorithm
KW - distributed optimization
KW - gossip algorithms
KW - low-rank regression
KW - power method
UR - http://www.scopus.com/inward/record.url?scp=85023771468&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85023771468&partnerID=8YFLogxK
U2 - 10.1109/ICASSP.2017.7952998
DO - 10.1109/ICASSP.2017.7952998
M3 - Conference contribution
AN - SCOPUS:85023771468
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 4451
EP - 4455
BT - 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2017 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2017
Y2 - 5 March 2017 through 9 March 2017
ER -