TY - GEN
T1 - Distributed sparse regression by consensus-based primal-dual perturbation optimization
AU - Chang, Tsung Hui
AU - Nedic, Angelia
AU - Scaglione, Anna
PY - 2013
Y1 - 2013
N2 - This paper studies the decentralized solution of a multi-agent sparse regression problem in the form of a globally coupled objective function with a non-smooth sparsity promoting constraint. In particular, we propose a distributed primal-dual perturbation (PDP) method which combines the average consensus technique and the primaldual perturbed subgradient method. Compared to the conventional primal-dual (PD) subgradient method without perturbation, the PDP subgradient method exhibits a faster convergence behavior. In order to handle the non-smooth constraints, we propose a novel proximal gradient type perturbation point. The proposed distributed optimization algorithm can be implemented as a fully decentralized protocol, with each agent using its local information and exchanging messages between neighbors only. We show that the proposed method converges to the global optimum of the considered problem under standard convex problem and network assumptions.
AB - This paper studies the decentralized solution of a multi-agent sparse regression problem in the form of a globally coupled objective function with a non-smooth sparsity promoting constraint. In particular, we propose a distributed primal-dual perturbation (PDP) method which combines the average consensus technique and the primaldual perturbed subgradient method. Compared to the conventional primal-dual (PD) subgradient method without perturbation, the PDP subgradient method exhibits a faster convergence behavior. In order to handle the non-smooth constraints, we propose a novel proximal gradient type perturbation point. The proposed distributed optimization algorithm can be implemented as a fully decentralized protocol, with each agent using its local information and exchanging messages between neighbors only. We show that the proposed method converges to the global optimum of the considered problem under standard convex problem and network assumptions.
KW - Average consensus
KW - Distributed optimization
KW - Primal-dual subgradient method
KW - Sparse regression
UR - http://www.scopus.com/inward/record.url?scp=84897744342&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84897744342&partnerID=8YFLogxK
U2 - 10.1109/GlobalSIP.2013.6736872
DO - 10.1109/GlobalSIP.2013.6736872
M3 - Conference contribution
AN - SCOPUS:84897744342
SN - 9781479902484
T3 - 2013 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2013 - Proceedings
SP - 289
EP - 292
BT - 2013 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2013 - Proceedings
T2 - 2013 1st IEEE Global Conference on Signal and Information Processing, GlobalSIP 2013
Y2 - 3 December 2013 through 5 December 2013
ER -