Guest Editorial Communication-Efficient Distributed Learning Over Networks

Xuanyu Cao, Tamer Basar, Suhas Diggavi, Yonina C. Eldar, Khaled B. Letaief, H. Vincent Poor, Junshan Zhang

Research output: Contribution to journalReview articlepeer-review

1 Scopus citations

Abstract

Distributed machine learning is envisioned as the bedrock of future intelligent networks, where agents exchange information with each other to train models collaboratively without uploading data to a central processor. Despite its broad applicability, a downside of distributed learning is the need for iterative information exchange between agents, which may lead to high communication overhead unaffordable in many practical systems with limited communication resources. To resolve this communication bottleneck, we need to devise communication-efficient distributed learning algorithms and protocols that can reduce the communication cost and simultaneously achieve satisfactory learning/optimization performance. Accomplishing this goal necessitates synergistic techniques from a diverse set of fields, including optimization, machine learning, wireless communications, game theory, and network/graph theory. This Special Issue is dedicated to communication-efficient distributed learning from multiple perspectives, including fundamental theories, algorithm design and analysis, and practical considerations.

Original languageEnglish (US)
Pages (from-to)845-850
Number of pages6
JournalIEEE Journal on Selected Areas in Communications
Volume41
Issue number4
DOIs
StatePublished - Apr 1 2023
Externally publishedYes

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Guest Editorial Communication-Efficient Distributed Learning Over Networks'. Together they form a unique fingerprint.

Cite this