Average consensus algorithms are protocols to compute the average value of all sensor measurements via near neighbors communications. They offer a natural tradeoff between the number of messages exchanged among terminals and the accuracy in the computation. Most of the models adopted for the message exchange in the literature, however, neither include explicit rate constraints nor explore the rate distortion tradeoff associated with the algorithm. The contribution of our work is in examining the impact of such constraints and in finding strategies to minimize the communication cost in terms of rate. The main motivation behind the proposed coding strategies is the observation that consensus algorithms offer the perfect example of a network communication problem where there is an increasing correlation between the data exchanged, as the algorithm iterates. Henceforth, it is possible to utilize previously exchanged data and current side information to significantly reduce the demands of quantization bit rate for a certain precision. We analyze the case of a network where the links are assumed to be reliable at a constant bit rate. We explore the conditions on the quantization noise which lead to a consensus value whose mean squared distance from the initial average is bounded. In the case of infinite-length vector coding with Gaussian states, we show that our proposed schemes achieve bounded convergence with vanishing rates as the iteration index tends to infinity.
- Bounded convergence
- Coding with side information
- Distributed average consensus
- Sensor networks
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering