TY - GEN
T1 - Differential nested lattice encoding for consensus problems
AU - Yildiz, Mehmet E.
AU - Scaglione, Anna
PY - 2007/10/22
Y1 - 2007/10/22
N2 - In this paper we consider the problem of transmitting quantized data while performing an average consensus algorithm. Average consensus algorithms are protocols to compute the average value of all sensor measurements via near neighbors communications. The main motivation for our work is the observation that consensus algorithms offer the perfect example of network communications where there is an increasing correlation between the data exchanged, as the system updates its computations. Henceforth, it is possible to utilize previously exchanged data and current side information to reduce significantly the demands of quantization bit rate for a certain precision. We analyze the case of a network with a topology built as that of a random geometric graph and with links that are assumed to be reliable at a constant bit rate. Numerically we show that in consensus algorithms, increasing number of iterations does not have the effect of increasing the error variance. Thus, we conclude that noisy recursions lead to a consensus if the data correlation is exploited in the messages source encoders and decoders. We briefly state the theoretical results which are parallel to our numerical experiments.
AB - In this paper we consider the problem of transmitting quantized data while performing an average consensus algorithm. Average consensus algorithms are protocols to compute the average value of all sensor measurements via near neighbors communications. The main motivation for our work is the observation that consensus algorithms offer the perfect example of network communications where there is an increasing correlation between the data exchanged, as the system updates its computations. Henceforth, it is possible to utilize previously exchanged data and current side information to reduce significantly the demands of quantization bit rate for a certain precision. We analyze the case of a network with a topology built as that of a random geometric graph and with links that are assumed to be reliable at a constant bit rate. Numerically we show that in consensus algorithms, increasing number of iterations does not have the effect of increasing the error variance. Thus, we conclude that noisy recursions lead to a consensus if the data correlation is exploited in the messages source encoders and decoders. We briefly state the theoretical results which are parallel to our numerical experiments.
KW - Average consensus
KW - Coding with side information
KW - Consensus
KW - Nested lattice coding
KW - Predictive coding
UR - http://www.scopus.com/inward/record.url?scp=35348858886&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=35348858886&partnerID=8YFLogxK
U2 - 10.1145/1236360.1236373
DO - 10.1145/1236360.1236373
M3 - Conference contribution
AN - SCOPUS:35348858886
SN - 1595936386
SN - 9781595936387
T3 - IPSN 2007: Proceedings of the Sixth International Symposium on Information Processing in Sensor Networks
SP - 89
EP - 98
BT - IPSN 2007
T2 - IPSN 2007: 6th International Symposium on Information Processing in Sensor Networks
Y2 - 25 April 2007 through 27 April 2007
ER -