Frame replenishment coding over noisy channels

L. Zhang, S. Panchanathan, M. Goldberg

Research output: Contribution to journalArticlepeer-review

Abstract

The authors propose a combined source-channel coding scheme that provides error protection to improve the performance in noisy channels. Recently two new frame replenishment coding techniques based on vector quantisation (VQ) have been proposed, namely, label replenishment coding using vector quantisation (LRVQ) and codeword replenishment coding using vector quantisation (CWRVQ). These techniques provide a good coding performance in a noiseless channel. However, the performance degenerates considerably in the presence of channel noise. The authors first compare the performances of LRVQ and CWRVQ in a noiseless channel with that of the basic frame replenishment (BFR) technique, where the performance of BFR is used as a baseline for comparison. The simulation results demonstrate that, for a noiseless channel, LRVQ and CWRVQ techniques provide better performance and tolerance to motion in the picture, compared to the BFR technique. However, in the presence of channel noise, the LRVQ and CWRVQ techniques exhibit serious error propagation and noise effects, resulting in poor performance. The authors purpose to use a combined source-channel coding technique with three different error protection schemes, where the bit rates for the source code and channel code are adjusted so as to minimise the mean square error. Simulation results demonstrate that the three error protection schemes provide a significant improvement in performance without sacrificing transmission bandwidth.

Original languageEnglish (US)
Pages (from-to)144-151
Number of pages8
JournalIEE Proceedings, Part I: Communications, Speech and Vision
Volume140
Issue number2
DOIs
StatePublished - 1993
Externally publishedYes

ASJC Scopus subject areas

  • Engineering(all)

Fingerprint

Dive into the research topics of 'Frame replenishment coding over noisy channels'. Together they form a unique fingerprint.

Cite this