Abstract
This paper presents a locally-adaptive perceptual quantization scheme for visual data compression. The strategy is to exploit human visual masking properties by deriving masking thresholds in a locally-adaptive fashion based on a sub-band decomposition. The derived masking thresholds are used in controlling the quantization stage by adapting the quantizer reconstruction levels to the local amount of masking present at the level of each sub-band transform coefficient. Compared to the existing non locally-adaptive perceptual quantization methods, the new locally-adaptive algorithm exhibit superior performance and does not require additional side information. This is accomplished by estimating the amount of available masking from the already quantized data and linear prediction of the coefficient under consideration. By virtue of the local adaptation, the proposed quantization scheme is able to remove a large amount of perceptually redundant information. Since the algorithm does not require additional side information, it yields a low entropy representation of the image and is well suited for perceptually-lossless image compression.
Original language | English (US) |
---|---|
Title of host publication | Conference Record / IEEE Global Telecommunications Conference |
Editors | Anon |
Place of Publication | Piscataway, NJ, United States |
Publisher | IEEE |
Pages | 1042-1046 |
Number of pages | 5 |
Volume | 2 |
State | Published - 1997 |
Event | Proceedings of the 1997 IEEE Global Telecommunications Conference. Part 2 (of 3) - Phoenix, AZ, USA Duration: Nov 3 1997 → Nov 8 1997 |
Other
Other | Proceedings of the 1997 IEEE Global Telecommunications Conference. Part 2 (of 3) |
---|---|
City | Phoenix, AZ, USA |
Period | 11/3/97 → 11/8/97 |
ASJC Scopus subject areas
- Electrical and Electronic Engineering
- Global and Planetary Change