Locally adaptive perceptual quantization without side information for compression of visual data

Ingo Hontsch, Lina Karam

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

This paper presents a locally-adaptive perceptual quantization scheme for visual data compression. The strategy is to exploit human visual masking properties by deriving masking thresholds in a locally-adaptive fashion based on a sub-band decomposition. The derived masking thresholds are used in controlling the quantization stage by adapting the quantizer reconstruction levels to the local amount of masking present at the level of each sub-band transform coefficient. Compared to the existing non locally-adaptive perceptual quantization methods, the new locally-adaptive algorithm exhibit superior performance and does not require additional side information. This is accomplished by estimating the amount of available masking from the already quantized data and linear prediction of the coefficient under consideration. By virtue of the local adaptation, the proposed quantization scheme is able to remove a large amount of perceptually redundant information. Since the algorithm does not require additional side information, it yields a low entropy representation of the image and is well suited for perceptually-lossless image compression.

Original languageEnglish (US)
Title of host publicationConference Record / IEEE Global Telecommunications Conference
Editors Anon
Place of PublicationPiscataway, NJ, United States
PublisherIEEE
Pages1042-1046
Number of pages5
Volume2
StatePublished - 1997
EventProceedings of the 1997 IEEE Global Telecommunications Conference. Part 2 (of 3) - Phoenix, AZ, USA
Duration: Nov 3 1997Nov 8 1997

Other

OtherProceedings of the 1997 IEEE Global Telecommunications Conference. Part 2 (of 3)
CityPhoenix, AZ, USA
Period11/3/9711/8/97

Fingerprint

Quantization (signal)
Data compression
Image compression
Adaptive algorithms
Entropy
compression
Decomposition
local adaptation
entropy
transform
decomposition
prediction

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Global and Planetary Change

Cite this

Hontsch, I., & Karam, L. (1997). Locally adaptive perceptual quantization without side information for compression of visual data. In Anon (Ed.), Conference Record / IEEE Global Telecommunications Conference (Vol. 2, pp. 1042-1046). Piscataway, NJ, United States: IEEE.

Locally adaptive perceptual quantization without side information for compression of visual data. / Hontsch, Ingo; Karam, Lina.

Conference Record / IEEE Global Telecommunications Conference. ed. / Anon. Vol. 2 Piscataway, NJ, United States : IEEE, 1997. p. 1042-1046.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Hontsch, I & Karam, L 1997, Locally adaptive perceptual quantization without side information for compression of visual data. in Anon (ed.), Conference Record / IEEE Global Telecommunications Conference. vol. 2, IEEE, Piscataway, NJ, United States, pp. 1042-1046, Proceedings of the 1997 IEEE Global Telecommunications Conference. Part 2 (of 3), Phoenix, AZ, USA, 11/3/97.
Hontsch I, Karam L. Locally adaptive perceptual quantization without side information for compression of visual data. In Anon, editor, Conference Record / IEEE Global Telecommunications Conference. Vol. 2. Piscataway, NJ, United States: IEEE. 1997. p. 1042-1046
Hontsch, Ingo ; Karam, Lina. / Locally adaptive perceptual quantization without side information for compression of visual data. Conference Record / IEEE Global Telecommunications Conference. editor / Anon. Vol. 2 Piscataway, NJ, United States : IEEE, 1997. pp. 1042-1046
@inproceedings{632c2ceac2d8446187e3a5e5b748c70f,
title = "Locally adaptive perceptual quantization without side information for compression of visual data",
abstract = "This paper presents a locally-adaptive perceptual quantization scheme for visual data compression. The strategy is to exploit human visual masking properties by deriving masking thresholds in a locally-adaptive fashion based on a sub-band decomposition. The derived masking thresholds are used in controlling the quantization stage by adapting the quantizer reconstruction levels to the local amount of masking present at the level of each sub-band transform coefficient. Compared to the existing non locally-adaptive perceptual quantization methods, the new locally-adaptive algorithm exhibit superior performance and does not require additional side information. This is accomplished by estimating the amount of available masking from the already quantized data and linear prediction of the coefficient under consideration. By virtue of the local adaptation, the proposed quantization scheme is able to remove a large amount of perceptually redundant information. Since the algorithm does not require additional side information, it yields a low entropy representation of the image and is well suited for perceptually-lossless image compression.",
author = "Ingo Hontsch and Lina Karam",
year = "1997",
language = "English (US)",
volume = "2",
pages = "1042--1046",
editor = "Anon",
booktitle = "Conference Record / IEEE Global Telecommunications Conference",
publisher = "IEEE",

}

TY - GEN

T1 - Locally adaptive perceptual quantization without side information for compression of visual data

AU - Hontsch, Ingo

AU - Karam, Lina

PY - 1997

Y1 - 1997

N2 - This paper presents a locally-adaptive perceptual quantization scheme for visual data compression. The strategy is to exploit human visual masking properties by deriving masking thresholds in a locally-adaptive fashion based on a sub-band decomposition. The derived masking thresholds are used in controlling the quantization stage by adapting the quantizer reconstruction levels to the local amount of masking present at the level of each sub-band transform coefficient. Compared to the existing non locally-adaptive perceptual quantization methods, the new locally-adaptive algorithm exhibit superior performance and does not require additional side information. This is accomplished by estimating the amount of available masking from the already quantized data and linear prediction of the coefficient under consideration. By virtue of the local adaptation, the proposed quantization scheme is able to remove a large amount of perceptually redundant information. Since the algorithm does not require additional side information, it yields a low entropy representation of the image and is well suited for perceptually-lossless image compression.

AB - This paper presents a locally-adaptive perceptual quantization scheme for visual data compression. The strategy is to exploit human visual masking properties by deriving masking thresholds in a locally-adaptive fashion based on a sub-band decomposition. The derived masking thresholds are used in controlling the quantization stage by adapting the quantizer reconstruction levels to the local amount of masking present at the level of each sub-band transform coefficient. Compared to the existing non locally-adaptive perceptual quantization methods, the new locally-adaptive algorithm exhibit superior performance and does not require additional side information. This is accomplished by estimating the amount of available masking from the already quantized data and linear prediction of the coefficient under consideration. By virtue of the local adaptation, the proposed quantization scheme is able to remove a large amount of perceptually redundant information. Since the algorithm does not require additional side information, it yields a low entropy representation of the image and is well suited for perceptually-lossless image compression.

UR - http://www.scopus.com/inward/record.url?scp=0031382106&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0031382106&partnerID=8YFLogxK

M3 - Conference contribution

VL - 2

SP - 1042

EP - 1046

BT - Conference Record / IEEE Global Telecommunications Conference

A2 - Anon, null

PB - IEEE

CY - Piscataway, NJ, United States

ER -