Staggered Quantizers for Perfect Perceptual Quality: A Connection between Quantizers with Common Randomness and Without

Published: 15 Apr 2024, Last Modified: 24 Apr 2024Learn to Compress @ ISIT 2024 SpotlightPresentationPosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Rate-distortion-perception, quantization
TL;DR: A quantization perspective on the rate-distortion-perception tradeoff.
Abstract: The rate-distortion-perception (RDP) framework has attracted significant recent attention due to its application in neural compression. It is important to understand the underlying mechanism connecting procedures with common randomness and those without. Different from previous efforts, we study this problem from a quantizer design perspective. By analyzing an idealized setting, we provide an interpretation of the advantage of dithered quantization in the RDP setting, which further allows us to make a conceptual connection between randomized (dithered) quantizers and quantizers without common randomness. This new understanding leads to a new procedure for RDP coding based on staggered quantizers.
Submission Number: 6
Loading