HyperCool: Reducing Encoding Cost in Overfitted Codecs with Hypernetworks

Published: 26 Jan 2026, Last Modified: 26 Jan 2026AAAI 2026 Workshop on ML4Wireless PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: image compression, learned compression, lightweight models, per-image overfitting
Abstract: Overfitted image codecs like Cool-chic achieve strong compression by tailoring lightweight models to individual images, but encoding is slow and costly. Non-Overfitted (N-O) Cool-chic accelerates encoding with a learned inference model, trading compression performance for speed. We introduce HyperCool, a hypernetwork that generates content-adaptive parameters for a Cool-chic decoder in a single forward pass, avoiding per-image fine-tuning. Our method reduces bitrate by 4.9\% over N-O Cool-chic with minimal overhead and provides a strong initialization for further optimization, reducing steps to approach fully overfitted performance. With fine-tuning, HEVC-level compression is achieved at 60.4\% of the cost of fully overfitted Cool-chic. This approach offers a practical way to accelerate overfitted image codecs under tight compute budgets.
Submission Number: 3
Loading