Keywords: Implicit Neural Representations, Image Quality Assesment
Abstract: Measuring the perceptual quality of a degraded signal relative to its pristine reference has long posed a fundamental challenge in signal and image processing. Traditional metrics such as PSNR and SSIM, as well as more recent learning-based approaches like LPIPS and DreamSim, have provided valuable insights, yet each carries inherent shortcomings. Pixel-domain measures like PSNR, for instance, can assign identical scores to two distortions that are visually very different, failing to reflect human perception. Moreover, as the severity of degradation increases or decreases, existing similarity measures often fail to respond proportionally, revealing their limited sensitivity to the true perceptual progression of signal quality. To address these limitations, we introduce Implicit Neural Representation for Image Quality Assessment (INRIQ). The central idea is to first overfit an INR to the reference signal, thereby encoding its structural and frequency content directly in the weight space of the network. We then quantify how the trained INR must adapt in order to approximate the degraded signal, analyzing this process through a Fisher-based sensitivity framework. By shifting the comparison from image or feature space into the parameter space of INRs, our approach captures subtle yet meaningful differences in distortion, offering a more principled and perceptually aligned measure of quality. Our experimental results on entire KADID10k dataset shows that INRIQ is the most sensitive similarity measure for images.
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 14700
Loading