Perceptual Context and Sensitivity in Image Quality Assessment: A Human-Centric Approach

23 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Human Visual System; Quality Context Contrastive Learning; Quality-Aware Mask Attention; Global and Local
Abstract: Blind Image Quality Assessment (BIQA) mirrors subjective made by human observers. The Human Visual System (HVS) assesses image quality by combining a global perspective of the contrasting relationships among samples of varying quality with a local analysis of individual images. However, current BIQA methodologies tend to emphasize local evaluations but overlook the contrasting relationship inherent in global perception, leading to the incomprehensive representation of human subjective assessment. Consequently, the representation learning of the BIQA model remains suboptimal. To address this, we present the Perceptual Context and Sensitivity in BIQA (CSIQA), a novel metric learning paradigm that seamlessly integrates efficient human-centric global and local evaluations into the BIQA methodology. Specifically, the CSIQA comprises two primary components: 1). A Quality Context Contrastive Learning module, that is equipped with different contrastive learning strategies to effectively capture potential quality correlations in the \textbf{global context} of the dataset. 2). A Quality-aware mask attention module, which employs the random masking mechanism to ensure the consistency with visual \textbf{local sensitivity}, thereby improving the model's perception of local distortions. Extensive experiments on eight standard BIQA datasets demonstrate the superior performance to the state-of-the-art BIQA methods, \emph{i.e.,} achieving the PLCC values of 0.941 ($\uparrow 3.3\%$ vs. 0.908 in TID2013) and 0.920 ($\uparrow 2.6\%$ vs. 0.894 in LIVEC).
Supplementary Material: zip
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6891
Loading