Comprehensive Artistic Style Representation for Quantitative Evaluation

27 Sept 2024 (modified: 26 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Visual Representation, Representation Decoupling, Artistic Style, Vision-Language Models
TL;DR: We analyze differences in artistic style representation between multimodal and unimodal models, propose a new style decoupling method, and introduce a higher-quality evaluation dataset, enhancing quantitative representation of artistic styles.
Abstract: Artistic style, a unique medium for artists to express creativity through elements like form, color, and composition, poses a challenge for computer vision due to its intricate patterns and nuanced aesthetics. Contemporary models, often reliant on specific datasets, face limitations in their generalizability and precision in identifying individual artists' styles. From an information theory perspective, we examine the limitations of fine-tuning and investigate techniques to disentangle content from style information. We note differences in artistic style representation between unimodal and multimodal models. As a result, we propose a plug-and-play approach designed to efficiently separate content information within Vision-Language Models (VLMs), preserving stylistic details. Furthermore, we present the WeART dataset, a large-scale art dataset with high-quality annotations, to evaluate the artistic style representation capabilities of models. Experimental results show that our method improves the performance of VLMs in style retrieval tasks across several datasets. We will publicly release the proposed dataset and code.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8954
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview