Supervised Pre-training for Unsupervised Product-Patent Image Retrieval

17 Sept 2024 (modified: 13 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: image retrieval, supervised pre-training, domain gap
Abstract: Detecting infringing products is essential for protecting intellectual property rights and is often implemented as a product-patent retrieval task. Manual infringement detection is extremely time-consuming, and artificial intelligence plays an increasingly important role. However, most existing methods rely on natural language-based retrieval due to the domain discrepancies between patent images and product images. Due to the lack of sufficient annotated data, this work aims to address the aforementioned issues in an unsupervised setting by answering the following two questions: 1) How can we align the domain gap between patent images and product images using existing technologies? 2) How can we build a powerful backbone to jointly extract the features of patent and product images? Initially, we construct a dataset for patent-product image retrieval, which includes product-patent pairs and unlabeled data. To address the first question, we systematically evaluate three unsupervised approaches to mitigate the domain gap between patent and product images. The results demonstrate that jointly mapping patent and product images to a new feature space is effective. To answer the second question, we propose a novel supervised pre-training paradigm to achieve domain-aligned feature extraction for product and patent edge images. Extensive experiments using various backbones and training pipelines demonstrate the superiority of our supervised pre-training method. The dataset and code of this paper will be made publicly available upon acceptance.
Primary Area: datasets and benchmarks
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1302
Loading