Homeomorphic Model Transformation for Boosting Performance and Efficiency in Object Detection Networks

23 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Desk Rejected SubmissionEveryoneRevisionsBibTeX
Keywords: Objection Detection, Model Transfer, Transfer Learning, Homeomorphic Model Transformation
Abstract: The field of computer vision has witnessed significant advancements in recent years with the development of deep learning networks. However, the fixed architectures of these networks limit their capabilities. For object detection task, existing methods typically rely on fixed architecture. While achieving promising performance, there is potential for further improving network performance with minimal modifications. In this study, we investigate that existing networks with minimal modifications can further boost performance. However, modifying some layers results in pre-trained weight mismatch, the fine-tune process is time-consuming and resource-inefficient. To address this issue, we propose a novel technique called Homeomorphic Model Transformation (HMT), which enables the adaptation of initial weights based on pretrained weights. This approach ensures the preservation of the original model's performance when modifying layers. Additionally, HMT significantly reduces the total training time required to achieve optimal results while further enhancing network performance. Extensive experiments across various object detection tasks validate the effectiveness and efficiency of our proposed HMT solution.
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7104
Loading