Compatibility-aware Single-cell Continual Annotation

26 Sept 2024 (modified: 12 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Continual Compatible learning; Single-Cell RNA-seq data
Abstract:

As massive well-labeled single-cell RNA-seq (scRNA-seq) data are available sequentially, automatic cell type annotation systems would require the model to continuously update to expand their internal cell type library. However, the model could suffer from the catastrophic forgetting phenomenon, in which the performance of the model on the old tasks degrades significantly after it learns a new task. To enable the smooth upgrading of the system, the model must possess the ability to maintain performance on old tasks (stability) and adapt itself to learn new tasks (plasticity). We call such an updating process continual compatible learning. To adapt to this task, we propose a simple yet effective method termed scROD based on sample replay and objective decomposition. Specifically, we first maintain a memory buffer to save some cells from the previous tasks and replay them to learn together with the next incoming tasks. Then we decompose two different training objectives in continual compatible learning, i.e., distinguishing new cell types from old ones and distinguishing between different new ones, to avoid forgetting the model to varying degrees. Lastly, we assign distinct weights for two objectives to obtain a better trade-off between model stability and plasticity than the coupled approach. Comprehensive experiments on various benchmarks show that scROD can outperform existing scRNA-seq annotation methods and learn many cell types continually over a long period.

Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5465
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview