How far can we go without finetuning?

22 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: lifelong learning, transformers
TL;DR: Better generalising pre-trained architectures can solve a number of problems without finetuning, providing the basis for lifelong learning without catastrophic forgetting and with a means of interpretation.
Abstract: Many of the existing deep learning methods are trained for scenarios which: (1) use (costly) fine-tuning of the latent spaces on the target dataset in the ”downstream task” (2) do not account for continual and open-set learning (3) do not provide interpretability. Instead of trying to solve the problem of semi- and unsupervised learning through representation learning, we propose recasting it into the problem of analysing existing foundational models’ feature spaces. We show that a simple baseline, based on non-parametric clustering analysis of the latent feature spaces and pre-trained classifiers on large-scale datasets, can help solve a set of such problems even without finetuning. It can also be seen as a set of metrics for assessment of generalisation within the latent feature spaces. We argue that better generalising pre-trained architectures can solve a number of problems without finetuning, providing the basis for lifelong learning without catastrophic forgetting and with a means of interpretation.
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5812
Loading