Attention-based Dynamic Subspace LearnersDownload PDF

22 Apr 2022, 01:21 (edited 04 Jun 2022)MIDL 2022 Short PapersReaders: Everyone
  • Keywords: Deep Metric Learning, Clustering, Image Retrieval
  • TL;DR: A novel dynamic learning strategy in Deep metric learning approaches that learns a consistent embedding without requiring the empirical search for an optimal number of learners.
  • Abstract: Deep metric learning methods are widely used to learn similarities in the data. Most methods use a single metric learner, which is inadequate to handle the variety of object attributes such as color, shape, or artifacts in the images. Multiple metric learners could focus on these object attributes. However, it requires a number of learners to be found empirically for each new dataset. This work presents a Dynamic Subspace Learners to dynamically exploit multiple learners by removing the need of knowing apriori the number of learners and aggregating new subspace learners during training. Furthermore, the interpretability of such subspace learning is enforced by integrating an attention module into our method, providing a visual explanation of the embedding features. Our method achieves competitive results with the performances of multiple learners baselines and significantly improves over the classification network in clustering and retrieval tasks.
  • Registration: I acknowledge that acceptance of this work at MIDL requires at least one of the authors to register and present the work during the conference.
  • Authorship: I confirm that I am the author of this work and that it has not been submitted to another publication before.
  • Paper Type: recently published or submitted journal contributions
  • Primary Subject Area: Unsupervised Learning and Representation Learning
  • Secondary Subject Area: Interpretability and Explainable AI
  • Confidentiality And Author Instructions: I read the call for papers and author instructions. I acknowledge that exceeding the page limit and/or altering the latex template can result in desk rejection.
1 Reply

Loading