Prototypical Representation Learning for Low-resource Knowledge Extraction: Summary and Perspective

Anonymous

17 Jan 2022 (modified: 05 May 2023)Submitted to BT@ICLR2022Readers: Everyone
Keywords: Prototype, Low-resource, Knowledge Extraction
Abstract: Recent years have witnessed the success of prototypical representation in widespread low-resource tasks, since "Prototypical Networks for Few-shot Learning (NeurIPS 2017)" proposed to represent each class as a prototype by the mean of its instance embeddings and learn a metric space in which classification can be performed by computing distances to prototypes. A recent paper "*Prototypical Representation Learning for Relation Extraction*" accepted by ICLR 2021, as a member of the growing zoo of prototypical networks, has addressed **prototypical representation learning for low-resource knowledge extraction**. In this post, we briefly summarize this issue by highlighting the ICLR paper. Different from vanilla prototypical networks, this ICLR paper has proposed to tackle low-resource knowledge extraction (1) considering both *compactness intra each prototype* and *separability inter prototypes*, (2) by leveraging *contrastive learning* and projecting prototypes into *geometric space*. Furthermore, we also point out some shortcomings of this paper and put forward some promising directions.
Submission Full: zip
Blogpost Url: yml
ICLR Paper: https://openreview.net/forum?id=aCgLmfhIy_f
2 Replies

Loading