Structure Representation Learning by Jointly Learning to Pool and RepresentDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Structure representation learning is a task to provide an overall representation for a given structure (e.g., sequential text, non-sequential graph). This representation characterizes the property of that structure. Previous methods decompose the task into an element representation learning phase and a pooling phase to aggregate element representations. Their pooling phase only considers the final representation of each element without considering the relationship between these elements that are used only to construct representations of elements. In this paper, we conjecture that classification performance suffers from the lack of relation exploitation while pooling and propose the Self-Attention Pooling to dynamically provide centrality scores for pooling based on the self-attention scores from the element representation learning. Simply applying Self-Attention Pooling improves model performance on $3$ sentence classification tasks ({$\boldsymbol{\uparrow 2.9}$}) and $5$ graph classification tasks ({$\boldsymbol{\uparrow 2.1}$}) on average.
0 Replies

Loading