HEAT: Hyperedge Attention NetworksDownload PDF

30 May 2022, 15:03 (modified: 14 Sept 2022, 00:30)Accepted by TMLRReaders: Everyone
Abstract: Learning from structured data is a core machine learning task. Commonly, such data is represented as graphs, which normally only consider (typed) binary relationships between pairs of nodes. This is a substantial limitation for many domains with highly-structured data. One important such domain is source code, where hypergraph-based representations can better capture the semantically rich and structured nature of code. In this work, we present HEAT, a neural model capable of representing typed and qualified hypergraphs, where each hyperedge explicitly qualifies how participating nodes contribute. It can be viewed as a generalization of both message passing neural networks and Transformers. We evaluate HEAT on knowledge base completion and on bug detection and repair using a novel hypergraph representation of programs. In both settings, it outperforms strong baselines, indicating its power and generality.
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Length: Regular submission (no more than 12 pages of main content)
Video: https://youtu.be/q0oFjmxz_60
Code: https://github.com/microsoft/neurips21-self-supervised-bug-detection-and-repair/tree/heat
Assigned Action Editor: ~Swarat_Chaudhuri1
13 Replies