Abstract: Highlights•We proposed SARL, a structure-aware graph transformer for rule learning on KGs.•We proposed generalized attention mechanism, which is replaceable for flexibility.•SARL outperforms baselines and is easily applicable to other KG reasoning tasks.
Loading