Abstract: Signed networks are commonly used to represent positive and negative relationships in the real world. Link sign prediction in signed networks is a significant research topic. In the past decades, various link sign prediction methods have been proposed. However, most of them following the message-passing paradigm encounter two challenges: i) capture the relationship between nodes and their high-order neighborhoods, and ii) alleviate the over-smoothing problem. To address these challenges, we propose a method called Signed Graph Transformer (SGTrans), which uses a node sequence encoding approach based on Transformer. For the first challenge, we introduce three types of positional encoding, guided by path-level balance theory, while adding more network layers to capture the relationships between nodes and their high-order neighborhoods effectively. For the second challenge, SGTrans utilizes self-attention to handle a sampled relevant node sequence rather than message passing. This approach not only minimizes the introduction of excessive irrelevant node information but also alleviates the over-smoothing problem. Extensive experiments are conducted on four real-world datasets, and the results illustrate the effectiveness of SGTrans.
External IDs:dblp:conf/ijcnn/LinCCHC25
Loading