ADAPTIVE HETEROGENEOUS GRAPH REPRESENTATION LEARNING USING KNN-AUGMENTED GRAPH MAMBA NETWORKS (KA-GMN)

Published: 06 Mar 2025, Last Modified: 15 Apr 2025ICLR 2025 DeLTa Workshop PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: tiny / short paper (up to 4 pages)
Keywords: Graph Mamba Networks, Graph neural networks, State Space Models
TL;DR: KA-GMN integrates KNN-based topology with state space models to enable efficient and structure-preserving representation learning on heterogeneous graphs.
Abstract: Graph representation learning for heterogeneous networks presents challenges in structural preservation and computational tractability. We present KA-GMN (KNN-Augmented Graph Mamba Networks), integrating k-nearest neighbor selection with state space models for graph representation learning. The architecture implements: (1) KNN-based state transitions for type-specific node representation, (2) compatibility functions for structural graph adaptation, and (3) type-aware feature transformations to prevent representation degradation. KA-GMN processes multi-typed relationships through selective message passing and state space modeling, maintaining graph structure through learned neighborhood functions. The theoretical framework establishes a foundation for heterogeneous graph representation through the synthesis of KNN-based topology and state space dynamics.
Submission Number: 130
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview