GCNFT: Graph Convolutional Networks Aware Generative Feature Transformation

ICLR 2025 Conference Submission12695 Authors

28 Sept 2024 (modified: 13 Oct 2024)ICLR 2025 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Feature Transformation, Data-centric AI, Representation Learning
Abstract: Feature transformation for attributed graphs converts raw node attributes into augmented features that preserve node and structure information. Relevant literature either fails to capture graph structures (e.g., manual handcrafting, discrete search), or is latent and hard to interpret (e.g., GCNs). How can we automatically reconstruct explicit features of an attributed graph while effectively integrating graph structures and attributes? We generalize the learning task under such setting as a GCN-aware Feature Transformation (GCNFT) problem. GCNFT imposes two under-addressed challenges: 1) quantifying GCN awareness and 2) bridging GCN awareness and feature transformation. To tackle these challenges, we propose a graph convolution structure score guided generative learning framework to solve GCNFT. To quantify GCN awareness, we interpret GCN as a gap minimization process between ideal and current node representations in iterative Laplacian smoothing, and develop a task-agnostic structure score to approximate GCN awareness. To incorporate GCN awareness, we model feature transformation as sequential generative learning so that we pave a way to leverage the structures score to guide the generative learning and encourage graph structure alignment. Extensive experiments demonstrate the proposed GCN-aware approach outperforms feature transformation baselines with an improvement of 3\% to 20\% over node, link, and graph prediction tasks.
Primary Area: other topics in machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 12695
Loading