Scalable Diffusion for Bio-topological Representation Learning on Brain Graphs

26 Sept 2024 (modified: 16 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Diffusion; Representation Learning; Graph Learning; Brain; Scalability; Topological Analysis
Abstract: The topological structure information of the brain graph is critical in discovering bio-topological properties that underlie brain function and pathology. Authentic representations of brain graphs in many clinical applications heavily rely on these bio-topological properties. While existing studies have made strides in analyzing brain graph topology, they are often constrained by single-scale structural analysis and hence fail to extract these properties across multiple scales, thus potentially leading to incomplete and distorted representations. To address this limitation, we propose a novel Scalable diffusion model for bio-TOpological REpresentation learning on Brain graphs (BrainSTORE). BrainSTORE constructs multiscale topological structures within brain graphs, facilitating a deep exploration of bio-topological properties. By embedding these features into the training process and prioritizing bio-topological feature reconstruction, BrainSTORE learns representations that are more reflective of underlying brain organization. Furthermore, BrainSTORE utilizes a unified architecture to integrate these features effectively, yielding improved bio-topological representations which are more robust and biologically meaningful. To the best of our knowledge, this is the first study to investigate bio-topological properties in brain graph representation learning. Extensive experiments demonstrate that BrainSTORE outperforms state-of-the-art methods in brain disease detection.
Primary Area: applications to neuroscience & cognitive science
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 8365
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview