R²-CoD: Understanding Text-Graph Complementarity in Relational Reasoning via Knowledge Co-Distillation

Published: 23 Sept 2025, Last Modified: 17 Nov 2025UniReps2025EveryoneRevisionsBibTeXCC BY 4.0
Track: Extended Abstract Track
Keywords: text-graph complementarity, relational reasoning, knowledge co-distillation, representation analysis
TL;DR: We analyze how text and graph representations complement each other across relational reasoning tasks using knowledge co-distillation.
Abstract: Relational reasoning lies at the core of many NLP tasks, drawing on complementary signals from text and graphs. While prior research has investigated this dual complementarity, a detailed and systematic understanding of text-graph interplay and its effect on hybrid models remains underexplored. We take an analysis-driven approach using a unified architecture with knowledge co-distillation (CoD) across five diverse relational reasoning tasks. By tracking how text and graph representations evolve during training, we uncover interpretable patterns of alignment and divergence, and provide insights into when and why their integration is beneficial.
Submission Number: 27
Loading