HyperDG: Hyperbolic Representation Alignment for Robust Domain Generalization via Curvature Refinement

TMLR Paper7459 Authors

11 Feb 2026 (modified: 18 Feb 2026)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Domain generalization often suffers from geometric inconsistencies in representations learned across multiple source domains. Although recent approaches pursue flat minima or invariant features, they remain restricted to Euclidean space, overlooking the inherently curved nature of real data manifolds. We introduce HyperDG, a hyperbolic representation learning framework that models each domain as a Lorentz manifold with learnable negative curvature and enforces cross-domain consistency through a self feedback mechanism alternating between local adaptation, tangent space mapping, and global manifold adjustment, effectively unifying flat minima consistency with non Euclidean representation learning within a single optimization process. By jointly optimizing model parameters and manifold curvature, the framework learns a shared meta manifold that preserves invariance across domains while maintaining hierarchical structure within each. Extensive experiments on standard domain generalization benchmarks show consistent improvements in accuracy, robustness, and out of distribution performance, demonstrating that embracing hyperbolic representation spaces rather than flattening them leads to geometry consistent and domain resilient generalization.
Submission Type: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Rémi_Flamary1
Submission Number: 7459
Loading