Sparse hyperbolic representation learning

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: metric learning, kernel learning, and sparse coding
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: sparse learning, hyperbolic space, Cartan-Hadamard norm
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose the regularization and optimization methods for sparse learning in hyperbolic space.
Abstract: Minimizing the space complexity of entity representations without the loss of information makes data science procedures computationally efficient and effective. For the entities with the tree structure, hyperbolic-space-based representation learning (HSBRL) has successfully reduced the space complexity of representations by using low-dimensional space. Nevertheless, it has not minimized the space complexity of each representation since it has used the same dimension for all representations and has not selected the best dimension for each representation. This paper, for the first time, constructs a sparse learning scheme to minimize the dimension for each representation in HSBRL. The most significant difficulty is that we cannot construct a well-defined sparse learning scheme for HSBRL based on a coordinate system since there is no canonical coordinate system that reflects geometric structure perfectly, unlike in linear space. Forcibly applying a linear sparse learning method on a coordinate system of hyperbolic space causes a non-uniform sparsity. Another difficulty is that existing Riemannian gradient descent cannot reach a sparse solution since the algorithm oscillates on a non-smooth function, which is essential in sparse learning. To overcome the above issue, for the first time, we geometrically define the sparseness and sparse regularization in hyperbolic space, to achieve geometrically uniform sparsity. Also, we propose the first optimization algorithm that can avoid the oscillation problem and obtain sparse representations in hyperbolic space by the geometric shrinkage-thresholding idea.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: pdf
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7200
Loading