Keywords: Generative Recommendation, semantic id, hyperbolic representation
TL;DR: Our paper presents Hyperbolic RQ-VAE, a hyperbolic-space indexing framework that aligns LLM-based generative recommendation with long-tailed item catalogs, significantly boosting performance, especially for tail items,over Euclidean baselines.
Abstract: Sequential recommender systems model user behavior as item‑ID sequences, while recent generative methods cast recommendation as a language modeling task using large language models (LLMs).
While this paradigm incorporates rich textual semantics, it creates a fundamental mismatch:
LLMs operate on text tokens, whereas recommender systems depend on discrete item indices. This misalignment often leads to hallucinations in generative recommendations.
Existing methods attempt to bridge this gap by learning item vocabularies in Euclidean space, but they struggle to model the inherent long-tail distribution of real-world catalogs.
where a small number of head items dominate and a vast number of tail items reflect users' niche preferences.
To address this issue, we introduce Hyperbolic Residual-Quantized Variational AutoEncoder (HypRQ-VAE), the first framework to learn item indexing in hyperbolic space.
HYPRQ-VAE leverages the unique properties of hyperbolic geometry, whose exponential volume expansion naturally accommodates the power-law structure of user-item interactions. This allows the model to encode rich textual semantics while preserving the representational fidelity of sparse, long-tail items.
Experiments on three benchmark datasets show that HypRQ-VAE significantly improves the performance of recommendation, particularly in recommending tail items.
Our analysis attributes these gains to the superior capacity of hyperbolic space to model item hierarchies and sparsity in generative recommendation. Our code and data are available at: \url{https://anonymous.4open.science/r/HypRQ-VAE-6C5B}.
Primary Area: generative models
Submission Number: 13967
Loading