A Universal Framework for Compressing Embeddings in CTR Prediction

Published: 11 Feb 2025, Last Modified: 22 Jan 2026OpenReview Archive Direct UploadEveryoneCC BY 4.0
Abstract: Accurate click-through rate (CTR) prediction is vital for on line advertising and recommendation systems. Recent deep learning ad vancements have improved the ability to capture feature interactions and understand user interests. However, optimizing the embedding layer often remains overlooked. Embedding tables, which represent categor ical and sequential features, can become excessively large, surpassing GPU memory limits and necessitating storage in CPU memory. This results in high memory consumption and increased latency due to fre quent GPU-CPU data transfers. To tackle these challenges, we introduce a Model-agnostic Embedding Compression (MEC) framework that com presses embedding tables by quantizing pre-trained embeddings, without sacrificing recommendation quality. Our approach consists of two stages: first, we apply popularity-weighted regularization to balance code distri bution between high- and low-frequency features. Then, we integrate a contrastive learning mechanism to ensure a uniform distribution of quan tized codes, enhancing the distinctiveness of embeddings. Experiments on three datasets reveal that our method reduces memory usage by over 50x while maintaining or improving recommendation performance com pared to existing models. The implementation code is accessible in our project repository https://github.com/USTC-StarTeam/MEC.
Loading