Integrating Global Features into Neural Collaborative Filtering

Published: 01 Jan 2022, Last Modified: 15 May 2025KSEM (2) 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recently, deep learning has been widely applied in the field of recommender systems and achieved great success, among which the most representative one is the Collaborative Filtering based Deep Neural Network. However, the input of such a model is usually a very sparse one-hot coding vector of users and items. This makes it difficult for the model to effectively capture the global features interaction between users and items. What is more, it also increases the training difficulty, making the model easily fall into a local optimum. Therefore, this paper proposes a two-stage Integrating Global Features into Neural Collaborative Filtering (GFNCF) model. To begin with, the AutoEncoder model with sparse constraint parameters is used to accurately extract the global features of users and items. Following that, the global features extracted in the previous step are integrated into the neural collaborative filtering framework as auxiliary information. It alleviates the sparse input problem and integrates more auxiliary features to improve the learning process of the model. Extensive experiments on several publicly available datasets demonstrate the effectiveness of the proposed GFNCF model.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview