Automatic Feature Selection By One-Shot Neural Architecture Search In Recommendation Systems

Published: 01 Jan 2023, Last Modified: 29 Aug 2024WWW 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Feature selection is crucial in large-scale recommendation system, which can not only reduce the computational cost, but also improve the recommendation efficiency. Most existing works rank the features and then select the top-k ones as the final feature subset. However, they assess feature importance individually and ignore the interrelationship between features. Consequently, multiple features with high relevance may be selected simultaneously, resulting in sub-optimal result. In this work, we solve this problem by proposing an AutoML-based feature selection framework that can automatically search the optimal feature subset. Specifically, we first embed the search space into a weight-sharing Supernet. Then, a two-stage neural architecture search method is employed to evaluate the feature quality. In the first stage, a well-designed sampling method considering feature convergence fairness is applied to train the Supernet. In the second stage, a reinforcement learning method is used to search for the optimal feature subset efficiently. The Experimental results on two real datasets demonstrate the superior performance of new framework over other solutions. Our proposed method obtain significant improvement with a 20% reduction in the amount of features on the Criteo. More validation experiments demonstrate the ability and robustness of the framework.
Loading