MUSE: A Trustworthy Vertical Federated Feature Selection Framework

Published: 2025, Last Modified: 08 Jan 2026IEEE Trans. Comput. Soc. Syst. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Vertical federated feature selection can select effective features and avoid overfitting in vertical federated learning. However, existing privacy-preserving techniques for vertical federated feature selection are limited to selecting task-related features and cannot reduce redundant features among clients, resulting in performance loss. This article introduces a mutual information-based federated feature selection (MUSE) framework to address these issues. In the MUSE framework, the correlation of cross-device feature–feature and feature–class is estimated by our defined privacy-preserving mutual information, called federated mutual information (FMI). To compute FMI, we propose the anonymous bin matching (ABM) algorithm, which only uses the intersection size of bins rather than bin elements to avoid sample-IDs leakage. With FMI, MUSE can support the minimized dependency feature selection criteria for removing redundant features. Additionally, we propose the local feature preselection to reduce the computation cost of FMI. It is theoretically and experimentally proved as a close approximation of the global optimum under certain constraints. We evaluate the effectiveness of our MUSE framework on various datasets. The experimental results demonstrate that our methods consistently outperform the state-of-the-art federated feature selection methods across most datasets. Moreover, our method shows potential in multimodal data as well.
Loading