Abstract: We consider a novel concept-learning and merging task, motivated by two use-cases. The first is about merging and compressing music playlists, and the second about federated learning with data privacy constraints. Both settings involve multiple learned concepts that must be merged and compressed into a single interpretable and accurate concept description. Our concept descriptions are logical formulae in CNF, for which merging, i.e. disjoining, multiple CNFs may lead to very large concept descriptions. To make the concepts interpretable, we compress them relative to a dataset. We propose a new method named CoWC (Compression Of Weighted Cnf) that approximates a CNF by exploiting techniques of itemset mining and inverse resolution. CoWC compresses the CNF size while also considering the F1-score w.r.t. the dataset. Our empirical evaluation shows that CoWC outperforms alternative compression approaches.
Loading