Towards Network Implementation of CBR: Case Study of a Neural Network K-NN Algorithm

Published: 01 Jan 2024, Last Modified: 04 Nov 2024ICCBR 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recent research brings the strengths of neural networks to bear on CBR tasks such as similarity assessment and case adaptation. This paper further advances this direction by implementing both retrieval and adaptation as a single neural network. Such an approach has multiple goals: From the perspective of CBR, it enables harmonizing the interaction between feature extraction, retrieval/similarity assessment, and case adaptation through end-to-end training. From the perspective of neural networks, a neural network implementing CBR processes ceases to be a black box and provides the natural interpretability of CBR. As a first step towards this goal, this paper presents neural network based k-nearest neighbor (NN-kNN), a network architecture that can be interpreted as a k-NN method. Unlike other network architectures, NN-kNN’s decisions can be fully explained in terms of surface features, feature/case weights and nearest neighbors. It can be trained or fine-tuned using existing neural network methods. This study illustrates its feasibility and examines its strengths and limitations. The approach is evaluated for classification and regression tasks comparing NN-kNN, a standard neural network, and k-NN models using state-of-the-art distance metric learning algorithms. In these tests, NN-kNN achieves equal or less error when compared to the other models, while being fully interpretable as a k-NN method. The study also considered the limitations of NN-kNN and future directions to alleviate them.
Loading