Abstract: Continual visual search has attracted increasing attention due to its practicality. It aims to continuously extract embeddings for new-emerging gallery data while keeping previously generated embeddings unchanged. The primary challenge lies in preserving embedding compatibility between ongoing and past data. Existing methods mainly concentrate on ensuring backward embedding consistency. However, they often overlook the issue of semantic drift, which arises from overlapping semantics between old and new-class data, thus hindering further progress. To address this problem, we propose a novel feature refinement and calibration module for continual visual search. Specifically, we introduce a feature refinement module aimed at enhancing features to be more discriminative concerning class distinctions. Subsequently, we employ a feature calibration module to further eliminate overlapping semantics, thereby avoiding the confusion between previously and currently learned knowledge. Extensive experiments demonstrate the effectiveness of both the feature refinement module and the feature calibration module. Our approach consistently outperforms existing state-of-the-art methods.