Neural Embeddings for Nearest Neighbor Search Under Edit DistanceDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: We propose a learning-based edit distance embedding method, which improves over prior data-independent approaches.
Abstract: The edit distance between two sequences is an important metric with many applications. The drawback, however, is the high computational cost of many basic problems involving this notion, such as the nearest neighbor search. A natural approach to overcoming this issue is to embed the sequences into a vector space such that the geometric distance in the target space approximates the edit distance in the original space. However, the known edit distance embedding algorithms, such as Chakraborty et al.(2016), construct embeddings that are data-independent, i.e., do not exploit any structure of embedded sets of strings. In this paper we propose an alternative approach, which learns the embedding function according to the data distribution. Our experiments show that the new algorithm has much better empirical performance than prior data-independent methods.
Keywords: Embedding, Edit Distance, Nearest Neighbor Search, Learning-Augmented Algorithm
Original Pdf: pdf
12 Replies

Loading