Do Sentence Transformers Learn Quasi-Geospatial Concepts from General Text?

Published: 01 Jan 2024, Last Modified: 12 Aug 2025CoRR 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Sentence transformers are language models designed to perform semantic search. This study investigates the capacity of sentence transformers, fine-tuned on general question-answering datasets for asymmetric semantic search, to associate descriptions of human-generated routes across Great Britain with queries often used to describe hiking experiences. We find that sentence transformers have some zero-shot capabilities to understand quasi-geospatial concepts, such as route types and difficulty, suggesting their potential utility for routing recommendation systems.
Loading