Structured Lexical Similarity via Convolution Kernels on Dependency TreesDownload PDFOpen Website

2011 (modified: 10 Nov 2022)EMNLP 2011Readers: Everyone
Abstract: A central topic in natural language processing is the design of lexical and syntactic features suitable for the target application. In this paper, we study convolution dependency tree kernels for automatic engineering of syntactic and semantic patterns exploiting lexical similarities. We define efficient and powerful kernels for measuring the similarity between dependency structures, whose surface forms of the lexical nodes are in part or completely different. The experiments with such kernels for question classification show an unprecedented results, e.g. 41% of error reduction of the former state-of-the-art. Additionally, semantic role classification confirms the benefit of semantic smoothing for dependency kernels.
0 Replies

Loading