Abstract: AMR is a graph-based semantic representation for natural language. It relies on concepts and their relations to transcend words and distill the meaning of English sentences. In this work, we propose a solution for identifying both the concepts and their associated relations as a post-processing step of a transition-based parsing system. We incorporate these contributions within an existing system. Furthermore, we enhance the LSTM transition learning component by optimizing the input features and extending the predicted output.
Loading