Contextualized word embeddings enhanced event temporal relation extraction for story understanding
Abstract: Learning causal and temporal relationships between events is an important step towards
deeper story and commonsense understanding.
Though there are abundant datasets annotated
with event relations for story comprehension,
many have no empirical results associated with
them. In this work, we establish strong baselines for event temporal relation extraction on
two under-explored story narrative datasets:
Richer Event Description (RED) and Causal
and Temporal Relation Scheme (CaTeRS). To
the best of our knowledge, these are the first
results reported on these two datasets. We
demonstrate that neural network-based models can outperform some strong traditional linguistic feature-based models. We also conduct
comparative studies to show the contribution
of adopting contextualized word embeddings
(BERT) for event temporal relation extraction
from stories. Detailed analyses are offered to
better understand the results.
0 Replies
Loading