Unsupervised Domain Adaptation for Event Detection via Meta Self-Paced LearningDownload PDF

Anonymous

16 Jan 2022 (modified: 05 May 2023)ACL ARR 2022 January Blind SubmissionReaders: Everyone
Abstract: A shift in data distribution can have a significant impact on performance of a model to detect important events in text. Recent methods addressing unsupervised domain adaptation for event detection task typically extracted domain-invariant representations through balancing between various objectives to align feature spaces between source and target domains. While effective, these methods are impractical as large-scale language models are drastically growing bigger to achieve optimal performance. To this end, we propose to leverage meta-learning framework to train a neural network-based self-paced learning procedure in an end-to-end manner. Our method, called Meta Self-Paced Domain Adaption (MSP-DA), effectively tunes domain-specific hyperparameters including learning schedules, sample weights, and objective balancing coefficients, simultaneously throughout the learning process, by imitating the train-test dataset split based on the difficulties of source domain's samples. Extensive experiments demonstrate our framework substantially improves performance on target domains, surpassing state-of-the-art approaches. Detailed analyses validate our method and provide insight into how each domain affects the learned hyperparameters.
Paper Type: long
0 Replies

Loading