Example-based Hypernetworks for Multi-source Adaptation to Unseen DomainsDownload PDF

Anonymous

16 Oct 2022 (modified: 05 May 2023)ACL ARR 2022 October Blind SubmissionReaders: Everyone
Keywords: Hypernetworks, Multi-source Adaptation, Domain adaptation, out of distribution generalization, prompt generation, sentiment classification, Natural language inference
Abstract: While Natural Language Processing (NLP) algorithms keep reaching unprecedented milestones, out-of-distribution generalization is still challenging. In this paper we address the problem of multi-source adaptation to unknown domains: Given labeled data from multiple source domains, we aim to generalize to data drawn from target domains that are unknown to the algorithm at training time. We present an algorithmic framework based on example-based Hypernetwork adaptation: Given an input example, a T5 encoder-decoder first generates a unique signature which embeds this example in the semantic space of the source domains, and this signature is then fed into a Hypernetwork which generates the weights of the task classifier. In an advanced version of our model, the learned signature also serves for improving the representation of the input example. In experiments with two tasks, sentiment classification and natural language inference, across 29 adaptation settings, our algorithms substantially outperform existing algorithms for this adaptation setup.To the best of our knowledge, this is the first time Hypernetworks are applied to adaptation to unknown domains.
Paper Type: long
Research Area: Machine Learning for NLP
0 Replies

Loading