OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning FrameworkDownload PDFOpen Website

2022 (modified: 08 Nov 2022)ICML 2022Readers: Everyone
Abstract: In this work, we pursue a unified paradigm for multimodal pretraining to break the shackles of complex task/modality-specific customization. We propose OFA, a Task-Agnostic and Modality-Agnostic fr...
0 Replies

Loading