HUBERT Untangles BERT to Improve Transfer across NLP Tasks

Anonymous

Sep 25, 2019 Blind Submission readers: everyone Show Bibtex
  • TL;DR: We introduce HUBERT which combines the power of Tensor-Product Representations and BERT language model.
  • Abstract: We introduce HUBERT which combines the structured-representational power of Tensor-Product Representations (TPRs) and BERT, a pre-trained bidirectional transformer language model. We validate the effectiveness of our model on the GLUE benchmark and HANS dataset. We also show that there is shared structure between different NLP datasets which HUBERT, but not BERT, is able to learn and leverage. Extensive transfer-learning experiments are conducted to confirm this proposition.
  • Keywords: Tensor Product Representation, BERT, Transfer Learning, Neuro-Symbolic Learning
0 Replies

Loading