Abstract: Highlights•A detailed review of the pre-training process for each of the selected BERT variants.•A comparison of diverse transformers within an explainable recommendation setting.•Dyadic data modeled for multilabel classification, using reviews as explanations.•A comparison considering environmental impact and performance of the models.
Loading