Attention De-sparsification Matters: Inducing diversity in digital pathology representation learning
Abstract: Highlights•Presented attention sparsification in self-supervised pretrained vision transformers.•Proposed DiRL: a domain-aware dense pretext framework to de-sparsify attention maps.•Demonstrated the efficacy of DiRL on three WSI-level datasets and two WSI-crop level datasets.
Loading