Abstract: Highlights•Parallel attention structure merge Transformer and CNNs.•Contrastive augmentation module distinguish anomalous regions.•Dual-decoding branch to enhance copy-move tampering features.•Improved loss function to alleviated the imbalance of categories.•Outperforms state-of-the-art methods.
Loading