Abstract: Highlights•Multi-branch network model, cross residual fusion block, global features.•Pyramidal pooling module, contextual information.•Gated axial-attention, gated axial transformer, high-level and low-level features.•Cross residual fusion block, faster converge.
External IDs:dblp:journals/cbm/JiangL22
Loading