Abstract: This study proposes a new activation function, namely, S-type rectified linear unit activation function (SReLU), to alleviate the gradientdispersion of neural network model and improve the segmentationprecision of high-resolution remote sensing images (HRIs).
Loading