Abstract: Although recurrent network-based optical flow estimation methods have shown great success in recent years, most of these methods have difficulty handling large displacements and occlusions because the existing recurrent networks are usually restricted to coarse-resolution single-scale models while ignoring the multiscale features brought by hierarchical concepts in previous coarse-to-fine approaches. In this paper, we propose an adaptive-aware correlation recurrent network for optical flow estimation, named ACR-Net, which preserves fine motion features with a single-scale resolution recurrent framework and adaptively incorporates multiscale features at different stages to achieve high-accuracy optical flow estimation. First, our proposed self-adaptation scale-aware correlation module can incorporate the adaptive correlation of multiscale inter- and intra-motion features, which makes the features more discriminative for capturing long-range dependencies between pixels. Second, our presented adaptive-aware motion module can effectively extract the required features of different kinds of motion from multilevel correspondence. Third, our introduced cross-guide motion and fusion modules can accurately guide the propagation of reliable pixels towards unreliable pixels and dynamically determine the most suitable expression to address the occlusion challenges. Comprehensive experiments demonstrate that ACR-Net outperforms existing two-view models, striking a good balance between speed and accuracy and achieving the best performance on the MPI-Sintel final pass and KITTI-2015 test datasets. Source code is available at https://github.com/PCwenyue/ACR-Net-TCSVT.
Loading