Abstract: In this paper, we propose a novel space-time video super-resolution
method, which aims to recover a high-frame-rate and high-resolution
video from its low-frame-rate and low-resolution observation. Existing solutions seldom consider the spatial-temporal correlation
and the long-term temporal context simultaneously and thus are
limited in the restoration performance. Inspired by the epipolarplane image used in multi-view computer vision tasks, we first
propose the concept of temporal-profile super-resolution to directly
exploit the spatial-temporal correlation in the long-term temporal
context. Then, we specifically design a feature shuffling module for
spatial retargeting and spatial-temporal information fusion, which
is followed by a refining module for artifacts alleviation and detail
enhancement. Different from existing solutions, our method does
not require any explicit or implicit motion estimation, making it
lightweight and flexible to handle any number of input frames. Comprehensive experimental results demonstrate that our method not
only generates superior space-time video super-resolution results
but also retains competitive implementation efficiency.
0 Replies
Loading