Visualwind: a Novel Video Dataset for Cameras to Sense the Wind

Published: 01 Jan 2022, Last Modified: 05 Mar 2025IGARSS 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The goal of this paper is to empower cameras to sense the wind from videos by capturing the motion information using optical flow and machine learning models, to potentially revolutionise the spatiotemporal resolution of existing professional wind records that are often at the city scale. To this end, we build a novel video dataset of over 6000 labeled video clips, covering eleven wind classes of the Beaufort scale. The videos are collected from social media, public cameras, and self-recording. Every video clip has a fixed 10 seconds length with varied frame rates, and contains scenes of various trees swaying in different scales of wind. We describe the key statistics of the dataset, how it was collected and annotated, and evaluate both one-stage and two-stage models trained and tested for wind scale estimation on this dataset to give some baseline performance figures. The dataset is publicly accessible11https://sme.uds.exeter.ac.uk/folders/48caf5102d6196b9645fab1f46e494ec. Please contact the authors to get the access key due to the server protection policy..
Loading