Abstract: Dynamic Time Warping (DTW) is widely used as a similarity measure in various domains. Due to its invariance against warping in the time axis, DTW provides more meaningful discrepancy measurements between two signals than other distance measures. In this paper, a learning framework based on DTW is proposed. In contrast to the previous successful usage of DTW as a loss function, we propose to apply DTW kernel as a new type of component in a neural network. The proposed framework leverages DTW to obtain a better feature extraction and generates interpretable representations. For the first time, the DTW loss is theoretically analyzed, and a stochastic backpropogation scheme is proposed to improve the accuracy and efficiency of the DTW learning. We also demonstrate that the proposed framework can be used as a data analysis tool to perform data decomposition.
Code Link: https://github.com/TideDancer/DTWNet
CMT Num: 6218
0 Replies
Loading