Abstract: Deep learning to binary coding improves multivariate time series retrieval performance by end-to-end representation learning and binary codes from training data. However, it is fair to say that exist deep learning retrieval methods, e.g., Encoder-Decoder based on recurrent or Convolutional neural network, failed to capture the latent dependencies between pairs of variables in multivariate time series, which results in substantial loss of retrieval quality. Furthermore, supervised deep learning to binary coding failed to meet the requirements in practice, due to the lack of labeled multivariate time series datasets. To address the above issues, this paper presents Unsupervised Transformer-based Binary Coding Networks (UTBCNs), a novel architecture for deep learning to binary coding, which consists of four key components, Transformer-based encoder (T-encoder), Transformer-based decoder (T-decoder), an adversarial loss, and a hashing loss. We employ the Transformer encoder-decoder to encode temporal dependencies, and inter-sensor correlations within multivariate time series by attention mechanisms solely. Meanwhile, to enhance the generalization capability of deep network, we add an adversarial loss based upon improved Wasserstein GAN (WGAN-GP) for real multivariate time series segments. To further improve of quality of binary code, a hashing loss based upon Convolutional encoder (C-encoder) is designed for the output of T-encoder. Extensive empirical experiments demonstrate that the proposed UTBCNs can generate high-quality binary codes and outperform state-of-the-art binary coding retrieval models on Twitter dataset, Air quality dataset, and Quick Access Recorder (QAR) dataset.
Loading