Entropy Coding Compression of Tree Tensor Networks

AAAI 2025 Workshop CoLoRAI Submission21 Authors

23 Nov 2024 (modified: 03 Feb 2025)AAAI 2025 Workshop CoLoRAI SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: entropy coding, tensor networks, tensor compression
TL;DR: We contribute an entropy coding compression algorithm for tree tensor networks (prior related methods only supported the Tucker decomposition).
Abstract: Low-rank decompositions have been successfully used to compactly represent tensors of many kinds, be it neural network layers, activation tensors, or raw datasets arising from tomography scans, sensor data, physical simulations, etc. These methods are often based on low-rank factorization and do not post-process coefficients thereafter (except, sometimes, quantization). That choice allows compressed tensors to be more easily used in learning pipelines, but it is not necessarily optimal for data storage or transmission purposes. Focusing on these tasks, we propose to prioritize data reduction rates by applying entropy coding and successive core orthogonalization to SVD-learned coefficients of a given tensor. Our scheme generalizes earlier Tucker-based compressors to more general acyclic tensor networks, and is thus promising for a wider class of target tensors.
Submission Number: 21
Loading