Keywords: tensor decomposition, tensor learning, t-SVD, tensor approximation
TL;DR: We develop a geometric framework for t-product-based tensor learning, define smooth t-manifolds, establish theory for manifold testing and function approximation, and illustrate the approach with a conceptual image modeling application.
Abstract: Despite the growing success of transform-based tensor models such as the t-product, their underlying geometric principles remain poorly understood. Classical differential geometry, built on real-valued function spaces, is not well suited to capture the algebraic and spectral structure induced by transform-based tensor operations. In this work, we take an initial step toward a geometric framework for tensors equipped with tube-wise multiplication via orthogonal transforms. We introduce the notion of smooth t-manifolds, defined as topological spaces locally modeled on structured tensor modules over a commutative t-scalar ring. This formulation enables transform-consistent definitions of geometric objects, including metrics, gradients, Laplacians, and geodesics, thereby bridging discrete and continuous tensor settings within a unified algebraic-geometric perspective.
On this basis, we develop a statistical procedure for testing whether tensor data lie near a low-dimensional t-manifold, and provide nonasymptotic guarantees for manifold fitting under noise. We further establish approximation bounds for tensor neural networks that learn smooth functions over t-manifolds, with generalization rates determined by intrinsic geometric complexity. This framework offers a theoretical foundation for geometry-aware learning in structured tensor spaces and supports the development of models that align with transform-based tensor representations.
Supplementary Material: zip
Primary Area: General machine learning (supervised, unsupervised, online, active, etc.)
Submission Number: 12297
Loading