Keywords: Representation Learning, Tensor Functions, Equivariance, Covariant Algebra, Invariant Theory, Closure Modeling
Abstract: Representing tensor-valued functions of tensor arguments is fundamental in many modeling problems. Tensor functions play a central role in constructing reduced-order approximations and are particularly useful for nonlinear anisotropic constitutive modeling of physical phenomena, such as fluid turbulence and material deformation among others. By imposing equivariance under the orthogonal group, tensor functions can be finitely and minimally generated by using the isomorphism between binary forms and symmetric trace-free tensors. After determining minimal generators, their coefficients can be learned as functions of the invariants of the tensor arguments by training on data which facilitates generality of the models. The algebraic nature of the learned models makes them interpretable by revealing underlying dynamics, and it keeps the models economical as they contain the theoretically minimum required number of terms. Determining minimal representations of higher-order tensor functions has remained computationally intractable in many cases of interest until now. The current work overcomes this limitation. Numerically efficient algorithms for generating tensor functions and reducing them to minimal sets are presented. A few classical tensor function representations and an approach to a bottleneck in modeling turbulence are worked out to showcase the practical applicability of our framework.
Supplementary Material: zip
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 20061
Loading