Unveiling the Hidden Structure: Tight Bounds for Matrix Multiplication Approximation via Convex Optimization

ICLR 2026 Conference Submission14352 Authors

18 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Matrix Multiplication Approximation, Sparse Approximation, Randomized Algorithms, Error Bounds, Upper Bounds, Sparsity Constrained Quadratic Programming, Gram Matrix Methods, Convex Optimization
TL;DR: This paper introduces a novel, computable, structure-aware upper bound that more accurately estimates the true optimal k-term matrix approximation error, especially when complex data interactions make standard bounds uninformative.
Abstract: Matrix multiplication lies at the heart of machine learning, yet standard approaches to approximate the multiplication often ignore the interactions that truly governs error. In this work, we introduce a structure-aware upper bound on the optimal achievable approximation using only linear combination of $k$ column-multiplication of the matrices. Our bounds, formulated via convex optimization over an interaction matrix, reveal the hidden challenges and opportunities in matrix multiplication. Through comprehensive numerical experiments, we demonstrate that our bounds not only outperform existing alternatives but also shed new light on the inherent complexity of structured matrix products. This framework paves the way for the development of structure-exploiting algorithms and principled performance guarantees in large-scale machine learning.
Supplementary Material: zip
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 14352
Loading