Quantum Algorithm for Deep Neural Networks with Efficient I/O

ICLR 2026 Conference Submission15201 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: quantum deep neural network, quantum algorithm, image classification
TL;DR: Our framework enables scalable quantum deep learning by strategically decomposing tasks and introducing a novel DCD protocol that achieves sublinear measurement costs, ensuring quantum speedup for large-dimensional inputs.
Abstract: A primary aim of research in quantum computing is the realization of quantum advantage within deep neural networks. However, it is hindered by known challenges in constructing deep architectures and the prohibitive overhead of quantum data I/O. We introduce a framework to overcome these barriers, designed to achieve an asymptotic speedup over the large input dimension of modern DNNs. This framework is based on the belief that a deep learning model can achieve similar performance when "rough" copies of the data are allowed, which is called the good-enough principle in this paper. Our framework enables the design of multi-layer Quantum ResNet and Transformer models by strategically breaking down the task into subroutines and assigning them to be executed by quantum linear algebra (QLA) or quantum arithmetic modules (QAM). This modularity is enabled by a novel data transfer protocol, Discrete Chebyshev Decomposition (DCD). Numerical validation reveals a pivotal insight: the measurement cost required to maintain a target accuracy scales sublinearly with the input dimension, verifying the good-enough principle. This sublinear scaling is key to preserving the quantum advantage, ensuring that I/O overhead does not nullify the computational gains. A rigorous resource analysis further corroborates the superiority of our models in both efficiency and flexibility. Our research provides strong evidence that quantum neural networks can be more scalable than classical counterparts on a fault-tolerant quantum computer.
Primary Area: foundation or frontier models, including LLMs
Submission Number: 15201
Loading