Measurement information multiple-reuse allows deeper quantum transformer

27 Sept 2024 (modified: 08 Dec 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: quantum machine learning, quantum transformer, measurement information multiple reuse
TL;DR: Measurement information multiple-reuse is one of the key to achieve deep quantum Transformer.
Abstract:

The current era has witnessed the success of the transformer in the field of classical deep neural networks (DNNs) and the potential of quantum computing. One naturally expects that quantum computing can offer significant speedup for the transformer. Recent developments of quantum transformer models are faced with challenges including the expensive cost of non-linear operations and the information loss problem caused by measurements. To address this issue, this paper proposes a scheme called measurement information multiple-reuse (MIMR). MIMR enables the repeated utilization of intermediate measurement data from former layers, thus enhancing information-transferring efficiency. This scheme facilitates our quantum vision transformer (QViT) capable of achieving exponential speedup compared to classical counterparts, with the support of many parameters and large depth. Our QViT model is further examined with an instance of 86 million parameters, which halves the requirements for tomography error compared to the one without MIMR. This demonstrates the superior performance of MIMR over existing schemes. Our findings underscore the importance of exploiting the value of information from each measurement, offering a key strategy towards scalable quantum deep neural networks.

Primary Area: foundation or frontier models, including LLMs
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 10169
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview