Keywords: QAS, Unsupervised Representation Learning, Predictor-free
Abstract: Utilizing unsupervised representation learning for quantum architecture search
(QAS) represents a cutting-edge approach poised to realize potential quantum advantage
on Noisy Intermediate-Scale Quantum (NISQ) devices. QAS is a scheme
to design quantum circuits for variational quantum algorithms (VQAs). Most
QAS algorithms combine their search space and search algorithms together and
thus generally require evaluating a large number of quantum circuits during the
search process, which results in formidable computational demands and limits
their applications to large-scale quantum circuits. Predictor-based QAS algorithms
can alleviate this problem by directly estimating the performance of circuits
according to their structures. However, a high-performance predictor generally requires
very time-consuming labeling to obtain a large number of labeled quantum
circuits because the gate parameters of quantum circuits need to be optimized until
convergence to their ground-truth performances. Recently, a classical neural
architecture search algorithm Arch2vec inspires us by showing that architecture
search can benefit from decoupling unsupervised representation learning from the
search process. Whether unsupervised representation learning can help QAS without
any predictor is still an open topic. In this work, we propose a framework
QAS with unsupervised representation learning and visualize how unsupervised
architecture representation learning encourages quantum circuit architectures with
similar connections and operators to cluster together. Specifically, our framework
enables the process of QAS to be decoupled from unsupervised architecture representation
learning so that the learned representation can be directly applied to
different downstream applications. Furthermore, our framework is predictor-free
eliminating the need for a large number of labeled quantum circuits. During the
search process, we use two algorithms REINFORCE and Bayesian Optimization
to directly search on the latent representation, and compare them with the method
Random Search. The results show our framework can more efficiently get well-performing
candidate circuits within a limited number of searches.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 5458
Loading