Practical Hybrid Quantum Language Models with Observable Readout on Real Hardware

ICLR 2026 Conference Submission21073 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Quantum ML, Quantum NLP, Next Token Prediction, NISQ Algorithms
TL;DR: We propose practical hybrid quantum language models (QRNNs and QCNNs) with observable readout, and show for the first time that they can be trained and evaluated on real NISQ hardware for both classification and generative language modeling.
Abstract: Hybrid quantum-classical models are emerging as a key approach for leveraging near-term quantum devices. We present quantum recurrent neural networks (QRNNs) and quantum convolutional neural networks (QCNNs) as hybrid quantum language models, and demonstrate for the first time generative language modeling trained and evaluated on real quantum hardware. Our models combine parametric quantum circuits with a lightweight classical projection layer, using hardware-friendly multi-sample SPSA to train the quantum parameters efficiently, and standard gradient-based updates for the classical weights. To support evaluation, we construct and release a synthetic dataset for next-word prediction. Experiments on both sentence classification and language modeling tasks show that QRNNs and QCNNs can be trained end-to-end on NISQ devices and achieve competitive performance in low-resource regimes. These results establish quantum sequence models as a promising foundation for quantum natural language processing.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Submission Number: 21073
Loading