Quantum-Enhanced Transformer: A Variational Quantum Circuit Approach to Attention Mechanisms

Published: 29 Jul 2025, Last Modified: 29 Jul 2025PQAI 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Quantum machine learning, Quantum transformers, Variational quantum circuits, Quantum attention mechanisms, Hybrid quantum-classical models, Quantum feature maps, Quantum natural language processing, Near-term quantum algorithms, Quantum computing, Transformer architectures
TL;DR: We built quantum transformers by adding quantum circuits to attention layers, achieving classical performance on text tasks with exponential theoretical potential.
Abstract: We present a quantum-enhanced transformer architecture that integrates variational quantum circuits into the attention mechanism of transformer networks. Our approach leverages quantum feature maps to encode classical attention queries and keys into quantum states, processes them through parameterized quantum circuits, and extracts attention scores via expectation value measurements. We demonstrate the viability of this hybrid approach on text classification tasks using the 20newsgroups dataset, achieving performance comparable to classical transformers while establishing theoretical foundations for future quantum language models. The architecture maintains full compatibility with existing transformer frameworks while introducing quantum computational elements that could provide significant advantages as quantum hardware continues to mature. Our theoretical analysis reveals that quantum attention mechanisms can represent exponentially more complex relationships than their classical counterparts, though current near-term quantum device limitations prevent immediate practical quantum advantage.
Submission Number: 2
Loading