Keywords: Quantum computing, quantum error correction, graph neural networks, uncertainty quantification, machine learning
Abstract: Quantum error correction (QEC) is essential for scalable quantum
computing, yet decoding errors via conventional algorithms results
in limited accuracy (i.e., suppression of logical errors) and high
overheads, both of which can be alleviated by inference-based
decoders. To date, such machine-learning (ML) decoders lack two key
properties crucial for practical fault tolerance: reliable
uncertainty quantification and robust generalization to previously
unseen codes. To address this gap, we propose \textbf{QuBA}, a
Bayesian graph neural decoder that integrates attention to
both dot-product and multi-head, enabling expressive error-pattern
recognition alongside calibrated uncertainty estimates. Building on
QuBA, we further develop \textbf{SAGU }\textbf{(Sequential Aggregate
Generalization under Uncertainty)}, a multi-code training
framework with enhanced cross-domain robustness enabling decoding
beyond the training set. Experiments on bivariate bicycle (BB)
codes and their coprime variants demonstrate that
(i) both QuBA and SAGU consistently outperform the classical
baseline belief propagation (BP), achieving a reduction of on
average \emph{one order of magnitude} in logical error rate (LER),
and up to \emph{two orders of magnitude} under confident-decision
bounds on the coprime BB code $[[154, 6, 16]]$;
(ii) QuBA also surpasses state-of-the-art neural decoders, providing
an advantage of roughly \emph{one order of magnitude} (e.g., for the
larger BB code $[[756, 16, \leq34]]$) even when considering
conservative (safe) decision bounds;
(iii) SAGU achieves decoding performance comparable to or even
outperforming QuBA's domain-specific training approach. Our code implementation is available at \url{https://anonymous.4open.science/r/QuBA-SAGU-5FCD/}.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 20103
Loading