Keywords: input-sensitivity, Transformer, QR code decoding
TL;DR: This study investigates the behavior of a Transformer trained to perform QR code decoding as a moderately input-sensitive function.
Abstract: The hardness of learning a function that attains a target task relates to its input-sensitivity. For example, image classification tasks are input-insensitive as minor corruptions should not affect the classification results, whereas arithmetic and symbolic computation, which has been recently attracting interest, is highly input-sensitive as each input variable connects to the computation results.
This study presents the first learning-based Quick Response (QR) code decoding and investigates learning functions of medium sensitivity.
Our experiments reveal that Transformers can successfully decode QR codes, even beyond the theoretical error-correction limit, by learning the underlying structure of embedded texts. They generalize from English-rich training data to other languages and even random strings. Moreover, we observe that the Transformer-based QR decoder focuses on data bits while ignoring error-correction bits, suggesting a decoding mechanism distinct from standard QR code readers.
Submission Number: 85
Loading