Disentangling Quantum and Classical Contributions in Hybrid Quantum Machine Learning Architectures

Published: 01 Jan 2024, Last Modified: 09 Aug 2024ICAART (3) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Quantum computing offers the potential for superior computational capabilities, particularly for data-intensive tasks. However, the current state of quantum hardware puts heavy restrictions on input size. To address this, hybrid transfer learning solutions have been developed, merging pre-trained classical models, capable of handling extensive inputs, with variational quantum circuits. Yet, it remains unclear how much each component – classical and quantum – contributes to the model’s results. We propose a novel hybrid architecture: instead of utilizing a pre-trained network for compression, we employ an autoencoder to derive a compressed version of the input data. This compressed data is then channeled through the encoder part of the autoencoder to the quantum component. We assess our model’s classification capabilities against two state-of-the-art hybrid transfer learning architectures, two purely classical architectures and one quantum architecture. Their accuracy is compared acro
Loading