Scaling-up Memristor Monte Carlo with magnetic domain-wall physics

Published: 01 Nov 2023, Last Modified: 22 Dec 2023MLNCP OralEveryoneRevisionsBibTeX
Keywords: Memristor, MCMC, low-precision
TL;DR: We scaled up Memristor Monte Carlo by five orders of magnitude by proposing a better algorithm to Memristor physics pairing
Abstract: By exploiting the intrinsic random nature of nanoscale devices, Memristor Monte Carlo (MMC) is a promising enabler of edge learning systems. However, due to multiple algorithmic and device-level limitations, existing demonstrations have been restricted to very small neural network models and datasets. We discuss these limitations, and describe how they can be overcome, by mapping the stochastic gradient Langevin dynamics (SGLD) algorithm onto the physics of magnetic domain-wall Memristors to scale-up MMC models by five orders of magnitude. We propose the push-pull pulse programming method that realises SGLD in-physics, and use it to train a domain-wall based ResNet18 on the CIFAR-10 dataset. On this task, we observe no performance degradation relative to a floating point model down to an update precision of between 6 and 7-bits, indicating we have made a step towards a large-scale edge learning system leveraging noisy analogue devices.
Submission Number: 20