Beyond Digital: Harnessing Analog Hardware for Machine Learning

Published: 01 Nov 2023, Last Modified: 22 Dec 2023MLNCP PosterEveryoneRevisionsBibTeX
Keywords: analog hardware, physics-inspired computing, deep learning, energy efficiency
TL;DR: Given the rising computational needs of large deep-learning models, we explore demands in designing ML models to work efficiently with analog hardware.
Abstract: A remarkable surge in utilizing large deep-learning models yields state-of-the-art results in a variety of tasks. Recent model sizes often exceed billions of parameters, underscoring the importance of fast and energy-efficient processing. The significant costs associated with training and inference primarily stem from the constrained memory bandwidth of current hardware and the computationally intensive nature of these models. Historically, the design of machine learning models has predominantly been guided by the operational parameters of classical digital devices. In contrast, analog computations have the potential to offer vastly improved power efficiency for both inference and training tasks. This work details several machine-learning methodologies that could leverage existing analog hardware infrastructures. To foster the development of analog hardware-aware machine learning techniques, we explore both optical and electronic hardware configurations suitable for executing the fundamental mathematical operations inherent to these models. Integrating analog hardware with innovative machine learning approaches may pave the way for cost-effective AI systems at scale.
Submission Number: 31