Nonlinear Classification Without a Processor

Published: 01 Nov 2023, Last Modified: 22 Dec 2023MLNCP PosterEveryoneRevisionsBibTeX
Keywords: Neuromorphic Computing, Analog Computing, Learning Materials, Classification, Power Efficient
TL;DR: We demonstrate a self-adjusting electronic meta-material that can learn nonlinear classification tasks without a processor.
Abstract: Computers, as well as most neuromorphic hardware systems, use central processing and top-down algorithmic control to train for machine learning tasks. In contrast, brains are ensembles of 100 billion neurons working in tandem, giving them tremendous advantages in power efficiency and speed. Many physical systems `learn' through history dependence, but training a physical system to perform arbitrary nonlinear tasks without a processor has not been possible. Here we demonstrate the successful implementation of such a system - a learning meta-material. This nonlinear analog circuit is comprised of identical copies of a single simple element, each following the same local update rule. By applying voltages to our system (inputs), inference is performed by physics in microseconds. When labels are properly enforced (also via voltages), the system's internal state evolves in time, approximating gradient descent. Our system $\textit{learns on its own}$; it requires no processor. Once trained, it performs inference passively, requiring approximately 100~$\mu$W of total power dissipation across its edges. We demonstrate the flexibility and power efficiency of our system by solving nonlinear 2D classification tasks. Learning meta-materials have immense potential as fast, efficient, robust learning systems for edge computing, from smart sensors to medical devices to robotic control.
Submission Number: 12
Loading