Abstract: Analog hardware Neural Network (NN) that uses a crossbar array of synapses to store the weights of the NN provides an extremely fast and energy efficient hardware platform to implement NN algorithms. Here, we design a crossbar network with a single conventional silicon based MOSFET as a synapse. We model the synapse characteristic using SPICE, benchmarked against experimentally obtained data. We also design analog peripheral circuits for neuron and synaptic weight update calculation. Next, using circuit simulations, we demonstrate "on-chip" learning (training in hardware) in the designed network. We obtain high classification accuracy on a standard machine learning dataset- the Fisher's Iris dataset. Linear and symmetric conductance response and easy, well developed method of fabrication are the two main advantages of our proposed transistor synapse compared to most synapses, currently used for NN implementation.
Loading