Abstract: Highlights•The HaH deep neural network (DNN) toolbox was developed to incorporate neuro-inspired Hebbian/anti-Hebbian learning to produce sparse, strong activations intended to enhance robustness and interpretability.•HaH is a software extension to PyTorch which enables addition of costs depending on layer-wise activations to conventional end-to-end DNN training, thereby enhancing control of features at intermediate layers.•HaH is designed to be easily incorporated into conventional DNNs, with HaH layers replacing standard CNN/ReLU/batchnorm layers.