-- Dependencies:
######################################################################################################
python 3
pytorch
numpy
pip install adversarial-robustness-toolbox
######################################################################################################





-- Contents and instructions
######################################################################################################
-softhebb.py
Contains all the classes and functions for training a SoftHebb network, 
testing its accuracy, and testing it with adversarial examples of PGD attack.
Its code is imported by loss_plot.py (see below).

------------------------------------------------------------------------------------------------------

-adversarial attack example script
Run softhebb.py directly for the following actions:
it loads a pretrained SoftHebb model, and generates and draws on the screen MNIST adversarial examples
 of various perturbation sizes epsilon, targeting the specific loaded network.

------------------------------------------------------------------------------------------------------

-loss_plot.py
Performs most of the possible functions.
Run loss_plot.py directly for the following sequence of actions:
1. Creates a SoftHebb model
2. Trains it on MNIST for one epoch
3. tests the accuracy of the single SoftHebb layer
4. shows its learned weights
------------------------------------------------------------------------------------------------------
5. trains a supervised classifier on top with backprop
6. tests this 2-layer model's accuracy
---------------------------------------------------
7. using the trained classifier, it goes through the training of the unsupervised layer again and 
   measures its post-hoc cross-entropy while it gets minimized.
   The post-hoc cross-entropy method is described in the manuscript.
8. trains a same-sized MLP with backpropagation
9. logs the MLP's cross-entropy history
10.plots the cross-entropy history of SoftHebb and of MLP
######################################################################################################