Code based off https://github.com/SayeedChowdhury/IIR-SNN

Modified for multi-level LIF integration and training

An ImageNet dataset is assumed to be prepared in ./data

First train an ANN to initialize SNNs or LQ-ANNs. Use ann.py or ann_ddp.py for training, and ann.py for testing.

Then train LQ-ANNs using ann.py or ann_ddp.py by including -quant_act parameter and providing the pre-trained ANN
checkpoint path in the -pretrained_ann parameter.

To train SNNs, see either snn.py or snn_ddp.py and provide the pre-trained ANN checkpoint path in the -pretrained_ann 
parameter. If batch-norm is used to train ANNs, first need to run absorb_bn.py to fuse the batch-norm parameters
with the corresponding layer's parameters prior to SNN training, then use newly generated checkpoint path with absorbed
parameters for -pretrained_ann parameter. Otherwise, no need to run absorb_bn.py.

For training M-LIF SNNs on CIFAR10 or CIFAR100, see snn.py or snn_ddp.py
For training M-LIF SNNs on ImageNet, see snn_ddp.py
For testing M-LIF SNNs, see snn.py

For spike-driven transformer experiments, see ./spike_driven_transformer/README.txt
For dynamic image classification experiments, see ./neuromorphic_tasks/README.txt