Code folder structure

code
	|------- instructions.txt
	|------- fairseq
	|------- lln_attention.py
	|------- LLN_attention_moment_matching.ipynb

The code of LLN Attention located in lln_attention.py file.
To run the training of roberta model we attach a copy of fairseq code with lln_attention.py inside fairseq/modules/ folder.
For detailed instructions related to dataset preparation and model configuration see: https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/README.md

Command line to run pre-training of roberta-base model with LLN Attention

export USE_LLN_ATTENTION=True
export USE_LLN_PLUS=True
python fairseq_cli/hydra_train.py -m --config-dir examples/roberta/config/pretraining --config-name base task.data=PATH_TO_WIKITEXT-103


Moment matchin:
Jupyter notebook LLN_attention_moment_matching.ipynb contains implementation and visualization of moment matching procedure.
