(i) For gpt2 or sgpt experiments, use the following command template:

python run_icl.py --model {sgpt OR gpt2}  --width 256 --n_heads 1 --n_layers 1  --T 1_000 --max_cl 100 --dim 20 --train_on 2 3 4 5 6 7 8 9 10 11 --test_on 2 3 4 5 6 7 8 9 10 11 




1. Regression task add this: --task Linear_regression --sigma 0.5 (or in the case of multiple noise level --sigma 0.1 0.5])
2. relu task add this: --task NN_Relu --hid_dim_model 100
3. decision tree add this: --task tree
4. Sparse linear regression: --task SparseLinear --sparsity 3

--T is the number of pre-traiing task 


(ii) for the MLP experiments, first go to the MLP_experiments folder then use the following template command,

python mlp_run.py --T 100_000 --dim 8 --max_cl 200 --feats_method hilbert --contexts 10 20 30 40 50 60 70 80

1. Regression task add this: --task Linear_regression --sigma 0.5 (or in the case of multiple noise level --sigma 0.1 0.5])
2. relu task add this: --task NN_Relu --hid_dim_model 100
3. decision tree add this: --task tree
4. Sparse linear regression: --task SparseLinear --sparsity 3

feats_method also has the following options: {hilbert, gd, just_flatten, gd_feats_plus_flatten, hilbert_feats_plus_flatten}



For better visulaization we suggest to set-up your own wandb in "MLP_experiments/mlp_run.py" and "run_icl.py"