Model save path: /content/drive/MyDrive/Neural Collapse/New_Models/bn_False_dataset_mlp3_epochs_100_lr_0.001_model_depth_MLP_6_model_type_MLP_rand_seed_314159_weight_decay_0.03.pth.tar
Training Set:
Layer 0: Linear(in_features=16, out_features=200, bias=True)
Linear Weight Norm: 1.930027723312378
Linear Weight Rank: 16
Intra Cos: 0.03299568220973015
Inter Cos: -0.006222447846084833
Norm Quadratic Average: 3.9938881397247314
Nearest Class Center Accuracy: 0.583375

Layer 1: Linear(in_features=200, out_features=200, bias=True)
Linear Weight Norm: 1.9334341287612915
Linear Weight Rank: 179
Intra Cos: 0.023692836984992027
Inter Cos: -0.0025398952420800924
Norm Quadratic Average: 1.1475287675857544
Nearest Class Center Accuracy: 0.568

Layer 2: Linear(in_features=200, out_features=200, bias=True)
Linear Weight Norm: 1.9380524158477783
Linear Weight Rank: 179
Intra Cos: 0.020327052101492882
Inter Cos: -0.001298776245675981
Norm Quadratic Average: 0.09819193929433823
Nearest Class Center Accuracy: 0.542125

Layer 3: Linear(in_features=200, out_features=200, bias=True)
Linear Weight Norm: 1.9241931438446045
Linear Weight Rank: 179
Intra Cos: 0.02055853232741356
Inter Cos: 0.0021337547805160284
Norm Quadratic Average: 0.009225013665854931
Nearest Class Center Accuracy: 0.510375

Layer 4: Linear(in_features=200, out_features=200, bias=True)
Linear Weight Norm: 1.9265350103378296
Linear Weight Rank: 180
Intra Cos: 0.026467351242899895
Inter Cos: 0.003835814306512475
Norm Quadratic Average: 0.0009291875758208334
Nearest Class Center Accuracy: 0.505625

Layer 5: Linear(in_features=200, out_features=4, bias=True)
Linear Weight Norm: 0.28440478444099426
Linear Weight Rank: 4
Intra Cos: 0.026933826506137848
Inter Cos: 0.004777452442795038
Norm Quadratic Average: 9.106389916269109e-05
Nearest Class Center Accuracy: 0.498

Output Layer:
Intra Cos: 0.030416835099458694
Inter Cos: 0.013462486676871777
Norm Quadratic Average: 2.700761342566693e-06
Nearest Class Center Accuracy: 0.452875

Test Set:
Average Loss: 1.3862221336364746
Accuracy: 0.255
NC1 Within Class Collapse: 17.947032928466797
NC2 Equinorm: Features: 0.275180846452713, Weights: 0.033164624124765396
NC2 Equiangle: Features: 0.24507546424865723, Weights: 0.3132059574127197
NC3 Self-Duality: 1.5025302171707153
NC4 NCC Mismatch: 0.7995

Layer 0: Linear(in_features=16, out_features=200, bias=True)
Linear Weight Norm: 1.930027723312378
Linear Weight Rank: 16
Intra Cos: 0.03043147176504135
Inter Cos: -0.004549722652882338
Norm Quadratic Average: 3.9951601028442383
Nearest Class Center Accuracy: 0.583

Layer 1: Linear(in_features=200, out_features=200, bias=True)
Linear Weight Norm: 1.9334341287612915
Linear Weight Rank: 179
Intra Cos: 0.023440534248948097
Inter Cos: -0.0015735283959656954
Norm Quadratic Average: 1.1506266593933105
Nearest Class Center Accuracy: 0.5565

Layer 2: Linear(in_features=200, out_features=200, bias=True)
Linear Weight Norm: 1.9380524158477783
Linear Weight Rank: 179
Intra Cos: 0.01901908591389656
Inter Cos: -0.0006810967461206019
Norm Quadratic Average: 0.0982675775885582
Nearest Class Center Accuracy: 0.547

Layer 3: Linear(in_features=200, out_features=200, bias=True)
Linear Weight Norm: 1.9241931438446045
Linear Weight Rank: 179
Intra Cos: 0.019103536382317543
Inter Cos: 0.002928722184151411
Norm Quadratic Average: 0.009228420443832874
Nearest Class Center Accuracy: 0.5225

Layer 4: Linear(in_features=200, out_features=200, bias=True)
Linear Weight Norm: 1.9265350103378296
Linear Weight Rank: 180
Intra Cos: 0.02706456556916237
Inter Cos: 0.0073187085799872875
Norm Quadratic Average: 0.0009282169048674405
Nearest Class Center Accuracy: 0.5185

Layer 5: Linear(in_features=200, out_features=4, bias=True)
Linear Weight Norm: 0.28440478444099426
Linear Weight Rank: 4
Intra Cos: 0.02829360030591488
Inter Cos: 0.00799677986651659
Norm Quadratic Average: 9.09338632482104e-05
Nearest Class Center Accuracy: 0.522

Output Layer:
Intra Cos: 0.02730461210012436
Inter Cos: 0.023537348955869675
Norm Quadratic Average: 2.681178330021794e-06
Nearest Class Center Accuracy: 0.442

