Checkpoint name: cifar100_resnet_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100_resnet_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
Number of Classes: 100
logs/cifar100_resnet_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_resume_cp1304
==> unlearning ...
Computing current moments on test set
Computed moments: 10.932028585815429,8.130016621589661,-1.7629506289070531
The MIA_loss has an accuracy of 0.912 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.550 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.2 , Uacc: 6.5
Forgetting epoch 0
Resetting retain iterator...
using alpha: 0.1
delta_val_loss: 6.816622257232666
delta_first_moment: 8.130016326904297
delta_second_moment: nan
The MIA_loss has an accuracy of 0.667 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.639 on forgotten vs unseen images
Accuracy on test set: 1.4 , Racc: 1.4 , Uacc: 0.0
Forgetting epoch 1
using alpha: 0.096
delta_val_loss: 3.115950584411621
delta_first_moment: 8.130016326904297
delta_second_moment: nan
The MIA_loss has an accuracy of 0.684 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.701 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.3 , Uacc: 0.0
Forgetting epoch 2
using alpha: 0.092
delta_val_loss: 2.017470359802246
delta_first_moment: 8.130016326904297
delta_second_moment: nan
The MIA_loss has an accuracy of 0.704 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.726 on forgotten vs unseen images
Accuracy on test set: 0.9 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 3
using alpha: 0.088
delta_val_loss: 0.1394176483154297
delta_first_moment: 8.130016326904297
delta_second_moment: nan
The MIA_loss has an accuracy of 0.686 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.685 on forgotten vs unseen images
Accuracy on test set: 0.9 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 4
using alpha: 0.08399999999999999
delta_val_loss: 0.29930686950683594
delta_first_moment: 8.130016326904297
delta_second_moment: nan
Computing current moments on test set
Computed moments: 5.419968313598633,9.706025382995605,6.786920284967564
The MIA_loss has an accuracy of 0.688 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.691 on forgotten vs unseen images
Accuracy on test set: 0.9 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 5
using alpha: 0.07999999999999999
delta_val_loss: -5.327971935272217
delta_first_moment: 9.706025123596191
delta_second_moment: nan
The MIA_loss has an accuracy of 0.907 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.647 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 6
using alpha: 0.07599999999999998
delta_val_loss: -3.553849697113037
delta_first_moment: 9.706025123596191
delta_second_moment: nan
The MIA_loss has an accuracy of 0.929 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.633 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.3 , Uacc: 0.0
Forgetting epoch 7
using alpha: 0.07199999999999998
delta_val_loss: -0.7157993316650391
delta_first_moment: 9.706025123596191
delta_second_moment: nan
The MIA_loss has an accuracy of 0.844 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.619 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.6 , Uacc: 0.0
Forgetting epoch 8
using alpha: 0.06799999999999998
delta_val_loss: 0.05083942413330078
delta_first_moment: 9.706025123596191
delta_second_moment: nan
The MIA_loss has an accuracy of 0.871 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.619 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.6 , Uacc: 0.0
Forgetting epoch 9
using alpha: 0.06399999999999997
delta_val_loss: 0.15534019470214844
delta_first_moment: 9.706025123596191
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.633515043640137,0.07793186523914337,0.1919078693057584
The MIA_loss has an accuracy of 0.926 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.648 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.5 , Uacc: 0.0
Forgetting epoch 10
using alpha: 0.05999999999999997
delta_val_loss: -0.7971100807189941
delta_first_moment: 0.07793186604976654
delta_second_moment: nan
The MIA_loss has an accuracy of 0.954 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.637 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.5 , Uacc: 0.0
Forgetting epoch 11
using alpha: 0.055999999999999966
delta_val_loss: -0.7352538108825684
delta_first_moment: 0.07793186604976654
delta_second_moment: nan
The MIA_loss has an accuracy of 0.951 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.655 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 1.3 , Uacc: 0.0
Forgetting epoch 12
using alpha: 0.05199999999999996
delta_val_loss: -0.6368870735168457
delta_first_moment: 0.07793186604976654
delta_second_moment: nan
The MIA_loss has an accuracy of 0.873 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.631 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 1.8 , Uacc: 0.0
Forgetting epoch 13
using alpha: 0.04799999999999996
delta_val_loss: -0.4320240020751953
delta_first_moment: 0.07793186604976654
delta_second_moment: nan
The MIA_loss has an accuracy of 0.743 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.540 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.7 , Uacc: 0.0
Forgetting epoch 14
using alpha: 0.043999999999999956
delta_val_loss: -0.2201523780822754
delta_first_moment: 0.07793186604976654
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.607262271118164,0.038187719544768337,-0.2949838016734567
The MIA_loss has an accuracy of 0.725 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.489 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.5 , Uacc: 0.0
Forgetting epoch 15
using alpha: 0.03999999999999995
delta_val_loss: -0.2078251838684082
delta_first_moment: 0.038187719881534576
delta_second_moment: nan
The MIA_loss has an accuracy of 0.838 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.551 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 1.9 , Uacc: 0.0
Forgetting epoch 16
using alpha: 0.03599999999999995
delta_val_loss: -0.3248424530029297
delta_first_moment: 0.038187719881534576
delta_second_moment: nan
The MIA_loss has an accuracy of 0.872 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.554 on forgotten vs unseen images
Accuracy on test set: 1.9 , Racc: 2.0 , Uacc: 0.0
Forgetting epoch 17
using alpha: 0.031999999999999945
delta_val_loss: -0.43019533157348633
delta_first_moment: 0.038187719881534576
delta_second_moment: nan
The MIA_loss has an accuracy of 0.880 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.506 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 2.0 , Uacc: 0.0
Forgetting epoch 18
using alpha: 0.027999999999999945
delta_val_loss: -0.43033695220947266
delta_first_moment: 0.038187719881534576
delta_second_moment: nan
The MIA_loss has an accuracy of 0.909 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.626 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 19
using alpha: 0.023999999999999945
delta_val_loss: -0.46896791458129883
delta_first_moment: 0.038187719881534576
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.57854976348877,0.06069792111814022,-0.8811111592315596
The MIA_loss has an accuracy of 0.872 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.457 on forgotten vs unseen images
Accuracy on test set: 1.4 , Racc: 1.5 , Uacc: 0.0
Forgetting epoch 20
using alpha: 0.019999999999999945
delta_val_loss: -0.4481315612792969
delta_first_moment: 0.06069792062044144
delta_second_moment: nan
The MIA_loss has an accuracy of 0.879 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.498 on forgotten vs unseen images
Accuracy on test set: 1.9 , Racc: 1.9 , Uacc: 0.0
Forgetting epoch 21
using alpha: 0.015999999999999945
delta_val_loss: -0.3529658317565918
delta_first_moment: 0.06069792062044144
delta_second_moment: nan
The MIA_loss has an accuracy of 0.956 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.481 on forgotten vs unseen images
Accuracy on test set: 2.5 , Racc: 2.5 , Uacc: 0.0
Forgetting epoch 22
using alpha: 0.011999999999999945
delta_val_loss: -0.5614824295043945
delta_first_moment: 0.06069792062044144
delta_second_moment: nan
The MIA_loss has an accuracy of 0.961 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.480 on forgotten vs unseen images
Accuracy on test set: 2.6 , Racc: 2.7 , Uacc: 0.0
Forgetting epoch 23
using alpha: 0.007999999999999945
delta_val_loss: -0.5933423042297363
delta_first_moment: 0.06069792062044144
delta_second_moment: nan
The MIA_loss has an accuracy of 0.954 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.526 on forgotten vs unseen images
Accuracy on test set: 2.7 , Racc: 2.7 , Uacc: 0.0
Forgetting epoch 24
using alpha: 0.003999999999999945
delta_val_loss: -0.5744600296020508
delta_first_moment: 0.06069792062044144
delta_second_moment: nan
The MIA loss has an accuracy of 0.951 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.476 on forgotten vs unseen images
x is [tensor([[[[-1.4986, -1.5180, -1.6343,  ..., -1.7312, -1.6537, -1.7118],
          [-1.5180, -1.5567, -1.6924,  ..., -1.7312, -1.6924, -1.6731],
          [-1.5761, -1.6343, -1.6537,  ..., -1.7506, -1.6343, -1.6149],
          ...,
          [-1.3435, -1.3241, -1.3241,  ..., -1.5374, -1.5955, -1.5955],
          [-1.3629, -1.3435, -1.2466,  ..., -1.4404, -1.5180, -1.5567],
          [-1.3047, -1.3241, -1.2660,  ..., -1.4404, -1.4598, -1.4792]],

         [[-1.2579, -1.2972, -1.4349,  ..., -1.4939, -1.4349, -1.4742],
          [-1.2776, -1.3366, -1.4939,  ..., -1.4939, -1.4546, -1.4349],
          [-1.3366, -1.3956, -1.4349,  ..., -1.5136, -1.3956, -1.3759],
          ...,
          [-1.0416, -1.0612, -1.1006,  ..., -1.2186, -1.2579, -1.2776],
          [-1.0809, -1.0809, -0.9629,  ..., -1.0809, -1.1596, -1.2382],
          [-1.0219, -1.0416, -0.9826,  ..., -1.0809, -1.1006, -1.1596]],

         [[-1.1483, -1.1873, -1.3044,  ..., -1.3825, -1.3239, -1.3629],
          [-1.1678, -1.2069, -1.3629,  ..., -1.3825, -1.3434, -1.3044],
          [-1.2264, -1.2849, -1.3239,  ..., -1.4020, -1.2849, -1.2459],
          ...,
          [-1.0118, -1.0118, -1.0313,  ..., -1.1093, -1.1873, -1.2264],
          [-1.0508, -1.0508, -0.9532,  ..., -1.0508, -1.1288, -1.1873],
          [-1.0118, -1.0313, -0.9727,  ..., -1.0703, -1.0703, -1.1288]]],


        [[[ 2.4365,  2.5141,  2.4753,  ...,  2.4753,  2.5141,  2.4365],
          [ 2.4559,  2.5141,  2.4947,  ...,  2.4947,  2.5141,  2.4559],
          [ 2.4753,  2.5141,  2.4753,  ...,  2.4947,  2.5141,  2.4753],
          ...,
          [ 2.4559,  2.4947,  2.4365,  ...,  2.4559,  2.5141,  2.4559],
          [ 2.4753,  2.5141,  2.4947,  ...,  2.4947,  2.5141,  2.4753],
          [ 2.4753,  2.5141,  2.4753,  ...,  2.4947,  2.5141,  2.4753]],

         [[ 2.5181,  2.5968,  2.5575,  ...,  2.5575,  2.5968,  2.5181],
          [ 2.5378,  2.5968,  2.5771,  ...,  2.5771,  2.5968,  2.5378],
          [ 2.5575,  2.5968,  2.5575,  ...,  2.5771,  2.5968,  2.5575],
          ...,
          [ 2.5378,  2.5968,  2.5575,  ...,  2.5378,  2.5575,  2.5181],
          [ 2.5575,  2.5968,  2.5771,  ...,  2.5771,  2.5968,  2.5575],
          [ 2.5575,  2.5968,  2.5575,  ...,  2.5771,  2.5968,  2.5575]],

         [[ 2.6757,  2.7537,  2.7147,  ...,  2.7147,  2.7537,  2.6757],
          [ 2.6952,  2.7537,  2.7342,  ...,  2.7342,  2.7537,  2.6952],
          [ 2.7147,  2.7537,  2.7147,  ...,  2.7342,  2.7537,  2.7147],
          ...,
          [ 2.6757,  2.7342,  2.7147,  ...,  2.6952,  2.7537,  2.6952],
          [ 2.7147,  2.7537,  2.7342,  ...,  2.7342,  2.7537,  2.7147],
          [ 2.7147,  2.7537,  2.7147,  ...,  2.7342,  2.7537,  2.7147]]],


        [[[-0.4130, -0.0060,  0.7113,  ...,  1.9325,  1.8550,  1.3898],
          [-0.2967,  0.1491,  0.9439,  ...,  1.8356,  1.6805,  1.3898],
          [-0.2192,  0.2267,  1.0408,  ...,  1.6999,  1.3122,  1.0021],
          ...,
          [ 0.9633,  1.3122,  2.3590,  ...,  0.5950,  0.0910,  0.0134],
          [ 1.1571,  1.5061,  2.3009,  ...,  0.8276,  0.4011,  0.3430],
          [ 1.3704,  1.4479,  2.0295,  ...,  0.7501,  0.5368,  0.8470]],

         [[-0.5892, -0.2746,  0.3941,  ...,  1.4561,  1.4954,  1.2201],
          [-0.5106, -0.1369,  0.5908,  ...,  1.3578,  1.3578,  1.2791],
          [-0.4712, -0.0779,  0.6498,  ...,  1.2791,  1.0431,  0.9251],
          ...,
          [ 0.7088,  0.8858,  2.0265,  ...,  0.2171, -0.2352, -0.2156],
          [ 0.9251,  1.0824,  1.9675,  ...,  0.4531,  0.0204,  0.0401],
          [ 1.0628,  0.9448,  1.6134,  ...,  0.4138,  0.0401,  0.3351]],

         [[-0.0558,  0.0223,  0.4710,  ...,  1.4661,  1.3880,  1.1929],
          [ 0.0028,  0.1394,  0.6466,  ...,  1.3295,  1.2319,  1.1929],
          [ 0.0223,  0.1589,  0.6856,  ...,  1.1734,  0.8612,  0.8027],
          ...,
          [ 0.7442,  0.9393,  1.9928,  ...,  0.5100,  0.3345,  0.1003],
          [ 0.9588,  1.1344,  1.9538,  ...,  0.7247,  0.5296,  0.3735],
          [ 1.1149,  1.0368,  1.6221,  ...,  0.6856,  0.4905,  0.5881]]],


        ...,


        [[[-0.5293,  1.2928,  1.6999,  ..., -1.5955, -1.3823, -1.5180],
          [-0.3549,  1.2735,  1.6612,  ..., -1.6537, -1.4598, -1.5955],
          [ 0.3817,  1.3898,  1.5642,  ..., -1.4986, -1.3823, -1.5374],
          ...,
          [ 0.8470,  0.3817,  0.5174,  ...,  0.2461,  0.0134, -0.1998],
          [ 0.8082,  0.4981,  0.6919,  ...,  0.3817,  0.0716,  0.0910],
          [ 0.6725,  0.5950,  0.6725,  ...,  0.1879, -0.0447, -0.0641]],

         [[-0.6876,  1.0431,  1.4364,  ..., -1.2972, -1.1202, -1.2972],
          [-0.5106,  0.9644,  1.1808,  ..., -1.3562, -1.1989, -1.3759],
          [ 0.2368,  1.0038,  0.9251,  ..., -1.1989, -1.1399, -1.3169],
          ...,
          [-0.2746, -0.6286, -0.5106,  ..., -0.4712, -0.7859, -1.0612],
          [-0.2549, -0.4516, -0.3139,  ..., -0.4712, -0.8646, -0.8252],
          [-0.3139, -0.2942, -0.2549,  ..., -0.6286, -0.8646, -0.8056]],

         [[-1.0898,  0.1784,  0.4710,  ..., -2.0458, -1.9678, -2.0068],
          [-1.1093, -0.1143,  0.1784,  ..., -2.1043, -2.0263, -2.0653],
          [-0.3874, -0.0948, -0.1143,  ..., -1.9678, -1.9287, -2.0068],
          ...,
          [-1.4410, -1.8117, -1.7922,  ..., -1.6751, -1.8507, -1.9287],
          [-1.5385, -1.7727, -1.6361,  ..., -1.6361, -1.8897, -1.7141],
          [-1.5190, -1.5580, -1.4605,  ..., -1.7141, -1.8702, -1.7336]]],


        [[[ 0.5950,  0.5174,  0.6338,  ...,  1.3898,  1.0214,  1.2541],
          [ 0.0910, -0.1804, -0.0641,  ...,  1.1378,  0.6725,  0.8858],
          [ 0.1491, -0.1416,  0.0522,  ...,  1.2928,  0.9827,  1.3122],
          ...,
          [ 1.1378,  0.9827,  0.8858,  ...,  1.0214,  0.8082,  1.0214],
          [ 0.9051,  0.3817,  1.0021,  ...,  1.3316,  0.6338,  0.5562],
          [ 1.1571,  1.0214,  1.2153,  ...,  1.3316,  1.1765,  1.0214]],

         [[-0.3926, -0.4909, -0.4122,  ...,  0.7678,  0.3744,  0.3941],
          [-0.8842, -0.9826, -0.9826,  ...,  0.7088,  0.2958,  0.1974],
          [-0.9826, -1.0219, -0.8842,  ...,  0.5318,  0.4334,  0.6301],
          ...,
          [ 0.2368,  0.1384,  0.0598,  ...,  0.4531,  0.1384,  0.2761],
          [ 0.1188, -0.4122,  0.2958,  ...,  0.7088, -0.0976, -0.1566],
          [ 0.3548,  0.1581,  0.3941,  ...,  0.4334,  0.1974,  0.0991]],

         [[-1.1483, -1.2264, -1.3239,  ..., -0.2899, -0.4460, -0.4655],
          [-1.4605, -1.4215, -1.6751,  ..., -0.1923, -0.3289, -0.5630],
          [-1.5385, -1.6166, -1.7727,  ..., -0.3289, -0.2509, -0.2509],
          ...,
          [-0.6606, -0.5435, -0.6801,  ..., -0.4264, -0.3484, -0.3679],
          [-0.8947, -1.0703, -0.4460,  ...,  0.0028, -0.6606, -0.8947],
          [-0.6996, -0.6606, -0.5240,  ..., -0.3679, -0.5630, -0.7776]]],


        [[[ 2.4947,  2.4559,  2.4753,  ...,  2.4365,  2.4365,  2.4365],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.4947,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4559,  2.4559,  2.4753],
          ...,
          [ 2.5141,  2.4947,  2.5141,  ...,  2.4947,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.4947,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.4947,  2.5141,  2.5141]],

         [[ 2.5771,  2.5378,  2.5575,  ...,  2.5181,  2.5181,  2.5181],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5771,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5378,  2.5378,  2.5575],
          ...,
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5968,  2.5968,  2.5968]],

         [[ 2.7342,  2.6952,  2.7147,  ...,  2.6757,  2.6757,  2.6757],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7342,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.6952,  2.6952,  2.7147],
          ...,
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7537,  2.7537,  2.7537]]]]), tensor([0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 1,
        0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 1,
        0, 1, 0, 0, 1, 1, 0, 0, 1, 1, 1, 1, 0, 1, 0, 0])]
x is [tensor([[[[-2.0026, -1.9832, -1.9638,  ..., -0.3161, -0.5100, -0.8007],
          [-2.0414, -2.0026, -2.0220,  ..., -0.7813, -1.0915, -1.1884],
          [-2.0414, -2.0026, -1.9832,  ..., -1.4404, -1.4017, -0.5875],
          ...,
          [ 1.1571,  0.7501,  0.6338,  ..., -1.0721, -0.1223,  0.1104],
          [-0.1998,  0.6531,  0.7888,  ..., -0.7620,  0.5562,  0.2267],
          [-0.4906,  0.2461,  0.7501,  ..., -0.2580,  0.7501, -0.1998]],

         [[-1.9069, -1.9266, -1.9463,  ..., -0.6089, -0.9826, -1.2776],
          [-1.9659, -1.9463, -2.0053,  ..., -1.1596, -1.6316, -1.8479],
          [-1.9463, -1.9463, -1.9659,  ..., -1.8676, -1.9856, -1.2972],
          ...,
          [-0.9432, -1.1596, -1.1202,  ..., -1.6512, -1.0022, -0.9826],
          [-1.6316, -0.9039, -0.8842,  ..., -1.5726, -0.5696, -0.9236],
          [-1.4939, -1.0416, -0.7662,  ..., -1.1596, -0.4122, -1.2776]],

         [[-1.8507, -1.8702, -1.8702,  ..., -0.5045, -0.8362, -1.0898],
          [-1.9092, -1.8897, -1.9092,  ..., -1.0313, -1.4605, -1.5971],
          [-1.8897, -1.8897, -1.8897,  ..., -1.7141, -1.7922, -1.0313],
          ...,
          [-0.6606, -1.0703, -1.1678,  ..., -1.5971, -0.9142, -0.8557],
          [-1.5580, -0.9337, -0.8752,  ..., -1.4995, -0.4264, -0.7581],
          [-1.5776, -1.1288, -0.7776,  ..., -1.1093, -0.2899, -1.1093]]],


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.4365,  2.5141,  2.5141],
          [ 2.5141,  2.4753,  2.4947,  ...,  2.3590,  2.4365,  2.5141],
          [ 2.5141,  2.4559,  2.4947,  ...,  2.4753,  2.3978,  2.5141],
          ...,
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4947,  2.4947,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4947,  2.4947,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4947,  2.4947,  2.5141]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5575,  2.5575,  ...,  2.5575,  2.5378,  2.5968],
          [ 2.5968,  2.5378,  2.5771,  ...,  2.5771,  2.5378,  2.5968],
          ...,
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5575,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5968]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7342,  2.7537],
          [ 2.7537,  2.7147,  2.7147,  ...,  2.6562,  2.6952,  2.7537],
          [ 2.7537,  2.6952,  2.7342,  ...,  2.6172,  2.7147,  2.7537],
          ...,
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7342,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7342,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7342,  2.7537]]],


        [[[-1.6731, -1.5567, -1.5955,  ..., -2.1189, -2.0801, -2.0608],
          [-1.5761, -1.5761, -1.5567,  ..., -2.0995, -2.0414, -2.0608],
          [-1.6149, -1.6537, -1.6149,  ..., -2.0220, -1.9251, -1.9832],
          ...,
          [-1.5761, -1.6731, -1.7118,  ...,  1.3704,  1.5061,  1.1571],
          [-1.6537, -1.6343, -1.6924,  ...,  1.5448,  1.7775,  1.6805],
          [-1.6924, -1.6537, -1.7118,  ...,  1.6999,  1.8162,  1.6030]],

         [[-1.9856, -1.9659, -2.0053,  ..., -2.2413, -2.2019, -2.1429],
          [-1.9463, -2.0249, -2.0446,  ..., -2.2216, -2.1626, -2.1626],
          [-1.9856, -2.0839, -2.1233,  ..., -2.1626, -2.0839, -2.1036],
          ...,
          [-1.9659, -2.0053, -2.0053,  ...,  1.0824,  1.2594,  1.0234],
          [-2.0053, -1.9463, -1.9659,  ...,  1.2988,  1.5938,  1.6134],
          [-1.9659, -1.9463, -1.9659,  ...,  1.4954,  1.6724,  1.5348]],

         [[-1.5776, -1.5776, -1.5971,  ..., -1.4605, -1.4410, -1.4020],
          [-1.4605, -1.5580, -1.5776,  ..., -1.3434, -1.2849, -1.3434],
          [-1.4410, -1.5580, -1.5971,  ..., -1.1873, -1.1093, -1.2264],
          ...,
          [-1.3629, -1.3434, -1.2849,  ...,  1.5051,  1.6416,  1.4075],
          [-1.4215, -1.3239, -1.2459,  ...,  1.5051,  1.8172,  1.8563],
          [-1.4215, -1.4020, -1.3434,  ...,  1.6807,  1.8953,  1.7782]]],


        ...,


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4753,  2.4947,  ...,  2.4947,  2.4947,  2.4947],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4947,  2.4947,  2.4947],
          ...,
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4753,  2.4753,  2.4947],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4753,  2.4753,  2.4947],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4947,  2.4947,  2.4947]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5575,  2.5771,  ...,  2.5771,  2.5771,  2.5771],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5771],
          ...,
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5968,  2.5968,  2.5771],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5968,  2.5771],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5771]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7147,  2.7342,  ...,  2.7342,  2.7342,  2.7342],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7342,  2.7342],
          ...,
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7147,  2.7147,  2.7342],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.6952,  2.6952,  2.7342],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7342,  2.7342]]],


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4947,  2.4947,  2.5141],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          ...,
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.4947,  2.5141],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5771,  2.5968,  2.5968],
          ...,
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7342,  2.7537],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7342,  2.7537,  2.7537],
          ...,
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7342,  2.7537],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537]]],


        [[[ 1.2928,  1.3122,  1.2735,  ...,  1.3704,  1.4091,  1.4867],
          [ 1.1765,  1.0408,  1.1571,  ...,  1.2928,  1.0214,  0.2848],
          [ 1.1184,  1.0408,  0.9633,  ...,  0.5562, -0.4518, -1.2078],
          ...,
          [ 1.6418,  1.5836,  1.5642,  ...,  1.5061,  1.4091,  1.3704],
          [ 1.5642,  1.6224,  1.6999,  ...,  1.3316,  1.3510,  1.3510],
          [ 1.5642,  1.6030,  1.7581,  ...,  1.4285,  1.3898,  1.5061]],

         [[ 0.5514,  0.5908,  0.5514,  ...,  0.5908,  0.7088,  0.8071],
          [ 0.4531,  0.3351,  0.4531,  ...,  0.5908,  0.4334, -0.1959],
          [ 0.3941,  0.3351,  0.2564,  ...,  0.0204, -0.8252, -1.3562],
          ...,
          [ 0.9251,  0.8268,  0.7874,  ...,  0.7088,  0.6498,  0.6301],
          [ 0.8661,  0.8858,  0.9054,  ...,  0.4924,  0.5514,  0.5908],
          [ 0.7481,  0.7874,  0.9251,  ...,  0.6498,  0.6104,  0.7088]],

         [[-0.6215, -0.5630, -0.6020,  ..., -0.4264, -0.4264, -0.3289],
          [-0.6996, -0.7971, -0.6801,  ..., -0.3289, -0.2899, -0.7971],
          [-0.7581, -0.7971, -0.8752,  ..., -0.6606, -1.0898, -1.4410],
          ...,
          [-0.2118, -0.3289, -0.4069,  ..., -0.4264, -0.3874, -0.3289],
          [-0.2899, -0.2704, -0.2704,  ..., -0.6215, -0.5435, -0.5240],
          [-0.4850, -0.4264, -0.3094,  ..., -0.4069, -0.4850, -0.4460]]]]), tensor([1, 0, 1, 1, 1, 1, 0, 1, 0, 0, 1, 0, 0, 1, 1, 1, 1, 0, 1, 1, 1, 1, 0, 0,
        0, 1, 1, 1, 0, 1, 0, 1, 1, 0, 1, 0, 0, 0, 1, 1, 1, 0, 1, 0, 0, 0, 1, 0,
        0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0])]
x is [tensor([[[[-9.3643e-01, -4.5180e-01,  1.0990e+00,  ...,  1.9325e+00,
            2.3590e+00,  2.5141e+00],
          [-9.5581e-01, -7.2319e-01,  1.1036e-01,  ...,  2.3590e+00,
            2.5141e+00,  2.5141e+00],
          [-1.0915e+00, -8.0073e-01, -5.0996e-01,  ...,  2.5141e+00,
            2.2621e+00,  2.0682e+00],
          ...,
          [-2.0414e+00, -1.1884e+00, -8.9766e-01,  ...,  9.0974e-02,
            9.0974e-02,  9.0974e-02],
          [-2.2546e+00, -1.7118e+00, -1.1884e+00,  ...,  9.0974e-02,
            5.2204e-02,  7.1589e-02],
          [-2.1964e+00, -2.0608e+00, -1.8669e+00,  ...,  3.2819e-02,
            1.3434e-02,  1.3434e-02]],

         [[-8.8424e-01, -5.4990e-01,  7.2844e-01,  ...,  5.5144e-01,
            1.0038e+00,  1.3774e+00],
          [-9.4324e-01, -8.2524e-01, -1.9590e-01,  ...,  9.8411e-01,
            1.5151e+00,  1.8888e+00],
          [-1.1202e+00, -8.8424e-01, -7.0724e-01,  ...,  1.3971e+00,
            1.5938e+00,  1.8298e+00],
          ...,
          [-2.1233e+00, -1.3956e+00, -1.2382e+00,  ..., -1.8900e-02,
           -1.8900e-02, -1.8900e-02],
          [-2.3199e+00, -1.8282e+00, -1.4152e+00,  ...,  7.6703e-04,
           -3.8567e-02, -1.8900e-02],
          [-2.2413e+00, -2.0839e+00, -1.9463e+00,  ..., -5.8233e-02,
           -7.7900e-02, -7.7900e-02]],

         [[-1.0313e+00, -8.7518e-01,  1.3935e-01,  ..., -9.3371e-01,
           -8.1665e-01, -7.9714e-01],
          [-1.0118e+00, -1.0898e+00, -6.8008e-01,  ..., -7.7763e-01,
           -6.2155e-01, -6.8008e-01],
          [-1.1288e+00, -1.1093e+00, -1.0703e+00,  ..., -5.0449e-01,
           -5.8253e-01, -7.3861e-01],
          ...,
          [-1.7922e+00, -1.5971e+00, -1.6751e+00,  ..., -1.7281e-01,
           -1.9232e-01, -1.9232e-01],
          [-1.8507e+00, -1.6946e+00, -1.5580e+00,  ..., -1.7281e-01,
           -2.1183e-01, -1.9232e-01],
          [-1.6751e+00, -1.6946e+00, -1.8312e+00,  ..., -2.3134e-01,
           -2.5085e-01, -2.5085e-01]]],


        [[[-6.8442e-01, -8.7827e-01, -7.2319e-01,  ..., -7.0381e-01,
           -8.2012e-01, -7.4258e-01],
          [-8.5889e-01, -1.0140e+00, -8.5889e-01,  ..., -8.3950e-01,
           -9.7520e-01, -8.7827e-01],
          [-6.8442e-01, -8.3950e-01, -7.0381e-01,  ..., -7.0381e-01,
           -8.0073e-01, -7.2319e-01],
          ...,
          [-4.4721e-02, -1.8042e-01,  7.1589e-02,  ...,  4.2052e-01,
            3.0421e-01,  4.0113e-01],
          [-6.4106e-02, -1.8042e-01,  1.3434e-02,  ...,  4.9806e-01,
            3.0421e-01,  3.6236e-01],
          [ 1.6851e-01,  1.3434e-02,  1.8790e-01,  ...,  8.0822e-01,
            5.9498e-01,  5.7560e-01]],

         [[-9.0391e-01, -8.2524e-01, -8.0557e-01,  ..., -7.4657e-01,
           -7.6624e-01, -8.4491e-01],
          [-8.0557e-01, -7.0724e-01, -6.6791e-01,  ..., -6.2857e-01,
           -6.4824e-01, -7.2691e-01],
          [-7.4657e-01, -6.4824e-01, -6.4824e-01,  ..., -6.0891e-01,
           -6.2857e-01, -6.8757e-01],
          ...,
          [-1.1723e-01,  7.6703e-04,  1.3844e-01,  ...,  4.9244e-01,
            5.1211e-01,  4.9244e-01],
          [ 7.6703e-04,  1.5810e-01,  2.3677e-01,  ...,  6.8911e-01,
            6.4977e-01,  5.7111e-01],
          [ 9.9101e-02,  2.1710e-01,  2.7610e-01,  ...,  8.2677e-01,
            7.8744e-01,  6.3011e-01]],

         [[-3.0938e-01, -1.5330e-01, -2.7036e-01,  ..., -2.3134e-01,
           -1.1428e-01, -1.7281e-01],
          [-1.5330e-01,  2.7802e-03, -9.4771e-02,  ..., -1.6730e-02,
            8.0821e-02,  6.1311e-02],
          [-2.3134e-01, -7.5261e-02, -1.9232e-01,  ..., -9.4771e-02,
            4.1801e-02,  2.7802e-03],
          ...,
          [ 6.2711e-01,  8.0270e-01,  8.0270e-01,  ...,  9.0025e-01,
            1.0368e+00,  1.0368e+00],
          [ 6.4662e-01,  8.6123e-01,  8.2221e-01,  ...,  1.1734e+00,
            1.2514e+00,  1.1929e+00],
          [ 6.4662e-01,  8.4172e-01,  7.6368e-01,  ...,  1.3100e+00,
            1.3880e+00,  1.2709e+00]]],


        [[[ 1.8790e-01, -4.4721e-02, -4.4721e-02,  ..., -9.5581e-01,
           -7.8135e-01, -8.3950e-01],
          [ 4.5929e-01,  2.0728e-01, -5.9512e-03,  ..., -8.0073e-01,
           -4.1303e-01, -3.9365e-01],
          [ 4.7867e-01,  3.2359e-01,  3.2819e-02,  ..., -5.6811e-01,
           -1.2226e-01, -4.4721e-02],
          ...,
          [-9.9458e-01, -1.1303e+00, -1.6103e-01,  ..., -8.0073e-01,
           -1.0527e+00, -1.0140e+00],
          [-9.3643e-01, -1.2272e+00, -1.2078e+00,  ..., -3.3549e-01,
           -6.8442e-01, -9.5581e-01],
          [-9.3643e-01, -1.0721e+00, -1.2078e+00,  ..., -1.4165e-01,
           -3.3549e-01, -6.8442e-01]],

         [[ 3.5477e-01,  1.3844e-01,  1.5810e-01,  ..., -6.6791e-01,
           -6.8757e-01, -1.0219e+00],
          [ 6.3011e-01,  4.1377e-01,  2.1710e-01,  ..., -5.6957e-01,
           -6.8757e-01, -9.8258e-01],
          [ 6.4977e-01,  5.3177e-01,  2.3677e-01,  ..., -4.9090e-01,
           -6.8757e-01, -8.8424e-01],
          ...,
          [-4.9090e-01, -6.4824e-01, -9.7567e-02,  ..., -5.4990e-01,
           -7.0724e-01, -5.4990e-01],
          [-4.7124e-01, -7.2691e-01, -1.0809e+00,  ..., -1.1723e-01,
           -3.5324e-01, -4.7124e-01],
          [-5.1057e-01, -5.1057e-01, -9.4324e-01,  ...,  4.0101e-02,
           -3.8567e-02, -2.3524e-01]],

         [[-1.2654e+00, -1.2849e+00, -1.1678e+00,  ..., -1.3044e+00,
           -1.1873e+00, -1.4215e+00],
          [-1.0898e+00, -1.1288e+00, -1.1678e+00,  ..., -1.2654e+00,
           -1.3825e+00, -1.5580e+00],
          [-1.0898e+00, -1.0703e+00, -1.1288e+00,  ..., -1.2459e+00,
           -1.4995e+00, -1.5580e+00],
          ...,
          [-1.9678e+00, -1.4020e+00, -5.5751e-02,  ..., -9.7273e-01,
           -1.0508e+00, -8.3616e-01],
          [-1.9873e+00, -1.7922e+00, -1.5385e+00,  ..., -7.1910e-01,
           -8.7518e-01, -9.3371e-01],
          [-1.9873e+00, -1.8507e+00, -1.8702e+00,  ..., -6.9959e-01,
           -7.3861e-01, -8.5567e-01]]],


        ...,


        [[[ 2.6544e-01,  2.2667e-01,  2.8482e-01,  ...,  1.1571e+00,
            1.1378e+00,  8.4699e-01],
          [ 2.2667e-01,  2.6544e-01,  2.8482e-01,  ...,  9.6329e-01,
            1.0990e+00,  8.6637e-01],
          [ 5.1744e-01,  3.6236e-01,  2.0728e-01,  ...,  7.1129e-01,
            7.3068e-01,  7.1129e-01],
          ...,
          [-6.6504e-01, -1.2226e-01,  1.2974e-01,  ...,  9.0514e-01,
            2.0489e+00,  1.8744e+00],
          [-6.2627e-01,  1.8790e-01,  1.4913e-01,  ...,  5.1744e-01,
            1.5836e+00,  1.4479e+00],
          [-1.0140e+00, -1.6103e-01, -1.9980e-01,  ...,  4.3990e-01,
            1.0021e+00,  6.7252e-01]],

         [[ 9.4478e-01,  8.8578e-01,  1.0431e+00,  ...,  1.7708e+00,
            1.6331e+00,  1.4561e+00],
          [ 8.6611e-01,  8.6611e-01,  1.0038e+00,  ...,  1.5938e+00,
            1.6331e+00,  1.4561e+00],
          [ 1.1218e+00,  9.0544e-01,  8.8578e-01,  ...,  1.4364e+00,
            1.4168e+00,  1.4168e+00],
          ...,
          [-4.9090e-01,  2.1710e-01,  6.1044e-01,  ...,  1.2398e+00,
            2.1641e+00,  2.0855e+00],
          [-3.7290e-01,  6.6944e-01,  7.4811e-01,  ...,  9.0544e-01,
            1.8495e+00,  1.7511e+00],
          [-9.0391e-01,  1.5810e-01,  2.5644e-01,  ...,  8.6611e-01,
            1.3774e+00,  1.0038e+00]],

         [[ 2.2291e-02, -1.6730e-02, -1.6730e-02,  ...,  3.5397e-01,
            2.1739e-01,  6.1311e-02],
          [-7.5261e-02,  2.7802e-03,  6.1311e-02,  ...,  1.0033e-01,
            1.5886e-01,  6.1311e-02],
          [ 4.1801e-02, -5.5751e-02, -5.5751e-02,  ..., -5.5751e-02,
           -5.5751e-02,  4.1801e-02],
          ...,
          [-8.7518e-01, -3.4841e-01, -1.3379e-01,  ...,  8.0821e-02,
            4.9054e-01,  4.1250e-01],
          [-8.5567e-01, -9.4771e-02, -5.5751e-02,  ..., -1.7281e-01,
            2.9543e-01,  2.1739e-01],
          [-1.1678e+00, -4.0694e-01, -3.0938e-01,  ..., -1.1428e-01,
            1.5886e-01, -3.6240e-02]]],


        [[[-1.9638e+00, -2.0026e+00, -2.0026e+00,  ...,  2.0682e+00,
            2.3396e+00,  2.4753e+00],
          [-1.9638e+00, -2.0026e+00, -2.0220e+00,  ...,  1.7775e+00,
            1.8938e+00,  2.2233e+00],
          [-2.0026e+00, -2.0220e+00, -2.0414e+00,  ...,  1.7775e+00,
            1.6612e+00,  1.6805e+00],
          ...,
          [-2.1771e+00, -2.2158e+00, -2.2158e+00,  ..., -9.9458e-01,
           -6.6504e-01, -6.6504e-01],
          [-1.0334e+00, -1.7312e+00, -1.9444e+00,  ..., -1.1303e+00,
           -5.2934e-01, -5.4873e-01],
          [-4.7119e-01, -3.1611e-01, -2.9672e-01,  ..., -3.3549e-01,
           -5.9512e-03,  2.2667e-01]],

         [[-1.9463e+00, -1.9856e+00, -1.9856e+00,  ...,  2.1641e+00,
            2.4395e+00,  2.5378e+00],
          [-1.9463e+00, -1.9856e+00, -2.0053e+00,  ...,  1.8888e+00,
            1.9871e+00,  2.3018e+00],
          [-1.9856e+00, -2.0053e+00, -2.0249e+00,  ...,  1.7904e+00,
            1.6921e+00,  1.7118e+00],
          ...,
          [-2.1626e+00, -2.2019e+00, -2.2019e+00,  ..., -2.5490e-01,
            5.5144e-01,  3.5477e-01],
          [-1.0022e+00, -1.7102e+00, -1.9266e+00,  ..., -4.3190e-01,
            6.3011e-01,  3.9410e-01],
          [-4.3190e-01, -2.7457e-01, -2.5490e-01,  ...,  4.9244e-01,
            1.2791e+00,  1.2594e+00]],

         [[-1.7531e+00, -1.7922e+00, -1.7922e+00,  ...,  1.5051e+00,
            1.8367e+00,  2.1879e+00],
          [-1.7531e+00, -1.7922e+00, -1.8117e+00,  ...,  1.2319e+00,
            1.3490e+00,  1.8172e+00],
          [-1.7922e+00, -1.8117e+00, -1.8312e+00,  ...,  1.2905e+00,
            1.1344e+00,  1.2124e+00],
          ...,
          [-1.9678e+00, -2.0068e+00, -2.0068e+00,  ..., -1.5776e+00,
           -1.6946e+00, -1.9678e+00],
          [-8.1665e-01, -1.5190e+00, -1.7336e+00,  ..., -1.7336e+00,
           -1.5190e+00, -1.6751e+00],
          [-2.5085e-01, -9.4771e-02, -7.5261e-02,  ..., -7.1910e-01,
           -5.0449e-01, -2.5085e-01]]],


        [[[-1.5374e+00, -8.7827e-01, -7.6196e-01,  ..., -1.4017e+00,
           -1.5567e+00, -1.6343e+00],
          [-1.0140e+00, -7.8135e-01, -6.8442e-01,  ..., -1.4211e+00,
           -1.5761e+00, -1.6343e+00],
          [-1.1109e+00, -8.0073e-01, -5.8750e-01,  ..., -1.6537e+00,
           -1.5180e+00, -1.4986e+00],
          ...,
          [-1.4986e+00, -1.7506e+00, -1.3823e+00,  ..., -1.2466e+00,
           -1.4017e+00, -1.7700e+00],
          [-1.6537e+00, -1.7312e+00, -1.4211e+00,  ..., -1.0721e+00,
           -1.2660e+00, -1.6343e+00],
          [-1.7894e+00, -1.6149e+00, -1.4017e+00,  ..., -4.9057e-01,
           -6.2627e-01, -7.0381e-01]],

         [[-3.8567e-02,  9.4478e-01,  1.3184e+00,  ...,  4.9244e-01,
            3.5477e-01,  2.1710e-01],
          [ 7.2844e-01,  1.1414e+00,  1.3774e+00,  ...,  3.9410e-01,
            3.5477e-01,  1.9744e-01],
          [ 6.8911e-01,  1.1414e+00,  1.4168e+00,  ..., -9.7567e-02,
            9.9101e-02, -1.8900e-02],
          ...,
          [-1.2776e+00, -1.5332e+00, -1.1399e+00,  ..., -1.0022e+00,
           -1.1596e+00, -1.5332e+00],
          [-1.4349e+00, -1.5332e+00, -1.1792e+00,  ..., -8.6457e-01,
           -1.0219e+00, -1.3956e+00],
          [-1.5726e+00, -1.4152e+00, -1.1596e+00,  ..., -2.9424e-01,
           -3.7290e-01, -4.5157e-01]],

         [[ 8.8074e-01,  2.2855e+00,  2.6562e+00,  ...,  2.0904e+00,
            1.9538e+00,  1.7782e+00],
          [ 2.0123e+00,  2.6367e+00,  2.6952e+00,  ...,  1.8563e+00,
            1.9343e+00,  1.7002e+00],
          [ 2.1879e+00,  2.6952e+00,  2.6757e+00,  ...,  1.0758e+00,
            1.3490e+00,  1.1344e+00],
          ...,
          [-1.2459e+00, -1.4995e+00, -1.2264e+00,  ..., -9.5322e-01,
           -1.0898e+00, -1.4605e+00],
          [-1.4020e+00, -1.4800e+00, -1.2654e+00,  ..., -7.7763e-01,
           -9.3371e-01, -1.3239e+00],
          [-1.5190e+00, -1.3629e+00, -1.2459e+00,  ..., -1.7281e-01,
           -3.0938e-01, -3.8743e-01]]]]), tensor([0, 0, 1, 0, 1, 0, 1, 1, 1, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0,
        1, 1, 1, 1, 1, 1, 0, 0, 1, 1, 1, 1, 1, 0, 0, 1, 0, 0, 1, 0, 1, 1, 0, 0,
        0, 1, 1, 1, 1, 1, 0, 0, 1, 0, 0, 0, 1, 1, 1, 1])]
x is [tensor([[[[ 2.3978,  2.2039,  2.4947,  ...,  1.9907,  1.7193,  1.6030],
          [ 2.1652,  2.1264,  2.4365,  ...,  2.2815,  1.9713,  1.6612],
          [ 1.9132,  2.0489,  2.3784,  ...,  1.5255,  0.8470,  0.0716],
          ...,
          [-1.2466, -1.3629, -0.5875,  ...,  1.0602,  0.9051,  0.8858],
          [-1.3241, -0.6069, -0.2580,  ...,  0.8664,  0.7307,  0.7113],
          [-0.8977, -0.6650, -0.9752,  ...,  0.6531,  0.5174,  0.4593]],

         [[ 1.8495,  1.6528,  2.2428,  ...,  0.7678,  0.5711,  0.3941],
          [ 1.6528,  1.6134,  2.0658,  ...,  1.2791,  0.9054,  0.5514],
          [ 1.3578,  1.5544,  2.0068,  ...,  0.9054,  0.2564, -0.4319],
          ...,
          [-1.3366, -1.4939, -1.1399,  ...,  0.7284,  0.6104,  0.5711],
          [-1.5529, -1.1202, -0.7466,  ...,  0.5318,  0.4138,  0.3744],
          [-1.1596, -1.1202, -1.2972,  ...,  0.3548,  0.2171,  0.1581]],

         [[ 1.1344,  0.9393,  1.5441,  ..., -0.7776, -0.8557, -0.9532],
          [ 0.8807,  0.8612,  1.3685,  ..., -0.0558, -0.4264, -0.7776],
          [ 0.5491,  0.7247,  1.2514,  ..., -0.0753, -0.6215, -1.1288],
          ...,
          [-1.2459, -1.3044, -1.3044,  ...,  0.2369,  0.1198,  0.0808],
          [-1.4215, -1.2654, -1.0703,  ...,  0.0808, -0.0362, -0.0753],
          [-1.1873, -1.2264, -1.2849,  ..., -0.0948, -0.2118, -0.2899]]],


        [[[-0.9558, -1.0140, -1.1303,  ..., -1.6343, -1.8475, -2.0220],
          [-0.8977, -0.8589, -1.1497,  ..., -1.6731, -1.8669, -2.0608],
          [-0.9558, -1.0527, -1.3435,  ..., -1.5955, -1.7700, -1.8863],
          ...,
          [-1.7700, -1.8281, -1.8669,  ..., -0.5875, -1.5955, -1.9638],
          [-1.7118, -1.7118, -1.7700,  ..., -0.2192, -1.4017, -1.8669],
          [-1.7506, -1.7312, -1.7700,  ..., -0.0641, -1.2660, -1.7700]],

         [[-0.3139, -0.2942, -0.3139,  ..., -0.7269, -1.0416, -1.2579],
          [-0.2549, -0.1369, -0.3139,  ..., -0.7269, -0.9629, -1.2579],
          [-0.3139, -0.3336, -0.5106,  ..., -0.6482, -0.8056, -1.0416],
          ...,
          [-1.5922, -1.6512, -1.6709,  ..., -0.6089, -1.3366, -1.5136],
          [-1.4939, -1.5136, -1.5529,  ..., -0.1959, -1.2186, -1.5726],
          [-1.5332, -1.4939, -1.5136,  ...,  0.0204, -1.1596, -1.6119]],

         [[-0.0558, -0.0167, -0.0167,  ..., -0.9922, -0.8752, -0.8947],
          [-0.0362,  0.1198, -0.0362,  ..., -1.0898, -0.9922, -1.0313],
          [-0.1338, -0.1143, -0.2899,  ..., -0.9922, -0.8947, -0.9337],
          ...,
          [-1.3239, -1.3825, -1.4410,  ..., -1.3239, -1.7531, -1.8312],
          [-1.1093, -1.1483, -1.2654,  ..., -1.1483, -1.8117, -1.8117],
          [-1.0313, -1.0703, -1.1873,  ..., -1.1288, -1.8897, -1.7727]]],


        [[[ 2.3978,  2.3590,  2.3590,  ...,  2.4753,  2.5141,  2.5141],
          [ 2.3396,  1.9132,  1.4479,  ...,  2.3978,  2.4947,  2.4947],
          [ 2.3590,  1.5642,  1.2153,  ...,  2.4753,  2.5141,  2.5141],
          ...,
          [ 2.4753,  1.3122, -0.1029,  ...,  2.3784,  2.5141,  2.5141],
          [ 2.3978,  1.8550,  1.3122,  ...,  2.4172,  2.5141,  2.5141],
          [ 2.3396,  2.3202,  2.4172,  ...,  2.4753,  2.5141,  2.5141]],

         [[ 2.5575,  2.5181,  2.5181,  ...,  2.5575,  2.5968,  2.5968],
          [ 2.4985,  2.0658,  1.6331,  ...,  2.4788,  2.5771,  2.5771],
          [ 2.5181,  1.7511,  1.5348,  ...,  2.5575,  2.5968,  2.5968],
          ...,
          [ 2.5575,  1.1611, -1.0219,  ...,  2.4591,  2.5968,  2.5968],
          [ 2.5378,  1.9281,  1.0234,  ...,  2.4985,  2.5968,  2.5968],
          [ 2.4985,  2.4788,  2.4985,  ...,  2.5575,  2.5968,  2.5968]],

         [[ 2.7537,  2.6952,  2.6562,  ...,  2.7147,  2.7537,  2.7537],
          [ 2.6952,  2.2270,  1.5246,  ...,  2.6367,  2.7342,  2.7342],
          [ 2.6757,  1.6612,  0.7637,  ...,  2.7147,  2.7537,  2.7537],
          ...,
          [ 2.6952,  1.2319, -0.8362,  ...,  2.6172,  2.7537,  2.7537],
          [ 2.6562,  1.9733,  1.0954,  ...,  2.6562,  2.7537,  2.7537],
          [ 2.6367,  2.5781,  2.5976,  ...,  2.7147,  2.7537,  2.7537]]],


        ...,


        [[[-0.7232, -0.6069, -0.4518,  ..., -0.1029, -0.1029, -0.6650],
          [-0.7620, -0.6650, -0.5100,  ..., -0.0253,  0.2461, -0.0060],
          [-0.8783, -0.7038, -0.5487,  ..., -0.1029,  0.2267, -0.0447],
          ...,
          [ 1.2735,  1.0408,  1.1765,  ...,  1.2541,  1.0990,  0.9439],
          [ 1.3316,  1.1765,  1.0602,  ...,  1.1765,  1.2347,  1.0990],
          [ 0.9439,  0.7888,  0.7888,  ...,  0.8858,  1.3510,  1.2928]],

         [[-0.6876, -0.6482, -0.4909,  ..., -0.1566, -0.0779, -0.5499],
          [-0.7662, -0.7072, -0.5696,  ..., -0.0582,  0.2564,  0.0598],
          [-0.8842, -0.7466, -0.6089,  ..., -0.1369,  0.2564, -0.0386],
          ...,
          [ 0.9251,  0.7678,  0.9054,  ...,  1.0431,  0.9054,  0.7678],
          [ 0.9054,  0.9054,  0.7678,  ...,  1.0431,  0.9644,  0.8858],
          [ 0.5711,  0.5711,  0.4531,  ...,  0.6104,  1.2201,  1.0628]],

         [[-0.4655, -0.5045, -0.5630,  ..., -0.4264, -0.3679, -0.7581],
          [-0.4460, -0.5240, -0.5630,  ..., -0.4069, -0.0948, -0.4069],
          [-0.6411, -0.6215, -0.5435,  ..., -0.4069, -0.0167, -0.2899],
          ...,
          [ 0.8027,  0.6856,  0.8222,  ...,  0.9003,  0.8417,  0.6661],
          [ 0.7052,  0.6856,  0.5686,  ...,  0.9198,  0.7052,  0.7832],
          [ 0.2564,  0.2954,  0.2759,  ...,  0.5296,  1.2319,  1.0368]]],


        [[[-1.6537, -1.6731, -1.6537,  ..., -1.6731, -1.7894, -1.8863],
          [-1.6343, -1.6537, -1.6149,  ..., -1.6149, -1.7700, -1.8669],
          [-1.5955, -1.6149, -1.5761,  ..., -1.5955, -1.7700, -1.8669],
          ...,
          [-1.1690, -1.1690, -1.1109,  ..., -1.3241, -1.6149, -1.6924],
          [-1.2272, -1.2078, -1.1690,  ..., -1.3629, -1.6537, -1.8669],
          [-1.2660, -1.2466, -1.2078,  ..., -1.4404, -1.7118, -1.9057]],

         [[-1.3759, -1.2776, -1.1792,  ..., -1.3169, -1.6316, -1.8479],
          [-1.3562, -1.2579, -1.1399,  ..., -1.2579, -1.5922, -1.8282],
          [-1.3169, -1.2186, -1.1006,  ..., -1.2382, -1.6119, -1.8479],
          ...,
          [-0.7466, -0.6876, -0.5696,  ..., -0.9236, -1.3956, -1.6906],
          [-0.8449, -0.7859, -0.6679,  ..., -1.0219, -1.4546, -1.8282],
          [-0.9236, -0.8646, -0.7466,  ..., -1.1596, -1.5332, -1.8676]],

         [[-1.4215, -1.3629, -1.3044,  ..., -1.2069, -1.4215, -1.5385],
          [-1.4020, -1.3434, -1.2654,  ..., -1.1483, -1.3825, -1.5190],
          [-1.3825, -1.3044, -1.2264,  ..., -1.1288, -1.4020, -1.5190],
          ...,
          [-0.8557, -0.8557, -0.8167,  ..., -0.8752, -1.2459, -1.4020],
          [-0.9337, -0.9337, -0.8947,  ..., -0.9532, -1.3044, -1.5580],
          [-1.0118, -1.0118, -0.9727,  ..., -1.0703, -1.3629, -1.5971]]],


        [[[-1.9638, -1.8475, -1.3241,  ..., -2.3128, -1.9444, -2.3515],
          [-1.2660, -1.4404, -0.4130,  ..., -2.0414, -1.4211, -1.8669],
          [-1.1690, -0.8395, -0.4712,  ..., -2.2546, -1.9057, -1.9251],
          ...,
          [-2.4097, -2.2934, -1.2854,  ..., -0.8201, -1.7312, -1.2466],
          [-2.3709, -2.2546, -2.1383,  ..., -0.4518, -1.3823, -0.9752],
          [-2.4291, -2.2158, -2.4291,  ..., -1.3435, -1.6343, -1.2272]],

         [[-0.9826, -1.1792, -0.8449,  ..., -1.8676, -1.4349, -1.6316],
          [-0.4516, -0.9236, -0.1369,  ..., -1.7889, -1.1792, -1.2579],
          [-0.5302, -0.5106, -0.3926,  ..., -2.2413, -1.8873, -1.4546],
          ...,
          [-1.9463, -1.9463, -0.9432,  ..., -0.4516, -1.3562, -0.5499],
          [-1.9069, -1.8479, -1.8282,  ..., -0.1172, -1.0416, -0.2942],
          [-1.9069, -1.6906, -2.1233,  ..., -0.8056, -1.1006, -0.4122]],

         [[-2.0653, -1.9092, -1.3629,  ..., -2.2214, -2.0458, -2.2214],
          [-1.2459, -1.3239, -0.3094,  ..., -1.9678, -1.4410, -1.9678],
          [-1.0118, -0.6801, -0.4069,  ..., -2.1043, -1.8312, -2.0848],
          ...,
          [-2.2214, -2.0848, -1.0118,  ..., -0.8362, -1.7336, -1.4800],
          [-2.2214, -2.0653, -1.8702,  ..., -0.3679, -1.3044, -1.1873],
          [-2.2214, -2.2019, -2.2214,  ..., -1.4215, -1.7922, -1.6361]]]]), tensor([0, 1, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 1, 0, 0, 0, 0, 1, 1, 0, 0, 1, 0, 1,
        1, 1, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 1, 0, 1, 1, 0, 1, 0, 1, 1, 1, 1, 0,
        0, 0, 1, 1, 0, 1, 0, 1, 1, 1, 0, 0, 0, 1, 1, 0])]
x is [tensor([[[[ 2.4753,  2.4365,  2.4753,  ...,  2.4947,  2.4753,  2.4947],
          [ 2.4365,  2.3590,  2.4365,  ...,  2.4753,  2.4559,  2.4753],
          [ 2.3590,  2.4365,  2.4559,  ...,  2.4753,  2.4559,  2.4947],
          ...,
          [ 2.4753,  2.4172,  2.3978,  ...,  2.4753,  2.4559,  2.4753],
          [ 2.4947,  2.4559,  2.4559,  ...,  2.4947,  2.4753,  2.4947],
          [ 2.4947,  2.4753,  2.4947,  ...,  2.4947,  2.4753,  2.4947]],

         [[ 2.5378,  2.5378,  2.5575,  ...,  2.5771,  2.5575,  2.5771],
          [ 2.5181,  2.5378,  2.4788,  ...,  2.5575,  2.5378,  2.5575],
          [ 2.5968,  2.4985,  1.8691,  ...,  2.5575,  2.5575,  2.5771],
          ...,
          [ 2.5575,  2.4985,  2.4788,  ...,  2.5575,  2.5378,  2.5575],
          [ 2.5771,  2.5378,  2.5378,  ...,  2.5771,  2.5575,  2.5771],
          [ 2.5771,  2.5575,  2.5771,  ...,  2.5771,  2.5575,  2.5771]],

         [[ 2.7537,  2.7147,  2.6757,  ...,  2.7342,  2.7147,  2.7342],
          [ 2.7147,  2.7147,  2.5001,  ...,  2.7147,  2.6952,  2.7147],
          [ 2.6562,  2.5196,  1.7587,  ...,  2.7147,  2.7147,  2.7342],
          ...,
          [ 2.7147,  2.6562,  2.6367,  ...,  2.7147,  2.6952,  2.7147],
          [ 2.7342,  2.6952,  2.6952,  ...,  2.7342,  2.7147,  2.7342],
          [ 2.7342,  2.7147,  2.7342,  ...,  2.7342,  2.7147,  2.7342]]],


        [[[-0.8201, -0.8395, -0.7426,  ..., -1.2078, -0.9946, -1.1690],
          [-1.2466, -1.1303, -0.8201,  ..., -1.3629, -1.3047, -1.4404],
          [-1.6731, -1.0334, -0.6263,  ..., -1.5374, -1.4986, -1.4986],
          ...,
          [ 0.5562,  0.4787,  0.4399,  ..., -0.1804, -0.5875, -0.2773],
          [ 0.4981,  0.4011,  0.4787,  ..., -0.2967, -0.6069, -0.3743],
          [ 0.4399,  0.7888,  0.6338,  ...,  0.3042,  0.0134, -0.1610]],

         [[-0.4319, -0.5302, -0.5892,  ..., -1.0022, -0.8056, -0.9629],
          [-0.9236, -0.9432, -0.8449,  ..., -1.1989, -1.1399, -1.2579],
          [-1.4349, -1.0219, -0.8449,  ..., -1.3956, -1.3366, -1.2972],
          ...,
          [ 0.7678,  0.6498,  0.6301,  ..., -0.0582, -0.3729,  0.0204],
          [ 0.6498,  0.5318,  0.6104,  ..., -0.1566, -0.3926, -0.0976],
          [ 0.5908,  0.9251,  0.7678,  ...,  0.4924,  0.2171,  0.0794]],

         [[-0.6215, -0.6996, -0.8557,  ..., -1.3044, -1.0118, -1.2459],
          [-1.2264, -1.3044, -1.3434,  ..., -1.3434, -1.3044, -1.5776],
          [-1.6751, -1.3434, -1.3434,  ..., -1.4215, -1.4995, -1.6946],
          ...,
          [ 0.5100,  0.4710,  0.4710,  ..., -0.4655, -0.7386, -0.2509],
          [ 0.4125,  0.3735,  0.4905,  ..., -0.5435, -0.7386, -0.3874],
          [ 0.3540,  0.7637,  0.6271,  ...,  0.1589, -0.0948, -0.2313]]],


        [[[-1.0915, -0.6069, -0.5681,  ..., -0.3355, -1.4017, -1.3823],
          [-1.1109, -0.5293, -0.4712,  ..., -0.2580, -0.9558, -0.7813],
          [-1.0721, -0.4906, -0.3936,  ..., -0.0060, -0.2773, -0.3936],
          ...,
          [ 0.5756, -0.0253, -0.7232,  ..., -0.5100, -0.2773,  0.4593],
          [ 0.5368, -0.0447, -0.7426,  ..., -0.9558, -0.6457,  0.1685],
          [ 0.3624, -0.2386, -0.8589,  ..., -0.8007, -0.5487, -0.0060]],

         [[-1.0416, -0.4122, -0.3336,  ..., -0.4516, -1.7299, -1.7102],
          [-0.9826, -0.2942, -0.2156,  ..., -0.4122, -1.2382, -1.0809],
          [-0.9236, -0.2746, -0.2156,  ..., -0.1566, -0.5106, -0.6089],
          ...,
          [ 0.4924, -0.1172, -0.8449,  ..., -1.2382, -0.7269,  0.3154],
          [ 0.4531, -0.1369, -0.8842,  ..., -1.6512, -1.0416,  0.1384],
          [ 0.2564, -0.3336, -0.9826,  ..., -1.6119, -1.0416, -0.0779]],

         [[-2.1238, -2.1238, -2.2019,  ..., -2.1043, -2.2019, -2.1434],
          [-2.1629, -2.1043, -2.2019,  ..., -2.0653, -2.1434, -1.9678],
          [-2.1629, -2.0848, -2.1238,  ..., -2.0068, -1.9678, -2.1043],
          ...,
          [-1.8897, -2.0068, -2.1238,  ..., -2.1629, -2.0458, -2.0068],
          [-1.9287, -2.0068, -2.1238,  ..., -2.1629, -2.1238, -2.1238],
          [-2.0263, -2.1238, -2.1824,  ..., -2.1824, -2.1629, -2.1629]]],


        ...,


        [[[-0.6263, -0.6650, -0.6457,  ..., -0.2192, -0.2386, -0.2967],
          [-0.6069, -0.6457, -0.5875,  ..., -0.2386, -0.2192, -0.1223],
          [-0.5487, -0.5293, -0.5100,  ...,  0.0134, -0.0835, -0.0641],
          ...,
          [-0.4906, -0.1223,  0.0328,  ..., -0.7620, -0.5293, -0.2580],
          [-0.9170, -0.2773, -0.0060,  ..., -0.7038, -0.4518, -0.3936],
          [-0.3549, -0.1610, -0.4518,  ..., -0.5100, -0.5293, -0.4518]],

         [[ 1.3774,  1.3774,  1.3971,  ...,  1.7314,  1.7314,  1.6134],
          [ 1.3578,  1.3381,  1.3971,  ...,  1.6921,  1.7118,  1.8101],
          [ 1.4364,  1.4758,  1.4954,  ...,  1.9675,  1.8691,  1.8691],
          ...,
          [ 1.2988,  1.5938,  1.6921,  ...,  0.9251,  1.1414,  1.3971],
          [ 0.8661,  1.4561,  1.6528,  ...,  0.9644,  1.2201,  1.2594],
          [ 1.4364,  1.5741,  1.2004,  ...,  1.1611,  1.1414,  1.2004]],

         [[ 2.5586,  2.5391,  2.5001,  ...,  2.6562,  2.6952,  2.6172],
          [ 2.5976,  2.5976,  2.5781,  ...,  2.5781,  2.5781,  2.5586],
          [ 2.6172,  2.6367,  2.6172,  ...,  2.7342,  2.6562,  2.5976],
          ...,
          [ 2.3830,  2.5781,  2.5001,  ...,  1.9148,  2.1489,  2.3830],
          [ 1.8953,  2.3830,  2.5001,  ...,  1.9538,  2.2074,  2.2465],
          [ 2.3050,  2.4416,  2.0904,  ...,  2.1489,  2.1294,  2.1879]]],


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4753,  2.4947,  ...,  2.3978,  2.3784,  2.4947],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.3590,  2.4559,  2.5141],
          ...,
          [ 2.5141,  2.4947,  2.5141,  ...,  2.0682,  2.3009,  2.4559],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.4172,  2.3590,  2.4365],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.3784,  2.3590,  2.4559]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5771,  2.5968,  2.5968],
          [ 2.5968,  2.5575,  2.5771,  ...,  2.5575,  2.5771,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5575,  2.5968],
          ...,
          [ 2.5968,  2.5771,  2.5968,  ...,  2.0461,  2.3608,  2.5181],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5378,  2.5181,  2.5378],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5181,  2.4788,  2.5378]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7147,  2.7342,  ...,  2.6952,  2.6952,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.6952,  2.7342,  2.7537],
          ...,
          [ 2.7537,  2.7342,  2.7537,  ...,  2.3440,  2.6172,  2.7537],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.6562,  2.6172,  2.7147],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.5781,  2.5586,  2.6757]]],


        [[[-0.2386, -0.1998, -0.1610,  ...,  1.0990,  1.0990,  1.2153],
          [-0.1998, -0.1610, -0.1416,  ...,  1.1765,  1.1378,  1.2541],
          [-0.1029, -0.0835,  0.0134,  ...,  1.2347,  1.2928,  1.3510],
          ...,
          [-0.9946, -0.8977, -0.4906,  ...,  0.6144,  0.4787,  0.3624],
          [-1.0527, -0.9752, -0.3549,  ...,  0.6919,  0.3817,  0.1879],
          [-1.1690, -1.0527, -0.8977,  ...,  0.7113,  0.3430,  0.2461]],

         [[-0.5499, -0.4122, -0.3336,  ...,  1.0038,  1.0038,  1.1218],
          [-0.4712, -0.3926, -0.3532,  ...,  1.0824,  1.0431,  1.1611],
          [-0.3336, -0.3336, -0.2549,  ...,  1.1414,  1.2004,  1.2594],
          ...,
          [-1.3956, -1.3562, -1.0219,  ...,  0.3154,  0.0991, -0.1369],
          [-1.4152, -1.3759, -0.8252,  ...,  0.2958, -0.0976, -0.3926],
          [-1.4742, -1.4152, -1.3169,  ...,  0.1581, -0.2746, -0.4122]],

         [[-0.7191, -0.5435, -0.4069,  ...,  1.1344,  1.1344,  1.2514],
          [-0.6215, -0.5240, -0.4655,  ...,  1.2124,  1.1734,  1.2905],
          [-0.4460, -0.4655, -0.4069,  ...,  1.2709,  1.3295,  1.3880],
          ...,
          [-1.5776, -1.5580, -1.2459,  ...,  0.2174, -0.0558, -0.3874],
          [-1.6166, -1.5971, -1.0703,  ...,  0.0613, -0.4069, -0.7386],
          [-1.6751, -1.6556, -1.5971,  ..., -0.2313, -0.7386, -0.8947]]]]), tensor([0, 1, 0, 1, 1, 0, 0, 0, 1, 1, 1, 0, 1, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1,
        0, 1, 1, 0, 1, 0, 1, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 1, 1, 1, 1, 0, 0,
        1, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 1, 0, 0, 1])]
x is [tensor([[[[-1.1109,  0.3042,  0.7307,  ...,  1.5448,  1.4479,  1.4285],
          [-1.0915,  0.1297,  0.7307,  ...,  1.5448,  1.4867,  1.4479],
          [ 0.0910,  0.5562,  0.7694,  ...,  1.5642,  1.4673,  1.4285],
          ...,
          [ 0.2654,  0.2461,  0.2461,  ...,  0.8082,  0.2848,  0.0134],
          [-0.0835,  0.1879, -0.0253,  ...,  0.4205, -0.1610,  0.1685],
          [-0.3549, -0.1223, -0.6844,  ...,  0.2267, -0.1610,  0.1491]],

         [[-1.7102, -0.6089, -0.3139,  ...,  0.4334,  0.3744,  0.3941],
          [-1.7299, -0.7662, -0.2942,  ...,  0.4334,  0.4138,  0.4138],
          [-0.7859, -0.4516, -0.2746,  ...,  0.4728,  0.3941,  0.3744],
          ...,
          [-0.5696, -0.5302, -0.6482,  ..., -0.2156, -1.0022, -1.5529],
          [-0.8646, -0.6679, -0.9826,  ..., -0.5696, -1.5922, -1.5136],
          [-1.2972, -1.0809, -1.3759,  ..., -0.9629, -1.6316, -1.4349]],

         [[-1.7141, -0.8557, -0.6606,  ...,  0.1394,  0.1198,  0.1003],
          [-1.7141, -0.9727, -0.6801,  ...,  0.1394,  0.1589,  0.1394],
          [-0.9922, -0.7581, -0.6801,  ...,  0.1784,  0.1394,  0.1198],
          ...,
          [-0.9532, -0.8752, -1.2264,  ..., -0.4850, -1.1483, -1.4995],
          [-1.3239, -1.1678, -1.5971,  ..., -0.7776, -1.5580, -1.5190],
          [-1.7922, -1.6751, -1.7727,  ..., -1.0508, -1.5776, -1.5385]]],


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4947,  2.4947,  2.5141],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.5141,  2.4947,  2.5141],
          ...,
          [ 2.5141,  2.4947,  2.5141,  ...,  2.4947,  2.4753,  2.5141],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.4753,  2.4753,  2.5141],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.4947,  2.4753,  2.5141]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5968],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5968,  2.5771,  2.5968],
          ...,
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5771,  2.5575,  2.5968],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5771,  2.5575,  2.5968],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5771,  2.5575,  2.5968]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7342,  2.7537],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7537,  2.7342,  2.7537],
          ...,
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7147,  2.7147,  2.7537],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7147,  2.7147,  2.7537],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7147,  2.7147,  2.7537]]],


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4365,  2.4753,  2.5141],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.4947,  2.4753,  2.5141],
          ...,
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4947,  2.4947,  2.5141],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5575,  2.5575,  2.5968],
          ...,
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5575,  2.5771,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7342,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7342,  2.7537],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7342,  2.7342,  2.7537],
          ...,
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7342,  2.7537],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537]]],


        ...,


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.4947,  2.4947,  2.5141],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.4947,  2.4947,  2.5141],
          ...,
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.5141,  2.4947,  2.5141],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5771,  2.5771,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          ...,
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5771,  2.5968],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5968,  2.5771,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7147,  2.7147,  2.7537],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7342,  2.7147,  2.7537],
          ...,
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7342,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7537,  2.7342,  2.7537],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537]]],


        [[[ 1.1571,  0.5368,  0.6144,  ...,  0.6144,  0.6144,  0.6919],
          [-0.5293, -1.8475, -1.7312,  ..., -1.6731, -1.7118, -1.5374],
          [-0.9170, -2.3903, -2.2546,  ..., -2.1189, -2.1771, -2.0026],
          ...,
          [-0.8977, -2.3515, -2.2158,  ..., -2.1964, -2.1383, -1.9832],
          [-0.8201, -2.2546, -2.1383,  ..., -2.1383, -2.1383, -1.9444],
          [ 0.4011, -0.5100, -0.4324,  ..., -0.4518, -0.4518, -0.3355]],

         [[ 1.1808,  0.5514,  0.6301,  ...,  0.6498,  0.6498,  0.7284],
          [-0.5302, -1.8676, -1.7496,  ..., -1.7102, -1.7299, -1.5726],
          [-0.9236, -2.4183, -2.2806,  ..., -2.2413, -2.2609, -2.1036],
          ...,
          [-0.8842, -2.3593, -2.2216,  ..., -2.2609, -2.3199, -2.1233],
          [-0.7859, -2.2413, -2.1233,  ..., -2.1626, -2.1823, -1.9856],
          [ 0.4531, -0.4712, -0.3926,  ..., -0.4122, -0.4122, -0.2942]],

         [[ 1.3685,  0.7442,  0.8222,  ...,  0.7832,  0.7832,  0.8612],
          [-0.3289, -1.6556, -1.5385,  ..., -1.5385, -1.5776, -1.4020],
          [-0.7191, -2.2019, -2.0653,  ..., -2.0458, -2.0653, -1.9092],
          ...,
          [-0.7386, -2.2019, -2.0653,  ..., -2.0653, -2.0653, -1.8312],
          [-0.6215, -2.0653, -1.9482,  ..., -1.9482, -1.9678, -1.7531],
          [ 0.6271, -0.2899, -0.2118,  ..., -0.2313, -0.2313, -0.1143]]],


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4753,  2.4947,  ...,  2.4947,  2.4753,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4947,  2.4947,  2.5141],
          ...,
          [ 2.5141,  2.5141,  2.4947,  ...,  2.5141,  2.4947,  2.5141],
          [ 2.5141,  2.4947,  2.4753,  ...,  2.5141,  2.4947,  2.5141],
          [ 2.5141,  2.4947,  2.4753,  ...,  2.5141,  2.4947,  2.5141]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5575,  2.5771,  ...,  2.5771,  2.5575,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5968],
          ...,
          [ 2.5968,  2.5968,  2.5771,  ...,  2.5968,  2.5771,  2.5968],
          [ 2.5968,  2.5771,  2.5575,  ...,  2.5968,  2.5771,  2.5968],
          [ 2.5968,  2.5771,  2.5575,  ...,  2.5968,  2.5771,  2.5968]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7147,  2.7342,  ...,  2.7342,  2.7147,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7342,  2.7537],
          ...,
          [ 2.7537,  2.7537,  2.7342,  ...,  2.7537,  2.7342,  2.7537],
          [ 2.7537,  2.7342,  2.7147,  ...,  2.7537,  2.7342,  2.7537],
          [ 2.7537,  2.7342,  2.7147,  ...,  2.7537,  2.7342,  2.7537]]]]), tensor([1, 0, 0, 0, 1, 1, 0, 0, 0, 1, 0, 1, 1, 1, 0, 1, 0, 0, 1, 0, 1, 0, 1, 1,
        0, 0, 1, 1, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0,
        1, 1, 0, 0, 0, 1, 1, 1, 0, 0, 1, 0, 1, 0, 1, 0])]
x is [tensor([[[[ 2.5141e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          ...,
          [ 2.5141e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00]],

         [[ 2.5968e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          ...,
          [ 2.5968e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00]],

         [[ 2.7537e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          ...,
          [ 2.7537e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00]]],


        [[[ 1.2928e+00,  1.2735e+00,  1.2735e+00,  ...,  1.5642e+00,
            1.5061e+00,  1.3898e+00],
          [ 1.2347e+00,  1.1571e+00,  1.1571e+00,  ...,  1.4479e+00,
            1.3704e+00,  1.3704e+00],
          [ 1.0990e+00,  1.0990e+00,  1.1571e+00,  ...,  1.3898e+00,
            1.3704e+00,  1.3898e+00],
          ...,
          [ 7.6945e-01,  5.9498e-01,  9.6329e-01,  ...,  1.0408e+00,
            1.0990e+00,  1.0990e+00],
          [ 6.5314e-01,  6.9191e-01,  7.6945e-01,  ...,  1.0021e+00,
            1.0408e+00,  1.0408e+00],
          [ 7.6945e-01,  7.1129e-01,  7.1129e-01,  ...,  1.0796e+00,
            1.1184e+00,  1.1184e+00]],

         [[ 1.3774e+00,  1.3774e+00,  1.3578e+00,  ...,  1.6331e+00,
            1.5741e+00,  1.4364e+00],
          [ 1.2988e+00,  1.2201e+00,  1.1414e+00,  ...,  1.5151e+00,
            1.4364e+00,  1.4168e+00],
          [ 1.1021e+00,  1.1021e+00,  1.1218e+00,  ...,  1.4561e+00,
            1.4364e+00,  1.4364e+00],
          ...,
          [ 6.6944e-01,  4.9244e-01,  9.0544e-01,  ...,  9.6444e-01,
            1.0038e+00,  9.8411e-01],
          [ 5.5144e-01,  5.7111e-01,  7.0877e-01,  ...,  9.8411e-01,
            1.0234e+00,  1.0038e+00],
          [ 6.6944e-01,  6.1044e-01,  6.4977e-01,  ...,  1.0628e+00,
            1.1021e+00,  1.0628e+00]],

         [[ 1.5831e+00,  1.5831e+00,  1.5246e+00,  ...,  1.7977e+00,
            1.7587e+00,  1.5831e+00],
          [ 1.4465e+00,  1.3880e+00,  1.3100e+00,  ...,  1.6807e+00,
            1.6026e+00,  1.5636e+00],
          [ 1.2319e+00,  1.2319e+00,  1.2514e+00,  ...,  1.6221e+00,
            1.6026e+00,  1.6026e+00],
          ...,
          [ 8.0270e-01,  6.2711e-01,  1.0368e+00,  ...,  1.1734e+00,
            1.2319e+00,  1.1929e+00],
          [ 6.8564e-01,  7.0515e-01,  8.4172e-01,  ...,  1.1734e+00,
            1.2124e+00,  1.1929e+00],
          [ 8.0270e-01,  7.4417e-01,  8.0270e-01,  ...,  1.2514e+00,
            1.2905e+00,  1.2709e+00]]],


        [[[-8.2012e-01, -4.4721e-02,  1.3434e-02,  ..., -1.8281e+00,
           -2.2546e+00, -2.3128e+00],
          [-8.7827e-01, -3.7426e-01,  1.4913e-01,  ..., -2.2352e+00,
           -2.3321e+00, -2.2158e+00],
          [-4.5180e-01, -5.2934e-01,  1.1036e-01,  ..., -2.0026e+00,
           -2.1383e+00, -2.2158e+00],
          ...,
          [-8.3491e-02, -2.7734e-01, -4.7119e-01,  ..., -6.6504e-01,
           -5.2934e-01, -9.3643e-01],
          [-1.4211e+00, -1.2660e+00, -1.2466e+00,  ..., -7.6196e-01,
           -3.7426e-01, -8.0073e-01],
          [-1.8669e+00, -2.0414e+00, -1.9638e+00,  ..., -7.2319e-01,
           -2.5796e-01, -5.4873e-01]],

         [[-9.4324e-01, -9.7567e-02,  4.0101e-02,  ..., -1.7692e+00,
           -2.1823e+00, -2.2413e+00],
          [-1.0416e+00, -4.7124e-01,  1.3844e-01,  ..., -2.1626e+00,
           -2.2216e+00, -2.1036e+00],
          [-6.6791e-01, -6.6791e-01,  4.0101e-02,  ..., -2.0053e+00,
           -2.1233e+00, -2.1823e+00],
          ...,
          [-1.7623e-01, -3.5324e-01, -7.2691e-01,  ..., -7.0724e-01,
           -4.9090e-01, -9.0391e-01],
          [-1.4546e+00, -1.2579e+00, -1.3956e+00,  ..., -8.0557e-01,
           -3.3357e-01, -7.4657e-01],
          [-1.8873e+00, -1.9856e+00, -2.0053e+00,  ..., -7.8591e-01,
           -2.3524e-01, -5.1057e-01]],

         [[-9.7273e-01, -1.5330e-01, -3.6240e-02,  ..., -1.4995e+00,
           -1.7141e+00, -1.6751e+00],
          [-1.0508e+00, -5.0449e-01,  6.1311e-02,  ..., -1.8117e+00,
           -1.7336e+00, -1.5580e+00],
          [-6.6057e-01, -6.8008e-01, -1.6730e-02,  ..., -1.6556e+00,
           -1.6751e+00, -1.7141e+00],
          ...,
          [-2.1183e-01, -3.8743e-01, -6.8008e-01,  ..., -9.5322e-01,
           -7.7763e-01, -1.0313e+00],
          [-1.4410e+00, -1.2459e+00, -1.3239e+00,  ..., -1.0118e+00,
           -5.8253e-01, -8.7518e-01],
          [-1.7922e+00, -1.9092e+00, -1.8897e+00,  ..., -9.5322e-01,
           -4.2645e-01, -6.2155e-01]]],


        ...,


        [[[ 6.3375e-01,  4.3990e-01,  3.6236e-01,  ..., -1.4165e-01,
           -1.8042e-01, -2.3857e-01],
          [ 6.7252e-01,  5.9498e-01,  4.5929e-01,  ..., -1.4165e-01,
           -1.6103e-01, -1.6103e-01],
          [ 6.9191e-01,  6.3375e-01,  6.7252e-01,  ..., -1.4165e-01,
           -1.6103e-01, -1.4165e-01],
          ...,
          [-8.3491e-02, -1.6103e-01, -1.8042e-01,  ..., -3.9365e-01,
           -3.7426e-01, -3.3549e-01],
          [-2.1919e-01, -2.9672e-01, -3.7426e-01,  ..., -3.1611e-01,
           -3.5488e-01, -3.5488e-01],
          [-4.5180e-01, -4.1303e-01, -3.5488e-01,  ..., -4.9057e-01,
           -5.4873e-01, -5.6811e-01]],

         [[ 6.1044e-01,  4.3344e-01,  3.7444e-01,  ...,  2.0434e-02,
           -5.8233e-02, -9.7567e-02],
          [ 6.6944e-01,  5.9077e-01,  4.5310e-01,  ...,  2.0434e-02,
           -3.8567e-02, -1.8900e-02],
          [ 6.8911e-01,  6.4977e-01,  6.6944e-01,  ...,  4.0101e-02,
            7.6703e-04,  7.6703e-04],
          ...,
          [ 2.0434e-02, -5.8233e-02, -7.7900e-02,  ..., -2.3524e-01,
           -2.1557e-01, -1.7623e-01],
          [-1.5657e-01, -2.1557e-01, -2.9424e-01,  ..., -1.7623e-01,
           -2.1557e-01, -2.3524e-01],
          [-3.9257e-01, -3.3357e-01, -2.7457e-01,  ..., -3.5324e-01,
           -4.1224e-01, -4.3190e-01]],

         [[ 1.0368e+00,  8.0270e-01,  6.6613e-01,  ...,  6.6613e-01,
            5.1005e-01,  5.1005e-01],
          [ 1.0368e+00,  9.1976e-01,  7.6368e-01,  ...,  6.0760e-01,
            5.1005e-01,  5.8809e-01],
          [ 9.7829e-01,  9.3927e-01,  9.7829e-01,  ...,  5.4907e-01,
            4.9054e-01,  5.6858e-01],
          ...,
          [ 6.6613e-01,  5.4907e-01,  5.1005e-01,  ...,  3.5397e-01,
            3.7348e-01,  4.3201e-01],
          [ 5.6858e-01,  4.3201e-01,  2.9543e-01,  ...,  4.3201e-01,
            3.9299e-01,  4.1250e-01],
          [ 3.3446e-01,  3.1495e-01,  3.1495e-01,  ...,  2.5641e-01,
            1.7837e-01,  1.5886e-01]]],


        [[[-2.2740e+00, -2.2352e+00, -2.1964e+00,  ..., -1.7312e+00,
           -1.7894e+00, -1.8087e+00],
          [-2.2158e+00, -2.1383e+00, -2.1189e+00,  ..., -1.3241e+00,
           -1.5374e+00, -1.7118e+00],
          [-2.3515e+00, -2.2158e+00, -2.0801e+00,  ..., -7.0381e-01,
           -7.8135e-01, -8.9766e-01],
          ...,
          [ 1.1765e+00,  8.4699e-01,  7.6945e-01,  ..., -2.1577e+00,
           -2.0995e+00, -2.0801e+00],
          [ 1.0990e+00,  1.0408e+00,  6.7252e-01,  ..., -2.1383e+00,
           -2.2158e+00, -2.1771e+00],
          [ 1.0602e+00,  1.0796e+00,  1.0990e+00,  ..., -2.0608e+00,
           -2.0995e+00, -2.1771e+00]],

         [[-2.4183e+00, -2.3789e+00, -2.3199e+00,  ..., -1.8086e+00,
           -1.8479e+00, -1.8873e+00],
          [-2.3789e+00, -2.3199e+00, -2.3199e+00,  ..., -1.4152e+00,
           -1.6709e+00, -1.8479e+00],
          [-2.3396e+00, -2.2609e+00, -2.1823e+00,  ..., -8.2524e-01,
           -9.0391e-01, -1.0416e+00],
          ...,
          [ 4.5310e-01,  5.9768e-02, -5.8233e-02,  ..., -2.3003e+00,
           -2.2413e+00, -2.2216e+00],
          [ 4.5310e-01,  3.3510e-01, -9.7567e-02,  ..., -2.2609e+00,
           -2.3593e+00, -2.3199e+00],
          [ 4.9244e-01,  4.7277e-01,  4.1377e-01,  ..., -2.1823e+00,
           -2.2413e+00, -2.3199e+00]],

         [[-2.1824e+00, -2.1824e+00, -2.1824e+00,  ..., -1.6556e+00,
           -1.6751e+00, -1.6946e+00],
          [-2.1629e+00, -2.1434e+00, -2.1824e+00,  ..., -1.3434e+00,
           -1.5580e+00, -1.7141e+00],
          [-2.2019e+00, -2.1629e+00, -2.1238e+00,  ..., -6.6057e-01,
           -7.9714e-01, -9.3371e-01],
          ...,
          [ 4.9054e-01,  1.1984e-01,  2.2291e-02,  ..., -2.1629e+00,
           -2.0848e+00, -2.0653e+00],
          [ 4.5152e-01,  3.7348e-01, -3.6240e-02,  ..., -2.1434e+00,
           -2.1824e+00, -2.1629e+00],
          [ 4.7103e-01,  4.7103e-01,  4.5152e-01,  ..., -2.0458e+00,
           -2.0848e+00, -2.1629e+00]]],


        [[[-5.8750e-01, -1.2660e+00, -5.2934e-01,  ...,  1.3122e+00,
            1.3316e+00,  1.3316e+00],
          [ 5.1744e-01, -4.3242e-01,  2.4605e-01,  ...,  1.2541e+00,
            1.2541e+00,  1.1959e+00],
          [ 5.1744e-01, -6.8442e-01, -8.5889e-01,  ...,  1.0796e+00,
            8.4699e-01,  9.2452e-01],
          ...,
          [-6.4565e-01, -6.0688e-01, -5.6811e-01,  ..., -5.9512e-03,
           -1.2226e-01, -1.9980e-01],
          [-5.4873e-01, -5.8750e-01, -6.0688e-01,  ..., -2.5796e-01,
           -2.1919e-01, -1.9980e-01],
          [-6.2627e-01, -6.0688e-01, -5.4873e-01,  ..., -3.5488e-01,
           -1.4165e-01, -1.6103e-01]],

         [[-7.8591e-01, -1.3169e+00, -7.0724e-01,  ...,  4.1377e-01,
            3.7444e-01,  4.1377e-01],
          [ 3.3510e-01, -4.7124e-01,  1.9744e-01,  ...,  3.5477e-01,
            2.9577e-01,  2.9577e-01],
          [ 2.9577e-01, -7.6624e-01, -9.0391e-01,  ...,  1.7777e-01,
           -5.8233e-02,  7.9434e-02],
          ...,
          [-6.0891e-01, -5.6957e-01, -5.3024e-01,  ..., -1.3690e-01,
           -2.5490e-01, -3.3357e-01],
          [-4.9090e-01, -5.6957e-01, -5.8924e-01,  ..., -3.9257e-01,
           -3.9257e-01, -4.1224e-01],
          [-5.1057e-01, -5.6957e-01, -5.1057e-01,  ..., -4.5157e-01,
           -3.3357e-01, -3.9257e-01]],

         [[-1.0118e+00, -1.1483e+00, -5.4351e-01,  ..., -4.6547e-01,
           -4.4596e-01, -3.8743e-01],
          [-1.5330e-01, -5.6302e-01,  2.1739e-01,  ..., -4.8498e-01,
           -4.4596e-01, -4.2645e-01],
          [-5.2400e-01, -1.2459e+00, -1.0703e+00,  ..., -5.6302e-01,
           -7.3861e-01, -5.6302e-01],
          ...,
          [-8.7518e-01, -8.7518e-01, -8.3616e-01,  ..., -6.0204e-01,
           -7.3861e-01, -7.3861e-01],
          [-7.7763e-01, -8.5567e-01, -8.9469e-01,  ..., -8.1665e-01,
           -7.9714e-01, -6.8008e-01],
          [-8.1665e-01, -8.5567e-01, -7.9714e-01,  ..., -8.1665e-01,
           -7.5812e-01, -6.9959e-01]]]]), tensor([0, 0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1])]nabla_unlearning_main.py:158: RuntimeWarning: invalid value encountered in scalar divide
  ic_err = counts / (np.sum(cm[class_to_forget[0]]) + np.sum(cm[class_to_forget[1]]))
nabla_unlearning_main.py:158: RuntimeWarning: invalid value encountered in scalar divide
  ic_err = counts / (np.sum(cm[class_to_forget[0]]) + np.sum(cm[class_to_forget[1]]))
nabla_unlearning_main.py:158: RuntimeWarning: invalid value encountered in scalar divide
  ic_err = counts / (np.sum(cm[class_to_forget[0]]) + np.sum(cm[class_to_forget[1]]))

Accuracy on test set: 2.5 , Racc: 2.6 , Uacc: 0.0
Folders created.
Checkpoint name: cifar100_resnet_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100_resnet_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
Number of Classes: 100
ResNet18(
  (conv1): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
  (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  (layer1): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (layer2): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (shortcut): Sequential(
        (0): Conv2d(64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (layer3): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (shortcut): Sequential(
        (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (layer4): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (shortcut): Sequential(
        (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (linear): Linear(in_features=512, out_features=100, bias=True)
)
==> unlearning ...
Computing current moments on test set
Computed moments: 10.932028585815429,8.130016621589661,-1.7629506289070531
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.912 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.550 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.2 , Uacc: 6.5
Forgetting epoch 0
Resetting retain iterator...
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.981 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.465 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 0.6 , Uacc: 59.5
Forgetting epoch 1
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.481 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 0.8 , Uacc: 80.2
Forgetting epoch 2
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.545 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 0.7 , Uacc: 77.8
Forgetting epoch 3
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.557 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.8 , Uacc: 84.5
Forgetting epoch 4
Computing current moments on test set
Computed moments: 28.82908302001953,81.07030419921875,-0.14071102539059485
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.507 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.8 , Uacc: 86.8
Forgetting epoch 5
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.519 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 88.8
Forgetting epoch 6
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.560 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 89.5
Forgetting epoch 7
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.579 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 89.5
Forgetting epoch 8
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.545 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 89.5
Forgetting epoch 9
Computing current moments on test set
Computed moments: 24.28757048034668,47.349576153564456,-0.5021283617129066
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.588 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 90.8
Forgetting epoch 10
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.586 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 92.2
Forgetting epoch 11
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.581 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 92.2
Forgetting epoch 12
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.602 on forgotten vs unseen images
Accuracy on test set: 1.9 , Racc: 0.9 , Uacc: 93.5
Forgetting epoch 13
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.616 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 93.8
Forgetting epoch 14
Computing current moments on test set
Computed moments: 25.326576794433592,58.110169073486325,-0.10549260588295888
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.584 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 92.5
Forgetting epoch 15
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.640 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 94.0
Forgetting epoch 16
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.581 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 95.8
Forgetting epoch 17
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.589 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 95.8
Forgetting epoch 18
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.619 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 95.5
Forgetting epoch 19
Computing current moments on test set
Computed moments: 25.098951797485352,60.56064638061523,0.11507702438510933
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.577 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 93.8
Forgetting epoch 20
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.623 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 96.2
Forgetting epoch 21
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.636 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 97.0
Forgetting epoch 22
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.641 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 97.5
Forgetting epoch 23
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.573 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 94.2
Forgetting epoch 24
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 97.2
Checkpoint name: cifar100_resnet_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100_resnet_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
Number of Classes: 100
Traceback (most recent call last):
  File "nabla_unlearning_main.py", line 552, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/Resnet-CIFAR100_3407_800000/checkpoint_s_500000.pt'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "nabla_unlearning_main.py", line 555, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/Resnet-CIFAR100_3407_800000/checkpoint-s:500000.pt'
Checkpoint name: cifar100_resnet_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100_resnet_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
Number of Classes: 100
ResNet18(
  (conv1): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
  (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  (layer1): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (layer2): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (shortcut): Sequential(
        (0): Conv2d(64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (layer3): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (shortcut): Sequential(
        (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (layer4): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (shortcut): Sequential(
        (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (linear): Linear(in_features=512, out_features=100, bias=True)
)
Traceback (most recent call last):
  File "main_forget_sparse.py", line 556, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/Resnet-CIFAR100_3407_800000/checkpoint_s_500000.pt'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "main_forget_sparse.py", line 559, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/Resnet-CIFAR100_3407_800000/checkpoint-s:500000.pt'
Checkpoint name: cifar100_resnet_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100_resnet_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
Number of Classes: 100
logs/cifar100_resnet_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_resume_cp1304
==> unlearning ...
Computing current moments on test set
Computed moments: 10.932028585815429,8.130016621589661,-1.7629506289070531
The MIA_loss has an accuracy of 0.927 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.585 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.3 , Uacc: 6.5
Forgetting epoch 0
Resetting retain iterator...
using alpha: 0.1
delta_val_loss: 6.555340766906738
delta_first_moment: 8.130016326904297
delta_second_moment: nan
The MIA_loss has an accuracy of 0.825 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.577 on forgotten vs unseen images
Accuracy on test set: 1.4 , Racc: 1.3 , Uacc: 0.0
Forgetting epoch 1
using alpha: 0.096
delta_val_loss: 1.892441749572754
delta_first_moment: 8.130016326904297
delta_second_moment: nan
The MIA_loss has an accuracy of 0.660 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.627 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.3 , Uacc: 0.0
Forgetting epoch 2
using alpha: 0.092
delta_val_loss: 3.2246971130371094
delta_first_moment: 8.130016326904297
delta_second_moment: nan
The MIA_loss has an accuracy of 0.600 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.672 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 1.2 , Uacc: 0.0
Forgetting epoch 3
using alpha: 0.088
delta_val_loss: 2.9656944274902344
delta_first_moment: 8.130016326904297
delta_second_moment: nan
The MIA_loss has an accuracy of 0.688 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.695 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 1.2 , Uacc: 0.0
Forgetting epoch 4
using alpha: 0.08399999999999999
delta_val_loss: 1.6955928802490234
delta_first_moment: 8.130016326904297
delta_second_moment: nan
Computing current moments on test set
Computed moments: 6.227260523223877,3.5385731857299803,1.585853809672241
The MIA_loss has an accuracy of 0.755 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.720 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 5
using alpha: 0.07999999999999999
delta_val_loss: -3.227161407470703
delta_first_moment: 3.5385732650756836
delta_second_moment: nan
The MIA_loss has an accuracy of 0.690 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.663 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.3 , Uacc: 0.0
Forgetting epoch 6
using alpha: 0.07599999999999998
delta_val_loss: -0.647099494934082
delta_first_moment: 3.5385732650756836
delta_second_moment: nan
The MIA_loss has an accuracy of 0.532 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.595 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.2 , Uacc: 0.0
Forgetting epoch 7
using alpha: 0.07199999999999998
delta_val_loss: 0.8069796562194824
delta_first_moment: 3.5385732650756836
delta_second_moment: nan
The MIA_loss has an accuracy of 0.557 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.610 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.4 , Uacc: 0.0
Forgetting epoch 8
using alpha: 0.06799999999999998
delta_val_loss: 1.0536518096923828
delta_first_moment: 3.5385732650756836
delta_second_moment: nan
The MIA_loss has an accuracy of 0.790 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.620 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.2 , Uacc: 0.0
Forgetting epoch 9
using alpha: 0.06399999999999997
delta_val_loss: 0.8781137466430664
delta_first_moment: 3.5385732650756836
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.753047283935547,0.24847532873153685,-1.4328888448099557
The MIA_loss has an accuracy of 0.910 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.642 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 10
using alpha: 0.05999999999999997
delta_val_loss: -1.0301661491394043
delta_first_moment: 0.2484753280878067
delta_second_moment: nan
The MIA_loss has an accuracy of 0.960 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.668 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 11
using alpha: 0.055999999999999966
delta_val_loss: -1.145444393157959
delta_first_moment: 0.2484753280878067
delta_second_moment: nan
The MIA_loss has an accuracy of 0.967 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.643 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 12
using alpha: 0.05199999999999996
delta_val_loss: -1.1117830276489258
delta_first_moment: 0.2484753280878067
delta_second_moment: nan
The MIA_loss has an accuracy of 0.963 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.670 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.2 , Uacc: 0.0
Forgetting epoch 13
using alpha: 0.04799999999999996
delta_val_loss: -0.9423351287841797
delta_first_moment: 0.2484753280878067
delta_second_moment: nan
The MIA_loss has an accuracy of 0.932 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.673 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 1.3 , Uacc: 0.0
Forgetting epoch 14
using alpha: 0.043999999999999956
delta_val_loss: -0.7858672142028809
delta_first_moment: 0.2484753280878067
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.637051392364502,0.07660467001199722,-0.7136149051667959
The MIA_loss has an accuracy of 0.915 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.662 on forgotten vs unseen images
Accuracy on test set: 1.4 , Racc: 1.4 , Uacc: 0.0
Forgetting epoch 15
using alpha: 0.03999999999999995
delta_val_loss: -0.6336507797241211
delta_first_moment: 0.07660467177629471
delta_second_moment: nan
The MIA_loss has an accuracy of 0.907 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.660 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.5 , Uacc: 0.0
Forgetting epoch 16
using alpha: 0.03599999999999995
delta_val_loss: -0.5663943290710449
delta_first_moment: 0.07660467177629471
delta_second_moment: nan
The MIA_loss has an accuracy of 0.890 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.640 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.5 , Uacc: 0.0
Forgetting epoch 17
using alpha: 0.031999999999999945
delta_val_loss: -0.4997081756591797
delta_first_moment: 0.07660467177629471
delta_second_moment: nan
The MIA_loss has an accuracy of 0.910 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.672 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.5 , Uacc: 0.0
Forgetting epoch 18
using alpha: 0.027999999999999945
delta_val_loss: -0.4445919990539551
delta_first_moment: 0.07660467177629471
delta_second_moment: nan
The MIA_loss has an accuracy of 0.915 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.698 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.4 , Uacc: 0.0
Forgetting epoch 19
using alpha: 0.023999999999999945
delta_val_loss: -0.4742603302001953
delta_first_moment: 0.07660467177629471
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.6159721343994145,0.04382634440660477,-0.8251164217358306
The MIA_loss has an accuracy of 0.890 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.660 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.4 , Uacc: 0.0
Forgetting epoch 20
using alpha: 0.019999999999999945
delta_val_loss: -0.3998103141784668
delta_first_moment: 0.04382634535431862
delta_second_moment: nan
The MIA_loss has an accuracy of 0.867 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.603 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.5 , Uacc: 0.0
Forgetting epoch 21
using alpha: 0.015999999999999945
delta_val_loss: -0.32797670364379883
delta_first_moment: 0.04382634535431862
delta_second_moment: nan
The MIA_loss has an accuracy of 0.843 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.610 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.5 , Uacc: 0.0
Forgetting epoch 22
using alpha: 0.011999999999999945
delta_val_loss: -0.32398271560668945
delta_first_moment: 0.04382634535431862
delta_second_moment: nan
The MIA_loss has an accuracy of 0.795 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.540 on forgotten vs unseen images
Accuracy on test set: 1.4 , Racc: 1.5 , Uacc: 0.0
Forgetting epoch 23
using alpha: 0.007999999999999945
delta_val_loss: -0.19657564163208008
delta_first_moment: 0.04382634535431862
delta_second_moment: nan
The MIA_loss has an accuracy of 0.700 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.495 on forgotten vs unseen images
Accuracy on test set: 1.4 , Racc: 1.4 , Uacc: 0.0
Forgetting epoch 24
using alpha: 0.003999999999999945
delta_val_loss: -0.1443943977355957
delta_first_moment: 0.04382634535431862
delta_second_moment: nan
The MIA loss has an accuracy of 0.667 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.485 on forgotten vs unseen images
x is [tensor([[[[ 2.2815,  2.3396,  2.3009,  ...,  2.2233,  2.2233,  2.1458],
          [ 2.1845,  2.2427,  2.2039,  ...,  2.2233,  2.2233,  2.1458],
          [ 2.2427,  2.3202,  2.3009,  ...,  2.2815,  2.2815,  2.2039],
          ...,
          [ 2.3978,  2.5141,  2.4947,  ...,  2.3396,  2.3396,  2.2621],
          [ 2.3978,  2.4559,  2.4365,  ...,  2.3590,  2.3784,  2.3202],
          [ 2.4365,  2.5141,  2.4947,  ...,  2.4753,  2.4947,  2.4172]],

         [[ 2.2231,  2.2821,  2.2428,  ...,  2.1641,  2.1641,  2.0855],
          [ 2.1248,  2.1838,  2.1445,  ...,  2.1445,  2.1445,  2.0658],
          [ 2.2035,  2.2625,  2.2428,  ...,  2.1838,  2.2035,  2.1248],
          ...,
          [ 2.4395,  2.4788,  2.4001,  ...,  2.2821,  2.3018,  2.2428],
          [ 2.4001,  2.4591,  2.4001,  ...,  2.3018,  2.3018,  2.2428],
          [ 2.4395,  2.4985,  2.4591,  ...,  2.3805,  2.4198,  2.3411]],

         [[ 2.2270,  2.2855,  2.2465,  ...,  2.1879,  2.1684,  2.0904],
          [ 2.1294,  2.1879,  2.1489,  ...,  2.2270,  2.2270,  2.1489],
          [ 2.1879,  2.2660,  2.2465,  ...,  2.3050,  2.3050,  2.2270],
          ...,
          [ 2.4611,  2.5391,  2.4806,  ...,  2.3635,  2.4025,  2.3635],
          [ 2.5001,  2.5391,  2.5001,  ...,  2.4221,  2.4221,  2.3635],
          [ 2.5391,  2.5976,  2.5586,  ...,  2.5001,  2.5391,  2.4611]]],


        [[[-0.4712, -1.5374, -1.6731,  ..., -1.1690, -1.7312, -2.3321],
          [-0.6069, -1.1109, -1.3241,  ..., -2.2158, -2.1189, -2.2158],
          [-0.7620, -1.4986, -1.0721,  ..., -2.0608, -2.1189, -2.0220],
          ...,
          [ 1.7775,  1.5255,  1.6030,  ...,  0.9245,  1.1378,  1.0990],
          [ 1.6030,  1.4673,  1.5448,  ...,  0.8276,  0.9827,  1.1378],
          [ 1.7775,  1.6612,  1.4867,  ...,  0.8664,  0.3624,  0.5950]],

         [[-0.4122, -1.4152, -1.4939,  ..., -1.0612, -1.6512, -2.3003],
          [-0.5302, -0.9826, -1.1399,  ..., -2.1036, -2.0446, -2.1823],
          [-0.7072, -1.3759, -0.8646,  ..., -1.9856, -2.0446, -1.9856],
          ...,
          [ 1.2398,  0.8268,  0.9448,  ...,  0.0991,  0.1384,  0.0598],
          [ 1.1021,  0.8464,  0.8661,  ..., -0.0779, -0.0582,  0.0598],
          [ 1.2988,  1.0824,  0.8268,  ..., -0.0976, -0.6876, -0.4319]],

         [[-0.5825, -1.7336, -1.6946,  ..., -1.3434, -1.7922, -2.2019],
          [-0.7191, -1.3629, -1.4020,  ..., -2.1629, -2.0653, -2.1238],
          [-0.8557, -1.7141, -1.2264,  ..., -2.0068, -2.0263, -1.9287],
          ...,
          [-0.3484, -1.1483, -1.5971,  ..., -1.7336, -1.7922, -1.8897],
          [-0.3289, -0.9142, -1.5580,  ..., -1.8897, -1.8117, -1.7141],
          [ 0.1198, -0.3484, -1.4020,  ..., -1.9287, -1.9873, -1.6166]]],


        [[[-1.3047, -1.2854, -1.2660,  ..., -1.1497, -1.2272, -1.1884],
          [-1.2854, -1.2854, -1.2272,  ..., -1.0915, -1.1690, -1.1497],
          [-1.3241, -1.3047, -1.2078,  ..., -1.0334, -1.0915, -1.1497],
          ...,
          [-1.3047, -1.3435, -1.3629,  ..., -1.2078, -1.2272, -1.1497],
          [-1.3241, -1.3435, -1.3435,  ..., -1.2272, -1.2272, -1.1109],
          [-1.3435, -1.3629, -1.3629,  ..., -1.2466, -1.2078, -1.1303]],

         [[-1.4742, -1.4546, -1.4349,  ..., -1.3759, -1.4939, -1.4349],
          [-1.4742, -1.4546, -1.3956,  ..., -1.3366, -1.3759, -1.3562],
          [-1.4939, -1.4742, -1.3759,  ..., -1.2579, -1.2579, -1.3366],
          ...,
          [-1.5726, -1.6119, -1.6316,  ..., -1.4546, -1.4939, -1.4152],
          [-1.5922, -1.6119, -1.6119,  ..., -1.4546, -1.4939, -1.3759],
          [-1.6119, -1.6316, -1.6316,  ..., -1.4742, -1.4742, -1.3956]],

         [[-1.7727, -1.7531, -1.7336,  ..., -1.5971, -1.7336, -1.7141],
          [-1.7727, -1.7531, -1.6946,  ..., -1.6166, -1.6946, -1.6361],
          [-1.7922, -1.7727, -1.6751,  ..., -1.5385, -1.5385, -1.5580],
          ...,
          [-1.7727, -1.7922, -1.8117,  ..., -1.7141, -1.7531, -1.6751],
          [-1.7727, -1.7922, -1.7922,  ..., -1.7336, -1.7531, -1.6556],
          [-1.7922, -1.8117, -1.8117,  ..., -1.7336, -1.7336, -1.6556]]],


        ...,


        [[[ 0.1879, -0.0447, -0.0447,  ..., -0.9558, -0.7813, -0.8395],
          [ 0.4593,  0.2073, -0.0060,  ..., -0.8007, -0.4130, -0.3936],
          [ 0.4787,  0.3236,  0.0328,  ..., -0.5681, -0.1223, -0.0447],
          ...,
          [-0.9946, -1.1303, -0.1610,  ..., -0.8007, -1.0527, -1.0140],
          [-0.9364, -1.2272, -1.2078,  ..., -0.3355, -0.6844, -0.9558],
          [-0.9364, -1.0721, -1.2078,  ..., -0.1416, -0.3355, -0.6844]],

         [[ 0.3548,  0.1384,  0.1581,  ..., -0.6679, -0.6876, -1.0219],
          [ 0.6301,  0.4138,  0.2171,  ..., -0.5696, -0.6876, -0.9826],
          [ 0.6498,  0.5318,  0.2368,  ..., -0.4909, -0.6876, -0.8842],
          ...,
          [-0.4909, -0.6482, -0.0976,  ..., -0.5499, -0.7072, -0.5499],
          [-0.4712, -0.7269, -1.0809,  ..., -0.1172, -0.3532, -0.4712],
          [-0.5106, -0.5106, -0.9432,  ...,  0.0401, -0.0386, -0.2352]],

         [[-1.2654, -1.2849, -1.1678,  ..., -1.3044, -1.1873, -1.4215],
          [-1.0898, -1.1288, -1.1678,  ..., -1.2654, -1.3825, -1.5580],
          [-1.0898, -1.0703, -1.1288,  ..., -1.2459, -1.4995, -1.5580],
          ...,
          [-1.9678, -1.4020, -0.0558,  ..., -0.9727, -1.0508, -0.8362],
          [-1.9873, -1.7922, -1.5385,  ..., -0.7191, -0.8752, -0.9337],
          [-1.9873, -1.8507, -1.8702,  ..., -0.6996, -0.7386, -0.8557]]],


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4753,  2.4947,  ...,  2.4947,  2.4947,  2.4947],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4947,  2.5141,  2.5141],
          ...,
          [ 2.5141,  2.4947,  2.4947,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.5141,  2.5141,  2.5141]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5575,  2.5771,  ...,  2.5771,  2.5771,  2.5771],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5968,  2.5968],
          ...,
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5968,  2.5968,  2.5968]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7147,  2.7342,  ...,  2.7342,  2.7342,  2.7342],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7537,  2.7537],
          ...,
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7537,  2.7537,  2.7537]]],


        [[[-2.1189, -2.3128, -2.3709,  ..., -1.7894, -1.8281, -2.2740],
          [-2.2158, -2.2352, -2.2158,  ..., -1.7700, -1.8669, -2.2546],
          [-2.0995, -2.1577, -2.0026,  ..., -1.9638, -1.8475, -2.0801],
          ...,
          [ 0.0910,  0.0716,  0.1297,  ...,  0.2073,  0.1879,  0.1491],
          [ 0.1685,  0.1685,  0.1685,  ...,  0.2848,  0.2461,  0.1879],
          [ 0.1104,  0.0522,  0.1104,  ...,  0.2848,  0.2461,  0.1879]],

         [[-0.3729, -0.2352, -0.1959,  ..., -1.0612, -1.1202, -0.7466],
          [-0.3336, -0.2352, -0.3336,  ..., -1.0612, -1.1596, -0.7072],
          [-0.5302, -0.3336, -0.6286,  ..., -1.0612, -1.2776, -0.9236],
          ...,
          [-0.7269, -0.7269, -0.6876,  ..., -0.6089, -0.6286, -0.6679],
          [-0.6679, -0.6482, -0.6089,  ..., -0.5499, -0.5892, -0.6286],
          [-0.6876, -0.7269, -0.6679,  ..., -0.5892, -0.6089, -0.5892]],

         [[ 0.0418,  0.3540,  0.4710,  ..., -1.5385, -1.5190, -0.3289],
          [ 0.1784,  0.2174,  0.1198,  ..., -1.5580, -1.4020, -0.2118],
          [-0.2313,  0.0028, -0.4460,  ..., -1.1483, -1.4215, -0.7191],
          ...,
          [-1.6946, -1.7141, -1.6946,  ..., -1.6556, -1.6556, -1.6361],
          [-1.6751, -1.6946, -1.6166,  ..., -1.5776, -1.5776, -1.5776],
          [-1.6946, -1.7141, -1.6361,  ..., -1.5580, -1.5580, -1.5385]]]]), tensor([0, 1, 1, 1, 1, 1, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0,
        0, 0, 1, 1, 1, 1, 0, 0, 1, 0, 0, 1, 1, 1, 1, 0, 1, 1, 1, 1, 1, 1, 0, 0,
        1, 1, 1, 0, 1, 1, 1, 0, 1, 0, 1, 1, 0, 1, 0, 1])]
x is [tensor([[[[-4.1303e-01, -5.9512e-03,  7.1129e-01,  ...,  1.9325e+00,
            1.8550e+00,  1.3898e+00],
          [-2.9672e-01,  1.4913e-01,  9.4391e-01,  ...,  1.8356e+00,
            1.6805e+00,  1.3898e+00],
          [-2.1919e-01,  2.2667e-01,  1.0408e+00,  ...,  1.6999e+00,
            1.3122e+00,  1.0021e+00],
          ...,
          [ 9.6329e-01,  1.3122e+00,  2.3590e+00,  ...,  5.9498e-01,
            9.0974e-02,  1.3434e-02],
          [ 1.1571e+00,  1.5061e+00,  2.3009e+00,  ...,  8.2760e-01,
            4.0113e-01,  3.4298e-01],
          [ 1.3704e+00,  1.4479e+00,  2.0295e+00,  ...,  7.5006e-01,
            5.3683e-01,  8.4699e-01]],

         [[-5.8924e-01, -2.7457e-01,  3.9410e-01,  ...,  1.4561e+00,
            1.4954e+00,  1.2201e+00],
          [-5.1057e-01, -1.3690e-01,  5.9077e-01,  ...,  1.3578e+00,
            1.3578e+00,  1.2791e+00],
          [-4.7124e-01, -7.7900e-02,  6.4977e-01,  ...,  1.2791e+00,
            1.0431e+00,  9.2511e-01],
          ...,
          [ 7.0877e-01,  8.8578e-01,  2.0265e+00,  ...,  2.1710e-01,
           -2.3524e-01, -2.1557e-01],
          [ 9.2511e-01,  1.0824e+00,  1.9675e+00,  ...,  4.5310e-01,
            2.0434e-02,  4.0101e-02],
          [ 1.0628e+00,  9.4478e-01,  1.6134e+00,  ...,  4.1377e-01,
            4.0101e-02,  3.3510e-01]],

         [[-5.5751e-02,  2.2291e-02,  4.7103e-01,  ...,  1.4661e+00,
            1.3880e+00,  1.1929e+00],
          [ 2.7802e-03,  1.3935e-01,  6.4662e-01,  ...,  1.3295e+00,
            1.2319e+00,  1.1929e+00],
          [ 2.2291e-02,  1.5886e-01,  6.8564e-01,  ...,  1.1734e+00,
            8.6123e-01,  8.0270e-01],
          ...,
          [ 7.4417e-01,  9.3927e-01,  1.9928e+00,  ...,  5.1005e-01,
            3.3446e-01,  1.0033e-01],
          [ 9.5878e-01,  1.1344e+00,  1.9538e+00,  ...,  7.2466e-01,
            5.2956e-01,  3.7348e-01],
          [ 1.1149e+00,  1.0368e+00,  1.6221e+00,  ...,  6.8564e-01,
            4.9054e-01,  5.8809e-01]]],


        [[[ 1.4285e+00,  1.3898e+00,  1.4091e+00,  ...,  1.4867e+00,
            1.4673e+00,  1.5061e+00],
          [ 1.3704e+00,  1.3704e+00,  1.3704e+00,  ...,  1.4673e+00,
            1.4479e+00,  1.5061e+00],
          [ 1.3316e+00,  1.3704e+00,  1.3898e+00,  ...,  1.4673e+00,
            1.4867e+00,  1.5255e+00],
          ...,
          [ 8.4699e-01,  8.2760e-01,  8.2760e-01,  ...,  1.0990e+00,
            1.1571e+00,  1.2153e+00],
          [ 8.0822e-01,  8.0822e-01,  8.2760e-01,  ...,  1.0796e+00,
            1.1184e+00,  1.1765e+00],
          [ 8.0822e-01,  8.2760e-01,  8.4699e-01,  ...,  1.0990e+00,
            1.1184e+00,  1.1765e+00]],

         [[ 1.8101e+00,  1.7708e+00,  1.7904e+00,  ...,  1.8691e+00,
            1.8495e+00,  1.8888e+00],
          [ 1.7314e+00,  1.7511e+00,  1.7511e+00,  ...,  1.8495e+00,
            1.8298e+00,  1.8691e+00],
          [ 1.7118e+00,  1.7511e+00,  1.7708e+00,  ...,  1.8691e+00,
            1.8691e+00,  1.9085e+00],
          ...,
          [ 1.1218e+00,  1.0824e+00,  1.0431e+00,  ...,  1.4954e+00,
            1.5544e+00,  1.5938e+00],
          [ 1.1218e+00,  1.0824e+00,  1.0628e+00,  ...,  1.4561e+00,
            1.4954e+00,  1.5544e+00],
          [ 1.1414e+00,  1.1218e+00,  1.1218e+00,  ...,  1.4758e+00,
            1.4954e+00,  1.5544e+00]],

         [[ 2.2855e+00,  2.2465e+00,  2.2660e+00,  ...,  2.5196e+00,
            2.4806e+00,  2.5196e+00],
          [ 2.2074e+00,  2.2270e+00,  2.2270e+00,  ...,  2.4806e+00,
            2.4806e+00,  2.5391e+00],
          [ 2.1879e+00,  2.2270e+00,  2.2465e+00,  ...,  2.3830e+00,
            2.4416e+00,  2.5196e+00],
          ...,
          [ 1.7197e+00,  1.6416e+00,  1.6026e+00,  ...,  2.0123e+00,
            2.0904e+00,  2.2074e+00],
          [ 1.7392e+00,  1.6807e+00,  1.6416e+00,  ...,  2.0123e+00,
            2.0904e+00,  2.1879e+00],
          [ 1.7977e+00,  1.7587e+00,  1.7002e+00,  ...,  2.0709e+00,
            2.1294e+00,  2.2074e+00]]],


        [[[ 1.8550e+00,  1.7968e+00,  1.7968e+00,  ...,  1.7968e+00,
            1.7968e+00,  1.8550e+00],
          [ 1.8162e+00,  1.7581e+00,  1.7775e+00,  ...,  1.7775e+00,
            1.7581e+00,  1.8162e+00],
          [ 1.8162e+00,  1.7775e+00,  1.7775e+00,  ...,  1.7775e+00,
            1.7775e+00,  1.8162e+00],
          ...,
          [ 1.7968e+00,  1.6612e+00,  1.4479e+00,  ...,  1.7387e+00,
            1.7775e+00,  1.8162e+00],
          [ 1.6999e+00,  1.7193e+00,  1.8744e+00,  ...,  1.6612e+00,
            1.7387e+00,  1.8356e+00],
          [ 1.7775e+00,  1.7581e+00,  1.8550e+00,  ...,  1.7387e+00,
            1.7581e+00,  1.8162e+00]],

         [[-1.4546e+00, -1.4742e+00, -1.4742e+00,  ..., -1.4742e+00,
           -1.4742e+00, -1.4546e+00],
          [-1.4742e+00, -1.4742e+00, -1.4742e+00,  ..., -1.4742e+00,
           -1.4742e+00, -1.4742e+00],
          [-1.4546e+00, -1.4742e+00, -1.4742e+00,  ..., -1.4742e+00,
           -1.4742e+00, -1.4546e+00],
          ...,
          [-1.4349e+00, -1.4742e+00, -1.2579e+00,  ..., -1.4546e+00,
           -1.4742e+00, -1.4546e+00],
          [-1.3759e+00, -1.4349e+00, -1.5726e+00,  ..., -1.4349e+00,
           -1.4546e+00, -1.4742e+00],
          [-1.4349e+00, -1.4546e+00, -1.5136e+00,  ..., -1.4546e+00,
           -1.4742e+00, -1.4742e+00]],

         [[-1.7336e+00, -1.7336e+00, -1.7336e+00,  ..., -1.7336e+00,
           -1.7336e+00, -1.7336e+00],
          [-1.7336e+00, -1.7336e+00, -1.7336e+00,  ..., -1.7336e+00,
           -1.7336e+00, -1.7336e+00],
          [-1.7336e+00, -1.7336e+00, -1.7336e+00,  ..., -1.7336e+00,
           -1.7336e+00, -1.7336e+00],
          ...,
          [-1.7531e+00, -1.8507e+00, -1.6946e+00,  ..., -1.6556e+00,
           -1.7141e+00, -1.7336e+00],
          [-1.7727e+00, -1.8702e+00, -1.8897e+00,  ..., -1.6361e+00,
           -1.7141e+00, -1.7336e+00],
          [-1.7531e+00, -1.7727e+00, -1.7922e+00,  ..., -1.6946e+00,
           -1.7336e+00, -1.7336e+00]]],


        ...,


        [[[-2.1771e+00, -2.1964e+00, -2.1964e+00,  ..., -2.0608e+00,
           -2.0608e+00, -2.0801e+00],
          [-2.1771e+00, -2.1771e+00, -2.1577e+00,  ..., -2.0608e+00,
           -2.0608e+00, -2.0801e+00],
          [-2.1771e+00, -2.1771e+00, -2.1771e+00,  ..., -2.0608e+00,
           -2.0608e+00, -2.0801e+00],
          ...,
          [-2.1383e+00, -2.1383e+00, -2.1383e+00,  ..., -2.0995e+00,
           -2.1383e+00, -2.1189e+00],
          [-2.1189e+00, -2.1383e+00, -2.1383e+00,  ..., -1.4404e+00,
           -1.4986e+00, -1.6537e+00],
          [-2.1189e+00, -2.1189e+00, -2.1189e+00,  ..., -1.0140e+00,
           -8.5889e-01, -9.5581e-01]],

         [[-2.1429e+00, -2.1626e+00, -2.1626e+00,  ..., -2.0249e+00,
           -2.0249e+00, -2.0446e+00],
          [-2.1429e+00, -2.1429e+00, -2.1233e+00,  ..., -2.0249e+00,
           -2.0249e+00, -2.0446e+00],
          [-2.1429e+00, -2.1429e+00, -2.1429e+00,  ..., -2.0249e+00,
           -2.0249e+00, -2.0446e+00],
          ...,
          [-2.0643e+00, -2.0643e+00, -2.0643e+00,  ..., -2.0643e+00,
           -2.1036e+00, -2.0839e+00],
          [-2.0446e+00, -2.0643e+00, -2.0643e+00,  ..., -1.3956e+00,
           -1.4546e+00, -1.6119e+00],
          [-2.0446e+00, -2.0446e+00, -2.0446e+00,  ..., -9.6291e-01,
           -8.0557e-01, -9.0391e-01]],

         [[-2.0458e+00, -2.0653e+00, -2.0653e+00,  ..., -1.9482e+00,
           -1.9482e+00, -1.9678e+00],
          [-2.0458e+00, -2.0458e+00, -2.0263e+00,  ..., -1.9482e+00,
           -1.9482e+00, -1.9678e+00],
          [-2.0458e+00, -2.0458e+00, -2.0458e+00,  ..., -1.9482e+00,
           -1.9482e+00, -1.9678e+00],
          ...,
          [-2.0068e+00, -2.0068e+00, -2.0068e+00,  ..., -1.9873e+00,
           -2.0263e+00, -2.0068e+00],
          [-1.9873e+00, -2.0068e+00, -2.0068e+00,  ..., -1.3239e+00,
           -1.3825e+00, -1.5580e+00],
          [-1.9873e+00, -1.9873e+00, -1.9873e+00,  ..., -8.9469e-01,
           -7.3861e-01, -8.3616e-01]]],


        [[[ 2.5141e+00,  2.4753e+00,  2.4753e+00,  ...,  2.4753e+00,
            2.4753e+00,  2.4753e+00],
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          ...,
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4753e+00,  2.4753e+00,  ...,  2.4753e+00,
            2.4753e+00,  2.4753e+00]],

         [[ 2.5968e+00,  2.5575e+00,  2.5575e+00,  ...,  2.5575e+00,
            2.5575e+00,  2.5575e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          ...,
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5575e+00,  2.5575e+00,  ...,  2.5575e+00,
            2.5575e+00,  2.5575e+00]],

         [[ 2.7537e+00,  2.7147e+00,  2.7147e+00,  ...,  2.7147e+00,
            2.7147e+00,  2.7147e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          ...,
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7147e+00,  2.7147e+00,  ...,  2.7147e+00,
            2.7147e+00,  2.7147e+00]]],


        [[[-1.5374e+00, -6.0688e-01, -1.5374e+00,  ..., -1.8087e+00,
           -2.1383e+00, -2.2158e+00],
          [-1.5955e+00, -6.2627e-01, -1.5374e+00,  ..., -1.8087e+00,
           -2.1383e+00, -2.1964e+00],
          [-1.6537e+00, -6.4565e-01, -1.5567e+00,  ..., -1.8087e+00,
           -2.0995e+00, -2.1771e+00],
          ...,
          [-1.9832e+00, -1.5180e+00, -2.0026e+00,  ..., -2.0608e+00,
           -2.2158e+00, -2.2740e+00],
          [-1.9638e+00, -1.3047e+00, -1.8087e+00,  ..., -2.0608e+00,
           -2.1383e+00, -2.3128e+00],
          [-1.9638e+00, -1.2854e+00, -1.7118e+00,  ..., -2.0995e+00,
           -2.1189e+00, -2.3515e+00]],

         [[-4.3190e-01,  4.7277e-01, -7.6624e-01,  ..., -8.8424e-01,
           -1.6709e+00, -1.7496e+00],
          [-4.7124e-01,  4.7277e-01, -7.4657e-01,  ..., -8.4491e-01,
           -1.6316e+00, -1.7496e+00],
          [-5.3024e-01,  4.3344e-01, -7.8591e-01,  ..., -8.4491e-01,
           -1.5922e+00, -1.7496e+00],
          ...,
          [-1.0022e+00, -4.3190e-01, -1.3169e+00,  ..., -1.5529e+00,
           -1.7889e+00, -1.4546e+00],
          [-8.6457e-01, -9.7567e-02, -1.0219e+00,  ..., -1.5529e+00,
           -1.7299e+00, -1.4939e+00],
          [-8.0557e-01,  7.6703e-04, -8.2524e-01,  ..., -1.5922e+00,
           -1.7102e+00, -1.5332e+00]],

         [[-9.5322e-01, -3.2889e-01, -1.4410e+00,  ..., -1.6361e+00,
           -2.0458e+00, -1.9092e+00],
          [-1.0313e+00, -3.4841e-01, -1.4410e+00,  ..., -1.6166e+00,
           -2.0263e+00, -1.8897e+00],
          [-1.1093e+00, -3.8743e-01, -1.4995e+00,  ..., -1.5971e+00,
           -1.9873e+00, -1.8897e+00],
          ...,
          [-1.5580e+00, -1.2069e+00, -1.9092e+00,  ..., -1.7336e+00,
           -1.9287e+00, -1.9873e+00],
          [-1.5385e+00, -1.0118e+00, -1.7531e+00,  ..., -1.7336e+00,
           -1.8507e+00, -2.0068e+00],
          [-1.5776e+00, -9.9224e-01, -1.6556e+00,  ..., -1.7727e+00,
           -1.8312e+00, -2.0458e+00]]]]), tensor([1, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1,
        1, 0, 1, 0, 0, 0, 1, 1, 0, 1, 0, 1, 1, 0, 1, 1, 0, 1, 1, 0, 0, 0, 0, 1,
        0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 1])]
x is [tensor([[[[ 1.7968e+00,  2.2039e+00,  2.5141e+00,  ...,  1.0796e+00,
            1.0796e+00,  1.5255e+00],
          [ 5.7560e-01,  1.8356e+00,  2.4947e+00,  ...,  1.5255e+00,
            1.6805e+00,  2.0489e+00],
          [ 2.6544e-01,  2.0682e+00,  2.5141e+00,  ...,  1.8938e+00,
            2.2233e+00,  2.4753e+00],
          ...,
          [-1.4404e+00, -2.1189e+00, -2.0220e+00,  ...,  4.7867e-01,
            3.8175e-01,  2.6544e-01],
          [-1.3047e+00, -1.8669e+00, -1.8087e+00,  ...,  2.6544e-01,
            2.2667e-01,  2.0728e-01],
          [-1.5567e+00, -1.8669e+00, -1.5955e+00,  ...,  1.2974e-01,
            1.2974e-01,  1.2974e-01]],

         [[ 1.2398e+00,  1.8495e+00,  2.5575e+00,  ...,  7.6703e-04,
           -5.8233e-02,  2.5644e-01],
          [ 1.7777e-01,  1.5151e+00,  2.4788e+00,  ...,  3.7444e-01,
            4.1377e-01,  7.6777e-01],
          [-3.8567e-02,  1.8691e+00,  2.5771e+00,  ...,  6.3011e-01,
            8.6611e-01,  1.2004e+00],
          ...,
          [-1.6119e+00, -2.1823e+00, -2.0249e+00,  ...,  3.3510e-01,
            2.3677e-01,  1.1877e-01],
          [-1.4742e+00, -1.8676e+00, -1.8282e+00,  ...,  1.3844e-01,
            1.1877e-01,  9.9101e-02],
          [-1.6512e+00, -1.8873e+00, -1.6906e+00,  ...,  7.6703e-04,
            2.0434e-02,  2.0434e-02]],

         [[ 5.1005e-01,  9.7829e-01,  2.0123e+00,  ..., -1.0898e+00,
           -1.1483e+00, -1.0898e+00],
          [-5.2400e-01,  5.8809e-01,  1.8563e+00,  ..., -7.7763e-01,
           -6.9959e-01, -6.2155e-01],
          [-6.6057e-01,  1.0173e+00,  2.0904e+00,  ..., -6.6057e-01,
           -3.6792e-01, -2.5085e-01],
          ...,
          [-1.5971e+00, -1.6946e+00, -1.7727e+00,  ...,  1.7837e-01,
            6.1311e-02, -7.5261e-02],
          [-1.6556e+00, -1.6751e+00, -1.6166e+00,  ..., -1.6730e-02,
           -7.5261e-02, -9.4771e-02],
          [-1.6361e+00, -1.7141e+00, -1.5971e+00,  ..., -1.5330e-01,
           -1.7281e-01, -1.7281e-01]]],


        [[[-2.1383e+00, -2.0801e+00, -2.0220e+00,  ..., -2.2934e+00,
           -2.2352e+00, -2.1771e+00],
          [-2.1577e+00, -2.0995e+00, -2.0608e+00,  ..., -2.2158e+00,
           -2.1964e+00, -2.1964e+00],
          [-2.1383e+00, -2.1189e+00, -2.0801e+00,  ..., -2.1771e+00,
           -2.1771e+00, -2.1964e+00],
          ...,
          [-2.0995e+00, -2.1383e+00, -2.1189e+00,  ..., -2.1771e+00,
           -2.1964e+00, -2.1771e+00],
          [-2.1383e+00, -2.1383e+00, -2.1189e+00,  ..., -2.1964e+00,
           -2.2158e+00, -2.1964e+00],
          [-2.3515e+00, -2.1771e+00, -2.1189e+00,  ..., -2.1964e+00,
           -2.2158e+00, -2.1964e+00]],

         [[-1.8676e+00, -1.8086e+00, -1.7496e+00,  ..., -2.1233e+00,
           -2.0643e+00, -1.9659e+00],
          [-1.9266e+00, -1.8676e+00, -1.8282e+00,  ..., -2.0643e+00,
           -2.0446e+00, -2.0249e+00],
          [-1.9463e+00, -1.9266e+00, -1.8873e+00,  ..., -2.0249e+00,
           -2.0446e+00, -2.0643e+00],
          ...,
          [-2.0643e+00, -2.1429e+00, -2.1823e+00,  ..., -2.0249e+00,
           -2.0446e+00, -2.0249e+00],
          [-2.1036e+00, -2.1429e+00, -2.1823e+00,  ..., -2.0446e+00,
           -2.0643e+00, -2.0446e+00],
          [-2.3396e+00, -2.1823e+00, -2.1823e+00,  ..., -2.0446e+00,
           -2.0643e+00, -2.0446e+00]],

         [[-1.4020e+00, -1.3434e+00, -1.2849e+00,  ..., -1.7727e+00,
           -1.6946e+00, -1.4995e+00],
          [-1.5190e+00, -1.4605e+00, -1.4215e+00,  ..., -1.7141e+00,
           -1.6751e+00, -1.5971e+00],
          [-1.5776e+00, -1.5580e+00, -1.5190e+00,  ..., -1.6751e+00,
           -1.6751e+00, -1.6556e+00],
          ...,
          [-1.7531e+00, -1.8312e+00, -1.8507e+00,  ..., -1.6361e+00,
           -1.6556e+00, -1.6361e+00],
          [-1.7922e+00, -1.8312e+00, -1.8507e+00,  ..., -1.6556e+00,
           -1.6751e+00, -1.6556e+00],
          [-2.0263e+00, -1.8702e+00, -1.8507e+00,  ..., -1.6361e+00,
           -1.6556e+00, -1.6556e+00]]],


        [[[ 2.4365e+00,  2.4172e+00,  2.4172e+00,  ...,  2.4559e+00,
            2.4753e+00,  2.4753e+00],
          [ 2.5141e+00,  2.5141e+00,  2.4947e+00,  ...,  2.4172e+00,
            2.4365e+00,  2.4365e+00],
          [ 2.5141e+00,  2.5141e+00,  2.4947e+00,  ...,  2.4172e+00,
            2.4172e+00,  2.4172e+00],
          ...,
          [ 2.4365e+00,  2.4753e+00,  2.4947e+00,  ...,  2.4559e+00,
            2.4559e+00,  2.4559e+00],
          [ 2.4947e+00,  2.4172e+00,  2.4365e+00,  ...,  2.4172e+00,
            2.4753e+00,  2.4559e+00],
          [ 2.4947e+00,  2.4365e+00,  2.4753e+00,  ...,  2.4753e+00,
            2.4559e+00,  2.4559e+00]],

         [[ 2.5771e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5575e+00,  2.5575e+00],
          [ 2.4985e+00,  2.5378e+00,  2.5575e+00,  ...,  2.5771e+00,
            2.5968e+00,  2.5771e+00],
          [ 2.5181e+00,  2.5378e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5968e+00,  2.5968e+00],
          ...,
          [ 2.5181e+00,  2.5378e+00,  2.5378e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5771e+00],
          [ 2.5575e+00,  2.4788e+00,  2.5181e+00,  ...,  2.5575e+00,
            2.5968e+00,  2.5771e+00],
          [ 2.5575e+00,  2.5181e+00,  2.5575e+00,  ...,  2.5968e+00,
            2.5771e+00,  2.5771e+00]],

         [[ 2.6757e+00,  2.6952e+00,  2.7147e+00,  ...,  2.6757e+00,
            2.6757e+00,  2.6757e+00],
          [ 2.6562e+00,  2.6562e+00,  2.6562e+00,  ...,  2.6367e+00,
            2.6562e+00,  2.6562e+00],
          [ 2.6172e+00,  2.6172e+00,  2.6367e+00,  ...,  2.6172e+00,
            2.6367e+00,  2.6172e+00],
          ...,
          [ 2.5781e+00,  2.6952e+00,  2.7537e+00,  ...,  2.5976e+00,
            2.5781e+00,  2.6367e+00],
          [ 2.6562e+00,  2.6367e+00,  2.7147e+00,  ...,  2.5391e+00,
            2.6172e+00,  2.6367e+00],
          [ 2.6757e+00,  2.6367e+00,  2.7147e+00,  ...,  2.6172e+00,
            2.6172e+00,  2.6367e+00]]],


        ...,


        [[[ 2.5141e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4753e+00,  2.4947e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.4947e+00],
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.4947e+00],
          ...,
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.4753e+00,
            2.4753e+00,  2.4947e+00],
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.4753e+00,
            2.4753e+00,  2.4947e+00],
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.4947e+00]],

         [[ 2.5968e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5575e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5771e+00],
          ...,
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5968e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5771e+00]],

         [[ 2.7537e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7147e+00,  2.7342e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7342e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7342e+00],
          ...,
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7147e+00,
            2.7147e+00,  2.7342e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.6952e+00,
            2.6952e+00,  2.7342e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7342e+00]]],


        [[[ 2.3784e+00,  2.4365e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.3978e+00,  2.5141e+00],
          [ 2.3978e+00,  2.4172e+00,  2.4753e+00,  ...,  2.1845e+00,
            2.4365e+00,  2.4947e+00],
          [ 2.4559e+00,  2.4365e+00,  2.4753e+00,  ...,  8.8576e-01,
            1.8744e+00,  2.5141e+00],
          ...,
          [ 2.5141e+00,  2.4753e+00,  2.4947e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.5141e+00],
          [ 2.4559e+00,  2.4753e+00,  2.4947e+00,  ...,  2.4559e+00,
            2.4559e+00,  2.5141e+00],
          [ 2.4365e+00,  2.4559e+00,  2.5141e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.5141e+00]],

         [[ 2.1445e+00,  2.1445e+00,  2.1838e+00,  ...,  2.4788e+00,
            2.3608e+00,  2.4198e+00],
          [ 2.1641e+00,  2.1641e+00,  2.2231e+00,  ...,  1.7708e+00,
            2.4001e+00,  2.4788e+00],
          [ 2.2231e+00,  2.2231e+00,  2.2821e+00,  ..., -6.6791e-01,
            1.2594e+00,  2.5181e+00],
          ...,
          [ 2.2625e+00,  2.2625e+00,  2.3018e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5968e+00],
          [ 2.1838e+00,  2.2035e+00,  2.2231e+00,  ...,  2.5575e+00,
            2.5575e+00,  2.5968e+00],
          [ 2.1445e+00,  2.1641e+00,  2.2231e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00]],

         [[ 1.1929e+00,  1.2514e+00,  1.2905e+00,  ...,  1.6221e+00,
            1.5051e+00,  1.6026e+00],
          [ 1.2709e+00,  1.2905e+00,  1.3490e+00,  ...,  1.1734e+00,
            1.6221e+00,  1.6612e+00],
          [ 1.3685e+00,  1.3685e+00,  1.4270e+00,  ..., -7.7763e-01,
            7.4417e-01,  1.7392e+00],
          ...,
          [ 1.3100e+00,  1.3100e+00,  1.3490e+00,  ...,  1.9928e+00,
            1.9928e+00,  2.0514e+00],
          [ 1.2514e+00,  1.2709e+00,  1.2905e+00,  ...,  1.8953e+00,
            1.8953e+00,  1.9538e+00],
          [ 1.2124e+00,  1.2319e+00,  1.2905e+00,  ...,  1.8172e+00,
            1.8172e+00,  1.8563e+00]]],


        [[[ 2.5141e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.4947e+00],
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          ...,
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.4947e+00],
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00]],

         [[ 2.5968e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          ...,
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00]],

         [[ 2.7537e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7342e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          ...,
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7342e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00]]]]), tensor([0, 0, 0, 0, 1, 1, 1, 1, 1, 0, 1, 1, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0,
        1, 0, 1, 0, 0, 1, 1, 1, 0, 1, 0, 1, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0,
        1, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0])]
x is [tensor([[[[-1.7506e+00, -1.8281e+00, -1.8087e+00,  ..., -7.0381e-01,
           -9.9458e-01, -1.1497e+00],
          [-1.8281e+00, -1.7700e+00, -1.8087e+00,  ..., -8.3950e-01,
           -1.2466e+00, -1.1303e+00],
          [-1.7700e+00, -1.7506e+00, -1.8281e+00,  ..., -7.8135e-01,
           -1.5374e+00, -1.3435e+00],
          ...,
          [-1.9638e+00, -1.9057e+00, -9.1704e-01,  ...,  1.8790e-01,
            8.4699e-01,  6.3375e-01],
          [-1.9444e+00, -1.9638e+00, -1.8087e+00,  ...,  6.7252e-01,
            5.1744e-01,  1.0602e+00],
          [-1.6149e+00, -1.7700e+00, -1.7894e+00,  ...,  9.0514e-01,
            1.4913e-01,  8.2760e-01]],

         [[-1.4742e+00, -1.8282e+00, -1.7889e+00,  ..., -6.4824e-01,
           -9.6291e-01, -1.1399e+00],
          [-1.6709e+00, -1.8086e+00, -1.7692e+00,  ..., -7.8591e-01,
           -1.2382e+00, -1.1399e+00],
          [-1.7299e+00, -1.8086e+00, -1.7692e+00,  ..., -7.2691e-01,
           -1.5136e+00, -1.3366e+00],
          ...,
          [-9.2357e-01, -1.0809e+00, -2.9424e-01,  ...,  5.1211e-01,
            1.1021e+00,  6.8911e-01],
          [-1.1399e+00, -1.4349e+00, -1.4152e+00,  ...,  7.0877e-01,
            4.3344e-01,  8.2677e-01],
          [-1.1989e+00, -1.6709e+00, -1.7299e+00,  ...,  1.0038e+00,
            7.6703e-04,  5.7111e-01]],

         [[-1.5385e+00, -1.4995e+00, -1.6361e+00,  ..., -1.0508e+00,
           -1.2654e+00, -1.3044e+00],
          [-1.6166e+00, -1.5190e+00, -1.6166e+00,  ..., -1.2069e+00,
           -1.4995e+00, -1.2849e+00],
          [-1.5776e+00, -1.5580e+00, -1.5971e+00,  ..., -1.1483e+00,
           -1.7922e+00, -1.4800e+00],
          ...,
          [-1.5971e+00, -1.6166e+00, -7.1910e-01,  ...,  4.5152e-01,
            5.6858e-01,  8.0821e-02],
          [-1.6361e+00, -1.7141e+00, -1.5580e+00,  ...,  4.1250e-01,
           -4.0694e-01, -3.6240e-02],
          [-1.4020e+00, -1.5971e+00, -1.5190e+00,  ...,  4.9054e-01,
           -6.8008e-01, -1.3379e-01]]],


        [[[ 2.4172e+00,  2.4947e+00,  2.4559e+00,  ...,  2.4559e+00,
            2.4947e+00,  2.4172e+00],
          [ 2.4947e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.4947e+00],
          [ 2.4559e+00,  2.5141e+00,  2.4753e+00,  ...,  2.4753e+00,
            2.5141e+00,  2.4559e+00],
          ...,
          [ 2.4559e+00,  2.5141e+00,  2.4753e+00,  ...,  2.4753e+00,
            2.5141e+00,  2.4559e+00],
          [ 2.4947e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.4947e+00],
          [ 2.4172e+00,  2.4947e+00,  2.4559e+00,  ...,  2.4559e+00,
            2.4947e+00,  2.4172e+00]],

         [[ 2.4985e+00,  2.5771e+00,  2.5378e+00,  ...,  2.5378e+00,
            2.5771e+00,  2.4985e+00],
          [ 2.5771e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5771e+00],
          [ 2.5378e+00,  2.5968e+00,  2.5575e+00,  ...,  2.5575e+00,
            2.5968e+00,  2.5378e+00],
          ...,
          [ 2.5378e+00,  2.5968e+00,  2.5575e+00,  ...,  2.5575e+00,
            2.5968e+00,  2.5378e+00],
          [ 2.5771e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5771e+00],
          [ 2.4985e+00,  2.5771e+00,  2.5378e+00,  ...,  2.5378e+00,
            2.5771e+00,  2.4985e+00]],

         [[ 2.6562e+00,  2.7342e+00,  2.6952e+00,  ...,  2.6952e+00,
            2.7342e+00,  2.6562e+00],
          [ 2.7342e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7342e+00],
          [ 2.6952e+00,  2.7537e+00,  2.7147e+00,  ...,  2.7147e+00,
            2.7537e+00,  2.6952e+00],
          ...,
          [ 2.6952e+00,  2.7537e+00,  2.7147e+00,  ...,  2.7147e+00,
            2.7537e+00,  2.6952e+00],
          [ 2.7342e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7342e+00],
          [ 2.6562e+00,  2.7342e+00,  2.6952e+00,  ...,  2.6952e+00,
            2.7342e+00,  2.6562e+00]]],


        [[[-2.1383e+00, -2.0801e+00, -1.4404e+00,  ..., -2.4291e+00,
           -2.4291e+00, -2.4097e+00],
          [-2.3515e+00, -2.2352e+00, -1.6149e+00,  ..., -2.4291e+00,
           -2.4291e+00, -2.4291e+00],
          [-2.3903e+00, -2.3709e+00, -1.8281e+00,  ..., -2.4097e+00,
           -2.4097e+00, -2.4097e+00],
          ...,
          [ 4.7867e-01,  2.0728e-01,  3.0421e-01,  ..., -2.5336e-02,
           -4.4721e-02, -2.5336e-02],
          [ 1.3510e+00,  7.3068e-01,  6.7252e-01,  ..., -2.5336e-02,
            1.8790e-01,  2.4605e-01],
          [ 1.2541e+00,  1.3316e+00,  1.2347e+00,  ...,  3.2359e-01,
            8.8576e-01,  9.0514e-01]],

         [[ 2.7610e-01,  2.1710e-01,  1.9744e-01,  ..., -3.7290e-01,
           -3.7290e-01, -3.3357e-01],
          [ 2.1710e-01,  1.5810e-01,  1.1877e-01,  ..., -3.9257e-01,
           -3.9257e-01, -3.7290e-01],
          [ 1.5810e-01,  1.1877e-01,  9.9101e-02,  ..., -4.1224e-01,
           -4.1224e-01, -3.9257e-01],
          ...,
          [-8.2524e-01, -1.2776e+00, -1.2579e+00,  ..., -2.3986e+00,
           -2.4183e+00, -2.4183e+00],
          [ 7.2844e-01, -2.5490e-01, -3.9257e-01,  ..., -2.1429e+00,
           -1.7102e+00, -1.6906e+00],
          [ 5.1211e-01,  4.7277e-01,  4.7277e-01,  ..., -1.1596e+00,
           -3.3357e-01, -1.9590e-01]],

         [[ 2.7342e+00,  2.6757e+00,  2.4416e+00,  ...,  2.3050e+00,
            2.3050e+00,  2.3245e+00],
          [ 2.6952e+00,  2.6562e+00,  2.4221e+00,  ...,  2.2855e+00,
            2.2855e+00,  2.2855e+00],
          [ 2.6952e+00,  2.6367e+00,  2.4806e+00,  ...,  2.3050e+00,
            2.3050e+00,  2.2855e+00],
          ...,
          [-1.2069e+00, -1.7531e+00, -1.9092e+00,  ..., -2.1824e+00,
           -2.2019e+00, -2.1629e+00],
          [ 3.9299e-01, -7.9714e-01, -8.5567e-01,  ..., -1.9482e+00,
           -1.5190e+00, -1.4800e+00],
          [ 2.9543e-01,  2.5641e-01,  2.7592e-01,  ..., -9.7273e-01,
           -1.3379e-01,  2.7802e-03]]],


        ...,


        [[[-1.5955e+00, -1.5374e+00, -1.4986e+00,  ..., -2.2740e+00,
           -2.3709e+00, -2.3903e+00],
          [-1.8863e+00, -1.8669e+00, -1.6343e+00,  ..., -2.2934e+00,
           -2.3903e+00, -2.3903e+00],
          [-2.2934e+00, -2.3321e+00, -2.3321e+00,  ..., -2.2740e+00,
           -2.4097e+00, -2.4097e+00],
          ...,
          [-1.1303e+00, -1.2854e+00, -1.5374e+00,  ..., -2.0414e+00,
           -1.9832e+00, -1.8863e+00],
          [-1.0527e+00, -1.3241e+00, -1.5761e+00,  ..., -2.0414e+00,
           -2.0026e+00, -1.9444e+00],
          [-9.1704e-01, -1.1690e+00, -1.4598e+00,  ..., -1.9638e+00,
           -1.9638e+00, -1.9444e+00]],

         [[-1.6119e+00, -1.5332e+00, -1.4742e+00,  ..., -1.9659e+00,
           -2.1233e+00, -2.2019e+00],
          [-1.8479e+00, -1.8086e+00, -1.5726e+00,  ..., -1.9266e+00,
           -2.0643e+00, -2.1429e+00],
          [-2.1233e+00, -2.1626e+00, -2.1626e+00,  ..., -1.8282e+00,
           -1.9856e+00, -2.0643e+00],
          ...,
          [ 3.1544e-01,  7.6703e-04, -3.9257e-01,  ..., -1.4152e+00,
           -1.3366e+00, -1.2382e+00],
          [ 5.5144e-01,  9.9101e-02, -3.1390e-01,  ..., -1.3956e+00,
           -1.3562e+00, -1.2972e+00],
          [ 6.8911e-01,  2.3677e-01, -1.9590e-01,  ..., -1.3169e+00,
           -1.3169e+00, -1.2972e+00]],

         [[-1.6730e-02, -3.6240e-02, -3.6240e-02,  ...,  3.1495e-01,
            1.3935e-01,  2.7802e-03],
          [-1.7281e-01, -2.3134e-01, -5.5751e-02,  ...,  4.5152e-01,
            2.9543e-01,  1.7837e-01],
          [ 1.0033e-01, -1.6730e-02, -7.5261e-02,  ...,  6.6613e-01,
            4.9054e-01,  3.9299e-01],
          ...,
          [ 2.7537e+00,  2.5976e+00,  2.4025e+00,  ...,  2.1294e+00,
            2.1879e+00,  2.2855e+00],
          [ 2.7342e+00,  2.4611e+00,  2.2074e+00,  ...,  2.1099e+00,
            2.1489e+00,  2.2074e+00],
          [ 2.6952e+00,  2.4221e+00,  2.1489e+00,  ...,  2.1879e+00,
            2.2074e+00,  2.2270e+00]]],


        [[[-4.5180e-01, -9.1704e-01, -2.3857e-01,  ..., -1.9832e+00,
           -2.1189e+00, -2.1771e+00],
          [-3.9365e-01, -1.5955e+00, -2.1919e-01,  ..., -1.7506e+00,
           -1.9638e+00, -2.0995e+00],
          [ 4.0113e-01, -4.9057e-01, -3.7426e-01,  ..., -1.5374e+00,
           -1.7700e+00, -1.9638e+00],
          ...,
          [-1.3435e+00, -1.2660e+00, -1.0721e+00,  ..., -1.2466e+00,
           -1.3435e+00, -1.5180e+00],
          [-1.4017e+00, -1.3629e+00, -1.3241e+00,  ..., -1.2660e+00,
           -1.3047e+00, -1.3241e+00],
          [-1.2272e+00, -1.2854e+00, -1.4986e+00,  ..., -1.2660e+00,
           -1.1497e+00, -1.1497e+00]],

         [[-5.3024e-01, -1.0416e+00, -3.9257e-01,  ..., -1.9856e+00,
           -2.0643e+00, -2.1233e+00],
          [-5.1057e-01, -1.7496e+00, -3.3357e-01,  ..., -1.7889e+00,
           -1.9463e+00, -2.0839e+00],
          [ 2.9577e-01, -6.0891e-01, -4.9090e-01,  ..., -1.6316e+00,
           -1.7692e+00, -1.9463e+00],
          ...,
          [-1.6906e+00, -1.6119e+00, -1.3956e+00,  ..., -1.4939e+00,
           -1.5529e+00, -1.7102e+00],
          [-1.6906e+00, -1.6316e+00, -1.5529e+00,  ..., -1.5332e+00,
           -1.5529e+00, -1.5332e+00],
          [-1.5529e+00, -1.5529e+00, -1.6906e+00,  ..., -1.5529e+00,
           -1.4349e+00, -1.4152e+00]],

         [[-4.8498e-01, -9.9224e-01, -3.6792e-01,  ..., -1.7727e+00,
           -1.8312e+00, -1.8702e+00],
          [-4.2645e-01, -1.6946e+00, -3.0938e-01,  ..., -1.7336e+00,
           -1.8117e+00, -1.9092e+00],
          [ 3.1495e-01, -5.6302e-01, -4.2645e-01,  ..., -1.7336e+00,
           -1.7531e+00, -1.8507e+00],
          ...,
          [-1.5971e+00, -1.5580e+00, -1.4800e+00,  ..., -1.5385e+00,
           -1.6166e+00, -1.6166e+00],
          [-1.5385e+00, -1.4995e+00, -1.4800e+00,  ..., -1.5190e+00,
           -1.6166e+00, -1.4995e+00],
          [-1.4605e+00, -1.5190e+00, -1.5971e+00,  ..., -1.4995e+00,
           -1.5190e+00, -1.4605e+00]]],


        [[[-2.4097e+00, -2.4291e+00, -2.3903e+00,  ..., -1.9832e+00,
           -2.1189e+00, -2.2934e+00],
          [-2.4291e+00, -2.4291e+00, -2.4097e+00,  ..., -1.3435e+00,
           -1.5955e+00, -2.2352e+00],
          [-2.4291e+00, -2.4097e+00, -2.3709e+00,  ..., -2.0220e+00,
           -2.1577e+00, -2.3321e+00],
          ...,
          [-2.3515e+00, -2.3709e+00, -1.6537e+00,  ..., -1.4211e+00,
           -1.3435e+00, -1.2660e+00],
          [-2.0220e+00, -1.8475e+00, -1.3241e+00,  ..., -1.0527e+00,
           -1.2078e+00, -1.1303e+00],
          [-1.3629e+00, -1.1497e+00, -1.1303e+00,  ..., -6.6504e-01,
           -8.7827e-01, -1.0334e+00]],

         [[-2.2609e+00, -2.3593e+00, -2.2806e+00,  ..., -2.1036e+00,
           -2.1233e+00, -2.3003e+00],
          [-2.3199e+00, -2.3593e+00, -2.3003e+00,  ..., -1.4152e+00,
           -1.5922e+00, -2.2216e+00],
          [-2.3396e+00, -2.3396e+00, -2.3003e+00,  ..., -2.0643e+00,
           -2.1626e+00, -2.3396e+00],
          ...,
          [-2.3593e+00, -2.3199e+00, -1.5332e+00,  ..., -1.3759e+00,
           -1.3169e+00, -1.2579e+00],
          [-2.0446e+00, -1.8873e+00, -1.3366e+00,  ..., -9.8258e-01,
           -1.1596e+00, -1.1792e+00],
          [-1.3562e+00, -1.1792e+00, -1.1792e+00,  ..., -5.8924e-01,
           -8.0557e-01, -1.1202e+00]],

         [[-2.2019e+00, -2.1629e+00, -2.1434e+00,  ..., -1.7727e+00,
           -1.9482e+00, -2.1434e+00],
          [-2.1629e+00, -2.1629e+00, -2.1629e+00,  ..., -1.1678e+00,
           -1.4215e+00, -2.0653e+00],
          [-2.1238e+00, -2.1629e+00, -2.1238e+00,  ..., -1.8702e+00,
           -2.0068e+00, -2.1629e+00],
          ...,
          [-2.1434e+00, -2.2019e+00, -1.7922e+00,  ..., -1.7922e+00,
           -1.7727e+00, -1.6556e+00],
          [-2.0458e+00, -2.0263e+00, -1.7336e+00,  ..., -1.4605e+00,
           -1.6751e+00, -1.6946e+00],
          [-1.6166e+00, -1.5385e+00, -1.6166e+00,  ..., -1.0313e+00,
           -1.3239e+00, -1.6556e+00]]]]), tensor([1, 0, 1, 1, 0, 1, 1, 1])]
Accuracy on test set: 1.4 , Racc: 1.4 , Uacc: 0.0
Folders created.
Checkpoint name: cifar100_resnet_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100_resnet_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
Number of Classes: 100
ResNet18(
  (conv1): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
  (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  (layer1): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (layer2): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (shortcut): Sequential(
        (0): Conv2d(64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (layer3): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (shortcut): Sequential(
        (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (layer4): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (shortcut): Sequential(
        (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (linear): Linear(in_features=512, out_features=100, bias=True)
)
==> unlearning ...
Computing current moments on test set
Computed moments: 10.932028585815429,8.130016621589661,-1.7629506289070531
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.927 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.585 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.3 , Uacc: 6.5
Forgetting epoch 0
Resetting retain iterator...
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.980 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.540 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 0.9 , Uacc: 58.5
Forgetting epoch 1
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.537 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.2 , Uacc: 82.0
Forgetting epoch 2
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.512 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.1 , Uacc: 83.0
Forgetting epoch 3
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.582 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.1 , Uacc: 78.5
Forgetting epoch 4
Computing current moments on test set
Computed moments: 31.13483080444336,90.64256333007812,-0.04528286564828707
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.552 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.2 , Uacc: 86.0
Forgetting epoch 5
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.545 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.2 , Uacc: 86.5
Forgetting epoch 6
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.557 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.2 , Uacc: 88.0
Forgetting epoch 7
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.607 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 1.2 , Uacc: 88.0
Forgetting epoch 8
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.495 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.2 , Uacc: 90.0
Forgetting epoch 9
Computing current moments on test set
Computed moments: 30.701072647094726,125.72290086669922,0.9273635942574564
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.443 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.2 , Uacc: 91.5
Forgetting epoch 10
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.568 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 91.0
Forgetting epoch 11
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.545 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.2 , Uacc: 92.0
Forgetting epoch 12
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.470 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 94.5
Forgetting epoch 13
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.512 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 93.5
Forgetting epoch 14
Computing current moments on test set
Computed moments: 30.18359422607422,151.28567282714843,1.421568840389997
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.495 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 94.5
Forgetting epoch 15
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.470 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 1.3 , Uacc: 95.0
Forgetting epoch 16
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.568 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 95.5
Forgetting epoch 17
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.443 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 95.0
Forgetting epoch 18
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.560 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 97.0
Forgetting epoch 19
Computing current moments on test set
Computed moments: 29.842274017333985,153.19609770507813,1.583537907573554
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.568 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 97.5
Forgetting epoch 20
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.535 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 1.3 , Uacc: 97.0
Forgetting epoch 21
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.575 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 96.5
Forgetting epoch 22
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.510 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 1.3 , Uacc: 98.0
Forgetting epoch 23
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.545 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 1.3 , Uacc: 98.0
Forgetting epoch 24
Accuracy on test set: 1.8 , Racc: 1.3 , Uacc: 99.0
Checkpoint name: cifar100_resnet_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100_resnet_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
Number of Classes: 100
Traceback (most recent call last):
  File "nabla_unlearning_main.py", line 552, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/Resnet-CIFAR100_3407_800000/checkpoint_s_500000.pt'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "nabla_unlearning_main.py", line 555, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/Resnet-CIFAR100_3407_800000/checkpoint-s:500000.pt'
Checkpoint name: cifar100_resnet_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100_resnet_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
Number of Classes: 100
ResNet18(
  (conv1): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
  (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  (layer1): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (layer2): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (shortcut): Sequential(
        (0): Conv2d(64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (layer3): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (shortcut): Sequential(
        (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (layer4): Sequential(
    (0): _ResBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1))
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (shortcut): Sequential(
        (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): _ResBlock(
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    )
  )
  (linear): Linear(in_features=512, out_features=100, bias=True)
)
Traceback (most recent call last):
  File "main_forget_sparse.py", line 556, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/Resnet-CIFAR100_3407_800000/checkpoint_s_500000.pt'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "main_forget_sparse.py", line 559, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/Resnet-CIFAR100_3407_800000/checkpoint-s:500000.pt'
