Checkpoint name: cifar100__CNN_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100__CNN_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
Number of Classes: 100
logs/cifar100__CNN_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_resume_cp1304
==> unlearning ...
Computing current moments on test set
Computed moments: 10.383838467407227,8.01676732711792,-1.3475479697511006
The MIA_loss has an accuracy of 0.924 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.550 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.0 , Uacc: 7.8
Forgetting epoch 0
Resetting retain iterator...
using alpha: 0.1
delta_val_loss: 6.33651876449585
delta_first_moment: 8.016767501831055
delta_second_moment: nan
The MIA_loss has an accuracy of 0.725 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.631 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.2 , Uacc: 0.0
Forgetting epoch 1
using alpha: 0.096
delta_val_loss: 5.443652153015137
delta_first_moment: 8.016767501831055
delta_second_moment: nan
The MIA_loss has an accuracy of 0.704 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.686 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 1.3 , Uacc: 0.0
Forgetting epoch 2
using alpha: 0.092
delta_val_loss: 2.066470146179199
delta_first_moment: 8.016767501831055
delta_second_moment: nan
The MIA_loss has an accuracy of 0.699 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.704 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.4 , Uacc: 0.2
Forgetting epoch 3
using alpha: 0.088
delta_val_loss: -0.6012096405029297
delta_first_moment: 8.016767501831055
delta_second_moment: nan
The MIA_loss has an accuracy of 0.705 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.694 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.3 , Uacc: 0.0
Forgetting epoch 4
using alpha: 0.08399999999999999
delta_val_loss: -1.3961868286132812
delta_first_moment: 8.016767501831055
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.845665470886231,1.9988329613685607,7.759664995584609
The MIA_loss has an accuracy of 0.766 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.639 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 5
using alpha: 0.07999999999999999
delta_val_loss: -1.8757872581481934
delta_first_moment: 1.9988329410552979
delta_second_moment: nan
The MIA_loss has an accuracy of 0.895 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.619 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 6
using alpha: 0.07599999999999998
delta_val_loss: -0.14453458786010742
delta_first_moment: 1.9988329410552979
delta_second_moment: nan
The MIA_loss has an accuracy of 0.831 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.625 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 0.2
Forgetting epoch 7
using alpha: 0.07199999999999998
delta_val_loss: 0.03755378723144531
delta_first_moment: 1.9988329410552979
delta_second_moment: nan
The MIA_loss has an accuracy of 0.643 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.500 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.0 , Uacc: 8.5
Forgetting epoch 8
using alpha: 0.06799999999999998
delta_val_loss: 0.1575632095336914
delta_first_moment: 1.9988329410552979
delta_second_moment: nan
The MIA_loss has an accuracy of 0.754 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.589 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 1.1 , Uacc: 0.5
Forgetting epoch 9
using alpha: 0.06399999999999997
delta_val_loss: 0.1226816177368164
delta_first_moment: 1.9988329410552979
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.608031422424316,0.00596838749051094,-0.683987524649701
The MIA_loss has an accuracy of 0.829 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.666 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 10
using alpha: 0.05999999999999997
delta_val_loss: -0.1476612091064453
delta_first_moment: 0.005968387704342604
delta_second_moment: nan
The MIA_loss has an accuracy of 0.890 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.656 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 11
using alpha: 0.055999999999999966
delta_val_loss: -0.17519330978393555
delta_first_moment: 0.005968387704342604
delta_second_moment: nan
The MIA_loss has an accuracy of 0.925 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.662 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 12
using alpha: 0.05199999999999996
delta_val_loss: -0.20284223556518555
delta_first_moment: 0.005968387704342604
delta_second_moment: nan
The MIA_loss has an accuracy of 0.932 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.670 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 13
using alpha: 0.04799999999999996
delta_val_loss: -0.19900846481323242
delta_first_moment: 0.005968387704342604
delta_second_moment: nan
The MIA_loss has an accuracy of 0.914 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.604 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 14
using alpha: 0.043999999999999956
delta_val_loss: -0.30548620223999023
delta_first_moment: 0.005968387704342604
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.6150267166137695,0.03498991470038891,-1.9012700107603684
The MIA_loss has an accuracy of 0.939 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.580 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 15
using alpha: 0.03999999999999995
delta_val_loss: -0.5234346389770508
delta_first_moment: 0.034989915788173676
delta_second_moment: nan
The MIA_loss has an accuracy of 0.733 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.574 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 1.3 , Uacc: 0.0
Forgetting epoch 16
using alpha: 0.03599999999999995
delta_val_loss: -0.12833881378173828
delta_first_moment: 0.034989915788173676
delta_second_moment: nan
The MIA_loss has an accuracy of 0.507 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.500 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 1.1 , Uacc: 4.2
Forgetting epoch 17
using alpha: 0.031999999999999945
delta_val_loss: 0.027743816375732422
delta_first_moment: 0.034989915788173676
delta_second_moment: nan
The MIA_loss has an accuracy of 0.576 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.574 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 0.9 , Uacc: 7.0
Forgetting epoch 18
using alpha: 0.027999999999999945
delta_val_loss: 0.018527507781982422
delta_first_moment: 0.034989915788173676
delta_second_moment: nan
The MIA_loss has an accuracy of 0.561 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.546 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 0.9 , Uacc: 9.5
Forgetting epoch 19
using alpha: 0.023999999999999945
delta_val_loss: -0.001857757568359375
delta_first_moment: 0.034989915788173676
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.60128441619873,0.016389711928367613,-1.1402445669087327
The MIA_loss has an accuracy of 0.866 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.588 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 2.2
Forgetting epoch 20
using alpha: 0.019999999999999945
delta_val_loss: -0.33408355712890625
delta_first_moment: 0.016389712691307068
delta_second_moment: nan
The MIA_loss has an accuracy of 0.922 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.576 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 0.5
Forgetting epoch 21
using alpha: 0.015999999999999945
delta_val_loss: -0.6052093505859375
delta_first_moment: 0.016389712691307068
delta_second_moment: nan
The MIA_loss has an accuracy of 0.957 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.597 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 22
using alpha: 0.011999999999999945
delta_val_loss: -0.7670326232910156
delta_first_moment: 0.016389712691307068
delta_second_moment: nan
The MIA_loss has an accuracy of 0.801 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.528 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 3.0
Forgetting epoch 23
using alpha: 0.007999999999999945
delta_val_loss: -0.24444913864135742
delta_first_moment: 0.016389712691307068
delta_second_moment: nan
The MIA_loss has an accuracy of 0.874 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.576 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 3.8
Forgetting epoch 24
using alpha: 0.003999999999999945
delta_val_loss: -0.3055715560913086
delta_first_moment: 0.016389712691307068
delta_second_moment: nan
The MIA loss has an accuracy of 0.774 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.506 on forgotten vs unseen images
x is [tensor([[[[-2.5336e-02, -6.8442e-01, -1.3047e+00,  ..., -4.5180e-01,
           -1.0915e+00, -7.6196e-01],
          [-6.2627e-01, -1.0140e+00, -1.3823e+00,  ...,  2.8482e-01,
           -1.8042e-01, -1.1303e+00],
          [-1.1884e+00, -1.2466e+00, -1.4986e+00,  ..., -2.5336e-02,
           -2.3857e-01, -9.3643e-01],
          ...,
          [-9.1704e-01, -9.1704e-01, -9.7520e-01,  ..., -1.4165e-01,
           -8.3491e-02,  5.2204e-02],
          [-8.0073e-01, -8.2012e-01, -8.9766e-01,  ..., -1.8042e-01,
           -4.4721e-02,  3.2819e-02],
          [-6.6504e-01, -6.8442e-01, -7.2319e-01,  ..., -2.5796e-01,
           -1.6103e-01, -4.4721e-02]],

         [[ 4.0101e-02, -5.1057e-01, -1.0809e+00,  ..., -3.3357e-01,
           -9.8258e-01, -6.4824e-01],
          [-5.3024e-01, -8.2524e-01, -1.1399e+00,  ...,  4.1377e-01,
           -5.8233e-02, -1.0219e+00],
          [-1.0809e+00, -1.0416e+00, -1.2382e+00,  ...,  9.9101e-02,
           -1.1723e-01, -8.2524e-01],
          ...,
          [-1.1399e+00, -1.0022e+00, -9.2357e-01,  ..., -2.3524e-01,
           -1.9590e-01, -1.1723e-01],
          [-1.1006e+00, -1.0219e+00, -9.8258e-01,  ..., -3.3357e-01,
           -2.3524e-01, -1.9590e-01],
          [-9.8258e-01, -9.8258e-01, -9.8258e-01,  ..., -4.5157e-01,
           -4.1224e-01, -3.3357e-01]],

         [[ 2.1739e-01, -3.6792e-01, -9.5322e-01,  ..., -9.4771e-02,
           -7.3861e-01, -4.0694e-01],
          [-3.6792e-01, -6.9959e-01, -1.0313e+00,  ...,  6.2711e-01,
            1.5886e-01, -7.9714e-01],
          [-9.1420e-01, -9.1420e-01, -1.1288e+00,  ...,  2.5641e-01,
            6.1311e-02, -6.6057e-01],
          ...,
          [-1.0703e+00, -9.7273e-01, -9.3371e-01,  ..., -2.1183e-01,
           -1.5330e-01, -7.5261e-02],
          [-9.9224e-01, -9.5322e-01, -9.5322e-01,  ..., -2.8987e-01,
           -1.7281e-01, -1.3379e-01],
          [-8.7518e-01, -8.9469e-01, -8.9469e-01,  ..., -3.8743e-01,
           -3.2889e-01, -2.3134e-01]]],


        [[[-2.3128e+00, -2.3128e+00, -2.3128e+00,  ..., -2.3128e+00,
           -2.3128e+00, -2.3128e+00],
          [-2.3128e+00, -2.3128e+00, -2.3128e+00,  ..., -2.3128e+00,
           -2.3128e+00, -2.3321e+00],
          [-2.3128e+00, -2.3128e+00, -2.3128e+00,  ..., -2.3128e+00,
           -2.3128e+00, -2.3321e+00],
          ...,
          [-2.3128e+00, -2.3128e+00, -2.3128e+00,  ..., -2.2934e+00,
           -2.2934e+00, -2.2934e+00],
          [-2.3128e+00, -2.3321e+00, -2.3321e+00,  ..., -2.3128e+00,
           -2.2934e+00, -2.3128e+00],
          [-1.8863e+00, -1.8863e+00, -1.8863e+00,  ..., -2.3128e+00,
           -2.3128e+00, -2.3128e+00]],

         [[-2.2806e+00, -2.2806e+00, -2.2806e+00,  ..., -2.2806e+00,
           -2.2806e+00, -2.2806e+00],
          [-2.2806e+00, -2.2806e+00, -2.2806e+00,  ..., -2.2806e+00,
           -2.2806e+00, -2.3003e+00],
          [-2.2806e+00, -2.2806e+00, -2.2806e+00,  ..., -2.2806e+00,
           -2.2806e+00, -2.3003e+00],
          ...,
          [-2.3003e+00, -2.3003e+00, -2.3003e+00,  ..., -2.2609e+00,
           -2.2609e+00, -2.2609e+00],
          [-2.3003e+00, -2.3199e+00, -2.3199e+00,  ..., -2.2806e+00,
           -2.2609e+00, -2.2806e+00],
          [-1.8676e+00, -1.8676e+00, -1.8676e+00,  ..., -2.2806e+00,
           -2.2806e+00, -2.2806e+00]],

         [[-2.1824e+00, -2.1824e+00, -2.1824e+00,  ..., -2.1824e+00,
           -2.1824e+00, -2.2019e+00],
          [-2.1824e+00, -2.1824e+00, -2.1824e+00,  ..., -2.1824e+00,
           -2.1824e+00, -2.2019e+00],
          [-2.1824e+00, -2.1824e+00, -2.1824e+00,  ..., -2.1824e+00,
           -2.1824e+00, -2.2019e+00],
          ...,
          [-2.1434e+00, -2.1434e+00, -2.1434e+00,  ..., -2.1629e+00,
           -2.1629e+00, -2.1629e+00],
          [-2.1434e+00, -2.1434e+00, -2.1629e+00,  ..., -2.1824e+00,
           -2.1629e+00, -2.1824e+00],
          [-1.7141e+00, -1.7141e+00, -1.6946e+00,  ..., -2.1824e+00,
           -2.1824e+00, -2.2019e+00]]],


        [[[-5.8750e-01, -5.6811e-01, -6.0688e-01,  ...,  5.5621e-01,
            6.7252e-01,  1.5061e+00],
          [-1.3629e+00, -1.3047e+00, -1.3629e+00,  ..., -3.1611e-01,
            1.1036e-01,  1.3122e+00],
          [-1.3241e+00, -8.3950e-01, -7.6196e-01,  ..., -2.5336e-02,
            1.1959e+00,  1.6805e+00],
          ...,
          [ 1.4867e+00,  1.4091e+00,  7.1129e-01,  ...,  1.0408e+00,
            9.2452e-01,  8.0822e-01],
          [ 1.1959e+00,  1.1184e+00,  1.3704e+00,  ...,  2.6544e-01,
            1.4913e-01,  1.3434e-02],
          [ 1.0408e+00,  3.6236e-01,  4.2052e-01,  ..., -5.0996e-01,
           -6.2627e-01, -8.7827e-01]],

         [[-1.3366e+00, -1.4939e+00, -1.6119e+00,  ...,  4.7277e-01,
            3.3510e-01,  7.8744e-01],
          [-1.8086e+00, -1.8282e+00, -1.9463e+00,  ..., -3.7290e-01,
           -5.3024e-01,  3.5477e-01],
          [-1.7692e+00, -1.2579e+00, -1.0809e+00,  ..., -5.8924e-01,
            2.5644e-01,  6.1044e-01],
          ...,
          [ 6.6944e-01,  4.7277e-01, -1.7623e-01,  ..., -5.8233e-02,
           -7.7900e-02, -1.5657e-01],
          [ 3.3510e-01,  2.1710e-01,  4.5310e-01,  ..., -3.7290e-01,
           -3.9257e-01, -4.9090e-01],
          [ 2.0434e-02, -5.4990e-01, -4.3190e-01,  ..., -7.8591e-01,
           -8.6457e-01, -1.0416e+00]],

         [[-1.7922e+00, -1.8507e+00, -1.8507e+00,  ..., -1.3434e+00,
           -1.4995e+00, -1.3629e+00],
          [-1.8117e+00, -1.9092e+00, -2.0068e+00,  ..., -1.6361e+00,
           -1.8702e+00, -1.4800e+00],
          [-2.0263e+00, -1.7727e+00, -1.7336e+00,  ..., -1.5580e+00,
           -1.2849e+00, -1.4410e+00],
          ...,
          [-9.5322e-01, -1.3044e+00, -1.9482e+00,  ..., -1.2069e+00,
           -1.4605e+00, -1.5385e+00],
          [-1.0508e+00, -1.5971e+00, -1.5776e+00,  ..., -1.6946e+00,
           -1.8117e+00, -1.7531e+00],
          [-8.1665e-01, -1.6361e+00, -1.7922e+00,  ..., -1.8117e+00,
           -1.8312e+00, -1.7922e+00]]],


        ...,


        [[[ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4753e+00,  2.4753e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.4947e+00],
          [ 2.5141e+00,  2.4753e+00,  2.4753e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.5141e+00],
          ...,
          [ 2.5141e+00,  2.4753e+00,  2.4947e+00,  ...,  2.3784e+00,
            2.4559e+00,  2.4753e+00],
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.4947e+00],
          [ 2.5141e+00,  2.5141e+00,  2.5141e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.4947e+00]],

         [[ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5575e+00,  2.5575e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5575e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5968e+00],
          ...,
          [ 2.5968e+00,  2.5575e+00,  2.5771e+00,  ...,  2.4788e+00,
            2.5181e+00,  2.5575e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5771e+00]],

         [[ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7147e+00,  2.7147e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7342e+00],
          [ 2.7537e+00,  2.7147e+00,  2.7147e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7537e+00],
          ...,
          [ 2.7537e+00,  2.7147e+00,  2.7342e+00,  ...,  2.6172e+00,
            2.6952e+00,  2.7147e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7342e+00],
          [ 2.7537e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7342e+00]]],


        [[[-3.9365e-01, -4.1303e-01, -2.3857e-01,  ..., -2.4097e+00,
           -2.4097e+00, -2.4097e+00],
          [-7.2319e-01, -8.7827e-01, -2.5796e-01,  ..., -2.4097e+00,
           -2.4097e+00, -2.4097e+00],
          [-5.6811e-01, -1.1497e+00, -5.2934e-01,  ..., -2.4097e+00,
           -2.4097e+00, -2.4097e+00],
          ...,
          [-2.1189e+00, -2.1383e+00, -2.3321e+00,  ..., -2.3515e+00,
           -2.2352e+00, -2.0995e+00],
          [-2.1577e+00, -2.1964e+00, -2.3709e+00,  ..., -2.3321e+00,
           -2.1771e+00, -2.0995e+00],
          [-2.1771e+00, -2.2740e+00, -2.3903e+00,  ..., -2.3128e+00,
           -2.1771e+00, -2.1577e+00]],

         [[-3.5324e-01, -3.7290e-01, -1.9590e-01,  ..., -2.3986e+00,
           -2.3986e+00, -2.3986e+00],
          [-6.8757e-01, -8.4491e-01, -2.1557e-01,  ..., -2.3986e+00,
           -2.3986e+00, -2.3986e+00],
          [-5.3024e-01, -1.1202e+00, -4.9090e-01,  ..., -2.3986e+00,
           -2.3986e+00, -2.3986e+00],
          ...,
          [-2.2413e+00, -2.3199e+00, -2.3986e+00,  ..., -2.3396e+00,
           -2.1036e+00, -1.6709e+00],
          [-2.2609e+00, -2.3396e+00, -2.3986e+00,  ..., -2.3003e+00,
           -1.9856e+00, -1.5726e+00],
          [-2.2609e+00, -2.3199e+00, -2.3986e+00,  ..., -2.2609e+00,
           -1.8873e+00, -1.5136e+00]],

         [[-1.7281e-01, -1.9232e-01, -1.6730e-02,  ..., -2.1629e+00,
           -2.1629e+00, -2.1629e+00],
          [-5.0449e-01, -6.6057e-01, -3.6240e-02,  ..., -2.1629e+00,
           -2.1629e+00, -2.1629e+00],
          [-3.4841e-01, -9.3371e-01, -3.0938e-01,  ..., -2.1629e+00,
           -2.1629e+00, -2.1629e+00],
          ...,
          [-1.9873e+00, -2.0653e+00, -2.1824e+00,  ..., -2.1238e+00,
           -1.9873e+00, -1.6946e+00],
          [-2.0458e+00, -2.0848e+00, -2.1824e+00,  ..., -2.1043e+00,
           -1.8897e+00, -1.6556e+00],
          [-2.0653e+00, -2.0848e+00, -2.1238e+00,  ..., -2.0653e+00,
           -1.8897e+00, -1.6361e+00]]],


        [[[ 4.3990e-01,  4.2052e-01,  4.0113e-01,  ...,  4.0113e-01,
            4.0113e-01,  4.0113e-01],
          [ 4.2052e-01,  3.8175e-01,  3.8175e-01,  ...,  3.4298e-01,
            3.6236e-01,  3.6236e-01],
          [ 4.0113e-01,  3.8175e-01,  3.8175e-01,  ...,  3.2359e-01,
            3.4298e-01,  3.4298e-01],
          ...,
          [ 9.0974e-02,  1.3434e-02, -5.9512e-03,  ..., -1.4165e-01,
           -1.0288e-01, -1.0288e-01],
          [ 1.1036e-01,  5.2204e-02,  3.2819e-02,  ..., -8.3491e-02,
           -4.4721e-02, -4.4721e-02],
          [ 1.2974e-01,  9.0974e-02,  5.2204e-02,  ..., -2.5336e-02,
           -5.9512e-03, -2.5336e-02]],

         [[ 4.5310e-01,  4.3344e-01,  4.1377e-01,  ...,  4.5310e-01,
            4.3344e-01,  4.1377e-01],
          [ 4.3344e-01,  3.9410e-01,  3.9410e-01,  ...,  3.9410e-01,
            3.9410e-01,  3.7444e-01],
          [ 4.1377e-01,  3.9410e-01,  3.9410e-01,  ...,  3.7444e-01,
            3.7444e-01,  3.5477e-01],
          ...,
          [ 7.9434e-02,  2.0434e-02,  7.6703e-04,  ..., -9.7567e-02,
           -5.8233e-02, -5.8233e-02],
          [ 1.1877e-01,  5.9768e-02,  4.0101e-02,  ..., -3.8567e-02,
            7.6703e-04,  7.6703e-04],
          [ 1.7777e-01,  1.1877e-01,  9.9101e-02,  ...,  4.0101e-02,
            4.0101e-02,  4.0101e-02]],

         [[ 6.4662e-01,  6.2711e-01,  6.0760e-01,  ...,  6.2711e-01,
            6.2711e-01,  6.0760e-01],
          [ 6.2711e-01,  5.8809e-01,  5.8809e-01,  ...,  5.6858e-01,
            5.6858e-01,  5.6858e-01],
          [ 6.0760e-01,  5.8809e-01,  5.8809e-01,  ...,  5.4907e-01,
            5.6858e-01,  5.4907e-01],
          ...,
          [ 2.7592e-01,  2.1739e-01,  1.9788e-01,  ...,  6.1311e-02,
            1.1984e-01,  1.0033e-01],
          [ 3.1495e-01,  2.5641e-01,  2.3690e-01,  ...,  1.3935e-01,
            1.7837e-01,  1.9788e-01],
          [ 3.5397e-01,  3.1495e-01,  2.7592e-01,  ...,  2.3690e-01,
            2.7592e-01,  2.9543e-01]]]]), tensor([0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 1, 0, 0, 1, 0, 0, 0, 1, 1,
        1, 1, 1, 1, 1, 0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 1, 1, 0, 0, 0, 0, 1, 0, 1,
        0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 1, 0, 1, 0, 1, 0])]
x is [tensor([[[[-1.5374e+00, -1.5180e+00, -1.5180e+00,  ..., -1.2854e+00,
           -1.3047e+00, -1.3241e+00],
          [-1.4986e+00, -1.4986e+00, -1.4792e+00,  ..., -1.2466e+00,
           -1.2466e+00, -1.2660e+00],
          [-1.4792e+00, -1.4792e+00, -1.4404e+00,  ..., -1.2078e+00,
           -1.2272e+00, -1.2466e+00],
          ...,
          [-1.9638e+00, -1.9638e+00, -1.9638e+00,  ..., -1.6731e+00,
           -1.6731e+00, -1.6537e+00],
          [-1.9638e+00, -1.9638e+00, -1.9638e+00,  ..., -1.7118e+00,
           -1.6924e+00, -1.6731e+00],
          [-1.9638e+00, -1.9638e+00, -1.9638e+00,  ..., -1.7118e+00,
           -1.7118e+00, -1.6731e+00]],

         [[-1.4152e+00, -1.3956e+00, -1.3956e+00,  ..., -1.1596e+00,
           -1.1792e+00, -1.1989e+00],
          [-1.3759e+00, -1.3759e+00, -1.3562e+00,  ..., -1.1202e+00,
           -1.1202e+00, -1.1399e+00],
          [-1.3562e+00, -1.3562e+00, -1.3169e+00,  ..., -1.0809e+00,
           -1.1006e+00, -1.1202e+00],
          ...,
          [-1.8676e+00, -1.8676e+00, -1.8676e+00,  ..., -1.5726e+00,
           -1.5726e+00, -1.5529e+00],
          [-1.8676e+00, -1.8676e+00, -1.8676e+00,  ..., -1.6119e+00,
           -1.5922e+00, -1.5726e+00],
          [-1.8676e+00, -1.8676e+00, -1.8676e+00,  ..., -1.6119e+00,
           -1.6119e+00, -1.5726e+00]],

         [[-1.0898e+00, -1.0898e+00, -1.0898e+00,  ..., -8.5567e-01,
           -8.7518e-01, -8.9469e-01],
          [-1.0703e+00, -1.0703e+00, -1.0508e+00,  ..., -8.1665e-01,
           -8.1665e-01, -8.3616e-01],
          [-1.0508e+00, -1.0508e+00, -1.0118e+00,  ..., -7.7763e-01,
           -7.9714e-01, -8.1665e-01],
          ...,
          [-1.6166e+00, -1.6166e+00, -1.6166e+00,  ..., -1.3239e+00,
           -1.3239e+00, -1.3044e+00],
          [-1.6166e+00, -1.6166e+00, -1.6166e+00,  ..., -1.3629e+00,
           -1.3434e+00, -1.3239e+00],
          [-1.6166e+00, -1.6166e+00, -1.6166e+00,  ..., -1.3629e+00,
           -1.3629e+00, -1.3239e+00]]],


        [[[-1.2854e+00, -9.9458e-01, -1.0140e+00,  ..., -3.9365e-01,
           -8.3950e-01, -8.0073e-01],
          [-1.2466e+00, -1.2466e+00, -9.1704e-01,  ..., -5.2934e-01,
           -6.4565e-01, -4.3242e-01],
          [-1.1109e+00, -1.4792e+00, -1.0527e+00,  ..., -4.9057e-01,
           -2.5336e-02, -4.4721e-02],
          ...,
          [-7.6196e-01, -8.0073e-01, -8.3950e-01,  ...,  1.6224e+00,
            1.5061e+00,  1.1184e+00],
          [-6.2627e-01, -6.2627e-01, -6.0688e-01,  ...,  1.6612e+00,
            1.5255e+00,  1.3898e+00],
          [-6.4565e-01, -8.3950e-01, -1.0334e+00,  ...,  1.7387e+00,
            1.6612e+00,  1.5836e+00]],

         [[-1.2186e+00, -5.3024e-01,  7.6703e-04,  ...,  5.1211e-01,
            1.3844e-01,  2.1710e-01],
          [-1.2186e+00, -8.8424e-01,  7.6703e-04,  ...,  3.3510e-01,
            1.9744e-01,  4.7277e-01],
          [-1.0809e+00, -1.2382e+00, -2.5490e-01,  ...,  3.9410e-01,
            8.4644e-01,  8.6611e-01],
          ...,
          [ 1.5810e-01,  7.9434e-02, -1.8900e-02,  ..., -3.3357e-01,
           -4.7124e-01, -6.2857e-01],
          [ 2.3677e-01,  1.7777e-01,  1.1877e-01,  ..., -1.7623e-01,
           -6.4824e-01, -9.0391e-01],
          [ 7.9434e-02, -2.1557e-01, -5.1057e-01,  ...,  7.6703e-04,
           -4.9090e-01, -7.4657e-01]],

         [[-1.4020e+00, -9.9224e-01, -8.5567e-01,  ..., -2.1183e-01,
           -8.1665e-01, -1.1288e+00],
          [-1.4215e+00, -1.3629e+00, -8.5567e-01,  ..., -4.6547e-01,
           -7.7763e-01, -8.7518e-01],
          [-1.4605e+00, -1.7531e+00, -1.0508e+00,  ..., -5.6302e-01,
           -4.2645e-01, -6.9959e-01],
          ...,
          [-3.6792e-01, -5.2400e-01, -6.8008e-01,  ...,  7.4417e-01,
            4.7103e-01,  8.0821e-02],
          [-3.6792e-01, -4.4596e-01, -5.4351e-01,  ...,  8.6123e-01,
            3.5397e-01,  4.1801e-02],
          [-6.2155e-01, -7.9714e-01, -9.7273e-01,  ...,  1.0563e+00,
            6.0760e-01,  3.3446e-01]]],


        [[[-2.2158e+00, -2.1964e+00, -2.1771e+00,  ..., -2.0414e+00,
           -1.9832e+00, -2.0608e+00],
          [-2.1771e+00, -2.1771e+00, -2.1577e+00,  ..., -2.0220e+00,
           -2.0026e+00, -2.0414e+00],
          [-2.1577e+00, -2.1577e+00, -2.1383e+00,  ..., -1.9638e+00,
           -1.9832e+00, -1.9832e+00],
          ...,
          [-2.1577e+00, -2.1964e+00, -2.2158e+00,  ..., -2.2158e+00,
           -2.1383e+00, -2.0801e+00],
          [-2.1964e+00, -2.2352e+00, -2.2546e+00,  ..., -2.2158e+00,
           -2.1577e+00, -2.1383e+00],
          [-2.2352e+00, -2.2352e+00, -2.2352e+00,  ..., -2.2158e+00,
           -2.1964e+00, -2.1577e+00]],

         [[-1.8479e+00, -1.8479e+00, -1.8086e+00,  ..., -1.6709e+00,
           -1.6316e+00, -1.7102e+00],
          [-1.8873e+00, -1.8873e+00, -1.8676e+00,  ..., -1.6709e+00,
           -1.6512e+00, -1.6906e+00],
          [-1.8873e+00, -1.8873e+00, -1.8873e+00,  ..., -1.6119e+00,
           -1.6316e+00, -1.6316e+00],
          ...,
          [-1.9069e+00, -1.9463e+00, -1.9659e+00,  ..., -1.9659e+00,
           -1.8873e+00, -1.8282e+00],
          [-1.9463e+00, -1.9856e+00, -2.0053e+00,  ..., -1.9659e+00,
           -1.9266e+00, -1.8873e+00],
          [-1.9856e+00, -1.9856e+00, -1.9856e+00,  ..., -1.9659e+00,
           -1.9266e+00, -1.8873e+00]],

         [[-1.8312e+00, -1.8117e+00, -1.7727e+00,  ..., -1.6751e+00,
           -1.6361e+00, -1.7141e+00],
          [-1.8312e+00, -1.8312e+00, -1.8117e+00,  ..., -1.6751e+00,
           -1.6556e+00, -1.6946e+00],
          [-1.8312e+00, -1.8312e+00, -1.8312e+00,  ..., -1.6166e+00,
           -1.6361e+00, -1.6361e+00],
          ...,
          [-1.7922e+00, -1.8312e+00, -1.8507e+00,  ..., -1.8507e+00,
           -1.7336e+00, -1.6556e+00],
          [-1.8312e+00, -1.8702e+00, -1.8897e+00,  ..., -1.8507e+00,
           -1.7922e+00, -1.7727e+00],
          [-1.8702e+00, -1.8702e+00, -1.8702e+00,  ..., -1.8507e+00,
           -1.8507e+00, -1.8312e+00]]],


        ...,


        [[[-2.1771e+00, -2.2546e+00, -2.2934e+00,  ..., -2.2740e+00,
           -2.2934e+00, -2.2740e+00],
          [-2.2158e+00, -2.3128e+00, -2.3321e+00,  ..., -2.2934e+00,
           -2.2934e+00, -2.2740e+00],
          [-2.2352e+00, -2.2934e+00, -2.3128e+00,  ..., -2.2740e+00,
           -2.2546e+00, -2.2158e+00],
          ...,
          [-1.6149e+00, -9.5581e-01, -6.0688e-01,  ...,  7.5006e-01,
            8.0822e-01,  9.0514e-01],
          [-1.6103e-01, -1.9980e-01, -2.5336e-02,  ...,  1.1378e+00,
            1.3704e+00,  1.3898e+00],
          [ 1.0796e+00,  1.0796e+00,  1.1378e+00,  ...,  1.7775e+00,
            1.7775e+00,  1.7775e+00]],

         [[-2.1429e+00, -2.2216e+00, -2.2609e+00,  ..., -2.2413e+00,
           -2.2609e+00, -2.2413e+00],
          [-2.1823e+00, -2.2806e+00, -2.3003e+00,  ..., -2.2609e+00,
           -2.2609e+00, -2.2413e+00],
          [-2.2019e+00, -2.2609e+00, -2.2806e+00,  ..., -2.2413e+00,
           -2.2216e+00, -2.1823e+00],
          ...,
          [-1.2579e+00, -4.3190e-01,  4.0101e-02,  ...,  1.1611e+00,
            1.2791e+00,  1.3971e+00],
          [ 2.7610e-01,  2.7610e-01,  5.1211e-01,  ...,  1.4954e+00,
            1.7118e+00,  1.7904e+00],
          [ 1.5544e+00,  1.4758e+00,  1.5348e+00,  ...,  1.9281e+00,
            1.9675e+00,  2.0461e+00]],

         [[-2.0653e+00, -2.1629e+00, -2.1824e+00,  ..., -2.1824e+00,
           -2.1629e+00, -2.1824e+00],
          [-2.1043e+00, -2.2019e+00, -2.2214e+00,  ..., -2.2019e+00,
           -2.1629e+00, -2.1824e+00],
          [-2.1238e+00, -2.1824e+00, -2.2019e+00,  ..., -2.1824e+00,
           -2.1238e+00, -2.1238e+00],
          ...,
          [-1.9092e+00, -1.9092e+00, -2.0458e+00,  ..., -1.7336e+00,
           -5.4351e-01,  7.2466e-01],
          [-1.7281e-01, -4.8498e-01, -7.7763e-01,  ...,  1.3935e-01,
            9.7829e-01,  1.1929e+00],
          [ 1.0563e+00,  1.1149e+00,  9.9781e-01,  ...,  1.6612e+00,
            1.6221e+00,  1.3880e+00]]],


        [[[-1.7700e+00, -1.9251e+00, -1.9444e+00,  ..., -1.9444e+00,
           -1.9832e+00, -1.7700e+00],
          [-2.2546e+00, -2.4291e+00, -2.4291e+00,  ..., -2.4097e+00,
           -2.4291e+00, -2.2740e+00],
          [-2.2158e+00, -2.3709e+00, -2.3903e+00,  ..., -2.3515e+00,
           -2.3709e+00, -2.2352e+00],
          ...,
          [ 1.5448e+00,  1.3316e+00,  1.3510e+00,  ...,  3.8175e-01,
            2.8482e-01,  3.4298e-01],
          [ 1.2541e+00,  1.0796e+00,  1.1378e+00,  ...,  3.4298e-01,
            4.0113e-01,  4.0113e-01],
          [ 1.1959e+00,  1.0408e+00,  1.2735e+00,  ...,  4.9806e-01,
            4.9806e-01,  4.3990e-01]],

         [[-1.5529e+00, -1.7889e+00, -1.7496e+00,  ..., -1.7299e+00,
           -1.7692e+00, -1.5922e+00],
          [-2.2609e+00, -2.4183e+00, -2.4183e+00,  ..., -2.4183e+00,
           -2.4183e+00, -2.3003e+00],
          [-2.1626e+00, -2.3593e+00, -2.3199e+00,  ..., -2.3396e+00,
           -2.3986e+00, -2.2019e+00],
          ...,
          [ 6.6944e-01,  3.9410e-01,  3.5477e-01,  ..., -6.0891e-01,
           -8.6457e-01, -5.8924e-01],
          [ 5.5144e-01,  2.7610e-01,  2.3677e-01,  ..., -5.8924e-01,
           -5.8924e-01, -4.1224e-01],
          [ 6.8911e-01,  4.3344e-01,  5.5144e-01,  ..., -1.3690e-01,
           -9.7567e-02, -3.8567e-02]],

         [[-1.4800e+00, -1.6751e+00, -1.6556e+00,  ..., -1.6556e+00,
           -1.6751e+00, -1.4605e+00],
          [-2.0848e+00, -2.2214e+00, -2.2214e+00,  ..., -2.2214e+00,
           -2.2214e+00, -2.1043e+00],
          [-2.0263e+00, -2.2019e+00, -2.1824e+00,  ..., -2.2019e+00,
           -2.2019e+00, -2.0263e+00],
          ...,
          [-6.0204e-01, -1.0118e+00, -8.9469e-01,  ..., -1.6361e+00,
           -1.8312e+00, -1.4410e+00],
          [-8.9469e-01, -1.3044e+00, -1.2654e+00,  ..., -1.6166e+00,
           -1.6361e+00, -1.3629e+00],
          [-3.2889e-01, -6.9959e-01, -5.2400e-01,  ..., -8.5567e-01,
           -8.9469e-01, -7.5812e-01]]],


        [[[-2.9672e-01,  2.8482e-01,  4.5929e-01,  ..., -1.9638e+00,
           -2.0220e+00, -1.7312e+00],
          [ 8.2760e-01,  9.2452e-01,  8.4699e-01,  ..., -1.2660e+00,
           -6.2627e-01, -5.9512e-03],
          [ 1.5448e+00,  1.3704e+00,  1.3122e+00,  ..., -3.9365e-01,
           -5.9512e-03,  1.0214e+00],
          ...,
          [ 8.8576e-01,  5.9498e-01,  3.2359e-01,  ..., -2.3857e-01,
           -2.7734e-01, -2.7734e-01],
          [ 1.1378e+00,  8.4699e-01,  5.7560e-01,  ..., -2.1919e-01,
           -2.3857e-01, -2.7734e-01],
          [ 8.4699e-01,  1.0796e+00,  8.0822e-01,  ..., -2.5796e-01,
           -2.7734e-01, -2.7734e-01]],

         [[-1.5657e-01,  5.1211e-01,  6.6944e-01,  ..., -1.5922e+00,
           -1.8676e+00, -1.7102e+00],
          [ 8.2677e-01,  9.8411e-01,  8.2677e-01,  ..., -9.8258e-01,
           -3.3357e-01,  4.0101e-02],
          [ 1.2791e+00,  1.0824e+00,  8.6611e-01,  ..., -1.8900e-02,
            2.3677e-01,  9.6444e-01],
          ...,
          [ 6.8911e-01,  5.3177e-01,  5.5144e-01,  ...,  9.9101e-02,
            4.0101e-02,  4.0101e-02],
          [ 9.0544e-01,  6.8911e-01,  5.9077e-01,  ...,  1.1877e-01,
            7.9434e-02,  4.0101e-02],
          [ 6.8911e-01,  8.8578e-01,  6.4977e-01,  ...,  1.1877e-01,
            5.9768e-02,  5.9768e-02]],

         [[ 2.7802e-03,  6.2711e-01,  5.2956e-01,  ..., -1.1093e+00,
           -1.1483e+00, -9.9224e-01],
          [ 7.8319e-01,  8.0270e-01,  4.9054e-01,  ..., -4.8498e-01,
           -3.6240e-02,  3.1495e-01],
          [ 1.1149e+00,  9.0025e-01,  7.2466e-01,  ...,  1.7837e-01,
            2.7592e-01,  9.1976e-01],
          ...,
          [ 1.0033e-01, -1.6730e-02, -9.4771e-02,  ...,  1.1984e-01,
            2.7802e-03,  2.2291e-02],
          [ 2.5641e-01,  6.1311e-02,  2.7802e-03,  ...,  2.7802e-03,
           -3.6240e-02, -7.5261e-02],
          [ 1.0033e-01,  3.1495e-01,  2.3690e-01,  ..., -9.4771e-02,
           -1.3379e-01, -1.5330e-01]]]]), tensor([0, 0, 1, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 1, 0, 1, 1, 1, 0, 1, 1, 1,
        0, 0, 0, 0, 1, 1, 1, 0, 0, 1, 0, 0, 1, 1, 1, 1, 0, 1, 1, 0, 0, 0, 1, 1,
        1, 0, 1, 0, 1, 1, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0])]
x is [tensor([[[[-0.8201, -0.0447,  0.0134,  ..., -1.8281, -2.2546, -2.3128],
          [-0.8783, -0.3743,  0.1491,  ..., -2.2352, -2.3321, -2.2158],
          [-0.4518, -0.5293,  0.1104,  ..., -2.0026, -2.1383, -2.2158],
          ...,
          [-0.0835, -0.2773, -0.4712,  ..., -0.6650, -0.5293, -0.9364],
          [-1.4211, -1.2660, -1.2466,  ..., -0.7620, -0.3743, -0.8007],
          [-1.8669, -2.0414, -1.9638,  ..., -0.7232, -0.2580, -0.5487]],

         [[-0.9432, -0.0976,  0.0401,  ..., -1.7692, -2.1823, -2.2413],
          [-1.0416, -0.4712,  0.1384,  ..., -2.1626, -2.2216, -2.1036],
          [-0.6679, -0.6679,  0.0401,  ..., -2.0053, -2.1233, -2.1823],
          ...,
          [-0.1762, -0.3532, -0.7269,  ..., -0.7072, -0.4909, -0.9039],
          [-1.4546, -1.2579, -1.3956,  ..., -0.8056, -0.3336, -0.7466],
          [-1.8873, -1.9856, -2.0053,  ..., -0.7859, -0.2352, -0.5106]],

         [[-0.9727, -0.1533, -0.0362,  ..., -1.4995, -1.7141, -1.6751],
          [-1.0508, -0.5045,  0.0613,  ..., -1.8117, -1.7336, -1.5580],
          [-0.6606, -0.6801, -0.0167,  ..., -1.6556, -1.6751, -1.7141],
          ...,
          [-0.2118, -0.3874, -0.6801,  ..., -0.9532, -0.7776, -1.0313],
          [-1.4410, -1.2459, -1.3239,  ..., -1.0118, -0.5825, -0.8752],
          [-1.7922, -1.9092, -1.8897,  ..., -0.9532, -0.4264, -0.6215]]],


        [[[-1.1303, -1.1497, -0.8783,  ..., -1.2660, -1.3629, -1.1690],
          [-1.0915, -1.2272, -1.1690,  ..., -1.2272, -1.1884, -1.1303],
          [-1.0721, -1.1303, -1.1303,  ..., -1.1109, -1.0527, -1.0334],
          ...,
          [-1.0721, -1.3435, -1.3047,  ..., -1.2078, -1.2854, -1.6537],
          [-0.9946, -1.5567, -1.5180,  ..., -1.2660, -1.4986, -1.5180],
          [-1.0721, -1.6343, -1.6149,  ..., -1.2854, -1.7118, -1.5180]],

         [[-0.8056, -0.7859, -0.4516,  ..., -0.7662, -0.9826, -0.8646],
          [-0.7859, -0.8842, -0.7859,  ..., -0.7269, -0.7859, -0.8056],
          [-0.7662, -0.8056, -0.7466,  ..., -0.6286, -0.6482, -0.6876],
          ...,
          [-0.7269, -1.1399, -1.1989,  ..., -0.9826, -1.0612, -1.4742],
          [-0.7466, -1.3366, -1.2776,  ..., -1.0809, -1.4742, -1.5529],
          [-0.8842, -1.4546, -1.3956,  ..., -1.1006, -1.6709, -1.4152]],

         [[-0.9532, -0.9337, -0.6215,  ..., -0.7971, -1.0898, -1.0508],
          [-0.8557, -0.9532, -0.8752,  ..., -0.7386, -0.9922, -1.1093],
          [-0.7776, -0.8167, -0.7776,  ..., -0.6411, -0.8362, -0.9727],
          ...,
          [-1.0118, -1.2264, -1.2849,  ..., -1.0118, -1.0898, -1.4800],
          [-1.0118, -1.4410, -1.4215,  ..., -1.0703, -1.4215, -1.4800],
          [-0.9532, -1.4800, -1.4410,  ..., -1.1678, -1.6946, -1.5190]]],


        [[[-2.3903, -2.3903, -2.3903,  ..., -2.3903, -2.3903, -2.3903],
          [-2.3903, -2.3903, -2.3903,  ..., -2.3903, -2.3903, -2.3903],
          [-2.3903, -2.3903, -2.3903,  ..., -2.3903, -2.3903, -2.3903],
          ...,
          [-2.3903, -2.3903, -2.3903,  ..., -2.3903, -2.3903, -2.3903],
          [-2.3903, -2.3903, -2.3903,  ..., -2.3903, -2.3903, -2.3903],
          [-2.3903, -2.3903, -2.3903,  ..., -2.3903, -2.3903, -2.3903]],

         [[-1.8086, -1.8086, -1.8086,  ..., -1.8086, -1.8086, -1.8086],
          [-1.8086, -1.8086, -1.8086,  ..., -1.8086, -1.8086, -1.8086],
          [-1.8086, -1.8086, -1.8086,  ..., -1.8086, -1.8086, -1.8086],
          ...,
          [-1.8086, -1.8086, -1.8086,  ..., -1.8086, -1.8086, -1.8086],
          [-1.8086, -1.8086, -1.8086,  ..., -1.8086, -1.8086, -1.8086],
          [-1.8086, -1.8086, -1.8086,  ..., -1.8086, -1.8086, -1.8086]],

         [[-2.2019, -2.2019, -2.2019,  ..., -2.2019, -2.2019, -2.2019],
          [-2.2019, -2.2019, -2.2019,  ..., -2.2019, -2.2019, -2.2019],
          [-2.2019, -2.2019, -2.2019,  ..., -2.2019, -2.2019, -2.2019],
          ...,
          [-2.2019, -2.2019, -2.2019,  ..., -2.2019, -2.2019, -2.2019],
          [-2.2019, -2.2019, -2.2019,  ..., -2.2019, -2.2019, -2.2019],
          [-2.2019, -2.2019, -2.2019,  ..., -2.2019, -2.2019, -2.2019]]],


        ...,


        [[[-1.9832, -1.9832, -1.9832,  ..., -2.0995, -2.1189, -2.0995],
          [-1.9832, -1.9832, -1.9832,  ..., -2.0801, -2.1189, -2.0995],
          [-1.9832, -1.9832, -1.9638,  ..., -2.0801, -2.1189, -2.1189],
          ...,
          [-1.9832, -1.9444, -1.9251,  ..., -2.0995, -2.0995, -2.0995],
          [-1.9832, -1.9444, -1.9444,  ..., -2.0995, -2.0995, -2.0995],
          [-1.9832, -1.9444, -1.9444,  ..., -2.0995, -2.0995, -2.0995]],

         [[-2.1233, -2.1233, -2.1233,  ..., -2.2216, -2.2609, -2.2413],
          [-2.1233, -2.1233, -2.1233,  ..., -2.2216, -2.2609, -2.2413],
          [-2.1233, -2.1233, -2.1036,  ..., -2.2216, -2.2609, -2.2609],
          ...,
          [-2.1429, -2.0839, -2.0643,  ..., -2.2609, -2.2609, -2.2609],
          [-2.1429, -2.0839, -2.0839,  ..., -2.2609, -2.2609, -2.2609],
          [-2.1429, -2.0839, -2.0839,  ..., -2.2609, -2.2609, -2.2609]],

         [[-1.6751, -1.6751, -1.6751,  ..., -1.7922, -1.8117, -1.7922],
          [-1.6751, -1.6751, -1.6751,  ..., -1.7727, -1.8117, -1.7922],
          [-1.6751, -1.6751, -1.6556,  ..., -1.7727, -1.8117, -1.8117],
          ...,
          [-1.5971, -1.5971, -1.6166,  ..., -1.6946, -1.6946, -1.6946],
          [-1.5971, -1.5971, -1.6361,  ..., -1.6946, -1.6946, -1.6946],
          [-1.5971, -1.5971, -1.6361,  ..., -1.6946, -1.6946, -1.6946]]],


        [[[-1.9444, -1.8669, -1.8087,  ..., -1.9638, -1.8669, -1.9251],
          [-1.8281, -1.7506, -1.6537,  ..., -1.9251, -1.7700, -1.8475],
          [-1.6924, -1.6343, -1.5374,  ..., -1.8281, -1.7506, -1.8281],
          ...,
          [-0.8589, -0.9558, -0.9558,  ...,  1.6224,  1.5836,  1.5836],
          [-1.2078, -1.2466, -1.1690,  ...,  1.5255,  1.5642,  1.5255],
          [-1.1884, -1.2272, -1.2660,  ...,  1.4091,  1.4479,  1.5448]],

         [[-1.7692, -1.6906, -1.6316,  ..., -1.8479, -1.7496, -1.8086],
          [-1.6512, -1.5726, -1.4742,  ..., -1.8086, -1.6512, -1.7299],
          [-1.4939, -1.4546, -1.3562,  ..., -1.7102, -1.6316, -1.7102],
          ...,
          [-1.1006, -1.1792, -1.1792,  ...,  1.8101,  1.7708,  1.7708],
          [-1.4546, -1.4939, -1.4152,  ...,  1.6921,  1.7314,  1.6921],
          [-1.4546, -1.4939, -1.5332,  ...,  1.5741,  1.6134,  1.7118]],

         [[-1.8312, -1.7531, -1.7141,  ..., -1.5385, -1.4410, -1.4995],
          [-1.7141, -1.6361, -1.5385,  ..., -1.4995, -1.3434, -1.4215],
          [-1.5776, -1.5190, -1.4215,  ..., -1.4020, -1.3239, -1.4020],
          ...,
          [-1.5971, -1.6751, -1.6751,  ...,  1.3295,  1.2514,  1.2514],
          [-1.7922, -1.8117, -1.7531,  ...,  1.2514,  1.2709,  1.2514],
          [-1.6556, -1.6946, -1.7141,  ...,  1.1734,  1.2124,  1.2905]]],


        [[[-1.8669, -1.9057, -2.0995,  ..., -1.8281, -2.0801, -1.7506],
          [-2.0608, -2.0801, -2.1771,  ..., -1.9251, -2.1964, -1.8669],
          [-2.1577, -2.1577, -2.1577,  ..., -1.8863, -2.1964, -1.9057],
          ...,
          [-2.2352, -2.2158, -2.1964,  ..., -2.3128, -2.3128, -2.2740],
          [-2.2352, -2.2158, -2.1577,  ..., -2.3321, -2.3321, -2.3128],
          [-2.2546, -2.2158, -2.1771,  ..., -2.3515, -2.3128, -2.2740]],

         [[-1.6512, -1.6906, -1.8873,  ..., -1.3956, -1.7102, -1.3169],
          [-1.8676, -1.9069, -1.9856,  ..., -1.5136, -1.8676, -1.4349],
          [-2.0249, -2.0249, -2.0249,  ..., -1.4742, -1.8676, -1.5136],
          ...,
          [-2.1823, -2.1626, -2.1429,  ..., -2.3003, -2.3003, -2.2609],
          [-2.1823, -2.1626, -2.1233,  ..., -2.3199, -2.3396, -2.3003],
          [-2.2413, -2.2019, -2.1626,  ..., -2.3593, -2.3003, -2.2806]],

         [[-1.6166, -1.6556, -1.8507,  ..., -1.9482, -1.9678, -2.0068],
          [-1.8312, -1.8507, -1.9482,  ..., -1.9678, -1.9482, -2.0068],
          [-1.9678, -1.9678, -1.9678,  ..., -1.9482, -1.8702, -1.9482],
          ...,
          [-2.0458, -2.0263, -2.0068,  ..., -2.0653, -2.1043, -2.1238],
          [-2.0458, -2.0263, -1.9873,  ..., -2.1043, -2.1434, -2.1629],
          [-2.0848, -2.0458, -2.0068,  ..., -2.1824, -2.1434, -2.1238]]]]), tensor([0, 0, 1, 0, 1, 0, 1, 1, 0, 1, 1, 0, 1, 0, 1, 0, 1, 1, 0, 0, 0, 0, 1, 0,
        0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 1, 1, 1, 1, 1, 1, 0, 0, 1, 0, 0, 0, 1,
        0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 1, 1, 1, 1])]
x is [tensor([[[[-1.7312, -1.5374, -1.4404,  ..., -0.1610,  0.6725,  1.8550],
          [-1.4986, -0.8201, -1.6731,  ...,  0.2073,  1.5836,  2.1652],
          [-1.2854, -1.3629, -1.7312,  ...,  1.0796,  1.8938,  2.0876],
          ...,
          [ 0.8664,  0.1491,  0.1297,  ...,  1.7968,  1.6612,  1.8744],
          [ 0.8276, -0.0835,  0.3236,  ...,  1.6224,  1.7581,  2.0489],
          [ 1.0990,  0.3042,  0.5950,  ...,  1.6999,  1.8744,  1.9907]],

         [[-0.2549, -0.3336, -0.1369,  ...,  0.6301,  1.2791,  2.2231],
          [ 0.0991,  0.3548, -0.5892,  ...,  0.9054,  1.9871,  2.4001],
          [ 0.2958, -0.3926, -0.9236,  ...,  1.7314,  2.1445,  2.2821],
          ...,
          [ 1.5741,  0.9448,  0.8858,  ...,  1.3971,  1.2594,  1.4758],
          [ 1.4168,  0.6498,  1.0824,  ...,  1.2398,  1.3774,  1.6724],
          [ 1.5938,  1.0038,  1.3381,  ...,  1.3381,  1.4954,  1.6331]],

         [[-0.5630, -0.5435, -0.3094,  ...,  0.3149,  0.9978,  1.8563],
          [-0.4655, -0.0362, -0.5240,  ...,  0.6076,  1.7392,  2.0904],
          [-0.3874, -0.7581, -0.6411,  ...,  1.4075,  1.9148,  1.9928],
          ...,
          [ 0.9978,  0.6661,  0.5296,  ...,  0.8222,  0.7247,  0.9588],
          [ 0.8807,  0.4710,  0.7442,  ...,  0.5881,  0.7832,  1.1149],
          [ 1.1344,  0.8807,  1.0563,  ...,  0.6466,  0.8417,  1.0173]]],


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4753,  2.4947,  ...,  2.4947,  2.4947,  2.5141],
          [ 2.5141,  2.4753,  2.4753,  ...,  2.4947,  2.4947,  2.5141],
          ...,
          [ 2.5141,  2.4947,  2.4559,  ...,  2.3202,  2.4753,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4365,  2.4559,  2.5141],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5575,  2.5771,  ...,  2.5771,  2.5771,  2.5968],
          [ 2.5968,  2.5575,  2.5575,  ...,  2.5968,  2.5771,  2.5968],
          ...,
          [ 2.5968,  2.5771,  2.5575,  ...,  2.2821,  2.5575,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5378,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7147,  2.7147,  ...,  2.7147,  2.7342,  2.7537],
          [ 2.7537,  2.7147,  2.7147,  ...,  2.6367,  2.7342,  2.7537],
          ...,
          [ 2.7537,  2.6952,  2.6952,  ...,  2.4416,  2.7342,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.6952,  2.7537],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7147,  2.7537,  2.7537]]],


        [[[ 0.4011,  0.4205,  0.4593,  ...,  0.5756,  0.6725,  0.5368],
          [ 0.3817,  0.4205,  0.4981,  ...,  0.6531,  0.6531,  0.6144],
          [ 0.3430,  0.3817,  0.4593,  ...,  0.5368,  0.3817,  0.2073],
          ...,
          [-0.1223, -0.2192, -0.2386,  ...,  0.5950,  0.3236, -0.0060],
          [-0.4130, -0.4712, -0.9946,  ..., -0.0253, -0.0447, -0.0447],
          [-0.4906, -0.5681, -0.8589,  ..., -0.0060,  0.0134,  0.0134]],

         [[ 0.0794,  0.1188,  0.1581,  ...,  0.3548,  0.4531,  0.3154],
          [ 0.0598,  0.0991,  0.1778,  ...,  0.4334,  0.4334,  0.3941],
          [ 0.0204,  0.0598,  0.1384,  ...,  0.3154,  0.1581, -0.0189],
          ...,
          [-0.3139, -0.4122, -0.4516,  ...,  0.4728,  0.2368, -0.0189],
          [-0.6089, -0.6876, -1.2382,  ..., -0.1172, -0.1369, -0.0976],
          [-0.7072, -0.8056, -1.1202,  ..., -0.0779, -0.0779, -0.0976]],

         [[-1.7922, -1.7727, -1.7336,  ..., -1.2069, -1.1093, -1.2459],
          [-1.8117, -1.7727, -1.6946,  ..., -1.1288, -1.1288, -1.1678],
          [-1.8117, -1.7922, -1.6946,  ..., -1.2654, -1.4020, -1.5776],
          ...,
          [-1.5971, -1.6361, -1.6361,  ..., -0.7386, -1.0703, -1.4605],
          [-1.8507, -1.8117, -2.1043,  ..., -1.4410, -1.4995, -1.5385],
          [-1.9287, -1.8897, -1.9482,  ..., -1.5971, -1.5580, -1.5776]]],


        ...,


        [[[-1.9638, -1.8475, -1.3241,  ..., -2.3128, -1.9444, -2.3515],
          [-1.2660, -1.4404, -0.4130,  ..., -2.0414, -1.4211, -1.8669],
          [-1.1690, -0.8395, -0.4712,  ..., -2.2546, -1.9057, -1.9251],
          ...,
          [-2.4097, -2.2934, -1.2854,  ..., -0.8201, -1.7312, -1.2466],
          [-2.3709, -2.2546, -2.1383,  ..., -0.4518, -1.3823, -0.9752],
          [-2.4291, -2.2158, -2.4291,  ..., -1.3435, -1.6343, -1.2272]],

         [[-0.9826, -1.1792, -0.8449,  ..., -1.8676, -1.4349, -1.6316],
          [-0.4516, -0.9236, -0.1369,  ..., -1.7889, -1.1792, -1.2579],
          [-0.5302, -0.5106, -0.3926,  ..., -2.2413, -1.8873, -1.4546],
          ...,
          [-1.9463, -1.9463, -0.9432,  ..., -0.4516, -1.3562, -0.5499],
          [-1.9069, -1.8479, -1.8282,  ..., -0.1172, -1.0416, -0.2942],
          [-1.9069, -1.6906, -2.1233,  ..., -0.8056, -1.1006, -0.4122]],

         [[-2.0653, -1.9092, -1.3629,  ..., -2.2214, -2.0458, -2.2214],
          [-1.2459, -1.3239, -0.3094,  ..., -1.9678, -1.4410, -1.9678],
          [-1.0118, -0.6801, -0.4069,  ..., -2.1043, -1.8312, -2.0848],
          ...,
          [-2.2214, -2.0848, -1.0118,  ..., -0.8362, -1.7336, -1.4800],
          [-2.2214, -2.0653, -1.8702,  ..., -0.3679, -1.3044, -1.1873],
          [-2.2214, -2.2019, -2.2214,  ..., -1.4215, -1.7922, -1.6361]]],


        [[[-0.7038, -1.0527, -1.0334,  ..., -1.3241, -1.1690, -1.0334],
          [-0.8589, -1.1690, -1.2272,  ..., -1.0721, -1.1109, -1.1884],
          [-0.9752, -1.2660, -1.1884,  ..., -1.0527, -1.1884, -1.2660],
          ...,
          [-0.7232, -0.8783, -1.1497,  ..., -1.3823, -1.3047, -0.8589],
          [-0.7813, -0.9752, -0.7426,  ..., -1.3435, -1.4017, -1.1497],
          [-1.1497, -0.9946, -0.6457,  ..., -1.5180, -1.4017, -1.4211]],

         [[-0.8842, -1.2382, -1.1989,  ..., -1.4152, -1.2972, -1.2579],
          [-0.9629, -1.2776, -1.3169,  ..., -1.2972, -1.3366, -1.4546],
          [-1.0219, -1.3562, -1.2579,  ..., -1.3562, -1.4742, -1.5332],
          ...,
          [-1.0416, -1.1792, -1.5529,  ..., -1.5136, -1.4152, -0.9629],
          [-1.1596, -1.3169, -1.2579,  ..., -1.4939, -1.5529, -1.2382],
          [-1.5726, -1.4152, -1.1792,  ..., -1.6316, -1.5332, -1.5529]],

         [[-0.7776, -1.0313, -1.0508,  ..., -1.3629, -1.2459, -1.1483],
          [-0.8557, -1.0703, -1.1873,  ..., -1.2849, -1.3239, -1.4215],
          [-1.0118, -1.2264, -1.1873,  ..., -1.3239, -1.4410, -1.5190],
          ...,
          [-1.4995, -1.5385, -1.8117,  ..., -1.3825, -1.3044, -0.8947],
          [-1.6556, -1.7727, -1.7141,  ..., -1.2849, -1.3629, -1.1483],
          [-1.8507, -1.7922, -1.6361,  ..., -1.4410, -1.3434, -1.4410]]],


        [[[ 2.5141,  2.4559,  2.4559,  ...,  2.4559,  2.4559,  2.4559],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          ...,
          [ 2.5141,  2.4947,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.5141,  2.5141,  2.5141]],

         [[ 2.5968,  2.5378,  2.5378,  ...,  2.5378,  2.5378,  2.5378],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          ...,
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5968,  2.5968,  2.5968]],

         [[ 2.7537,  2.6952,  2.6952,  ...,  2.6952,  2.6952,  2.6952],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          ...,
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7537,  2.7537,  2.7537]]]]), tensor([1, 0, 1, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1,
        1, 0, 1, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 1, 1, 0, 1, 0, 0, 1, 0,
        1, 0, 0, 0, 1, 1, 0, 1, 1, 0, 1, 0, 0, 0, 1, 1])]
x is [tensor([[[[ 0.5950,  0.5368,  0.5562,  ..., -0.1804, -0.6069, -1.5180],
          [-1.2466, -1.4017, -1.3435,  ..., -0.4324, -0.7426, -1.6731],
          [-1.5374, -1.9057, -1.8475,  ...,  1.3316, -0.0060, -1.8281],
          ...,
          [ 0.4399, -1.3241, -1.7700,  ..., -1.0140, -2.0026, -1.8669],
          [ 0.4399, -0.8783, -1.0527,  ..., -1.1497, -2.0414, -1.8669],
          [ 0.5756, -0.4130, -0.5293,  ..., -1.2466, -2.0026, -1.8475]],

         [[ 0.4924,  0.4138,  0.4138,  ..., -0.3532, -0.9826, -1.9463],
          [-1.5726, -1.7496, -1.7102,  ..., -0.4516, -1.0219, -2.0643],
          [-1.6512, -2.0249, -1.9659,  ...,  1.5151, -0.1566, -2.1429],
          ...,
          [ 0.0598, -1.3956, -1.7889,  ..., -1.0022, -2.2019, -2.1823],
          [ 0.1188, -0.8842, -1.0022,  ..., -1.1399, -2.2413, -2.1823],
          [ 0.2564, -0.4712, -0.4909,  ..., -1.2776, -2.2413, -2.2019]],

         [[ 0.9393,  0.9393,  1.0173,  ..., -0.1143, -0.6996, -1.6946],
          [-0.8167, -0.9727, -0.8947,  ..., -0.0948, -0.6411, -1.7727],
          [-0.8362, -1.2264, -1.2069,  ...,  1.9733,  0.3149, -1.7922],
          ...,
          [ 1.0758, -0.6996, -1.3044,  ..., -0.5825, -1.9482, -2.0263],
          [ 1.0563, -0.2704, -0.5630,  ..., -0.6801, -2.0068, -2.0263],
          [ 1.1734,  0.1198, -0.0948,  ..., -0.8557, -2.0263, -2.0263]]],


        [[[-2.2352, -2.2740, -2.2546,  ..., -1.0915, -1.2078, -0.5681],
          [-2.2740, -2.2352, -2.2158,  ..., -1.2078, -1.2272, -0.3936],
          [-2.2352, -2.2158, -2.1577,  ..., -1.3241, -1.2272, -0.2967],
          ...,
          [ 2.1845,  2.4172,  2.5141,  ...,  0.6338,  0.4981,  0.1685],
          [ 1.0796,  1.0796,  1.0602,  ..., -1.9638, -2.0220, -2.0608],
          [-2.1964, -2.2158, -2.2546,  ..., -2.1383, -2.1964, -2.1964]],

         [[-2.2413, -2.3003, -2.2609,  ..., -1.1596, -1.2382, -0.4122],
          [-2.2806, -2.2609, -2.2413,  ..., -1.2579, -1.2579, -0.2746],
          [-2.2609, -2.2413, -2.1823,  ..., -1.2382, -1.1399, -0.1369],
          ...,
          [ 2.5378,  2.5968,  2.5968,  ...,  0.7284,  0.5908,  0.2958],
          [ 1.4758,  1.2988,  1.1808,  ..., -2.0446, -2.1429, -2.1233],
          [-2.1233, -2.1823, -2.2413,  ..., -2.2806, -2.2806, -2.2609]],

         [[-1.9092, -1.8897, -1.7922,  ..., -0.5435, -0.5630, -0.0167],
          [-1.9092, -1.7922, -1.7141,  ..., -0.6215, -0.5630,  0.0613],
          [-1.8117, -1.7141, -1.5776,  ..., -0.6606, -0.5825,  0.2954],
          ...,
          [ 2.7537,  2.7537,  2.7537,  ...,  1.2124,  1.0954,  0.7442],
          [ 1.6221,  1.4661,  1.4270,  ..., -1.5190, -1.5971, -1.6361],
          [-1.9482, -1.9873, -2.0068,  ..., -1.7727, -1.8312, -1.7531]]],


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4947,  2.4947,  2.4947],
          [ 2.5141,  2.4947,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          ...,
          [ 2.5141,  2.4947,  2.5141,  ...,  0.9439,  1.0408,  1.1571],
          [ 2.5141,  2.4947,  2.5141,  ...,  1.3510,  1.4673,  1.5836],
          [ 2.5141,  2.4947,  2.5141,  ...,  1.8744,  2.0295,  2.0876]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5771],
          [ 2.5968,  2.5771,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          ...,
          [ 2.5968,  2.5771,  2.5968,  ...,  1.0234,  1.1218,  1.2398],
          [ 2.5968,  2.5771,  2.5968,  ...,  1.4364,  1.5544,  1.6724],
          [ 2.5968,  2.5771,  2.5968,  ...,  1.9478,  2.1051,  2.1641]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7342,  2.7342],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          ...,
          [ 2.7537,  2.7342,  2.7537,  ...,  1.2905,  1.3880,  1.5051],
          [ 2.7537,  2.7342,  2.7537,  ...,  1.6807,  1.7977,  1.9148],
          [ 2.7537,  2.7342,  2.7537,  ...,  2.1294,  2.2660,  2.3440]]],


        ...,


        [[[ 1.6418,  1.5642,  1.5642,  ...,  1.5642,  1.5642,  1.6418],
          [ 1.5836,  1.5061,  1.5255,  ...,  1.5255,  1.5061,  1.5836],
          [ 1.6030,  1.5255,  1.5255,  ...,  1.5061,  1.5255,  1.6030],
          ...,
          [ 1.6030,  1.5255,  1.5448,  ...,  1.5255,  1.5448,  1.6030],
          [ 1.5836,  1.5061,  1.5061,  ...,  1.5255,  1.5061,  1.5836],
          [ 1.6418,  1.5642,  1.5642,  ...,  1.5642,  1.5642,  1.6418]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5771,  2.5968],
          ...,
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5968,  2.5771,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968]],

         [[ 0.8807,  0.8222,  0.8222,  ...,  0.8222,  0.8222,  0.8807],
          [ 0.8222,  0.7637,  0.7832,  ...,  0.7832,  0.7832,  0.8222],
          [ 0.8417,  0.7832,  0.7832,  ...,  0.7637,  0.7832,  0.8417],
          ...,
          [ 0.8417,  0.8027,  0.8027,  ...,  0.7832,  0.7832,  0.8222],
          [ 0.8222,  0.7637,  0.7637,  ...,  0.7832,  0.7637,  0.8417],
          [ 0.8807,  0.8222,  0.8222,  ...,  0.8222,  0.8222,  0.8807]]],


        [[[-1.4211, -0.9946,  0.2073,  ..., -0.0253, -0.1998, -0.7813],
          [-1.4986, -1.1303, -0.1998,  ...,  0.1104, -0.0253, -0.1029],
          [-1.5955, -1.1690, -0.1416,  ..., -0.0060,  0.3042,  0.8470],
          ...,
          [-1.9832, -1.9832, -1.9638,  ...,  1.7775,  1.9325,  2.0295],
          [-1.9832, -1.9638, -1.9444,  ...,  1.9713,  1.9325,  2.0295],
          [-1.9832, -1.9638, -1.9638,  ...,  1.9325,  1.8550,  2.0101]],

         [[-1.3562, -1.0612,  0.0794,  ..., -0.0779, -0.2942, -0.8252],
          [-1.4349, -1.2579, -0.4319,  ..., -0.0189, -0.1959, -0.2156],
          [-1.5136, -1.2776, -0.3729,  ..., -0.1959,  0.0794,  0.7284],
          ...,
          [-1.8676, -1.8676, -1.8479,  ...,  1.9281,  2.0658,  2.1641],
          [-1.8676, -1.8479, -1.8282,  ...,  2.1248,  2.0855,  2.1641],
          [-1.8676, -1.8479, -1.8479,  ...,  2.0855,  1.9675,  2.1445]],

         [[-1.6166, -1.3629, -0.4264,  ..., -0.7776, -0.9142, -1.2264],
          [-1.5776, -1.4800, -0.8947,  ..., -0.6801, -0.8167, -0.6411],
          [-1.5580, -1.4215, -0.7581,  ..., -0.7581, -0.4655,  0.2954],
          ...,
          [-1.7922, -1.7922, -1.7727,  ...,  1.7977,  2.0319,  2.0709],
          [-1.7922, -1.7727, -1.7531,  ...,  1.9928,  2.0123,  2.0904],
          [-1.7922, -1.7727, -1.7727,  ...,  1.9928,  1.9733,  2.1489]]],


        [[[-1.1690, -1.2466, -1.5180,  ..., -1.8281, -2.0414, -2.0220],
          [-1.7506, -1.4404, -1.4404,  ..., -1.8669, -2.0026, -1.9638],
          [-2.3321, -2.1771, -1.7506,  ..., -2.0026, -2.0220, -1.9057],
          ...,
          [-2.2740, -2.3321, -2.1964,  ..., -2.2158, -2.3321, -2.3903],
          [-2.3128, -2.3515, -2.2740,  ..., -2.3903, -2.3709, -2.1577],
          [-2.0608, -2.1964, -2.1577,  ..., -2.3321, -2.2352, -1.8087]],

         [[-0.4319, -0.4122, -0.6482,  ..., -1.1989, -1.2186, -1.1006],
          [-1.3562, -0.8252, -0.7072,  ..., -1.3169, -1.2579, -1.0612],
          [-2.2413, -1.8676, -1.2186,  ..., -1.5726, -1.3956, -1.0219],
          ...,
          [-2.2216, -2.2806, -2.1429,  ..., -2.1233, -2.2413, -2.3003],
          [-2.2609, -2.3003, -2.2216,  ..., -2.3199, -2.3003, -2.0643],
          [-2.0053, -2.1429, -2.1036,  ..., -2.2609, -2.1429, -1.7102]],

         [[-1.1678, -1.2849, -1.3434,  ..., -1.4605, -1.3629, -1.4215],
          [-1.6556, -1.4215, -1.2654,  ..., -1.5190, -1.4020, -1.4215],
          [-2.1629, -2.0263, -1.5580,  ..., -1.6556, -1.5190, -1.4410],
          ...,
          [-2.0458, -2.1043, -1.9678,  ..., -1.9482, -2.0653, -2.1238],
          [-2.0848, -2.1238, -2.0458,  ..., -2.1434, -2.1238, -1.8897],
          [-1.8507, -1.9873, -1.9482,  ..., -2.0848, -1.9678, -1.5385]]]]), tensor([1, 1, 0, 0, 1, 1, 1, 1, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 1, 1, 1, 0, 0, 1,
        0, 0, 1, 1, 1, 1, 0, 0, 1, 0, 0, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 1,
        1, 1, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 1, 1])]
x is [tensor([[[[ 1.6805,  1.6418,  1.6612,  ...,  1.4091,  1.3898,  1.3704],
          [ 1.6612,  1.6224,  1.6418,  ...,  1.3704,  1.3510,  1.3510],
          [ 1.6999,  1.6612,  1.6805,  ...,  1.3510,  1.3122,  1.3122],
          ...,
          [ 1.6612,  1.5642,  1.5642,  ...,  1.3704,  1.3704,  1.3704],
          [ 1.6612,  1.5642,  1.5642,  ...,  1.3704,  1.3898,  1.3898],
          [ 1.7193,  1.6418,  1.6418,  ...,  1.4091,  1.4285,  1.4285]],

         [[ 1.5741,  1.5348,  1.5544,  ...,  1.4954,  1.4561,  1.4364],
          [ 1.5544,  1.5151,  1.5348,  ...,  1.4364,  1.4168,  1.3971],
          [ 1.5741,  1.5544,  1.5741,  ...,  1.4168,  1.3774,  1.3774],
          ...,
          [ 1.6331,  1.5348,  1.5348,  ...,  1.3578,  1.3381,  1.3381],
          [ 1.6528,  1.5544,  1.5544,  ...,  1.3774,  1.3774,  1.3774],
          [ 1.7314,  1.6528,  1.6528,  ...,  1.4364,  1.4561,  1.4758]],

         [[ 2.1489,  2.1099,  2.1294,  ...,  2.1294,  2.1294,  2.1684],
          [ 2.1294,  2.0904,  2.1099,  ...,  2.0514,  2.0904,  2.1294],
          [ 2.1489,  2.1294,  2.1489,  ...,  2.0709,  2.0514,  2.1099],
          ...,
          [ 2.2270,  2.1294,  2.1294,  ...,  2.0904,  2.0904,  2.0904],
          [ 2.2465,  2.1294,  2.1489,  ...,  2.1294,  2.1294,  2.1294],
          [ 2.3050,  2.2270,  2.2465,  ...,  2.2270,  2.2465,  2.2465]]],


        [[[-2.0414, -1.9251, -1.9638,  ..., -0.5487, -0.5293, -0.5681],
          [-1.8475, -1.7700, -1.7894,  ..., -0.5293, -0.4906, -0.5487],
          [-1.7700, -1.6731, -1.6731,  ..., -0.5100, -0.4130, -0.5487],
          ...,
          [-1.7506, -2.0026, -1.3629,  ..., -1.9444, -1.7506, -1.3435],
          [-1.9638, -1.9057, -0.8201,  ..., -1.9832, -1.8475, -1.7312],
          [-2.0026, -1.7506, -0.5100,  ..., -1.2078, -1.8669, -1.9638]],

         [[-2.0249, -1.9069, -1.9463,  ..., -0.9236, -0.8449, -0.8252],
          [-1.9463, -1.8873, -1.8873,  ..., -0.9432, -0.8449, -0.8449],
          [-1.9659, -1.8479, -1.8676,  ..., -0.9629, -0.8252, -0.9039],
          ...,
          [-1.6709, -1.8479, -1.1202,  ..., -1.9069, -1.8479, -1.5726],
          [-1.8676, -1.7692, -0.6679,  ..., -1.7889, -1.8479, -1.8086],
          [-1.8873, -1.6316, -0.3926,  ..., -0.9826, -1.8086, -1.9856]],

         [[-2.0068, -1.8897, -1.9287,  ..., -1.2264, -1.1093, -1.0703],
          [-2.0458, -1.9678, -1.9873,  ..., -1.2264, -1.1093, -1.0703],
          [-2.1043, -2.0068, -2.0068,  ..., -1.2459, -1.0703, -1.1093],
          ...,
          [-1.4215, -1.5385, -0.7581,  ..., -1.8117, -1.8117, -1.6556],
          [-1.6361, -1.4605, -0.2899,  ..., -1.7141, -1.7922, -1.7531],
          [-1.6361, -1.3239, -0.0167,  ..., -0.7971, -1.6556, -1.7922]]],


        [[[ 0.1879,  0.9439,  1.0021,  ..., -1.1884, -1.0721, -0.7038],
          [ 0.2848,  0.8470,  1.0602,  ..., -1.1690, -0.9946, -0.5681],
          [ 0.3236,  0.6725,  0.9051,  ..., -0.8977, -0.9170, -0.5681],
          ...,
          [-0.3161, -1.1884, -0.8395,  ..., -1.3823, -1.3435, -1.3823],
          [-0.8977, -1.5180, -0.6263,  ..., -1.1884, -1.2854, -1.5761],
          [-0.8201, -1.0915, -0.7813,  ..., -0.9752, -0.9364, -1.0721]],

         [[ 1.1021,  1.5741,  1.5938,  ..., -0.1172,  0.4531,  0.7678],
          [ 1.0628,  1.4954,  1.6528,  ...,  0.1384,  0.6301,  0.9448],
          [ 1.1021,  1.3381,  1.5348,  ...,  0.5121,  0.6891,  0.9448],
          ...,
          [-0.7662, -1.2382, -1.0809,  ..., -1.0416, -1.0219, -1.0416],
          [-1.0809, -1.3956, -1.1399,  ..., -1.0612, -1.1202, -1.1989],
          [-1.0219, -1.2382, -1.0416,  ..., -1.1792, -1.1202, -1.1006]],

         [[ 1.2319,  1.5246,  1.6612,  ...,  0.9783,  1.5831,  1.7002],
          [ 1.2319,  1.4856,  1.7002,  ...,  1.3685,  1.7587,  1.9343],
          [ 1.2514,  1.3295,  1.5636,  ...,  1.6807,  1.8953,  1.8172],
          ...,
          [-0.0558, -0.4850, -0.3679,  ..., -0.2509, -0.2509, -0.3094],
          [-0.2118, -0.6215, -0.4069,  ..., -0.1338, -0.2899, -0.4264],
          [-0.2118, -0.5435, -0.4264,  ..., -0.3094, -0.4264, -0.2704]]],


        ...,


        [[[ 2.3590,  2.3978,  2.4753,  ...,  1.4091,  1.3898,  1.2735],
          [ 2.3784,  2.4559,  2.4753,  ...,  1.7581,  1.6805,  1.4867],
          [ 2.0876,  2.3978,  2.4172,  ...,  1.9519,  1.8550,  1.7775],
          ...,
          [ 0.1491,  0.4399,  0.8664,  ...,  0.9245,  0.8858,  0.7888],
          [ 2.3009,  2.3202,  2.3396,  ...,  2.3202,  2.3202,  2.3202],
          [ 2.4947,  2.4753,  2.4947,  ...,  2.5141,  2.5141,  2.5141]],

         [[ 2.1641,  2.1838,  2.2035,  ..., -0.4319, -0.4712, -0.5499],
          [ 2.0658,  2.1445,  2.3411,  ..., -0.0779, -0.0976, -0.2352],
          [ 1.5938,  1.9085,  2.3018,  ...,  0.1581,  0.0991, -0.0386],
          ...,
          [-0.1369,  0.0204,  0.3744,  ...,  0.4138,  0.3744,  0.2761],
          [ 2.2625,  2.2231,  2.2625,  ...,  2.3608,  2.3608,  2.3608],
          [ 2.5771,  2.5575,  2.5968,  ...,  2.5771,  2.5771,  2.5771]],

         [[ 1.0563,  1.0758,  1.3295,  ..., -0.9337, -0.9532, -0.9922],
          [ 1.2124,  1.2514,  1.5636,  ..., -1.1288, -1.1093, -1.1288],
          [ 0.5881,  0.8807,  1.2709,  ..., -1.0898, -1.0118, -0.9337],
          ...,
          [-0.1923, -0.3094, -0.1338,  ...,  0.0223, -0.0167, -0.0948],
          [ 2.3050,  2.1489,  2.1294,  ...,  2.2855,  2.2855,  2.2855],
          [ 2.7537,  2.6952,  2.7342,  ...,  2.7537,  2.7537,  2.7537]]],


        [[[ 2.4947,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.4947],
          [ 2.4559,  2.4947,  2.5141,  ...,  2.5141,  2.4947,  2.4559],
          [ 2.4753,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.4753],
          ...,
          [ 2.4365,  2.4559,  2.3978,  ...,  2.4947,  2.4947,  2.4753],
          [ 2.4365,  2.5141,  2.4753,  ...,  2.4947,  2.5141,  2.4753],
          [ 2.4559,  2.4559,  2.4365,  ...,  2.4947,  2.5141,  2.4753]],

         [[ 2.5771,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5771],
          [ 2.5378,  2.5771,  2.5968,  ...,  2.5968,  2.5771,  2.5378],
          [ 2.5575,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5575],
          ...,
          [ 2.5575,  2.5771,  2.5771,  ...,  2.5968,  2.5968,  2.5575],
          [ 2.5378,  2.5378,  2.5575,  ...,  2.5771,  2.5968,  2.5575],
          [ 2.5181,  2.5771,  2.5575,  ...,  2.5771,  2.5968,  2.5575]],

         [[ 2.7342,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7342],
          [ 2.6952,  2.7342,  2.7537,  ...,  2.7537,  2.7342,  2.6952],
          [ 2.7147,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7147],
          ...,
          [ 2.6952,  2.7342,  2.6952,  ...,  2.7342,  2.7537,  2.7147],
          [ 2.6952,  2.7147,  2.6757,  ...,  2.7537,  2.7537,  2.7147],
          [ 2.6952,  2.7147,  2.7342,  ...,  2.7537,  2.7537,  2.7147]]],


        [[[-0.7038, -0.6457, -0.7232,  ..., -1.6343, -1.6343, -1.5955],
          [-0.7620, -0.6650, -0.7232,  ..., -1.6537, -1.6731, -1.6731],
          [-0.8201, -0.7232, -0.7038,  ..., -1.6343, -1.5761, -1.5567],
          ...,
          [-1.7312, -1.7700, -1.7894,  ..., -1.7894, -1.7894, -1.6537],
          [-1.7506, -1.7894, -1.8475,  ..., -1.8087, -1.9638, -1.7506],
          [-1.7894, -1.8087, -1.8863,  ..., -1.7894, -1.9057, -1.7506]],

         [[ 0.6694,  0.5908,  0.5121,  ..., -1.4742, -1.4939, -1.5332],
          [ 0.5711,  0.5711,  0.5121,  ..., -1.4939, -1.5332, -1.5922],
          [ 0.4334,  0.5318,  0.4924,  ..., -1.5332, -1.4939, -1.4742],
          ...,
          [-0.9629, -1.0416, -1.2186,  ..., -1.7889, -1.7496, -1.6709],
          [-1.0809, -1.1006, -1.3169,  ..., -1.8479, -1.9266, -1.7692],
          [-1.1399, -1.2186, -1.4349,  ..., -1.8086, -1.9266, -1.8676]],

         [[-1.3044, -1.4020, -1.3434,  ..., -1.1093, -1.0898, -1.1093],
          [-1.3239, -1.4605, -1.4020,  ..., -1.0898, -1.1483, -1.2069],
          [-1.3825, -1.4410, -1.4800,  ..., -1.1483, -1.1093, -1.0703],
          ...,
          [-1.5971, -1.6556, -1.6946,  ..., -1.5385, -1.5190, -1.4605],
          [-1.6556, -1.6751, -1.6946,  ..., -1.5580, -1.6751, -1.5580],
          [-1.7141, -1.7141, -1.7336,  ..., -1.5776, -1.6946, -1.6556]]]]), tensor([0, 1, 1, 1, 1, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1,
        0, 0, 0, 1, 1, 0, 0, 1, 0, 1, 1, 1, 1, 1, 1, 1, 0, 1, 0, 1, 0, 1, 0, 0,
        0, 0, 1, 0, 0, 0, 1, 0, 1, 1, 1, 0, 0, 0, 0, 1])]
x is [tensor([[[[-0.6069, -0.5487, -0.6263,  ..., -0.2192,  0.0134, -0.3549],
          [-0.5100, -0.5487, -0.7232,  ..., -0.1610, -0.3355, -0.2580],
          [-0.5875, -0.5681, -0.6069,  ..., -0.1998, -0.5681, -0.3161],
          ...,
          [-0.0253,  0.2267,  0.3042,  ...,  0.1491,  0.3817,  0.6338],
          [ 0.1297,  0.1879,  0.3817,  ...,  0.3236,  0.6531,  0.7694],
          [-0.0253,  0.1879,  0.4011,  ...,  0.5562,  0.6725,  0.8276]],

         [[-1.0022, -0.9432, -1.0219,  ..., -0.5302, -0.2156, -0.5696],
          [-0.9039, -0.9432, -1.1202,  ..., -0.4712, -0.6089, -0.5302],
          [-0.9826, -0.9629, -1.0022,  ..., -0.5302, -0.8646, -0.6286],
          ...,
          [-0.2746, -0.0976, -0.0976,  ..., -0.1762,  0.1778,  0.4728],
          [-0.1369, -0.1369, -0.0386,  ...,  0.0401,  0.4531,  0.5711],
          [-0.2942, -0.1566, -0.0189,  ...,  0.3351,  0.4531,  0.6301]],

         [[-1.0703, -1.0118, -1.0898,  ..., -0.6215, -0.3484, -0.6996],
          [-0.9727, -1.0118, -1.1873,  ..., -0.5825, -0.7191, -0.6411],
          [-1.0508, -1.0313, -1.0703,  ..., -0.6215, -0.9727, -0.7191],
          ...,
          [-0.3874, -0.1143, -0.0362,  ..., -0.0948,  0.2564,  0.4905],
          [-0.2118, -0.1143,  0.0808,  ...,  0.1394,  0.5100,  0.6076],
          [-0.3094, -0.0948,  0.1198,  ...,  0.4515,  0.5296,  0.6466]]],


        [[[ 0.8664,  0.7888,  0.7888,  ..., -1.8669,  0.0134,  1.3704],
          [ 1.7387,  1.5642,  1.4867,  ..., -1.4404,  0.2848,  1.3510],
          [ 2.1264,  2.0876,  2.0876,  ..., -0.7232,  0.3624,  1.3704],
          ...,
          [ 0.2654,  0.2267,  0.2461,  ...,  0.2461,  0.5562,  1.1959],
          [ 0.2461,  0.2267,  0.2267,  ...,  0.2267,  0.5368,  1.1184],
          [ 0.2461,  0.2073,  0.2267,  ...,  0.1879,  0.5174,  1.0990]],

         [[ 1.0431,  0.9644,  0.9841,  ..., -1.8479,  0.2564,  1.5544],
          [ 1.8298,  1.7314,  1.6921,  ..., -1.3562,  0.5908,  1.5151],
          [ 2.2428,  2.2035,  2.2231,  ..., -0.5696,  0.7088,  1.5544],
          ...,
          [ 0.2958,  0.2564,  0.2761,  ...,  0.5318,  0.8858,  1.5348],
          [ 0.2761,  0.2368,  0.2564,  ...,  0.4924,  0.8858,  1.4561],
          [ 0.2761,  0.2171,  0.2368,  ...,  0.4531,  0.8661,  1.4168]],

         [[ 0.9003,  0.8612,  0.8612,  ..., -1.2264,  0.5881,  1.7392],
          [ 1.7197,  1.5831,  1.5051,  ..., -0.6411,  1.0173,  1.7392],
          [ 2.1294,  2.0904,  2.0904,  ..., -0.0948,  1.0954,  1.6807],
          ...,
          [-0.0753, -0.1143, -0.0948,  ...,  0.8417,  1.3295,  1.8758],
          [-0.0948, -0.0948, -0.0558,  ...,  0.8417,  1.3100,  1.7977],
          [-0.0948, -0.0753, -0.0362,  ...,  0.8222,  1.2709,  1.7002]]],


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          ...,
          [ 2.5141,  2.5141,  2.4753,  ...,  2.4947,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4753,  2.4947,  2.4947],
          [ 2.4947,  2.4947,  2.5141,  ...,  2.4947,  2.4947,  2.4947]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          ...,
          [ 2.5968,  2.5968,  2.5575,  ...,  2.5771,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5575,  2.5771,  2.5771],
          [ 2.5771,  2.5771,  2.5968,  ...,  2.5771,  2.5771,  2.5771]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          ...,
          [ 2.7537,  2.7537,  2.7147,  ...,  2.7342,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7147,  2.7342,  2.7342],
          [ 2.7342,  2.7342,  2.7537,  ...,  2.7342,  2.7342,  2.7342]]],


        ...,


        [[[-0.5875, -1.2660, -0.5293,  ...,  1.3122,  1.3316,  1.3316],
          [ 0.5174, -0.4324,  0.2461,  ...,  1.2541,  1.2541,  1.1959],
          [ 0.5174, -0.6844, -0.8589,  ...,  1.0796,  0.8470,  0.9245],
          ...,
          [-0.6457, -0.6069, -0.5681,  ..., -0.0060, -0.1223, -0.1998],
          [-0.5487, -0.5875, -0.6069,  ..., -0.2580, -0.2192, -0.1998],
          [-0.6263, -0.6069, -0.5487,  ..., -0.3549, -0.1416, -0.1610]],

         [[-0.7859, -1.3169, -0.7072,  ...,  0.4138,  0.3744,  0.4138],
          [ 0.3351, -0.4712,  0.1974,  ...,  0.3548,  0.2958,  0.2958],
          [ 0.2958, -0.7662, -0.9039,  ...,  0.1778, -0.0582,  0.0794],
          ...,
          [-0.6089, -0.5696, -0.5302,  ..., -0.1369, -0.2549, -0.3336],
          [-0.4909, -0.5696, -0.5892,  ..., -0.3926, -0.3926, -0.4122],
          [-0.5106, -0.5696, -0.5106,  ..., -0.4516, -0.3336, -0.3926]],

         [[-1.0118, -1.1483, -0.5435,  ..., -0.4655, -0.4460, -0.3874],
          [-0.1533, -0.5630,  0.2174,  ..., -0.4850, -0.4460, -0.4264],
          [-0.5240, -1.2459, -1.0703,  ..., -0.5630, -0.7386, -0.5630],
          ...,
          [-0.8752, -0.8752, -0.8362,  ..., -0.6020, -0.7386, -0.7386],
          [-0.7776, -0.8557, -0.8947,  ..., -0.8167, -0.7971, -0.6801],
          [-0.8167, -0.8557, -0.7971,  ..., -0.8167, -0.7581, -0.6996]]],


        [[[ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          [ 2.5141,  2.4947,  2.4947,  ...,  2.4947,  2.4947,  2.5141],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.5141,  2.5141,  2.5141],
          ...,
          [ 2.5141,  2.4753,  2.4753,  ...,  2.4753,  2.4753,  2.5141],
          [ 2.5141,  2.4753,  2.4753,  ...,  2.4753,  2.4753,  2.5141],
          [ 2.5141,  2.5141,  2.5141,  ...,  2.4753,  2.4753,  2.5141]],

         [[ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          [ 2.5968,  2.5771,  2.5771,  ...,  2.5771,  2.5771,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5968,  2.5968,  2.5968],
          ...,
          [ 2.5968,  2.5575,  2.5575,  ...,  2.5575,  2.5575,  2.5968],
          [ 2.5968,  2.5575,  2.5575,  ...,  2.5575,  2.5575,  2.5968],
          [ 2.5968,  2.5968,  2.5968,  ...,  2.5771,  2.5771,  2.5968]],

         [[ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          [ 2.7537,  2.7342,  2.7342,  ...,  2.7342,  2.7342,  2.7537],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.7537,  2.7537,  2.7537],
          ...,
          [ 2.7537,  2.7147,  2.7147,  ...,  2.7147,  2.7147,  2.7537],
          [ 2.7537,  2.7147,  2.7147,  ...,  2.6952,  2.6952,  2.7537],
          [ 2.7537,  2.7537,  2.7537,  ...,  2.6367,  2.6172,  2.7537]]],


        [[[-2.2158, -2.1964, -2.1771,  ..., -2.2740, -2.3128, -2.3128],
          [-2.0414, -2.0220, -2.0220,  ..., -2.2934, -2.3128, -2.3128],
          [-1.8475, -1.8669, -1.8863,  ..., -2.3128, -2.3128, -2.3128],
          ...,
          [-1.5761, -1.4986, -1.3241,  ..., -1.4404, -1.4598, -1.4986],
          [-1.4598, -1.6343, -1.5374,  ..., -1.4986, -1.4792, -1.4792],
          [-1.3629, -1.5374, -1.5761,  ..., -1.4986, -1.4986, -1.4211]],

         [[-1.9659, -1.9463, -1.9463,  ..., -2.0053, -2.0446, -2.0446],
          [-1.8282, -1.8282, -1.8282,  ..., -2.0249, -2.0446, -2.0446],
          [-1.7299, -1.7496, -1.7692,  ..., -2.0446, -2.0446, -2.0446],
          ...,
          [-1.4546, -1.3956, -1.2186,  ..., -1.3759, -1.3956, -1.4349],
          [-1.3759, -1.4939, -1.3562,  ..., -1.4349, -1.4152, -1.4152],
          [-1.2776, -1.3759, -1.3759,  ..., -1.4349, -1.4349, -1.3366]],

         [[-1.9873, -1.9678, -1.9482,  ..., -1.9873, -2.0263, -2.0263],
          [-1.9092, -1.8897, -1.8897,  ..., -2.0068, -2.0263, -2.0263],
          [-1.8507, -1.8702, -1.8702,  ..., -2.0263, -2.0263, -2.0263],
          ...,
          [-1.7922, -1.7922, -1.6751,  ..., -1.6556, -1.6751, -1.6751],
          [-1.6946, -1.8507, -1.7727,  ..., -1.7141, -1.6946, -1.6751],
          [-1.6751, -1.7727, -1.7922,  ..., -1.7141, -1.7141, -1.6556]]]]), tensor([0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 1, 0, 1])]
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 7.5
Folders created.
Checkpoint name: cifar100__CNN_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100__CNN_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
Number of Classes: 100
PreActNet(
  (conv1): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
  (layer1): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (layer2): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (shortcut): Sequential(
        (0): Conv2d(64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (layer3): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (shortcut): Sequential(
        (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (layer4): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (shortcut): Sequential(
        (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (linear): Linear(in_features=512, out_features=100, bias=True)
)
==> unlearning ...
Computing current moments on test set
Computed moments: 10.383838467407227,8.01676732711792,-1.3475479697511006
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.924 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.550 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.0 , Uacc: 7.8
Forgetting epoch 0
Resetting retain iterator...
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.965 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.590 on forgotten vs unseen images
Accuracy on test set: 1.4 , Racc: 0.7 , Uacc: 71.0
Forgetting epoch 1
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.511 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 0.8 , Uacc: 81.5
Forgetting epoch 2
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.537 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 0.6 , Uacc: 60.0
Forgetting epoch 3
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.565 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.8 , Uacc: 80.2
Forgetting epoch 4
Computing current moments on test set
Computed moments: 22.508987408447265,94.48721342773437,2.5592654959641123
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.530 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 0.8 , Uacc: 83.0
Forgetting epoch 5
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.539 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 0.9 , Uacc: 85.2
Forgetting epoch 6
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.569 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.8 , Uacc: 91.5
Forgetting epoch 7
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.511 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 91.0
Forgetting epoch 8
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.574 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 94.0
Forgetting epoch 9
Computing current moments on test set
Computed moments: 25.49767059020996,162.37069885253905,3.242034474916404
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.549 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 94.0
Forgetting epoch 10
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.591 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 95.2
Forgetting epoch 11
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.537 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 94.5
Forgetting epoch 12
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.591 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 96.5
Forgetting epoch 13
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.583 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 98.0
Forgetting epoch 14
Computing current moments on test set
Computed moments: 26.74697229309082,152.70558900756836,2.9386725314377093
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.599 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 96.2
Forgetting epoch 15
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.546 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 97.8
Forgetting epoch 16
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.570 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 99.2
Forgetting epoch 17
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.615 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.8 , Uacc: 96.8
Forgetting epoch 18
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.605 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 98.5
Forgetting epoch 19
Computing current moments on test set
Computed moments: 25.98800549926758,100.93938464355469,2.4058109597999286
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.647 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 98.8
Forgetting epoch 20
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.606 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 100.0
Forgetting epoch 21
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.585 on forgotten vs unseen images
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 99.8
Forgetting epoch 22
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.611 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 99.5
Forgetting epoch 23
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True]
The MIA_loss has an accuracy of 0.986 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.613 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 0.9 , Uacc: 100.0
Forgetting epoch 24
Accuracy on test set: 1.8 , Racc: 0.9 , Uacc: 100.0
Checkpoint name: cifar100__CNN_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100__CNN_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
Number of Classes: 100
Traceback (most recent call last):
  File "nabla_unlearning_main.py", line 552, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/CNN-CIFAR100_3407_800000/checkpoint_s_500000.pt'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "nabla_unlearning_main.py", line 555, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/CNN-CIFAR100_3407_800000/checkpoint-s:500000.pt'
Checkpoint name: cifar100__CNN_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100__CNN_1_0_forget_[0, 1]_num_400_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353   251 19825 30759 29729
 31082 19591 20216 17928 27252 33141 27811  7482 12094 24208 38794 28166
 21188 11946 10287 12588  8013 19241 21333 20387 16137  8855 26872 24630
 35947 39010 34150 26599 25140 17774  1073 18866 30668 21119 15839 13886
  5896 33982 33563   810 15502 17508  9307  2644 30617 13699 39935  3338
 39546  6711 19081 11622 26924 13996  5139 25054  3760 30280 39636  6155
 30403 37127 22528  5444  5987 28612 35535 24245  7002  3820 11696 13097
 27068 12492 28743  9787 36354 37138 19503 26390  7557 26786 17683 20007
  8897  1702 33681 15850 23501  4390 31614  5417 19364 25932 31229 31030
 15037 21767  6472 32967  5066 20610  4655 16161  2538 24956 36133 27596
 30603 15081  7592 17892 23084  4479 24553 16920  3973 35392 29965 10463
 36118 13356 32034  5818 28389 23575  2629 23411  2884 12223 16361 34368
 35896 21114 26212 17385 23008 11582  2853 36094 34174 35283 33580  9752
 33044  8862 10230 36194  3010 27820 33297 29436 29513  2120 22027 12754
  1112 39738  3517 37591 10548 22759 11977 36602  1999 34618 25504 29196
 18571 13224  2782 31575 16108 34337 18030  2985 31530 28037 20599 32061
 32702 15947 31109 39064  7615 28852 33504 13252  1328 33488 25706  8032
  4627 24803  3333 32556]
Number of Classes: 100
PreActNet(
  (conv1): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
  (layer1): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (layer2): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (shortcut): Sequential(
        (0): Conv2d(64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (layer3): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (shortcut): Sequential(
        (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (layer4): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (shortcut): Sequential(
        (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (linear): Linear(in_features=512, out_features=100, bias=True)
)
Traceback (most recent call last):
  File "main_forget_sparse.py", line 556, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/CNN-CIFAR100_3407_800000/checkpoint_s_500000.pt'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "main_forget_sparse.py", line 559, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/CNN-CIFAR100_3407_800000/checkpoint-s:500000.pt'
Checkpoint name: cifar100__CNN_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100__CNN_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
Number of Classes: 100
logs/cifar100__CNN_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_resume_cp1304
==> unlearning ...
Computing current moments on test set
Computed moments: 10.383838467407227,8.01676732711792,-1.3475479697511006
The MIA_loss has an accuracy of 0.942 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.550 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 6.0
Forgetting epoch 0
Resetting retain iterator...
using alpha: 0.1
delta_val_loss: 6.4539690017700195
delta_first_moment: 8.016767501831055
delta_second_moment: nan
The MIA_loss has an accuracy of 0.815 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.580 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.5 , Uacc: 0.0
Forgetting epoch 1
using alpha: 0.096
delta_val_loss: 4.448558807373047
delta_first_moment: 8.016767501831055
delta_second_moment: nan
The MIA_loss has an accuracy of 0.560 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.552 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 2
using alpha: 0.092
delta_val_loss: 5.082057952880859
delta_first_moment: 8.016767501831055
delta_second_moment: nan
The MIA_loss has an accuracy of 0.635 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.630 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 3
using alpha: 0.088
delta_val_loss: 3.510681629180908
delta_first_moment: 8.016767501831055
delta_second_moment: nan
The MIA_loss has an accuracy of 0.697 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.655 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 4
using alpha: 0.08399999999999999
delta_val_loss: 2.612901210784912
delta_first_moment: 8.016767501831055
delta_second_moment: nan
Computing current moments on test set
Computed moments: 5.488032284545898,3.7728775260925294,2.4373030098306137
The MIA_loss has an accuracy of 0.718 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.690 on forgotten vs unseen images
Accuracy on test set: 0.8 , Racc: 0.9 , Uacc: 1.5
Forgetting epoch 5
using alpha: 0.07999999999999999
delta_val_loss: -2.9134740829467773
delta_first_moment: 3.7728774547576904
delta_second_moment: nan
The MIA_loss has an accuracy of 0.657 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.635 on forgotten vs unseen images
Accuracy on test set: 0.7 , Racc: 0.8 , Uacc: 4.0
Forgetting epoch 6
using alpha: 0.07599999999999998
delta_val_loss: 0.47130680084228516
delta_first_moment: 3.7728774547576904
delta_second_moment: nan
The MIA_loss has an accuracy of 0.812 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.495 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.5
Forgetting epoch 7
using alpha: 0.07199999999999998
delta_val_loss: 1.065678596496582
delta_first_moment: 3.7728774547576904
delta_second_moment: nan
The MIA_loss has an accuracy of 0.487 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.450 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 8
using alpha: 0.06799999999999998
delta_val_loss: 0.8394031524658203
delta_first_moment: 3.7728774547576904
delta_second_moment: nan
The MIA_loss has an accuracy of 0.770 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.593 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 9
using alpha: 0.06399999999999997
delta_val_loss: 0.6386675834655762
delta_first_moment: 3.7728774547576904
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.644382039642334,0.05639285371564329,-1.924642699255756
The MIA_loss has an accuracy of 0.927 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.452 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 10
using alpha: 0.05999999999999997
delta_val_loss: -0.39765501022338867
delta_first_moment: 0.056392852216959
delta_second_moment: nan
The MIA_loss has an accuracy of 0.965 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.605 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 11
using alpha: 0.055999999999999966
delta_val_loss: -0.40121030807495117
delta_first_moment: 0.056392852216959
delta_second_moment: nan
The MIA_loss has an accuracy of 0.975 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.642 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 12
using alpha: 0.05199999999999996
delta_val_loss: -0.31352853775024414
delta_first_moment: 0.056392852216959
delta_second_moment: nan
The MIA_loss has an accuracy of 0.972 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.628 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 13
using alpha: 0.04799999999999996
delta_val_loss: -0.2090296745300293
delta_first_moment: 0.056392852216959
delta_second_moment: nan
The MIA_loss has an accuracy of 0.885 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.637 on forgotten vs unseen images
Accuracy on test set: 1.4 , Racc: 1.2 , Uacc: 0.0
Forgetting epoch 14
using alpha: 0.043999999999999956
delta_val_loss: -0.10235214233398438
delta_first_moment: 0.056392852216959
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.607868075561523,0.0049677876077592375,-0.6838768123308528
The MIA_loss has an accuracy of 0.827 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.695 on forgotten vs unseen images
Accuracy on test set: 1.4 , Racc: 1.3 , Uacc: 0.0
Forgetting epoch 15
using alpha: 0.03999999999999995
delta_val_loss: -0.09277820587158203
delta_first_moment: 0.004967787768691778
delta_second_moment: nan
The MIA_loss has an accuracy of 0.782 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.500 on forgotten vs unseen images
Accuracy on test set: 1.4 , Racc: 1.3 , Uacc: 0.0
Forgetting epoch 16
using alpha: 0.03599999999999995
delta_val_loss: -0.08380126953125
delta_first_moment: 0.004967787768691778
delta_second_moment: nan
The MIA_loss has an accuracy of 0.782 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.500 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.2 , Uacc: 0.0
Forgetting epoch 17
using alpha: 0.031999999999999945
delta_val_loss: -0.07203340530395508
delta_first_moment: 0.004967787768691778
delta_second_moment: nan
The MIA_loss has an accuracy of 0.815 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.500 on forgotten vs unseen images
Accuracy on test set: 1.3 , Racc: 1.2 , Uacc: 0.0
Forgetting epoch 18
using alpha: 0.027999999999999945
delta_val_loss: -0.08674001693725586
delta_first_moment: 0.004967787768691778
delta_second_moment: nan
The MIA_loss has an accuracy of 0.787 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.500 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 1.1 , Uacc: 0.0
Forgetting epoch 19
using alpha: 0.023999999999999945
delta_val_loss: -0.07645368576049805
delta_first_moment: 0.004967787768691778
delta_second_moment: nan
Computing current moments on test set
Computed moments: 4.606234670257568,0.0046550245180726054,-0.7757064234376584
The MIA_loss has an accuracy of 0.790 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.500 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 20
using alpha: 0.019999999999999945
delta_val_loss: -0.08687162399291992
delta_first_moment: 0.004655024502426386
delta_second_moment: nan
The MIA_loss has an accuracy of 0.785 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.500 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 21
using alpha: 0.015999999999999945
delta_val_loss: -0.10161352157592773
delta_first_moment: 0.004655024502426386
delta_second_moment: nan
The MIA_loss has an accuracy of 0.713 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.500 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 22
using alpha: 0.011999999999999945
delta_val_loss: -0.09042549133300781
delta_first_moment: 0.004655024502426386
delta_second_moment: nan
The MIA_loss has an accuracy of 0.745 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.500 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 23
using alpha: 0.007999999999999945
delta_val_loss: -0.10917139053344727
delta_first_moment: 0.004655024502426386
delta_second_moment: nan
The MIA_loss has an accuracy of 0.820 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.500 on forgotten vs unseen images
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Forgetting epoch 24
using alpha: 0.003999999999999945
delta_val_loss: -0.1660904884338379
delta_first_moment: 0.004655024502426386
delta_second_moment: nan
The MIA loss has an accuracy of 0.857 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.490 on forgotten vs unseen images
x is [tensor([[[[ 2.5141e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.4947e+00,  2.4947e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.4947e+00,  2.4947e+00],
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.4947e+00,  2.5141e+00],
          ...,
          [ 2.3978e+00,  2.4559e+00,  2.4947e+00,  ...,  2.5141e+00,
            2.4947e+00,  2.4947e+00],
          [ 2.4172e+00,  2.4559e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.4947e+00,  2.4947e+00],
          [ 2.4753e+00,  2.4753e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.4947e+00,  2.4947e+00]],

         [[ 2.5968e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5771e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5771e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5771e+00,  2.5968e+00],
          ...,
          [ 2.5968e+00,  2.5771e+00,  2.5575e+00,  ...,  2.5968e+00,
            2.5771e+00,  2.5771e+00],
          [ 2.5771e+00,  2.5771e+00,  2.5575e+00,  ...,  2.5968e+00,
            2.5771e+00,  2.5771e+00],
          [ 2.5771e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5771e+00,  2.5771e+00]],

         [[ 2.7537e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7342e+00,  2.7342e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7342e+00,  2.7342e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7342e+00,  2.7537e+00],
          ...,
          [ 2.7537e+00,  2.7342e+00,  2.7147e+00,  ...,  2.7537e+00,
            2.7342e+00,  2.7342e+00],
          [ 2.7342e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7537e+00,
            2.7342e+00,  2.7342e+00],
          [ 2.7342e+00,  2.7342e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7342e+00,  2.7342e+00]]],


        [[[ 1.4867e+00,  1.3898e+00,  1.3316e+00,  ...,  1.2735e+00,
            1.2541e+00,  1.2735e+00],
          [ 1.5061e+00,  1.4285e+00,  1.3510e+00,  ...,  1.3316e+00,
            1.3704e+00,  1.3510e+00],
          [ 1.5061e+00,  1.4673e+00,  1.4285e+00,  ...,  1.3510e+00,
            1.3316e+00,  1.3316e+00],
          ...,
          [ 1.4285e+00,  1.4673e+00,  1.4867e+00,  ...,  1.5836e+00,
            1.6224e+00,  1.6224e+00],
          [ 1.4867e+00,  1.4673e+00,  1.4867e+00,  ...,  1.6224e+00,
            1.6612e+00,  1.6224e+00],
          [ 1.5061e+00,  1.4091e+00,  1.4091e+00,  ...,  1.6030e+00,
            1.6418e+00,  1.6224e+00]],

         [[ 1.5151e+00,  1.4168e+00,  1.3774e+00,  ...,  1.2988e+00,
            1.2201e+00,  1.1611e+00],
          [ 1.5544e+00,  1.4758e+00,  1.3774e+00,  ...,  1.3971e+00,
            1.3578e+00,  1.2594e+00],
          [ 1.5544e+00,  1.5151e+00,  1.4758e+00,  ...,  1.4561e+00,
            1.3774e+00,  1.2988e+00],
          ...,
          [ 1.4561e+00,  1.5151e+00,  1.5348e+00,  ...,  1.5938e+00,
            1.6331e+00,  1.6331e+00],
          [ 1.5151e+00,  1.4954e+00,  1.5348e+00,  ...,  1.6331e+00,
            1.6724e+00,  1.6331e+00],
          [ 1.4954e+00,  1.4168e+00,  1.4168e+00,  ...,  1.6724e+00,
            1.6724e+00,  1.6331e+00]],

         [[ 1.3685e+00,  1.2709e+00,  1.2319e+00,  ...,  1.3880e+00,
            1.3295e+00,  1.3100e+00],
          [ 1.3295e+00,  1.2514e+00,  1.1539e+00,  ...,  1.4856e+00,
            1.4661e+00,  1.3880e+00],
          [ 1.3295e+00,  1.2905e+00,  1.2514e+00,  ...,  1.5246e+00,
            1.4661e+00,  1.4075e+00],
          ...,
          [ 1.3685e+00,  1.3685e+00,  1.3490e+00,  ...,  1.6612e+00,
            1.7002e+00,  1.7002e+00],
          [ 1.4661e+00,  1.4075e+00,  1.3685e+00,  ...,  1.7002e+00,
            1.7392e+00,  1.7002e+00],
          [ 1.4075e+00,  1.2905e+00,  1.2709e+00,  ...,  1.7587e+00,
            1.7392e+00,  1.6807e+00]]],


        [[[ 2.5141e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4753e+00,  2.4947e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.4947e+00],
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.4947e+00,
            2.5141e+00,  2.5141e+00],
          ...,
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00]],

         [[ 2.5968e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5575e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5968e+00,  2.5968e+00],
          ...,
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00]],

         [[ 2.7537e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7147e+00,  2.7342e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7342e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7342e+00,
            2.7537e+00,  2.7537e+00],
          ...,
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00]]],


        ...,


        [[[-1.3435e+00, -1.3047e+00, -1.2854e+00,  ..., -1.4598e+00,
           -1.4986e+00, -1.5567e+00],
          [-1.3241e+00, -1.3047e+00, -1.2854e+00,  ..., -1.4598e+00,
           -1.4986e+00, -1.5567e+00],
          [-1.3047e+00, -1.2854e+00, -1.2854e+00,  ..., -1.4598e+00,
           -1.4986e+00, -1.5567e+00],
          ...,
          [-1.1303e+00, -1.1497e+00, -1.1497e+00,  ..., -1.3241e+00,
           -1.4017e+00, -1.4017e+00],
          [-1.1303e+00, -1.1497e+00, -1.1497e+00,  ..., -1.3435e+00,
           -1.4404e+00, -1.4211e+00],
          [-1.1303e+00, -1.1497e+00, -1.1497e+00,  ..., -1.3435e+00,
           -1.4792e+00, -1.4211e+00]],

         [[-8.2524e-01, -7.8591e-01, -7.6624e-01,  ..., -1.0022e+00,
           -1.0416e+00, -1.1006e+00],
          [-8.0557e-01, -7.8591e-01, -7.6624e-01,  ..., -1.0022e+00,
           -1.0416e+00, -1.1006e+00],
          [-7.8591e-01, -7.6624e-01, -7.6624e-01,  ..., -1.0022e+00,
           -1.0416e+00, -1.1006e+00],
          ...,
          [-4.1224e-01, -4.3190e-01, -4.3190e-01,  ..., -8.8424e-01,
           -9.6291e-01, -9.4324e-01],
          [-4.1224e-01, -4.3190e-01, -4.3190e-01,  ..., -9.0391e-01,
           -1.0022e+00, -9.6291e-01],
          [-4.1224e-01, -4.3190e-01, -4.3190e-01,  ..., -9.0391e-01,
           -1.0219e+00, -9.6291e-01]],

         [[-7.1910e-01, -6.8008e-01, -6.6057e-01,  ..., -1.0508e+00,
           -1.0898e+00, -1.1483e+00],
          [-6.9959e-01, -6.8008e-01, -6.6057e-01,  ..., -1.0508e+00,
           -1.0898e+00, -1.1483e+00],
          [-6.8008e-01, -6.6057e-01, -6.6057e-01,  ..., -1.0508e+00,
           -1.0898e+00, -1.1483e+00],
          ...,
          [-3.0938e-01, -3.2889e-01, -3.2889e-01,  ..., -8.3616e-01,
           -9.3371e-01, -9.9224e-01],
          [-3.0938e-01, -3.2889e-01, -3.2889e-01,  ..., -8.3616e-01,
           -9.7273e-01, -1.0118e+00],
          [-3.0938e-01, -3.2889e-01, -3.2889e-01,  ..., -8.5567e-01,
           -1.0118e+00, -1.0118e+00]]],


        [[[-2.1383e+00, -2.0801e+00, -1.4404e+00,  ..., -2.4291e+00,
           -2.4291e+00, -2.4097e+00],
          [-2.3515e+00, -2.2352e+00, -1.6149e+00,  ..., -2.4291e+00,
           -2.4291e+00, -2.4291e+00],
          [-2.3903e+00, -2.3709e+00, -1.8281e+00,  ..., -2.4097e+00,
           -2.4097e+00, -2.4097e+00],
          ...,
          [ 4.7867e-01,  2.0728e-01,  3.0421e-01,  ..., -2.5336e-02,
           -4.4721e-02, -2.5336e-02],
          [ 1.3510e+00,  7.3068e-01,  6.7252e-01,  ..., -2.5336e-02,
            1.8790e-01,  2.4605e-01],
          [ 1.2541e+00,  1.3316e+00,  1.2347e+00,  ...,  3.2359e-01,
            8.8576e-01,  9.0514e-01]],

         [[ 2.7610e-01,  2.1710e-01,  1.9744e-01,  ..., -3.7290e-01,
           -3.7290e-01, -3.3357e-01],
          [ 2.1710e-01,  1.5810e-01,  1.1877e-01,  ..., -3.9257e-01,
           -3.9257e-01, -3.7290e-01],
          [ 1.5810e-01,  1.1877e-01,  9.9101e-02,  ..., -4.1224e-01,
           -4.1224e-01, -3.9257e-01],
          ...,
          [-8.2524e-01, -1.2776e+00, -1.2579e+00,  ..., -2.3986e+00,
           -2.4183e+00, -2.4183e+00],
          [ 7.2844e-01, -2.5490e-01, -3.9257e-01,  ..., -2.1429e+00,
           -1.7102e+00, -1.6906e+00],
          [ 5.1211e-01,  4.7277e-01,  4.7277e-01,  ..., -1.1596e+00,
           -3.3357e-01, -1.9590e-01]],

         [[ 2.7342e+00,  2.6757e+00,  2.4416e+00,  ...,  2.3050e+00,
            2.3050e+00,  2.3245e+00],
          [ 2.6952e+00,  2.6562e+00,  2.4221e+00,  ...,  2.2855e+00,
            2.2855e+00,  2.2855e+00],
          [ 2.6952e+00,  2.6367e+00,  2.4806e+00,  ...,  2.3050e+00,
            2.3050e+00,  2.2855e+00],
          ...,
          [-1.2069e+00, -1.7531e+00, -1.9092e+00,  ..., -2.1824e+00,
           -2.2019e+00, -2.1629e+00],
          [ 3.9299e-01, -7.9714e-01, -8.5567e-01,  ..., -1.9482e+00,
           -1.5190e+00, -1.4800e+00],
          [ 2.9543e-01,  2.5641e-01,  2.7592e-01,  ..., -9.7273e-01,
           -1.3379e-01,  2.7802e-03]]],


        [[[ 1.8356e+00,  1.8162e+00,  1.8938e+00,  ...,  5.1744e-01,
            3.0421e-01,  3.4298e-01],
          [ 1.4673e+00,  1.2735e+00,  1.6224e+00,  ...,  3.6236e-01,
            3.0421e-01,  2.6544e-01],
          [ 1.5255e+00,  1.2735e+00,  1.5255e+00,  ...,  2.6544e-01,
            3.8175e-01,  2.2667e-01],
          ...,
          [-1.0527e+00, -1.0140e+00, -9.7520e-01,  ..., -1.3435e+00,
           -1.3629e+00, -1.3241e+00],
          [-3.5488e-01, -3.9365e-01, -4.5180e-01,  ..., -1.4211e+00,
           -1.2854e+00, -1.3047e+00],
          [ 7.1129e-01,  4.0113e-01,  4.5929e-01,  ..., -1.9980e-01,
           -1.2226e-01, -8.9766e-01]],

         [[ 1.5810e-01,  3.3510e-01,  7.6703e-04,  ..., -1.2186e+00,
           -1.2579e+00, -1.2579e+00],
          [-6.2857e-01, -7.8591e-01, -6.0891e-01,  ..., -1.2776e+00,
           -1.3169e+00, -1.2776e+00],
          [-9.0391e-01, -1.1399e+00, -8.6457e-01,  ..., -1.2776e+00,
           -1.0809e+00, -1.1989e+00],
          ...,
          [-1.0219e+00, -1.1202e+00, -9.6291e-01,  ..., -1.4742e+00,
           -1.4939e+00, -1.4939e+00],
          [-2.5490e-01, -4.3190e-01, -4.1224e-01,  ..., -1.5332e+00,
           -1.4152e+00, -1.4546e+00],
          [ 8.2677e-01,  4.3344e-01,  5.5144e-01,  ..., -2.3524e-01,
           -1.7623e-01, -1.0416e+00]],

         [[ 3.5397e-01,  4.5152e-01,  1.3935e-01,  ..., -1.2264e+00,
           -1.2264e+00, -1.2069e+00],
          [-5.6302e-01, -6.4106e-01, -4.2645e-01,  ..., -1.3239e+00,
           -1.2654e+00, -1.1483e+00],
          [-9.5322e-01, -9.5322e-01, -6.4106e-01,  ..., -1.3239e+00,
           -9.3371e-01, -1.0313e+00],
          ...,
          [-1.1288e+00, -1.1483e+00, -7.7763e-01,  ..., -1.3434e+00,
           -1.3629e+00, -1.3434e+00],
          [-3.6792e-01, -4.8498e-01, -9.4771e-02,  ..., -1.4215e+00,
           -1.2849e+00, -1.3044e+00],
          [ 7.0515e-01,  3.9299e-01,  7.0515e-01,  ..., -1.3379e-01,
           -7.5261e-02, -9.7273e-01]]]]), tensor([0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 1, 1, 0, 1, 1, 1,
        1, 0, 1, 1, 1, 0, 0, 0, 1, 1, 1, 0, 0, 1, 1, 1, 1, 0, 0, 1, 1, 0, 1, 1,
        0, 0, 0, 1, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 1, 1])]
x is [tensor([[[[-2.4291e+00, -2.4291e+00, -2.4291e+00,  ..., -2.4291e+00,
           -2.4291e+00, -2.4291e+00],
          [-2.4291e+00, -2.4291e+00, -2.4291e+00,  ..., -2.4291e+00,
           -2.4291e+00, -2.4291e+00],
          [-2.4291e+00, -2.4291e+00, -2.4291e+00,  ..., -2.4291e+00,
           -2.4291e+00, -2.4291e+00],
          ...,
          [-2.4291e+00, -2.4291e+00, -2.4291e+00,  ..., -2.4291e+00,
           -2.4291e+00, -2.4291e+00],
          [-2.4291e+00, -2.4291e+00, -2.4291e+00,  ..., -2.4291e+00,
           -2.4291e+00, -2.4291e+00],
          [-2.4291e+00, -2.4291e+00, -2.4291e+00,  ..., -2.4291e+00,
           -2.4291e+00, -2.4291e+00]],

         [[-2.4183e+00, -2.4183e+00, -2.4183e+00,  ..., -2.4183e+00,
           -2.4183e+00, -2.4183e+00],
          [-2.4183e+00, -2.4183e+00, -2.4183e+00,  ..., -2.4183e+00,
           -2.4183e+00, -2.4183e+00],
          [-2.4183e+00, -2.4183e+00, -2.4183e+00,  ..., -2.4183e+00,
           -2.4183e+00, -2.4183e+00],
          ...,
          [-2.4183e+00, -2.4183e+00, -2.4183e+00,  ..., -2.4183e+00,
           -2.4183e+00, -2.4183e+00],
          [-2.4183e+00, -2.4183e+00, -2.4183e+00,  ..., -2.4183e+00,
           -2.4183e+00, -2.4183e+00],
          [-2.4183e+00, -2.4183e+00, -2.4183e+00,  ..., -2.4183e+00,
           -2.4183e+00, -2.4183e+00]],

         [[-2.2214e+00, -2.2214e+00, -2.2214e+00,  ..., -2.2214e+00,
           -2.2214e+00, -2.2214e+00],
          [-2.2214e+00, -2.2214e+00, -2.2214e+00,  ..., -2.2214e+00,
           -2.2214e+00, -2.2214e+00],
          [-2.2214e+00, -2.2214e+00, -2.2214e+00,  ..., -2.2214e+00,
           -2.2214e+00, -2.2214e+00],
          ...,
          [-2.2214e+00, -2.2214e+00, -2.2214e+00,  ..., -2.2019e+00,
           -2.2214e+00, -2.2214e+00],
          [-2.2214e+00, -2.2214e+00, -2.2214e+00,  ..., -2.2019e+00,
           -2.2214e+00, -2.2214e+00],
          [-2.2214e+00, -2.2214e+00, -2.2214e+00,  ..., -2.2214e+00,
           -2.2214e+00, -2.2214e+00]]],


        [[[-1.3435e+00, -1.3629e+00, -1.3241e+00,  ..., -7.0381e-01,
           -9.5581e-01, -1.3823e+00],
          [-1.3241e+00, -1.3435e+00, -1.3047e+00,  ..., -6.8442e-01,
           -9.7520e-01, -1.3629e+00],
          [-1.3435e+00, -1.3629e+00, -1.3241e+00,  ..., -8.5889e-01,
           -8.9766e-01, -1.2272e+00],
          ...,
          [-1.2660e+00, -8.7827e-01,  3.8175e-01,  ..., -1.6924e+00,
           -1.6537e+00, -1.6149e+00],
          [-8.0073e-01,  4.0113e-01,  4.7867e-01,  ..., -1.7118e+00,
           -1.6731e+00, -1.6343e+00],
          [ 1.3434e-02,  1.0214e+00,  6.9191e-01,  ..., -1.3435e+00,
           -1.4792e+00, -1.6731e+00]],

         [[-1.1792e+00, -1.2186e+00, -1.2186e+00,  ..., -3.7290e-01,
           -7.0724e-01, -1.2776e+00],
          [-1.1596e+00, -1.1989e+00, -1.1989e+00,  ..., -3.9257e-01,
           -7.4657e-01, -1.2382e+00],
          [-1.1596e+00, -1.2186e+00, -1.2186e+00,  ..., -6.4824e-01,
           -7.2691e-01, -1.1006e+00],
          ...,
          [-1.1989e+00, -6.6791e-01,  6.6944e-01,  ..., -1.6316e+00,
           -1.5922e+00, -1.5529e+00],
          [-7.4657e-01,  6.1044e-01,  8.2677e-01,  ..., -1.5922e+00,
           -1.5922e+00, -1.5726e+00],
          [ 7.6703e-04,  1.1611e+00,  1.0038e+00,  ..., -1.1596e+00,
           -1.3562e+00, -1.6119e+00]],

         [[-1.3434e+00, -1.3629e+00, -1.3629e+00,  ..., -9.1420e-01,
           -1.1873e+00, -1.4800e+00],
          [-1.3239e+00, -1.3629e+00, -1.3434e+00,  ..., -8.7518e-01,
           -1.1678e+00, -1.4605e+00],
          [-1.3434e+00, -1.3629e+00, -1.3629e+00,  ..., -1.0118e+00,
           -1.0703e+00, -1.3434e+00],
          ...,
          [-1.4800e+00, -1.3044e+00, -6.9959e-01,  ..., -1.6751e+00,
           -1.6556e+00, -1.6361e+00],
          [-1.4215e+00, -6.0204e-01, -7.5812e-01,  ..., -1.7336e+00,
           -1.6751e+00, -1.5776e+00],
          [-9.9224e-01, -3.6792e-01, -7.1910e-01,  ..., -1.4215e+00,
           -1.5580e+00, -1.6361e+00]]],


        [[[-2.1383e+00, -2.0801e+00, -2.0220e+00,  ..., -2.2934e+00,
           -2.2352e+00, -2.1771e+00],
          [-2.1577e+00, -2.0995e+00, -2.0608e+00,  ..., -2.2158e+00,
           -2.1964e+00, -2.1964e+00],
          [-2.1383e+00, -2.1189e+00, -2.0801e+00,  ..., -2.1771e+00,
           -2.1771e+00, -2.1964e+00],
          ...,
          [-2.0995e+00, -2.1383e+00, -2.1189e+00,  ..., -2.1771e+00,
           -2.1964e+00, -2.1771e+00],
          [-2.1383e+00, -2.1383e+00, -2.1189e+00,  ..., -2.1964e+00,
           -2.2158e+00, -2.1964e+00],
          [-2.3515e+00, -2.1771e+00, -2.1189e+00,  ..., -2.1964e+00,
           -2.2158e+00, -2.1964e+00]],

         [[-1.8676e+00, -1.8086e+00, -1.7496e+00,  ..., -2.1233e+00,
           -2.0643e+00, -1.9659e+00],
          [-1.9266e+00, -1.8676e+00, -1.8282e+00,  ..., -2.0643e+00,
           -2.0446e+00, -2.0249e+00],
          [-1.9463e+00, -1.9266e+00, -1.8873e+00,  ..., -2.0249e+00,
           -2.0446e+00, -2.0643e+00],
          ...,
          [-2.0643e+00, -2.1429e+00, -2.1823e+00,  ..., -2.0249e+00,
           -2.0446e+00, -2.0249e+00],
          [-2.1036e+00, -2.1429e+00, -2.1823e+00,  ..., -2.0446e+00,
           -2.0643e+00, -2.0446e+00],
          [-2.3396e+00, -2.1823e+00, -2.1823e+00,  ..., -2.0446e+00,
           -2.0643e+00, -2.0446e+00]],

         [[-1.4020e+00, -1.3434e+00, -1.2849e+00,  ..., -1.7727e+00,
           -1.6946e+00, -1.4995e+00],
          [-1.5190e+00, -1.4605e+00, -1.4215e+00,  ..., -1.7141e+00,
           -1.6751e+00, -1.5971e+00],
          [-1.5776e+00, -1.5580e+00, -1.5190e+00,  ..., -1.6751e+00,
           -1.6751e+00, -1.6556e+00],
          ...,
          [-1.7531e+00, -1.8312e+00, -1.8507e+00,  ..., -1.6361e+00,
           -1.6556e+00, -1.6361e+00],
          [-1.7922e+00, -1.8312e+00, -1.8507e+00,  ..., -1.6556e+00,
           -1.6751e+00, -1.6556e+00],
          [-2.0263e+00, -1.8702e+00, -1.8507e+00,  ..., -1.6361e+00,
           -1.6556e+00, -1.6556e+00]]],


        ...,


        [[[-1.9638e+00, -2.0026e+00, -2.0026e+00,  ...,  2.0682e+00,
            2.3396e+00,  2.4753e+00],
          [-1.9638e+00, -2.0026e+00, -2.0220e+00,  ...,  1.7775e+00,
            1.8938e+00,  2.2233e+00],
          [-2.0026e+00, -2.0220e+00, -2.0414e+00,  ...,  1.7775e+00,
            1.6612e+00,  1.6805e+00],
          ...,
          [-2.1771e+00, -2.2158e+00, -2.2158e+00,  ..., -9.9458e-01,
           -6.6504e-01, -6.6504e-01],
          [-1.0334e+00, -1.7312e+00, -1.9444e+00,  ..., -1.1303e+00,
           -5.2934e-01, -5.4873e-01],
          [-4.7119e-01, -3.1611e-01, -2.9672e-01,  ..., -3.3549e-01,
           -5.9512e-03,  2.2667e-01]],

         [[-1.9463e+00, -1.9856e+00, -1.9856e+00,  ...,  2.1641e+00,
            2.4395e+00,  2.5378e+00],
          [-1.9463e+00, -1.9856e+00, -2.0053e+00,  ...,  1.8888e+00,
            1.9871e+00,  2.3018e+00],
          [-1.9856e+00, -2.0053e+00, -2.0249e+00,  ...,  1.7904e+00,
            1.6921e+00,  1.7118e+00],
          ...,
          [-2.1626e+00, -2.2019e+00, -2.2019e+00,  ..., -2.5490e-01,
            5.5144e-01,  3.5477e-01],
          [-1.0022e+00, -1.7102e+00, -1.9266e+00,  ..., -4.3190e-01,
            6.3011e-01,  3.9410e-01],
          [-4.3190e-01, -2.7457e-01, -2.5490e-01,  ...,  4.9244e-01,
            1.2791e+00,  1.2594e+00]],

         [[-1.7531e+00, -1.7922e+00, -1.7922e+00,  ...,  1.5051e+00,
            1.8367e+00,  2.1879e+00],
          [-1.7531e+00, -1.7922e+00, -1.8117e+00,  ...,  1.2319e+00,
            1.3490e+00,  1.8172e+00],
          [-1.7922e+00, -1.8117e+00, -1.8312e+00,  ...,  1.2905e+00,
            1.1344e+00,  1.2124e+00],
          ...,
          [-1.9678e+00, -2.0068e+00, -2.0068e+00,  ..., -1.5776e+00,
           -1.6946e+00, -1.9678e+00],
          [-8.1665e-01, -1.5190e+00, -1.7336e+00,  ..., -1.7336e+00,
           -1.5190e+00, -1.6751e+00],
          [-2.5085e-01, -9.4771e-02, -7.5261e-02,  ..., -7.1910e-01,
           -5.0449e-01, -2.5085e-01]]],


        [[[ 2.5141e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4947e+00,  2.4947e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.4947e+00],
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          ...,
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  9.4391e-01,
            1.0408e+00,  1.1571e+00],
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  1.3510e+00,
            1.4673e+00,  1.5836e+00],
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  1.8744e+00,
            2.0295e+00,  2.0876e+00]],

         [[ 2.5968e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          ...,
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  1.0234e+00,
            1.1218e+00,  1.2398e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  1.4364e+00,
            1.5544e+00,  1.6724e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  1.9478e+00,
            2.1051e+00,  2.1641e+00]],

         [[ 2.7537e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7342e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7342e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          ...,
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  1.2905e+00,
            1.3880e+00,  1.5051e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  1.6807e+00,
            1.7977e+00,  1.9148e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  2.1294e+00,
            2.2660e+00,  2.3440e+00]]],


        [[[ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.4947e+00,  2.5141e+00],
          [ 2.4947e+00,  2.4753e+00,  2.4753e+00,  ...,  2.4753e+00,
            2.4753e+00,  2.4947e+00],
          [ 2.5141e+00,  2.4753e+00,  2.4947e+00,  ...,  2.4753e+00,
            2.4753e+00,  2.4947e+00],
          ...,
          [ 2.5141e+00,  2.4753e+00,  2.4947e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.5141e+00],
          [ 2.4947e+00,  2.4753e+00,  2.4947e+00,  ...,  2.4947e+00,
            2.4753e+00,  2.4947e+00],
          [ 2.5141e+00,  2.4947e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.4947e+00,  2.5141e+00]],

         [[ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5771e+00,  2.5968e+00],
          [ 2.5771e+00,  2.5575e+00,  2.5575e+00,  ...,  2.5575e+00,
            2.5575e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5575e+00,  2.5771e+00,  ...,  2.5575e+00,
            2.5575e+00,  2.5771e+00],
          ...,
          [ 2.5968e+00,  2.5575e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5968e+00],
          [ 2.5771e+00,  2.5575e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5575e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5771e+00,  2.5968e+00]],

         [[ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  2.6757e+00,
            2.7342e+00,  2.7537e+00],
          [ 2.7342e+00,  2.7147e+00,  2.7147e+00,  ...,  2.7147e+00,
            2.6952e+00,  2.7342e+00],
          [ 2.7537e+00,  2.7147e+00,  2.7342e+00,  ...,  2.7342e+00,
            2.6952e+00,  2.7147e+00],
          ...,
          [ 2.7537e+00,  2.7147e+00,  2.7342e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7537e+00],
          [ 2.7342e+00,  2.7147e+00,  2.7342e+00,  ...,  2.7342e+00,
            2.7147e+00,  2.7342e+00],
          [ 2.7537e+00,  2.7342e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7342e+00,  2.7537e+00]]]]), tensor([1, 1, 0, 1, 0, 0, 1, 1, 1, 1, 0, 1, 1, 1, 0, 0, 0, 0, 0, 1, 1, 1, 1, 0,
        0, 1, 0, 1, 1, 1, 0, 0, 0, 1, 1, 1, 0, 1, 1, 1, 0, 0, 0, 1, 0, 1, 1, 0,
        1, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 1, 0, 1, 0, 0])]
x is [tensor([[[[-1.7700e+00, -1.8087e+00, -1.8281e+00,  ..., -1.8475e+00,
           -1.8087e+00, -1.8087e+00],
          [-1.7700e+00, -1.8281e+00, -1.8669e+00,  ..., -1.8281e+00,
           -1.8087e+00, -1.8087e+00],
          [-1.7700e+00, -1.8087e+00, -1.8863e+00,  ..., -1.8087e+00,
           -1.8087e+00, -1.7506e+00],
          ...,
          [-2.1919e-01, -1.6103e-01, -1.0288e-01,  ..., -1.9638e+00,
           -2.1383e+00, -1.8475e+00],
          [ 1.3434e-02, -1.4165e-01, -1.0288e-01,  ..., -4.9057e-01,
           -1.3047e+00, -1.6731e+00],
          [-3.7426e-01, -1.6103e-01, -2.9672e-01,  ...,  3.2359e-01,
            7.1589e-02, -4.1303e-01]],

         [[-1.7889e+00, -1.8282e+00, -1.8479e+00,  ..., -1.8479e+00,
           -1.8086e+00, -1.8086e+00],
          [-1.7889e+00, -1.8479e+00, -1.8873e+00,  ..., -1.8676e+00,
           -1.8479e+00, -1.8479e+00],
          [-1.7889e+00, -1.8282e+00, -1.8873e+00,  ..., -1.8873e+00,
           -1.8873e+00, -1.8479e+00],
          ...,
          [-5.6957e-01, -5.6957e-01, -5.3024e-01,  ..., -2.0249e+00,
           -2.0643e+00, -1.9463e+00],
          [-3.3357e-01, -5.4990e-01, -5.3024e-01,  ..., -8.0557e-01,
           -1.4742e+00, -1.8282e+00],
          [-6.6791e-01, -4.9090e-01, -6.8757e-01,  ..., -7.7900e-02,
           -3.5324e-01, -7.2691e-01]],

         [[-8.5567e-01, -8.9469e-01, -9.3371e-01,  ..., -1.0118e+00,
           -9.7273e-01, -9.9224e-01],
          [-8.7518e-01, -9.3371e-01, -9.7273e-01,  ..., -1.0703e+00,
           -1.0508e+00, -9.9224e-01],
          [-8.7518e-01, -9.1420e-01, -9.9224e-01,  ..., -1.1093e+00,
           -1.0898e+00, -9.9224e-01],
          ...,
          [ 2.9543e-01,  3.1495e-01,  3.7348e-01,  ..., -1.1483e+00,
           -1.2069e+00, -1.0898e+00],
          [ 5.4907e-01,  2.9543e-01,  2.7592e-01,  ..., -5.5751e-02,
           -7.1910e-01, -9.5322e-01],
          [ 1.7837e-01,  3.5397e-01,  1.7837e-01,  ...,  5.8809e-01,
            4.5152e-01,  2.2291e-02]]],


        [[[ 2.5141e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.5141e+00,  2.4753e+00,  2.4753e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.4947e+00],
          [ 2.5141e+00,  2.4753e+00,  2.4947e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          ...,
          [ 2.5141e+00,  2.4559e+00,  2.4753e+00,  ...,  1.1765e+00,
            1.4479e+00,  1.6612e+00],
          [ 2.5141e+00,  2.4559e+00,  2.4559e+00,  ...,  1.6805e+00,
            1.9132e+00,  2.1070e+00],
          [ 2.5141e+00,  2.4753e+00,  2.4753e+00,  ...,  2.3009e+00,
            2.4172e+00,  2.4559e+00]],

         [[ 2.5968e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5968e+00,  2.5575e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5771e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          ...,
          [ 2.5968e+00,  2.5575e+00,  2.5575e+00,  ...,  1.0038e+00,
            1.2594e+00,  1.4758e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  1.4561e+00,
            1.6724e+00,  1.8298e+00],
          [ 2.5968e+00,  2.5771e+00,  2.5968e+00,  ...,  2.0265e+00,
            2.1445e+00,  2.2231e+00]],

         [[ 2.4806e+00,  2.5196e+00,  2.5586e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.4221e+00,  2.4221e+00,  2.4806e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7342e+00],
          [ 2.4611e+00,  2.4416e+00,  2.4806e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          ...,
          [ 2.4806e+00,  2.3050e+00,  2.2270e+00,  ...,  8.6123e-01,
            1.1149e+00,  1.3295e+00],
          [ 2.6562e+00,  2.4806e+00,  2.4221e+00,  ...,  1.3295e+00,
            1.5441e+00,  1.7002e+00],
          [ 2.6952e+00,  2.5586e+00,  2.5391e+00,  ...,  1.9148e+00,
            2.0319e+00,  2.1099e+00]]],


        [[[ 2.3978e+00,  2.4365e+00,  2.4172e+00,  ...,  2.4559e+00,
            2.4559e+00,  2.4559e+00],
          [ 2.4559e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.3978e+00,  2.4947e+00,  2.4753e+00,  ...,  2.4947e+00,
            2.4947e+00,  2.4947e+00],
          ...,
          [ 2.4559e+00,  2.5141e+00,  2.4947e+00,  ...,  2.3590e+00,
            2.4753e+00,  2.4947e+00],
          [ 2.4947e+00,  2.5141e+00,  2.5141e+00,  ...,  2.5141e+00,
            2.5141e+00,  2.5141e+00],
          [ 2.4172e+00,  2.4753e+00,  2.4559e+00,  ...,  2.4559e+00,
            2.4559e+00,  2.4559e+00]],

         [[ 2.4985e+00,  2.5575e+00,  2.5378e+00,  ...,  2.5378e+00,
            2.5378e+00,  2.5378e+00],
          [ 2.5771e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5968e+00,
            2.5968e+00,  2.5968e+00],
          [ 2.5378e+00,  2.5968e+00,  2.5771e+00,  ...,  2.5771e+00,
            2.5771e+00,  2.5771e+00],
          ...,
          [ 2.5378e+00,  2.5968e+00,  2.5771e+00,  ...,  2.2821e+00,
            2.4591e+00,  2.5181e+00],
          [ 2.5771e+00,  2.5968e+00,  2.5968e+00,  ...,  2.5575e+00,
            2.5575e+00,  2.5771e+00],
          [ 2.4985e+00,  2.5575e+00,  2.5378e+00,  ...,  2.5378e+00,
            2.5181e+00,  2.5181e+00]],

         [[ 2.6367e+00,  2.7147e+00,  2.6952e+00,  ...,  2.6952e+00,
            2.6952e+00,  2.6952e+00],
          [ 2.7342e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.6952e+00,  2.7342e+00,  2.7147e+00,  ...,  2.7342e+00,
            2.7342e+00,  2.7342e+00],
          ...,
          [ 2.6952e+00,  2.7537e+00,  2.7342e+00,  ...,  2.4806e+00,
            2.6562e+00,  2.7147e+00],
          [ 2.7342e+00,  2.7537e+00,  2.7537e+00,  ...,  2.7537e+00,
            2.7537e+00,  2.7537e+00],
          [ 2.6562e+00,  2.7147e+00,  2.6952e+00,  ...,  2.6952e+00,
            2.6952e+00,  2.6952e+00]]],


        ...,


        [[[-6.8442e-01, -2.9672e-01, -8.2012e-01,  ..., -1.2660e+00,
           -1.2854e+00, -1.0915e+00],
          [-1.2854e+00, -1.4165e-01,  3.2819e-02,  ..., -2.2740e+00,
           -2.2158e+00, -1.7506e+00],
          [-1.1497e+00,  4.0113e-01,  6.3375e-01,  ..., -2.2934e+00,
           -2.4097e+00, -2.0608e+00],
          ...,
          [-2.4291e+00, -2.4097e+00, -2.3903e+00,  ..., -4.4721e-02,
            8.6637e-01,  1.1378e+00],
          [-2.4291e+00, -2.4097e+00, -2.3903e+00,  ..., -2.3857e-01,
            8.6637e-01,  1.0602e+00],
          [-2.4097e+00, -2.4097e+00, -2.4097e+00,  ..., -3.1611e-01,
            5.7560e-01,  8.0822e-01]],

         [[-1.1723e-01,  3.1544e-01, -2.1557e-01,  ..., -8.0557e-01,
           -8.2524e-01, -5.6957e-01],
          [-7.2691e-01,  4.9244e-01,  6.6944e-01,  ..., -1.9856e+00,
           -1.9266e+00, -1.3562e+00],
          [-5.8924e-01,  1.0431e+00,  1.2988e+00,  ..., -2.0643e+00,
           -2.2019e+00, -1.7299e+00],
          ...,
          [-2.3986e+00, -2.3986e+00, -2.3789e+00,  ..., -7.7900e-02,
            8.6611e-01,  1.1218e+00],
          [-2.3396e+00, -2.3593e+00, -2.3396e+00,  ..., -1.5657e-01,
            9.6444e-01,  1.1021e+00],
          [-2.2609e+00, -2.2806e+00, -2.2806e+00,  ..., -1.3690e-01,
            7.0877e-01,  8.6611e-01]],

         [[-2.8987e-01, -1.3379e-01, -5.8253e-01,  ..., -8.7518e-01,
           -7.7763e-01, -6.9959e-01],
          [-9.5322e-01, -1.6730e-02,  1.5886e-01,  ..., -1.7922e+00,
           -1.6361e+00, -1.2849e+00],
          [-8.7518e-01,  4.7103e-01,  6.6613e-01,  ..., -1.9092e+00,
           -1.9482e+00, -1.6556e+00],
          ...,
          [-1.9873e+00, -1.9873e+00, -1.9678e+00,  ..., -3.6240e-02,
            1.0758e+00,  1.3100e+00],
          [-1.9287e+00, -1.9287e+00, -1.9287e+00,  ...,  2.7802e-03,
            1.2124e+00,  1.3100e+00],
          [-1.8507e+00, -1.8702e+00, -1.8702e+00,  ...,  8.0821e-02,
            9.5878e-01,  1.0563e+00]]],


        [[[-1.7506e+00, -1.6731e+00, -1.7118e+00,  ..., -2.2740e+00,
           -2.2740e+00, -2.2934e+00],
          [-1.5761e+00, -1.6149e+00, -1.7312e+00,  ..., -2.2546e+00,
           -2.2740e+00, -2.2934e+00],
          [-1.6537e+00, -1.8281e+00, -2.0608e+00,  ..., -2.2546e+00,
           -2.2546e+00, -2.2352e+00],
          ...,
          [-2.2740e+00, -2.1964e+00, -2.1964e+00,  ..., -2.3128e+00,
           -2.3321e+00, -2.3128e+00],
          [-2.2740e+00, -2.1964e+00, -2.1771e+00,  ..., -2.2934e+00,
           -2.2740e+00, -2.2158e+00],
          [-2.2740e+00, -2.1964e+00, -2.1964e+00,  ..., -2.3128e+00,
           -2.3128e+00, -2.2352e+00]],

         [[-2.3396e+00, -2.3986e+00, -2.3789e+00,  ..., -1.5657e-01,
           -1.3690e-01, -3.8567e-02],
          [-2.2216e+00, -2.3199e+00, -2.3003e+00,  ..., -9.7567e-02,
           -3.8567e-02,  5.9768e-02],
          [-1.5332e+00, -1.8086e+00, -1.8086e+00,  ...,  7.6703e-04,
            9.9101e-02,  2.5644e-01],
          ...,
          [-7.8591e-01, -7.8591e-01, -8.8424e-01,  ..., -1.0809e+00,
           -1.0809e+00, -1.0809e+00],
          [-7.8591e-01, -7.6624e-01, -8.2524e-01,  ..., -1.0612e+00,
           -1.0416e+00, -9.8258e-01],
          [-7.8591e-01, -7.2691e-01, -7.6624e-01,  ..., -9.4324e-01,
           -8.8424e-01, -8.2524e-01]],

         [[-2.1043e+00, -2.1434e+00, -2.1824e+00,  ...,  1.4270e+00,
            1.4661e+00,  1.5636e+00],
          [-2.0848e+00, -2.1434e+00, -2.0263e+00,  ...,  1.5246e+00,
            1.5831e+00,  1.6807e+00],
          [-1.0898e+00, -1.0313e+00, -7.9714e-01,  ...,  1.6612e+00,
            1.7782e+00,  1.9148e+00],
          ...,
          [ 8.2221e-01,  7.6368e-01,  7.2466e-01,  ...,  4.9054e-01,
            5.1005e-01,  5.4907e-01],
          [ 8.6123e-01,  9.0025e-01,  8.8074e-01,  ...,  5.4907e-01,
            6.0760e-01,  7.0515e-01],
          [ 9.5878e-01,  9.7829e-01,  9.5878e-01,  ...,  7.4417e-01,
            8.2221e-01,  9.1976e-01]]],


        [[[-2.2158e+00, -2.1964e+00, -2.1771e+00,  ..., -2.2740e+00,
           -2.3128e+00, -2.3128e+00],
          [-2.0414e+00, -2.0220e+00, -2.0220e+00,  ..., -2.2934e+00,
           -2.3128e+00, -2.3128e+00],
          [-1.8475e+00, -1.8669e+00, -1.8863e+00,  ..., -2.3128e+00,
           -2.3128e+00, -2.3128e+00],
          ...,
          [-1.5761e+00, -1.4986e+00, -1.3241e+00,  ..., -1.4404e+00,
           -1.4598e+00, -1.4986e+00],
          [-1.4598e+00, -1.6343e+00, -1.5374e+00,  ..., -1.4986e+00,
           -1.4792e+00, -1.4792e+00],
          [-1.3629e+00, -1.5374e+00, -1.5761e+00,  ..., -1.4986e+00,
           -1.4986e+00, -1.4211e+00]],

         [[-1.9659e+00, -1.9463e+00, -1.9463e+00,  ..., -2.0053e+00,
           -2.0446e+00, -2.0446e+00],
          [-1.8282e+00, -1.8282e+00, -1.8282e+00,  ..., -2.0249e+00,
           -2.0446e+00, -2.0446e+00],
          [-1.7299e+00, -1.7496e+00, -1.7692e+00,  ..., -2.0446e+00,
           -2.0446e+00, -2.0446e+00],
          ...,
          [-1.4546e+00, -1.3956e+00, -1.2186e+00,  ..., -1.3759e+00,
           -1.3956e+00, -1.4349e+00],
          [-1.3759e+00, -1.4939e+00, -1.3562e+00,  ..., -1.4349e+00,
           -1.4152e+00, -1.4152e+00],
          [-1.2776e+00, -1.3759e+00, -1.3759e+00,  ..., -1.4349e+00,
           -1.4349e+00, -1.3366e+00]],

         [[-1.9873e+00, -1.9678e+00, -1.9482e+00,  ..., -1.9873e+00,
           -2.0263e+00, -2.0263e+00],
          [-1.9092e+00, -1.8897e+00, -1.8897e+00,  ..., -2.0068e+00,
           -2.0263e+00, -2.0263e+00],
          [-1.8507e+00, -1.8702e+00, -1.8702e+00,  ..., -2.0263e+00,
           -2.0263e+00, -2.0263e+00],
          ...,
          [-1.7922e+00, -1.7922e+00, -1.6751e+00,  ..., -1.6556e+00,
           -1.6751e+00, -1.6751e+00],
          [-1.6946e+00, -1.8507e+00, -1.7727e+00,  ..., -1.7141e+00,
           -1.6946e+00, -1.6751e+00],
          [-1.6751e+00, -1.7727e+00, -1.7922e+00,  ..., -1.7141e+00,
           -1.7141e+00, -1.6556e+00]]]]), tensor([0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1,
        0, 1, 0, 1, 0, 0, 1, 1, 0, 1, 0, 1, 1, 0, 1, 1, 1, 0, 0, 1, 1, 0, 1, 1,
        0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 1, 1, 1, 1])]
x is [tensor([[[[-2.1771, -2.2546, -2.2934,  ..., -2.2740, -2.2934, -2.2740],
          [-2.2158, -2.3128, -2.3321,  ..., -2.2934, -2.2934, -2.2740],
          [-2.2352, -2.2934, -2.3128,  ..., -2.2740, -2.2546, -2.2158],
          ...,
          [-1.6149, -0.9558, -0.6069,  ...,  0.7501,  0.8082,  0.9051],
          [-0.1610, -0.1998, -0.0253,  ...,  1.1378,  1.3704,  1.3898],
          [ 1.0796,  1.0796,  1.1378,  ...,  1.7775,  1.7775,  1.7775]],

         [[-2.1429, -2.2216, -2.2609,  ..., -2.2413, -2.2609, -2.2413],
          [-2.1823, -2.2806, -2.3003,  ..., -2.2609, -2.2609, -2.2413],
          [-2.2019, -2.2609, -2.2806,  ..., -2.2413, -2.2216, -2.1823],
          ...,
          [-1.2579, -0.4319,  0.0401,  ...,  1.1611,  1.2791,  1.3971],
          [ 0.2761,  0.2761,  0.5121,  ...,  1.4954,  1.7118,  1.7904],
          [ 1.5544,  1.4758,  1.5348,  ...,  1.9281,  1.9675,  2.0461]],

         [[-2.0653, -2.1629, -2.1824,  ..., -2.1824, -2.1629, -2.1824],
          [-2.1043, -2.2019, -2.2214,  ..., -2.2019, -2.1629, -2.1824],
          [-2.1238, -2.1824, -2.2019,  ..., -2.1824, -2.1238, -2.1238],
          ...,
          [-1.9092, -1.9092, -2.0458,  ..., -1.7336, -0.5435,  0.7247],
          [-0.1728, -0.4850, -0.7776,  ...,  0.1394,  0.9783,  1.1929],
          [ 1.0563,  1.1149,  0.9978,  ...,  1.6612,  1.6221,  1.3880]]],


        [[[ 0.8664,  0.8858,  0.9439,  ...,  1.6224,  1.6224,  1.6030],
          [ 0.8664,  0.8858,  0.9439,  ...,  1.5836,  1.5836,  1.5642],
          [ 0.9051,  0.9245,  0.9827,  ...,  1.6418,  1.6224,  1.6030],
          ...,
          [ 0.9633,  0.9633,  0.8082,  ...,  1.1184,  1.8162,  1.7968],
          [ 0.8664,  0.8276,  0.9051,  ...,  1.7968,  1.8356,  1.7387],
          [ 0.8470,  0.8664,  0.9245,  ...,  1.8162,  1.7581,  1.6999]],

         [[ 1.0038,  1.0234,  1.0824,  ...,  1.6921,  1.6921,  1.6724],
          [ 1.0038,  1.0234,  1.0824,  ...,  1.6528,  1.6528,  1.6331],
          [ 1.0431,  1.0628,  1.1218,  ...,  1.7118,  1.6921,  1.6724],
          ...,
          [ 1.0234,  1.0628,  0.9644,  ...,  0.6301,  1.9478,  1.8691],
          [ 0.9251,  0.9448,  1.0628,  ...,  1.8298,  1.8495,  1.7904],
          [ 0.9054,  0.9448,  1.0234,  ...,  1.8691,  1.8298,  1.7904]],

         [[ 1.1539,  1.1734,  1.2319,  ...,  1.8563,  1.8563,  1.8367],
          [ 1.1539,  1.1734,  1.2319,  ...,  1.8172,  1.8172,  1.7977],
          [ 1.1929,  1.2124,  1.2709,  ...,  1.8758,  1.8563,  1.8367],
          ...,
          [ 1.1929,  1.2124,  1.0563,  ...,  0.8612,  2.0709,  2.0319],
          [ 1.0954,  1.1149,  1.2124,  ...,  2.0123,  2.0319,  1.9538],
          [ 1.0758,  1.1149,  1.1929,  ...,  2.0319,  1.9928,  1.9538]]],


        [[[-2.1189, -2.3128, -2.3709,  ..., -1.7894, -1.8281, -2.2740],
          [-2.2158, -2.2352, -2.2158,  ..., -1.7700, -1.8669, -2.2546],
          [-2.0995, -2.1577, -2.0026,  ..., -1.9638, -1.8475, -2.0801],
          ...,
          [ 0.0910,  0.0716,  0.1297,  ...,  0.2073,  0.1879,  0.1491],
          [ 0.1685,  0.1685,  0.1685,  ...,  0.2848,  0.2461,  0.1879],
          [ 0.1104,  0.0522,  0.1104,  ...,  0.2848,  0.2461,  0.1879]],

         [[-0.3729, -0.2352, -0.1959,  ..., -1.0612, -1.1202, -0.7466],
          [-0.3336, -0.2352, -0.3336,  ..., -1.0612, -1.1596, -0.7072],
          [-0.5302, -0.3336, -0.6286,  ..., -1.0612, -1.2776, -0.9236],
          ...,
          [-0.7269, -0.7269, -0.6876,  ..., -0.6089, -0.6286, -0.6679],
          [-0.6679, -0.6482, -0.6089,  ..., -0.5499, -0.5892, -0.6286],
          [-0.6876, -0.7269, -0.6679,  ..., -0.5892, -0.6089, -0.5892]],

         [[ 0.0418,  0.3540,  0.4710,  ..., -1.5385, -1.5190, -0.3289],
          [ 0.1784,  0.2174,  0.1198,  ..., -1.5580, -1.4020, -0.2118],
          [-0.2313,  0.0028, -0.4460,  ..., -1.1483, -1.4215, -0.7191],
          ...,
          [-1.6946, -1.7141, -1.6946,  ..., -1.6556, -1.6556, -1.6361],
          [-1.6751, -1.6946, -1.6166,  ..., -1.5776, -1.5776, -1.5776],
          [-1.6946, -1.7141, -1.6361,  ..., -1.5580, -1.5580, -1.5385]]],


        ...,


        [[[ 2.4365,  2.4172,  2.4172,  ...,  2.4559,  2.4753,  2.4753],
          [ 2.5141,  2.5141,  2.4947,  ...,  2.4172,  2.4365,  2.4365],
          [ 2.5141,  2.5141,  2.4947,  ...,  2.4172,  2.4172,  2.4172],
          ...,
          [ 2.4365,  2.4753,  2.4947,  ...,  2.4559,  2.4559,  2.4559],
          [ 2.4947,  2.4172,  2.4365,  ...,  2.4172,  2.4753,  2.4559],
          [ 2.4947,  2.4365,  2.4753,  ...,  2.4753,  2.4559,  2.4559]],

         [[ 2.5771,  2.5771,  2.5771,  ...,  2.5771,  2.5575,  2.5575],
          [ 2.4985,  2.5378,  2.5575,  ...,  2.5771,  2.5968,  2.5771],
          [ 2.5181,  2.5378,  2.5771,  ...,  2.5771,  2.5968,  2.5968],
          ...,
          [ 2.5181,  2.5378,  2.5378,  ...,  2.5968,  2.5968,  2.5771],
          [ 2.5575,  2.4788,  2.5181,  ...,  2.5575,  2.5968,  2.5771],
          [ 2.5575,  2.5181,  2.5575,  ...,  2.5968,  2.5771,  2.5771]],

         [[ 2.6757,  2.6952,  2.7147,  ...,  2.6757,  2.6757,  2.6757],
          [ 2.6562,  2.6562,  2.6562,  ...,  2.6367,  2.6562,  2.6562],
          [ 2.6172,  2.6172,  2.6367,  ...,  2.6172,  2.6367,  2.6172],
          ...,
          [ 2.5781,  2.6952,  2.7537,  ...,  2.5976,  2.5781,  2.6367],
          [ 2.6562,  2.6367,  2.7147,  ...,  2.5391,  2.6172,  2.6367],
          [ 2.6757,  2.6367,  2.7147,  ...,  2.6172,  2.6172,  2.6367]]],


        [[[-2.3515, -2.3903, -2.3903,  ..., -2.4097, -2.4097, -2.4097],
          [-2.3903, -2.3903, -2.3903,  ..., -2.4097, -2.4097, -2.4097],
          [-1.2272, -2.4097, -2.3709,  ..., -2.4097, -2.4097, -2.4097],
          ...,
          [ 2.4172,  2.3784,  2.3784,  ..., -2.4097, -2.4097, -2.4097],
          [ 2.3590,  2.3009,  2.3009,  ..., -2.4097, -2.4097, -2.4097],
          [ 2.3009,  2.2233,  2.2233,  ..., -2.4097, -2.4097, -2.4097]],

         [[-2.3199, -2.3396, -2.3396,  ..., -2.3986, -2.3986, -2.3986],
          [-2.3396, -2.3396, -2.3396,  ..., -2.3986, -2.3986, -2.3986],
          [-1.1596, -2.3593, -2.3199,  ..., -2.3986, -2.3986, -2.3986],
          ...,
          [ 2.4985,  2.4591,  2.4591,  ..., -2.3986, -2.3986, -2.3986],
          [ 2.4395,  2.3805,  2.3805,  ..., -2.3986, -2.3986, -2.3986],
          [ 2.3805,  2.3018,  2.3018,  ..., -2.3986, -2.3986, -2.3986]],

         [[-2.1629, -2.2019, -2.2019,  ..., -2.2019, -2.2019, -2.2019],
          [-2.2019, -2.2019, -2.2019,  ..., -2.2019, -2.2019, -2.2019],
          [-1.0118, -2.2214, -2.1824,  ..., -2.2019, -2.2019, -2.2019],
          ...,
          [ 2.6952,  2.6562,  2.6562,  ..., -2.2214, -2.2019, -2.2019],
          [ 2.6367,  2.5781,  2.5781,  ..., -2.2214, -2.2019, -2.2019],
          [ 2.5781,  2.5001,  2.5001,  ..., -2.2214, -2.2019, -2.2019]]],


        [[[-1.3629, -1.1884, -1.1884,  ..., -1.9638, -1.9832, -2.0026],
          [-1.3823, -1.2078, -1.1497,  ..., -1.5180, -1.4017, -1.5567],
          [-1.2466, -1.1497, -0.9752,  ..., -0.0641, -0.7232, -1.5567],
          ...,
          [-0.1223, -0.6069, -1.0140,  ..., -0.9364, -1.0140, -1.0721],
          [-0.3161, -0.8201, -0.9558,  ..., -0.9752, -1.0334, -1.0721],
          [-0.4518, -0.6650, -0.7038,  ..., -1.0334, -1.0721, -1.1303]],

         [[-1.2972, -1.0612, -1.0022,  ..., -1.9659, -2.0053, -2.0053],
          [-1.2972, -1.0612, -0.9629,  ..., -1.6709, -1.5136, -1.5922],
          [-1.1202, -0.9826, -0.7859,  ..., -0.1762, -0.7269, -1.5529],
          ...,
          [ 0.1384, -0.3139, -0.6679,  ..., -0.9432, -1.0416, -1.1399],
          [-0.0189, -0.5499, -0.5892,  ..., -0.9826, -1.0612, -1.1989],
          [-0.1959, -0.3926, -0.4516,  ..., -1.0416, -1.1006, -1.2186]],

         [[-0.1533,  0.0418,  0.0808,  ..., -1.4020, -1.4410, -1.2654],
          [-0.2509, -0.0167,  0.1198,  ..., -0.6996, -0.5435, -0.6215],
          [-0.3094, -0.0948,  0.1979,  ...,  1.0954,  0.3930, -0.6996],
          ...,
          [ 0.7247,  0.0223,  0.2174,  ..., -0.3289, -0.4069, -0.4264],
          [ 0.3930,  0.1394,  0.6661,  ..., -0.3679, -0.4264, -0.4460],
          [ 0.4515,  0.5100,  0.7442,  ..., -0.4264, -0.4655, -0.5435]]]]), tensor([0, 0, 1, 0, 0, 0, 0, 1])]
Accuracy on test set: 1.0 , Racc: 1.0 , Uacc: 0.0
Folders created.
Checkpoint name: cifar100__CNN_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100__CNN_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
Number of Classes: 100
PreActNet(
  (conv1): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
  (layer1): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (layer2): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (shortcut): Sequential(
        (0): Conv2d(64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (layer3): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (shortcut): Sequential(
        (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (layer4): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (shortcut): Sequential(
        (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (linear): Linear(in_features=512, out_features=100, bias=True)
)
==> unlearning ...
Computing current moments on test set
Computed moments: 10.383838467407227,8.01676732711792,-1.3475479697511006
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.942 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.550 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 1.1 , Uacc: 6.0
Forgetting epoch 0
Resetting retain iterator...
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.988 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.507 on forgotten vs unseen images
Accuracy on test set: 1.1 , Racc: 0.9 , Uacc: 57.5
Forgetting epoch 1
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.588 on forgotten vs unseen images
Accuracy on test set: 1.2 , Racc: 0.9 , Uacc: 72.5
Forgetting epoch 2
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.620 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.0 , Uacc: 77.5
Forgetting epoch 3
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.463 on forgotten vs unseen images
Accuracy on test set: 1.4 , Racc: 1.1 , Uacc: 78.5
Forgetting epoch 4
Computing current moments on test set
Computed moments: 22.763965158081053,78.18318979797364,0.8998613946167487
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.475 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.2 , Uacc: 82.0
Forgetting epoch 5
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.630 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.2 , Uacc: 87.0
Forgetting epoch 6
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.677 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.2 , Uacc: 88.0
Forgetting epoch 7
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.560 on forgotten vs unseen images
Accuracy on test set: 1.5 , Racc: 1.1 , Uacc: 81.0
Forgetting epoch 8
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.637 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.3 , Uacc: 91.0
Forgetting epoch 9
Computing current moments on test set
Computed moments: 24.500759112548828,180.42727314453126,2.8422848136504686
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.603 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.2 , Uacc: 90.5
Forgetting epoch 10
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.523 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.2 , Uacc: 91.0
Forgetting epoch 11
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.528 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 92.5
Forgetting epoch 12
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.458 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.3 , Uacc: 94.5
Forgetting epoch 13
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.465 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.3 , Uacc: 97.5
Forgetting epoch 14
Computing current moments on test set
Computed moments: 24.65703311767578,160.48135764160156,3.1416791437045517
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.585 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 96.5
Forgetting epoch 15
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.510 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.3 , Uacc: 94.5
Forgetting epoch 16
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.585 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 98.5
Forgetting epoch 17
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.568 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 97.5
Forgetting epoch 18
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.552 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.3 , Uacc: 97.5
Forgetting epoch 19
Computing current moments on test set
Computed moments: 28.52691082763672,298.9161729370117,3.35379894337407
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.548 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.3 , Uacc: 98.0
Forgetting epoch 20
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.590 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.2 , Uacc: 97.5
Forgetting epoch 21
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.562 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 100.0
Forgetting epoch 22
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.505 on forgotten vs unseen images
Accuracy on test set: 1.6 , Racc: 1.2 , Uacc: 95.5
Forgetting epoch 23
NaN in ft_samples_mia: False
[ True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True  True  True  True  True  True  True  True  True
  True  True  True  True]
The MIA_loss has an accuracy of 0.990 on forgotten vs unseen images
The MIA_entropy has an accuracy of 0.583 on forgotten vs unseen images
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 99.5
Forgetting epoch 24
Accuracy on test set: 1.7 , Racc: 1.3 , Uacc: 98.0
Checkpoint name: cifar100__CNN_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100__CNN_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
Number of Classes: 100
Traceback (most recent call last):
  File "nabla_unlearning_main.py", line 552, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/CNN-CIFAR100_3407_800000/checkpoint_s_500000.pt'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "nabla_unlearning_main.py", line 555, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/CNN-CIFAR100_3407_800000/checkpoint-s:500000.pt'
Checkpoint name: cifar100__CNN_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10
[Logging in cifar100__CNN_1_0_forget_[0, 1]_num_200_lr_0_0001_bs_256_ls_ce_wd_0_1_seed_10_training]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: train
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
forget Class: [0, 1]
Files already downloaded and verified
Files already downloaded and verified
confuse mode: False
split mode: forget
Replacing indexes [15338 29579  4126  8315   903 37224 39736 16028 12397  4436   579 32852
 37528 38428  7530 26956 27023 32189  3241 22377 32457 32450 34417 27974
  2028 13724 13451 30857  7528 18501 38266  5010  2765  4733  8988 35537
 22720 36827 15273 16037 27110 11636 15418 34399  9609 16979 33780  5725
 31276  4066 33456 28677 27027   221 20110 21811  4686 26283 35600 11872
 15123  4765 24989  4979 28846 10733 21020 16489 39138 33602 17168 13944
 17806  8249 18486  4196 22661 10689 27228 31635 28761  9804 27730  9690
  1700 22913 10051 16165  6758 31988  3811  9138 35595  7883 39157  4881
  1595   280 36256 22893 21137 24784 10832  5715 20926 33999 21846 20947
 11596 11053  6350  7108 32428 16422  5213 35542 20818 17981 21791  1813
 34398 39742  8784  5650 19565 29415 36350 23161 10394 18137  6454 29019
 28113  9492 29261 16752 35880 19934 28809  6047 11761 27537  2972 18786
 36805 29986  9779  1462 21054  1585 23654 17526 13134 33315  7279 38803
 21105 13914  2033 15860 13659 22489 14458 11165 25628 24069 12811 24688
 12700   625 35879 38480 34610 28239 24471 13046 13100 29306 30595 11697
 35844  7911 34867 36566 20941 24676 32943  3401 11185 35311 32729 26997
 27298 34216 13323 10370 19145  7412 17223   353]
Number of Classes: 100
PreActNet(
  (conv1): Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
  (layer1): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (layer2): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (shortcut): Sequential(
        (0): Conv2d(64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (layer3): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (shortcut): Sequential(
        (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (layer4): Sequential(
    (0): PreActCNNBlock(
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (shortcut): Sequential(
        (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
      )
    )
    (1): PreActCNNBlock(
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv1): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
    )
  )
  (linear): Linear(in_features=512, out_features=100, bias=True)
)
Traceback (most recent call last):
  File "main_forget_sparse.py", line 556, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/CNN-CIFAR100_3407_800000/checkpoint_s_500000.pt'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "main_forget_sparse.py", line 559, in <module>
    state_chkpt = torch.load(os.path.join(args.resume,
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 771, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 270, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/home/yuanbangliang/anaconda3/envs/grokkingUnlearning/lib/python3.8/site-packages/torch/serialization.py", line 251, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/storage/4T_add/grok-adversarial/models/CNN-CIFAR100_3407_800000/checkpoint-s:500000.pt'
