Running 1 scripts.. 
 
------------------------------------------------------------------
 
Running 'trainModels' ...
GPU available: 1
Running evaluation for "cnn3-bn (relu)" on "mnist" [2025-Mar-31-11-59-33]
Saving results in ./results/mnist-short/evaluation
Loading training configuration "mnist"... done
Loading dataset "mnist"... done
Started training run (1/1)...
--- Started training config "point" (1/5)...
____________________________________
 Neural Network Training Parameters 
------------------------------------
 - Training method: point                                                                                                     
 - Optimizer: AdamOptimizer, Learning Rate: 5.00e-04, Beta1: 9.00e-01, Beta2: 9.99e-01, Epsilon: 1.00e-08, Lambda: 1.00e-06, L2-Grad norm threshold: 1.00e+01 
 - Training epochs: 10                                                                                                                                                                          
 - Mini-batch size: 256                                                                                                                                                 
 - Shuffle data: every_epoch                                                  
 - Early stopping after non-decr. validation steps: Inf 
____________________________________
 
‖===============================================================‖
‖ Epoch | Iteration | Training Time | Loss (train) | Loss (val) ‖
‖===============================================================‖
‖     1 |         1 | 00:00:00      |   2.3511e+00 | 1.1789e+00 ‖
‖     1 |        50 | 00:00:01      |   1.1360e+00 | 1.1192e+00 ‖
‖     1 |       100 | 00:00:01      |   8.5068e-01 | 8.3064e-01 ‖
‖     1 |       150 | 00:00:02      |   5.7631e-01 | 5.3122e-01 ‖
‖     1 |       200 | 00:00:02      |   5.6068e-01 | 3.7504e-01 ‖
‖     2 |       250 | 00:00:03      |   4.5977e-01 | 2.9530e-01 ‖
‖     2 |       300 | 00:00:03      |   3.4860e-01 | 2.5401e-01 ‖
‖     2 |       350 | 00:00:04      |   3.2225e-01 | 2.2144e-01 ‖
‖     2 |       400 | 00:00:04      |   2.9801e-01 | 2.0487e-01 ‖
‖     2 |       450 | 00:00:05      |   3.2564e-01 | 1.9293e-01 ‖
‖     3 |       500 | 00:00:05      |   2.3463e-01 | 1.8077e-01 ‖
‖     3 |       550 | 00:00:05      |   2.2850e-01 | 1.6863e-01 ‖
‖     3 |       600 | 00:00:06      |   2.9812e-01 | 1.6257e-01 ‖
‖     3 |       650 | 00:00:06      |   2.3739e-01 | 1.5506e-01 ‖
‖     3 |       700 | 00:00:07      |   2.3598e-01 | 1.4843e-01 ‖
‖     4 |       750 | 00:00:07      |   2.3589e-01 | 1.3960e-01 ‖
‖     4 |       800 | 00:00:07      |   1.4426e-01 | 1.3591e-01 ‖
‖     4 |       850 | 00:00:08      |   2.0146e-01 | 1.2951e-01 ‖
‖     4 |       900 | 00:00:08      |   1.5546e-01 | 1.2303e-01 ‖
‖     5 |       950 | 00:00:09      |   1.7922e-01 | 1.2183e-01 ‖
‖     5 |      1000 | 00:00:09      |   1.6300e-01 | 1.1461e-01 ‖
‖     5 |      1050 | 00:00:10      |   1.5543e-01 | 1.1226e-01 ‖
‖     5 |      1100 | 00:00:10      |   1.3104e-01 | 1.0837e-01 ‖
‖     5 |      1150 | 00:00:10      |   1.7724e-01 | 1.0571e-01 ‖
‖     6 |      1200 | 00:00:11      |   2.0197e-01 | 1.0267e-01 ‖
‖     6 |      1250 | 00:00:11      |   2.3567e-01 | 9.6416e-02 ‖
‖     6 |      1300 | 00:00:12      |   1.2028e-01 | 9.5445e-02 ‖
‖     6 |      1350 | 00:00:12      |   1.3425e-01 | 9.3656e-02 ‖
‖     6 |      1400 | 00:00:12      |   1.4051e-01 | 8.7897e-02 ‖
‖     7 |      1450 | 00:00:13      |   1.5701e-01 | 8.8758e-02 ‖
‖     7 |      1500 | 00:00:13      |   1.2562e-01 | 8.8412e-02 ‖
‖     7 |      1550 | 00:00:14      |   1.2216e-01 | 8.8052e-02 ‖
‖     7 |      1600 | 00:00:14      |   1.6141e-01 | 8.6893e-02 ‖
‖     8 |      1650 | 00:00:15      |   1.2203e-01 | 8.6812e-02 ‖
‖     8 |      1700 | 00:00:15      |   1.0758e-01 | 8.6159e-02 ‖
‖     8 |      1750 | 00:00:15      |   1.0514e-01 | 8.5761e-02 ‖
‖     8 |      1800 | 00:00:16      |   1.0125e-01 | 8.6281e-02 ‖
‖     8 |      1850 | 00:00:16      |   1.0374e-01 | 8.4998e-02 ‖
‖     9 |      1900 | 00:00:17      |   1.4733e-01 | 8.5100e-02 ‖
‖     9 |      1950 | 00:00:17      |   1.1169e-01 | 8.5330e-02 ‖
‖     9 |      2000 | 00:00:17      |   9.8524e-02 | 8.5195e-02 ‖
‖     9 |      2050 | 00:00:18      |   1.3446e-01 | 8.5550e-02 ‖
‖     9 |      2100 | 00:00:18      |   1.1864e-01 | 8.5320e-02 ‖
‖    10 |      2150 | 00:00:19      |   1.2525e-01 | 8.5732e-02 ‖
‖    10 |      2200 | 00:00:19      |   1.0317e-01 | 8.5350e-02 ‖
‖    10 |      2250 | 00:00:19      |   1.1641e-01 | 8.5164e-02 ‖
‖    10 |      2300 | 00:00:20      |   1.2513e-01 | 8.4948e-02 ‖
‖    10 |      2340 | 00:00:20      |   2.0047e-01 | 8.4729e-02 ‖
‖===============================================================‖
 
 done (training time: 20.51 [s])
--- Started training config "Gowal (IBP; eps=0.1, kappa=0.5)" (2/5)...
____________________________________
 Neural Network Training Parameters 
------------------------------------
 - Training method: gowal                                                                                                     
 - Optimizer: AdamOptimizer, Learning Rate: 5.00e-04, Beta1: 9.00e-01, Beta2: 9.99e-01, Epsilon: 1.00e-08, Lambda: 1.00e-06, L2-Grad norm threshold: 1.00e+01 
 - Training epochs: 10                                                                                                                                                                          
 - Mini-batch size: 256                                                                                                                                                 
 - Shuffle data: every_epoch                                                  
 - (max) Training noise: 1.00e-01     
 - kappa (gowal): 5.00e-01                                                                                  
 - Early stopping after non-decr. validation steps: Inf 
____________________________________
 
‖===============================================================‖
‖ Epoch | Iteration | Training Time | Loss (train) | Loss (val) ‖
‖===============================================================‖
‖     1 |         1 | 00:00:00      |   2.3511e+00 | 1.1789e+00 ‖
‖     1 |        50 | 00:00:00      |   1.1360e+00 | 1.1192e+00 ‖
‖     1 |       100 | 00:00:00      |   8.5068e-01 | 8.3064e-01 ‖
‖     1 |       150 | 00:00:01      |   5.7631e-01 | 5.3122e-01 ‖
‖     1 |       200 | 00:00:01      |   5.6068e-01 | 3.7504e-01 ‖
‖     2 |       250 | 00:00:02      |   4.8088e-01 | 2.8821e-01 ‖
‖     2 |       300 | 00:00:03      |   3.1769e-01 | 2.4095e-01 ‖
‖     2 |       350 | 00:00:04      |   3.1676e-01 | 2.1215e-01 ‖
‖     2 |       400 | 00:00:05      |   3.5355e-01 | 1.9222e-01 ‖
‖     2 |       450 | 00:00:06      |   4.5073e-01 | 1.8288e-01 ‖
‖     3 |       500 | 00:00:07      |   4.5387e-01 | 1.7664e-01 ‖
‖     3 |       550 | 00:00:08      |   5.4186e-01 | 1.8318e-01 ‖
‖     3 |       600 | 00:00:09      |   6.4630e-01 | 1.9182e-01 ‖
‖     3 |       650 | 00:00:10      |   6.8743e-01 | 2.0534e-01 ‖
‖     3 |       700 | 00:00:11      |   6.5894e-01 | 2.1491e-01 ‖
‖     4 |       750 | 00:00:12      |   7.4458e-01 | 2.1772e-01 ‖
‖     4 |       800 | 00:00:13      |   6.8512e-01 | 2.1676e-01 ‖
‖     4 |       850 | 00:00:15      |   7.6730e-01 | 2.2202e-01 ‖
‖     4 |       900 | 00:00:16      |   6.9254e-01 | 2.2335e-01 ‖
‖     5 |       950 | 00:00:17      |   7.4638e-01 | 2.2014e-01 ‖
‖     5 |      1000 | 00:00:18      |   7.0976e-01 | 2.1870e-01 ‖
‖     5 |      1050 | 00:00:19      |   7.1883e-01 | 2.1274e-01 ‖
‖     5 |      1100 | 00:00:20      |   6.5940e-01 | 2.1139e-01 ‖
‖     5 |      1150 | 00:00:21      |   7.9230e-01 | 2.1006e-01 ‖
‖     6 |      1200 | 00:00:22      |   7.4330e-01 | 2.1043e-01 ‖
‖     6 |      1250 | 00:00:23      |   8.0162e-01 | 2.0255e-01 ‖
‖     6 |      1300 | 00:00:24      |   5.6981e-01 | 1.9869e-01 ‖
‖     6 |      1350 | 00:00:25      |   6.6129e-01 | 1.9304e-01 ‖
‖     6 |      1400 | 00:00:26      |   4.7597e-01 | 1.8913e-01 ‖
‖     7 |      1450 | 00:00:27      |   6.8587e-01 | 1.9120e-01 ‖
‖     7 |      1500 | 00:00:28      |   5.7421e-01 | 1.9117e-01 ‖
‖     7 |      1550 | 00:00:29      |   6.6687e-01 | 1.9036e-01 ‖
‖     7 |      1600 | 00:00:30      |   5.8794e-01 | 1.8941e-01 ‖
‖     8 |      1650 | 00:00:31      |   5.9153e-01 | 1.8842e-01 ‖
‖     8 |      1700 | 00:00:33      |   5.5839e-01 | 1.8785e-01 ‖
‖     8 |      1750 | 00:00:34      |   5.5980e-01 | 1.8677e-01 ‖
‖     8 |      1800 | 00:00:35      |   5.2030e-01 | 1.8556e-01 ‖
‖     8 |      1850 | 00:00:36      |   5.2318e-01 | 1.8458e-01 ‖
‖     9 |      1900 | 00:00:37      |   5.8010e-01 | 1.8409e-01 ‖
‖     9 |      1950 | 00:00:38      |   5.6448e-01 | 1.8451e-01 ‖
‖     9 |      2000 | 00:00:39      |   4.7871e-01 | 1.8432e-01 ‖
‖     9 |      2050 | 00:00:40      |   5.1315e-01 | 1.8445e-01 ‖
‖     9 |      2100 | 00:00:42      |   6.2743e-01 | 1.8448e-01 ‖
‖    10 |      2150 | 00:00:43      |   5.1537e-01 | 1.8445e-01 ‖
‖    10 |      2200 | 00:00:44      |   5.0272e-01 | 1.8383e-01 ‖
‖    10 |      2250 | 00:00:45      |   4.9525e-01 | 1.8359e-01 ‖
‖    10 |      2300 | 00:00:46      |   6.3977e-01 | 1.8328e-01 ‖
‖    10 |      2340 | 00:00:47      |   7.0435e-01 | 1.8300e-01 ‖
‖===============================================================‖
 
 done (training time: 47.68 [s])
--- Started training config "SABR" (3/5)...
____________________________________
 Neural Network Training Parameters 
------------------------------------
 - Training method: sabr                                                                                                                          
 - Optimizer: AdamOptimizer, Learning Rate: 5.00e-04, Beta1: 9.00e-01, Beta2: 9.99e-01, Epsilon: 1.00e-08, Lambda: 1.00e-06, L2-Grad norm threshold: 1.00e+01 
 - Training epochs: 10                                                                                                                                                                          
 - Mini-batch size: 256                                                                                                                                                 
 - Shuffle data: every_epoch                                                  
 - (max) Training noise: 1.00e-01     
 - Early stopping after non-decr. validation steps: Inf 
____________________________________
 
‖===============================================================‖
‖ Epoch | Iteration | Training Time | Loss (train) | Loss (val) ‖
‖===============================================================‖
‖     1 |         1 | 00:00:00      |   2.3511e+00 | 1.1789e+00 ‖
‖     1 |        50 | 00:00:00      |   1.1360e+00 | 1.1192e+00 ‖
‖     1 |       100 | 00:00:00      |   8.5068e-01 | 8.3064e-01 ‖
‖     1 |       150 | 00:00:01      |   5.7631e-01 | 5.3122e-01 ‖
‖     1 |       200 | 00:00:01      |   5.6068e-01 | 3.7504e-01 ‖
‖     2 |       250 | 00:00:03      |   5.0937e-01 | 2.8092e-01 ‖
‖     2 |       300 | 00:00:06      |   4.2954e-01 | 2.2731e-01 ‖
‖     2 |       350 | 00:00:09      |   4.9376e-01 | 2.0028e-01 ‖
‖     2 |       400 | 00:00:11      |   5.5519e-01 | 1.8945e-01 ‖
‖     2 |       450 | 00:00:14      |   6.7812e-01 | 1.8376e-01 ‖
‖     3 |       500 | 00:00:16      |   6.3220e-01 | 1.8173e-01 ‖
‖     3 |       550 | 00:00:19      |   7.1923e-01 | 1.8455e-01 ‖
‖     3 |       600 | 00:00:21      |   8.1562e-01 | 1.7971e-01 ‖
‖     3 |       650 | 00:00:24      |   8.2538e-01 | 1.8189e-01 ‖
‖     3 |       700 | 00:00:26      |   7.7385e-01 | 1.8518e-01 ‖
‖     4 |       750 | 00:00:29      |   8.4349e-01 | 1.8379e-01 ‖
‖     4 |       800 | 00:00:31      |   7.7520e-01 | 1.8076e-01 ‖
‖     4 |       850 | 00:00:34      |   8.8628e-01 | 1.8418e-01 ‖
‖     4 |       900 | 00:00:37      |   7.6767e-01 | 1.8566e-01 ‖
‖     5 |       950 | 00:00:40      |   7.9960e-01 | 1.8214e-01 ‖
‖     5 |      1000 | 00:00:42      |   7.6359e-01 | 1.8216e-01 ‖
‖     5 |      1050 | 00:00:45      |   7.5510e-01 | 1.7606e-01 ‖
‖     5 |      1100 | 00:00:48      |   7.1442e-01 | 1.7588e-01 ‖
‖     5 |      1150 | 00:00:51      |   8.3454e-01 | 1.7510e-01 ‖
‖     6 |      1200 | 00:00:54      |   7.9177e-01 | 1.7457e-01 ‖
‖     6 |      1250 | 00:00:57      |   8.5013e-01 | 1.6720e-01 ‖
‖     6 |      1300 | 00:00:59      |   6.0243e-01 | 1.6480e-01 ‖
‖     6 |      1350 | 00:01:02      |   7.1156e-01 | 1.5918e-01 ‖
‖     6 |      1400 | 00:01:05      |   5.0021e-01 | 1.5504e-01 ‖
‖     7 |      1450 | 00:01:08      |   7.2496e-01 | 1.5622e-01 ‖
‖     7 |      1500 | 00:01:11      |   6.1637e-01 | 1.5546e-01 ‖
‖     7 |      1550 | 00:01:13      |   7.1616e-01 | 1.5437e-01 ‖
‖     7 |      1600 | 00:01:16      |   6.0292e-01 | 1.5300e-01 ‖
‖     8 |      1650 | 00:01:19      |   6.0762e-01 | 1.5229e-01 ‖
‖     8 |      1700 | 00:01:21      |   5.9881e-01 | 1.5245e-01 ‖
‖     8 |      1750 | 00:01:24      |   5.9903e-01 | 1.5170e-01 ‖
‖     8 |      1800 | 00:01:27      |   5.4303e-01 | 1.5061e-01 ‖
‖     8 |      1850 | 00:01:29      |   5.6073e-01 | 1.4968e-01 ‖
‖     9 |      1900 | 00:01:32      |   5.9705e-01 | 1.4936e-01 ‖
‖     9 |      1950 | 00:01:35      |   5.8610e-01 | 1.4971e-01 ‖
‖     9 |      2000 | 00:01:37      |   5.0074e-01 | 1.4964e-01 ‖
‖     9 |      2050 | 00:01:40      |   5.3316e-01 | 1.4984e-01 ‖
‖     9 |      2100 | 00:01:43      |   6.6254e-01 | 1.4982e-01 ‖
‖    10 |      2150 | 00:01:46      |   5.5169e-01 | 1.4978e-01 ‖
‖    10 |      2200 | 00:01:48      |   5.4508e-01 | 1.4931e-01 ‖
‖    10 |      2250 | 00:01:51      |   5.1993e-01 | 1.4913e-01 ‖
‖    10 |      2300 | 00:01:53      |   6.6652e-01 | 1.4887e-01 ‖
‖    10 |      2340 | 00:01:56      |   7.5062e-01 | 1.4872e-01 ‖
‖===============================================================‖
 
 done (training time: 116.11 [s])
--- Started training config "TRADES" (4/5)...
____________________________________
 Neural Network Training Parameters 
------------------------------------
 - Training method: trades                                                                                  
 - Optimizer: AdamOptimizer, Learning Rate: 5.00e-04, Beta1: 9.00e-01, Beta2: 9.99e-01, Epsilon: 1.00e-08, Lambda: 1.00e-06, L2-Grad norm threshold: 1.00e+01 
 - Training epochs: 10                                                                                                                                                                          
 - Mini-batch size: 256                                                                                                                                                 
 - Shuffle data: every_epoch                                                  
 - (max) Training noise: 1.00e-01     
 - Early stopping after non-decr. validation steps: Inf 
____________________________________
 
‖===============================================================‖
‖ Epoch | Iteration | Training Time | Loss (train) | Loss (val) ‖
‖===============================================================‖
‖     1 |         1 | 00:00:00      |   2.3511e+00 | 1.1789e+00 ‖
‖     1 |        50 | 00:00:00      |   1.1360e+00 | 1.1192e+00 ‖
‖     1 |       100 | 00:00:00      |   8.5068e-01 | 8.3064e-01 ‖
‖     1 |       150 | 00:00:01      |   5.7631e-01 | 5.3122e-01 ‖
‖     1 |       200 | 00:00:01      |   5.6068e-01 | 3.7504e-01 ‖
‖     2 |       250 | 00:00:02      |   5.3040e+00 | 2.8791e-01 ‖
‖     2 |       300 | 00:00:05      |   3.4438e+00 | 2.3653e-01 ‖
‖     2 |       350 | 00:00:07      |   2.5433e+00 | 2.0213e-01 ‖
‖     2 |       400 | 00:00:10      |   2.1914e+00 | 1.7591e-01 ‖
‖     2 |       450 | 00:00:13      |   2.0534e+00 | 1.5631e-01 ‖
‖     3 |       500 | 00:00:15      |   1.4535e+00 | 1.4013e-01 ‖
‖     3 |       550 | 00:00:18      |   1.7450e+00 | 1.3474e-01 ‖
‖     3 |       600 | 00:00:21      |   1.7110e+00 | 1.2018e-01 ‖
‖     3 |       650 | 00:00:23      |   1.5473e+00 | 1.0791e-01 ‖
‖     3 |       700 | 00:00:26      |   1.3637e+00 | 1.0441e-01 ‖
‖     4 |       750 | 00:00:29      |   1.3518e+00 | 9.4860e-02 ‖
‖     4 |       800 | 00:00:31      |   1.1562e+00 | 8.6368e-02 ‖
‖     4 |       850 | 00:00:34      |   1.3062e+00 | 8.8619e-02 ‖
‖     4 |       900 | 00:00:36      |   1.2286e+00 | 7.9713e-02 ‖
‖     5 |       950 | 00:00:38      |   1.1964e+00 | 8.1848e-02 ‖
‖     5 |      1000 | 00:00:41      |   1.1567e+00 | 7.5325e-02 ‖
‖     5 |      1050 | 00:00:44      |   1.1871e+00 | 7.0830e-02 ‖
‖     5 |      1100 | 00:00:46      |   1.1360e+00 | 7.2347e-02 ‖
‖     5 |      1150 | 00:00:49      |   1.2155e+00 | 7.4190e-02 ‖
‖     6 |      1200 | 00:00:52      |   1.4242e+00 | 6.7654e-02 ‖
‖     6 |      1250 | 00:00:55      |   1.5014e+00 | 6.5581e-02 ‖
‖     6 |      1300 | 00:00:57      |   8.8916e-01 | 6.7848e-02 ‖
‖     6 |      1350 | 00:01:00      |   1.2669e+00 | 6.1911e-02 ‖
‖     6 |      1400 | 00:01:03      |   8.8601e-01 | 5.8178e-02 ‖
‖     7 |      1450 | 00:01:06      |   1.1815e+00 | 6.1220e-02 ‖
‖     7 |      1500 | 00:01:09      |   1.1169e+00 | 6.1200e-02 ‖
‖     7 |      1550 | 00:01:12      |   1.1890e+00 | 5.9993e-02 ‖
‖     7 |      1600 | 00:01:15      |   1.0496e+00 | 5.8787e-02 ‖
‖     8 |      1650 | 00:01:18      |   1.0492e+00 | 6.0023e-02 ‖
‖     8 |      1700 | 00:01:21      |   9.6379e-01 | 5.8917e-02 ‖
‖     8 |      1750 | 00:01:24      |   9.2947e-01 | 5.8009e-02 ‖
‖     8 |      1800 | 00:01:26      |   7.8703e-01 | 5.8238e-02 ‖
‖     8 |      1850 | 00:01:29      |   9.5685e-01 | 5.6785e-02 ‖
‖     9 |      1900 | 00:01:32      |   1.0114e+00 | 5.7576e-02 ‖
‖     9 |      1950 | 00:01:35      |   9.3606e-01 | 5.7914e-02 ‖
‖     9 |      2000 | 00:01:37      |   8.3683e-01 | 5.7944e-02 ‖
‖     9 |      2050 | 00:01:40      |   1.0433e+00 | 5.7881e-02 ‖
‖     9 |      2100 | 00:01:43      |   1.1054e+00 | 5.7697e-02 ‖
‖    10 |      2150 | 00:01:45      |   9.6605e-01 | 5.8030e-02 ‖
‖    10 |      2200 | 00:01:48      |   9.6037e-01 | 5.7641e-02 ‖
‖    10 |      2250 | 00:01:51      |   9.4238e-01 | 5.7493e-02 ‖
‖    10 |      2300 | 00:01:54      |   9.0832e-01 | 5.7357e-02 ‖
‖    10 |      2340 | 00:01:56      |   1.4901e+00 | 5.7114e-02 ‖
‖===============================================================‖
 
 done (training time: 116.25 [s])
--- Started training config "Set (fgsm2-10-10; eps=0.1; tau=0.1)" (5/5)...
____________________________________
 Neural Network Training Parameters 
------------------------------------
 - Training method: set                                                                                                                                                 
 - Optimizer: AdamOptimizer, Learning Rate: 5.00e-04, Beta1: 9.00e-01, Beta2: 9.99e-01, Epsilon: 1.00e-08, Lambda: 1.00e-06, L2-Grad norm threshold: 1.00e+01 
 - Training epochs: 10                                                                                                                                                                          
 - Mini-batch size: 256                                                                                                                                                 
 - Shuffle data: every_epoch                                                  
 - (max) Training noise: 1.00e-01     
 - Volume heuristic for training: f-radius 
 - Weight for volume heuristic in loss (tau): 1.00e-01 
 - Type of input generators: fgsm2  
 - Number of generators: 10                                                                 
 - Early stopping after non-decr. validation steps: Inf 
____________________________________
 
‖=============================================================================================================================‖
‖ Epoch | Iteration | Training Time | Loss (softmax+log; train) | Loss (Vol; train) | Loss (Total; train) | Loss (Total; val) ‖
‖=============================================================================================================================‖
‖     1 |         1 | 00:00:00      |                2.3511e+00 |        0.0000e+00 |          2.3511e+00 |        1.1789e+00 ‖
‖     1 |        50 | 00:00:00      |                1.1360e+00 |        0.0000e+00 |          1.1360e+00 |        1.1192e+00 ‖
‖     1 |       100 | 00:00:01      |                8.5068e-01 |        0.0000e+00 |          8.5068e-01 |        8.3064e-01 ‖
‖     1 |       150 | 00:00:01      |                5.7631e-01 |        0.0000e+00 |          5.7631e-01 |        5.3122e-01 ‖
‖     1 |       200 | 00:00:01      |                5.6068e-01 |        0.0000e+00 |          5.6068e-01 |        3.7504e-01 ‖
‖     2 |       250 | 00:00:03      |                4.6917e-01 |        6.4399e-01 |          4.8665e-01 |        2.9032e-01 ‖
‖     2 |       300 | 00:00:06      |                3.8587e-01 |        6.4215e-01 |          4.1150e-01 |        2.6518e-01 ‖
‖     2 |       350 | 00:00:09      |                3.8066e-01 |        6.0944e-01 |          4.0354e-01 |        2.4244e-01 ‖
‖     2 |       400 | 00:00:12      |                3.7863e-01 |        6.0499e-01 |          4.0126e-01 |        2.2742e-01 ‖
‖     2 |       450 | 00:00:15      |                4.3353e-01 |        6.1639e-01 |          4.5181e-01 |        2.0861e-01 ‖
‖     3 |       500 | 00:00:18      |                3.4822e-01 |        6.5346e-01 |          3.7875e-01 |        1.9394e-01 ‖
‖     3 |       550 | 00:00:22      |                3.7252e-01 |        6.7694e-01 |          4.0296e-01 |        1.8752e-01 ‖
‖     3 |       600 | 00:00:25      |                4.5479e-01 |        6.5124e-01 |          4.7443e-01 |        1.7576e-01 ‖
‖     3 |       650 | 00:00:28      |                4.1681e-01 |        6.8422e-01 |          4.4355e-01 |        1.6707e-01 ‖
‖     3 |       700 | 00:00:31      |                3.9979e-01 |        6.8990e-01 |          4.2880e-01 |        1.6060e-01 ‖
‖     4 |       750 | 00:00:34      |                4.1099e-01 |        7.3386e-01 |          4.4328e-01 |        1.5218e-01 ‖
‖     4 |       800 | 00:00:38      |                3.1706e-01 |        7.5678e-01 |          3.6103e-01 |        1.4405e-01 ‖
‖     4 |       850 | 00:00:41      |                4.0449e-01 |        7.3608e-01 |          4.3765e-01 |        1.4389e-01 ‖
‖     4 |       900 | 00:00:44      |                3.4003e-01 |        7.7873e-01 |          3.8390e-01 |        1.3658e-01 ‖
‖     5 |       950 | 00:00:48      |                3.4811e-01 |        8.1473e-01 |          3.9477e-01 |        1.3389e-01 ‖
‖     5 |      1000 | 00:00:51      |                3.2873e-01 |        8.3174e-01 |          3.7903e-01 |        1.3141e-01 ‖
‖     5 |      1050 | 00:00:55      |                3.4373e-01 |        7.8190e-01 |          3.8754e-01 |        1.2677e-01 ‖
‖     5 |      1100 | 00:00:59      |                3.0351e-01 |        8.2250e-01 |          3.5541e-01 |        1.2244e-01 ‖
‖     5 |      1150 | 00:01:02      |                3.7822e-01 |        7.8229e-01 |          4.1863e-01 |        1.2222e-01 ‖
‖     6 |      1200 | 00:01:06      |                3.7881e-01 |        7.8400e-01 |          4.1933e-01 |        1.1719e-01 ‖
‖     6 |      1250 | 00:01:10      |                4.5746e-01 |        7.5575e-01 |          4.8729e-01 |        1.1279e-01 ‖
‖     6 |      1300 | 00:01:13      |                2.5663e-01 |        7.4035e-01 |          3.0500e-01 |        1.1207e-01 ‖
‖     6 |      1350 | 00:01:16      |                3.1708e-01 |        7.5313e-01 |          3.6069e-01 |        1.1057e-01 ‖
‖     6 |      1400 | 00:01:20      |                2.4401e-01 |        7.0071e-01 |          2.8968e-01 |        1.0458e-01 ‖
‖     7 |      1450 | 00:01:24      |                3.2480e-01 |        7.3798e-01 |          3.6612e-01 |        1.0527e-01 ‖
‖     7 |      1500 | 00:01:27      |                3.0393e-01 |        6.8845e-01 |          3.4238e-01 |        1.0429e-01 ‖
‖     7 |      1550 | 00:01:31      |                3.1868e-01 |        7.3972e-01 |          3.6078e-01 |        1.0304e-01 ‖
‖     7 |      1600 | 00:01:34      |                2.9110e-01 |        7.1060e-01 |          3.3305e-01 |        1.0328e-01 ‖
‖     8 |      1650 | 00:01:38      |                2.8034e-01 |        6.7156e-01 |          3.1946e-01 |        1.0304e-01 ‖
‖     8 |      1700 | 00:01:41      |                2.6383e-01 |        7.3319e-01 |          3.1077e-01 |        1.0231e-01 ‖
‖     8 |      1750 | 00:01:44      |                2.7618e-01 |        7.0695e-01 |          3.1926e-01 |        1.0172e-01 ‖
‖     8 |      1800 | 00:01:48      |                2.1682e-01 |        6.9855e-01 |          2.6500e-01 |        1.0148e-01 ‖
‖     8 |      1850 | 00:01:51      |                2.5675e-01 |        6.9890e-01 |          3.0096e-01 |        1.0031e-01 ‖
‖     9 |      1900 | 00:01:55      |                2.9664e-01 |        6.9458e-01 |          3.3643e-01 |        1.0111e-01 ‖
‖     9 |      1950 | 00:01:58      |                2.5906e-01 |        6.9972e-01 |          3.0312e-01 |        1.0077e-01 ‖
‖     9 |      2000 | 00:02:02      |                2.1887e-01 |        7.2545e-01 |          2.6953e-01 |        1.0072e-01 ‖
‖     9 |      2050 | 00:02:06      |                2.6379e-01 |        6.5644e-01 |          3.0306e-01 |        1.0042e-01 ‖
‖     9 |      2100 | 00:02:10      |                2.9011e-01 |        6.9819e-01 |          3.3092e-01 |        1.0026e-01 ‖
‖    10 |      2150 | 00:02:15      |                2.6268e-01 |        6.9007e-01 |          3.0542e-01 |        9.9989e-02 ‖
‖    10 |      2200 | 00:02:19      |                2.4286e-01 |        6.9801e-01 |          2.8837e-01 |        1.0003e-01 ‖
‖    10 |      2250 | 00:02:23      |                2.5594e-01 |        6.5554e-01 |          2.9590e-01 |        1.0026e-01 ‖
‖    10 |      2300 | 00:02:26      |                2.7648e-01 |        6.8196e-01 |          3.1703e-01 |        1.0002e-01 ‖
‖    10 |      2340 | 00:02:29      |                4.0643e-01 |        6.4514e-01 |          4.3030e-01 |        9.9607e-02 ‖
‖=============================================================================================================================‖
 
 done (training time: 149.59 [s])
Training run (1/1) done. Saving results... done
Computing accuracies, run (1/1)...
--- config "point" (1/5)... --- !!! Warning: reducing propagation batch size: 256 -> 128
 --- [11.54,73.45] -> nn.verify(timeout [s]: 0) -> [11.54,73.45]...  done
--- config "Gowal (IBP; eps=0.1, kappa=0.5)" (2/5)... --- [83.60,88.03] -> nn.verify(timeout [s]: 0) -> [83.60,88.03]...  done
--- config "SABR" (3/5)... --- [87.91,91.42] -> nn.verify(timeout [s]: 0) -> [87.91,91.42]...  done
--- config "TRADES" (4/5)... --- !!! Warning: reducing propagation batch size: 256 -> 128
 --- [66.58,93.31] -> nn.verify(timeout [s]: 0) -> [66.58,93.31]...  done
--- config "Set (fgsm2-10-10; eps=0.1; tau=0.1)" (5/5)... --- !!! Warning: reducing propagation batch size: 256 -> 128
 --- [74.68,92.88] -> nn.verify(timeout [s]: 0) -> [74.68,92.88]...  done
Saving accuracies... done
Extracting training times, run (1/1)...
--- config "point" (1/5)... done
--- config "Gowal (IBP; eps=0.1, kappa=0.5)" (2/5)... done
--- config "SABR" (3/5)... done
--- config "TRADES" (4/5)... done
--- config "Set (fgsm2-10-10; eps=0.1; tau=0.1)" (5/5)... done
Generating results table...
____________________________________________________________________________________________________________________
                Method               |   Accuracy   | falsified Acc. (eps=0.1) | fast-verif. Acc. (eps=0.1) | [max] 
--------------------------------------------------------------------------------------------------------------------
 point                               | 96.67 ± 0.00 |       73.45 ± 0.00       |        11.54 ± 0.00        | 11.54 
 Gowal (IBP; eps=0.1, kappa=0.5)     | 93.40 ± 0.00 |       88.03 ± 0.00       |        83.60 ± 0.00        | 83.60 
 SABR                                | 95.02 ± 0.00 |       91.42 ± 0.00       |        87.91 ± 0.00        | 87.91 
 TRADES                              | 97.88 ± 0.00 |       93.31 ± 0.00       |        66.58 ± 0.00        | 66.58 
 Set (fgsm2-10-10; eps=0.1; tau=0.1) | 96.98 ± 0.00 |       92.88 ± 0.00       |        74.68 ± 0.00        | 74.68 
____________________________________________________________________________________________________________________
 
 done
Generating training time table...
____________________________________________________________________
                Method               | (min) Train Time [s / Epoch] 
--------------------------------------------------------------------
 point                               |                          2.1 
 Gowal (IBP; eps=0.1, kappa=0.5)     |                          4.8 
 SABR                                |                         11.6 
 TRADES                              |                         11.6 
 Set (fgsm2-10-10; eps=0.1; tau=0.1) |                         15.0 
____________________________________________________________________
 
 done
 
'trainModels' was run successfully!
Saving plots to './results/mnist-short/plots'..
 
------------------------------------------------------------------
 
Completed!
Date: 31-Mar-2025 12:09:02
