Epoch: 0001 train_loss= 2.09610 train_acc= 0.10243 val_loss= 2.07314 val_acc= 0.20690 time= 0.79742
Epoch: 0002 train_loss= 2.08920 train_acc= 0.09973 val_loss= 2.06895 val_acc= 0.20690 time= 0.00000
Epoch: 0003 train_loss= 2.08676 train_acc= 0.10243 val_loss= 2.06537 val_acc= 0.20690 time= 0.01563
Epoch: 0004 train_loss= 2.08087 train_acc= 0.09973 val_loss= 2.06234 val_acc= 0.20690 time= 0.00000
Epoch: 0005 train_loss= 2.07770 train_acc= 0.09704 val_loss= 2.05985 val_acc= 0.20690 time= 0.00000
Epoch: 0006 train_loss= 2.07026 train_acc= 0.15364 val_loss= 2.05760 val_acc= 0.13793 time= 0.01563
Epoch: 0007 train_loss= 2.06983 train_acc= 0.16173 val_loss= 2.05536 val_acc= 0.13793 time= 0.00000
Epoch: 0008 train_loss= 2.06803 train_acc= 0.16712 val_loss= 2.05318 val_acc= 0.13793 time= 0.00000
Epoch: 0009 train_loss= 2.06499 train_acc= 0.16173 val_loss= 2.05102 val_acc= 0.13793 time= 0.01563
Epoch: 0010 train_loss= 2.06252 train_acc= 0.15903 val_loss= 2.04893 val_acc= 0.13793 time= 0.00000
Epoch: 0011 train_loss= 2.05960 train_acc= 0.16981 val_loss= 2.04698 val_acc= 0.13793 time= 0.00000
Epoch: 0012 train_loss= 2.05550 train_acc= 0.16712 val_loss= 2.04520 val_acc= 0.13793 time= 0.01563
Epoch: 0013 train_loss= 2.05463 train_acc= 0.16173 val_loss= 2.04344 val_acc= 0.13793 time= 0.00000
Epoch: 0014 train_loss= 2.05380 train_acc= 0.16981 val_loss= 2.04174 val_acc= 0.13793 time= 0.00000
Epoch: 0015 train_loss= 2.05315 train_acc= 0.18059 val_loss= 2.04024 val_acc= 0.13793 time= 0.01563
Epoch: 0016 train_loss= 2.05054 train_acc= 0.17251 val_loss= 2.03882 val_acc= 0.13793 time= 0.00000
Epoch: 0017 train_loss= 2.05121 train_acc= 0.17251 val_loss= 2.03760 val_acc= 0.13793 time= 0.00000
Epoch: 0018 train_loss= 2.05257 train_acc= 0.16712 val_loss= 2.03657 val_acc= 0.13793 time= 0.01563
Epoch: 0019 train_loss= 2.04979 train_acc= 0.16981 val_loss= 2.03575 val_acc= 0.13793 time= 0.00000
Epoch: 0020 train_loss= 2.05109 train_acc= 0.16442 val_loss= 2.03511 val_acc= 0.13793 time= 0.00000
Epoch: 0021 train_loss= 2.04786 train_acc= 0.17251 val_loss= 2.03480 val_acc= 0.13793 time= 0.01562
Epoch: 0022 train_loss= 2.05183 train_acc= 0.15094 val_loss= 2.03454 val_acc= 0.13793 time= 0.00000
Epoch: 0023 train_loss= 2.05157 train_acc= 0.16712 val_loss= 2.03425 val_acc= 0.13793 time= 0.00000
Epoch: 0024 train_loss= 2.05003 train_acc= 0.16442 val_loss= 2.03371 val_acc= 0.13793 time= 0.01563
Epoch: 0025 train_loss= 2.05461 train_acc= 0.16981 val_loss= 2.03287 val_acc= 0.13793 time= 0.00000
Epoch: 0026 train_loss= 2.05071 train_acc= 0.15633 val_loss= 2.03224 val_acc= 0.13793 time= 0.00000
Epoch: 0027 train_loss= 2.04729 train_acc= 0.18598 val_loss= 2.03205 val_acc= 0.13793 time= 0.01563
Epoch: 0028 train_loss= 2.04841 train_acc= 0.16173 val_loss= 2.03210 val_acc= 0.13793 time= 0.00000
Epoch: 0029 train_loss= 2.04931 train_acc= 0.13747 val_loss= 2.03238 val_acc= 0.13793 time= 0.00000
Epoch: 0030 train_loss= 2.05058 train_acc= 0.17251 val_loss= 2.03273 val_acc= 0.13793 time= 0.01563
Epoch: 0031 train_loss= 2.04868 train_acc= 0.14825 val_loss= 2.03302 val_acc= 0.13793 time= 0.00000
Epoch: 0032 train_loss= 2.04693 train_acc= 0.17520 val_loss= 2.03322 val_acc= 0.13793 time= 0.00000
Early stopping...
Optimization Finished!
Test set results: cost= 2.07704 accuracy= 0.16949 time= 0.00000 
