Epoch: 0001 train_loss= 2.09687 train_acc= 0.12075 val_loss= 2.07432 val_acc= 0.20690 time= 0.18751
Epoch: 0002 train_loss= 2.11309 train_acc= 0.11321 val_loss= 2.07344 val_acc= 0.20690 time= 0.01563
Epoch: 0003 train_loss= 2.09095 train_acc= 0.12453 val_loss= 2.07263 val_acc= 0.17241 time= 0.01563
Epoch: 0004 train_loss= 2.08472 train_acc= 0.13585 val_loss= 2.07199 val_acc= 0.17241 time= 0.00000
Epoch: 0005 train_loss= 2.08563 train_acc= 0.12453 val_loss= 2.07182 val_acc= 0.17241 time= 0.01563
Epoch: 0006 train_loss= 2.08044 train_acc= 0.15849 val_loss= 2.07118 val_acc= 0.06897 time= 0.00000
Epoch: 0007 train_loss= 2.07597 train_acc= 0.19623 val_loss= 2.07043 val_acc= 0.06897 time= 0.01563
Epoch: 0008 train_loss= 2.07506 train_acc= 0.18113 val_loss= 2.06963 val_acc= 0.06897 time= 0.01563
Epoch: 0009 train_loss= 2.07675 train_acc= 0.12830 val_loss= 2.06873 val_acc= 0.10345 time= 0.00000
Epoch: 0010 train_loss= 2.07273 train_acc= 0.20377 val_loss= 2.06744 val_acc= 0.10345 time= 0.01563
Epoch: 0011 train_loss= 2.07324 train_acc= 0.16226 val_loss= 2.06590 val_acc= 0.13793 time= 0.00000
Epoch: 0012 train_loss= 2.06975 train_acc= 0.18491 val_loss= 2.06399 val_acc= 0.17241 time= 0.01563
Epoch: 0013 train_loss= 2.06360 train_acc= 0.19245 val_loss= 2.06166 val_acc= 0.17241 time= 0.01563
Epoch: 0014 train_loss= 2.06016 train_acc= 0.17358 val_loss= 2.05922 val_acc= 0.17241 time= 0.00000
Epoch: 0015 train_loss= 2.06689 train_acc= 0.16981 val_loss= 2.05669 val_acc= 0.17241 time= 0.01562
Epoch: 0016 train_loss= 2.06195 train_acc= 0.17358 val_loss= 2.05411 val_acc= 0.17241 time= 0.00000
Epoch: 0017 train_loss= 2.06246 train_acc= 0.20000 val_loss= 2.05175 val_acc= 0.17241 time= 0.01563
Epoch: 0018 train_loss= 2.06630 train_acc= 0.20755 val_loss= 2.04985 val_acc= 0.17241 time= 0.01563
Epoch: 0019 train_loss= 2.05337 train_acc= 0.18491 val_loss= 2.04780 val_acc= 0.17241 time= 0.00000
Epoch: 0020 train_loss= 2.06584 train_acc= 0.17358 val_loss= 2.04568 val_acc= 0.17241 time= 0.01563
Epoch: 0021 train_loss= 2.05511 train_acc= 0.18868 val_loss= 2.04371 val_acc= 0.17241 time= 0.00000
Epoch: 0022 train_loss= 2.04885 train_acc= 0.21132 val_loss= 2.04169 val_acc= 0.17241 time= 0.01563
Epoch: 0023 train_loss= 2.04712 train_acc= 0.18113 val_loss= 2.03974 val_acc= 0.17241 time= 0.01563
Epoch: 0024 train_loss= 2.05556 train_acc= 0.21132 val_loss= 2.03802 val_acc= 0.24138 time= 0.00000
Epoch: 0025 train_loss= 2.06295 train_acc= 0.20755 val_loss= 2.03657 val_acc= 0.24138 time= 0.01563
Epoch: 0026 train_loss= 2.04403 train_acc= 0.20377 val_loss= 2.03489 val_acc= 0.20690 time= 0.00000
Epoch: 0027 train_loss= 2.05171 train_acc= 0.21887 val_loss= 2.03348 val_acc= 0.20690 time= 0.01563
Epoch: 0028 train_loss= 2.03630 train_acc= 0.20377 val_loss= 2.03181 val_acc= 0.20690 time= 0.00000
Epoch: 0029 train_loss= 2.03918 train_acc= 0.22264 val_loss= 2.03018 val_acc= 0.27586 time= 0.01563
Epoch: 0030 train_loss= 2.04826 train_acc= 0.21132 val_loss= 2.02909 val_acc= 0.27586 time= 0.01563
Epoch: 0031 train_loss= 2.03496 train_acc= 0.20377 val_loss= 2.02820 val_acc= 0.27586 time= 0.00000
Epoch: 0032 train_loss= 2.03490 train_acc= 0.20755 val_loss= 2.02732 val_acc= 0.27586 time= 0.01562
Epoch: 0033 train_loss= 2.03739 train_acc= 0.20377 val_loss= 2.02740 val_acc= 0.27586 time= 0.00000
Epoch: 0034 train_loss= 2.05383 train_acc= 0.20377 val_loss= 2.02906 val_acc= 0.27586 time= 0.01563
Epoch: 0035 train_loss= 2.03313 train_acc= 0.21132 val_loss= 2.03119 val_acc= 0.20690 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 2.08958 accuracy= 0.11864 time= 0.00000 
