Epoch: 0001 train_loss= 2.08702 train_acc= 0.11698 val_loss= 2.08517 val_acc= 0.00000 time= 0.31542
Epoch: 0002 train_loss= 2.08547 train_acc= 0.10943 val_loss= 2.08278 val_acc= 0.10345 time= 0.00000
Epoch: 0003 train_loss= 2.08419 train_acc= 0.15849 val_loss= 2.08083 val_acc= 0.34483 time= 0.01563
Epoch: 0004 train_loss= 2.08314 train_acc= 0.13962 val_loss= 2.07966 val_acc= 0.34483 time= 0.00000
Epoch: 0005 train_loss= 2.08260 train_acc= 0.17358 val_loss= 2.07845 val_acc= 0.10345 time= 0.01563
Epoch: 0006 train_loss= 2.08150 train_acc= 0.17358 val_loss= 2.07717 val_acc= 0.10345 time= 0.00000
Epoch: 0007 train_loss= 2.08068 train_acc= 0.17358 val_loss= 2.07580 val_acc= 0.10345 time= 0.00000
Epoch: 0008 train_loss= 2.07913 train_acc= 0.16226 val_loss= 2.07437 val_acc= 0.10345 time= 0.01563
Epoch: 0009 train_loss= 2.07838 train_acc= 0.16604 val_loss= 2.07289 val_acc= 0.10345 time= 0.00000
Epoch: 0010 train_loss= 2.07743 train_acc= 0.17358 val_loss= 2.07145 val_acc= 0.10345 time= 0.01563
Epoch: 0011 train_loss= 2.07610 train_acc= 0.15849 val_loss= 2.07000 val_acc= 0.10345 time= 0.00000
Epoch: 0012 train_loss= 2.07521 train_acc= 0.18113 val_loss= 2.06852 val_acc= 0.10345 time= 0.01562
Epoch: 0013 train_loss= 2.07302 train_acc= 0.14340 val_loss= 2.06707 val_acc= 0.10345 time= 0.00000
Epoch: 0014 train_loss= 2.07249 train_acc= 0.16604 val_loss= 2.06532 val_acc= 0.10345 time= 0.01563
Epoch: 0015 train_loss= 2.07046 train_acc= 0.17736 val_loss= 2.06352 val_acc= 0.10345 time= 0.00000
Epoch: 0016 train_loss= 2.06943 train_acc= 0.17358 val_loss= 2.06178 val_acc= 0.10345 time= 0.01563
Epoch: 0017 train_loss= 2.06764 train_acc= 0.17358 val_loss= 2.06012 val_acc= 0.10345 time= 0.00000
Epoch: 0018 train_loss= 2.06621 train_acc= 0.17358 val_loss= 2.05856 val_acc= 0.10345 time= 0.01563
Epoch: 0019 train_loss= 2.06611 train_acc= 0.17736 val_loss= 2.05712 val_acc= 0.10345 time= 0.00000
Epoch: 0020 train_loss= 2.06407 train_acc= 0.17358 val_loss= 2.05583 val_acc= 0.10345 time= 0.01563
Epoch: 0021 train_loss= 2.06407 train_acc= 0.17358 val_loss= 2.05468 val_acc= 0.10345 time= 0.00000
Epoch: 0022 train_loss= 2.06364 train_acc= 0.17358 val_loss= 2.05376 val_acc= 0.10345 time= 0.01563
Epoch: 0023 train_loss= 2.06306 train_acc= 0.17358 val_loss= 2.05299 val_acc= 0.10345 time= 0.00000
Epoch: 0024 train_loss= 2.05952 train_acc= 0.17736 val_loss= 2.05237 val_acc= 0.10345 time= 0.01563
Epoch: 0025 train_loss= 2.05864 train_acc= 0.17358 val_loss= 2.05190 val_acc= 0.10345 time= 0.00000
Epoch: 0026 train_loss= 2.05873 train_acc= 0.17358 val_loss= 2.05146 val_acc= 0.10345 time= 0.01563
Epoch: 0027 train_loss= 2.05784 train_acc= 0.17736 val_loss= 2.05124 val_acc= 0.10345 time= 0.00000
Epoch: 0028 train_loss= 2.05974 train_acc= 0.17358 val_loss= 2.05126 val_acc= 0.10345 time= 0.01563
Epoch: 0029 train_loss= 2.05701 train_acc= 0.17358 val_loss= 2.05134 val_acc= 0.10345 time= 0.00000
Epoch: 0030 train_loss= 2.05964 train_acc= 0.17358 val_loss= 2.05153 val_acc= 0.10345 time= 0.01563
Epoch: 0031 train_loss= 2.05661 train_acc= 0.17736 val_loss= 2.05179 val_acc= 0.10345 time= 0.00000
Epoch: 0032 train_loss= 2.05888 train_acc= 0.17358 val_loss= 2.05231 val_acc= 0.10345 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 2.07908 accuracy= 0.10169 time= 0.00000 
