Epoch: 0001 train_loss= 2.08140 train_acc= 0.13208 val_loss= 2.08678 val_acc= 0.03448 time= 0.84401
Epoch: 0002 train_loss= 2.07968 train_acc= 0.13747 val_loss= 2.08500 val_acc= 0.03448 time= 0.00000
Epoch: 0003 train_loss= 2.07938 train_acc= 0.12129 val_loss= 2.08314 val_acc= 0.03448 time= 0.01563
Epoch: 0004 train_loss= 2.07824 train_acc= 0.11590 val_loss= 2.08105 val_acc= 0.06897 time= 0.00000
Epoch: 0005 train_loss= 2.07680 train_acc= 0.12129 val_loss= 2.07874 val_acc= 0.06897 time= 0.00000
Epoch: 0006 train_loss= 2.07493 train_acc= 0.12938 val_loss= 2.07630 val_acc= 0.06897 time= 0.01563
Epoch: 0007 train_loss= 2.07542 train_acc= 0.12668 val_loss= 2.07375 val_acc= 0.06897 time= 0.00000
Epoch: 0008 train_loss= 2.07421 train_acc= 0.14016 val_loss= 2.07108 val_acc= 0.06897 time= 0.00000
Epoch: 0009 train_loss= 2.07396 train_acc= 0.14825 val_loss= 2.06833 val_acc= 0.06897 time= 0.01563
Epoch: 0010 train_loss= 2.07246 train_acc= 0.14555 val_loss= 2.06550 val_acc= 0.06897 time= 0.00000
Epoch: 0011 train_loss= 2.06916 train_acc= 0.14555 val_loss= 2.06244 val_acc= 0.06897 time= 0.00000
Epoch: 0012 train_loss= 2.06828 train_acc= 0.12129 val_loss= 2.05922 val_acc= 0.10345 time= 0.01562
Epoch: 0013 train_loss= 2.06754 train_acc= 0.13208 val_loss= 2.05592 val_acc= 0.34483 time= 0.00000
Epoch: 0014 train_loss= 2.06709 train_acc= 0.16981 val_loss= 2.05238 val_acc= 0.34483 time= 0.00000
Epoch: 0015 train_loss= 2.06672 train_acc= 0.14016 val_loss= 2.04869 val_acc= 0.34483 time= 0.01563
Epoch: 0016 train_loss= 2.06331 train_acc= 0.16442 val_loss= 2.04478 val_acc= 0.34483 time= 0.00000
Epoch: 0017 train_loss= 2.06454 train_acc= 0.15633 val_loss= 2.04067 val_acc= 0.34483 time= 0.01563
Epoch: 0018 train_loss= 2.06094 train_acc= 0.15903 val_loss= 2.03650 val_acc= 0.34483 time= 0.00000
Epoch: 0019 train_loss= 2.06056 train_acc= 0.15903 val_loss= 2.03226 val_acc= 0.34483 time= 0.00000
Epoch: 0020 train_loss= 2.05955 train_acc= 0.15903 val_loss= 2.02819 val_acc= 0.34483 time= 0.01563
Epoch: 0021 train_loss= 2.05948 train_acc= 0.15903 val_loss= 2.02446 val_acc= 0.34483 time= 0.00000
Epoch: 0022 train_loss= 2.05970 train_acc= 0.15903 val_loss= 2.02124 val_acc= 0.34483 time= 0.00000
Epoch: 0023 train_loss= 2.06099 train_acc= 0.16173 val_loss= 2.01856 val_acc= 0.34483 time= 0.01563
Epoch: 0024 train_loss= 2.05970 train_acc= 0.16442 val_loss= 2.01663 val_acc= 0.34483 time= 0.00000
Epoch: 0025 train_loss= 2.05871 train_acc= 0.16173 val_loss= 2.01530 val_acc= 0.34483 time= 0.00000
Epoch: 0026 train_loss= 2.05942 train_acc= 0.16442 val_loss= 2.01494 val_acc= 0.34483 time= 0.01563
Epoch: 0027 train_loss= 2.05869 train_acc= 0.15903 val_loss= 2.01494 val_acc= 0.34483 time= 0.00000
Epoch: 0028 train_loss= 2.05713 train_acc= 0.16442 val_loss= 2.01568 val_acc= 0.34483 time= 0.00000
Epoch: 0029 train_loss= 2.05954 train_acc= 0.15633 val_loss= 2.01704 val_acc= 0.34483 time= 0.01563
Epoch: 0030 train_loss= 2.05753 train_acc= 0.15094 val_loss= 2.01855 val_acc= 0.31034 time= 0.00000
Epoch: 0031 train_loss= 2.05758 train_acc= 0.15364 val_loss= 2.02015 val_acc= 0.27586 time= 0.00000
Early stopping...
Optimization Finished!
Test set results: cost= 2.06831 accuracy= 0.16949 time= 0.01563 
