Epoch: 0001 train_loss= 2.08739 train_acc= 0.16352 val_loss= 2.08392 val_acc= 0.13793 time= 0.10955
Epoch: 0002 train_loss= 2.08542 train_acc= 0.14465 val_loss= 2.08211 val_acc= 0.13793 time= 0.01563
Epoch: 0003 train_loss= 2.08430 train_acc= 0.14465 val_loss= 2.08042 val_acc= 0.13793 time= 0.00000
Epoch: 0004 train_loss= 2.08269 train_acc= 0.14465 val_loss= 2.07877 val_acc= 0.13793 time= 0.01563
Epoch: 0005 train_loss= 2.08233 train_acc= 0.14465 val_loss= 2.07693 val_acc= 0.13793 time= 0.00000
Epoch: 0006 train_loss= 2.08089 train_acc= 0.15094 val_loss= 2.07505 val_acc= 0.13793 time= 0.00000
Epoch: 0007 train_loss= 2.07969 train_acc= 0.14465 val_loss= 2.07340 val_acc= 0.13793 time= 0.01563
Epoch: 0008 train_loss= 2.07829 train_acc= 0.15094 val_loss= 2.07170 val_acc= 0.13793 time= 0.00000
Epoch: 0009 train_loss= 2.07743 train_acc= 0.10063 val_loss= 2.06997 val_acc= 0.13793 time= 0.01563
Epoch: 0010 train_loss= 2.07695 train_acc= 0.11950 val_loss= 2.06824 val_acc= 0.13793 time= 0.00000
Epoch: 0011 train_loss= 2.07414 train_acc= 0.15723 val_loss= 2.06650 val_acc= 0.13793 time= 0.01563
Epoch: 0012 train_loss= 2.07342 train_acc= 0.12579 val_loss= 2.06475 val_acc= 0.24138 time= 0.00000
Epoch: 0013 train_loss= 2.07176 train_acc= 0.19497 val_loss= 2.06301 val_acc= 0.24138 time= 0.01563
Epoch: 0014 train_loss= 2.07033 train_acc= 0.17610 val_loss= 2.06130 val_acc= 0.24138 time= 0.00000
Epoch: 0015 train_loss= 2.06857 train_acc= 0.16352 val_loss= 2.05960 val_acc= 0.24138 time= 0.01563
Epoch: 0016 train_loss= 2.06846 train_acc= 0.18868 val_loss= 2.05797 val_acc= 0.24138 time= 0.00000
Epoch: 0017 train_loss= 2.06633 train_acc= 0.18239 val_loss= 2.05645 val_acc= 0.24138 time= 0.01563
Epoch: 0018 train_loss= 2.06475 train_acc= 0.19497 val_loss= 2.05500 val_acc= 0.24138 time= 0.01563
Epoch: 0019 train_loss= 2.06614 train_acc= 0.17610 val_loss= 2.05369 val_acc= 0.24138 time= 0.00000
Epoch: 0020 train_loss= 2.06360 train_acc= 0.17610 val_loss= 2.05250 val_acc= 0.24138 time= 0.01563
Epoch: 0021 train_loss= 2.06052 train_acc= 0.17610 val_loss= 2.05142 val_acc= 0.24138 time= 0.00000
Epoch: 0022 train_loss= 2.05917 train_acc= 0.17610 val_loss= 2.05049 val_acc= 0.24138 time= 0.01563
Epoch: 0023 train_loss= 2.05939 train_acc= 0.17610 val_loss= 2.04973 val_acc= 0.24138 time= 0.00000
Epoch: 0024 train_loss= 2.05930 train_acc= 0.16981 val_loss= 2.04915 val_acc= 0.24138 time= 0.01563
Epoch: 0025 train_loss= 2.05791 train_acc= 0.17610 val_loss= 2.04887 val_acc= 0.24138 time= 0.00000
Epoch: 0026 train_loss= 2.06063 train_acc= 0.16981 val_loss= 2.04876 val_acc= 0.24138 time= 0.01563
Epoch: 0027 train_loss= 2.05680 train_acc= 0.17610 val_loss= 2.04884 val_acc= 0.24138 time= 0.00000
Epoch: 0028 train_loss= 2.05574 train_acc= 0.17610 val_loss= 2.04907 val_acc= 0.24138 time= 0.01563
Epoch: 0029 train_loss= 2.05387 train_acc= 0.17610 val_loss= 2.04942 val_acc= 0.24138 time= 0.00000
Epoch: 0030 train_loss= 2.05728 train_acc= 0.17610 val_loss= 2.04981 val_acc= 0.24138 time= 0.01562
Epoch: 0031 train_loss= 2.05715 train_acc= 0.16981 val_loss= 2.05007 val_acc= 0.24138 time= 0.00000
Early stopping...
Optimization Finished!
Test set results: cost= 2.10666 accuracy= 0.08475 time= 0.01563 
