Epoch: 0001 train_loss= 2.08716 train_acc= 0.13208 val_loss= 2.08376 val_acc= 0.06897 time= 0.35940
Epoch: 0002 train_loss= 2.08467 train_acc= 0.13208 val_loss= 2.08073 val_acc= 0.06897 time= 0.00000
Epoch: 0003 train_loss= 2.08253 train_acc= 0.13208 val_loss= 2.07791 val_acc= 0.06897 time= 0.01562
Epoch: 0004 train_loss= 2.08083 train_acc= 0.13208 val_loss= 2.07536 val_acc= 0.06897 time= 0.01563
Epoch: 0005 train_loss= 2.07922 train_acc= 0.13208 val_loss= 2.07305 val_acc= 0.06897 time= 0.00000
Epoch: 0006 train_loss= 2.07782 train_acc= 0.13208 val_loss= 2.07099 val_acc= 0.06897 time= 0.01563
Epoch: 0007 train_loss= 2.07702 train_acc= 0.13208 val_loss= 2.06916 val_acc= 0.06897 time= 0.01563
Epoch: 0008 train_loss= 2.07612 train_acc= 0.12938 val_loss= 2.06748 val_acc= 0.06897 time= 0.00000
Epoch: 0009 train_loss= 2.07523 train_acc= 0.12938 val_loss= 2.06584 val_acc= 0.06897 time= 0.01563
Epoch: 0010 train_loss= 2.07417 train_acc= 0.12938 val_loss= 2.06414 val_acc= 0.06897 time= 0.00000
Epoch: 0011 train_loss= 2.07334 train_acc= 0.15094 val_loss= 2.06241 val_acc= 0.17241 time= 0.01562
Epoch: 0012 train_loss= 2.07277 train_acc= 0.16173 val_loss= 2.06063 val_acc= 0.17241 time= 0.01563
Epoch: 0013 train_loss= 2.07256 train_acc= 0.16981 val_loss= 2.05865 val_acc= 0.17241 time= 0.00000
Epoch: 0014 train_loss= 2.07117 train_acc= 0.17520 val_loss= 2.05637 val_acc= 0.17241 time= 0.01563
Epoch: 0015 train_loss= 2.07150 train_acc= 0.17520 val_loss= 2.05383 val_acc= 0.17241 time= 0.01563
Epoch: 0016 train_loss= 2.07018 train_acc= 0.17520 val_loss= 2.05121 val_acc= 0.17241 time= 0.00000
Epoch: 0017 train_loss= 2.06948 train_acc= 0.17520 val_loss= 2.04850 val_acc= 0.17241 time= 0.01563
Epoch: 0018 train_loss= 2.06874 train_acc= 0.17520 val_loss= 2.04590 val_acc= 0.17241 time= 0.01563
Epoch: 0019 train_loss= 2.06918 train_acc= 0.17520 val_loss= 2.04350 val_acc= 0.17241 time= 0.00000
Epoch: 0020 train_loss= 2.06860 train_acc= 0.17520 val_loss= 2.04121 val_acc= 0.17241 time= 0.01563
Epoch: 0021 train_loss= 2.06806 train_acc= 0.17520 val_loss= 2.03918 val_acc= 0.17241 time= 0.01562
Epoch: 0022 train_loss= 2.06751 train_acc= 0.17520 val_loss= 2.03739 val_acc= 0.17241 time= 0.00000
Epoch: 0023 train_loss= 2.06687 train_acc= 0.17520 val_loss= 2.03599 val_acc= 0.17241 time= 0.01563
Epoch: 0024 train_loss= 2.06698 train_acc= 0.17520 val_loss= 2.03492 val_acc= 0.17241 time= 0.00000
Epoch: 0025 train_loss= 2.06590 train_acc= 0.17520 val_loss= 2.03408 val_acc= 0.17241 time= 0.01563
Epoch: 0026 train_loss= 2.06560 train_acc= 0.17520 val_loss= 2.03348 val_acc= 0.17241 time= 0.01563
Epoch: 0027 train_loss= 2.06570 train_acc= 0.17520 val_loss= 2.03322 val_acc= 0.17241 time= 0.00000
Epoch: 0028 train_loss= 2.06495 train_acc= 0.17520 val_loss= 2.03299 val_acc= 0.17241 time= 0.01563
Epoch: 0029 train_loss= 2.06516 train_acc= 0.17520 val_loss= 2.03286 val_acc= 0.17241 time= 0.01563
Epoch: 0030 train_loss= 2.06401 train_acc= 0.17520 val_loss= 2.03272 val_acc= 0.17241 time= 0.00000
Epoch: 0031 train_loss= 2.06389 train_acc= 0.17520 val_loss= 2.03275 val_acc= 0.17241 time= 0.01563
Epoch: 0032 train_loss= 2.06317 train_acc= 0.17520 val_loss= 2.03287 val_acc= 0.17241 time= 0.01563
Epoch: 0033 train_loss= 2.06320 train_acc= 0.17520 val_loss= 2.03294 val_acc= 0.17241 time= 0.00000
Epoch: 0034 train_loss= 2.06402 train_acc= 0.17520 val_loss= 2.03296 val_acc= 0.17241 time= 0.01563
Epoch: 0035 train_loss= 2.06397 train_acc= 0.17520 val_loss= 2.03311 val_acc= 0.17241 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 2.04787 accuracy= 0.11864 time= 0.00000 
