Epoch: 0001 train_loss= 1.39781 train_acc= 0.23453 val_loss= 1.41595 val_acc= 0.10714 time= 0.26564
Epoch: 0002 train_loss= 1.39485 train_acc= 0.23453 val_loss= 1.41221 val_acc= 0.10714 time= 0.00000
Epoch: 0003 train_loss= 1.39210 train_acc= 0.23453 val_loss= 1.40869 val_acc= 0.10714 time= 0.01563
Epoch: 0004 train_loss= 1.39294 train_acc= 0.23779 val_loss= 1.40537 val_acc= 0.10714 time= 0.00000
Epoch: 0005 train_loss= 1.39047 train_acc= 0.23127 val_loss= 1.40221 val_acc= 0.10714 time= 0.01563
Epoch: 0006 train_loss= 1.38857 train_acc= 0.23779 val_loss= 1.39921 val_acc= 0.10714 time= 0.00000
Epoch: 0007 train_loss= 1.38846 train_acc= 0.23127 val_loss= 1.39634 val_acc= 0.10714 time= 0.00000
Epoch: 0008 train_loss= 1.38516 train_acc= 0.28013 val_loss= 1.39362 val_acc= 0.12500 time= 0.01563
Epoch: 0009 train_loss= 1.38464 train_acc= 0.28664 val_loss= 1.39103 val_acc= 0.30357 time= 0.00000
Epoch: 0010 train_loss= 1.38231 train_acc= 0.30293 val_loss= 1.38856 val_acc= 0.33929 time= 0.01563
Epoch: 0011 train_loss= 1.38144 train_acc= 0.32899 val_loss= 1.38615 val_acc= 0.33929 time= 0.00000
Epoch: 0012 train_loss= 1.38084 train_acc= 0.32573 val_loss= 1.38379 val_acc= 0.33929 time= 0.01562
Epoch: 0013 train_loss= 1.37973 train_acc= 0.32573 val_loss= 1.38148 val_acc= 0.33929 time= 0.00000
Epoch: 0014 train_loss= 1.37730 train_acc= 0.32573 val_loss= 1.37920 val_acc= 0.33929 time= 0.01770
Epoch: 0015 train_loss= 1.37700 train_acc= 0.32573 val_loss= 1.37695 val_acc= 0.33929 time= 0.00303
Epoch: 0016 train_loss= 1.37694 train_acc= 0.32573 val_loss= 1.37476 val_acc= 0.33929 time= 0.00000
Epoch: 0017 train_loss= 1.37524 train_acc= 0.32573 val_loss= 1.37263 val_acc= 0.33929 time= 0.01100
Epoch: 0018 train_loss= 1.37652 train_acc= 0.32573 val_loss= 1.37059 val_acc= 0.33929 time= 0.00000
Epoch: 0019 train_loss= 1.37228 train_acc= 0.32573 val_loss= 1.36864 val_acc= 0.33929 time= 0.01563
Epoch: 0020 train_loss= 1.37406 train_acc= 0.32573 val_loss= 1.36686 val_acc= 0.33929 time= 0.00000
Epoch: 0021 train_loss= 1.37261 train_acc= 0.32573 val_loss= 1.36523 val_acc= 0.33929 time= 0.01563
Epoch: 0022 train_loss= 1.37219 train_acc= 0.32573 val_loss= 1.36380 val_acc= 0.33929 time= 0.00000
Epoch: 0023 train_loss= 1.37057 train_acc= 0.32573 val_loss= 1.36260 val_acc= 0.33929 time= 0.01563
Epoch: 0024 train_loss= 1.37188 train_acc= 0.32573 val_loss= 1.36168 val_acc= 0.33929 time= 0.00000
Epoch: 0025 train_loss= 1.37173 train_acc= 0.32573 val_loss= 1.36097 val_acc= 0.33929 time= 0.00000
Epoch: 0026 train_loss= 1.37089 train_acc= 0.32573 val_loss= 1.36045 val_acc= 0.33929 time= 0.01563
Epoch: 0027 train_loss= 1.37202 train_acc= 0.32573 val_loss= 1.36019 val_acc= 0.33929 time= 0.00000
Epoch: 0028 train_loss= 1.37168 train_acc= 0.32573 val_loss= 1.36021 val_acc= 0.33929 time= 0.01563
Epoch: 0029 train_loss= 1.37270 train_acc= 0.32573 val_loss= 1.36036 val_acc= 0.33929 time= 0.00000
Epoch: 0030 train_loss= 1.37173 train_acc= 0.32573 val_loss= 1.36064 val_acc= 0.33929 time= 0.01563
Epoch: 0031 train_loss= 1.37097 train_acc= 0.32573 val_loss= 1.36094 val_acc= 0.33929 time= 0.00000
Epoch: 0032 train_loss= 1.37141 train_acc= 0.32573 val_loss= 1.36126 val_acc= 0.33929 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 1.39366 accuracy= 0.31858 time= 0.00000 
