Epoch: 0001 train_loss= 1.02186 train_acc= 0.50130 val_loss= 0.67210 val_acc= 0.57377 time= 1.01997
Epoch: 0002 train_loss= 1.32871 train_acc= 0.52597 val_loss= 0.75570 val_acc= 0.59016 time= 0.03125
Epoch: 0003 train_loss= 0.83575 train_acc= 0.51558 val_loss= 0.86227 val_acc= 0.57377 time= 0.01563
Epoch: 0004 train_loss= 1.14892 train_acc= 0.51169 val_loss= 0.85531 val_acc= 0.60656 time= 0.03125
Epoch: 0005 train_loss= 1.31109 train_acc= 0.48961 val_loss= 0.75180 val_acc= 0.65574 time= 0.01563
Epoch: 0006 train_loss= 0.77342 train_acc= 0.50130 val_loss= 0.71851 val_acc= 0.62295 time= 0.03125
Epoch: 0007 train_loss= 1.01167 train_acc= 0.49870 val_loss= 0.70080 val_acc= 0.54098 time= 0.01563
Epoch: 0008 train_loss= 0.74652 train_acc= 0.48182 val_loss= 0.70679 val_acc= 0.47541 time= 0.03125
Epoch: 0009 train_loss= 0.73447 train_acc= 0.50390 val_loss= 0.71152 val_acc= 0.45902 time= 0.01563
Epoch: 0010 train_loss= 0.80570 train_acc= 0.51299 val_loss= 0.71740 val_acc= 0.40984 time= 0.03125
Epoch: 0011 train_loss= 0.79216 train_acc= 0.49221 val_loss= 0.71931 val_acc= 0.44262 time= 0.01563
Epoch: 0012 train_loss= 0.73003 train_acc= 0.50649 val_loss= 0.71923 val_acc= 0.39344 time= 0.03125
Epoch: 0013 train_loss= 0.74997 train_acc= 0.50000 val_loss= 0.71944 val_acc= 0.44262 time= 0.03125
Epoch: 0014 train_loss= 0.72551 train_acc= 0.51688 val_loss= 0.71813 val_acc= 0.44262 time= 0.01563
Epoch: 0015 train_loss= 0.74209 train_acc= 0.48831 val_loss= 0.71557 val_acc= 0.42623 time= 0.01563
Epoch: 0016 train_loss= 0.75790 train_acc= 0.51429 val_loss= 0.71232 val_acc= 0.42623 time= 0.03125
Epoch: 0017 train_loss= 0.70578 train_acc= 0.50779 val_loss= 0.70917 val_acc= 0.44262 time= 0.01563
Epoch: 0018 train_loss= 0.74799 train_acc= 0.48442 val_loss= 0.70602 val_acc= 0.45902 time= 0.03125
Epoch: 0019 train_loss= 0.80016 train_acc= 0.49481 val_loss= 0.70301 val_acc= 0.44262 time= 0.01563
Epoch: 0020 train_loss= 0.70303 train_acc= 0.51039 val_loss= 0.70090 val_acc= 0.40984 time= 0.03125
Epoch: 0021 train_loss= 0.71818 train_acc= 0.51169 val_loss= 0.69934 val_acc= 0.50820 time= 0.01563
Epoch: 0022 train_loss= 0.75044 train_acc= 0.54026 val_loss= 0.69813 val_acc= 0.55738 time= 0.03125
Epoch: 0023 train_loss= 0.71409 train_acc= 0.51558 val_loss= 0.69722 val_acc= 0.57377 time= 0.01563
Epoch: 0024 train_loss= 0.70268 train_acc= 0.52338 val_loss= 0.69657 val_acc= 0.60656 time= 0.03125
Epoch: 0025 train_loss= 0.70615 train_acc= 0.52987 val_loss= 0.69618 val_acc= 0.60656 time= 0.01563
Epoch: 0026 train_loss= 0.73468 train_acc= 0.52468 val_loss= 0.69542 val_acc= 0.57377 time= 0.03125
Epoch: 0027 train_loss= 0.71568 train_acc= 0.52597 val_loss= 0.69476 val_acc= 0.57377 time= 0.01563
Epoch: 0028 train_loss= 0.70926 train_acc= 0.51039 val_loss= 0.69433 val_acc= 0.59016 time= 0.03125
Epoch: 0029 train_loss= 0.70767 train_acc= 0.51039 val_loss= 0.69400 val_acc= 0.59016 time= 0.01563
Epoch: 0030 train_loss= 0.70244 train_acc= 0.51688 val_loss= 0.69391 val_acc= 0.59016 time= 0.03125
Epoch: 0031 train_loss= 0.70189 train_acc= 0.51688 val_loss= 0.69410 val_acc= 0.54098 time= 0.01563
Epoch: 0032 train_loss= 0.69412 train_acc= 0.53117 val_loss= 0.69437 val_acc= 0.54098 time= 0.03125
Epoch: 0033 train_loss= 0.70491 train_acc= 0.51558 val_loss= 0.69492 val_acc= 0.55738 time= 0.01563
Epoch: 0034 train_loss= 0.69950 train_acc= 0.50130 val_loss= 0.69547 val_acc= 0.55738 time= 0.03125
Early stopping...
Optimization Finished!
Test set results: cost= 0.69747 accuracy= 0.47541 time= 0.01563 
