Epoch: 0001 train_loss= 2.45052 train_acc= 0.45152 val_loss= 1.29846 val_acc= 0.42623 time= 0.12501
Epoch: 0002 train_loss= 1.45010 train_acc= 0.45455 val_loss= 0.98200 val_acc= 0.44262 time= 0.01563
Epoch: 0003 train_loss= 1.35432 train_acc= 0.45758 val_loss= 0.82341 val_acc= 0.49180 time= 0.01562
Epoch: 0004 train_loss= 1.04874 train_acc= 0.46364 val_loss= 0.78574 val_acc= 0.47541 time= 0.01563
Epoch: 0005 train_loss= 1.29692 train_acc= 0.50909 val_loss= 0.81718 val_acc= 0.50820 time= 0.00000
Epoch: 0006 train_loss= 1.14559 train_acc= 0.52727 val_loss= 0.83560 val_acc= 0.52459 time= 0.01563
Epoch: 0007 train_loss= 1.11484 train_acc= 0.46061 val_loss= 0.84526 val_acc= 0.54098 time= 0.01562
Epoch: 0008 train_loss= 0.90491 train_acc= 0.51212 val_loss= 0.85445 val_acc= 0.57377 time= 0.00000
Epoch: 0009 train_loss= 1.16390 train_acc= 0.50303 val_loss= 0.84395 val_acc= 0.57377 time= 0.01563
Epoch: 0010 train_loss= 1.00171 train_acc= 0.49091 val_loss= 0.82670 val_acc= 0.55738 time= 0.01563
Epoch: 0011 train_loss= 1.02508 train_acc= 0.51212 val_loss= 0.81142 val_acc= 0.55738 time= 0.00000
Epoch: 0012 train_loss= 0.81477 train_acc= 0.52424 val_loss= 0.79385 val_acc= 0.57377 time= 0.01563
Epoch: 0013 train_loss= 0.98758 train_acc= 0.50909 val_loss= 0.78444 val_acc= 0.57377 time= 0.01563
Epoch: 0014 train_loss= 1.14168 train_acc= 0.52121 val_loss= 0.77432 val_acc= 0.55738 time= 0.01563
Epoch: 0015 train_loss= 0.87575 train_acc= 0.48182 val_loss= 0.76723 val_acc= 0.52459 time= 0.00000
Epoch: 0016 train_loss= 0.84890 train_acc= 0.51212 val_loss= 0.76072 val_acc= 0.55738 time= 0.01563
Epoch: 0017 train_loss= 0.86303 train_acc= 0.48182 val_loss= 0.75347 val_acc= 0.57377 time= 0.01563
Epoch: 0018 train_loss= 0.97000 train_acc= 0.52727 val_loss= 0.74359 val_acc= 0.57377 time= 0.00000
Epoch: 0019 train_loss= 1.13295 train_acc= 0.49394 val_loss= 0.73096 val_acc= 0.57377 time= 0.01562
Epoch: 0020 train_loss= 0.82583 train_acc= 0.50606 val_loss= 0.72028 val_acc= 0.57377 time= 0.01563
Epoch: 0021 train_loss= 0.86230 train_acc= 0.52727 val_loss= 0.71170 val_acc= 0.54098 time= 0.01563
Epoch: 0022 train_loss= 0.78742 train_acc= 0.50000 val_loss= 0.70674 val_acc= 0.55738 time= 0.00000
Epoch: 0023 train_loss= 0.82984 train_acc= 0.50000 val_loss= 0.70433 val_acc= 0.55738 time= 0.01563
Epoch: 0024 train_loss= 0.78307 train_acc= 0.49394 val_loss= 0.70298 val_acc= 0.57377 time= 0.01563
Epoch: 0025 train_loss= 0.81262 train_acc= 0.46667 val_loss= 0.70352 val_acc= 0.57377 time= 0.01562
Epoch: 0026 train_loss= 0.97859 train_acc= 0.46061 val_loss= 0.70673 val_acc= 0.59016 time= 0.00000
Epoch: 0027 train_loss= 0.87631 train_acc= 0.50303 val_loss= 0.70864 val_acc= 0.55738 time= 0.01563
Epoch: 0028 train_loss= 0.80919 train_acc= 0.48788 val_loss= 0.71040 val_acc= 0.54098 time= 0.01563
Epoch: 0029 train_loss= 0.88031 train_acc= 0.51212 val_loss= 0.71245 val_acc= 0.55738 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 0.72803 accuracy= 0.50820 time= 0.00000 
