Epoch: 0001 train_loss= 0.70112 train_acc= 0.52182 val_loss= 0.69029 val_acc= 0.55738 time= 0.53066
Epoch: 0002 train_loss= 0.69610 train_acc= 0.52545 val_loss= 0.69062 val_acc= 0.55738 time= 0.01100
Epoch: 0003 train_loss= 0.69600 train_acc= 0.52364 val_loss= 0.69099 val_acc= 0.55738 time= 0.00000
Epoch: 0004 train_loss= 0.69555 train_acc= 0.52364 val_loss= 0.69160 val_acc= 0.55738 time= 0.00000
Epoch: 0005 train_loss= 0.69685 train_acc= 0.52545 val_loss= 0.69209 val_acc= 0.55738 time= 0.01563
Epoch: 0006 train_loss= 0.69620 train_acc= 0.54000 val_loss= 0.69214 val_acc= 0.55738 time= 0.00000
Epoch: 0007 train_loss= 0.69580 train_acc= 0.50909 val_loss= 0.69197 val_acc= 0.55738 time= 0.00000
Epoch: 0008 train_loss= 0.69405 train_acc= 0.54182 val_loss= 0.69143 val_acc= 0.55738 time= 0.01563
Epoch: 0009 train_loss= 0.69453 train_acc= 0.53818 val_loss= 0.69085 val_acc= 0.55738 time= 0.00000
Epoch: 0010 train_loss= 0.69370 train_acc= 0.52364 val_loss= 0.69011 val_acc= 0.55738 time= 0.00000
Epoch: 0011 train_loss= 0.69534 train_acc= 0.51091 val_loss= 0.68942 val_acc= 0.55738 time= 0.01563
Epoch: 0012 train_loss= 0.69307 train_acc= 0.53455 val_loss= 0.68874 val_acc= 0.55738 time= 0.00000
Epoch: 0013 train_loss= 0.69380 train_acc= 0.52364 val_loss= 0.68809 val_acc= 0.55738 time= 0.00000
Epoch: 0014 train_loss= 0.69295 train_acc= 0.52909 val_loss= 0.68754 val_acc= 0.55738 time= 0.01563
Epoch: 0015 train_loss= 0.69279 train_acc= 0.52364 val_loss= 0.68715 val_acc= 0.55738 time= 0.01810
Epoch: 0016 train_loss= 0.69332 train_acc= 0.51636 val_loss= 0.68689 val_acc= 0.55738 time= 0.00706
Epoch: 0017 train_loss= 0.69229 train_acc= 0.53091 val_loss= 0.68665 val_acc= 0.55738 time= 0.00000
Epoch: 0018 train_loss= 0.69250 train_acc= 0.52727 val_loss= 0.68642 val_acc= 0.55738 time= 0.01563
Epoch: 0019 train_loss= 0.69059 train_acc= 0.52364 val_loss= 0.68623 val_acc= 0.55738 time= 0.00000
Epoch: 0020 train_loss= 0.69192 train_acc= 0.53818 val_loss= 0.68609 val_acc= 0.55738 time= 0.01563
Epoch: 0021 train_loss= 0.69290 train_acc= 0.52727 val_loss= 0.68596 val_acc= 0.55738 time= 0.00000
Epoch: 0022 train_loss= 0.69302 train_acc= 0.52909 val_loss= 0.68588 val_acc= 0.55738 time= 0.01563
Epoch: 0023 train_loss= 0.69061 train_acc= 0.52909 val_loss= 0.68583 val_acc= 0.55738 time= 0.00000
Epoch: 0024 train_loss= 0.69118 train_acc= 0.53455 val_loss= 0.68576 val_acc= 0.55738 time= 0.01563
Epoch: 0025 train_loss= 0.69120 train_acc= 0.52909 val_loss= 0.68568 val_acc= 0.55738 time= 0.00000
Epoch: 0026 train_loss= 0.69196 train_acc= 0.52545 val_loss= 0.68568 val_acc= 0.55738 time= 0.01563
Epoch: 0027 train_loss= 0.69070 train_acc= 0.53455 val_loss= 0.68571 val_acc= 0.55738 time= 0.00000
Epoch: 0028 train_loss= 0.69204 train_acc= 0.52364 val_loss= 0.68577 val_acc= 0.55738 time= 0.00000
Epoch: 0029 train_loss= 0.69155 train_acc= 0.52000 val_loss= 0.68583 val_acc= 0.55738 time= 0.01563
Epoch: 0030 train_loss= 0.69121 train_acc= 0.53273 val_loss= 0.68590 val_acc= 0.55738 time= 0.00000
Early stopping...
Optimization Finished!
Test set results: cost= 0.69842 accuracy= 0.46721 time= 0.00000 
