Epoch: 0001 train_loss= 0.70108 train_acc= 0.46061 val_loss= 0.69816 val_acc= 0.55738 time= 0.18751
Epoch: 0002 train_loss= 0.69775 train_acc= 0.56667 val_loss= 0.69552 val_acc= 0.52459 time= 0.00000
Epoch: 0003 train_loss= 0.69539 train_acc= 0.53333 val_loss= 0.69352 val_acc= 0.52459 time= 0.01563
Epoch: 0004 train_loss= 0.69360 train_acc= 0.53333 val_loss= 0.69190 val_acc= 0.54098 time= 0.00000
Epoch: 0005 train_loss= 0.69159 train_acc= 0.53939 val_loss= 0.69067 val_acc= 0.55738 time= 0.01563
Epoch: 0006 train_loss= 0.69016 train_acc= 0.55758 val_loss= 0.68985 val_acc= 0.55738 time= 0.00000
Epoch: 0007 train_loss= 0.68927 train_acc= 0.55455 val_loss= 0.68927 val_acc= 0.57377 time= 0.01563
Epoch: 0008 train_loss= 0.68861 train_acc= 0.55455 val_loss= 0.68887 val_acc= 0.57377 time= 0.00000
Epoch: 0009 train_loss= 0.68735 train_acc= 0.57273 val_loss= 0.68856 val_acc= 0.59016 time= 0.01563
Epoch: 0010 train_loss= 0.68703 train_acc= 0.59394 val_loss= 0.68825 val_acc= 0.59016 time= 0.01563
Epoch: 0011 train_loss= 0.68688 train_acc= 0.58788 val_loss= 0.68792 val_acc= 0.59016 time= 0.00000
Epoch: 0012 train_loss= 0.68620 train_acc= 0.56667 val_loss= 0.68763 val_acc= 0.59016 time= 0.01563
Epoch: 0013 train_loss= 0.68564 train_acc= 0.57273 val_loss= 0.68734 val_acc= 0.59016 time= 0.00000
Epoch: 0014 train_loss= 0.68344 train_acc= 0.58182 val_loss= 0.68708 val_acc= 0.59016 time= 0.01563
Epoch: 0015 train_loss= 0.68262 train_acc= 0.60606 val_loss= 0.68676 val_acc= 0.60656 time= 0.00000
Epoch: 0016 train_loss= 0.68193 train_acc= 0.62727 val_loss= 0.68635 val_acc= 0.60656 time= 0.01563
Epoch: 0017 train_loss= 0.68104 train_acc= 0.62424 val_loss= 0.68597 val_acc= 0.60656 time= 0.00000
Epoch: 0018 train_loss= 0.68162 train_acc= 0.60000 val_loss= 0.68573 val_acc= 0.60656 time= 0.01563
Epoch: 0019 train_loss= 0.67879 train_acc= 0.61212 val_loss= 0.68553 val_acc= 0.60656 time= 0.00000
Epoch: 0020 train_loss= 0.67881 train_acc= 0.60000 val_loss= 0.68543 val_acc= 0.59016 time= 0.01563
Epoch: 0021 train_loss= 0.67619 train_acc= 0.66970 val_loss= 0.68523 val_acc= 0.59016 time= 0.01563
Epoch: 0022 train_loss= 0.67745 train_acc= 0.61818 val_loss= 0.68518 val_acc= 0.60656 time= 0.00000
Epoch: 0023 train_loss= 0.67473 train_acc= 0.64545 val_loss= 0.68528 val_acc= 0.57377 time= 0.01563
Epoch: 0024 train_loss= 0.67265 train_acc= 0.65455 val_loss= 0.68537 val_acc= 0.59016 time= 0.00000
Epoch: 0025 train_loss= 0.67397 train_acc= 0.70000 val_loss= 0.68529 val_acc= 0.57377 time= 0.01562
Epoch: 0026 train_loss= 0.67434 train_acc= 0.70000 val_loss= 0.68495 val_acc= 0.57377 time= 0.00000
Epoch: 0027 train_loss= 0.66931 train_acc= 0.67576 val_loss= 0.68477 val_acc= 0.57377 time= 0.01563
Epoch: 0028 train_loss= 0.67128 train_acc= 0.69091 val_loss= 0.68442 val_acc= 0.57377 time= 0.00000
Epoch: 0029 train_loss= 0.66786 train_acc= 0.70606 val_loss= 0.68376 val_acc= 0.54098 time= 0.01563
Epoch: 0030 train_loss= 0.66619 train_acc= 0.66667 val_loss= 0.68357 val_acc= 0.55738 time= 0.00000
Epoch: 0031 train_loss= 0.67007 train_acc= 0.66061 val_loss= 0.68369 val_acc= 0.57377 time= 0.01563
Epoch: 0032 train_loss= 0.66763 train_acc= 0.68788 val_loss= 0.68362 val_acc= 0.63934 time= 0.00000
Epoch: 0033 train_loss= 0.66410 train_acc= 0.71212 val_loss= 0.68337 val_acc= 0.62295 time= 0.01563
Epoch: 0034 train_loss= 0.66106 train_acc= 0.71818 val_loss= 0.68326 val_acc= 0.62295 time= 0.00000
Epoch: 0035 train_loss= 0.66100 train_acc= 0.73636 val_loss= 0.68308 val_acc= 0.62295 time= 0.01563
Epoch: 0036 train_loss= 0.65779 train_acc= 0.68182 val_loss= 0.68361 val_acc= 0.63934 time= 0.00000
Epoch: 0037 train_loss= 0.65715 train_acc= 0.72727 val_loss= 0.68423 val_acc= 0.62295 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 0.68512 accuracy= 0.62295 time= 0.00000 
