Epoch: 0001 train_loss= 1.53582 train_acc= 0.54545 val_loss= 1.27896 val_acc= 0.34426 time= 0.07813
Epoch: 0002 train_loss= 1.78462 train_acc= 0.53636 val_loss= 2.08211 val_acc= 0.36066 time= 0.01563
Epoch: 0003 train_loss= 1.91210 train_acc= 0.43636 val_loss= 2.11650 val_acc= 0.36066 time= 0.00000
Epoch: 0004 train_loss= 1.28595 train_acc= 0.51818 val_loss= 2.24620 val_acc= 0.36066 time= 0.01563
Epoch: 0005 train_loss= 1.11015 train_acc= 0.48788 val_loss= 2.17045 val_acc= 0.36066 time= 0.01563
Epoch: 0006 train_loss= 0.84309 train_acc= 0.46061 val_loss= 1.96879 val_acc= 0.34426 time= 0.01563
Epoch: 0007 train_loss= 2.81680 train_acc= 0.45758 val_loss= 1.60744 val_acc= 0.34426 time= 0.01563
Epoch: 0008 train_loss= 1.00634 train_acc= 0.50000 val_loss= 1.42043 val_acc= 0.34426 time= 0.00000
Epoch: 0009 train_loss= 1.74568 train_acc= 0.46667 val_loss= 1.05846 val_acc= 0.34426 time= 0.01563
Epoch: 0010 train_loss= 1.53407 train_acc= 0.46667 val_loss= 0.79720 val_acc= 0.44262 time= 0.01563
Epoch: 0011 train_loss= 0.76332 train_acc= 0.53636 val_loss= 0.79367 val_acc= 0.62295 time= 0.01562
Epoch: 0012 train_loss= 1.02516 train_acc= 0.51515 val_loss= 0.86833 val_acc= 0.63934 time= 0.00000
Epoch: 0013 train_loss= 1.26327 train_acc= 0.53030 val_loss= 0.93066 val_acc= 0.63934 time= 0.01563
Epoch: 0014 train_loss= 1.34422 train_acc= 0.55758 val_loss= 0.93092 val_acc= 0.63934 time= 0.01563
Epoch: 0015 train_loss= 1.65250 train_acc= 0.53939 val_loss= 0.89931 val_acc= 0.63934 time= 0.01563
Epoch: 0016 train_loss= 0.78668 train_acc= 0.52121 val_loss= 0.85904 val_acc= 0.63934 time= 0.00000
Epoch: 0017 train_loss= 1.02598 train_acc= 0.55152 val_loss= 0.80605 val_acc= 0.63934 time= 0.01563
Epoch: 0018 train_loss= 0.77785 train_acc= 0.56061 val_loss= 0.76953 val_acc= 0.62295 time= 0.01563
Epoch: 0019 train_loss= 0.79214 train_acc= 0.50303 val_loss= 0.76436 val_acc= 0.60656 time= 0.01563
Epoch: 0020 train_loss= 0.76884 train_acc= 0.50303 val_loss= 0.78345 val_acc= 0.44262 time= 0.01562
Epoch: 0021 train_loss= 1.14776 train_acc= 0.48788 val_loss= 0.79508 val_acc= 0.44262 time= 0.00000
Epoch: 0022 train_loss= 0.79328 train_acc= 0.51212 val_loss= 0.79645 val_acc= 0.44262 time= 0.01563
Epoch: 0023 train_loss= 0.82376 train_acc= 0.54848 val_loss= 0.81094 val_acc= 0.45902 time= 0.01563
Epoch: 0024 train_loss= 1.32987 train_acc= 0.50000 val_loss= 0.78503 val_acc= 0.42623 time= 0.01563
Epoch: 0025 train_loss= 1.53420 train_acc= 0.54545 val_loss= 0.77534 val_acc= 0.44262 time= 0.01563
Epoch: 0026 train_loss= 0.86151 train_acc= 0.49394 val_loss= 0.77282 val_acc= 0.47541 time= 0.00000
Epoch: 0027 train_loss= 1.11568 train_acc= 0.45152 val_loss= 0.75043 val_acc= 0.47541 time= 0.01563
Epoch: 0028 train_loss= 0.80889 train_acc= 0.52727 val_loss= 0.74598 val_acc= 0.44262 time= 0.01563
Epoch: 0029 train_loss= 0.90343 train_acc= 0.48485 val_loss= 0.72545 val_acc= 0.42623 time= 0.01563
Epoch: 0030 train_loss= 1.11252 train_acc= 0.44848 val_loss= 0.69275 val_acc= 0.59016 time= 0.01563
Epoch: 0031 train_loss= 0.81447 train_acc= 0.50000 val_loss= 0.67924 val_acc= 0.63934 time= 0.00000
Epoch: 0032 train_loss= 0.88907 train_acc= 0.49394 val_loss= 0.67991 val_acc= 0.62295 time= 0.01562
Epoch: 0033 train_loss= 1.43752 train_acc= 0.53333 val_loss= 0.68445 val_acc= 0.62295 time= 0.01563
Epoch: 0034 train_loss= 0.84444 train_acc= 0.52121 val_loss= 0.69324 val_acc= 0.62295 time= 0.01562
Epoch: 0035 train_loss= 0.73878 train_acc= 0.47879 val_loss= 0.70298 val_acc= 0.63934 time= 0.00000
Epoch: 0036 train_loss= 0.70802 train_acc= 0.50606 val_loss= 0.71314 val_acc= 0.65574 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 0.71550 accuracy= 0.50820 time= 0.01563 
