Epoch: 0001 train_loss= 1.23686 train_acc= 0.49697 val_loss= 0.76657 val_acc= 0.55738 time= 0.32904
Epoch: 0002 train_loss= 1.05385 train_acc= 0.51818 val_loss= 0.76344 val_acc= 0.54098 time= 0.01563
Epoch: 0003 train_loss= 0.89217 train_acc= 0.49697 val_loss= 0.78936 val_acc= 0.54098 time= 0.01562
Epoch: 0004 train_loss= 1.03781 train_acc= 0.56667 val_loss= 0.74185 val_acc= 0.50820 time= 0.03125
Epoch: 0005 train_loss= 2.01701 train_acc= 0.46364 val_loss= 0.70222 val_acc= 0.52459 time= 0.01563
Epoch: 0006 train_loss= 1.14523 train_acc= 0.53030 val_loss= 0.78302 val_acc= 0.42623 time= 0.01563
Epoch: 0007 train_loss= 0.85572 train_acc= 0.54545 val_loss= 0.88405 val_acc= 0.42623 time= 0.03125
Epoch: 0008 train_loss= 1.29825 train_acc= 0.53030 val_loss= 0.92556 val_acc= 0.42623 time= 0.01563
Epoch: 0009 train_loss= 1.23062 train_acc= 0.53333 val_loss= 0.87691 val_acc= 0.42623 time= 0.01563
Epoch: 0010 train_loss= 1.27344 train_acc= 0.46061 val_loss= 0.86495 val_acc= 0.42623 time= 0.03125
Epoch: 0011 train_loss= 1.03926 train_acc= 0.56970 val_loss= 0.83330 val_acc= 0.42623 time= 0.01562
Epoch: 0012 train_loss= 0.71085 train_acc= 0.54545 val_loss= 0.80079 val_acc= 0.42623 time= 0.03125
Epoch: 0013 train_loss= 0.80982 train_acc= 0.56667 val_loss= 0.76216 val_acc= 0.42623 time= 0.01563
Epoch: 0014 train_loss= 0.73856 train_acc= 0.55455 val_loss= 0.74545 val_acc= 0.42623 time= 0.03125
Epoch: 0015 train_loss= 0.85784 train_acc= 0.54848 val_loss= 0.73740 val_acc= 0.39344 time= 0.01563
Epoch: 0016 train_loss= 0.69877 train_acc= 0.52727 val_loss= 0.72998 val_acc= 0.42623 time= 0.03125
Epoch: 0017 train_loss= 0.82734 train_acc= 0.52727 val_loss= 0.72643 val_acc= 0.44262 time= 0.01563
Epoch: 0018 train_loss= 0.68672 train_acc= 0.60000 val_loss= 0.72526 val_acc= 0.44262 time= 0.03125
Epoch: 0019 train_loss= 0.94310 train_acc= 0.56970 val_loss= 0.72336 val_acc= 0.42623 time= 0.01563
Epoch: 0020 train_loss= 0.75511 train_acc= 0.55455 val_loss= 0.72089 val_acc= 0.44262 time= 0.01563
Epoch: 0021 train_loss= 0.97775 train_acc= 0.53939 val_loss= 0.71854 val_acc= 0.44262 time= 0.03125
Epoch: 0022 train_loss= 0.71278 train_acc= 0.53939 val_loss= 0.71698 val_acc= 0.44262 time= 0.01563
Epoch: 0023 train_loss= 0.68069 train_acc= 0.58485 val_loss= 0.71567 val_acc= 0.44262 time= 0.01563
Epoch: 0024 train_loss= 0.69302 train_acc= 0.56667 val_loss= 0.71327 val_acc= 0.44262 time= 0.01563
Epoch: 0025 train_loss= 1.13621 train_acc= 0.55455 val_loss= 0.71211 val_acc= 0.44262 time= 0.03125
Epoch: 0026 train_loss= 0.68710 train_acc= 0.57879 val_loss= 0.71150 val_acc= 0.44262 time= 0.01563
Epoch: 0027 train_loss= 0.94033 train_acc= 0.57576 val_loss= 0.71058 val_acc= 0.44262 time= 0.01563
Epoch: 0028 train_loss= 0.68330 train_acc= 0.55455 val_loss= 0.71016 val_acc= 0.44262 time= 0.01563
Epoch: 0029 train_loss= 0.68628 train_acc= 0.55455 val_loss= 0.71023 val_acc= 0.44262 time= 0.03125
Epoch: 0030 train_loss= 0.68708 train_acc= 0.56061 val_loss= 0.71085 val_acc= 0.44262 time= 0.01562
Epoch: 0031 train_loss= 0.68286 train_acc= 0.53636 val_loss= 0.71169 val_acc= 0.44262 time= 0.01563
Epoch: 0032 train_loss= 0.68002 train_acc= 0.56364 val_loss= 0.71242 val_acc= 0.44262 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 0.69375 accuracy= 0.60656 time= 0.01563 
