Epoch: 0001 train_loss= 0.82550 train_acc= 0.48788 val_loss= 0.71115 val_acc= 0.47541 time= 0.09375
Epoch: 0002 train_loss= 0.80747 train_acc= 0.49697 val_loss= 0.70694 val_acc= 0.54098 time= 0.00000
Epoch: 0003 train_loss= 1.07331 train_acc= 0.47879 val_loss= 0.72894 val_acc= 0.49180 time= 0.01563
Epoch: 0004 train_loss= 0.93788 train_acc= 0.45152 val_loss= 0.81882 val_acc= 0.52459 time= 0.01563
Epoch: 0005 train_loss= 0.77828 train_acc= 0.52727 val_loss= 0.91346 val_acc= 0.49180 time= 0.01563
Epoch: 0006 train_loss= 1.06530 train_acc= 0.51515 val_loss= 1.05603 val_acc= 0.50820 time= 0.01563
Epoch: 0007 train_loss= 1.12463 train_acc= 0.50303 val_loss= 1.10693 val_acc= 0.50820 time= 0.01563
Epoch: 0008 train_loss= 0.78844 train_acc= 0.51212 val_loss= 1.12432 val_acc= 0.50820 time= 0.00000
Epoch: 0009 train_loss= 1.21996 train_acc= 0.53636 val_loss= 1.08320 val_acc= 0.50820 time= 0.01563
Epoch: 0010 train_loss= 1.01127 train_acc= 0.50606 val_loss= 1.01746 val_acc= 0.50820 time= 0.01563
Epoch: 0011 train_loss= 0.93542 train_acc= 0.52424 val_loss= 0.94403 val_acc= 0.50820 time= 0.01563
Epoch: 0012 train_loss= 1.13655 train_acc= 0.51515 val_loss= 0.86163 val_acc= 0.49180 time= 0.01563
Epoch: 0013 train_loss= 0.74030 train_acc= 0.52121 val_loss= 0.79972 val_acc= 0.52459 time= 0.00000
Epoch: 0014 train_loss= 0.90345 train_acc= 0.46364 val_loss= 0.76525 val_acc= 0.52459 time= 0.01563
Epoch: 0015 train_loss= 0.82216 train_acc= 0.50303 val_loss= 0.73562 val_acc= 0.50820 time= 0.01563
Epoch: 0016 train_loss= 0.71488 train_acc= 0.49697 val_loss= 0.71802 val_acc= 0.50820 time= 0.01563
Epoch: 0017 train_loss= 0.77265 train_acc= 0.50909 val_loss= 0.70763 val_acc= 0.52459 time= 0.01563
Epoch: 0018 train_loss= 0.76717 train_acc= 0.49697 val_loss= 0.70378 val_acc= 0.54098 time= 0.00000
Epoch: 0019 train_loss= 0.79043 train_acc= 0.49091 val_loss= 0.70540 val_acc= 0.49180 time= 0.01563
Epoch: 0020 train_loss= 0.75173 train_acc= 0.52121 val_loss= 0.70962 val_acc= 0.52459 time= 0.01563
Epoch: 0021 train_loss= 0.78419 train_acc= 0.48788 val_loss= 0.71236 val_acc= 0.55738 time= 0.01562
Epoch: 0022 train_loss= 0.71046 train_acc= 0.52121 val_loss= 0.71467 val_acc= 0.55738 time= 0.01563
Epoch: 0023 train_loss= 0.75353 train_acc= 0.50909 val_loss= 0.71600 val_acc= 0.55738 time= 0.01563
Epoch: 0024 train_loss= 1.08587 train_acc= 0.49394 val_loss= 0.71243 val_acc= 0.57377 time= 0.00000
Epoch: 0025 train_loss= 0.70739 train_acc= 0.54545 val_loss= 0.70882 val_acc= 0.55738 time= 0.01563
Epoch: 0026 train_loss= 0.85587 train_acc= 0.47879 val_loss= 0.70490 val_acc= 0.52459 time= 0.01562
Epoch: 0027 train_loss= 0.79302 train_acc= 0.47273 val_loss= 0.70121 val_acc= 0.50820 time= 0.01563
Epoch: 0028 train_loss= 0.82230 train_acc= 0.50606 val_loss= 0.69787 val_acc= 0.49180 time= 0.01563
Epoch: 0029 train_loss= 0.72810 train_acc= 0.45758 val_loss= 0.69657 val_acc= 0.54098 time= 0.00000
Epoch: 0030 train_loss= 0.78810 train_acc= 0.50909 val_loss= 0.69688 val_acc= 0.52459 time= 0.01563
Epoch: 0031 train_loss= 0.75703 train_acc= 0.50000 val_loss= 0.69869 val_acc= 0.52459 time= 0.01563
Epoch: 0032 train_loss= 0.80694 train_acc= 0.49394 val_loss= 0.70217 val_acc= 0.49180 time= 0.01563
Epoch: 0033 train_loss= 0.69896 train_acc= 0.51818 val_loss= 0.70636 val_acc= 0.50820 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 0.74309 accuracy= 0.54918 time= 0.00000 
