Epoch: 0001 train_loss= 0.69889 train_acc= 0.51455 val_loss= 0.69882 val_acc= 0.45902 time= 0.34906
Epoch: 0002 train_loss= 0.69871 train_acc= 0.49091 val_loss= 0.69852 val_acc= 0.45902 time= 0.02592
Epoch: 0003 train_loss= 0.69811 train_acc= 0.53091 val_loss= 0.69814 val_acc= 0.45902 time= 0.01400
Epoch: 0004 train_loss= 0.69741 train_acc= 0.53273 val_loss= 0.69773 val_acc= 0.45902 time= 0.01400
Epoch: 0005 train_loss= 0.69706 train_acc= 0.52909 val_loss= 0.69739 val_acc= 0.45902 time= 0.01107
Epoch: 0006 train_loss= 0.69684 train_acc= 0.52000 val_loss= 0.69705 val_acc= 0.45902 time= 0.01563
Epoch: 0007 train_loss= 0.69615 train_acc= 0.52909 val_loss= 0.69676 val_acc= 0.45902 time= 0.01563
Epoch: 0008 train_loss= 0.69571 train_acc= 0.52727 val_loss= 0.69652 val_acc= 0.45902 time= 0.00000
Epoch: 0009 train_loss= 0.69552 train_acc= 0.53091 val_loss= 0.69633 val_acc= 0.45902 time= 0.01563
Epoch: 0010 train_loss= 0.69528 train_acc= 0.53455 val_loss= 0.69616 val_acc= 0.45902 time= 0.01563
Epoch: 0011 train_loss= 0.69507 train_acc= 0.53091 val_loss= 0.69597 val_acc= 0.45902 time= 0.01563
Epoch: 0012 train_loss= 0.69463 train_acc= 0.52909 val_loss= 0.69580 val_acc= 0.45902 time= 0.00000
Epoch: 0013 train_loss= 0.69443 train_acc= 0.53091 val_loss= 0.69563 val_acc= 0.45902 time= 0.01563
Epoch: 0014 train_loss= 0.69441 train_acc= 0.53091 val_loss= 0.69545 val_acc= 0.45902 time= 0.01563
Epoch: 0015 train_loss= 0.69401 train_acc= 0.53091 val_loss= 0.69533 val_acc= 0.45902 time= 0.00000
Epoch: 0016 train_loss= 0.69404 train_acc= 0.52909 val_loss= 0.69522 val_acc= 0.45902 time= 0.01563
Epoch: 0017 train_loss= 0.69364 train_acc= 0.53273 val_loss= 0.69514 val_acc= 0.45902 time= 0.01563
Epoch: 0018 train_loss= 0.69388 train_acc= 0.52909 val_loss= 0.69514 val_acc= 0.45902 time= 0.00000
Epoch: 0019 train_loss= 0.69386 train_acc= 0.53091 val_loss= 0.69497 val_acc= 0.45902 time= 0.00000
Epoch: 0020 train_loss= 0.69311 train_acc= 0.53273 val_loss= 0.69493 val_acc= 0.45902 time= 0.01563
Epoch: 0021 train_loss= 0.69373 train_acc= 0.53091 val_loss= 0.69482 val_acc= 0.45902 time= 0.00000
Epoch: 0022 train_loss= 0.69357 train_acc= 0.52909 val_loss= 0.69463 val_acc= 0.45902 time= 0.01563
Epoch: 0023 train_loss= 0.69340 train_acc= 0.53091 val_loss= 0.69456 val_acc= 0.45902 time= 0.01563
Epoch: 0024 train_loss= 0.69342 train_acc= 0.53091 val_loss= 0.69439 val_acc= 0.45902 time= 0.00000
Epoch: 0025 train_loss= 0.69318 train_acc= 0.53091 val_loss= 0.69427 val_acc= 0.45902 time= 0.01563
Epoch: 0026 train_loss= 0.69328 train_acc= 0.52909 val_loss= 0.69418 val_acc= 0.45902 time= 0.01562
Epoch: 0027 train_loss= 0.69322 train_acc= 0.53091 val_loss= 0.69412 val_acc= 0.45902 time= 0.01563
Epoch: 0028 train_loss= 0.69326 train_acc= 0.53273 val_loss= 0.69407 val_acc= 0.45902 time= 0.00000
Epoch: 0029 train_loss= 0.69305 train_acc= 0.53091 val_loss= 0.69404 val_acc= 0.45902 time= 0.01563
Epoch: 0030 train_loss= 0.69291 train_acc= 0.53091 val_loss= 0.69406 val_acc= 0.45902 time= 0.00000
Epoch: 0031 train_loss= 0.69320 train_acc= 0.53091 val_loss= 0.69407 val_acc= 0.45902 time= 0.01563
Epoch: 0032 train_loss= 0.69300 train_acc= 0.53273 val_loss= 0.69411 val_acc= 0.45902 time= 0.01563
Epoch: 0033 train_loss= 0.69330 train_acc= 0.53091 val_loss= 0.69415 val_acc= 0.45902 time= 0.00000
Epoch: 0034 train_loss= 0.69282 train_acc= 0.53273 val_loss= 0.69427 val_acc= 0.45902 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 0.69409 accuracy= 0.48361 time= 0.01563 
