Epoch: 0001 train_loss= 0.70095 train_acc= 0.52182 val_loss= 0.69685 val_acc= 0.55738 time= 0.25003
Epoch: 0002 train_loss= 0.69791 train_acc= 0.51091 val_loss= 0.69414 val_acc= 0.55738 time= 0.01562
Epoch: 0003 train_loss= 0.69580 train_acc= 0.51273 val_loss= 0.69222 val_acc= 0.55738 time= 0.01563
Epoch: 0004 train_loss= 0.69423 train_acc= 0.51091 val_loss= 0.69096 val_acc= 0.55738 time= 0.01563
Epoch: 0005 train_loss= 0.69338 train_acc= 0.51455 val_loss= 0.69045 val_acc= 0.55738 time= 0.00000
Epoch: 0006 train_loss= 0.69303 train_acc= 0.51091 val_loss= 0.69031 val_acc= 0.55738 time= 0.01563
Epoch: 0007 train_loss= 0.69259 train_acc= 0.52000 val_loss= 0.69025 val_acc= 0.55738 time= 0.01563
Epoch: 0008 train_loss= 0.69272 train_acc= 0.52000 val_loss= 0.69026 val_acc= 0.55738 time= 0.00000
Epoch: 0009 train_loss= 0.69272 train_acc= 0.52545 val_loss= 0.69012 val_acc= 0.55738 time= 0.01563
Epoch: 0010 train_loss= 0.69265 train_acc= 0.51455 val_loss= 0.68991 val_acc= 0.55738 time= 0.01563
Epoch: 0011 train_loss= 0.69260 train_acc= 0.52545 val_loss= 0.68978 val_acc= 0.55738 time= 0.00000
Epoch: 0012 train_loss= 0.69246 train_acc= 0.52545 val_loss= 0.68966 val_acc= 0.55738 time= 0.01563
Epoch: 0013 train_loss= 0.69228 train_acc= 0.52364 val_loss= 0.68960 val_acc= 0.55738 time= 0.01563
Epoch: 0014 train_loss= 0.69216 train_acc= 0.52364 val_loss= 0.68967 val_acc= 0.55738 time= 0.00000
Epoch: 0015 train_loss= 0.69154 train_acc= 0.55091 val_loss= 0.68956 val_acc= 0.55738 time= 0.01563
Epoch: 0016 train_loss= 0.69196 train_acc= 0.52909 val_loss= 0.68966 val_acc= 0.55738 time= 0.01563
Epoch: 0017 train_loss= 0.69155 train_acc= 0.55273 val_loss= 0.68974 val_acc= 0.55738 time= 0.01563
Epoch: 0018 train_loss= 0.69103 train_acc= 0.53636 val_loss= 0.68965 val_acc= 0.55738 time= 0.00000
Epoch: 0019 train_loss= 0.69157 train_acc= 0.59273 val_loss= 0.68912 val_acc= 0.55738 time= 0.01563
Epoch: 0020 train_loss= 0.69094 train_acc= 0.57455 val_loss= 0.68856 val_acc= 0.55738 time= 0.01563
Epoch: 0021 train_loss= 0.69098 train_acc= 0.53818 val_loss= 0.68836 val_acc= 0.55738 time= 0.01563
Epoch: 0022 train_loss= 0.69013 train_acc= 0.57818 val_loss= 0.68795 val_acc= 0.55738 time= 0.00000
Epoch: 0023 train_loss= 0.69072 train_acc= 0.53455 val_loss= 0.68799 val_acc= 0.55738 time= 0.01563
Epoch: 0024 train_loss= 0.69082 train_acc= 0.54909 val_loss= 0.68840 val_acc= 0.55738 time= 0.01563
Epoch: 0025 train_loss= 0.68970 train_acc= 0.58545 val_loss= 0.68854 val_acc= 0.55738 time= 0.01563
Epoch: 0026 train_loss= 0.68962 train_acc= 0.57273 val_loss= 0.68839 val_acc= 0.55738 time= 0.00000
Epoch: 0027 train_loss= 0.68976 train_acc= 0.56364 val_loss= 0.68842 val_acc= 0.55738 time= 0.01563
Epoch: 0028 train_loss= 0.68920 train_acc= 0.56909 val_loss= 0.68819 val_acc= 0.55738 time= 0.01563
Epoch: 0029 train_loss= 0.68958 train_acc= 0.62364 val_loss= 0.68689 val_acc= 0.55738 time= 0.00000
Epoch: 0030 train_loss= 0.68940 train_acc= 0.55636 val_loss= 0.68665 val_acc= 0.55738 time= 0.01563
Epoch: 0031 train_loss= 0.68975 train_acc= 0.54545 val_loss= 0.68730 val_acc= 0.55738 time= 0.01563
Epoch: 0032 train_loss= 0.68869 train_acc= 0.59273 val_loss= 0.68821 val_acc= 0.63934 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 0.68897 accuracy= 0.62295 time= 0.00000 
