Epoch: 0001 train_loss= 2.08722 train_acc= 0.13208 val_loss= 2.08474 val_acc= 0.17241 time= 0.35939
Epoch: 0002 train_loss= 2.08497 train_acc= 0.15633 val_loss= 2.08249 val_acc= 0.17241 time= 0.01563
Epoch: 0003 train_loss= 2.08299 train_acc= 0.15903 val_loss= 2.08021 val_acc= 0.17241 time= 0.01563
Epoch: 0004 train_loss= 2.08130 train_acc= 0.15903 val_loss= 2.07798 val_acc= 0.17241 time= 0.00000
Epoch: 0005 train_loss= 2.07989 train_acc= 0.16712 val_loss= 2.07577 val_acc= 0.17241 time= 0.01563
Epoch: 0006 train_loss= 2.07878 train_acc= 0.13208 val_loss= 2.07357 val_acc= 0.17241 time= 0.01563
Epoch: 0007 train_loss= 2.07786 train_acc= 0.14016 val_loss= 2.07123 val_acc= 0.17241 time= 0.00000
Epoch: 0008 train_loss= 2.07691 train_acc= 0.14286 val_loss= 2.06880 val_acc= 0.17241 time= 0.01563
Epoch: 0009 train_loss= 2.07592 train_acc= 0.14016 val_loss= 2.06601 val_acc= 0.17241 time= 0.02009
Epoch: 0010 train_loss= 2.07515 train_acc= 0.17251 val_loss= 2.06293 val_acc= 0.17241 time= 0.01000
Epoch: 0011 train_loss= 2.07381 train_acc= 0.15094 val_loss= 2.05947 val_acc= 0.17241 time= 0.00000
Epoch: 0012 train_loss= 2.07237 train_acc= 0.16173 val_loss= 2.05563 val_acc= 0.17241 time= 0.01568
Epoch: 0013 train_loss= 2.07207 train_acc= 0.15364 val_loss= 2.05148 val_acc= 0.17241 time= 0.01563
Epoch: 0014 train_loss= 2.07044 train_acc= 0.15633 val_loss= 2.04708 val_acc= 0.17241 time= 0.00000
Epoch: 0015 train_loss= 2.06953 train_acc= 0.17251 val_loss= 2.04244 val_acc= 0.17241 time= 0.01563
Epoch: 0016 train_loss= 2.06850 train_acc= 0.15364 val_loss= 2.03770 val_acc= 0.17241 time= 0.01563
Epoch: 0017 train_loss= 2.06712 train_acc= 0.15903 val_loss= 2.03310 val_acc= 0.17241 time= 0.01563
Epoch: 0018 train_loss= 2.06601 train_acc= 0.15633 val_loss= 2.02871 val_acc= 0.17241 time= 0.00000
Epoch: 0019 train_loss= 2.06573 train_acc= 0.14555 val_loss= 2.02470 val_acc= 0.17241 time= 0.01563
Epoch: 0020 train_loss= 2.06527 train_acc= 0.16981 val_loss= 2.02127 val_acc= 0.17241 time= 0.01563
Epoch: 0021 train_loss= 2.06397 train_acc= 0.17520 val_loss= 2.01852 val_acc= 0.17241 time= 0.00000
Epoch: 0022 train_loss= 2.06436 train_acc= 0.16442 val_loss= 2.01649 val_acc= 0.17241 time= 0.01563
Epoch: 0023 train_loss= 2.06307 train_acc= 0.16442 val_loss= 2.01527 val_acc= 0.17241 time= 0.01563
Epoch: 0024 train_loss= 2.06284 train_acc= 0.16712 val_loss= 2.01472 val_acc= 0.17241 time= 0.00000
Epoch: 0025 train_loss= 2.06235 train_acc= 0.18059 val_loss= 2.01481 val_acc= 0.17241 time= 0.01563
Epoch: 0026 train_loss= 2.06142 train_acc= 0.15364 val_loss= 2.01529 val_acc= 0.17241 time= 0.01563
Epoch: 0027 train_loss= 2.06158 train_acc= 0.15364 val_loss= 2.01594 val_acc= 0.17241 time= 0.01563
Epoch: 0028 train_loss= 2.06127 train_acc= 0.16173 val_loss= 2.01669 val_acc= 0.17241 time= 0.00000
Epoch: 0029 train_loss= 2.06167 train_acc= 0.15094 val_loss= 2.01708 val_acc= 0.17241 time= 0.01563
Epoch: 0030 train_loss= 2.05971 train_acc= 0.16173 val_loss= 2.01740 val_acc= 0.17241 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 2.08194 accuracy= 0.18644 time= 0.00000 
