Epoch: 0001 train_loss= 2.07996 train_acc= 0.14016 val_loss= 2.06779 val_acc= 0.17241 time= 0.81040
Epoch: 0002 train_loss= 2.07823 train_acc= 0.13747 val_loss= 2.06248 val_acc= 0.17241 time= 0.00000
Epoch: 0003 train_loss= 2.07126 train_acc= 0.15903 val_loss= 2.05753 val_acc= 0.17241 time= 0.01563
Epoch: 0004 train_loss= 2.07185 train_acc= 0.15633 val_loss= 2.05289 val_acc= 0.17241 time= 0.00000
Epoch: 0005 train_loss= 2.07069 train_acc= 0.15633 val_loss= 2.04837 val_acc= 0.17241 time= 0.00000
Epoch: 0006 train_loss= 2.06768 train_acc= 0.15903 val_loss= 2.04391 val_acc= 0.17241 time= 0.01563
Epoch: 0007 train_loss= 2.06654 train_acc= 0.16173 val_loss= 2.03950 val_acc= 0.17241 time= 0.00000
Epoch: 0008 train_loss= 2.06499 train_acc= 0.16173 val_loss= 2.03520 val_acc= 0.17241 time= 0.01563
Epoch: 0009 train_loss= 2.06139 train_acc= 0.16173 val_loss= 2.03097 val_acc= 0.17241 time= 0.00000
Epoch: 0010 train_loss= 2.06249 train_acc= 0.16173 val_loss= 2.02697 val_acc= 0.17241 time= 0.00000
Epoch: 0011 train_loss= 2.05892 train_acc= 0.16173 val_loss= 2.02308 val_acc= 0.17241 time= 0.01563
Epoch: 0012 train_loss= 2.05980 train_acc= 0.16173 val_loss= 2.01942 val_acc= 0.17241 time= 0.00000
Epoch: 0013 train_loss= 2.05478 train_acc= 0.16173 val_loss= 2.01556 val_acc= 0.17241 time= 0.00000
Epoch: 0014 train_loss= 2.05494 train_acc= 0.16173 val_loss= 2.01132 val_acc= 0.17241 time= 0.01563
Epoch: 0015 train_loss= 2.05727 train_acc= 0.16173 val_loss= 2.00709 val_acc= 0.17241 time= 0.00000
Epoch: 0016 train_loss= 2.05294 train_acc= 0.16173 val_loss= 2.00305 val_acc= 0.17241 time= 0.00000
Epoch: 0017 train_loss= 2.05379 train_acc= 0.16173 val_loss= 1.99900 val_acc= 0.17241 time= 0.01562
Epoch: 0018 train_loss= 2.05321 train_acc= 0.15903 val_loss= 1.99505 val_acc= 0.17241 time= 0.00000
Epoch: 0019 train_loss= 2.04774 train_acc= 0.16173 val_loss= 1.99134 val_acc= 0.17241 time= 0.01563
Epoch: 0020 train_loss= 2.05101 train_acc= 0.16712 val_loss= 1.98784 val_acc= 0.17241 time= 0.00000
Epoch: 0021 train_loss= 2.05059 train_acc= 0.15633 val_loss= 1.98487 val_acc= 0.17241 time= 0.00000
Epoch: 0022 train_loss= 2.04863 train_acc= 0.15633 val_loss= 1.98216 val_acc= 0.17241 time= 0.01563
Epoch: 0023 train_loss= 2.04949 train_acc= 0.16981 val_loss= 1.97995 val_acc= 0.17241 time= 0.00000
Epoch: 0024 train_loss= 2.04937 train_acc= 0.15903 val_loss= 1.97802 val_acc= 0.17241 time= 0.00000
Epoch: 0025 train_loss= 2.04963 train_acc= 0.15094 val_loss= 1.97617 val_acc= 0.17241 time= 0.01563
Epoch: 0026 train_loss= 2.04838 train_acc= 0.17520 val_loss= 1.97461 val_acc= 0.17241 time= 0.00000
Epoch: 0027 train_loss= 2.05075 train_acc= 0.15364 val_loss= 1.97374 val_acc= 0.17241 time= 0.00000
Epoch: 0028 train_loss= 2.05186 train_acc= 0.14555 val_loss= 1.97341 val_acc= 0.17241 time= 0.01563
Epoch: 0029 train_loss= 2.05090 train_acc= 0.15903 val_loss= 1.97377 val_acc= 0.17241 time= 0.00000
Epoch: 0030 train_loss= 2.04443 train_acc= 0.17790 val_loss= 1.97428 val_acc= 0.17241 time= 0.00000
Epoch: 0031 train_loss= 2.05247 train_acc= 0.15364 val_loss= 1.97536 val_acc= 0.17241 time= 0.01563
Epoch: 0032 train_loss= 2.05246 train_acc= 0.16712 val_loss= 1.97683 val_acc= 0.17241 time= 0.00000
Early stopping...
Optimization Finished!
Test set results: cost= 2.04409 accuracy= 0.11864 time= 0.00000 
