Epoch: 0001 train_loss= 2.07590 train_acc= 0.19497 val_loss= 2.10528 val_acc= 0.10345 time= 0.23439
Epoch: 0002 train_loss= 2.07038 train_acc= 0.19497 val_loss= 2.10124 val_acc= 0.10345 time= 0.01915
Epoch: 0003 train_loss= 2.06886 train_acc= 0.19497 val_loss= 2.09793 val_acc= 0.10345 time= 0.00101
Epoch: 0004 train_loss= 2.06508 train_acc= 0.19497 val_loss= 2.09529 val_acc= 0.10345 time= 0.01150
Epoch: 0005 train_loss= 2.06472 train_acc= 0.19497 val_loss= 2.09316 val_acc= 0.10345 time= 0.00000
Epoch: 0006 train_loss= 2.05769 train_acc= 0.19497 val_loss= 2.09157 val_acc= 0.10345 time= 0.00000
Epoch: 0007 train_loss= 2.05819 train_acc= 0.19497 val_loss= 2.09011 val_acc= 0.10345 time= 0.01563
Epoch: 0008 train_loss= 2.05617 train_acc= 0.19497 val_loss= 2.08947 val_acc= 0.10345 time= 0.00000
Epoch: 0009 train_loss= 2.05532 train_acc= 0.19497 val_loss= 2.08927 val_acc= 0.10345 time= 0.01563
Epoch: 0010 train_loss= 2.05302 train_acc= 0.19497 val_loss= 2.08942 val_acc= 0.10345 time= 0.00000
Epoch: 0011 train_loss= 2.04958 train_acc= 0.19497 val_loss= 2.08996 val_acc= 0.10345 time= 0.00000
Epoch: 0012 train_loss= 2.05023 train_acc= 0.19497 val_loss= 2.09035 val_acc= 0.10345 time= 0.01563
Epoch: 0013 train_loss= 2.05450 train_acc= 0.19497 val_loss= 2.09035 val_acc= 0.10345 time= 0.00000
Epoch: 0014 train_loss= 2.05221 train_acc= 0.19497 val_loss= 2.08978 val_acc= 0.10345 time= 0.01563
Epoch: 0015 train_loss= 2.05136 train_acc= 0.19497 val_loss= 2.08926 val_acc= 0.10345 time= 0.00000
Epoch: 0016 train_loss= 2.05025 train_acc= 0.19497 val_loss= 2.08886 val_acc= 0.10345 time= 0.00000
Epoch: 0017 train_loss= 2.05260 train_acc= 0.19497 val_loss= 2.08801 val_acc= 0.10345 time= 0.01563
Epoch: 0018 train_loss= 2.04873 train_acc= 0.19497 val_loss= 2.08691 val_acc= 0.10345 time= 0.00000
Epoch: 0019 train_loss= 2.04713 train_acc= 0.19497 val_loss= 2.08601 val_acc= 0.10345 time= 0.00000
Epoch: 0020 train_loss= 2.04913 train_acc= 0.19497 val_loss= 2.08520 val_acc= 0.10345 time= 0.01563
Epoch: 0021 train_loss= 2.05128 train_acc= 0.19497 val_loss= 2.08420 val_acc= 0.10345 time= 0.00000
Epoch: 0022 train_loss= 2.05052 train_acc= 0.19497 val_loss= 2.08280 val_acc= 0.10345 time= 0.00000
Epoch: 0023 train_loss= 2.04614 train_acc= 0.20126 val_loss= 2.08159 val_acc= 0.10345 time= 0.01563
Epoch: 0024 train_loss= 2.04788 train_acc= 0.19497 val_loss= 2.08028 val_acc= 0.10345 time= 0.00000
Epoch: 0025 train_loss= 2.04793 train_acc= 0.18239 val_loss= 2.07934 val_acc= 0.10345 time= 0.00000
Epoch: 0026 train_loss= 2.04875 train_acc= 0.18868 val_loss= 2.07887 val_acc= 0.10345 time= 0.01563
Epoch: 0027 train_loss= 2.04871 train_acc= 0.19497 val_loss= 2.07863 val_acc= 0.10345 time= 0.00000
Epoch: 0028 train_loss= 2.04859 train_acc= 0.19497 val_loss= 2.07871 val_acc= 0.10345 time= 0.00000
Epoch: 0029 train_loss= 2.05079 train_acc= 0.17610 val_loss= 2.07905 val_acc= 0.10345 time= 0.01563
Epoch: 0030 train_loss= 2.04665 train_acc= 0.19497 val_loss= 2.07969 val_acc= 0.10345 time= 0.00000
Epoch: 0031 train_loss= 2.04795 train_acc= 0.19497 val_loss= 2.08044 val_acc= 0.10345 time= 0.00000
Early stopping...
Optimization Finished!
Test set results: cost= 2.03015 accuracy= 0.22034 time= 0.00000 
