Epoch: 0001 train_loss= 2.08348 train_acc= 0.14340 val_loss= 2.08214 val_acc= 0.20690 time= 0.52999
Epoch: 0002 train_loss= 2.08204 train_acc= 0.14340 val_loss= 2.08117 val_acc= 0.20690 time= 0.01562
Epoch: 0003 train_loss= 2.08165 train_acc= 0.14340 val_loss= 2.08026 val_acc= 0.20690 time= 0.00000
Epoch: 0004 train_loss= 2.08059 train_acc= 0.14340 val_loss= 2.07936 val_acc= 0.20690 time= 0.00000
Epoch: 0005 train_loss= 2.07847 train_acc= 0.14340 val_loss= 2.07844 val_acc= 0.20690 time= 0.01563
Epoch: 0006 train_loss= 2.07792 train_acc= 0.14340 val_loss= 2.07753 val_acc= 0.20690 time= 0.00000
Epoch: 0007 train_loss= 2.07694 train_acc= 0.14340 val_loss= 2.07676 val_acc= 0.20690 time= 0.00000
Epoch: 0008 train_loss= 2.07695 train_acc= 0.13962 val_loss= 2.07601 val_acc= 0.20690 time= 0.01563
Epoch: 0009 train_loss= 2.07538 train_acc= 0.15094 val_loss= 2.07529 val_acc= 0.20690 time= 0.00000
Epoch: 0010 train_loss= 2.07499 train_acc= 0.13962 val_loss= 2.07460 val_acc= 0.20690 time= 0.00000
Epoch: 0011 train_loss= 2.07354 train_acc= 0.13208 val_loss= 2.07391 val_acc= 0.20690 time= 0.01563
Epoch: 0012 train_loss= 2.07244 train_acc= 0.15094 val_loss= 2.07319 val_acc= 0.20690 time= 0.00000
Epoch: 0013 train_loss= 2.07116 train_acc= 0.14717 val_loss= 2.07250 val_acc= 0.20690 time= 0.01563
Epoch: 0014 train_loss= 2.07027 train_acc= 0.14717 val_loss= 2.07179 val_acc= 0.20690 time= 0.00000
Epoch: 0015 train_loss= 2.07061 train_acc= 0.13962 val_loss= 2.07110 val_acc= 0.20690 time= 0.00000
Epoch: 0016 train_loss= 2.06812 train_acc= 0.14340 val_loss= 2.07042 val_acc= 0.20690 time= 0.01563
Epoch: 0017 train_loss= 2.06891 train_acc= 0.14340 val_loss= 2.06979 val_acc= 0.20690 time= 0.00000
Epoch: 0018 train_loss= 2.06692 train_acc= 0.13585 val_loss= 2.06914 val_acc= 0.17241 time= 0.00000
Epoch: 0019 train_loss= 2.06729 train_acc= 0.13962 val_loss= 2.06848 val_acc= 0.20690 time= 0.01563
Epoch: 0020 train_loss= 2.06499 train_acc= 0.16226 val_loss= 2.06786 val_acc= 0.17241 time= 0.00000
Epoch: 0021 train_loss= 2.06447 train_acc= 0.16226 val_loss= 2.06736 val_acc= 0.17241 time= 0.00000
Epoch: 0022 train_loss= 2.06501 train_acc= 0.17736 val_loss= 2.06695 val_acc= 0.17241 time= 0.01563
Epoch: 0023 train_loss= 2.06536 train_acc= 0.17736 val_loss= 2.06656 val_acc= 0.17241 time= 0.00000
Epoch: 0024 train_loss= 2.06172 train_acc= 0.17736 val_loss= 2.06629 val_acc= 0.17241 time= 0.00000
Epoch: 0025 train_loss= 2.06324 train_acc= 0.18113 val_loss= 2.06612 val_acc= 0.17241 time= 0.01563
Epoch: 0026 train_loss= 2.06227 train_acc= 0.18113 val_loss= 2.06606 val_acc= 0.17241 time= 0.00000
Epoch: 0027 train_loss= 2.06248 train_acc= 0.18113 val_loss= 2.06610 val_acc= 0.17241 time= 0.00000
Epoch: 0028 train_loss= 2.05857 train_acc= 0.18113 val_loss= 2.06629 val_acc= 0.17241 time= 0.00000
Epoch: 0029 train_loss= 2.05993 train_acc= 0.18113 val_loss= 2.06655 val_acc= 0.17241 time= 0.00000
Epoch: 0030 train_loss= 2.06021 train_acc= 0.17736 val_loss= 2.06688 val_acc= 0.17241 time= 0.00000
Early stopping...
Optimization Finished!
Test set results: cost= 2.04606 accuracy= 0.11864 time= 0.01563 
