Epoch: 0001 train_loss= 2.08470 train_acc= 0.09434 val_loss= 2.08537 val_acc= 0.13793 time= 0.28127
Epoch: 0002 train_loss= 2.08085 train_acc= 0.15094 val_loss= 2.08382 val_acc= 0.10345 time= 0.00000
Epoch: 0003 train_loss= 2.08154 train_acc= 0.16352 val_loss= 2.08225 val_acc= 0.10345 time= 0.00000
Epoch: 0004 train_loss= 2.08105 train_acc= 0.13208 val_loss= 2.08094 val_acc= 0.10345 time= 0.01562
Epoch: 0005 train_loss= 2.08105 train_acc= 0.13836 val_loss= 2.07971 val_acc= 0.10345 time= 0.00000
Epoch: 0006 train_loss= 2.07942 train_acc= 0.13836 val_loss= 2.07850 val_acc= 0.10345 time= 0.00000
Epoch: 0007 train_loss= 2.07818 train_acc= 0.14465 val_loss= 2.07718 val_acc= 0.10345 time= 0.01563
Epoch: 0008 train_loss= 2.07844 train_acc= 0.15723 val_loss= 2.07583 val_acc= 0.10345 time= 0.00000
Epoch: 0009 train_loss= 2.07799 train_acc= 0.13208 val_loss= 2.07449 val_acc= 0.10345 time= 0.00000
Epoch: 0010 train_loss= 2.07629 train_acc= 0.15094 val_loss= 2.07316 val_acc= 0.10345 time= 0.01563
Epoch: 0011 train_loss= 2.07763 train_acc= 0.13836 val_loss= 2.07189 val_acc= 0.10345 time= 0.00000
Epoch: 0012 train_loss= 2.07588 train_acc= 0.14465 val_loss= 2.07074 val_acc= 0.10345 time= 0.00000
Epoch: 0013 train_loss= 2.07342 train_acc= 0.15094 val_loss= 2.06961 val_acc= 0.10345 time= 0.01563
Epoch: 0014 train_loss= 2.07531 train_acc= 0.12579 val_loss= 2.06851 val_acc= 0.10345 time= 0.00000
Epoch: 0015 train_loss= 2.07449 train_acc= 0.11950 val_loss= 2.06743 val_acc= 0.10345 time= 0.00000
Epoch: 0016 train_loss= 2.07342 train_acc= 0.13208 val_loss= 2.06628 val_acc= 0.20690 time= 0.00000
Epoch: 0017 train_loss= 2.07271 train_acc= 0.13836 val_loss= 2.06521 val_acc= 0.20690 time= 0.01563
Epoch: 0018 train_loss= 2.07178 train_acc= 0.13836 val_loss= 2.06416 val_acc= 0.17241 time= 0.00000
Epoch: 0019 train_loss= 2.06967 train_acc= 0.18239 val_loss= 2.06320 val_acc= 0.13793 time= 0.00000
Epoch: 0020 train_loss= 2.07247 train_acc= 0.13208 val_loss= 2.06224 val_acc= 0.13793 time= 0.01563
Epoch: 0021 train_loss= 2.06982 train_acc= 0.15723 val_loss= 2.06144 val_acc= 0.13793 time= 0.00000
Epoch: 0022 train_loss= 2.06993 train_acc= 0.18239 val_loss= 2.06083 val_acc= 0.13793 time= 0.00000
Epoch: 0023 train_loss= 2.07092 train_acc= 0.15094 val_loss= 2.06040 val_acc= 0.13793 time= 0.01563
Epoch: 0024 train_loss= 2.06976 train_acc= 0.16352 val_loss= 2.05992 val_acc= 0.13793 time= 0.00000
Epoch: 0025 train_loss= 2.07044 train_acc= 0.14465 val_loss= 2.05947 val_acc= 0.13793 time= 0.00000
Epoch: 0026 train_loss= 2.06914 train_acc= 0.15723 val_loss= 2.05921 val_acc= 0.13793 time= 0.00000
Epoch: 0027 train_loss= 2.06914 train_acc= 0.15723 val_loss= 2.05923 val_acc= 0.13793 time= 0.01562
Epoch: 0028 train_loss= 2.06995 train_acc= 0.14465 val_loss= 2.05943 val_acc= 0.13793 time= 0.00000
Epoch: 0029 train_loss= 2.07182 train_acc= 0.15723 val_loss= 2.05972 val_acc= 0.13793 time= 0.00000
Epoch: 0030 train_loss= 2.07025 train_acc= 0.15723 val_loss= 2.06006 val_acc= 0.13793 time= 0.01563
Epoch: 0031 train_loss= 2.06795 train_acc= 0.15723 val_loss= 2.06055 val_acc= 0.13793 time= 0.00000
Early stopping...
Optimization Finished!
Test set results: cost= 2.06931 accuracy= 0.11864 time= 0.00000 
