Epoch: 0001 train_loss= 2.27046 train_acc= 0.26384 val_loss= 1.84582 val_acc= 0.26786 time= 0.34379
Epoch: 0002 train_loss= 3.50811 train_acc= 0.20521 val_loss= 2.17856 val_acc= 0.33929 time= 0.02000
Epoch: 0003 train_loss= 2.65950 train_acc= 0.25407 val_loss= 2.16934 val_acc= 0.33929 time= 0.02414
Epoch: 0004 train_loss= 2.69929 train_acc= 0.24756 val_loss= 1.96254 val_acc= 0.21429 time= 0.02119
Epoch: 0005 train_loss= 1.67362 train_acc= 0.25733 val_loss= 1.87302 val_acc= 0.23214 time= 0.02201
Epoch: 0006 train_loss= 2.15883 train_acc= 0.26059 val_loss= 1.72672 val_acc= 0.23214 time= 0.02081
Epoch: 0007 train_loss= 2.65411 train_acc= 0.24104 val_loss= 1.56728 val_acc= 0.25000 time= 0.02290
Epoch: 0008 train_loss= 1.99844 train_acc= 0.28339 val_loss= 1.56664 val_acc= 0.32143 time= 0.02025
Epoch: 0009 train_loss= 2.01388 train_acc= 0.26059 val_loss= 1.58880 val_acc= 0.33929 time= 0.02069
Epoch: 0010 train_loss= 1.68495 train_acc= 0.23779 val_loss= 1.62248 val_acc= 0.33929 time= 0.01563
Epoch: 0011 train_loss= 2.24974 train_acc= 0.26384 val_loss= 1.58858 val_acc= 0.33929 time= 0.01563
Epoch: 0012 train_loss= 1.82221 train_acc= 0.23453 val_loss= 1.57940 val_acc= 0.33929 time= 0.02913
Epoch: 0013 train_loss= 1.56411 train_acc= 0.20847 val_loss= 1.55288 val_acc= 0.33929 time= 0.02200
Epoch: 0014 train_loss= 1.44273 train_acc= 0.28664 val_loss= 1.53483 val_acc= 0.33929 time= 0.02025
Epoch: 0015 train_loss= 1.46321 train_acc= 0.29967 val_loss= 1.52724 val_acc= 0.33929 time= 0.02132
Epoch: 0016 train_loss= 1.43907 train_acc= 0.27362 val_loss= 1.50453 val_acc= 0.33929 time= 0.01932
Epoch: 0017 train_loss= 1.64203 train_acc= 0.27362 val_loss= 1.45918 val_acc= 0.33929 time= 0.01994
Epoch: 0018 train_loss= 1.38693 train_acc= 0.29642 val_loss= 1.42691 val_acc= 0.33929 time= 0.00700
Epoch: 0019 train_loss= 1.90289 train_acc= 0.24104 val_loss= 1.39317 val_acc= 0.33929 time= 0.03129
Epoch: 0020 train_loss= 1.38952 train_acc= 0.30293 val_loss= 1.37693 val_acc= 0.35714 time= 0.02178
Epoch: 0021 train_loss= 1.38129 train_acc= 0.28990 val_loss= 1.37218 val_acc= 0.37500 time= 0.00607
Epoch: 0022 train_loss= 1.38034 train_acc= 0.25733 val_loss= 1.37442 val_acc= 0.35714 time= 0.03451
Epoch: 0023 train_loss= 1.95107 train_acc= 0.26384 val_loss= 1.37453 val_acc= 0.39286 time= 0.02401
Epoch: 0024 train_loss= 1.36998 train_acc= 0.29316 val_loss= 1.37476 val_acc= 0.41071 time= 0.01833
Epoch: 0025 train_loss= 1.41959 train_acc= 0.28339 val_loss= 1.37494 val_acc= 0.42857 time= 0.01563
Epoch: 0026 train_loss= 1.39302 train_acc= 0.27687 val_loss= 1.37508 val_acc= 0.42857 time= 0.02825
Epoch: 0027 train_loss= 1.38366 train_acc= 0.28664 val_loss= 1.37523 val_acc= 0.42857 time= 0.01500
Epoch: 0028 train_loss= 1.38485 train_acc= 0.28013 val_loss= 1.37541 val_acc= 0.39286 time= 0.01567
Epoch: 0029 train_loss= 1.38047 train_acc= 0.28664 val_loss= 1.37553 val_acc= 0.39286 time= 0.01563
Epoch: 0030 train_loss= 1.38005 train_acc= 0.28013 val_loss= 1.37569 val_acc= 0.39286 time= 0.03125
Early stopping...
Optimization Finished!
Test set results: cost= 1.38074 accuracy= 0.30973 time= 0.00000 
