Epoch: 0001 train_loss= 2.08612 train_acc= 0.12830 val_loss= 2.09017 val_acc= 0.03448 time= 0.18751
Epoch: 0002 train_loss= 2.08431 train_acc= 0.12830 val_loss= 2.08762 val_acc= 0.03448 time= 0.01563
Epoch: 0003 train_loss= 2.08240 train_acc= 0.12830 val_loss= 2.08518 val_acc= 0.03448 time= 0.01562
Epoch: 0004 train_loss= 2.08046 train_acc= 0.12830 val_loss= 2.08342 val_acc= 0.03448 time= 0.00000
Epoch: 0005 train_loss= 2.07902 train_acc= 0.12830 val_loss= 2.08242 val_acc= 0.03448 time= 0.01563
Epoch: 0006 train_loss= 2.07783 train_acc= 0.12830 val_loss= 2.08136 val_acc= 0.03448 time= 0.01563
Epoch: 0007 train_loss= 2.07610 train_acc= 0.12830 val_loss= 2.08007 val_acc= 0.03448 time= 0.00000
Epoch: 0008 train_loss= 2.07513 train_acc= 0.12830 val_loss= 2.07852 val_acc= 0.03448 time= 0.01563
Epoch: 0009 train_loss= 2.07437 train_acc= 0.12830 val_loss= 2.07653 val_acc= 0.03448 time= 0.01563
Epoch: 0010 train_loss= 2.07275 train_acc= 0.12830 val_loss= 2.07412 val_acc= 0.20690 time= 0.00000
Epoch: 0011 train_loss= 2.07193 train_acc= 0.16981 val_loss= 2.07129 val_acc= 0.20690 time= 0.01563
Epoch: 0012 train_loss= 2.07102 train_acc= 0.16604 val_loss= 2.06808 val_acc= 0.20690 time= 0.00000
Epoch: 0013 train_loss= 2.07051 train_acc= 0.16604 val_loss= 2.06442 val_acc= 0.20690 time= 0.01562
Epoch: 0014 train_loss= 2.06969 train_acc= 0.16604 val_loss= 2.06037 val_acc= 0.20690 time= 0.01563
Epoch: 0015 train_loss= 2.06763 train_acc= 0.16604 val_loss= 2.05595 val_acc= 0.20690 time= 0.01563
Epoch: 0016 train_loss= 2.06662 train_acc= 0.16604 val_loss= 2.05130 val_acc= 0.20690 time= 0.00000
Epoch: 0017 train_loss= 2.06789 train_acc= 0.16604 val_loss= 2.04647 val_acc= 0.20690 time= 0.01563
Epoch: 0018 train_loss= 2.06690 train_acc= 0.16604 val_loss= 2.04163 val_acc= 0.20690 time= 0.01563
Epoch: 0019 train_loss= 2.06574 train_acc= 0.16604 val_loss= 2.03696 val_acc= 0.20690 time= 0.00000
Epoch: 0020 train_loss= 2.06648 train_acc= 0.16604 val_loss= 2.03266 val_acc= 0.20690 time= 0.01563
Epoch: 0021 train_loss= 2.06601 train_acc= 0.16604 val_loss= 2.02891 val_acc= 0.20690 time= 0.01563
Epoch: 0022 train_loss= 2.06605 train_acc= 0.16604 val_loss= 2.02589 val_acc= 0.20690 time= 0.00000
Epoch: 0023 train_loss= 2.06785 train_acc= 0.16604 val_loss= 2.02365 val_acc= 0.20690 time= 0.01563
Epoch: 0024 train_loss= 2.06630 train_acc= 0.16604 val_loss= 2.02237 val_acc= 0.20690 time= 0.01563
Epoch: 0025 train_loss= 2.06661 train_acc= 0.16604 val_loss= 2.02190 val_acc= 0.20690 time= 0.00000
Epoch: 0026 train_loss= 2.06451 train_acc= 0.16604 val_loss= 2.02223 val_acc= 0.20690 time= 0.01563
Epoch: 0027 train_loss= 2.06632 train_acc= 0.16604 val_loss= 2.02329 val_acc= 0.20690 time= 0.01563
Epoch: 0028 train_loss= 2.06647 train_acc= 0.16604 val_loss= 2.02472 val_acc= 0.20690 time= 0.00000
Epoch: 0029 train_loss= 2.06612 train_acc= 0.16604 val_loss= 2.02655 val_acc= 0.20690 time= 0.01562
Early stopping...
Optimization Finished!
Test set results: cost= 2.03576 accuracy= 0.18644 time= 0.00000 
