Epoch: 0001 train_loss= 2.07954 train_acc= 0.18868 val_loss= 2.07824 val_acc= 0.13793 time= 0.06250
Epoch: 0002 train_loss= 2.07551 train_acc= 0.15723 val_loss= 2.07543 val_acc= 0.13793 time= 0.01563
Epoch: 0003 train_loss= 2.06991 train_acc= 0.16352 val_loss= 2.07324 val_acc= 0.13793 time= 0.00000
Epoch: 0004 train_loss= 2.06396 train_acc= 0.16981 val_loss= 2.07129 val_acc= 0.13793 time= 0.01563
Epoch: 0005 train_loss= 2.06332 train_acc= 0.18239 val_loss= 2.06971 val_acc= 0.13793 time= 0.01563
Epoch: 0006 train_loss= 2.05929 train_acc= 0.16981 val_loss= 2.06866 val_acc= 0.13793 time= 0.00000
Epoch: 0007 train_loss= 2.05518 train_acc= 0.16981 val_loss= 2.06760 val_acc= 0.13793 time= 0.01563
Epoch: 0008 train_loss= 2.06878 train_acc= 0.16352 val_loss= 2.06683 val_acc= 0.13793 time= 0.00000
Epoch: 0009 train_loss= 2.05915 train_acc= 0.20755 val_loss= 2.06604 val_acc= 0.13793 time= 0.01563
Epoch: 0010 train_loss= 2.05252 train_acc= 0.17610 val_loss= 2.06503 val_acc= 0.10345 time= 0.01563
Epoch: 0011 train_loss= 2.04635 train_acc= 0.19497 val_loss= 2.06408 val_acc= 0.10345 time= 0.00000
Epoch: 0012 train_loss= 2.04705 train_acc= 0.16352 val_loss= 2.06294 val_acc= 0.10345 time= 0.01563
Epoch: 0013 train_loss= 2.04760 train_acc= 0.18868 val_loss= 2.06181 val_acc= 0.10345 time= 0.00000
Epoch: 0014 train_loss= 2.04524 train_acc= 0.16352 val_loss= 2.06057 val_acc= 0.10345 time= 0.01563
Epoch: 0015 train_loss= 2.04719 train_acc= 0.18239 val_loss= 2.05959 val_acc= 0.10345 time= 0.00000
Epoch: 0016 train_loss= 2.03996 train_acc= 0.17610 val_loss= 2.05846 val_acc= 0.13793 time= 0.01563
Epoch: 0017 train_loss= 2.04783 train_acc= 0.16981 val_loss= 2.05704 val_acc= 0.13793 time= 0.01563
Epoch: 0018 train_loss= 2.05001 train_acc= 0.15094 val_loss= 2.05574 val_acc= 0.13793 time= 0.00000
Epoch: 0019 train_loss= 2.03794 train_acc= 0.17610 val_loss= 2.05457 val_acc= 0.13793 time= 0.01563
Epoch: 0020 train_loss= 2.03367 train_acc= 0.18239 val_loss= 2.05343 val_acc= 0.13793 time= 0.00000
Epoch: 0021 train_loss= 2.03838 train_acc= 0.17610 val_loss= 2.05248 val_acc= 0.13793 time= 0.01563
Epoch: 0022 train_loss= 2.03330 train_acc= 0.16981 val_loss= 2.05180 val_acc= 0.13793 time= 0.00000
Epoch: 0023 train_loss= 2.03528 train_acc= 0.18868 val_loss= 2.05131 val_acc= 0.13793 time= 0.01563
Epoch: 0024 train_loss= 2.02965 train_acc= 0.16352 val_loss= 2.05090 val_acc= 0.10345 time= 0.01563
Epoch: 0025 train_loss= 2.02234 train_acc= 0.16352 val_loss= 2.05103 val_acc= 0.10345 time= 0.00000
Epoch: 0026 train_loss= 2.03085 train_acc= 0.16981 val_loss= 2.05043 val_acc= 0.10345 time= 0.01562
Epoch: 0027 train_loss= 2.02563 train_acc= 0.22013 val_loss= 2.04997 val_acc= 0.13793 time= 0.00000
Epoch: 0028 train_loss= 2.03005 train_acc= 0.20755 val_loss= 2.04988 val_acc= 0.13793 time= 0.01563
Epoch: 0029 train_loss= 2.02041 train_acc= 0.19497 val_loss= 2.04992 val_acc= 0.13793 time= 0.00000
Epoch: 0030 train_loss= 2.04002 train_acc= 0.18868 val_loss= 2.05107 val_acc= 0.13793 time= 0.01563
Epoch: 0031 train_loss= 2.02534 train_acc= 0.22013 val_loss= 2.05266 val_acc= 0.13793 time= 0.01563
Early stopping...
Optimization Finished!
Test set results: cost= 2.14749 accuracy= 0.10169 time= 0.00000 
