
=== Start adding workers ===
=> Add worker SGDMWorker(index=0, momentum=0.9)
=> Add worker SGDMWorker(index=1, momentum=0.9)
=> Add worker SGDMWorker(index=2, momentum=0.9)
=> Add worker SGDMWorker(index=3, momentum=0.9)
=> Add worker SGDMWorker(index=4, momentum=0.9)
=> Add worker BitFlippingWorker
=> Add worker BitFlippingWorker
=> Add worker BitFlippingWorker
=> Add worker BitFlippingWorker
=> Add worker BitFlippingWorker
=> Add worker BitFlippingWorker
=> Add worker BitFlippingWorker
=> Add worker BitFlippingWorker
=> Add worker BitFlippingWorker
=> Add worker BitFlippingWorker
=> Add worker BitFlippingWorker

=== Start adding graph ===
<__main__.MaliciousRing object at 0x7fd463245b20>

Train epoch 1
[E 1B0  |    512/60000 (  1%) ] Loss: 2.3075 top1=  5.0000

=== Peeking data label distribution E1B0 ===
Worker 0 has targets: tensor([0, 0, 0, 0, 0], device='cuda:0')
Worker 1 has targets: tensor([1, 1, 0, 0, 1], device='cuda:0')
Worker 2 has targets: tensor([1, 1, 1, 1, 1], device='cuda:0')
Worker 3 has targets: tensor([2, 2, 1, 2, 2], device='cuda:0')
Worker 4 has targets: tensor([2, 2, 2, 2, 2], device='cuda:0')
Worker 5 has targets: tensor([3, 3, 3, 3, 3], device='cuda:0')
Worker 6 has targets: tensor([4, 4, 3, 3, 4], device='cuda:0')
Worker 7 has targets: tensor([4, 4, 4, 4, 4], device='cuda:0')
Worker 8 has targets: tensor([5, 5, 4, 5, 5], device='cuda:0')
Worker 9 has targets: tensor([6, 6, 5, 5, 6], device='cuda:0')
Worker 10 has targets: tensor([6, 6, 6, 6, 6], device='cuda:0')
Worker 11 has targets: tensor([7, 7, 6, 7, 7], device='cuda:0')
Worker 12 has targets: tensor([7, 7, 7, 7, 8], device='cuda:0')
Worker 13 has targets: tensor([8, 8, 8, 8, 8], device='cuda:0')
Worker 14 has targets: tensor([9, 9, 8, 9, 9], device='cuda:0')
Worker 15 has targets: tensor([9, 9, 9, 9, 9], device='cuda:0')



=== Log mixing matrix @ E1B0 ===
[[0.545 0.091 0.    0.    0.091 0.091 0.    0.    0.    0.    0.091 0.
  0.    0.    0.    0.091]
 [0.091 0.582 0.109 0.    0.    0.    0.109 0.    0.    0.    0.    0.109
  0.    0.    0.    0.   ]
 [0.    0.109 0.564 0.109 0.    0.    0.    0.109 0.    0.    0.    0.
  0.109 0.    0.    0.   ]
 [0.    0.    0.109 0.564 0.109 0.    0.    0.    0.109 0.    0.    0.
  0.    0.109 0.    0.   ]
 [0.091 0.    0.    0.109 0.582 0.    0.    0.    0.    0.109 0.    0.
  0.    0.    0.109 0.   ]
 [0.091 0.    0.    0.    0.    0.909 0.    0.    0.    0.    0.    0.
  0.    0.    0.    0.   ]
 [0.    0.109 0.    0.    0.    0.    0.891 0.    0.    0.    0.    0.
  0.    0.    0.    0.   ]
 [0.    0.    0.109 0.    0.    0.    0.    0.891 0.    0.    0.    0.
  0.    0.    0.    0.   ]
 [0.    0.    0.    0.109 0.    0.    0.    0.    0.891 0.    0.    0.
  0.    0.    0.    0.   ]
 [0.    0.    0.    0.    0.109 0.    0.    0.    0.    0.891 0.    0.
  0.    0.    0.    0.   ]
 [0.091 0.    0.    0.    0.    0.    0.    0.    0.    0.    0.909 0.
  0.    0.    0.    0.   ]
 [0.    0.109 0.    0.    0.    0.    0.    0.    0.    0.    0.    0.891
  0.    0.    0.    0.   ]
 [0.    0.    0.109 0.    0.    0.    0.    0.    0.    0.    0.    0.
  0.891 0.    0.    0.   ]
 [0.    0.    0.    0.109 0.    0.    0.    0.    0.    0.    0.    0.
  0.    0.891 0.    0.   ]
 [0.    0.    0.    0.    0.109 0.    0.    0.    0.    0.    0.    0.
  0.    0.    0.891 0.   ]
 [0.091 0.    0.    0.    0.    0.    0.    0.    0.    0.    0.    0.
  0.    0.    0.    0.909]]


[E 1B10 |   5632/60000 (  9%) ] Loss: 0.1111 top1= 98.1250
[E 1B20 |  10752/60000 ( 18%) ] Loss: 0.1937 top1= 95.6250

=> Averaged model (Global Average Validation Accuracy) | Eval Loss=8.2413 top1= 22.3157

Train epoch 2
[E 2B0  |    512/60000 (  1%) ] Loss: 0.1959 top1= 96.8750
[E 2B10 |   5632/60000 (  9%) ] Loss: 0.0240 top1= 99.3750
[E 2B20 |  10752/60000 ( 18%) ] Loss: 0.0295 top1= 99.3750

=> Averaged model (Global Average Validation Accuracy) | Eval Loss=5.2520 top1= 24.6995

Train epoch 3
[E 3B0  |    512/60000 (  1%) ] Loss: 0.0507 top1= 99.3750
[E 3B10 |   5632/60000 (  9%) ] Loss: 0.0160 top1=100.0000
[E 3B20 |  10752/60000 ( 18%) ] Loss: 0.0234 top1= 99.3750
