IMPORTANT NOTE:
As this code uses Sequential LR Scheduler for Linear LR along with Reduce On Plateau, in order to work properly, simple modification to pytorch library has to be performed:

in file 
pytorch/torch/optim/lr_scheduler.py of the pytorch library
lines 882 to 887 should be changed from:

if isinstance(scheduler, ReduceLROnPlateau):
    raise ValueError(
        f"{self.__class__.__name__} does not support `ReduceLROnPlateau` scheduler as it "
        "requires additional kwargs to be specified when calling `step`, "
        f"but got one at index {scheduler_idx} in the given schedulers sequence."
        )

to:

if isinstance(scheduler, ReduceLROnPlateau):
    pass

as the LR update is handled within our framework.
