fix the bug of aware for inferen batch is 512 that larger than the num of the client dataset
