Covering numbers are central to estimating sample complexity. Alas, standard techniques for bounding covering numbers fail in estimating the covering numbers of many classes of neural networks. We introduce a generalization of covers, called {\em multicovers}, which are covers w.r.t. many metrics simultaneously.
Contrary to standard covering numbers, multicovering numbers behave better with the layer-wise structure in neural networks. We utilize this property to recover a recent result of \citet{ADL2019} who defined a new notion called Approximate Description Length (ADL) to establish tight bounds on the sample complexity of networks with weights of bounded Frobenius norm. We also show that ADL and multicovering numbers are closely related.