Improved Sample Complexity for Full Coverage in Compact and Continuous Spaces via Probabilistic Analysis
Abstract: Verifying uniform conditions over continuous spaces through random sampling is fundamental in machine learning and control theory, yet classical coverage analyses often yield conservative bounds, particularly at small failure probabilities. We study uniform random sampling on the $d$-dimensional unit hypercube and analyze the number of uncovered subcubes after discretization. By applying a concentration inequality to the uncovered-count statistic, we derive a sample complexity bound with a logarithmic dependence on the failure probability ($\delta$), i.e., $M =O( \tilde{C}\ln(\frac{2\tilde{C}}{\delta}))$, which contrasts sharply with the classical linear $1/\delta$ dependence. Under standard Lipschitz and uniformity assumptions, we present a self-contained derivation and compare our result with classical coupon-collector rates. Numerical studies across dimensions, precision levels, and confidence targets indicate that our bound tracks practical coverage requirements more tightly and scales favorably as $\delta \to 0$. Our findings offer a sharper theoretical tool for algorithms that rely on grid-based coverage guarantees, enabling more efficient sampling, especially in high-confidence regimes.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Yutian_Chen1
Submission Number: 5769
Loading