How many randomly drawn examples suffice to
$\epsilon$-exhaust $VS_{H,D}$ with probability at least $(1 -
\delta)$?
\[
m \geq \frac{1}{\epsilon}(4\log_2(\frac{2}{\delta}) + 8 VC(H) \log_2 (\frac{13}{\epsilon}))
\]
Similar to before but we replaced the $\ln |H|$ with
$VC(H)$ (recall that $VC(H) \leq \log_2|H|$).
We can also provide a lower bound on sample
complexity. If $VC(C) \geq 2$, $0 < \epsilon <
\frac{1}{8}$, and $0 < \delta < \frac{1}{100}$, then
there exists a distribution $\mathcal{D}$ and target concept C
such that if L observes fewer examples than
\[
\max \left( \frac{1}{\epsilon}\log(\frac{1}{\delta}), \frac{VC(C)-1}{32\epsilon} \right)
\]
then with probability at least $\delta$, L outputs a hypothesis $h$
having $error_{\mathcal{D}}(h) > \epsilon$.