Conjunctions of Boolean Literals are PAC-Learnable
Consider the class C of target concepts described by
conjunctions of boolean literals, e.g. \[Old \wedge \not
Tall\]
Is it PAC-learnable? We need to show that any consistent
learner will require only a polynomial number of training
examples to learn any $c \in C$.
How many examples are sufficient to assure with
probability at least $(1 - \delta)$ that every $h$ in
$VS_{H,D}$ satisfies $error_{\cal{D}}(h) \leq \epsilon$?
Use our theorem:
\[m \geq \frac{1}{\epsilon}(\ln|H| + \ln(\frac{1}{\delta})) \]
Suppose $H$ contains conjunctions of constraints on up to $n$ boolean
attributes (i.e., $n$ boolean literals). Then $|H| = 3^n$, and
\[m \geq \frac{1}{\epsilon}(\ln 3^n + \ln(\frac{1}{\delta})) \]
or
\[m \geq \frac{1}{\epsilon}(n \ln 3 + \ln(\frac{1}{\delta})) \]
So, $m$ grows linearly or below (i.e., polynomially).
For example, Find-S is a consistent learner that uses time
linear in $n$ for each example. Therefore, the class of
conjunction of boolean literals is PAC-learnable by the Find-S
algorithm using $H = C$.