Combining Inductive and Analytical Learning

This talk is based on

1 Motivation

Inductive Learning Analytical Learning
Goal Hypothesis fits data Hypothesis fits domain theory
Justification Statistical inference Deductive Inference
Advantages Requires little prior knowledge Learns from scarce data
Pitfalls Scarce data, incorrect bias Imperfect domain theory

1.1 What We Want

We want a learning method such that:
  1. Given no domain theory it should be as good as purely inductive methods.
  2. Given a perfect domain theory it should be as good as analytical methods.
  3. Given imperfect domain theory and imperfect data it should combine the two and do batter than both inductive and analytical.
  4. Accommodate an unknown level of error in training data.
  5. Accommodate an unknown level of error in domain theory.

2 The Learning Problem

The learning problem: Determine

2.1 Best Fits?

So, what does "best fits" mean?

2.2 Hypothesis Space Search

As always, we view the problem as that of search over H where We see that we have several possible approaches:

3 KBANN

3.1 KBANN Algorithm

KBANN(domainTheory, trainingExamples)
domainTheory: set of propositional non-recursive Horn clauses
  1. for each instance attribute create a network input.
  2. for each Horn clause in domainTheory, create a network unit
    1. Connect inputs to attributes tested by antecedents.
    2. Each non-negated antecedent gets a weight W.
    3. Each negated antecedent gets a weight -W
    4. Threshold weight is -(n - .5), where n is the number of non-negated antecedents.
  3. Make all other connections between layers, giving these very low weights.
  4. Apply Backpropagation using trainingExamples

3.2 KBANN Example

3.3 KBANN Example Network

KBANN net

3.4 After Training

KBANN net

3.5 KBANN Results

3.6 Hypothesis Space Search

Search

4 TangentProp

4.1 TangentProp Example

4.2 TangentProp Search

5 EBNN

5.1 EBNN Example

Example

5.2 EBNN Summary

6 FOCL

6.1 FOCL Example

FOCL Example

6.2 FOCL Search

FOCL search

URLs

  1. Machine Learning book at Amazon, http://www.amazon.com/exec/obidos/ASIN/0070428077/multiagentcom/
  2. Slides by Tom Mitchell on Machine Learning, http://www-2.cs.cmu.edu/~tom/mlbook-chapter-slides.html
  3. KBANN paper, http://citeseer.nj.nec.com/towell94knowledgebased.html
  4. TangentProp paper, http://research.microsoft.com/~patrice/PS/tang_prop.ps
  5. EBNN paper, http://citeseer.nj.nec.com/mitchell92explanationbased.html
  6. citeseer:pazzani92utility, http://citeseer.nj.nec.com/pazzani92utility.html

This talk available at http://jmvidal.cse.sc.edu/talks/indanalytical/
Copyright © 2009 José M. Vidal . All rights reserved.

17 April 2003, 12:26PM