Candidate Elimination Algorithm in Machine Learning
Candidate Elimination Algorithm is used to find the set of consistent hypothesis, that is Version spsce.
Video Tutorial of Candidate Elimination Algorithm Solved Example – 3
Algorithm:
For each training example d, do: If d is positive example Remove from G any hypothesis h inconsistent with d For each hypothesis s in S not consistent with d: Remove s from S Add to S all minimal generalizations of s consistent with d and having a generalization in G Remove from S any hypothesis with a more specific h in S If d is negative example Remove from S any hypothesis h inconsistent with d For each hypothesis g in G not consistent with d: Remove g from G Add to G all minimal specializations of g consistent with d and having a specialization in S Remove from G any hypothesis having a more general hypothesis in G
Solved Numerical Example – 3 (Candidate Elimination Algorithm):
Example | Citations | Size | InLibrary | Price | Editions | Buy |
1 | Some | Small | No | Affordable | One | No |
2 | Many | Big | No | Expensive | Many | Yes |
3 | Many | Medium | No | Expensive | Few | Yes |
4 | Many | Small | No | Affordable | Many | Yes |
Solution:
S0: (0, 0, 0, 0, 0) Most Specific Boundary
G0: (?, ?, ?, ?, ?) Most Generic Boundary
The first example is negative, the hypothesis at the specific boundary is consistent, hence we retain it, and the hypothesis at the generic boundary is inconsistent hence we write all consistent hypotheses by removing one “?” at a time.
S1: (0, 0, 0, 0, 0)
G1: (Many,?,?,?, ?) (?, Big,?,?,?) (?,Medium,?,?,?) (?,?,?,Exp,?) (?,?,?,?,One) (?,?,?,?,Few)
The second example is positive, the hypothesis at the specific boundary is inconsistent, hence we extend the specific boundary, and the consistent hypothesis at the generic boundary is retained and inconsistent hypotheses are removed from the generic boundary.
S2: (Many, Big, No, Exp, Many)
G2: (Many,?,?,?, ?) (?, Big,?,?,?) (?,?,?,Exp,?) (?,?,?,?,Many)
The third example is positive, the hypothesis at the specific boundary is inconsistent, hence we extend the specific boundary, and the consistent hypothesis at the generic boundary is retained and inconsistent hypotheses are removed from the generic boundary.
S3: (Many, ?, No, Exp, ?)
G3: (Many,?,?,?,?) (?,?,?,exp,?)
The fourth example is positive, the hypothesis at the specific boundary is inconsistent, hence we extend the specific boundary, and the consistent hypothesis at the generic boundary is retained and inconsistent hypotheses are removed from the generic boundary.
S4: (Many, ?, No, ?, ?)
G4: (Many,?,?,?,?)
Learned Version Space by Candidate Elimination Algorithm for given data set is:
(Many, ?, No, ?, ?) (Many, ?, ?, ?, ?)
Summary
This tutorial discusses the Candidate Elimination Algorithm to find the set of consistent hypotheses in Machine Learning. If you like the tutorial share with your friends. Like the Facebook page for regular updates and YouTube channel for video tutorials.
I think it’s wrong
in many cases you considered editions to be ‘one’ rather than many.
Updated, do check and share with your friends