Candidate Elimination Algorithm Solved Example – 3

 

Candidate Elimination Algorithm in Machine Learning

Candidate Elimination Algorithm is used to find the set of consistent hypothesis, that is Version spsce.

Click Here for Python Program to Implement Candidate Elimination Algorithm to get Consistent Version Space

Video Tutorial of Candidate Elimination Algorithm Solved Example – 3

Algorithm:

For each training example d, do:
     If d is positive example
         Remove from G any hypothesis h inconsistent with d
         For each hypothesis s in S not consistent with d:
              Remove s from S
              Add to S all minimal generalizations of s consistent with d and having a generalization in G
              Remove from S any hypothesis with a more specific h in S
     If d is negative example
          Remove from S any hypothesis h inconsistent with d
          For each hypothesis g in G not consistent with d:
                Remove g from G
                Add to G all minimal specializations of g consistent with d and having a specialization in S
                Remove from G any hypothesis having a more general hypothesis in G

Solved Numerical Example – 3 (Candidate Elimination Algorithm):

ExampleCitationsSizeInLibraryPriceEditionsBuy
1SomeSmallNoAffordableOneNo
2ManyBigNoExpensiveManyYes
3ManyMediumNoExpensiveFewYes
4ManySmallNoAffordableManyYes

Solution:

S0: (0, 0, 0, 0, 0) Most Specific Boundary

G0: (?,  ?,  ?, ?, ?) Most Generic Boundary

The first example is negative, the hypothesis at the specific boundary is consistent, hence we retain it, and the hypothesis at the generic boundary is inconsistent hence we write all consistent hypotheses by removing one “?” at a time.

S1: (0, 0, 0, 0, 0)

G1: (Many,?,?,?, ?) (?, Big,?,?,?) (?,Medium,?,?,?) (?,?,?,Exp,?) (?,?,?,?,One) (?,?,?,?,Few)

The second example is positive, the hypothesis at the specific boundary is inconsistent, hence we extend the specific boundary, and the consistent hypothesis at the generic boundary is retained and inconsistent hypotheses are removed from the generic boundary.

S2: (Many, Big, No, Exp, Many)

G2: (Many,?,?,?, ?) (?, Big,?,?,?) (?,?,?,Exp,?) (?,?,?,?,Many) 

The third example is positive, the hypothesis at the specific boundary is inconsistent, hence we extend the specific boundary, and the consistent hypothesis at the generic boundary is retained and inconsistent hypotheses are removed from the generic boundary.

S3: (Many, ?, No, Exp, ?)

G3: (Many,?,?,?,?) (?,?,?,exp,?)

The fourth example is positive, the hypothesis at the specific boundary is inconsistent, hence we extend the specific boundary, and the consistent hypothesis at the generic boundary is retained and inconsistent hypotheses are removed from the generic boundary.

S4: (Many, ?, No, ?, ?)

G4: (Many,?,?,?,?)

Learned Version Space by Candidate Elimination Algorithm for given data set is:

(Many, ?, No, ?, ?) (Many, ?, ?, ?, ?)

Summary

This tutorial discusses the Candidate Elimination Algorithm to find the set of consistent hypotheses in Machine Learning. If you like the tutorial share with your friends. Like the Facebook page for regular updates and YouTube channel for video tutorials.

2 thoughts on “Candidate Elimination Algorithm Solved Example – 3”

Leave a Comment

Your email address will not be published. Required fields are marked *

Welcome to VTUPulse.com


Computer Graphics and Image Processing Mini Projects -> Click Here

Download Final Year Project -> Click Here

This will close in 12 seconds