Candidate Elimination Algorithm Solved Example – 1

 

Candidate Elimination Algorithm in Machine Learning

Candidate Elimination Algorithm is used to find the set of consistent hypothesis, that is Version spsce.

Click Here for Python Program to Implement Candidate Elimination Algorithm to get Consistent Version Space

Video Tutorial of Candidate Elimination Algorithm Solved Example – 1

Algorithm:

For each training example d, do:
     If d is positive example
         Remove from G any hypothesis h inconsistent with d
         For each hypothesis s in S not consistent with d:
              Remove s from S
              Add to S all minimal generalizations of s consistent with d and having a generalization in G
              Remove from S any hypothesis with a more specific h in S
     If d is negative example
          Remove from S any hypothesis h inconsistent with d
          For each hypothesis g in G not consistent with d:
                Remove g from G
                Add to G all minimal specializations of g consistent with d and having a specialization in S
                Remove from G any hypothesis having a more general hypothesis in G

Solved Numerical Example – 1 (Candidate Elimination Algorithm):

ExampleSkyAirTempHumidityWindWaterForecastEnjoySport
1SunnyWarmNormalStrongWarmSameYes
2SunnyWarmHighStrongWarmSameYes
3RainColdHighStrongWarmChangeNo
4SunnyWarmHighStrongCoolChangeYes

Solution:

S0: (ø, ø, ø, ø, ø, ø) Most Specific Boundary

G0: (?,  ?,  ?,  ?,  ?,  ?) Most Generic Boundary

The first example is positive, the hypothesis at the specific boundary is inconsistent, hence we extend the specific boundary, and the hypothesis at the generic boundary is consistent hence we retain it.

S1: (Sunny,Warm, Normal, Strong, Warm, Same)

G1: (?,  ?,  ?,  ?,  ?,  ?)

The second example in positive, again the hypothesis at the specific boundary is inconsistent, hence we extend the specific boundary, and the hypothesis at the generic boundary is consistent hence we retain it.

S2: (Sunny,Warm, ?, Strong, Warm, Same)

G2: (?,  ?,  ?,  ?,  ?,  ?)

The third example is negative, the hypothesis at the specific boundary is consistent, hence we retain it, and hypothesis at the generic boundary is inconsistent hence we write all consistent hypotheses by removing one “?” (question mark) at time.

S3: (Sunny,Warm, ?, Strong, Warm, Same)

G3: (Sunny,?,?,?,?,?) (?,Warm,?,?,?,?) (?,?,?,?,?,Same)

The fourth example is positive, the hypothesis at the specific boundary is inconsistent, hence we extend the specific boundary, and the consistent hypothesis at the generic boundary are retained.

S4: (Sunny, Warm, ?, Strong, ?, ?)

G4: (Sunny,?,?,?,?,?) (?,Warm,?,?,?,?)

Learned Version Space by Candidate Elimination Algorithm for given data set is:

Summary

This tutorial discusses the Candidate Elimination Algorithm to find the set of consistent hypotheses in Machine Learning. If you like the tutorial share with your friends. Like the Facebook page for regular updates and YouTube channel for video tutorials.

Leave a Comment

Your email address will not be published. Required fields are marked *

Welcome to VTUPulse.com


Computer Graphics and Image Processing Mini Projects -> Click Here

Download Final Year Project -> Click Here

This will close in 12 seconds