How to find the Entropy – Decision Tree Learning

 

How to find the Entropy – Decision Tree Learning – Machine Learning

In this tutorial we will understand, how to find the entropy given the four probabilities p1=0.1, p2=0.2, p3=0.3 and p4=0.4 in the decision tree.

Solution:

We know that the equation to find Entropy is,

Entropy = -∑pi * log2⁡(pi)  

We were give four probabilities, hence after expanding the equation will become,

Entropy =-p1∗log2(p1 )-p2∗log2(p2)-p3∗log2(p3 )-p4∗log2(p4 )

Now we will put the values of p1, p2, p3, and p4 in eqaution.

Entropy =-0.1∗log2(0.1 )-0.2∗log2(0.2)-0.3∗log2(0.3 )-0.4∗log2(0.4 )

Entropy =-0.1∗(-3.322)-0.2∗(-2.322)-0.3∗(-1.736)-0.4∗(-1.322)

Entropy =0.3322+0.4644+0.5208+0.5288

Entropy =1.8462

The Entropy is 1.8462 for the given four Probabilities p1=0.1, p2=0.2, p3=0.3 and p4=0.4.

Summary:

In this tutorial we understood, how to find the entropy given the probabilities in decision tree learning.

If you like the tutorial share it with your friends. Like the Facebook page for regular updates and YouTube channel for video tutorials.

Leave a Comment

Your email address will not be published. Required fields are marked *

Welcome to VTUPulse.com


Computer Graphics and Image Processing Mini Projects -> Click Here

Download Final Year Project -> Click Here

This will close in 12 seconds