Entropy, information gain, and gini impurity

Decision Trees splitting criteria.png

Decision trees are supervised machine-learning models used to solve classification and regression problems. They help to make decisions by breaking down a problem with a bunch of if-else-then-like evaluations that result in a tree-like structure.

For quality and viable decisions to be made, a decision tree builds itself by splitting various features that give the best information about the target feature till a pure final decision is archived.

This article will discuss three common splitting criteria used in decision tree building:

  • Entropy
  • Information gain
  • Gini impurity

Read more

Written on December 6, 2022