Entropy, information gain, and gini impurity
Decision trees are supervised machine-learning models used to solve classification and regression problems. They help to make decisions by breaking down a problem with a bunch of if-else-then-like evaluations that result in a tree-like structure.
For quality and viable decisions to be made, a decision tree builds itself by splitting various features that give the best information about the target feature till a pure final decision is archived.
This article will discuss three common splitting criteria used in decision tree building:
- Entropy
- Information gain
- Gini impurity
…
Written on December 6, 2022