top of page


DECISION TREES - 3
Decision trees measure feature importance via impurity reduction (e.g., Gini). Overfitting occurs when trees fit noise, not patterns. Pruning reduces complexity: pre-pruning uses max depth or min samples, while post-pruning, like cost complexity pruning, trims nodes after growth. These methods enhance generalization, improving performance on new data, making them vital for effective machine learning models.

Aryan
May 17
Â
Â
bottom of page