top of page


DECISION TREES - 2
Dive into Decision Trees for Regression (CART), understanding its core mechanics for continuous target variables. This post covers how CART evaluates splits using Mean Squared Error (MSE), its geometric interpretation of creating axis-aligned regions, and the step-by-step process of making predictions for both regression and classification tasks. Discover its advantages in handling non-linear data and key disadvantages like overfitting, emphasizing the need for regularization

Aryan
May 17
Â
Â


DECISION TREES - 1
Discover the power of decision trees in machine learning. This post dives into their intuitive approach, versatility for classification and regression, and the CART algorithm. Learn how Gini impurity and splitting criteria partition data for accurate predictions. Perfect for data science enthusiasts !

Aryan
May 16
Â
Â
bottom of page