top of page


XGBoost For Regression
Dive into a step-by-step explanation of how XGBoost handles regression problems using a CGPA vs. salary dataset. Understand residual learning, tree construction, similarity scores, gain calculations, and how each stage progressively refines model accuracy. Ideal for beginners and intermediates mastering XGBoost.

Aryan
Aug 11
Â
Â


Introduction to XGBoost
XGBoost is one of the most powerful tools for structured/tabular data — known for its speed, scalability, and high performance. In this post, I’ve shared a detailed explanation of what makes XGBoost so effective, along with its history, features, and real-world use. A great resource for anyone learning ML!

Aryan
Jul 26
Â
Â


KNN (K-Nearest Neighbors)
Understand K-Nearest Neighbors (KNN), a lazy learning algorithm that predicts by finding the closest training data points. Explore how it works, its classification and regression modes, key hyperparameters, overfitting/underfitting issues, and optimized search structures like KD-Tree and Ball Tree for efficient computation.

Aryan
Feb 22
Â
Â


Gradient Descent
What if your machine learning model was a hiker, blindfolded, stumbling its way to the lowest point in a vast valley? That’s Gradient Descent in action. Dive into this blog to understand not just the algorithm, but the soul of how models learn — with crystal-clear math, visuals, and real-world analogies.

Aryan
Jan 7
Â
Â
bottom of page