top of page
All Posts


Ensemble Learning
Ensemble Learning combines multiple machine learning models to improve accuracy, stability, and generalization. Inspired by the “Wisdom of the Crowd,” it relies on the idea that diverse models can correct each other’s errors. Popular methods include Voting, Bagging, Boosting, and Stacking. These approaches reduce overfitting, handle variance or bias, and enhance performance, making ensemble learning a key technique in modern machine learning.

Aryan
May 17


DECISION TREES - 2
Dive into Decision Trees for Regression (CART), understanding its core mechanics for continuous target variables. This post covers how CART evaluates splits using Mean Squared Error (MSE), its geometric interpretation of creating axis-aligned regions, and the step-by-step process of making predictions for both regression and classification tasks. Discover its advantages in handling non-linear data and key disadvantages like overfitting, emphasizing the need for regularization

Aryan
May 17


DECISION TREES - 1
Discover the power of decision trees in machine learning. This post dives into their intuitive approach, versatility for classification and regression, and the CART algorithm. Learn how Gini impurity and splitting criteria partition data for accurate predictions. Perfect for data science enthusiasts !

Aryan
May 16


Support Vector Machine (SVM) – Part 7
This blog demystifies how the RBF kernel in SVM creates highly adaptive, local decision boundaries by emphasizing similarity between nearby points. You'll understand the role of gamma, how the kernel's geometry defines regions of influence, and why RBF enables powerful non-linear classification by implicitly mapping data into infinite-dimensional space.

Aryan
May 8


Support Vector Machine (SVM) – Part 6
Dual Formulation in SVMs isn’t just a rewrite—it’s a revolution. It shifts the optimization game from weight space to α-space, focusing only on support vectors. We unpack how dual SVMs compute smarter, enable kernel tricks, and efficiently solve non-linear problems. This post breaks it all down—from dot products to RBFs—with clarity, code, and geometric insight.

Aryan
May 5


Support Vector Machine (SVM) – Part 5
Step beyond 2‑D lines into n‑D hyperplanes. This post walks you through soft‑margin SVMs, inequality‑constrained optimisation, KKT conditions, primal‑to‑dual conversion, and why only a handful of support vectors end up steering the whole classifier—your cheat‑sheet to scaling SVMs without losing your cool.

Aryan
May 4


Support Vector Machine (SVM) – Part 4
From toy circles to cutting-edge classifiers, this post shows how Support Vector Machines harness constrained optimization: we chart contours, trace gradient vectors, and align them with Lagrange multipliers to see exactly how SVMs carve out the widest possible margin. Ready to bridge raw calculus and real-world margin magic ?

Aryan
May 2


Support Vector Machine (SVM) – Part 3
Support Vector Classifiers (SVCs) struggle when data isn’t linearly separable. The real-world isn’t clean, and straight-line boundaries fail. That’s where constrained optimization and the kernel trick step in—transforming SVC into full-blown SVMs capable of tackling nonlinear patterns with elegance and efficiency.

Aryan
Apr 30


Support Vector Machine (SVM) – Part 2
Hard‑margin SVMs look clean on the whiteboard—huge margin, zero errors—but real‑world data laughs at that rigidity. Noise, overlap, and outliers wreck the ‘perfectly separable’ dream, leaving the model unsolvable. Cue slack variables: a pragmatic detour that births the soft‑margin SVM and keeps classification sane.

Aryan
Apr 28


Support Vector Machine (SVM) – Part 1
Discover the core idea of Hard Margin SVM — finding the hyperplane that perfectly separates two classes with the widest margin. With student placement data as an example, this blog explains support vectors, margin equations, and the math behind maximal margin classification. Learn how SVM makes decisions and why hard margin isn't always practical in real-world data.

Aryan
Apr 26
bottom of page