top of page


How LSTMs Work: A Deep Dive into Gates and Information Flow
Long Short-Term Memory (LSTM) networks solve the limitations of traditional RNNs through a powerful gating mechanism. This article explains how the Forget, Input, and Output gates work internally, breaking down the math, vector dimensions, and intuition behind cell states and hidden states. A deep, implementation-level guide for serious deep learning practitioners.

Aryan
Feb 4


The Complete Intuition Behind CNNs: How the Human Visual Cortex Inspired Convolutional Neural Networks
Convolutional Neural Networks (CNNs) are inspired by how our visual cortex understands shapes, edges, and patterns. This blog explains CNNs with simple intuition, real experiments like the Hubel & Wiesel cat study, the evolution from the Neocognitron to modern deep learning models, and practical applications in computer vision.

Aryan
Dec 31, 2025
bottom of page