top of page


Why Is Self-Attention Called “Self”? Understanding Attention Mechanisms from Encoder–Decoder to Transformers
This blog explains why self-attention qualifies as an attention mechanism and why the term “self” is used. By revisiting encoder–decoder attention, Luong attention, and alignment scores, we build a clear intuition for how self-attention works within a single sequence.

Aryan
Feb 28
bottom of page