Bias-variance decomposition explains model error. Bias (underfitting) means a model is too simple, failing to capture data patterns. Variance (overfitting) means a model is too complex, sensitive to training data, and generalizes poorly. The goal is to balance this trade-off to minimize total prediction error for optimal model performance. Reducing bias may increase variance, and vice-versa, requiring strategic adjustments like complexity changes or regularization.