Member-only story
Understanding Overfitting and Underfitting In Layman Terms
Friends, we must have heard these terms i.e Underfitting and Overfitting in building classification ML models. In this blog, I will try to explain in simpler terms. Before understanding these terms we need to know about Bias and Variance. Let’s define it in one simple sentence as below:
Bias:
It is the error of training data.
Variance :
It is the error of testing data.
Underfit Vs Best Model Vs Overfit:
“One picture is worth a thousand words” — Albert Einstein
There are different methods to handle both the overfitting and underfitting models that we will see in the next blog.
Hope you enjoy reading this blog and clearly understood the overfitting and underfitting terms in Machine learning models.