Underfitting and Overfitting in Deep Learning

Not sure if good model… or just overfitting?

Artem Oppermann
10 min readJul 18, 2021
Source: Authors own image.

In applied Deep Learning, we very often face the problem of overfitting and underfitting. This is a detailed guide that should answer the questions of what is Overfitting and Underfitting in Deep Learning and how to prevent these phenomena.

In Short: Overfitting means that the neural network performs very well on training data, but fails as soon it sees some new data from the problem domain. Underfitting, on the other hand, means, that the model performs poorly on both datasets.

Table of Content

  1. Generalization in Deep Learning
  2. Overfitting
  3. Underfitting
  4. Variance Bias Tradeoff
  5. Identifying Overfitting and Underfitting during Training
  6. How to avoid Overfitting?
  7. How to avoid Underfitting?
  8. Take-Home-Message

1. Generalization in Deep Learning

When training a neural network we are optimizing the weights and biases so the network can perform a mathematical mapping of input values to output values, dependent on the given objective.

--

--

Artem Oppermann
Artem Oppermann

No responses yet