Lecture

Relationship Between Number of Layers and Model Performance

The number of layers in a neural network significantly affects its performance.

If there are too few layers, learning may be insufficient, while too many layers can lead to overfitting.

In this lesson, we will examine how the number of layers affects the performance of a neural network.


1. When There Are Too Few Layers: Underfitting

Too few layers may prevent the network from capturing complex patterns, leading to underfitting and poor performance.

In such cases, while it may handle simple problems, it struggles with learning complex data patterns.

Both training and test performance may be low, indicating the model cannot generalize well.

For example, consider the following neural network structure.

Neural Network Structure with Few Layers
Input Layer → Hidden Layer (10 Neurons) → Output Layer

This structure has few neurons and is shallow, limiting its ability to learn complex data patterns.


2. When There Are Too Many Layers: Overfitting

Too many layers may cause the network to overfit the training data, reducing its ability to generalize to new inputs.

In this scenario, the model may perform well on training data but poorly on new data.

Additionally, increased computational cost may slow down the training process, introducing unnecessary complexity.

For instance, the structure below with an excessive number of hidden layers may lead to unnecessary complexity.

Neural Network Structure with Many Layers
Input Layer → Hidden Layer (256 Neurons) → Hidden Layer (128 Neurons) → Hidden Layer (64 Neurons) → Output Layer

Such a model may optimize well on training data but may not perform well on new data.


A well-balanced network maintains good performance on both training and test data by avoiding underfitting and overfitting.

To determine the optimal depth, start with a simple model and incrementally add layers, observing performance trends on validation data.

Moreover, using a validation set helps measure generalization and fine-tune the network’s depth accordingly.

Quiz
0 / 1

What is the most suitable word to fill in the blank?

If there are too many layers, the neural network may be to the training data, and the generalization performance may decline.
transfer
underfit
overfit
perturb

Lecture

AI Tutor

Design

Upload

Notes

Favorites

Help