Dropout means droping some nodes?

Tiya Vaj
Oct 3, 2024

--

Yes, in the context of neural networks, “dropout” refers to randomly “dropping” (i.e., turning off) a certain percentage of neurons or nodes in a layer during training. This means that during each training iteration, some nodes in the layer are ignored, meaning their contributions to the network’s predictions are temporarily removed. This technique helps prevent “overfitting” by making the model less reliant on specific neurons and forcing it to generalize better to unseen data.

Once training is complete, dropout is no longer applied, and all neurons are used during the prediction phase.

image from Dropout: A Simple Way to Prevent Neural Networks from Overfitting paper

Reference :

--

--

Tiya Vaj
Tiya Vaj

Written by Tiya Vaj

Ph.D. Research Scholar in NLP and my passionate towards data-driven for social good.Let's connect here https://www.linkedin.com/in/tiya-v-076648128/

No responses yet