Yes, in the context of neural networks, “dropout” refers to randomly “dropping” (i.e., turning off) a certain percentage of neurons or nodes in a layer during training. This means that during each training iteration, some nodes in the layer are ignored, meaning their contributions to the network’s predictions are temporarily removed. This technique helps prevent “overfitting” by making the model less reliant on specific neurons and forcing it to generalize better to unseen data.
Once training is complete, dropout is no longer applied, and all neurons are used during the prediction phase.
Reference :