Dropout

A regularization technique in which neurons are randomly “turned off” during training (output to 0) with a given probability.

Dropout helps most with:

  • Large, fully connected layers
  • Deep networks / many parameters
  • Limited training data
  • Preventing overfitting (when observed)
Referenced in