Dropout regularization

From Rice Wiki
Revision as of 20:23, 18 May 2024 by Rice (talk | contribs) (Created page with "Category:Machine Learning '''Dropout regularization''' behaves quite differently than other regularization techniques. Instead of penalizing large weights in the loss function, it adds a layer that randomly ignores neurons at every full-pass. Dropout regularization is controlled by hyperparameter '''dropout rate'''. For example, a dropout rate of 0.2 means that 20% of input neurons will be ignored.")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


Dropout regularization behaves quite differently than other regularization techniques. Instead of penalizing large weights in the loss function, it adds a layer that randomly ignores neurons at every full-pass.

Dropout regularization is controlled by hyperparameter dropout rate. For example, a dropout rate of 0.2 means that 20% of input neurons will be ignored.