Regularization: Difference between revisions

From Rice Wiki
(Removed redirect to Lasso regression)
Tag: Removed redirect
 
Line 10: Line 10:


The two differ slightly in degree of regularization but has similar principles: applying a penalty term to the loss function to discourage many large weights.
The two differ slightly in degree of regularization but has similar principles: applying a penalty term to the loss function to discourage many large weights.
There are other regularization techniques
* [[Dropout regularization]]


= Regularization parameter =
= Regularization parameter =

Latest revision as of 20:20, 18 May 2024


Regularization is a technique that prevents overfitting in machine learning.

Techniques

There are primarily two regularization techniques:

The two differ slightly in degree of regularization but has similar principles: applying a penalty term to the loss function to discourage many large weights.

There are other regularization techniques

Regularization parameter

The regularization parameter controls the degree to which the model is regularized; a higher value penalizes large weights more.

When the regularization parameter is too high, some weights tend toward zero. The range of regularization parameter depends on the following

  • scale of the weights of the model
  • magnitude of noise