Curve fitting

From Rice Wiki
Revision as of 18:40, 8 April 2024 by Rice (talk | contribs) (Created page with "'''Curve fitting''' is the process of defining/determining (''fit'') a function (''curve'') that best approximates the relationship between dependent and independent variables. * '''Underfitting''' is when models are too basic * '''Overfitting''' is when models are too complex, which may lead to incorrect predictions for values outside of the training data. = Underfitting = Visualize the fit of the model on the test data and bias variance tradeoff. If a model has ''hi...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Curve fitting is the process of defining/determining (fit) a function (curve) that best approximates the relationship between dependent and independent variables.

  • Underfitting is when models are too basic
  • Overfitting is when models are too complex, which may lead to incorrect predictions for values outside of the training data.

Underfitting

Visualize the fit of the model on the test data and bias variance tradeoff.

If a model has high bias and low variance, the model undefits the data

Error of the model

If a model has high bias and low variance, the model undefits the data. If the opposite occurrs, it overfits.

Bias Variance Tradeoff

Bias Variance Tradeoff, https://scott.fortmann-roe.com/docs/BiasVariance.html

Bias is the error between average model prediction and the ground truth

Variance is the error between the average model prediction and the model prediction

Bias and variance have an inverse relationship.

Regression Models

Mean squared error uses the mean of a collection of differences between the prediction and the truth squared as a measurement for fit.

It can be used to measure the fit of the model on the training and test data.