Linear regression: Difference between revisions
From Rice Wiki
Line 22: | Line 22: | ||
# [[Maximum likelihood estimation]] | # [[Maximum likelihood estimation]] | ||
# [[Gradient Descent]] | # [[Gradient Descent]] | ||
# [[Newton's | # [[Newton's method]] |
Revision as of 05:20, 26 April 2024
Linear regression is one of the simplest used techniques for predictive modeling. It estimates a linear relationship between dependent continuous variable $y$ and attributes (aka. independent variables) $X$.
There are different types
- Simple linear regression: one attribute
- Multiple linear regression: multiple attributes
Let the following function model the true relationship between $y$ and $X$
where is the weight coefficient of the attribute $x_i$ to be learned, and $\epsilon$ is residual error.
To train a linear regression model is to learn weight coefficients that minimize error. Error is numerically assigned a value with cost functions, usually RSS.
Minimizing RSS
There are several ways to minimize the RSS