Linear regression: Difference between revisions

From Rice Wiki
No edit summary
 
(One intermediate revision by the same user not shown)
Line 23: Line 23:
# [[Gradient Descent]]
# [[Gradient Descent]]
# [[Newton's method]]
# [[Newton's method]]
[[Category:Machine Learning]]

Latest revision as of 19:34, 17 May 2024

Linear regression is one of the simplest used techniques for predictive modeling. It estimates a linear relationship between dependent continuous variable $y$ and attributes (aka. independent variables) $X$.

There are different types

Let the following function model the true relationship between $y$ and $X$

where is the weight coefficient of the attribute $x_i$ to be learned, and $\epsilon$ is residual error.

To train a linear regression model is to learn weight coefficients that minimize error. Error is numerically assigned a value with cost functions, usually RSS.

Minimizing RSS

There are several ways to minimize the RSS

  1. Ordinary least squares
  2. Maximum likelihood estimation
  3. Gradient Descent
  4. Newton's method