Newton's method

From Rice Wiki
Revision as of 05:24, 26 April 2024 by Rice (talk | contribs) (Created page with "'''Newton's method''' is an alternative to Gradient Descent. In contrast to GD, which uses the first derivative to approach the optimal model, Newton's method adds the ''second derivative'' to converge ''faster''. Newton's method has the drawback of being more computationally expensive due to the need to find the second derivative. <math> w_j = w_j - a\frac{\frac{\partial l}{\partial w_j}}{\frac{\partial^2 l}{\partial w_j^2}} </math> Category:Machine Learning")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Newton's method is an alternative to Gradient Descent. In contrast to GD, which uses the first derivative to approach the optimal model, Newton's method adds the second derivative to converge faster.

Newton's method has the drawback of being more computationally expensive due to the need to find the second derivative.