Newton's method: Revision history

From Rice Wiki

Diff selection: Mark the radio buttons of the revisions to compare and hit enter or the button at the bottom.
Legend: (cur) = difference with latest revision, (prev) = difference with preceding revision, m = minor edit.

26 April 2024

  • curprev 05:2405:24, 26 April 2024Rice talk contribs 474 bytes +474 Created page with "'''Newton's method''' is an alternative to Gradient Descent. In contrast to GD, which uses the first derivative to approach the optimal model, Newton's method adds the ''second derivative'' to converge ''faster''. Newton's method has the drawback of being more computationally expensive due to the need to find the second derivative. <math> w_j = w_j - a\frac{\frac{\partial l}{\partial w_j}}{\frac{\partial^2 l}{\partial w_j^2}} </math> Category:Machine Learning"