Back propagation: Revision history

From Rice Wiki

Diff selection: Mark the radio buttons of the revisions to compare and hit enter or the button at the bottom.
Legend: (cur) = difference with latest revision, (prev) = difference with preceding revision, m = minor edit.

1 May 2024

  • curprev 03:1803:18, 1 May 2024Rice talk contribs 902 bytes +902 Created page with "'''Back propagation''' is a error calculation technique. It consists of passing a loss function ''backwards'' through a neural network layer-by-layer to update its weights. = Procedure = Consider the following RSS loss function (halved for easier derivative). <math> E = \frac{1}{2}\sum (y_i - \bf{w}\bf{x})^2 </math> After each feed-forward pass where one data point is passed through the neural network, the gradient of the loss function is computed. We compute gra..."