Search in: Word
Vietnamese keyboard: Off
Virtual keyboard: Show
Computing (FOLDOC) dictionary
Jump to user comments
(Or "backpropagation") A learning algorithm for modifying a
feed-forward neural network which minimises a continuous
Back-propagation is a "gradient descent" method of training
in that it uses gradient information to modify the network
weights to decrease the value of the error function on
subsequent tests of the inputs. Other gradient-based methods
from numerical analysis can be used to train networks more
Back-propagation makes use of a mathematical trick when the
network is simulated on a digital computer, yielding in just
two traversals of the network (once forward, and once back)
both the difference between the desired and actual output, and
the derivatives of this difference with respect to the
connection weights.
Related search result for "back-propagation"
Comments and discussion on the word "back-propagation"