Backpropagation


Topic | v1 | created by janarez |
Description

In machine learning, backpropagation (backprop, BP) is a widely used algorithm for training feedforward neural networks. Generalizations of backpropagation exist for other artificial neural networks (ANNs), and for functions generally. These classes of algorithms are all referred to generically as "backpropagation". In fitting a neural network, backpropagation computes the gradient of the loss function with respect to the weights of the network for a single input–output example, and does so efficiently, unlike a naive direct computation of the gradient with respect to each weight individually. This efficiency makes it feasible to use gradient methods for training multilayer networks, updating weights to minimize loss; gradient descent, or variants such as stochastic gradient descent, are commonly used.


Relations

used by Deep learning

Deep learning (also known as deep structured learning) is part of a broader family of machine learnin...


Edit details Edit relations Attach new author Attach new topic Attach new resource
Resources

treated in Backpropagation — ML Glossary documentation

7.0 rating 3.0 level 7.0 clarity 8.0 background – 1 rating

The goals of backpropagation are straightforward: adjust each weight in the network in proportion to...