Add topic "Backpropagation" Accepted
Changes: 4
-
Add Backpropagation — ML Glossary documentation
- Title
-
- Unchanged
- Backpropagation — ML Glossary documentation
- Type
-
- Unchanged
- Cheat sheet
- Created
-
- Unchanged
- 2017
- Description
-
- Unchanged
- The goals of backpropagation are straightforward: adjust each weight in the network in proportion to how much it contributes to overall error. If we iteratively reduce each weight’s error, eventually we’ll have a series of weights that produce good predictions.
- Link
-
- Unchanged
- https://ml-cheatsheet.readthedocs.io/en/latest/backpropagation.html
- Identifier
-
- Unchanged
- no value
Resource | v1 | current (v1) -
Add Backpropagation
- Title
-
- Unchanged
- Backpropagation
- Description
-
- Unchanged
- In machine learning, backpropagation (backprop, BP) is a widely used algorithm for training feedforward neural networks. Generalizations of backpropagation exist for other artificial neural networks (ANNs), and for functions generally. These classes of algorithms are all referred to generically as "backpropagation". In fitting a neural network, backpropagation computes the gradient of the loss function with respect to the weights of the network for a single input–output example, and does so efficiently, unlike a naive direct computation of the gradient with respect to each weight individually. This efficiency makes it feasible to use gradient methods for training multilayer networks, updating weights to minimize loss; gradient descent, or variants such as stochastic gradient descent, are commonly used.
- Link
-
- Unchanged
- https://en.wikipedia.org/?curid=1360091
Topic | v1 | current (v1) -
Add Backpropagation treated in Backpropagation — ML Glossary documentation
- Current
- treated in
Topic to resource relation | v1 -
Add Deep learning uses Backpropagation
- Current
- uses
Topic to topic relation | v1