**Backpropagation**

Backpropagation is an algorithm to efficiently calculate the gradients in a Neural Network, or more generally, a feedforward computational graph. ^{1} It boils down to applying the chain rule of differentiation starting from the network output and propagating the gradients backward. The first uses of backpropagation go back to Vapnik in the 1960’s, but Learning representations by back-propagating errors is often cited as the source.

**Further Reading**

**Sources**

“Deep Learning Glossary.”

*WildML*, 8 Sept. 2017, www.wildml.com/deep-learning-glossary/ (1)