Calculus on Computational Graphs: Backpropagation (2015)

Backpropagation is a powerful algorithm that makes training deep models much faster. It has been reinvented multiple times in different fields and is known as “reverse-mode differentiation.” Computational graphs are a useful tool for understanding mathematical expressions and are closely related to deep learning frameworks like Theano. Derivatives can be calculated on computational graphs by understanding how each node affects the others. Reverse-mode differentiation is particularly useful because it can calculate derivatives of all outputs with respect to all inputs in one fell swoop, leading to a significant speedup. Backpropagation is not as trivial as it seems and has been rediscovered multiple times in history. Derivatives are cheaper than expected, and understanding backpropagation can help optimize models and solve other computational problems.

https://colah.github.io/posts/2015-08-Backprop/

To top