RMSProp is a gradient-based optimization algorithm. 1 It is similar to Adagrad, but introduces an additional decay term to counteract Adagrad’s rapid decrease in learning rate.
Neural Networks for Machine Learning Lecture 6a
Stanford CS231n: Optimization Algorithms
An overview of gradient descent optimization algorithms
“Deep Learning Glossary.” WildML, 8 Sept. 2017, www.wildml.com/deep-learning-glossary/ (1)
RMSProp (last edited 2018-03-12 00:03:27 by notAndrey)