Deep Learning Seminar  /  18. Juni 2020

Natural Gradient Descent


[nur in Englisch verfügbar]

Natural gradient descent is an optimization method that takes steps in distribution space rather than in parameter space. Therefore it can be used as an alternative to stochastic gradient descent. In this talk an overview over this method and its properties is given, and it is shown how it can be viewed as a type of 2nd-order optimization method.