The cookie-related information is fully under our control. These cookies are not used for any purpose other than those described here. Unibo policy
We propose proximal backpropagation (ProxProp) as a novel algorithm that takes implicit instead of explicit gradient steps to update the network parameters during neural network training. ProxProp is developed from a general point of view on the backpropagation algorithm. Specifically, we show that backpropagation of a prediction error is equivalent to sequential gradient descent steps on a quadratic penalty energy. We conclude by analyzing theoretical properties of ProxProp and demonstrate promising numerical results.
This presentation is part of Minisymposium “MS50 - Analysis, Optimization, and Applications of Machine Learning in Imaging (3 parts)”
organized by: Michael Moeller (University of Siegen) , Gitta Kutyniok (Technische Universität Berlin) .