Stochastic Approximation Method with Second Order Search DirectionMS13

Application of second-order- like search directions in the Stochastic Approximation framework is discussed together with convergence conditions. We consider strictly convex problems in noisy environment and assume that only noisy values for the objective function and the gradient are available, as well as some approximate Hessian value. Under the zero mean assumption on noise a convergence analysis is presented for methods that use some approximate second-order direction. We prove that there exists a level of inexactness, governed by the usual gain sequence in SA methods, that does not interfere with the convergence and hence derive the set of convergence conditions that are applicable to a number of search directions. These directions include the so called mini-batch subsampled Hessian in statistical learning and similar directions.

This presentation is part of Minisymposium “MS13 - Optimization for Imaging and Big Data (2 parts)
organized by: Margherita Porcelli (University of Firenze) , Francesco Rinaldi (University of Padova) .

Authors:
Natasa Krklec Jerinkic (University of Novi Sad)
Nataša Krejić (University of Novi Sad)
Zoran Ovcin (Faculty of Technical Sciences, University of Novi Sad)
Keywords:
machine learning, nonlinear optimization, stochastic approximation