Fast l0 sparse approximation problem via a Scaled Alternating OptimizationCP1

Inspired by techniques used for alternating optimization of nonconvex functions, we propose a simple yet effective algorithm with better trade-off between accuracy and computation time than the state-of-the-art for the nonconvex $\ell_0$ regularized optimization ($\ell_0$-RO) problem. Given an initial solution, we first find the vanilla solution to $\ell_0$-RO via a descent method (Nesterov’s AGP), to then estimate a new one by scaling the dictionary involved in $\ell_0$-RO, considering only a reduced number of its atoms.

This presentation is part of Contributed Presentation “CP1 - Contributed session 1

Authors:
Paul Rodriguez (Pontificia Universidad Catolica del Peru)
Keywords:
alternation optimization, inverse problems, nesterov’s agp, nonconvex optimization