Joseph Salmon (Télécom, ParisTech)

From safe screening rules to working sets for faster Lasso-type solvers

vendredi 21 avril 2017, 9h30 - 10h30

Salle du conseil, espace Turing

Convex sparsity promoting regularizations are now ubiquitous to regularize inverse problems in statistics, in signal processing or in for machine learning.
By construction, they yield solutions with few non-zero coefficients, which correspond to saturated constraints in the dual optimization formulation.
Working set (WS) strategies are generic optimization techniques that consist in solving simpler problems that only consider a subset of variables, whose indices form the WS. Working set methods therefore involve two nested iterations: the outer loop corresponds to the definition of the WS and the inner loop calls a solver for the subproblems. For the Lasso estimator a WS is a set of features, while for a Group Lasso with block sparsity it refers to a set of groups. In practice, WS are generally small in this context so the associated feature Gram matrix can fit in memory.
We show that the Gauss-Southwell rule (a greedy strategy for block coordinate descent techniques) leads to fast solvers in this case. Combined with a working set strategy based on an aggressive use of so-called Gap Safe screening rules, we propose a solver achieving state-of-the-art performance on sparse learning problems.