Authors
Rémi Leblond, Fabian Pedregosa, Simon Lacoste-Julien
Publication date
2017
Conference
Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS)
Description
We describe ASAGA, an asynchronous parallel version of the incremental gradient algorithm SAGA that enjoys fast linear convergence rates. Through a novel perspective, we revisit and clarify a subtle but important technical issue present in a large fraction of the recent convergence rate proofs for asynchronous parallel optimization algorithms, and propose a simplification of the recently introduced “perturbed iterate” framework that resolves it. We thereby prove that ASAGA can obtain a theoretical linear speedup on multi-core systems even without sparsity assumptions. We present results of an implementation on a 40-core architecture illustrating the practical speedup as well as the hardware overhead.
Total citations
201620172018201920202021202220232024614261923231262
Scholar articles
R Leblond, F Pedregosa, S Lacoste-Julien - Artificial Intelligence and Statistics, 2017