Distributed Proximal Splitting Algorithms with Rates and Acceleration

Published in Frontiers in Signal Processing, 2022

PDF, Cite, Poster, arXiv, ‘The 12th Annual Workshop on Optimization for Machine Learning (NeurIPS Workshop OPT2020), Spotlight’, ‘Frontiers in Signal Processing’

Abstract:

We propose new generic distributed proximal splitting algorithms, well suited for large-scale convex nonsmooth optimization. We derive sublinear and linear convergence results with new nonergodic rates, as well as new accelerated versions of the algorithms, using varying stepsizes.