A nonconvex proximal splitting algorithm under Moreau-Yosida regularization

Laude E, Wu T, Cremers D (2018)


Publication Type: Conference contribution

Publication year: 2018

Publisher: PMLR

Pages Range: 491-499

Conference Proceedings Title: International Conference on Artificial Intelligence and Statistics, AISTATS 2018

Event location: Playa Blanca, Lanzarote, Canary Islands, ESP

Abstract

We tackle highly nonconvex, nonsmooth composite optimization problems whose objectives comprise a Moreau-Yosida regularized term. Classical nonconvex proximal splitting algorithms, such as nonconvex ADMM, suffer from lack of convergence for such a problem class. To overcome this difficulty, in this work we consider a lifted variant of the Moreau-Yosida regularized model and propose a novel multiblock primal-dual algorithm that intrinsically stabilizes the dual block. We provide a complete convergence analysis of our algorithm and identify respective optimality qualifications under which stationarity of the original model is retrieved at convergence. Numerically, we demonstrate the relevance of Moreau-Yosida regularized models and the efficiency of our algorithm on robust regression as well as joint feature selection and semi-supervised learning.

Involved external institutions

How to cite

APA:

Laude, E., Wu, T., & Cremers, D. (2018). A nonconvex proximal splitting algorithm under Moreau-Yosida regularization. In Amos J. Storkey, Fernando Perez-Cruz (Eds.), International Conference on Artificial Intelligence and Statistics, AISTATS 2018 (pp. 491-499). Playa Blanca, Lanzarote, Canary Islands, ESP: PMLR.

MLA:

Laude, Emanuel, Tao Wu, and Daniel Cremers. "A nonconvex proximal splitting algorithm under Moreau-Yosida regularization." Proceedings of the 21st International Conference on Artificial Intelligence and Statistics, AISTATS 2018, Playa Blanca, Lanzarote, Canary Islands, ESP Ed. Amos J. Storkey, Fernando Perez-Cruz, PMLR, 2018. 491-499.

BibTeX: Download