Nonconvex Quadratic Composite Minimization: Theory and Algorithms
Final Report Abstract
It was anticipated in the original proposal that findings and discoveries resulting from this project will establish theoretical and practical performance guarantees of nonconvex algorithms with non- Euclidean metrics. In the original proposal we expected results and potential impact to include: (i) deeper understanding, locally and globally, of various mechanisms behind the convergence algorithms – qualitative and quantitative; (ii) novel approaches for complexity and global convergence analysis of non-descent schemes; (iii) new methods for problems lacking standard geometries and analytical properties; (iv) preparing next generation researchers to make contributions, both theoretical and practical, in the field of nonconvex and nonsmooth optimization. Item (i) was achieved fully with the collective achievements of the team; we are satisfied with our progress on global convergence analysis of non-descent schemes in item (ii); progress on item (iii) was much greater than anticipated, with extensions of the theory to uniformly convex metric spaces where we obtained for the first time convergence results for proximal methods in positively curved spaces. Furthermore, the convergence analysis of multiplier-based methods lacking descent properties and under only local Lipschitz smoothness derived further extend the scope of results we expected in (ii) and (iii) - this went far beyond the non-Euclidean (but still vector-space) analysis anticipated for the project; item (iv) was achieved through the support of PhD students, as well as four masters projects that contributed to the research. We disseminated our results through 7 publications in leading international journals, and presentations in major conferences on optimization and imaging. Prominent instances of minisymposia that we organized were the SIAM Conference on Imaging Science on March 22, 2022, minisymposium “Advances in optimisation for image processing: new applications, algorithms and theory” and a follow-up minisymposium for the SIAM Conference on Optimization in Seattle March 31- June 3, 2023 entitled “Structured Optimization: Trends and Frontiers”. We are satisfied with the attention that this work is getting and will continue to promote our results and push beyond the progress made here.
Publications
-
Necessary conditions for linear convergence of iterated expansive, set-valued mappings. Mathematical Programming, 180(1-2), 1-31.
Luke, D. Russell; Teboulle, Marc & Thao, Nguyen H.
-
A Dynamic Alternating Direction of Multipliers for Nonconvex Minimization with Nonlinear Functional Equality Constraints. Journal of Optimization Theory and Applications, 193(1–3), 324–353.
Cohen, Eyal; Hallak, Nadav & Teboulle, Marc
-
Convergence of proximal splitting algorithms in CAT(κ) spaces and beyond. Fixed Point Theory and Algorithms for Sciences and Engineering, 2021(1).
Lauster, Florian & Luke, D. Russell
-
Non-Euclidean proximal methods for convex-concave saddle-point problems. Journal of Applied and Numerical Optimization, 3(1).
E. Cohen; S. Sabach & M. Teboulle
-
α-Firmly nonexpansive operators on metric spaces. Journal of Fixed Point Theory and Applications, 24(1).
Bërdëllima, Arian; Lauster, Florian & Luke, D. Russell
-
A semi-Bregman proximal alternating method for a class of nonconvex problems: local and global convergence analysis. Journal of Global Optimization, 89(1), 33-55.
Cohen, Eyal; Luke, D. Russell; Pinta, Titus; Sabach, Shoham & Teboulle, Marc
-
Convergent Nested Alternating Minimization Algorithms for Nonconvex Optimization Problems. Mathematics of Operations Research, 48(1), 53-77.
Gur, Eyal; Sabach, Shoham & Shtern, Shimrit
