Detailseite
Projekt Druckansicht

Anwendung jüngster Resultate der Proximal Theorie jenseits der Komfortzone

Antragsteller Professor Dr. Gert Wanka
Fachliche Zuordnung Mathematik
Förderung Förderung von 2016 bis 2022
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 281129760
 
Erstellungsjahr 2022

Zusammenfassung der Projektergebnisse

We developed two stochastic incremental mirror descent algorithms with Nesterov smoothing. For this we approximated the component functions of the objective using Nesterov’s smoothing technique to obtain gradients to perfome gradient descent and used a mirror map to "mirror" the iterates onto the feasible set. Due to the fact that the Moreau envelope is a special case of this smoothing technique, these algorithms can be formulated with proximal steps and thus √ be can classified in the context of proximal algorithms. We proved a convergence order of O(1/√k) in expectation for the kth best objective function value using Bregman-distance-like function and the Bregman distance respectively, which are associated to the mirror map. These results may have potential applications in various areas as in Logistics, Medical Imaging and Machine Learning. Furthermore, we developed the Proximal Alternating Minimzation Algorithm (Proximal AMA) as a combination of the Alternating Minimization Algorithm (AMA) and the concept of proximality induced by variable metrics. As long as the sequence of variable metrics is chosen appropriately, the Proximal AMA method has the advantage over the AMA method that it performs proximal steps when calculating new iterates instead of solving minimization subproblems in each interation. This may lead to better numerical performances, as we showed in experiments on image processing and support vector machines classification. We could prove that the generated sequence converges weakly to a saddle point of the Lagrangian associated with the investigated optimization problem. We were able to show in numerical applications of image reconstruction and machine learning that proximal AMA converges faster than AMA in these two applications. Moreover, we introduced and investigated a dynamical system which generates three trajectories in order to approach the set of saddle points of the Lagrangian associated to the same optimization problem, which we investigated in the context of our work of Proximal AMA. Under appropriate conditions we showed the existence and uniqueness of strong global solutions of the dynamical system. We could prove that the discretization of the considered dynamical system is related to the Proximal AMA and AMA numerical schemes and so it can provide insights into the convergence behavior and possible improvements for these numerical algorithms in further research. Working on the research project, we encountered both positive and negative surprises. Unpleasant experiences concern different ideas for research that have proven to lead towards dead ends or the sudden difficulties that have appeared in proofs that ran rather smoothly otherwise, but we will not insist on them as they are inherent to any research endeavour. As positive experiences we can mention that we were able to improve in the context of splitting algorithms some well-known numerical algorithms that have applications in many different areas. We could prove the convergence of these algorithms and could show the advantages in several numerical applications. A pleasant suprise regarding our work in the framework of Objective 4 was that we could choose in our proposed dynamical system the additional parameter c(t), time varying. This was not clear from the beginning and introduced some difficulties in the proofs, which we could overcome.

Projektbezogene Publikationen (Auswahl)

 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung