Survey of sequential convex programming and generalized Gauss-Newton methods*

We provide an overview of a class of iterative convex approximation methods for nonlinear optimization problems with convex-over-nonlinear substructure. These problems are characterized by outer convexities on the one hand, and nonlinear, generally nonconvex, but differentiable functions on the othe...

Full description

Bibliographic Details
Main Authors: Messerer Florian, Baumgärtner Katrin, Diehl Moritz
Format: Article
Language:English
Published: EDP Sciences 2021-08-01
Series:ESAIM: Proceedings and Surveys
Online Access:https://www.esaim-proc.org/articles/proc/pdf/2021/02/proc2107107.pdf
id doaj-6ccd6f6410c34349a83f2a06c20c9eaf
record_format Article
spelling doaj-6ccd6f6410c34349a83f2a06c20c9eaf2021-09-02T09:29:22ZengEDP SciencesESAIM: Proceedings and Surveys2267-30592021-08-0171648810.1051/proc/202171107proc2107107Survey of sequential convex programming and generalized Gauss-Newton methods*Messerer Florian0Baumgärtner Katrin1Diehl MoritzDepartment of Microsystems Engineering (IMTEK), University of FreiburgDepartment of Microsystems Engineering (IMTEK), University of FreiburgWe provide an overview of a class of iterative convex approximation methods for nonlinear optimization problems with convex-over-nonlinear substructure. These problems are characterized by outer convexities on the one hand, and nonlinear, generally nonconvex, but differentiable functions on the other hand. All methods from this class use only first order derivatives of the nonlinear functions and sequentially solve convex optimization problems. All of them are different generalizations of the classical Gauss-Newton (GN) method. We focus on the smooth constrained case and on three methods to address it: Sequential Convex Programming (SCP), Sequential Convex Quadratic Programming (SCQP), and Sequential Quadratically Constrained Quadratic Programming (SQCQP). While the first two methods were previously known, the last is newly proposed and investigated in this paper. We show under mild assumptions that SCP, SCQP and SQCQP have exactly the same local linear convergence – or divergence – rate. We then discuss the special case in which the solution is fully determined by the active constraints, and show that for this case the KKT conditions are sufficient for local optimality and that SCP, SCQP and SQCQP even converge quadratically. In the context of parameter estimation with symmetric convex loss functions, the possible divergence of the methods can in fact be an advantage that helps them to avoid some undesirable local minima: generalizing existing results, we show that the presented methods converge to a local minimum if and only if this local minimum is stable against a mirroring operation applied to the measurement data of the estimation problem. All results are illustrated by numerical experiments on a tutorial example.https://www.esaim-proc.org/articles/proc/pdf/2021/02/proc2107107.pdf
collection DOAJ
language English
format Article
sources DOAJ
author Messerer Florian
Baumgärtner Katrin
Diehl Moritz
spellingShingle Messerer Florian
Baumgärtner Katrin
Diehl Moritz
Survey of sequential convex programming and generalized Gauss-Newton methods*
ESAIM: Proceedings and Surveys
author_facet Messerer Florian
Baumgärtner Katrin
Diehl Moritz
author_sort Messerer Florian
title Survey of sequential convex programming and generalized Gauss-Newton methods*
title_short Survey of sequential convex programming and generalized Gauss-Newton methods*
title_full Survey of sequential convex programming and generalized Gauss-Newton methods*
title_fullStr Survey of sequential convex programming and generalized Gauss-Newton methods*
title_full_unstemmed Survey of sequential convex programming and generalized Gauss-Newton methods*
title_sort survey of sequential convex programming and generalized gauss-newton methods*
publisher EDP Sciences
series ESAIM: Proceedings and Surveys
issn 2267-3059
publishDate 2021-08-01
description We provide an overview of a class of iterative convex approximation methods for nonlinear optimization problems with convex-over-nonlinear substructure. These problems are characterized by outer convexities on the one hand, and nonlinear, generally nonconvex, but differentiable functions on the other hand. All methods from this class use only first order derivatives of the nonlinear functions and sequentially solve convex optimization problems. All of them are different generalizations of the classical Gauss-Newton (GN) method. We focus on the smooth constrained case and on three methods to address it: Sequential Convex Programming (SCP), Sequential Convex Quadratic Programming (SCQP), and Sequential Quadratically Constrained Quadratic Programming (SQCQP). While the first two methods were previously known, the last is newly proposed and investigated in this paper. We show under mild assumptions that SCP, SCQP and SQCQP have exactly the same local linear convergence – or divergence – rate. We then discuss the special case in which the solution is fully determined by the active constraints, and show that for this case the KKT conditions are sufficient for local optimality and that SCP, SCQP and SQCQP even converge quadratically. In the context of parameter estimation with symmetric convex loss functions, the possible divergence of the methods can in fact be an advantage that helps them to avoid some undesirable local minima: generalizing existing results, we show that the presented methods converge to a local minimum if and only if this local minimum is stable against a mirroring operation applied to the measurement data of the estimation problem. All results are illustrated by numerical experiments on a tutorial example.
url https://www.esaim-proc.org/articles/proc/pdf/2021/02/proc2107107.pdf
work_keys_str_mv AT messererflorian surveyofsequentialconvexprogrammingandgeneralizedgaussnewtonmethods
AT baumgartnerkatrin surveyofsequentialconvexprogrammingandgeneralizedgaussnewtonmethods
AT diehlmoritz surveyofsequentialconvexprogrammingandgeneralizedgaussnewtonmethods
_version_ 1721177106496356352