[Home]   [  News]   [  Events]   [  People]   [  Research]   [  Education]   [Visitor Info]   [UCSD Only]   [Admin]
Home > Events > CCoM > Abstract
Search this site:

Second-Derivative SQP Methods for Large-Scale Nonconvex Optimization

Jeb Runnoe
UCSD

Abstract:

The class of sequential quadratic programming (SQP) methods solve a nonlinearly constrained optimization problem by solving a sequence of related quadratic programming (QP) subproblems. Each subproblem involves the minimization of a quadratic model of the Lagrangian function subject to the linearized constraints. In contrast to the quasi-Newton approach, which maintains a positive-definite approximation of the Hessian of the Lagrangian, second-derivative SQP methods use the exact Hessian of the Lagrangian. In this context, we will discuss a dynamic convexification strategy with two main features. First, the method makes minimal matrix modifications while ensuring that the iterates of the QP subproblem are bounded. Second, the solution of the convexified QP is a descent direction for a merit function that is used to force convergence from any starting point. This talk will focus on the dynamic convexification of a class of primal-dual SQP methods. Extensive numerical results will be presented.

Tuesday, March 12, 2024
11:00AM Zoom Only ID 990 3560 4352