Article ID Journal Published Year Pages File Type
714566 IFAC Proceedings Volumes 2012 8 Pages PDF
Abstract

In this paper, some recent results and ideas on a class of smooth optimization algorithms for convex optimization problems are presented. These algorithms are formulated as ordinary differential equations whose solutions converge to saddle points of the Lagrangian function associated to the convex optimization problem. Specifically in this paper, the global stability behavior for general convex programs as well as linear and quadratic programs are discussed. Furthermore, a continuous Nesterov-like fast gradient variant as well as an interior-point variant of these continuous-time saddle point algorithms are proposed.

Related Topics
Physical Sciences and Engineering Engineering Computational Mechanics