James Renegar:
Greater Conic Structure Provides the Basis for Faster Algorithms
Abstract:
The framework presented in the previous lecture applies to optimization problems defined over arbitrary convex cones. The framework gives a convex objective function which is Lipschitz continuous but not smooth. In order to apply accelerated gradient methods, however, smoothness of the objective function is essential.
In this lecture we present basics of hyperbolic programming, which generalizes semidefinite programming. Most practical optimization problems are hyperbolic programs. Hyperbolic programming was introduced by Guler in the mid-1990's, and has direct ties to many research topics, such as the recent, celebrated work by Marcus, Spielman and Srivastava, in which they solved the long-open Kadison-Singer Problem.
Hyperbolicity cones - the convex cones underlying hyperbolic programs - have pronounced algebraic/geometric structure that can be used to design faster algorithms. We show this for both first-order, and second-order, methods. Regarding first-order methods, in particular, we show the structure provides a natural smoothing to the objective function in the previous framework, thereby allowing accelerated gradient methods to be applied.
|