Skip to content Skip to main navigation Report an accessibility issue

Large-Scale Semi-definite Programs and Applications to Polynomial Optimization and Machine Learning

Dr. Georgina Hall
Princeton University
Gordon Wu Fellow
Friday, February 16, 2018   2:30-3:30pm
JDT 410

Abstract:

Many intractable problems that appear in nonconvex optimization, statistics and machine learning, economics, combinatorics, and network theory, can be approximately solved using semidefinite programs (SDPs). Though they provide good-quality bounds on the initial problem, it is well known that, as a class of problems, semidefinite programming is severely limited by its scalability. In the first part of the talk, we present new algorithms that solve SDPs in a more scalable fashion. This is done by generating a sequence of linear or second order programs that adaptively approximates the SDP with increasing accuracy. In the second part of the talk, we focus on a specific, but fundamental, intractable problem that has a powerful SDP-based approximation: the problem of optimizing over nonnegative polynomials. We present two applications of this problem in machine learning. The first is multivariate monotone regression, which is motivated by some applications in pricing. The second concerns a specific subclass of optimization problems called difference of convex (DC) programs, which appear naturally in machine learning problems. We show how our techniques can be used to optimally reformulate DC programs in order to speed up the best-known algorithms used for solving them.

Bio:

Georgina Hall is a final-year graduate student and a Gordon Wu fellow in the department of Operations Research and Financial Engineering at Princeton University, where she is advised by Professor Amir Ali Ahmadi. She was the valedictorian of Ecole Centrale, Paris, where she obtained a B.S. and an M.S., in 2011 and 2013 respectively. Her interests lie in convex relaxations of NP-hard problems, particularly those arising in polynomial optimization. Georgina is the recipient of the Médaille de l’Ecole Centrale from the French Académie des Sciences and the Princeton School of Engineering and Applied Sciences Award for Excellence. She was also chosen for the 2017 Rising Stars in EECS workshop at Stanford and the 2017 Young Researchers Workshop at Cornell University.  Her paper “DC decomposition of nonconvex polynomials using algebraic techniques” is the recent recipient of the 2016 Informs Computing Society Prize for Best Student Paper. She has also been the recipient of a number of teaching awards, including the Princeton University’s Engineering Council Teaching Award, the university-wide Excellence in Teaching Award of the Princeton Graduate School, and the 2017 Excellence in Teaching of Operations Research Award of the Institute for Industrial and Systems Engineers.