Efficient non-convex polynomial optimization and the sum-of-squares hierarchy

Members' Seminar
Topic:Efficient non-convex polynomial optimization and the sum-of-squares hierarchy
Speaker:David Steurer
Affiliation:Cornell University; Member, School of Mathematics
Date:Monday, March 20
Time/Room:2:00pm - 3:00pm/S-101
Video Link:https://video.ias.edu/membsem/2017/0320-DavidSteurer

The sum-of-squares (SOS) hierarchy (due to Shor'85, Parrilo'00, and Lasserre'00) is a widely-studied meta-algorithm for (non-convex) polynomial optimization that has its roots in Hilbert's 17th problem about non-negative polynomials. SOS plays an increasingly important role in theoretical computer science because it affords a new and unifying perspective on the field's most basic question: What's the best possible polynomial-time algorithm for a given computational problem? For a wide range of problems, SOS generalizes the best-known polynomial-time algorithms and achieves plausibly optimal guarantees.

I will explain key concepts behind SOS that turn out to be constructive versions of the notions of proof and probability.
Then, I will show that SOS gives significantly better guarantees than previous algorithms for two basic non-convex optimization problem: the sparse coding problem from machine learning and the best-separable-state problem from quantum information theory.

Based on joint works with Boaz Barak, Pravesh Kothari, Tengyu Ma, and Jonathan Shi.