|COMPUTER SCIENCE/DISCRETE MATH I|
|Topic:||Efficient Algorithms Using the Multiplicative Weights Update Method|
|Date:||Monday, March 5|
|Time/Room:||11:15am - 12:15pm/S-101|
Algorithms based on convex optimization, especially linear and semidefinite programming, are ubiquitous in Computer Science. While there are polynomial time algorithms known to solve such problems, quite often the running time of these algorithms is very high. Designing simpler and more efficient Algorithms is important for practical impact. In my talk, I will describe applications of a Lagrangian relaxation technique, the Multiplicative Weights Update method in the design of efficient Algorithms for various optimization problems. We generalize the method to the setting of symmetric matrices rather than real numbers. The new algorithm yields the first truly general, combinatorial, primal-dual method for designing efficient algorithms for semidefinite programming. Using these techniques, we obtain significantly faster algorithms for approximating the Sparsest Cut and Balanced Separator in both directed and undirected weighted graphs, and the Min UnCut problem. In addition, we also obtain an efficient derandomization of the Alon-Roichman theorem, a deterministic O(log n) approximation to the Quantum Hypergraph Covering problem, and an alternative proof of Aaronson's result on the learnability of quantum states.