Computer Science Special Talk

— 3:00pm

Location:
7101 - Gates Hillman Centers

Speaker:
RICHARD PENG , Assistant Professor
https://www.cc.gatech.edu/~rpeng/

In current convex optimization literature, there are significant gaps between algorithms that produce approximate answers and algorithms that converge to polynomially small errors.  This gap is reflected in many important open problems in efficient algorithms, including distinctions between (accelerated) gradient descent vs. interior point methods, and between approximate vs. exact max-flow.

In this talk, I will discuss recent and ongoing works that generalize a fundamental building block in numerical analysis, preconditioned iterative methods, to wide ranges of convex functions. This leads to algorithms that use low accuracy solutions of residual problems to reduce distances to optimum in ways analogous to linear system solvers.

This method is applicable to a wide range of convex objectives that include to p-norms with p bounded away from 1. Here follow up works led to high accuracy p-norm regression routines that converge in about m1/3 linear systems solves, and high accuracy p-norm flow algorithms that take almost-linear time on unit capacity graphs.

Faculty Host: Gary Miller

For More Information:
thuwjx@gmail.com


Add event to Google
Add event to iCal