Uniformly Faster Gradient Descent of Varying Step Sizes for Smooth Convex Functions

Program:

Applied Mathematics and Statistics

Project Description:

In using gradient descent method to optimize smooth convex functions, the conventional approach chooses a constant stepsize less than 2 for every iteration. Recent works [1] have shown using stepsizes larger than 2 enables better final guarantees but at the cost of intermediate iterates performing poorly. We seek to find such longer stepsize patterns improving performance uniformly, not just at the last iteration.

Team Members

Project Mentors, Sponsors, and Partners

  • Benjamin Grimmer

Course Faculty

Project Links

Additional Project Information

Project Photo

Project Photo Caption:

presentation poster of project

Project Video