Uniformly Faster Gradient Descent of Varying Step Sizes for Smooth Convex Functions

Program:

Applied Mathematics and Statistics

Project Description:

In using gradient descent method to optimize smooth convex functions, the conventional approach chooses a constant step size smaller than two for every iteration. Recent works have shown using stepsizes larger than two enables better final guarantees but at the cost of intermediate iterates performing poorly. We seek to find such longer stepsize patterns improving performance uniformly, not just at the last iteration. Using the recent computer-assisted analysis software, Performance Estimation Problems (PEP) in Python, we can construct the worst smooth convex function for given stepsizes. Numerical search over all small stepsize sequences (e.g. 3-steps) via a grid-search method and optimizing stepsizes via gradient descent produced new uniformly faster stepsize sequences. Continued work will formalize a proof for our small pattern’s performance and produce uniformly good stepsize patterns beyond small settings we have numerically explored.

Team Members

Project Mentors, Sponsors, and Partners

  • Benjamin Grimmer

Course Faculty

Project Links

Additional Project Information

Project Photo

Project Photo Caption:

Gradient Descent

Project Video