Material Detail

Fast first-order methods for convex optimization with line search

Fast first-order methods for convex optimization with line search

This video was recorded at NIPS Workshops, Sierra Nevada 2011. We propose accelerated first-order methods with non-monotonic choice of the prox parameter, which essentially controls the step size. This is in contrast with most accelerated schemes where the prox parameter is either assumed to be constant or non-increasing. In particular we show that a backtracking strategy can be used within FISTA [2] and FALM algorithms [5] starting from an arbitrary parameter value preserving their worst-case iteration complexities of O. We also derive complexity estimates that depend on the "average" step size rather than the global Lipschitz constant for the function gradient, which provide better theoretical justification for these methods, hence the main contribution of this paper is theoretical.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.