Material Detail
Convergence rates of nested accelerated inexact proximal methods
This video was recorded at NIPS Workshops, Lake Tahoe 2012. Proximal gradient methods are popular first order algorithms currently used to solve several machine learning and inverse problems. We consider the case where the proximity operator is not available in closed form and is thus approximated via an iterative procedure leading to a nested algorithm. For the first time, we show that relying on an appropriate notion of approximations, which gives an explicit stopping rule for the inner loop, convergence rates for the two-loops algorithm can be proved for accelerated procedures for a large class of approximation algorithms. An experimental comparison with a benchmark primal-dual algorithm is reported and confirms a good empirical performance.
Quality
- User Rating
- Comments
- Learning Exercises
- Bookmark Collections
- Course ePortfolios
- Accessibility Info