Material Detail

Gradient Descent with Sparsification: An Iterative Algorithm for Sparse Recovery with Restricted Isometry Property

Gradient Descent with Sparsification: An Iterative Algorithm for Sparse Recovery with Restricted Isometry Property

This video was recorded at 26th International Conference on Machine Learning (ICML), Montreal 2009. In this paper, we present an algorithm for finding an s-sparse vector x that minimizes the square-error ∥y − Φx∥ 2 where Φ satisfies the restricted isometry property (RIP). Our algorithm, called GraDeS (Gradient Descent with Sparsification) starts from an arbitrary s-sparse x and iteratively updates it as: x← Hsx + 1γ · Φt (y − Φx)where γ > 1 is a constant and Hs sets all but largest s coordinates in absolute value to zero. We show that GraDeS, in constant number of iterations, computes the correct s-sparse solution to the system y = Φx where Φ satisfies the condition that the isometric constant δ2s < 1/3. This is the most general condition for which, near-linear time algorithm is known. In comparison, the best condition under which any polynomial-time algorithm is known, is δ2s < √2 − 1. An important contribution of the paper is to analyze how the hard-thresholding function Hs acts w.r.t. the potential ∥y − Φx∥ 2 . A special case of GraDeS, corresponding to γ = 1, called Iterative Hard Thresholding (IHT), was previously shown to converge when δ3s < 1/√32. Our Matlab implementation of GraDeS out-performs previously proposed algorithms like Subspace Pursuit, StOMP, OMP, and Lasso by an order of magnitude. Curiously, our experiments also uncovered several cases where L1-regularized regression (Lasso) fails but GraDeS finds the correct solution.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.