# Material Detail ## Gradient Descent with Sparsiﬁcation: An Iterative Algorithm for Sparse Recovery with Restricted Isometry Property

This video was recorded at 26th International Conference on Machine Learning (ICML), Montreal 2009. In this paper, we present an algorithm for ﬁnding an s-sparse vector x that minimizes the square-error ∥y − Φx∥ 2 where Φ satisﬁes the restricted isometry property (RIP). Our algorithm, called GraDeS (Gradient Descent with Sparsiﬁcation) starts from an arbitrary s-sparse x and iteratively updates it as: x← Hsx + 1γ · Φt (y − Φx)where γ > 1 is a constant and Hs sets all but largest s coordinates in absolute value to zero. We show that GraDeS, in constant number of iterations, computes the correct s-sparse solution to the system y = Φx where Φ satisﬁes the condition that the isometric constant δ2s < 1/3. This is the most general condition for which, near-linear time algorithm is known. In comparison, the best condition under which any polynomial-time algorithm is known, is δ2s < √2 − 1. An important contribution of the paper is to analyze how the hard-thresholding function Hs acts w.r.t. the potential ∥y − Φx∥ 2 . A special case of GraDeS, corresponding to γ = 1, called Iterative Hard Thresholding (IHT), was previously shown to converge when δ3s < 1/√32. Our Matlab implementation of GraDeS out-performs previously proposed algorithms like Subspace Pursuit, StOMP, OMP, and Lasso by an order of magnitude. Curiously, our experiments also uncovered several cases where L1-regularized regression (Lasso) fails but GraDeS ﬁnds the correct solution.
Rate

#### Quality

• User Rating
• Learning Exercises
• Bookmark Collections
• Course ePortfolios
• Accessibility Info

#### Browse...

##### People who viewed this also viewed
• • • • ##### Other materials like Gradient Descent with Sparsiﬁcation: An Iterative Algorithm for Sparse Recovery with Restricted Isometry Property
• • • • 