Material Detail
Fast methods for sparse recovery: alternatives to L1
This video was recorded at Workshop on Sparsity in Machine Learning and Statistics, Cumberland Lodge 2009. Finding sparse solutions to underdetermined inverse problems is a fundamental challenge encountered in a wide range of signal processing applications, from signal acquisition to source separation. Recent theoretical advances in our understanding of this problem have further increased interest in their application to various domains. In many areas, such as for example medical imaging or geophysical data acquisition, it is necessary to find sparse solutions to very large underdetermined inverse problems. Fast methods therefore have to be developed. In this talk, we will present two classes of fast algorithm that are competitive with the more classical L1 minimization (Basis Pursuit). The first techniques is based upon a greedy selection approach. However in each iteration, several new elements are selected. The selected coefficients are then updated using a conjugate update direction. This is an extension of the previously suggested Gradient Pursuit framework to allow an even greedier selection strategy. It also has the unique property of allowing a smooth trade-off between recovery performance and computational complexity. The second technique that we discuss is an extremely simple strategy called Iterative Hard thresholding (IHT). Despite its simplicity it can be shown that: it gives near-optimal performance guarantees; it is robust to observation noise; it has low bounded computational complexity per iteration and only requires a fixed number of iterations depending on the signal to noise ratio of the signal. Unfortunately a niaive application of IHT yields empirical performance substantially below that of other state of the art recovery algorithms. We therefore discuss have the algorithm can be modified to obtain performance that is comparable with state of the art while retaining its strong theoretical properties.
Quality
- User Rating
- Comments
- Learning Exercises
- Bookmark Collections
- Course ePortfolios
- Accessibility Info