Material Detail
Connections between the Lasso and Support Vector Machines
This video was recorded at International Workshop on Advances in Regularization, Optimization, Kernel Methods and Support Vector Machines (ROKS): theory and applications, Leuven 2013. We investigate the relation of two fundamental tools in machine learning and signal processing, that is the support vector machine (SVM) for classification, and the Lasso technique used in regression. We show [7] that the resulting optimization problems are equivalent, in the following sense: Given any instance of one of the two problems, we construct an instance of the other, having the same optimal solution. In consequence, many existing optimization algorithms for both SVMs and Lasso can also be applied to the respective other problem instances. Also, the equivalence allows for many known theoretical insights for SVM and Lasso to be translated between the two settings. One such implication gives a simple kernelized version of the Lasso, analogous to the kernels used in the SVM setting. Another consequence is that the sparsity of a Lasso solution is equal to the number of support vectors for the corresponding SVM instance, and that one can use screening rules to prune the set of support vectors. Furthermore, we can relate sublinear time algorithms for the two problems, and give a new such algorithm variant for the Lasso.
Quality
- User Rating
- Comments
- Learning Exercises
- Bookmark Collections
- Course ePortfolios
- Accessibility Info