Material Detail

Theory of Matching Pursuit in Kernel Defined Feature Spaces

Theory of Matching Pursuit in Kernel Defined Feature Spaces

This video was recorded at NIPS Workshop on New Challenges in Theoretical Machine Learning: Learning with Data-dependent Concept Spaces, Whistler 2008. We analyse matching pursuit for kernel principal components analysis (KPCA) by proving that the sparse subspace it produces is a sample compression scheme. We show that this bound is tighter than the KPCA bound of Shawe-Taylor et al and highly predictive of the size of the subspace needed to capture most of the variance in the data. We analyse a second matching pursuit algorithm called kernel matching pursuit (KMP) which does not correspond to a sample compression scheme. However, we give a novel bound that views the choice of subspace of the KMP algorithm as a compression scheme and hence provide a VC bound to upper bound its future loss. Finally we describe how the same bound can be applied to other matching pursuit related algorithms.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Browse...

Disciplines with similar materials as Theory of Matching Pursuit in Kernel Defined Feature Spaces

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.