This video was recorded at International Workshop on Advances in Regularization, Optimization, Kernel Methods and Support Vector Machines (ROKS): theory and applications, Leuven 2013. This work deals with the problem of linear subspace estimation in a general, Hilbert space setting. We provide bounds that are considerably sharper than existing ones, under equal assumptions. These bounds are also competitive with bounds that are allowed to make strong, further assumptions (on the fourth order moments), even when we do not. Finally, we generalize these results to a family of metrics, allowing for a more general definition of performance.
You just viewed Subspace Learning
. Please take a moment to rate this material.
Search by ISBN?
It looks like you have entered an ISBN number. Would you like to search using what you have
entered as an ISBN number?
Searching for Members?
You entered an email address. Would you like to search for members? Click Yes to continue. If no, materials will be displayed first. You can refine your search with the options on the left of the results page.