Material Detail
Gradient Weights help Nonparametric Regressors
This video was recorded at 26th Annual Conference on Neural Information Processing Systems (NIPS), Lake Tahoe 2012. In regression problems over real d, the unknown function f often varies more in some coordinates than in others. We show that weighting each coordinate i with the estimated norm of the ith derivative of f is an efficient way to significantly improve the performance of distance-based regressors, e.g. kernel and k-NN regressors. We propose a simple estimator of these derivative norms and prove its consistency. Moreover, the proposed estimator is efficiently learned online.
Quality
- User Rating
- Comments
- Learning Exercises
- Bookmark Collections
- Course ePortfolios
- Accessibility Info