Material Detail
On-line learning competitive with reproducing kernel Hilbert spaces
This video was recorded at Workshop on Modelling in Classification and Statistical Learning, Eindhoven 2004. In this talk I will describe a new technique for designing competitive on-line prediction algorithms and proving loss bounds for them. The goal of such algorithms is to perform almost as well as the best decision rules in a wide benchmark class, with no assumptions made about the way the observations are generated. However, standard algorithms in this area can only deal with finite-dimensional (often countable) benchmark classes. The new technique gives similar results for decision rules ranging over infinite-dimensional function spaces. It is based on a recent game-theoretic approach to the foundations of probability and, more specifically, on recent results about defensive forecasting. Given the probabilities produced by a defensive forecasting algorithm, which are known to be well calibrated and to have good resolution in the long run, the expected loss minimization principle is used to find a suitable prediction.
Quality
- User Rating
- Comments
- Learning Exercises
- Bookmark Collections
- Course ePortfolios
- Accessibility Info