Material Detail

Some Aspects of Learning Rates for SVMs

Some Aspects of Learning Rates for SVMs

This video was recorded at Machine Learning Summer School (MLSS), Chicago 2005. We present some learning rates for support vector machine classification. In particular we discuss a recently proposed geometric noise assumption which allows to bound the approximation error for Gaussian RKHSs. Furthermore we show how a noise assumption proposed by Tsybakov can be used to obtain learning rates between 1/sqrt(n) and 1/n. Finally, we describe the influence of the approximation error on the overall learning rate.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.
hidden