Material Detail

A metric notion of dimension and its applications to learning

A metric notion of dimension and its applications to learning

This video was recorded at 27th International Conference on Machine Learning (ICML), Haifa 2010. Let us define the dimension of a metric space as the minimum k>0 such that every ball in the metric space can be covered by 2^k balls of half the radius. This definition has several attractive features besides being applicable to every metric space. For instance, it coincides with the standard notion of dimension in Euclidean spaces, but captures also nonlinear structures such as manifolds. Metric spaces of low dimension (under the above definition) occur naturally in many contexts. I will discuss recent theoretical results regarding such metric spaces, including questions such as embeddability, dimension reduction, Nearest Neighbor Search, and large-margin classification, the common thread being that low dimension implies algorithmic efficiency.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.