Material Detail

Unifying Subspace and Distance Metric Learning with Bhattacharyya Coefficient for Image Classification

Unifying Subspace and Distance Metric Learning with Bhattacharyya Coefficient for Image Classification

This video was recorded at Emerging Trends in Visual Computing. In this talk, we propose a unified scheme of subspace and distance metric learning under the Bayesian framework for image classification. According to the local distribution of data, we divide the k-nearest neighbors of each sample into the intra-class set and the inter-class set, and we aim to learn a distance metric in the embedding subspace, which can make the distances between the sample and its intra-class set smaller than the distances between it and its inter-class set. To reach this goal, we consider the intra-class distances and the inter-class distances to be from two different probability distributions respectively, and we model the goal with minimizing the overlap between two distributions. Inspired by the Bayesian classification error estimation, we formulate the objective function by minimizing the Bhattachyrra coefficient between two distributions. We further extend it with the kernel trick to learn nonlinear distance metric. The power and generality of the proposed approach are demonstrated by a series of experiments on the CMU-PIE face database, the extended YALE face database, and the COREL-5000 nature image database.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.