Material Detail

Nonlinear Mappings for Generative Kernels on Latent Variable Models

Nonlinear Mappings for Generative Kernels on Latent Variable Models

This video was recorded at Joint IAPR International Workshops on Structural and Syntactic Pattern Recognition (SSPR) and Statistical Techniques in Pattern Recognition (SPR), Cesme 2010. Generative kernels have emerged in the last years as an effective method for mixing discriminative and generative approaches. In particular, in this talk, we focus on kernels defined on generative models with latent variables (e.g. the states in a Hidden Markov Model). The basic idea underlying these kernels is to compare objects, via a inner product, in a feature space where the dimensions are related to the latent variables of the model. We show how to enhance these kernels via a nonlinear normalization of the space, namely a nonlinear mapping of space dimensions able to exploit their discriminative characteristics. We investigated three possible nonlinear mappings, for two HMMbased generative kernels, testing them in different sequence classification problems, with really promising results.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.