Material Detail

An Asymptotic Analysis of Generative, Discriminative, and Pseudolikelihood Estimators

An Asymptotic Analysis of Generative, Discriminative, and Pseudolikelihood Estimators

This video was recorded at 25th International Conference on Machine Learning (ICML), Helsinki 2008. Statistical and computational concerns have motivated parameter estimators based on various forms of likelihood, e.g., joint, conditional, and pseudolikelihood. In this paper, we present a unified framework for studying these estimators, which allows us to compare their relative (statistical) efficiencies. Our asymptotic analysis suggests that modeling more of the data tends to reduce variance, but at the cost of being more sensitive to model misspecification. We present experiments validating our analysis.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.