Material Detail
Multitask learning: the Bayesian way
This video was recorded at Open House on Multi-Task and Complex Outputs Learning, London 2006. Multi-task learning lends itself particularly well to a Bayesian approach. Cross-inference between tasks can be implemented by sharing parameters in the likelihood model and the prior for the task-specific model parameters. Choosing different priors, one can implement task clustering and task gating. Throughout my presentation, predicting single-copy newspaper sales will serve as a running example.
Quality
- User Rating
- Comments
- Learning Exercises
- Bookmark Collections
- Course ePortfolios
- Accessibility Info