Material Detail

No-Free-Lunch Theorems for Transfer Learning

No-Free-Lunch Theorems for Transfer Learning

This video was recorded at NIPS Workshops, Whistler 2009. I will present a formal framework for transfer learning and investigate under which conditions is it possible to provide performance guarantees for such scenarios. I will address two key issues: 1) Which notions of task-similarity suffice to provide meaningful error bounds on a target task, for a predictor trained on a (different) source task? 2) Can we do better than just train a hypothesis on the source task and analyze its performance on the target task? Can the use of unlabeled target samples reduce the target prediction error?

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.