Material Detail

Robust Near-Separable Nonnegative Matrix Factorization Using Linear Optimization

Robust Near-Separable Nonnegative Matrix Factorization Using Linear Optimization

This video was recorded at International Workshop on Advances in Regularization, Optimization, Kernel Methods and Support Vector Machines (ROKS): theory and applications, Leuven 2013. Nonnegative matrix factorization (NMF) has been shown recently to be tractable under the separability assumption, which amounts for the columns of the input data matrix to belong to the convex cone generated by a small number of columns. Bittorf, Recht, R´e and Tropp ('Factoring nonnegative matrices with linear programs', NIPS 2012) proposed a linear programming (LP) model, referred to as HottTopixx, which is robust under any small perturbation of the input matrix. However, HottTopixx has two important drawbacks: (i) the input matrix has to be normalized, and (ii) the factorization rank has to be known in advance. In this talk, we generalize HottTopixx in order to resolve these two drawbacks, that is, we propose a new LP model which does not require normalization and detects the factorization rank automatically. Moreover, the new LP model is more flexible, significantly more tolerant to noise, and can easily be adapted to handle outliers and other noise models. We show on several synthetic datasets that it outperforms HottTopixx while competing favorably with two state-of-the-art methods.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.