Material Detail

Sparsity analsysis of term weighting schemes and application to text classification

Sparsity analsysis of term weighting schemes and application to text classification

This video was recorded at Workshop on Subspace, Latent Structure and Feature Selection Techniques: Statistical and Optimisation Perspectives, Bohinj 2005. We revisit the common practice of feature selection for dimensionality and noise reduction. This typically involves scoring and ranking features based on some weighting scheme and selecting top ranked features for further processing. Experiments show that the performance of text classification methods is sensitive to characteristics of the used feature sets. For example, the size of the feature sets that yield the same performance level for a given classification method can be very different, depending on the feature scoring method used. We expand this exploration by considering representations of individual document vectors that result from a particular feature set. In particular, we observe the average number of features per document vector, i.e., the vector sparsity, or density and introduce sparsity curves to illustrate how the vector density increases with the feature set for different weighting schemes. We show that selecting feature by specifying the vector density parameter, instead of a feature set size, yields comparable results to the commonly adopted practice. However, it has the added benefit of understanding the effect of feature selection on document vector representation and system parameters, such as memory consumption of the classification operations. Furthermore, the corresponding classification performance curves link the sparsity and performance measures and provide further insight on how the feature specificity or distribution of the feature across documents in the corpus, is accounted for by the classification method.

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.