Material Detail

Stationary Subspace Analysis

Stationary Subspace Analysis

This video was recorded at NIPS Workshop on Algebraic and Combinatorial Methods in Machine Learning, Whistler 2008. Non-stationarities are an ubiquitous phenomenon in real-world data, yet they challenge standard Machine Learning methods: if training and test distributions differ we cannot, in principle, gen- eralise from the observed training sample to the test distribution. This affects both supervised and unsupervised learning algorithms. In a classification problem, for instance, we may infer spurious dependen- cies between data and label from the the training sample that are mere artefacts of the non-stationarities. Conversely, identifying the sources of non-stationary behaviour in order to better understand the analyzed system often lies at the heart of a scientific question. To this end, we propose a novel unsupervised paradigm: Stationary Subspace Analysis (SSA). SSA decomposes a multi-variate time-series into a stationary and a non-stationary subspace. We derive an efficient algorithm that hinges on an optimization procedure in the Special Orthogonal Group. By exploiting the Lie group structure of the optimization manifold, we can explicitly factor out the inherent symmetries of the problem and thereby reduce the number of parameters to the exact degrees of freedom. The practical utility of our approach is demonstrated in an application to Brain Computer-Interfacing (BCI).

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.