Header logo is am

Learning attractor landscapes for learning motor primitives


Conference Paper


If globally high dimensional data has locally only low dimensional distributions, it is advantageous to perform a local dimensionality reduction before further processing the data. In this paper we examine several techniques for local dimensionality reduction in the context of locally weighted linear regression. As possible candidates, we derive local versions of factor analysis regression, principle component regression, principle component regression on joint distributions, and partial least squares regression. After outlining the statistical bases of these methods, we perform Monte Carlo simulations to evaluate their robustness with respect to violations of their statistical assumptions. One surprising outcome is that locally weighted partial least squares regression offers the best average results, thus outperforming even factor analysis, the theoretically most appealing of our candidate techniques.Ê

Author(s): Ijspeert, A. and Nakanishi, J. and Schaal, S.
Book Title: Advances in Neural Information Processing Systems 15
Pages: 1547-1554
Year: 2003
Editors: Becker, S.;Thrun, S.;Obermayer, K.
Publisher: Cambridge, MA: MIT Press

Department(s): Autonomous Motion
Bibtex Type: Conference Paper (inproceedings)

Cross Ref: p1642
Note: clmc
URL: http://www-clmc.usc.edu/publications/I/ijspeert-NIPS2002.pdf


  title = {Learning attractor landscapes for learning motor primitives},
  author = {Ijspeert, A. and Nakanishi, J. and Schaal, S.},
  booktitle = {Advances in Neural Information Processing Systems 15},
  pages = {1547-1554},
  editors = {Becker, S.;Thrun, S.;Obermayer, K.},
  publisher = {Cambridge, MA: MIT Press},
  year = {2003},
  note = {clmc},
  crossref = {p1642},
  url = {http://www-clmc.usc.edu/publications/I/ijspeert-NIPS2002.pdf}