Autonomous Motion
Note: This department has relocated.

Fusing visual and tactile sensing for 3-D object reconstruction while grasping

2013

Conference Paper

am


In this work, we propose to reconstruct a complete 3-D model of an unknown object by fusion of visual and tactile information while the object is grasped. Assuming the object is symmetric, a first hypothesis of its complete 3-D shape is generated from a single view. This initial model is used to plan a grasp on the object which is then executed with a robotic manipulator equipped with tactile sensors. Given the detected contacts between the fingers and the object, the full object model including the symmetry parameters can be refined. This refined model will then allow the planning of more complex manipulation tasks. The main contribution of this work is an optimal estimation approach for the fusion of visual and tactile data applying the constraint of object symmetry. The fusion is formulated as a state estimation problem and solved with an iterative extended Kalman filter. The approach is validated experimentally using both artificial and real data from two different robotic platforms.

Author(s): Ilonen, J. and Bohg, J. and Kyrki, V.
Book Title: IEEE International Conference on Robotics and Automation (ICRA)
Pages: 3547-3554
Year: 2013

Department(s): Autonomous Motion
Research Project(s): Interactive Perception
Bibtex Type: Conference Paper (inproceedings)
Paper Type: Conference

DOI: 10.1109/ICRA.2013.6631074

BibTex

@inproceedings{6631074,
  title = {Fusing visual and tactile sensing for 3-D object reconstruction while grasping},
  author = {Ilonen, J. and Bohg, J. and Kyrki, V.},
  booktitle = {IEEE International Conference on Robotics and Automation (ICRA)},
  pages = {3547-3554},
  year = {2013},
  doi = {10.1109/ICRA.2013.6631074}
}