Header logo is am

am Thumb sm dscf3163  2
Jeannette Bohg (Project leader)
Research Group Leader
Noemployeeimage sm
Ville Kyrki
Aalto University
no image
Jarmo Illonen
Lappeenranta University of Technology
3 results

2013


Thumb xl screen shot 2015 08 23 at 00.29.36
Fusing visual and tactile sensing for 3-D object reconstruction while grasping

Ilonen, J., Bohg, J., Kyrki, V.

In IEEE International Conference on Robotics and Automation (ICRA), pages: 3547-3554, 2013 (inproceedings)

Abstract
In this work, we propose to reconstruct a complete 3-D model of an unknown object by fusion of visual and tactile information while the object is grasped. Assuming the object is symmetric, a first hypothesis of its complete 3-D shape is generated from a single view. This initial model is used to plan a grasp on the object which is then executed with a robotic manipulator equipped with tactile sensors. Given the detected contacts between the fingers and the object, the full object model including the symmetry parameters can be refined. This refined model will then allow the planning of more complex manipulation tasks. The main contribution of this work is an optimal estimation approach for the fusion of visual and tactile data applying the constraint of object symmetry. The fusion is formulated as a state estimation problem and solved with an iterative extended Kalman filter. The approach is validated experimentally using both artificial and real data from two different robotic platforms.

DOI Project Page [BibTex]

2013

DOI Project Page [BibTex]


Thumb xl multi modal
3-D Object Reconstruction of Symmetric Objects by Fusing Visual and Tactile Sensing

Illonen, J., Bohg, J., Kyrki, V.

The International Journal of Robotics Research, 33(2):321-341, Sage, October 2013 (article)

Abstract
In this work, we propose to reconstruct a complete 3-D model of an unknown object by fusion of visual and tactile information while the object is grasped. Assuming the object is symmetric, a first hypothesis of its complete 3-D shape is generated. A grasp is executed on the object with a robotic manipulator equipped with tactile sensors. Given the detected contacts between the fingers and the object, the initial full object model including the symmetry parameters can be refined. This refined model will then allow the planning of more complex manipulation tasks. The main contribution of this work is an optimal estimation approach for the fusion of visual and tactile data applying the constraint of object symmetry. The fusion is formulated as a state estimation problem and solved with an iterative extended Kalman filter. The approach is validated experimentally using both artificial and real data from two different robotic platforms.

Web DOI Project Page [BibTex]

Web DOI Project Page [BibTex]

2011


Thumb xl kthexecution
Mind the gap - robotic grasping under incomplete observation

Bohg, J., Johnson-Roberson, M., Leon, B., Felip, J., Gratal, X., Bergstrom, N., Kragic, D., Morales, A.

In Robotics and Automation (ICRA), 2011 IEEE International Conference on, pages: 686-693, May 2011 (inproceedings)

Abstract
We consider the problem of grasp and manipulation planning when the state of the world is only partially observable. Specifically, we address the task of picking up unknown objects from a table top. The proposed approach to object shape prediction aims at closing the knowledge gaps in the robot's understanding of the world. A completed state estimate of the environment can then be provided to a simulator in which stable grasps and collision-free movements are planned. The proposed approach is based on the observation that many objects commonly in use in a service robotic scenario possess symmetries. We search for the optimal parameters of these symmetries given visibility constraints. Once found, the point cloud is completed and a surface mesh reconstructed. Quantitative experiments show that the predictions are valid approximations of the real object shape. By demonstrating the approach on two very different robotic platforms its generality is emphasized.

pdf video code data DOI Project Page [BibTex]

2011

pdf video code data DOI Project Page [BibTex]