While there exist many solutions and criteria for selecting the best grasp for an object of known shape, grasping an object whose shape is uncertain and noise remains a challenge. In this project, we consider the problem of object shape estimation when it is only partially observable. Once we have a prediction, we can apply the criteria for grasp synthesis that require knowledge of the complete object shape.
The proposed approach to object shape prediction aims at closing the knowledge gaps in the robot’s understanding of the world. Psychological studies suggest that humans are able to predict the portions of a scene that are not visible to them through controlled scene continuation. The expected structure of unobserved object parts are governed by two classes of knowledge: i) visual evidence and ii) completion rules gained through prior visual experience. A very strong prior that exists in especially man-made objects is symmetry. In [ ], we showed that by exploiting visibility constraints, we could estimate the pose of the symmetry axis from a single view of the object. However, the quality of this estimate depends on having a sufficiently good view point of the object to not overestimate the object’s width in the viewing direction.
In [ ], we propose to include tactile measurements for estimating the complete object shape under the assumption of symmetry. Specifically, we consider the location of the contacts between hand and object as additional constraints in the estimation process. The problem is formulated as state estimation where the state contains all observed object points, the parameters of the symmetry plane and the bias error between camera and arm. Given the contact points as measurements, we can correct an initial guess of the symmetry plane such that the original points and the points mirrored across the symmetry line comply with the contact points. This optimization is solved with an Iterative Extended Kalman Filter.