When developing controllers for legged robots, one assumes that important quantities like the Center of Mass of the robot (CoM), its position and orientation in space or its joint positions and velocities are know accurately to be used in feedback laws. While the estimation of such quantities is trivial in simulation, it becomes a serious problem on real robots, especially for legged robots that can fall and are subject to unknown external disturbances. Sensors noise and biases can significantly degrade the performance of controllers. Moreover, several of these quantities cannot be measured directly (e.g. the pose and orientation in space, its CoM position or its angular momentum).
In this project, we explore the problem of fusing sensor information from inertial, position and force measurements to recover quantities fundamental for our feedback controllers. Our final goal is to find a systematic way of fusing multiple sensor modalities to improve the control of legged robots. In a theoretical part, we perform observability analysis on nonlinear estimation models to understand the fundamental limitations of our sensor fusion approaches. In our experimental work, we demonstrate that our estimators, by providing accurate estimates of important quantities, can significantly improve the effective control bandwidth and therefore the performance of our controllers.
We developed state estimation methods that allow to recover the state of the floating base of a robot [ ] using information about its contact situation with improved observability characteristics. We also developed methods to estimate the CoM of the robot, its angular momentum and external wrenches applied to it [ ] by fusing kinematic, inertial and force sensing information. Finally, we explored the problem of improving velocity estimation accuracy by fusion position and inertial measurements.