Title: Articulated Tracking for Humanoid Manipulation Tasks
Robots that use man-made tools often require complex manipulators with a high degree of freedom. Controlling and maintaining the state of versatile manipulators, such as hands, mostly relies on the reported joint states of arm and fingers. Because of calibration inaccuracies and other factors such as linkage elasticity, these joint states may not reflect the true state of the manipulator.
Our aim is to provide additional visual feedback during a manipulation task. We propose to use discriminative learning methods to classify parts of the manipulator in depth camera images and to directly predict the pose of the manipulator on a per-frame basis. Hereby the manipulator tracking becomes independent of errors in reported joint states and is able to distinguish shape-similar parts.
IPAB workshop - 25/05/2017