IPAB Workshop-18/07/2024
Speaker: Zhaole Sun
Title: Dexterous Manipulation of Cables
Abstract: Human hands are dexterous enough to manipulate cables, ropes, hoses, and other deformable linear objects to achieve certain goals, like USB cable insertion for phone charging. However, there is limited robotics research in how to do dexterous manipulation of cables. To address this problem, we proposed an integreted system including 1. cable perception, 2. a new dexterous hand design, and 3. dexterous manipulation algorithm. In this workshop, we will discuss more about 2 and 3. Our newly designed hand is a multi-fingered hand with 25 degrees of freedom based on the open-sourced hand design which can achieve many dexterous manipulation tasks that are even difficult for human. We also implement a cable dexterous manipulation algorithm based on reinforcement learning in simulation. We will further discuss the sim-to-real gap for deploying the algorithm in the real world and show some existing real-world demos.
Speaker: Thomas Corberes
Title: Planning and control for real-time quadruped manoeuvres in multi-contact environments.
Abstract: Dynamic quadrupedal locomotion in a complex environment requires efficient environmental perception, precise footstep planning, collision avoidance, and a robust locomotion controller. We propose a complete pipeline from detection to motion generation in a clustered environment based on many different works. This synthesis aims to find a trade-off between different methods to generate complex motion while staying relatively simple and computationally efficient for real-time application. The proposed control architecture was initially tested and developed on a lightweight robot (Solo), and subsequently extended to a larger robot (ANYmal-B) to further showcase the capabilities of our approach. Finally, an exploration of various methods to enhance planning capabilities through learning will be presented as part of our investigation.
Speaker: Marina Aoyama
Title: Few-Shot Learning of Force-Based Motions From Demonstration Through Pre-training of Haptic Representation
Abstract: In many contact-rich tasks, force sensing plays an essential role in adapting the motion to the physical properties of the manipulated object. To enable robots to capture the underlying distribution of object properties necessary for generalizing learned manipulation tasks to unseen objects, existing Learning from Demonstration (LfD) approaches require a large number of costly human demonstrations. In our recent work, we proposed a semi-supervised LfD approach that decouples the learnt model into a haptic representation encoder and a motion generation decoder. This enables us to pre-train the first using a large amount of unsupervised data, easily accessible, while using few-shot LfD to train the second, leveraging the benefits of learning skills from humans. I will talk about our approach, which was validated on a wiping task using sponges with different stiffness and surface friction on a physical robot. I will also talk about our analysis
IPAB Workshop-18/07/2024
G.03