IEEE International Conference on Robotics and Automation (ICRA) 2024

SLMC and the Alan Turing Institute present four papers and a workshop at ICRA 2024

Paper 1

Russell Buchanan*, Adrian Röfer, Joao Moura, Abhinav Valada and Sethu Vijayakumar, Online Estimation of Articulated Objects with Factor Graphs using Vision and Proprioceptive Sensing, Proc. IEEE International Conference on Robotics and Automation (ICRA 2024), Yokohama, Japan (2024). [pdf] [video] [citation]

Abstract

From dishwashers to cabinets, humans interact with articulated objects every day, and for a robot to assist in common manipulation tasks, it must learn a representation of articulation. Recent deep learning methods can provide powerful vision-based priors on the affordance of articulated objects from previous, possibly simulated, experiences. In contrast, many works estimate articulation by observing the object in motion, requiring the robot to already be interacting with the object. In this work, we propose to use the best of both worlds by introducing an online estimation method that merges vision-based affordance predictions from a neural network with interactive kinematic sensing in an analytical model. Our work has the benefit of using vision to predict an articulation model before touching the object, while also being able to update the model quickly from kinematic sensing during the interaction. In this paper, we implement a full system using shared autonomy for robotic opening of articulated objects, in particular objects in which the articulation is not apparent from vision alone. We implemented our system on a real robot and performed several autonomous closed-loop experiments in which the robot had to open a door with unknown joint while estimating the articulation online. Our system achieved an 80% success rate for autonomous opening of unknown articulated objects.

Supported by: EU project HARMONY and The Alan Turing Institute

Paper 2

Andreas Christou, Antonio J. del-Ama, Juan C. Moreno and Sethu Vijayakumar, Adaptive Control for Triadic Human-Robot-FES Collaboration in Gait Rehabilitation: A Pilot Study, Proc. IEEE International Conference on Robotics and Automation (ICRA 2024), Yokohama, Japan (2024). [pdf] [video] [citation]

Abstract

The hybridisation of robot-assisted gait training and functional electrical stimulation (FES) can provide numerous physiological benefits to neurological patients. However, the design of an effective hybrid controller poses significant challenges. In this over-actuated system, it is extremely difficult to find the right balance between robotic assistance and FES that will provide personalised assistance, prevent muscle fatigue and encourage the patient's active participation in order to accelerate recovery.

In this paper, we present an adaptive hybrid robot-FES controller to do this and enable the triadic collaboration between the patient, the robot and FES.  A patient-driven controller is designed where the voluntary movement of the patient is prioritised and assistance is provided using FES and the robot in a hierarchical order depending on the patient's performance and their muscles' fitness. The performance of this hybrid adaptive controller is tested in simulation and on one healthy subject. Our results indicate an increase in tracking performance with lower overall assistance, and less muscle fatigue when the adaptive hybrid controller is used, compared to its non adaptive counterpart. This suggests that our hybrid adaptive controller may be able to adapt to the behaviour of the user to provide assistance as needed and prevent the early termination of physical therapy due to muscle fatigue.

Supported by: HONDA Research Institute, Engineering and Physical Sciences Research Council (EPSRC, grant reference EP/L016834/1) and The Alan Turing Institute

Paper 3

Marina Y. Aoyama, Joao Moura, Namiko Saito and Sethu Vijayakumar, Few-Shot Learning of Force-Based Motions From Demonstration Through Pre-training of Haptic Representation, Proc. IEEE International Conference on Robotics and Automation (ICRA 2024), Yokohama, Japan (2024). [pdf] [video] [citation]

Abstract

Learning to adapt motion to various physical properties of manipulated objects in contact-rich tasks requires extensive human demonstrations. Our semi-supervised Learning from Demonstration approach decouples the learned model into a haptic representation encoder and a motion generation decoder. This enables us to pre-train the first using a large amount of unsupervised data while using few-shot LfD to train the second, leveraging the benefits of learning skills from humans. We validate our approach in a wiping task with sponges of varying stiffness and surface friction, showcasing improved physical property recognition and motion generation for unseen objects on the KUKA iiwa robot arm.

Supported by: EU project HARMONY, Japan Science and Technology (JST) Moonshot and The Alan Turing Institute

Paper 4

Keyhan Kouhkiloui Babarahmati, Mohammadreza Kasaei, Michael Mistry and Sethu Vijayakumar, Robust and Dexterous Dual-arm Tele-Cooperation using Adaptable Impedance Control, Proc. IEEE International Conference on Robotics and Automation (ICRA 2024), Yokohama, Japan (2024). [pdf] [video] [citation]

Abstract

In recent years, the need for robots to transition from isolated industrial tasks to shared environments, including human-robot collaboration and teleoperation, has become increasingly evident. Building on the foundation of Fractal Impedance Control (FIC) introduced in our previous work, this paper presents a novel extension to dual-arm tele-cooperation, leveraging the non-linear stiffness and passivity of FIC to adapt to diverse cooperative scenarios. Unlike traditional impedance controllers, our approach ensures stability without relying on energy tanks, as demonstrated in our prior research. In this paper, we further extend the FIC framework to bimanual operations, allowing for stable and smooth switching between different dynamic tasks without gain tuning. We also introduce a telemanipulation architecture that offers higher transparency and dexterity, addressing the challenges of signal latency and low-bandwidth communication. Through extensive experiments, we validate the robustness of our method, and the results confirm the advantages of the FIC approach over traditional impedance controllers, showcasing its potential for applications in planetary exploration and other scenarios requiring dexterous telemanipulation. This paper's contributions include the seamless integration of FIC into multi-arm systems, the ability to perform robust interactions in highly variable environments, and the provision of a comprehensive comparison with competing approaches, thereby significantly enhancing the robustness and adaptability of robotic systems.

Supported by: EU project HARMONY and The Alan Turing Institute

Workshop  - Cooking Robotics: Perception and motion planning

The realisation of cooking robots presents numerous real-world challenges. This workshop explores the latest advancements in 'robots in cooking,' addressing hardware considerations, multimodal perception, motion planning and control, experimental methodologies, and benchmarking approaches. 

Amid labour shortages and increased hygiene concerns in the post-COVID world, automation in the food industry and personalised cooking assistants have become essential. This workshop will bridge the gap between academia, industry, and professionals by inviting leading researchers, collaborating with a competition, the Food Topping Challenge, and showcasing robot demonstrations.

**Invited speakers (tentative)

Fumiya Iida (University of Cambridge, UK) Tamim Asfour (Karlsruhe Institute of Technology, Germany) Oliver Kroemer/Kevin Zhang (Carnegie Mellon University, United States) Akihiko Yamaguchi (Tohoku University, Japan) Dorsa Sadigh/Priya Sundaresan (Stanford University, United States) Mako Miyatake (freelancer chef, Japan) Josie Hughes (EPFL)

Supported by: Sony AI, EU project HARMONY, Japan Science and Technology (JST) Moonshot and The Alan Turing Institute

 

Further Information

ICRA2024
Cooking Robots Workshop