Moonshot: Assistive Collaborative Manipulation

The University of Edinburgh and Japan Science & Technology (JST) Collaborative Moonshot Project

Moonshot stacking boxes
AIREC robot, from Future Robotics Organisation in Waseda University, helping organising and carefully stacking packages, without knowing its weight and content.

Hierarchical Motion Planning Framework for Realising Long-horizon Task

Having robots collaborating alongside humans and assisting them in daily life activities, such as tidying up, is extremely challenging.

Robots need to reason about how to split such complex tasks in to simpler subtasks, while being able to safely and robustly accomplish each of the individual subtasks. For instance, picking up the target object, then moving it to the front of the shelf, opening the shelf door, and finally successfully placing the object inside the shelf. Additionally, robots also need to be able to adapt their movements to unpredictable and dynamic changes, such as humans moving around, and to unexpected object properties, including the object turning out to be heavier or softer than anticipated.

In this project, in collaboration with Waseda University in Japan and under the auspices of The Alan Turing Institute, we develop learning and motion planning methods for realising long-horizon tasks that are adaptable and generalisable to  objects of various different properties.

We use the example of stacking several boxes with different hidden properties, such as mass or even the contents, that require the robot to change the high level (long-horizon) plan, as well as the robot motion itself -- such as how fast it should move.

This project directly contributes to the Goal 3 of the Moonshot programme: Realisation of AI robots that autonomously learn, adapt to their environment, evolve in intelligence and act alongside human beings, by 2050.

For more information on the Goal 3 of the Moonshot programme check: Moonshot Programme Goal 3

Project Collaboration Timeline: First Phase (April 1, 2024 to November 30, 2025)
MS1
Moonshot: Assistive Collaborative Manipulation Project Team (front) Marina Aoyama, Sachiya Fujita, Elle Miller, Sethu Vijayakumar, Joao Moura (back) Victor Leve, Suzanne Perry, Namiko Saito

 

Please check out below our results and events organised as part of the Moonshot programme (in reverse chronological order):

Paper Presentation @ Humanoids 2024, Nancy

November 23, 2024
Contribution Type: Conference Paper Publication
Talos humanoids paper

The number of ways to climb a staircase depends on the perspective: infinite in a continuous context and finite in a discrete one, determined by step placement and sequence. We introduce NAS, an algorithm that integrates these aspects to compute all solutions for contact planning problems under standard assumptions. NAS is the first to deliver a globally optimal policy, enabling real-time humanoid footstep planning. Results on the Talos robot (in simulation and hardware) demonstrate that NAS reduces theoretical exponential complexity to a manageable bilinear form, supporting efficient GPU parallelisation while maintaining completeness.

Jiayi Wang, Saeid Samadi, Hefan Wang, Pierre Fernbach, Olivier Stasse, Sethu Vijayakumar and Steve Tonneau, NAS: N-step computation of All Solutions to the footstep planning problem, Proc. IEEE-RAS 23rd International Conference on Humanoid Robots (Humanoids 2024), Nancy, France (2024). [pdf] [video] [citation]

 

Paper Presentation @ Humanoids 2024, Nancy

November 23, 2024
Poster presentaion at Humanoids by Victor
Contribution Type: Conference Paper Publication

Humans can use their entire body surface to manipulate heavy, distant, or multiple objects simultaneously, but replicating this ability in robots—known as Whole-Body Contact-Rich Manipulation (WBCRM)—is highly complex. This is due to the challenge of managing numerous contact modes and enabling contact anywhere on the robot's surface, which complicates planning within a reasonable time. To address this, we reformulate planar WBCRM as hierarchical continuous optimization problems, using a novel explicit representation of the robot surface. This approach significantly improves convergence, planning efficiency, and feasibility.

 Victor Leve, Joao Moura, Namiko Saito, Steve Tonneau and Sethu Vijayakumar, Explicit Contact Optimization in Whole-Body Contact-Rich Manipulation, Proc. IEEE-RAS 23rd International Conference on Humanoid Robots (Humanoids 2024), Nancy, France (2024). [pdf] [video] [citation]

 

Paper Presentation @ CoRL 2024, Munich

Poster session at CoRL in Munich
November 8, 2024
Contribution Type: Conference Paper Publication

Manipulation without grasping, known as non-prehensile manipulation, is essential for dexterous robots in contact-rich environments, but presents many challenges relating with underactuation, hybrid-dynamics, and frictional uncertainty. Additionally, object occlusions in a scenario of contact uncertainty becomes a critical problem, which previous literature fails to address. We present a method for learning visuotactile state estimators and uncertainty-aware control policies for non-prehensile manipulation under occlusions, by leveraging diverse interaction data from privileged policies trained in simulation. Unlike prior non-prehensile research that relies on complex external perception set-ups, our method handles occlusions after sim-to-real transfer to robotic hardware with an onboard camera.

Juan Del Aguila Ferrandis, Joao Moura, and Sethu Vijayakumar, Learning Visuotactile Estimation and Control for Non-prehensile Manipulation under Occlusions, Proc. The Conference on Robot Learning (CoRL), Munich, Germany (2024). [pdf] [video] [citation]

 

 Data Curation Workshop @ IROS 2024, Abu Dhabi

October 14, 2024
IROS_ws_speakers
Contribution Type: Interactive Workshop and Seminar

We invited world-leading research scientists for a workshop at the International Conference on Intelligent Robots and Systems (IROS), in Abu Dhabi, UAE, to discuss collecting, managing and utilizing data through embodied robots.

In recent years, embodied robots have been increasingly playing a significant role in real-world scenarios such as daily tasks, healthcare, caregiving, and agriculture, and interacting with humans and environments. Nevertheless, the collection and application of sensorimotor data in real-world settings pose challenges, given the associated high costs, inherent random noise, requisite tailored processing, and potential inclusion of personal data. This workshop would provide a platform to discuss efficient data collection, high-quality data assurance, sophisticated processing and utilization, and ethical considerations.

Check our workshop website for more information: Data curation workshop

 

Paper Presentation @ IROS 2024, Abu Dhabi

IROS latent object characteristics recognition
October, 2024
Contribution Type: Conference Paper Publication

Recognising the characteristics of objects while a robot handles them is crucial for adjusting motions that ensure stable and efficient interactions with containers.  Ahead of realising stable and efficient robot motions for handling/transferring the containers, this work aims to recognise the unobservable latent object characteristics. We propose a cross-modal transfer learning approach from vision to haptic-audio, which facilitates the representation of object characteristics using indirect sensor data, thereby improving recognition accuracy. For evaluation, we handled predicting shape, position, and orientation of an object inside a closed box.

Namiko Saito, Joao Moura, Hiroki Uchida and Sethu Vijayakumar, Latent Object Characteristics Recognition with Visual to Haptic-Audio Cross-modal Transfer Learning , Proc. 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2024), Abu Dhabi, UAE (2024). [pdf] [video] [citation]

 

Paper Presentation @ ICRA40 2024, Rotterdam

concept framework of hierarchical motion planning
September, 2024
Contribution Type: Conference Paper Publication

The research on long-horizon manipulation in environments with numerous objects and subtasks falls under the framework of task and motion planning (TAMP). We propose a concept of hierarchical framework combining deep neural networks (DNN) for higher-level subgoal decisions and optimization for lower-level motion control to enhance robustness, scalability, and generalizability. This will be evaluated on a latent state box transport and stacking task – where the robot needs to change the order of actions and speed to control during motion execution.

Namiko Saito, Joao Moura, and Sethu Vijayakumar, Long-horizon Manipulation through Hierarchical Motion Planning with Subgoal Prediction, 40th Anniversary of the IEEE Conference on Robotics and Automation (ICRA@40), Rotterdam, Netherlands (2024). [pdf] [video] [citation]

 

Paper Presentation @ ICRA40 2024, Rotterdam

Diagram of a Shared-control framework for contact-rich manipulation
September, 2024
Contribution Type: Conference Paper Publication

Shared Autonomy presents a paradigm for solving the problem of tele-operating complex robotic systems. We propose a framework for combining human user/operator input with autonomous reasoning for remote handling of contact-rich manipulation tasks, consisting of an optimal control (OC) formalism that incorporates models from hybrid contact dynamics, compliant interaction, and operator intention as a means of expanding current robotic manipulation capabilities whilst ensuring safe and stable task execution

Joao Moura, Theodoros Stouraitis, Namiko Saito, and Sethu Vijayakumar, Optimal Shared Autonomy for Contact-rich Robotic Manipulation , 40th Anniversary of the IEEE Conference on Robotics and Automation (ICRA@40), Rotterdam, Netherlands (2024). [pdf] [citation]

 

AI in Healthcare Workshop, Edinburgh

Triadic collaboration between human, robot, and exoskeleton
May 30, 2024
Contribution Type: Seminar and Live Demonstrations

AI and embodied robotics solutions are playing an increasingly significant role in addressing some of the most pressing grand challenges in healthcare and assisted living in our society. For successful, safe and effective deployment of these fast-changing technologies, it is important that we work closely and collaboratively with different stakeholders.

In this workshop, we bring together end-users in care homes, hospitals and rehabilitation centres; world-leading researchers from academia including European and international consortia; startup companies and established industry players in this space; as well as policymakers and researchers who look at the ethics and governance structures that inform deployment strategies.

Check our workshop website for more information: AI in Healthcare Workshop

Please check out the detailed news item for some pictures of the event.

 

ATR International Mini-Symposium on Robot Learning, Kyoto

Visit to ATR in Kyoto
May 20, 2024
Contribution Type: Symposium and Presentations

As robot learning approaches are getting more and more complex and sophisticated, we co-organised a mini-symposium in collaboration with the Advanced Telecommunications Research Institute International (ATR), in Kyoto, for discussing challenges and the latest developments on robot learning. Our host was one of leading human motor control researchers in the world, Prof. Mitsuo Kawato of ATR.

In this symposium, we had the chance of hearing from many robot learning researchers at ATR as well as renowned international researchers, such as Prof. Jan Peters (TUD, Germany) and Prof Ales Ude (Jožef Stefan Institute, Slovenia). We also presented some of the work we have been developing in the context of the Moonshot programme, with presentations from our UoE SLMC group including Prof. Sethu Vijayakumar, Dr. Namiko Saito, Dr. Joao Moura, and Marina Aoyama, among others.

Please check out the detailed news item.

 

 Cooking Robotics Workshop @ ICRA 2024, Yokohama

AIREC robot cooking scramble eggs
May 17, 2024
Contribution Type: Interactive Workshop and Seminar

We invited world-leading research scientists and practitioners for a workshop at the International Conference on Robotics and Automation (ICRA), in Yokohama, Japan, to explore the new frontiers of “robots in cooking” ! We will address various scientific research questions, including hardware, multimodal perception, motion planning and control, experimental methodologies, and benchmarking approaches. The workshop will count with networking opportunities through poster session from young researchers and even exciting live robotic demos from Waseda University and Kawada Robotics Corporation!

Check our workshop website for more information: Cooking Robotics Workshop

 

Paper Presentation @ ICRA 2024, Yokohama

Human demonstrating motion to robot
May 16, 2024
Contribution Type: Conference Paper Publication

Existing Learning from Demonstration (LfD) approaches require a large number of human demonstrations for enabling robots to generalise to unseen objects. In this work, we propose a semi-supervised LfD approach that decouples the learnt model into a haptic representation encoder and a motion generation decoder. This enables us to pre-train the first using a large amount of unsupervised data, easily accessible, while using few-shot LfD to train the second, leveraging the benefits of learning skills from humans. We validate our approach on the wiping task, demonstrating improved performance for sponges with unseen stiffness and surface friction.

Marina Y. Aoyama, Joao Moura, Namiko Saito and Sethu Vijayakumar, Few-Shot Learning of Force-Based Motions From Demonstration Through Pre-training of Haptic Representation, Proc. IEEE International Conference on Robotics and Automation (ICRA 2024), Yokohama, Japan (2024). [pdf] [video] [citation]

 

Meet the Robots @ Edinburgh International Science Festival (EISF)

Children tele operating a robot to push a box
April 12, 2024
Contribution Type: Outreach Event

We opened our laboratory to the public to showcase our cutting-edge research platforms to inform and engage visitors about the best ways to deliver human-centric assistance and effective human-robot collaborations.

Visitors, both adults and children, had  the opportunity to learn more about the world of robotics, through an interactive tour that included demonstrations of humanoid robots, exoskeletons, a showcase of the factory of the future, and finally a chance to directly teleoperate the robots themselves.

Please check out the detailed news item.