Nextage Interactive Collision-Free Bi-Manual Manipulation
Joint Industry Project (JIP) between Kawada Robotics and University of Edinburgh
- Video: Reconfigurable Smart Factory project with Nextage Open from Kawada Robotics
- Reconfigurable Smart Factory project with Nextage Open from Kawada Robotics
The SLMC group at the University of Edinburgh and Kawada Robotics are committed to enhance and improve human-robot interaction (HRI) through cutting edge research in hardware, perception, motion planning and control. This new scientific collaboration through a Kawada Joint Industry Project (JIP) and associated externally funded projects (e.g. EU H2020) aims to develop enhanced, robust capabilities on the Nextage Research Platform. We focus on realising an obstacle-aware, safe bimanual robot system that can react to dynamic, unseen environments and accurately and robustly perform tasks with compliant real-time control.
The project is coordinated by Professor Sethu Vijayakumar and Dr. Vladimir Ivan on the Edinburgh side and Yuichiro Kawasumi and Victor Leve from Kawada, Japan. Matt-Timmons Brown and Dr. Christian Rauch are researchers contributing to the project at UoE.
I am delighted to collaborate with one of the leading robotics companies in Japan to advance cutting edge, yet extremely practical capabilities on the Nextage platform. We will leverage the world leading research at the Edinburgh Centre for Robotics in dynamic motion planning and real-time control to realise real-world translation. I expect this work to expand the domain of application of such bimanual platforms beyond restrictive, machine only environments to allow human-centric applications in complex social environments such as shopping malls, hospital and healthcare settings and for in-home assistive technologies.
We are honoured to be collaborating with Professor Sethu Vijayakumar from the University of Edinburgh, a leading authority in robotics research.
We, Kawada Robotics, have been providing humanoid robotics research platforms to leading research institutions for nearly 20 years. We look forward to continue working with The University of Edinburgh, one of the world's premier research and education institution in the domain of intelligent robotics, to make humanoid robots useful to people around the world.
Within the Kawada JIP project, we develop and integrate software and hardware functionalities that allows the robot to interact with people safely and robustly. The aim is to endow bi-manual robots with complex behaviours that make them interactive, while performing safe collision-free motion in previously unseen environments using real-time perception and reactive re-planning. To achieve these objectives, our group is addressing the following two main challenges:
-
Tracking of Moving Objects in Dense Depth Data with specific focus on fusing dense point cloud data of moving objects. The challenge is to accurately distinguish moving objects from stationary ones without knowing beforehand how these objects look or how they move.
- Safe Collision-free Motion Planning then uses the static environment as well as the moving objects to calculate a safe motion around them. The challenge is in using the dense data without assuming any specific shape of the objects and avoiding collisions as the objects move through the robot’s workspace.
In this project, we use multiple Kawada humanoid upper torso robots called Nextage. Each platform weighs 150kg and features 15 degrees of freedom in the upper body.
Hardware and Interface
We have modified the hardware of our Nextage robots to suit our research needs. Our work has involved custom mounting, electrical interfacing and programming drivers to add a variety of RGBD cameras, grippers and sensors to the robots. We evaluated the Intel Realsense and Microsoft Kinect Azure depth cameras. These cameras provide rich depth information about the Nextage's environment. We have also mounted a Ossur Robolimb prosthetic hand and a Schunk electric gripper on the arms of the robot to enable different object-grasping techniques.
In close collaboration with Kawada Robotics, we developed a lean real-time control interface to expose the Nextage robot to the ROS ecosystem. This enables us to exploit traditional control methods to track trajectories as well as to develop compliant controllers, whole body controllers, and model predictive controllers. These are advanced control methods that will allow us to implement safer and more interactive robot behaviours.
We have also integrated the robot model into a ROS pipeline simulation using the Gazebo simulator and tuned the simulator to reduce the sim-to-real gap. This allows rapid development with the platform. Additionally, we use the EXOTica framework and other in-house software to implement collision avoidance both for planning and real-time control.
Project Demonstration Plans
Demonstrations of one of the project’s output (Milestone 1) will be shown at the Edinburgh International Science Festival 2021 [in the event of postponement due to COVID, we will particpate with virtual video based contribution]. This interactive demo will showcase the robot performing a pick-and-place task in an environment that visitors set up for the robot. The robot will use its sensors to find the object to pick, identify the dynamic target location and avoid obstacles placed in its way to execute the motion.
Milestone 1: Collision-free Pick-and-Place in Unseen Static Environments
In this part of the project, the Nextage robot uses its Schunk electric gripper to move a target box from one location to another in an unseen environment. Before the demo, visitors can rearrange the environment in front of the robot, including moving the target box and any obstacles around it. The robot then uses its RGBD perception pipeline to recognise the targets, and build an occupancy map of obstacles. This occupancy map is then used in the motion planning of the grasping sequence to ensure collision-free trajectories for the target box to be picked up and moved to its desired location. This demo showcases depth cloud sensing with Octomaps, safe planning and control using EXOTica and MoveIt.
Milestone 2: Collision-free Pick-and-Place in Unseen Moving Environments
In the next phase of the project, we are using the NextageA robot with a Schunk electric gripper to pick-and-place a moving box from a conveyor belt and place it inside a moving shelf. The robot will use its RGB-D perception pipeline to recognise the moving box and use movement-based object segmentation methods (such as RigidFusion [REF]) to identify objects of interest. Detecting and tracking the box and shelf does not require a model in advance. It will then use the EXOTica motion planning library, along with HDRM methods [REF], to calculate a collision-free trajectory and grasping sequence of the moving target, based on its predicted future location, accounting for movement. This demo showcases our latest computer vision research with RigidFusion and the robustness of our collision-avoidance motion planning work with HDRM.
The capability will also be developed further in the context the EU H2020 Project HARMONY: Enhancing Healthcare with Assistive Robotic Mobile Manipulation (2021-2024). The University of Edinburgh's role is to realise robust, flexible and adaptive dual arm manipulation and loco-manipulation on platforms such as the Nextage robot -- with a focus on compliant, real-time control in challenging healthcare-centric environments such as hospitals.
Further Information
EXOTica motion planning framework