Insight is investigating the challenging topic of dexterous robotic manipulation, aiming to develop a model-based reinforcement learning framework which can fuse both vision and touch sensation to improve the rate and reliability of robotic grasping and manipulation. The project proceeds along two initially parallel paths that will later merge.
The first attempts to extend existing model-based frameworks in simulation for robotic manipulation applications, while the second involves the development of robotic hardware, including vision and touch sensors, on which the developed algorithms will later be deployed.
In the first case several avenues have been explored, including using the World Models concept, replacing the computer game tasks (car racing and Doom) with a robotic simulator (CoppeliaSim). The results of this early work show that the variational autoencoder (VAE) used in the World Models solution cannot capture the relevant objects in the visual scene to be useful in a manipulation task. In associated work on the application of reinforcement learning for model-free automatic robotic manipulation, the group has twice participated in the Real Robot Challenge competition, winning Phase 1 in 2021 and winning outright in 2022 (https://real-robot-challenge.com/). The novel method used in the 2022 competition, which uses a teacher-student method to transfer skills from a less skilled agent to accelerate the learning of second agent on a more difficult task, has been extended and published in Expert Systems.
On the second research pathway, a Universal Robots UR5e six-axis robotic arm has been acquired and installed at UCD. When both the simulations of the first research stream and the hardware capabilities of the second reach sufficient maturity, the developed algorithms will be deployed on the robotics hardware for testing.
Investigators: Stephen Redmond (UCD), Noel O’Connor (DCU), Kevin McGuinness (DCU)