In October 2025, we had the pleasure of visiting the Human Robotics Group led by Prof. Etienne Burdet at Imperial College London, part of the Department of Bioengineering. This visit brought together early-career researchers from the R4N network, who had the unique opportunity to explore a cutting-edge, integrative approach that combines neuroscience and robotics to study human sensorimotor control.
The lab’s innovative work aims to develop efficient assistive devices and training systems that advance the field of neurorehabilitation. Their work ranges from the development of supplementary robotic limbs and assistive technologies in the clinic to their most recent studies on haptic communication and sensorimotor control. 
Figure 1
During our visit, Prof. Etienne Burdet, Dr. Ivanova and their amazing team gave us a warm welcome and an inspiring tour through their ongoing projects at the Human Robotics Lab.
The lab’s goal is to pioneer the science and technology of sensorimotor and cognitive interfaces, with a strong focus on translating this research into technologies that can enhance daily living. One particularly fascinating example came from the work on rehabilitation in stroke patients.
In this stroke project, the researchers aim to combine robotics and Functional Electrical Stimulation (FES) to assist upper-limb movement recovery. FES activates the peripheral motor nerves that control specific muscle groups, and each stimulated contraction sends sensory and proprioceptive signals back to the brain, helping reorganize neural pathways through neuroplasticity (Figure 2). In contrast, robotic devices generate the movement externally, allowing the limb to move even when the patient has little or no voluntary control.

Figure 2
The team also shared insights into the advantages and challenges of using robotics and FES in stroke neurorehabilitation. On the one hand, Robotics can lead to improved motor function, even in chronic conditions, but may sometimes reduce the patient’s own motor effort. On the other hand, FES increases neural feedback and promotes motor improvements, though it is less effective in chronic stages.
Another device the lab developed for rehabilitation is the GripAble (https://gripable.co), one of the lab’s innovative devices used in stroke neurorehabilitation, one of the very few devices that can be used from the bedside to home. It connects a digital hand-grip interface to a tablet, measuring the force and acceleration of flexion and extension movements. Using this force, the patient can control on-screen actions – for example, moving a bird up and down in a game where the aim is to catch objects. This system turns rehabilitation sessions into engaging and interactive experiences, allowing patients and therapists to work and play together. See Figure 3.

Figure 3
In addition to the GripAble system, the team also introduced us to two more innovative technologies designed to advance neurorehabilitation:
1. The Spinal Reflex Robot for Pain Management, a fascinating tool that explores how robotic assistance can help modulate spinal reflexes to relieve pain (left in Figure 4)
2. The force sensitive Myro workbench for poststroke rehabilitation, where patients engage in interactive games on a touch screen that mimic everyday movements. One exercise involves a key-turning motion, while another focuses on grasping movement. To complete these tasks, patients use specially designed tools that help strengthen fine motor skills in a fun and functional way (right in Figure 4). The system has been initally developed at HRG and commercialised by Tyromotion.
These demonstrations offered a vivid look into how robotics technology and neuroscience come together to support recovery and improve quality of life for people undergoing neurorehabilitation.

Figure 4
After a well-deserved break and some friendly conversations, we split into two groups to continue the lab tour and demonstrations.
Our next stop was a new lab space, where some of the PhD students introduced us to their fascinating project on human–human interaction. This study brings together robotics, eye-tracking technology and electromyography (EMG) to explore how people understand each other’s movements.
In the experiment, two participants sit opposite each other, each with a screen in front of them (Figure 5). Their task is to follow a ball moving horizontally across the screen using the robotics arm strapped to their hands. Sensors attached to the participants arm detect the wrist movement. In different experimental conditions, the robotics arms are either haptically support participants’ movements or offering a haptic link between the participants, so that they can feel each other motion. By measuring how the participants respond to both the robot arm’s forces as well as each other’s movements, the team is investigating communication and movement prediction.
One of the team’s long-term goals is to apply this research to neurodivergent populations, investigating communication and movement prediction in dyads with e.g. one neurodivergent or one not neurodivergent partners..

Figure 5
To wrap up our visit, we had the chance to experience one of the most captivating demonstrations of the day: the Dr. Octopus project by The MUVE (right in Figure 6, see https://www.imperial.ac.uk/human-robotics/dr-octopus-/).
True to its name, Dr. Octopus takes inspiration from the iconic Spider-Man character (left in Figure 6), featuring robotic limbs that extend and move in coordination with the user. This remarkable robot can track the motion of a person’s arms and body in a 4D space, creating a seamless interaction between human and machine in real time.
We watched two impressive demonstrations in action:
1. Human–Robot Synchrony, where the robotic limbs react and adapt to a person’s movements in real time. Participants would lean to the left or right while wearing the Dr. Octopus. After a few moments, the two arms would move accordingly to keep the balance.
2. The Reaching Task exploring whether the robot’s arm can compensate for the human arm during a goal-directed movement, effectively supporting and enhancing natural motion. In this task, participants stood in front of a pillar with a lower and higher target. The robot arm would reach out to the target while adjusting its position based on a camera in the arm.
Trying out Dr. Octopus was a truly unforgettable experience. It was like a step into a future where robotic assistance could extend our capabilities and redefine the boundaries of human movement. The visit wrapped up on a high note, with a group of enthusiastic early-career researchers keen to try out the incredible Dr. Octopus themselves!

Figure 6
This lab visit was a perfect mix of fun and curiosity, the kind of experience that reminds us why exploring science together is so inspiring. We left the Human Robotics Groiup not only with new insights into the fascinating intersection of neuroscience and robotics, but also with a sense of motivation and inspiration as future researchers.
Thank you to everyone who joined us for this visit, and Prof. Etienne Burdet, Dr. Ekatarina Ivanova, and their team in the Human Robotics Group for their warm welcome and passion for innovation.
~ written by Ottavia Ollari