Could a robotic scrub nurse be assisting our surgical procedures in the future?

Assistive robotic devices (ARD), machines controlled by a person to help carry out a task, are increasingly being explored for their potential to help deliver healthcare.

In 2019, the UK government launched a five-year research programme dedicated to making autonomous systems (such as robots to support older people at home) safe for public use. The appetite for advancing healthcare with robotics is driven by the multiple benefits these devices can offer, including freeing up healthcare staff for other tasks and minimising human error.

One area in healthcare leading the use of ARDs is surgery. Promising better precision, control and flexibility in operations, it is inevitable that robotics will transform the way we receive procedures over the next few decades.

The Institute of Global Health Innovation’s Hamlyn Centre is leading some of the cutting-edge research in this area. Honorary clinical research fellow at the centre and general surgery registrar, Dr Ahmed Ezzat, recently explored testing an early-stage robotic scrub nurse (RSN) which responds to eye-tracking. The project was co-led with Dr Alexandros Kogkas, Engineer and Research Associate at the HARMS Lab: Human-centred Automation, Robotics and Monitoring for Surgery, part of the Hamlyn Centre. The project was supervised by Dr George Mylonas, Director of the HARMS Lab.

We spoke to Ahmed to find out more about the research he’s conducted with colleagues at Imperial College London.

Why are we turning to assistive robotic devices (ARD) in surgery?

Since the 1980-90s, there’s been a movement to introduce robotic devices to advance healthcare. Robotic devices provide benefits of practicality, but also can improve performance and impact patient-related factors such as length of hospital stay, duration of surgery.

Robotics allow us to perform medical procedures in a wider way. We can use robots to control settings from a distance such as remote operating. It also improves safety for staff, reducing instrument-based sharps injuries, which mostly occur when surgical tools are passed between staff members.

Are there any examples of ARD currently being used in healthcare settings?

There are several robotic devices in use in surgery, but the da Vinci surgical system is the most widely used and successful one. Some assistant devices can respond to your voice to execute a command such as dimming the operating lights or controlling gas flow during keyhole surgery.

Currently, there are no examples of RSNs being used to provide assistance using eye gaze. There have been papers and engineering-based designs investigating robotic assistive nurses to help with commands and there is a scrub nurse design, named Penelope. It can pick up an instrument and take it back using a combination of voice and gesture control, however, it does not use gaze. The use of gaze avoids problems in picking up the surgeon’s voice in a noisy operating theatre. It also gives the surgeon “an extra hand” whilst having the freedom to move around theatre freely.

Are there any disadvantages of using ARDs?

Robots are generally costly, expensive and bulky. This goes for the da Vinci as well. There are also worries from surgeons and nurses about these potentially replacing their roles. We don’t see ARDs as a way of replacing staff. We see it as a way of supporting them.

Your recent research relates to testing your robotic scrub nurse. How does the device work?

The operating theatre set up for the experiment

Our device uses a robotic arm with a magnetic interface at the end which works as an instrument gripper. We use sensors to create a smart operating room and have the surgeon wear an eye tracker, which looks like sunglasses.

The surgeon uses a screen presenting a range of instruments and fixates on the one they’d like to use. That gaze is then translated into an instruction for the robot to pick up an instrument, which then moves and delivers it to the surgeon. The robot is programmed to know where instruments are positioned on a tray and to make a specific movement. The human scrub nurse will then return the instrument once the surgeon has used it.

How did you test the device?

We set up an operating theatre environment where we recruited 10 surgeons to perform a mock surgical procedure. We had a trained, qualified scrub nurse from the Trust, trained assistant surgeon and surgical trainees. Each surgeon performed the procedure twice – once with the help of a human scrub nurse, and secondly with both an RSN and a human scrub nurse.

What were your findings?

Healthcare professionals take part in the trial with the robotic scrub nurse

The RSN was 100% accurate and did not pick up any wrong instruments during the experiment, showing it was safe in terms of its use. The RSN made movements in 2-3 seconds or even less. Looking at task completion, the robotic nurse wasn’t inferior to the human scrub nurse.

Surgeons reacted positively towards the RSN, but less positively than our scrub nurses which was surprising. Yet both users felt that the addition of the RSN did not negatively affect task performance.

One significant factor was that looking at a screen and looking away from the operating field was frustrating for surgeons. But when using the device, its most obvious use would be in laparoscopic (keyhole) surgery, where surgeons naturally look at a screen. Surgeons expressed if the RSN was responsive to more than one modality such as gaze, voice and gestures, it would be more realistic like in normal communication with a human scrub nurse. That’s something we would aim to improve.

Nurses felt there was an increased chance of empowerment in their roles by using an RSN. They believed they could be freed up to do other more complex tasks which I think is very powerful.

What does the future look like for using RSNs in surgical settings?

In the future, there’s no doubt ARD will be more widely used in surgery, as the technology gets lighter, cheaper and smaller. Think of mobile phones! You’ll be able to study a surgeon’s brain wave activity and relate that to fatigue and performance in real-time. Using gaze and combining this with other modalities is the way forward, whether it’s by our research team or others.

Read the paper, ‘An eyetracking based robotic scrub nurse: proof of concept’ here.

Leave a Reply

Your email address will not be published.