5 innovative pieces of healthcare tech we learned about this Christmas

Every Christmas, researchers from IGHI’s Hamlyn Centre gather to show off their latest innovations in robotics and healthcare technology.

We caught up with some of the team to find out more about their research, and how they hope it could make a difference to people’s lives.

1. GLOW camera for breast-conserving surgery

Cancer surgeons aim to remove as much of a tumour as possible, while sparing the surrounding healthy tissue to reduce side effects. To do this successfully, surgeons need sophisticated and accurate methods of determining the precise location of the tumour. Second-year PhD student Maria Leiloglou is in the early stages of developing a camera system that she hopes will help to guide the surgeon’s knife. Called GLOW, her device could equip surgeons with better information on the location of a tumour, using optics, or light, in real-time.

Maria’s camera system uses a beam splitter to split light into two sections. It captures visible light with one camera, and fluorescence with the other. By injecting a dye called indocyanine green into the patient, which sticks to blood plasma proteins, the system can reveal the vasculature of the tissue. Tumours’ blood vessels are thought to have different characteristics to surrounding healthy tissue. By overlaying the two images, Maria is hoping that this can provide information to help discriminate between healthy tissue and cancerous tissue.

GLOW is still in the early stages of development. Maria recently launched a clinical trial to discover if fluorescent signals can be picked up by the camera system. Currently, she’s processing this information to learn precisely what it can tell us about the location of tumours. 

2. A clearer picture for endoscopists

Endoscopy is a minimally invasive procedure that uses a camera to look inside the body to investigate unusual symptoms, or help perform certain types of surgery.

Despite being a common procedure, doctors often experience difficulties getting a clear image, due to something called ‘specular reflection’. This is where body fluid reflects light from the endoscope and disturbs what the physician can see.

To deal with this issue, researchers Henry Taysom and Ben Jackson are hoping to improve the endoscopy procedure. They’ve created a system that sends a laser through a diffuser to produce a pattern on tissue, which is then analysed by a computer. By increasing the depth of field, the team hopes this can help provide surgeons with better 3D images, eventually in real-time.

While the pair is still in the experimental phase, they believe in the long-term, their project could help reduce training time for surgeons.

3. Wearable sensors for measuring food intake 

Around the world, many families continue to go hungry. According to the WHO, 45% of deaths among children under five are linked to undernutrition and this is mostly in low- and middle-income countries. Yet to help distribute food where people need it the most, we need to better understand what people are – and aren’t – eating.

Senior Lecturer  Dr Benny Lo has designed a set of sensors to help determine individuals’ nutritional intake, thus giving an accurate picture of people’s diets. Impressively, his devices can identify different types of food using custom-built, AI-driven surface recognition software.

Many of Dr Lo’s technologies are wearable too. One of the designs is a tiny camera which turns on when a person’s jaw moves in a specific way, while another is an earpiece. These sensors are also lightweight, making it easier for people to eat their food while wearing them.

In the long term, his team is hopeful that using these devices to track people’s eating habits will help inform public health policies and reduce undernutrition.

4. A ‘Geiger counter’ for prostate cancer

To the naked eye, cancerous tissue is virtually indistinguishable from healthy tissue. Research led by Prof Dan Elson is hoping to give surgeons a clearer picture of cancer in the operating theatre. This project is developing an imaging system that would enable surgeons to “see” prostate tumours in real-time.

They’re augmenting an existing probe developed by their partners, Lightpoint Medical. This detects a radioactive molecule that sticks to prostate cancer cells when administered to a patient. It clicks like a Geiger counter when it picks up the radiation, indicating to the surgeon that the tissue is cancerous.

Elson’s project will work to translate this information into a visual map that will be overlaid onto real-time camera footage, guiding the surgeon’s knife and making the operation more precise. This could spare healthy tissue, and thus reduce side effects, while also lowering the risk that cancerous tissue will be left behind. You can read more about the project here.

5. Steps towards more accurate diagnosis

Last, but not least, we learnt about technology that’s making its way from movies to medicine. PhD student Xiao Gu is investigating whether motion capture technology could be used to analyse a person’s gait, or the way that they walk, to aid diagnostics and recovery.

The system uses 12 cameras combined with reflective beads that are placed on a person’s body in various places. By picking up the light reflected from the beads, the cameras work in harmony to measure depth, or distance to the cameras.

This means that the system can produce a 3D estimate of a person’s pose and information on their walking patterns. The aim is for this biometric data to help diagnose certain conditions that can affect a person’s mobility or movement, or to aid rehabilitation, for example after traumatic injury or an event such as a stroke.