The Operating Theatre of The Future at New Scientist Live
Come visit The School of Biomedical Engineering & Imaging Sciences at this year's New Scientist Live, October 10-13
01 October 2019
Dr Christos Bergeles and his team are exploring ways to develop image-guided micro-precise instruments and multi-sensory guidance algorithms to achieve impossible interventions and regenerative therapies deep inside the human body.
Senior lecturer in the School of Biomedical Engineering and Imaging Sciences, Dr Christos Bergeles is on a mission.
His team is exploring ways to develop image-guided micro-precise instruments and multi-sensory guidance algorithms to achieve impossible interventions and regenerative therapies deep inside the human body.
Earlier this year, the groundbreaking project of Professor Pete Coffeey and Lyndon Da Cruz from University College London and Moorfields Eye Hospital, succeeded for the first time, in a small trial to successfully implant therapeutic cells for sight restoration in Age-Related Macular Degeneration (AMD).
The robotics team of Dr Bergeles has since linked with them creating an interdisciplinary team to develop flexible robots that can perform sub-millimeter manipulation of delicate tissues during retinal surgery.
These robots reach the bottom part of the eye -- where they can, for instance, transplant retinal cells to replace damaged ones -- ultimately improving the dexterity of the surgeon.
“Robotics can assist expert surgeons in pushing their limits and imagine how they could perform currently impossible interventions,” Dr Bergeles said.
“The multi-disciplinary team we have put together builds on the recent international successes of Moorfields Eye Hospital, to spearhead a new era in micro-precise retinal therapy delivery.”
The team is also working on haptic devices to allow for surgeries that were not previously possible and to make surgery safer and more precise.
Currently, surgical robots cannot be fully automated, and one of the main challenges that researchers and engineers are navigating through is how to provide more control over the medical instrument during the operation by using both force feedback and visual feedback from these tools.
That’s where haptics come in. Both force and visual feedback are achievable by using a haptic interface the team has developed. With this the surgeon can control the position of the surgical robot remotely while simultaneously feeling the force between the robotic tool and the tissues.
Various algorithms are also implemented in the models to simulate what surgeons can experience when they encounter anomalies in the tissues such as lesions.
“The sense of human touch has guided surgical interventions for decades. With the advent of robotic minimally invasive surgery, however, the sense of touch has been lost,” Dr Bergeles explained.
“Even more, in vitreoretinal surgery, there was never a mechanism to sense potentially harmful forces, as the applied forces are at the limits of human perception. We are tackling this problem through new force sensing approaches and novel haptic devices that deliver the sense of touch to the surgeon.”
Advancing the robotic technology comes with its own set of challenges. As Dr Bergeles explains, engineers can’t develop a system that somehow does another job.
“If you have a human hair, which is around a quarter of a millimeter, and you cut it horizontally, and you need to go into one of these little cuts and do something there - these are very tiny motions and in order to achieve those you have to be very careful on how you design your systems and how you study them,” Dr Bergeles said.
“We’re advancing the state of the art by considering in more detail how every single robot component interacts with the robotic mechanism through advanced modelling and simulations.”
When designing the systems, the team deliberates with the clinical co-lead, observing them during work to identify what needs to be improved. The team gathers requirements, but they don’t just want to make a new robot, rather they must ensure that it fits within the current practice.
“We can’t have something that’s extremely disruptive and will require retraining surgeons. We need to ease robotics in the operating theatre, implying that we have to consider phased approaches” Dr Bergeles said.
“We think how we could introduce a new tool that considers the current spatial arrangement of the current operating theatre because the surgeon is just one of many people there, and the robots should fit in with them.”
“We plan to cover all these requirements and then make devices that consider the surgical requirements that will also fit with the operating workflow.”
Their designs are then implemented and tested through a virtual reality framework where the clinician, research fellows and engineers wear the VR mask and are immersed in a simulated operating theatre where they can see and interact with robots. This allows the team to catch design flaws early on.
“It’s one thing seeing it on a computer screen, and another thing of being immersed when you can see how these robots will finally look like,” Dr Bergeles said.
Nevertheless, the progress the team has made in the last five years has been substantial. They have created the computational tools allowing them to design robots better and have also created medical imaging analysis algorithms that helps the robots become a little more aware about the environment in which they operate. For instance, the algorithm can recognize retinal vessels that the robot should target, therefore helping the robot stabilize an injection despite potential deformations during the intervention.
But while the benefits of the technology are clear from an engineer’s perspective, the public continues to grapple with the concept of a robot performing surgery.
Earlier in the year, in collaboration with the Macular Society, a patient advocacy group, Dr Bergeles and Prof Lyndon Da Cruz, organised Public and Patient Involvement (PPI) sessions to explore public sentiments towards the work his team is developing, what would be the most fundamental worries they would have about the robot operating or if they would they be willing to be on a waiting list if the robot is only one of a kind or their interactions with the surgeons.
“This allowed us to have some very early insights that people would support robotic technology if it meant it would have an obvious benefit to them,” Dr Bergeles said.
In this PPI event, the team tried to uncover what were the fears and hopes for robotic surgery, retinal surgery and how patients would feel if a robot was autonomous or not.
“One person was concerned about whether a robot will or will not show empathy,” Dr Bergeles said.
“Maybe there was a misconception there that this will be a human-like robot, but these are just machines, they don’t exhibit any anthropomorphic characteristics. Maybe in the future the patient can come and see how the robot is and understand what’s going to happen to them."
Dr Bergeles said engineers tend to think that what society wants is for surgeons to always be in control of robots so they aren’t entirely autonomous but could correct any mistakes.
However, participants revealed if robots are designed to be able to do an operation, they trust that the robot will be able to operate without surgeon supervision.
“Currently, a robot is not on its own an intelligent entity – it's fully operated by the clinician. But if we were to opt for a follow-up event, we would use this to understand the directions we could go in terms of capability and start to investigate the possibilities of autonomy in surgery.”
At this year’s New Scientist Live, the team will be demonstrating the possibilities that manifest with their technology.
“I’d like the public to understand the engineering process behind the technology and the fundamental mechanisms that will allow interventions that are not currently possible, so audiences can get a sense of the future of retinal surgery and where we’re going,” Dr Bergeles said.
But what does that future look like according to Dr Bergeles? Part of it is about transparency for the surgeon so there is minimal friction with the robot and to allow surgeons to have super-human capabilities.
“It will allow you to do something that’s very fine, like accessing the small parts of the human eye,” Dr Bergeles explains.
“Image guidance and computer vision algorithms will collect information from the eyes, from the microscopes, and from the robots in order to help with clinical decisions so they could help stabilise the clinicians motions and could record patient anxiety levels.”
“It will be a much richer environment with much more information, much more precision and will be much more personalised.”
The long-term goal of this research, Dr Bergeles says, is to enable the possibility of novel therapies with the help of medical robots, assisting surgeons in delicate procedures that require precision that is not achievable manually.
The real importance of their system will be in enabling forms of surgery that are at present impossible – namely the precise localised delivery of gene or cell therapies to any individual retinal layer.
“We hope that in the next years these devices can will be in a position to deliver these advanced therapies and help patients regain their sight.”
Come visit The School of Biomedical Engineering & Imaging Sciences at this year's New Scientist Live, October 10-13