At the start of lockdown in the UK back in March (2020, if anyone needs reminding…), doctors surgeries closed their doors to walk in appointments and everything went as online as possible. Whilst many would have been sceptical of a virtual consultation, I also think many have been surprised. GP surgeries have had fewer missed appointments and shorter waiting times to see a doctor. Going virtual meant many concerns could be addressed without the patient needing to come to the surgery. But what if the doctors themselves were computerised? What if we had robot doctors?
Computers and algorithms have already started to infiltrate our health; they count our steps, monitor our heart rate, track calorie intake, and even give us cues as to when we’ve been sitting about for too long. But could robots, computers and algorithms replace a human doctor? There are already examples of robots/computers used in health care. During the pandemic, disinfecting robots have been used in hospitals to kill any microorganisms in a patient’s room/surgery after they’ve left. The robot (called UVD) uses high-powered UV rays which destroy microorganisms’ DNA, thus stopping them from multiplying and being harmful.
The da Vinci Surgical System is a robot-assisted surgery tool, controlled by a human and has been in use since 2000. It allows for more precision than a human surgeon, leading to less bleeding, faster healing, and a reduced risk of infection. More recently however, a robot controlled by a computer performed intestinal surgery on a pig. Using an advanced imaging system, the computer was able to guide the robot to positions down to the millimetre, reconnecting the intestine of a living animal. They compared the computer-controlled robot’s work to that of a human surgeon and the previously mentioned da Vinci system; It outperformed both. Capsule endoscopies have begun to be used in the NHS, whereby the patient swallows a pill-sized camera/robot which travels along the digestive tract to gather data and take pictures for diagnostics. This could reduce the often unpleasant preparation and certainly be a more comfortable procedure than current endoscopy methods. But what about when you take away human control and give the machine power?
Artificial Intelligence (AI) and Machine Learning (ML) have become buzz phrases commonly used to talk about how machines learn from experience, adjust to an input and can perform human-like tasks; How? Huge amounts of data are inputted to a system which uses fast processing and algorithms to find patterns and thus learn. For example, if you show an AI system 100 photos of wolves and 100 photos of huskies, and ask it to spot the difference, the system can find patterns to do so. You can see this HERE. Whilst we know what a dog looks like, four legs, furry etc., researchers found the system had learnt to tell the difference primarily on whether there was snow in the picture it was presented with. An example was tested in 2017 when AI was used to analyse breast cancer biopsies. Researchers at MIT implemented an AI system which could predict if a high-risk lesion detected in a biopsy could lead to malignant cancer. The system was presented with 335 cases and correctly diagnosed 97% of the malignant cases, and reduced the number of unnecessary benign surgeries by more than 30%, when compared to current methods of detection. This is just one example of many ongoing AI studies as single, specific use diagnostic tools. What about multi purpose robotic GPs?
Technology and consulting company, IBM, want to do just that. In 2011, IBM entered their question answering AI computer system, Watson, into the American TV show Jeopardy! where it beat the reigning champions. After their win IBM turned their efforts of Watson to the development of a diagnostic robot. Watson’s broader goal was to be the beginning of a new generation of technology that can find answers in mass data more effectively than standard search technology could. However, researchers have found this is much more difficult than expected. IBM envisaged an AI doctor, able to take the symptoms presented by any patient and quickly produce a diagnosis and treatment plan. That would mean Watson would need to be able to differentiate between all diseases and conditions and their symptoms. In theory, this shouldn’t have been impossible. Input all the data on all diseases and Watson could learn which symptoms went with which disease, starting to differentiate. And, in a closed environment of the lab, Waston performed well. Researchers inputted bizarre combinations of symptoms and Watson could successfully diagnose them. However, in the real messy world Watson didn’t do so well. In cancer studies, researchers found Watson struggled to compute diagnoses from doctors’ notes. If they used shorthand or wrote out of chronological order Watson couldn’t use the data to make an informed decision. What about our complicated language? How is Watson meant to learn that when a patient says their stomach is “killing them” they may not literally mean that?
Many other concerns arise with AI in medicine, one of which is over diagnosis as the system may be too sensitive and lead to unnecessary treatments. Another is who the system would be working for, the individual or the wider population? An individual may benefit from a course of antibiotics for their continuous cough, but what about antibiotic resistance? Would AI systems know to account for these instances? No one is denying the importance of human doctors and nurses, especially after a pandemic which has certainly opened everyone’s eyes to the incredibly challenging job they face every day. However, quick diagnosis and treatment plans could free up more time for healthcare professionals to administer treatments and more personalised care allowing time for more patient care and compassion. However, the robot doctor is a way off and won’t be seeing you just yet….