The NHS could massively benefit from new technologies – just not the new technologies you think.
The NHS turned 70 this year. Much of the coverage of the anniversary highlighted how the health service is facing unprecedented challenges: a shortage of clinicians, an ageing population prone to ever more ailments, and ongoing public health challenges from poor nutrition to air pollution.
And all of this is set against a background of the vagaries of politicians, including pledges to deliver a seven-day NHS, attacks on junior doctors, and various health service ‘reforms’.
When an organisation is stressed in this way, it’s not unusual to hear calls for greater efficiencies to be made, for staff to start working smarter, not harder — and technology is often seen as an enabler of such productivity gains.
The NHS has had a difficult relationship with technology in the past: indeed some of the worst failures in public sector IT have been associated with the health service, like the costly problems of the NHS National Programme for IT. More recent WannaCry ransomware attacks, which took swathes of NHS services offline, shows that the organisation is still lagging when it comes to basic IT hygiene.
That said, it’s hard not to come to the conclusion that there are areas where technology could help the NHS deal with its challenges: experiments with artificial intelligence are showing early promise, indicating that there are areas of routine work, such as reviewing various types of medical scans, that could one day be handed over to AIs.
Similarly, it’s easy to see how Internet of Things deployments could be used to improve patient safety and resource management. Other technologies, such as robotics or virtual reality, could have smaller, but potentially equally interesting, roles within the health service in future. Connected devices, both for use in hospitals and in the home, will also help drive insight into public health at a population level. Some of the pressure here will come from patients themselves who have bought a new gadget like the new Apple Watch.
However, the more clouds hang over other emerging technologies. For a permanently cash-strapped NHS, any technology that requires a significant financial outlay — such as a new surgical robot — is likely to only make its way into the NHS slowly, particularly if it’s a more esoteric piece of kit. One example of the honest realities of funding: while charities may be willing to fundraise for a new surgical robot for a children’s ward, will they feel the same about IoT kit for tracking (expensive and easy to misplace) hospital beds? That’s always going to be a harder sell.
Still, the trend for technology to become dramatically cheaper and more powerful over time, particularly when there’s increased competition, will likely see certain technologies like AI and robotics spread relatively quickly once a certain price point is reached.
But questions over responsibility will also need to be solved before some emerging tech is used to its full potential. Take AI, for example, or any other systems that become automated: if something goes wrong, who should take the blame?
If an AI reads a scan and makes the wrong call, sending the patient home with an all-clear rather than for treatment due to a recurrence of cancer, working out how to apportion blame will be tricky. Will we blame the NHS, for using a system that can make such mistakes, the hospital for not reviewing the AI’s decisions, or any of the technology providers involved in the health service’s AI stack? How does healthcare ensure that the algorithms behind these services are tested and fair to the whole community, not just a subset?
These are harder problems to resolve than questions of cost, bringing with them issues of patient safety and litigation. The NHS will have to decide what is an acceptable error rate for autonomous tech systems, just as it has to with human medical professionals.
A related issue that must also be dealt with is that of trust. From consumer wearables to AI systems, the health service’s users need to feel that any data they provide to emerging tech should be treated with the same level of confidentiality as if they had shared it with their family GP.
The NHS has already made some missteps in this area: the tin-eared attitude to data sharing with Care.data, and the clumsy handling of patient information with the pilot of DeepMind’s Streams app, have shown the health service still has a way to go before it reaches the necessary level of trust for wider data sharing.
That’s not only a shame because it could hold back the rollout of useful technology, but also because it prevents the NHS from gathering vital data that could be used for public health research to improve healthcare across the country.
Of course, NHS-wide data gathering is something of a pipe-dream right now. To gather and share data between organisations within the NHS would require interoperability between the disparate IT systems that the NHS uses. As IT procurement is done piecemeal, NHS IT is not standardised, meaning that it’s far too hard to piece together data across the health service as whole — a real shame given how useful such data would be in training those AI systems which could do so much good.
What’s more, a significant chunk of the NHS, particularly in secondary care, is firmly attached to good old undigitized paper. For many patients admitted to hospital, every interaction with a doctor will be recorded on a bundle of paper notes in the doctor’s famously terrible handwriting.
Pages of notes that fall out with age, which are often kept separately to the patient’s drug chart, and can only be viewed by one healthcare professional at a time. Pages that more junior members of the ward team will spend a not insignificant amount of time chasing around the ward on a daily basis.
Those drug charts, too, are paper — a dreadful anachronism in a world where e-prescribing schemes would cut the mistakes due to handwriting.
Similarly, fax machines and transferring images on physical CDs still occur in some modern hospitals — while there’s a certain security argument for such systems, equally secure all-digital alternatives exist, and would greatly speed the time it takes to send, say, a discharge letter detailing a new medication regime from a hospital ward to the patient’s GP.
Even basic mobile tech is something of a rarity, although more understandable given security reasons. You can’t help but think of the simple productivity benefits mobile would bring — the ability for a consultant to check a patient’s notes when they get a 2am call about an emergency, for example — and wonder how much consumer tech is already taking the strain: when WannaCry took email systems online, it was WhatsApp that let health professionals carry on communicating with each other.
While emerging tech from AI to VR, robotics to IoT could bring clear improvements to the NHS, it seems like there are greater, systemic benefits to be had from just getting the basics right: a paperless NHS, electronic prescribing, data sharing between primary and secondary care, a central body to guide NHS procurement to ensure health service-wide interoperability.
Such a wish list may sound ridiculously simple, but it would be both costly and grossly time consuming, and unlikely to capture either public imagination or political enthusiasm in the same way as more cutting-edge tech. And yet, it’s these ‘simple’ changes that could not only bring greater efficiency and better productivity for the NHS, but lay the foundations for the next generation of emerging technologies.