When AI is used in medicine patients will need new protections

The NHS has a unique store of millions of medical records providing an unparalleled resource from which, with the use of digital techniques, we may speed progress to the next breakthroughs – so why was a tobacco company allowed to access it for its own gains?

Ara Darzi
Wednesday 07 February 2018 11:58 GMT
Comments
I was dismayed by the revelations last month that William E Wecker Associates, a company working for the tobacco industry, obtained the lung cancer records of almost 180,000 patients from Public Health England
I was dismayed by the revelations last month that William E Wecker Associates, a company working for the tobacco industry, obtained the lung cancer records of almost 180,000 patients from Public Health England (Reuters)

For Elon Musk, the term artificial intelligence conjures apocalyptic scenarios of autonomous robots wreaking destruction in a world dominated by hyper-intelligent machines. Stephen Hawking foresees a future in which smart machines replace sluggish humans across a range of activities, driving millions out of work. Last month Bill Gates, speaking at the World Economic Forum in Davos, imagined a gentler future – one with longer holidays and more free time.

“The purpose of humanity is not just to sit behind a counter and sell things,” he said.

We can all speculate about the future. Will.i.am, the Black Eyed Peas singer, has been promoting his first novel this week, an action adventure called WaR: Wizards and Robots. Despite its title, he told an audience in Davos last month that artificial intelligence would be a force for good, narrowing the wealth gap between rich and poor countries.

We must hope it can also narrow the health gap. Healthcare provides especially fertile territory for these advances because of the sheer volume of medical knowledge. No clinician, however smart, can hope to master it. McKinsey has estimated potential savings of up to $100bn (£72bn) in the US Healthcare sector alone from developments in artificial intelligence. The aim is not to replace the doctor (yet, at least) but to enhance their medical expertise.

At the same time, treatment can be democratised and spread equally to all. Why rely on one doctor’s opinion when you can share thousands, culled from databases of their knowledge and the key studies they rely on? Rural dwellers, living far from medical facilities, may be able to enjoy the same level of expertise as their urban counterparts and, ultimately, those in low income countries may benefit from the same expert input as those in the industrialised world.

Supporting doctors to diagnose disease is a key area of research. Mobile apps to help patients track changes in their health and respond appropriately are bringing quicker treatment and lower costs. Employing machine learning to identify new chemical agents is speeding up drug development and shaping clinical research.

To reap these benefits, however, scientists need access to data. Data is as vital to machine learning as coal was to the railways and oil to the motorcar. However, the potential for abuse of data is real.

As a surgeon and researcher I was dismayed by the revelations last month that William E Wecker Associates, a company working for the tobacco industry, obtained the lung cancer records of almost 180,000 patients from Public Health England.

The NHS has a unique store of millions of medical records providing an unparalleled resource from which, with the use of digital techniques, we may speed progress to the next breakthroughs in medical science and transform care. That such a uniquely valuable resource should now be plundered on behalf of a tobacco manufacturer seeking to defend their cancer-causing products is simply shameful.

It remains unclear whether any rules were broken by the company in question, which has testified on behalf of tobacco giants in dozens of lawsuits. Or indeed by Public Health England, which maintains it was under a legal duty to release the information when it was requested under the Freedom of Information Act.

But our failure to protect our medical data from misuse is symptomatic of a wider malaise – our failure to value it. Incidents such as these undermine patients’ trust and set back the cause of research.

The challenge, then, is to devise a system of data governance that protects the interests of patients, provides access for researchers, distributes the fruits of success fairly and wins the confidence of the public. If we are to generate the growth that these innovations could deliver we need to demonstrate why data sharing is a social benefit, as necessary to the public good as taxes.

The Government published its industrial strategy in November 2017 in which it set out a plan to create an Artificial Intelligence Council and a Centre for Data Ethics and Innovation, demonstrating its commitment to an ethical approach.

This is welcome but we need to go further. Public trust demands more transparency. We need a health specific Data Charter, with clear rules, norms and standards, setting out what can be done, what should be done and what may not be done.

There are huge opportunities in these technologies to advance healthcare, benefit health systems and improve the outlook for millions of patients. But unless we establish clear rules from the outset we risk sacrificing public trust, surrendering vital clinical gains and squandering the potential in the vast quantities of medical data we have spent decades accumulating.

Lord Darzi of Denham is a surgeon and director of the Institute of Global Health Innovation at Imperial College London. He was a Labour health minister from 2007-9.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in