The findings of the Independent Review of Equity in Medical Devices were published on Monday 11th March 2024 in a report on the Government website. Within the report, it covers the performance of devices commonly used within the NHS and their biases. This covered three main optical devices: pulse oximeters and certain genomics applications, as well as those which operate with Artificial Intelligence (AI). Through the report, experts call for urgent action on the development, testing and deployment of these devices to prevent patient harm and the further expansion of health inequalities.

For the review, the panel was made up of multiple experts and were led by Professor Dame Margaret Whitehead. Alongside Professor Holmes (The Alan Turing Institute and Department of Statistics, University of Oxford), the panel included Professors Raghib Ali (Cambridge University), Enitan Carrol (The University of Liverpool and North West Clinical Research Network), and Frank Kee (Queens University, Belfast).

During Covid-19, pulse oximeters were widely used to monitor blood oxygen levels and evidence has been found by the expert panel that these can over-estimate the oxygen levels in the blood of people with darker skin tones. This can lead to dangerously low oxygen levels being missed and in turn a delay in treatment. In relation to the pulse oximeters already in widespread use across the NHS, mitigating actions have been recommended by the review with additional comments made on preventing adverse impacts by addressing new devices when they are developed and then used.

Other potential biases demonstrated included those against women, ethnic minorities, and those from disadvantaged socioeconomic groups for AI-enabled devices. These biases were found to affect the clinical decision-making tools used to choose which patients needed more intensive treatment, particularly if they were higher risk. An example provided by the report covers the under-diagnosis of skin cancers for people with darker skin when using AI-enabled devices, due to them being ‘trained’ on mainly images of lighter skin tones, and there being less images provided during the training of medium to dark skin tones.

There are also the long-standing medical gaps in women’s health and those who are born as female at birth, which could also provide bias in AI-enabled devices. One such case looked at by the report is the under-diagnosis of cardiac condition in women and those born as female at birth. The AI algorithms used with the AI-enabled devices have potential to make this under-diagnosis worse.

As a member of the expert panel for the review, Professor Holmes called on the Government to appoint an expert panel to address the AI revolution happening in healthcare.

“We are calling on the government to appoint an expert panel including clinical, technology and healthcare leaders, patient and public representatives and industry to assess the potential unintended consequences arising from the AI revolution in healthcare.

Now is the time to seize the opportunity to incorporate action on equity in medical devices into the overarching global strategies on AI safety,”

Professor Chris Holmes

It was also recommended by the review for the Government to start preparing now for the healthcare disruption from the next-generation of AI-enabled machines to reduce the risk of patient harm.

Further Information

Find out more about the report/read the full report here.

Click here to read the Government response to the Report.