Leveraging red dot® to tackle radiology backlogs

In this blog, we show the benefits of using Behold.ai’s red dot® Chest X-ray platform to tackle radiology reporting backlogs. Our red dot® CXR platform was developed in conjunction with NHS Trusts and FRCR Consultant Radiologists, and it has already proven to be effective and to have a high performance in live settings [1]. Red dot® CXR is CE-marked and UKCA certified.

By Dr Nicole Tay, Tom Dyer, Gaëtan Dissez, Liliana Garcia Mondragon, George Pearse and Jordan Smith



Pressures on Radiology Departments

Chest X-Ray (CXR) was the most common imaging procedure conducted in 2019/20, with 8.3m performed [2], and is the main tool to detect lung cancer. However, the NHS radiology departments are severely short-staffed. However, the Royal College of Radiologists (RCR) reported that in 2021 there was a 29% shortfall of radiologists in the UK, which is expected to rise to 39% by 2026 [3]. These shortfalls have increased concerns about existing backlogs and delays to patient diagnoses and treatments.

The Royal College of Radiologists (RCR) reported that in 2021 there was a 29% shortfall of radiologists in the UK, which is expected to rise to 39% by 2026 [3].

Moreover, a 2021 report published by Parliamentary and Health Service Ombudsman service highlighted recurrent failings in x-ray and other radiography reporting for cancer diagnosis, with nearly a third of cases experiencing delays in reviewing or reporting of their images [4]. With the absence of an effective prioritisation mechanism for reporting, radiologists and reporting radiographers need to chronologically report on all CXRs to identify urgent cases. It’s no wonder patients continue to experience delays, which often results in missed opportunities to diagnose and treat a condition sooner.


The impact of Covid on Lung Cancer

Lung cancer is the third most common cancer in the UK, and it is the most common cause of cancer death [5]. The NHS Long Term Plan (LTP) aims for 55,000 more people each year to survive cancer for five or more years and for 75% of cancers to be diagnosed at an early stage [6]. However, the reality of radiology staff shortages, increasing imaging demand and backlogs makes this target seem unachievable. In 2019, a mere 30% of lung cancers in England were diagnosed at stage 1 or 2 [7], which is a long way off the ambitious LTP target.

“The Covid-19 pandemic has only exacerbated this problem – it is estimated that, between March 2020 and September 2021, 740,000 GP referrals were “missed” for suspected cancer and up to 60,000 missed cancer patients have therefore not started treatment [8]. This is likely to lead to up to 7.7% more deaths within a year of lung cancer diagnosis compared with pre-pandemic [9].”

Furthermore, lung cancer outcomes are a huge area of concern for healthcare inequalities in the UK.

Patients in the most deprived areas around 170% more likely to die from lung cancer compared to patient in the least deprived areas [10].


About Red dot® CXR

Our state-of-the-art red dot® CXR platform is a Class IIa CE marked and UKCA certified software as a medical device (SaaMD). It can process all adult ≥18 years frontal Chest X-Ray examinations to either rule out or identify the presence of an abnormality in a clinical care setting.

There are two key AI-derived outputs produced by red dot®:

  1. Suspected Lung Cancer (SLC) – a subset of abnormal examinations in which a finding suggestive of lung cancer has been identified. Red dot® has been shown to reduce the rate of cancers missed by human readers [11].
  2. High Confidence Normal (HCN) – a subset of normal examinations (~15% of CXRs) that have been identified as normal with a very high degree of confidence. Red dot® can safely remove 15% of all CXRs with an error rate of 0.34% to be auto-reported [12], thereby reducing workloads for radiology departments.


Tackling Radiology Backlogs – A Case Study

Behold.ai have worked closely with Mid and South Essex (MSE) NHS Trust to assist in effectively prioritising their patient backlog of 3,497 cases. The red dot® CXR solution was implemented with the goal of identifying urgent cases, by triaging patients identified as positive for Suspected Lung Cancer by the AI solution.


Two patient backlogs were collected from the Trust, consisting of 3,497 unreported adult, frontal CXRs. Red dot® CXR was implemented to triage these exams and speed up the process of reporting. When an exam was flagged as SLC, a heatmap was provided to the reader which highlights the algorithm’s attention point for abnormal features. SLC-flagged CXRs were also prioritised for next day reporting (as opposed to the remaining exams which were reported in due course). Any CXRs with suspicious malignancy flagged by red dot®, which was confirmed by the Trust radiologist, were alerted for follow up CT in line with the Trust’s protocols.  

429 cases (13% of the total) were flagged as SLC by the red dot® algorithm, and Trust reporters showed an agreement rate of 60% with these flagged cases. Disagreements with the algorithm mainly consisted of safe overcalls, such as granulomas and nipple shadows.

AI outputsNumber of CXRs flagged (% of total CXRs)Agreement with AI output by Trust reporters (% agreement)Overview of disagreements
SLC469 (13%)280 (60%)Safe overcalls including granulomas, nipple shadows and pleural plaques.

While the majority of the significant pathologies within the backlog were known to the Trust, this analysis raised 10 new CT alerts. Amongst these 10 alerts, 3 new lung cancer patients were identified. These patients experienced an average delay of 17 days between CXR acquisition and CXR reporting as a result of the backlog. Without the implementation of red dot® to triage and prioritise exams, further delays would have been highly likely. Surprisingly, a further review of these patients’ electronic medical records revealed that two patients were originally documented as ‘normal CXR’ following acquisition:

  • Patient A’s CXR was reported as “normal” and was discharged on the same day as CXR acquisition. A CT was performed following backlog triage, which showed multiple masses likely suggesting advance-staged tumours.
Patient A experienced delay of 38 days between CXR acquisition and CT reporting to confirm diagnosis.
  • Patient B’s CXR was also reported as “normal” and was discharged on the same day as CXR acquisition. A newly acquired CT showed a right lower lobe lesion, also potentially in advanced stages.
Patient B experienced delay of 35 days between CXR acquisition and CT reporting to confirm diagnosis.
  • Patient C was admitted into hospital due to potential Covid and a CXR was acquired;  the patient was discharged 6 days later. When the backlog was finally reported, an urgent CT was promptly requested but the department struggled to get the patient to attend the booked appointments. CT was finally conducted 47 days after initial CXR, which indicated a recurrence of cancer.
Patient C experienced delays of 41 days between CXR acquisition and CT appointment date. Patient did not attend 2 previous appointments, resulting in Trust admin staff having to constantly follow up.


Conclusion

This case study showed that the red dot® CXR platform can effectively triage the most urgent patients and thus reduce delays for lung cancer treatment. The study also demonstrated red dot®’s ability to overcome expert reader errors, which in turn highlights the value of a second reader for the evaluation of potential lung cancer in a climate of enormous pressure on clinical staff.

The use of this platform can thus play an important role in alleviating pressures in radiology departments and it can lead to improved lung cancer outcomes. The use of red dot® is also likely to have a positive impact on health inequalities, for example by promoting same-day CT exams, which can benefit patients who cannot afford to make multiple hospital visits. Indeed, NHS Trusts from deprived areas are more likely to have high DNA (did not attend) rates for follow-up CT after a CXR is flagged as suspicious for malignancy.


References

1. Somerset NHS Trust tests AI in cancer diagnosis. Available from https://www.ukauthority.com/articles/somerset-nhs-trust-tests-ai-in-cancer-diagnosis/.

2. NHS England and NHS Improvement. Diagnostic Imaging Dataset Annual Statistical Release 2019/20; 2020. Available from https://www.england.nhs.uk/statistics/statistical-work-areas/diagnostic-imaging-dataset/diagnostic-imaging-dataset-2019-20-data/.

3. The Royal College of Radiologists. UK workforce census 2021 report; 2021. Available from https://www.rcr.ac.uk/clinical-radiology/rcr-clinical-radiology-census-report-2021.

4. Parliamentary and Health Service Ombudsman. Unlocking solutions in imaging: working together to learn from failings in the NHS; 2021. Available from https://www.ombudsman.org.uk/publications/unlocking-solutions-imaging-working-together-learn-failings-nhs.

5. Cancer Research UK. Lung cancer incidence statistics. Available from https://www.cancerresearchuk.org/health-professional/cancer-statistics/statistics-by-cancer-type/lung-cancer/incidence.

6. NHS. The NHS Long Term Plan; 2019. Available from https://www.longtermplan.nhs.uk/publication/nhs-long-term-plan/

7. Cancer Research UK. Staging data in England. Available from https://www.cancerdata.nhs.uk/stage_at_diagnosis.

8. National Audit Office (NAO). NHS backlogs and waiting times in England; 2021. https://www.nao.org.uk/wp-content/uploads/2021/07/NHS-backlogs-and-waiting-times-in-England.pdf.

9. Maringe, Camille et al. “The impact of the COVID-19 pandemic on cancer deaths due to delays in diagnosis in England, UK: a national, population-based, modelling study.” The Lancet. Oncology vol. 21,8 (2020): 1023-1034.

10. Cancer Research UK and National Cancer Intelligence Network. Cancer by deprivation in England:  Incidence, 1996-2010, Mortality, 1997-2011. London: NCIN; 2014.

11. Tam, M D B S et al. “Augmenting lung cancer diagnosis on chest radiographs: positioning artificial intelligence to improve radiologist performance.” Clinical radiology vol. 76,8 (2021): 607-614. doi:10.1016/j.crad.2021.03.021

12. Dyer, T et al. “Diagnosis of normal chest radiographs using an autonomous deep-learning algorithm.” Clinical radiology vol. 76,6 (2021): 473.e9-473.e15. doi:10.1016/j.crad.2021.01.015.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create your website with WordPress.com
Get started