Pre-registration UK diagnostic radiography student ability and confidence in interpretation of chest X-rays

Radiography student ability and confidence in interpretation of chest X-rays

Authors

  • Abbaas Khan
  • Paul Lockwood Canterbury Christ Church University

DOI:

https://doi.org/10.7577/radopen.4529

Keywords:

Radiography students, Chest X-rays, Abnormality detection, Training

Abstract

Introduction Chest X-rays are the most frequently requested X-ray imaging in English hospitals. This study aimed to assess final year UK radiography student’s confidence and ability in image interpretation of chest X-rays.

Methods Thirty-three diagnostic radiography students were invited to assess their confidence and ability in interpreting chest x-rays from a bank of n=10 cases using multiple choice answers. Data analysis included 2x2 contingency tables, Kappa for inter-rater reliability, a Likert scale of confidence for each case, and questions to assess individual interpretation skills and ways to increase the learning of the subject.

Results Twenty-three students participated in the study. The pooled accuracy achieved was 61% (95% CI 38.4-77.7; k=0.22). The degree of confidence and ability varied depending upon the student and the conditions observed. High confidence was noted with COVID-19 (n=12/23; 52%), lung metastasis (n=14/23; 61%), and pneumothorax (n=13/23; 57%). Low confidence was noted with conditions of consolidation (n=8/23; 35%), haemothorax (n=8/23; 35%), and surgical emphysema (n=8/23; 35%). From the sample n=11 (48%), participants stated they felt they had the knowledge to interpret chest X-rays required for a newly qualified radiographer.

Conclusion The results demonstrated final year radiography student’s confidence and ability in image interpretation of chest X-rays. Student feedback indicated a preference for learning support through university lectures, online study resources, and time spent with reporting radiographers on clinical practice to improve ability and confidence in interpreting chest X-rays.

References

National Health Service England. Diagnostic Imaging Dataset for February 2021. https://www.england.nhs.uk/statistics/wp-content/uploads/sites/2/2021/06/Provisional-Monthly-Diagnostic-Imaging-Dataset-Statistics-2021-06-17.pdf

National Health Service England. Diagnostic Imaging Dataset for February 2019. https://www.england.nhs.uk/statistics/wp-content/uploads/sites/2/2018/06/Provisional-Monthly-Diagnostic-Imaging-Dataset-Statistics-2018-06-21.pdf

Albahli S, Yar GN. Fast and Accurate Detection of COVID-19 Along With 14 Other Chest Pathologies Using a Multi-Level Classification: Algorithm Development and Validation Study. J Med Internet Res 2021;23(2):e23693. https://doi.org/10.2196/23693

Henostroza G, Harris JB, Kancheya N, Nhandu V, Musopole R, Krüüner A, Chileshe C, Dunn IJ, Reid SE. Chest radiograph reading and recording system: evaluation in frontline clinicians in Zambia. BMC Infect Dis 2016;16(1):1-8. https://doi.org/10.1186/s12879-016-1460-z

Society of Radiographers. Preliminary Clinical Evaluation and Clinical Reporting by Radiographers: Policy and Practice Guidance. 2013. https://www.sor.org/learning/document-library/preliminary-clinical-evaluation-and-clinical-reporting-radiographers-policy-and-practice-guidance

Kranz R, Cosson P. Anatomical and/or pathological predictors for the “incorrect” classification of red dot markers on wrist radiographs taken following trauma. Brit J Radiol 2015;88(1046):20140503. https://doi.org/10.1259/bjr.20140503

Hazell L, Motto J, Chipeya L. The influence of image interpretation training on the accuracy of abnormality detection and written comments on musculoskeletal radiographs by South African radiographers. Journal of Medical Imaging and Radiation Sciences 2015;46(3):302-8. https://doi.org/10.1016/j.jmir.2015.03.002

Health and Care Professions Council. The standard of proficiency for radiographers. 2013. https://www.hcpc-uk.org/standards/standards-of-proficiency/radiographers/

McLaughlin L, Bond R, Hughes C, McConnell J, McFadden S. Computing eye gaze metrics for the automatic assessment of radiographer performance during X-ray image interpretation. International journal of medical informatics. 2017 Sep 1;105:11-21. https://doi.org/10.1016/j.ijmedinf.2017.03.001

McLaughlin L, McConnell J, McFadden S, Bond R, Hughes C. Methods employed for chest radiograph interpretation education for radiographers: A systematic review of the literature. Radiography 2017;23(4):350-7. https://doi.org/10.1016/j.radi.2017.07.013

Toomey RJ, Chen M, Davies K, Fernandes K, Olav S, Heitmann MS, Molehe MM, Dorota P, Pettka JA. Does training have an impact on radiography students’ approach to chest X-ray image quality assessment?. OPTIMAX 2018. 2019:113. https://www.researchgate.net/profile/Carst-Buissink/publication/331994287_O_P_TI_M_A_X_2_0_18_a_focus_on_Education_in_Radiology/links/5c9a4c09299bf1116949895a/O-P-TI-M-A-X-2-0-18-a-focus-on-Education-in-Radiology.pdf#page=113

Wright C, Reeves P. Image interpretation performance: A longitudinal study from novice to professional. Radiography 2017;23(1):e1-7. https://doi.org/10.1016/j.radi.2016.08.006

Jeffrey DR, Goddard PR, Callaway MP, Greenwood R. Chest radiograph interpretation by medical students. Clin Radiol 2003;58(6):478-81. https://doi.org/10.1016/S0009-9260(03)00113-2

Ball V, Chiu CS, Lian YP, Lingeswaran L. Final year physiotherapy student’s reliability in chest X-ray interpretation. Physiother Theor Pr 2018;34(1):54-7. https://doi.org/10.1080/09593985.2017.1360423

Murphy A, Ekpo E, Steffens T, Neep MJ. Radiographic image interpretation by Australian radiographers: a systematic review. Journal of Medical Radiation Sciences 2019;66(4):269-83. https://doi.org/10.1002/jmrs.356

Google. Google Forms online platform 2021. https://docs.google.com/forms/u/0/

Royal College of Radiologists. Final Examination for the Fellowship in Clinical Radiology. Part A: Specimen Questions. 2021. https://www.rcr.ac.uk/sites/default/files/SpecimenSBAQuestions-CR2A.pdf

McHugh ML. Interrater reliability: the kappa statistic. Biochemia medica. 2012 Oct 15;22(3):276-82. https://doi.org/10.11613/BM.2012.031

Warrens MJ. New interpretations of Cohen’s Kappa. Journal of Mathematics 2014. pp.1-9. https://doi.org/10.1155/2014/203907

Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977 Mar 1:159-74. https://doi.org/10.2307/2529310

Darling-Hammond L, Flook L, Cook-Harvey C, Barron B, Osher D. Implications for educational practice of the science of learning and development. Appl Dev Sci 2020;24(2):97-140. https://doi.org/10.1080/10888691.2018.1537791

Ekpo EU, Egbe NO, Akpan BE. Radiographers' performance in chest X-ray interpretation: the Nigerian experience. Brit J Radiol 2015;88(1051):20150023. https://doi.org/10.1259/bjr.20150023

Stevens BJ, White N. Newly qualified radiographers' perceptions of their abnormality detection abilities and the associated training they received at undergraduate level. Radiography. 2018 Aug 1;24(3):219-23. https://doi.org/10.1016/j.radi.2018.01.004

Health and Care Professions Council. Standards of continuing professional development. 2018. https://www.hcpc-uk.org/standards/standards-of-continuing-professional-development

Downloads

Published

2021-12-31

How to Cite

Khan, A., & Lockwood, P. (2021). Pre-registration UK diagnostic radiography student ability and confidence in interpretation of chest X-rays : Radiography student ability and confidence in interpretation of chest X-rays . Radiography Open, 7(1), 1–13. https://doi.org/10.7577/radopen.4529

Issue

Section

Articles

Cited by