Concordance and Discordance Between Radiology Residents and Consultant Radiologist Interpretation Of CT Brain

Authors

  • Madiha Pervaiz Rehman Medical Institute, Peshawar
  • Ummara Siddique Umer Rehman  Medical Institute, Peshawar
  • Muhammad Abdullah Rehman Medical Institute, Peshawar
  • Ghulam Ghaus Rehman Medical Institute, Peshawar
  • Muhammad Kamran Khan Rehman Medical Institute, Peshawar
  • Muhammad Sohail Rehman Medical Institute, Peshawar
  • Hammad Ur Rehman Rehman Medical Institute, Peshawar

DOI:

https://doi.org/10.37762/jgmds.11-4.567

Keywords:

Concordance, Radiology, Diagnosis, Consultant

Abstract

OBJECTIVES

The primary objective of this study is to assess the degree of concordance and discordance between the interpretations of computed tomography (CT) brain images by resident and consultant radiologists while emphasizing the critical significance of accurate image interpretation for informed clinical decision-making.

METHODOLOGY

The evaluation of radiology reports for CT Brain interpretation through a prospective analysis at the Radiology Department of Rehman Medical Institute over two years, from 1st October 2020 to 31st October 2022. A total of 198 patients who underwent cranial CT scans were interpreted by residents (R1, R2, R3, R4). Following this, the consultant radiologists reviewed the images and completed their reports. The reports of the residents and the consultant radiologists were then compared, and concordance was achieved when the residents’ reports were consistent with the final radiologist’s reports. The data collected were recorded in Microsoft Excel. The statistical analysis was performed using SPSS version 22 (IBM Corp., Armonk, NY), and the kappa coefficient was used to determine the level of agreement between residents and consultants.
RESULTS
Among the 198 CT Head reports evaluated, 186 of them were in agreement with the final report of the consultant radiologist. Of the correctly diagnosed cases, R1 correctly diagnosed 46 cases, R2 correctly diagnosed 80 cases, R3 correctly diagnosed 54 cases, and R4 correctly diagnosed 6 cases. Our study achieved a percentage agreement of 93.93, with a Cohen's kappa coefficient of 0.8.
CONCLUSION
The overall concordance rate between residents and consultant radiologists was 93.93%, with a kappa coefficient 0.8. This high kappa coefficient indicates strong agreement between the two groups.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Author Biographies

Madiha Pervaiz, Rehman Medical Institute, Peshawar

Resident, Department of Radiology,
Rehman Medical Institute, Peshawar

Ummara Siddique Umer, Rehman  Medical Institute, Peshawar

Associate Professor Radiology,
Rehman  Medical Institute, Peshawar

Muhammad Abdullah, Rehman Medical Institute, Peshawar

Research Fellow, Department of Radiology,
Rehman Medical Institute, Peshawar

Ghulam Ghaus, Rehman Medical Institute, Peshawar

Consultant, Radiologist,
Head of Department,
Rehman Medical Institute, Peshawar

Muhammad Kamran Khan, Rehman Medical Institute, Peshawar

Fellow,
Department of Radiology,
Rehman Medical Institute, Peshawar

Muhammad Sohail, Rehman Medical Institute, Peshawar

Assistant Professor,
Department of Neurology,
Rehman Medical Institute, Peshawar

Hammad Ur Rehman, Rehman Medical Institute, Peshawar

Resident,
Department of Radiology,
Rehman Medical Institute, Peshawar

References

Kim JJ, Gean AD: Imaging for the diagnosis and management of traumatic brain injury. Neurotherapeutics. 2011, 8:39-53 DOI: https://doi.org/10.1007/s13311-010-0003-3

Ganeshan D, Duong PA, Probyn L, et al.: Structured reporting in radiology. Academic radiology. 20181, 25:66-73 DOI: https://doi.org/10.1016/j.acra.2017.08.005

Carney E, Kempf J, DeCarvalho V, Yudd A, Nosher J: Preliminary interpretations of after-hours CT and sonography by radiology residents versus final interpretations by body imaging radiologists at a level 1 trauma center. American Journal of Roentgenology. 2003, 181:367-73 DOI: https://doi.org/10.2214/ajr.181.2.1810367

Guérin G, Jamali S, Soto CA, Guilbert F, Raymond J: Interobserver agreement in the interpretation of outpatient head CT scans in an academic neuroradiology practice. American Journal of Neuroradiology. 20151, 36:24-9. DOI: https://doi.org/10.3174/ajnr.A4058

Erly WK, Berger WG, Krupinski E, Seeger JF, Guisto JA: Radiology resident evaluation of head CT scan orders in the emergency department. American journal of neuroradiology. 2002, 23:103-7

Wu MZ, McInnes MD, Blair Macdonald D, Kielar AZ, Duigenan S: CT in adults: systematic review and meta-analysis of interpretation discrepancy rates. Radiology. 2014, 270:717-35 DOI: https://doi.org/10.1148/radiol.13131114

Babiarz LS, Yousem DM: Quality control in neuroradiology: discrepancies in image interpretation among academic neuroradiologists. American journal of neuroradiology. 20121, 33:37-42 DOI: https://doi.org/10.3174/ajnr.A2704

McCoubrie P, FitzGerald R: Commentary on discrepancies in discrepancy meetings. Clinical Radiology. 20141, 69:11-2 DOI: https://doi.org/10.1016/j.crad.2013.07.013

Tamjeedi B, Correa J, Semionov A, Mesurolle B: Interobserver agreement between on-call radiology resident and general radiologist interpretations of CT pulmonary angiograms and CT venograms. PLoS One. 2015, 4:0126116 DOI: https://doi.org/10.1371/journal.pone.0126116

Khorasani R, Bates DW, Teeger S, et al.: Is terminology used effectively to convey diagnostic certainty in radiology reports?. Acad Radiol. 2000, 7:1091-8

Vaattovaara E, Nikki M, Nevalainen M, Ilmarinen M, Tervonen O: Discrepancies in interpretation of nighttime emergency computed tomography scans by radiology residents. Acta Radiologica Open. 2018, 7:2058460118807234 DOI: https://doi.org/10.1177/2058460118807234

Waite S, Grigorian A, Alexander RG, Macknik SL, Carrasco M, Heeger DJ, Martinez-Conde S: Analysis of perceptual expertise in radiology-Current knowledge and a new perspective. Frontiers in human neuroscience. 2019, 25:213 DOI: https://doi.org/10.3389/fnhum.2019.00213

Geijer H, Geijer M: Added value of double reading in diagnostic radiology, a systematic review. Insights into imaging. 2018, 9:287-301 DOI: https://doi.org/10.1007/s13244-018-0599-0

Zwaan L, Kok EM, van der Gijp A: Radiology education: a radiology curriculum for all medical students?. Diagnosis. 20171, 4:185-9 DOI: https://doi.org/10.1515/dx-2017-0009

Johnson AJ, Chen MY, Swan JS, Applegate KE, Littenberg B: Cohort study of structured reporting compared with conventional dictation. Radiology. 2009, 253:74-80 DOI: https://doi.org/10.1148/radiol.2531090138

Goldberg-Stein S, Chernyak V: Adding value in radiology reporting. Journal of the American College of Radiology. 20191, 16:1292-8 DOI: https://doi.org/10.1016/j.jacr.2019.05.042

Richardson ML, Garwood ER, Lee Y, Li MD, Lo HS, Nagaraju A, Nguyen XV: Probyn L, Rajiah P, Sin J, Wasnik AP. Noninterpretive uses of artificial intelligence in radiology. Academic Radiology. 2021, 1:1225-35 DOI: https://doi.org/10.1016/j.acra.2020.01.012

Mutasa S, Sun S, Ha R. Understanding artificial intelligence based radiology studies: What is overfitting?. Clinical imaging. 2020 Sep 1;65:96-9 DOI: https://doi.org/10.1016/j.clinimag.2020.04.025

Downloads

Published

2024-09-30

How to Cite

Pervaiz, M., Umer, U. S., Abdullah, M. ., Ghaus, G. ., Khan, M. K. ., Sohail, M. ., & Rehman, H. U. . (2024). Concordance and Discordance Between Radiology Residents and Consultant Radiologist Interpretation Of CT Brain. Journal of Gandhara Medical and Dental Science, 11(4), 7–11. https://doi.org/10.37762/jgmds.11-4.567

Most read articles by the same author(s)