Reliability of Soft-Copy Versus Hard-Copy Interpretation of Emergency Department Radiographs
- 1 September 2001
- journal article
- research article
- Published by American Roentgen Ray Society in American Journal of Roentgenology
- Vol. 177 (3) , 525-528
- https://doi.org/10.2214/ajr.177.3.1770525
Abstract
OBJECTIVE. The purpose of this study was to compare the diagnostic reliability of hard-copy and soft-copy interpretation of radiographs obtained in the emergency department using a methodology for evaluating imaging systems when independent proof of the diagnosis is not available.MATERIALS AND METHODS. We collected radiographs from a stratified sample of 100 patients seen in the emergency department. The images were obtained using computed radiography, and the digital images were printed on film and stored for display on a workstation. A group of seven experienced radiologists reported the cases using both film and the workstation display. The results were analyzed using mixture distribution analysis (MDA).RESULTS. The reliability expressed as the percentage of agreement of a typical observer relative to the majority was computed from the MDA. The result was 90% for both hard copy and soft copy with bootstrap confidence intervals of 86-94%.CONCLUSION. We conclude that, in the emergency department, soft-co...Keywords
This publication has 11 references indexed in Scilit:
- Interpretation of Emergency Department RadiographsAmerican Journal of Roentgenology, 2000
- Comparing observer performance with mixture distribution analysis when there is no external gold standardPublished by SPIE-Intl Soc Optical Eng ,1998
- Accuracy of bedside chest hard-copy screen-film versus hard- and soft-copy computed radiographs in a medical intensive care unit: receiver operating characteristic analysis.Radiology, 1997
- Mixture distribution and receiver operating characteristic analysis of bedside chest imaging with screen-film and computed radiographyAcademic Radiology, 1997
- Digital radiography and conventional imaging of the chest: a comparison of observer performance.American Journal of Roentgenology, 1994
- Receiver operating characteristic analysis of chest image interpretation with conventional, laser-printed, and high-resolution workstation images.Radiology, 1990
- Receiver Operator characteristic (ROC) Analysis without TruthMedical Decision Making, 1990
- High agreement but low Kappa: I. the problems of two paradoxesJournal of Clinical Epidemiology, 1990
- Better Bootstrap Confidence IntervalsJournal of the American Statistical Association, 1987
- The Effect of Verification on the Assessment of Imaging TechniquesInvestigative Radiology, 1983