Article ID Journal Published Year Pages File Type
3983212 Clinical Radiology 2012 5 Pages PDF
Abstract

AimTo compare levels of agreement amongst paediatric clinicians with those amongst consultant paediatric radiologists when interpreting chest radiographs (CXRs).Materials and methodsFour paediatric radiologists used picture archiving and communication system (PACS) workstations to evaluate the presence of five radiological features of infection, independently in each of 30 CXRs. The radiographs were obtained over 1 year (2008) from children with fever and signs of respiratory distress, aged 6 months to <16 years. The same CXRs were interpreted a second time by the paediatric radiologists and by 21 clinicians with varying experience levels, using the Web 1000 viewing system and a projector. Intra- and interobserver agreement within groups, split by grade and specialty, were analysed using free-marginal multi-rater kappa.ResultsNormal CXRs were identified consistently amongst all 25 participants. The four paediatric radiologists showed high levels of intraobserver agreement between methods (kappa scores between 0.53 and 1.00) and interobserver agreement for each method (kappa scores between 0.67 and 0.96 for PACS assessment). The 21 clinicians showed varying levels of agreement from 0.21 to 0.89.ConclusionPaediatric radiologists showed high levels of agreement for all features. In general, the clinicians had lower levels of agreement than the radiologists. This study highlights the need for improved training in interpreting CXRs for clinicians and the timely reporting of CXRs by radiologists to allow appropriate patient management.

Related Topics
Health Sciences Medicine and Dentistry Oncology
Authors
, , , , , , , ,