Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
2735724 | Radiography | 2015 | 6 Pages |
•Retrospective analysis of three years CR/DR rejects using customised QA program.•Reviewed data by individual workstations and departments.•Highlighted trends between workstations and departments.•Overall annual reject rates shown to be within WHO recommendations.•Recommendations for future improvements for reject analysis within department and continued review of JPEG images.
AimsReject analysis continues to play an integral part of a Quality Assurance (QA) program. This study aims to show how Computed Radiography (CR) and Digital Radiography (DR) reject analysis data can be customised by the user to aid in the interpretation of exported data and identify trends and issues relating to technique and training.Materials and methodsReject analysis was reviewed for the period of 2011–2014 using exported data from CR and DR systems in the Accident and Emergency (A&E) and General radiology departments at a district general hospital. Reject criteria was customised to departmental needs and standardised across all workstation's with monthly data collection for amalgamation onto a central spreadsheet.ResultsAnalysis by workstation and department was performed with regards to total number of exposure events, rejects and reject ratios (%) and reasons for film rejection (positional and exposure) were reviewed. Annual overall reject ratios (%) were shown to be on average within levels acceptable by the World Health Organisation (WHO)1 with some variability on monthly basis according to workloads experienced.ConclusionsA number of improvements have been suggested to improve data reliability for future analysis and the continuation of a review of the physical rejected image is recommended as this can highlight problematic areas and help to reveal trends which pure data cannot show.