Article ID Journal Published Year Pages File Type
5884663 Journal of Clinical Anesthesia 2016 6 Pages PDF
Abstract

•Evaluation of anesthesiology resident applicants is challenging.•We describe our departmental review process for the file application and interview.•Objective scoring is done by 2 randomly assigned faculty.•Inter-rater reproducibility was excellent between faculty.•Reliability and reproducibility of our scoring system are reassuring.

Study ObjectiveTo assess reliability and reproducibility of a recently instituted anesthesiology resident applicant interview scoring system at our own institution.DesignRetrospective evaluation of 2 years of interview data with a newly implemented scoring system using randomly assigned interviewing faculty.SettingInterview scoring evaluations were completed as standard practice in a large academic anesthesiology department.SubjectsAll anesthesiology resident applicants interviewed over the 2013/14 and 2014/15 seasons by a stable cohort of faculty interviewers. Data collection blinded for both interviewers and interviewees.InterventionsNone for purposes of study - collation of blinded data already used as standard practice during interview process and analysis.MeasurementsNone specific to study.Main ResultsGood inter-rater faculty reliability of interview scoring (day-of) and excellent inter-faculty reliability of application review (pre-interview).ConclusionsDevelopment of a department-specific interview scoring system including many elements beyond traditional standardized tests shows good-excellent reliability of faculty scoring of both the interview itself (including non-technical skills) and the application resume.

Related Topics
Health Sciences Medicine and Dentistry Anesthesiology and Pain Medicine
Authors
, , , , , , , , , , , , ,