Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6154182 | Patient Education and Counseling | 2012 | 7 Pages |
ObjectiveThe aim of our study was to present the structure, process and results of the objective structured video exam and One-Station standardized patient exam that have been used to assess second year medical students' communication skills.MethodsScores of 1137 students between the years 2007 and 2010 were analyzed. Means and standard deviations were calculated for scores and ratings. Internal consistency was assessed using Cronbach's alpha coefficient. To analyze reliability and generalizability, multivariate generalizability theory was employed.ResultsStudents' total and item scores on the objective structured video exam (60.5-68.8) were lower than on the One-Station standardized patient exam (90.4-96.6). Internal consistencies of both exams were moderate. Generalizability analysis and D-study results showed that both the objective structured video exam and the One-Station standardized patient exam need improvement.ConclusionBoth exams need measures to improve them, such as increasing the number of video cases or stations, and further standardization of raters.Practice ImplicationsThis study might encourage medical teachers to consider assessing validity and reliability of written and performance exams on the basis of generalizability theory, and to find out feasible actions to improve assessment procedures by conducting a D-study.