Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6849183 | Studies in Educational Evaluation | 2014 | 16 Pages |
Abstract
Supporting users in interpreting assessment results is an important but underexposed aspect of validity. This study investigated how the score reports from the pupil-monitoring Computer Program LOVS can be redesigned in a way that supports users in interpreting pupils' test results. In several rounds of consultations and designs with users and experts using design principles from the literature, alternative designs for the reports were created and field tested. No clear differences were found in terms of users' interpretation accuracy between the original and the redesigned reports. However, users' perceptions of the redesigned reports were predominantly positive. The authors emphasise the need for involvement of experts and users in the design process to ensure the validity of reports.
Related Topics
Social Sciences and Humanities
Social Sciences
Education
Authors
Fabienne M. Van der Kleij, Theo J.H.M. Eggen, Ronald J.H. Engelen,