Article ID Journal Published Year Pages File Type
4935750 Assessing Writing 2017 15 Pages PDF
Abstract
Numerous researchers have explored the degree to which specific textual characteristics of student compositions are associated with high and low ratings, as well as differences in these relationships across subgroups of students (e.g., English language learners). These studies provide insight into rater judgments and the development of writing proficiency. However, the degree to which textual characteristics are associated with the psychometric quality of ratings is relatively unexplored. This study illustrates a procedure for exploring the influence of textual characteristics of essays on rating quality in the context of rater-mediated writing performance assessments in order to gain a more-complete understanding of rating quality. Two illustrative datasets are used that reflect writing assessments for native English speakers and English language learners. The CohMetrix software program was used to obtain measures of textual characteristics, and the Partial Credit model was used to obtain indicators of rating quality. The relationship between essay features and rating quality was explored using correlation and profile analyses. Results suggested that rating quality varies across essays with different features, and the relationship between rating quality and essay features is unique to individual writing assessments. Implications are discussed as they relate to research and practice for rater-mediated writing assessments.
Related Topics
Social Sciences and Humanities Arts and Humanities Language and Linguistics
Authors
, , ,