Article ID Journal Published Year Pages File Type
4935751 Assessing Writing 2017 21 Pages PDF
Abstract
This study investigates a novel approach to conducting formative writing assessment that involves evaluating students' writing skills across three levels of language (word, sentence, and discourse) using automated measures of word choice, syntax, and cohesion. Writing from students in Grades 6 and 8 (n = 240 each) was analyzed using Coh-Metrix. Multigroup confirmatory factor analysis evaluated a hypothesized three factor levels of language model, and multigroup structural equation modeling determined if these factors predicted performance on a state writing achievement test comprised of a Direct Assessment of Writing (DAW) and an Editing and Revising test (ER). Results indicated that a subset of 9 Coh-Metrix measures successfully modeled three latent levels of language factors at each grade level. Results also indicated that the DAW test was predicted by the latent Discourse factor and the ER test was predicted by the latent Discourse and Sentence factors. Findings provide a proof of concept for automated formative assessment using a levels of language framework. Furthermore, although not the primary goal of the study, results may lay the groundwork for new levels of language detection algorithms that could be incorporated within automated writing evaluation software programs to expand automated + teacher assessment and feedback approaches.
Related Topics
Social Sciences and Humanities Arts and Humanities Language and Linguistics
Authors
, , ,