Article ID Journal Published Year Pages File Type
344278 Assessing Writing 2013 22 Pages PDF
Abstract

In this paper, I describe the design and evaluation of automated essay scoring (AES) models for an institution's writing placement program. Information was gathered on admitted student writing performance at a science and technology research university in the northeastern United States. Under timed conditions, first-year students (N = 879) were assigned to write essays on two persuasive prompts within the Criterion® Online Writing Evaluation Service at the beginning of the semester. AES models were built and evaluated for a total of four prompts. AES models meeting recommended performance criteria were then compared to standardized admissions measures and locally developed writing measures. Results suggest that there is evidence to support the use of Criterion as part of the placement process at the institution.

► Design and evaluation of automated essay scoring models for an institution's writing placement program are described. ► Performance of customized prompt-specific and preexisting generic models are compared. ► Association between AES placement test scores and locally developed measures (portfolio scores) is examined.

Related Topics
Social Sciences and Humanities Arts and Humanities Language and Linguistics
Authors
,