Article ID Journal Published Year Pages File Type
344378 Assessing Writing 2009 16 Pages PDF
Abstract

Establishing the score or the placement as the first priority in a writing assessment leads to more reductive forms of writing assessment. However, if the prompts used in a direct test of writing were generative – that is, if they asked test-takers to analyze their own experiences as writers or learners, for example – the resulting texts would be useful data beyond the act of producing a ranking or a judgment. Washington State University developed and trialled such a prompt, one that asks students to reflect on their curricular and extra-curricular learning opportunities in relation to the university's Six Learning Goals for the Baccalaureate. The results were texts that demonstrate, among other things, which goals are (and are not) effectively distributed across the curriculum. Using these texts to address outcomes assessment on a university-wide level makes the assessment more valuable than it would be if it merely produced a set of placements. In addition, the richness of the student texts has provided a valuable resource for graduate-level research that is broader and more meaningful than simply training future raters of writing. Further, the raw data have proved to be accessible to researchers with wide-ranging theoretical lenses, meaning that the data yielded by an assessment can become a significant resource for research beyond the needs of the assessment program alone. Given the need for university assessment programs to compete for ever-scarcer resources, exploring the potential of the generative prompt seems in our enlightened self-interest.

Related Topics
Social Sciences and Humanities Arts and Humanities Language and Linguistics
Authors
,