Article ID Journal Published Year Pages File Type
360176 Journal of English for Academic Purposes 2015 15 Pages PDF
Abstract

•Textual features of test essays were compared to those of successful student disciplinary writing.•Test essays differed systematically from most disciplinary writing on four dimensions.•Topic had a greater effect than native language status on dimension scores.•Second language proficiency had a systematic effect on dimension scores.

One important validity question with regard to writing assessment is the degree to which performance on a timed writing test can predict performance on future academic writing. Recent developments in corpus linguistics have allowed scholars to describe in detail the linguistic features of a variety of academic texts, including genres of disciplinary writing and writing on essay tests, which can aid in answering this question. The purpose of this paper is to compare the linguistic features of test essays written by native and non-native speakers with a comparison corpus of successful student writing across a range of disciplines using Biber's (1988) multidimensional analysis framework. Essays written on two different test prompts were analyzed along dimensions of successful student writing revealed by an analysis of the Michigan Corpus of Upper-level Student Writing (MICUSP) conducted by Hardy and Römer (2013). Results demonstrated that test essays differed in significant ways from disciplinary writing, particularly in the natural and health sciences. Furthermore, language background (native vs. non-native), prompt, and language proficiency (i.e., essay scores) were systematically related to scores on all four dimensions. Implications for pedagogy and language assessment are discussed.

Related Topics
Social Sciences and Humanities Arts and Humanities Language and Linguistics
Authors
, ,