کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
4937462 1434615 2017 11 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Comparative evaluation of automated scoring of syntactic competence of non-native speakers
ترجمه فارسی عنوان
ارزیابی تطبیقی ​​امتیاز خودکار به صلاحیت نحوی سخنرانان غیر مادری
کلمات کلیدی
به ثمر رساند خودکار تشخیص گفتار خودکار ارزیابی زبان انگلیسی،
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر نرم افزارهای علوم کامپیوتر
چکیده انگلیسی
Syntactic competence, especially the ability to use a wide range of sophisticated grammatical expressions, represents an important aspect of communicative acumen. This paper explores the question of how to best evaluate the syntactic competence of non-native speakers in an automated way. Using spoken responses of test takers participating in an English practice assessment, three classes of grammatical features - features based on n-grams of part-of-speech tags (POS), features based on various clause types, and features based on various phrases - are compared in an end-to-end assessment system. Feature correlations with human proficiency scores show that POS features and phrase features exhibit the highest correlations with human scores. Including these three classes of grammar features in a baseline scoring model that measures various aspects of spoken proficiency excluding aspects of grammar, we find substantial increases in agreement between machine and human scores. Finally, we discuss the broader implications of our results on the design of automatic scoring systems for spoken language.
ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Computers in Human Behavior - Volume 76, November 2017, Pages 672-682
نویسندگان
, , , ,