Change search
ReferencesLink to record
Permanent link

Direct link
Bedömarvariation: Balansen mellan teknisk och hermeneutisk rationalitet vid bedömning av skrivprov
Örebro universitet.
Örebro universitet.ORCID iD: 0000-0002-3176-7226
2014 (Swedish)In: Språk & Stil, ISSN 1101-1165, no 24, 133-165 p.Article in journal (Refereed) Published
Abstract [en]

It is well known from studies of inter-rater reliability that assessments of writing tests vary. In order to discuss this rater variation, we depart from two research questions: 1. How can rater variation be conceived of from a professional, i.e. teacher, perspective? 2. What characterises Swedish (mother-tongue) teachers’ assessments of writing tests? The first question is addressed in a meta-study of previous research, and the second question is answered in a study of 14 Swedish teachers’ rating of texts from a national written composition test in upper secondary school. The results show that teachers in the same subject assess better, i.e. have less rater variation, than other groups. It is also clear that writing tests are notoriously difficult to rate. It is very rare that the correlation coefficients reach the desirable 0.7, a number that means that 50 % of the variance could be explained by shared norms. Another main result concerns criteria and tools for assessment. Such tools should be grounded in teachers’ professional expertise, in their expectations for different levels of performance. Our study reveals several situations where teachers’ professional expertise clashes with assessment criteria. The article concludes that valid assessments of tests that are high-stakes must handle both a technical rationality, i.e. the grading should be predictable from rater to rater, and a hermeneutic rationality, i.e. the grading must be based on teachers’ professional judgment.

Place, publisher, year, edition, pages
Uppsala: Adolf Noreen-Sällskapet för Svensk Språk- och Stilforskning , 2014. no 24, 133-165 p.
Keyword [en]
inter-rater reliability, interpretative community, writing assessment, Swedish national writing tests, assessment criteria, true score
National Category
Specific Languages
URN: urn:nbn:se:sh:diva-28584ScopusID: 2-s2.0-84922876854OAI: diva2:861733
Available from: 2015-10-19 Created: 2015-10-16 Last updated: 2015-10-19Bibliographically approved

Open Access in DiVA

No full text

Other links


Search in DiVA

By author/editor
Ledin, Per
In the same journal
Språk & Stil
Specific Languages

Search outside of DiVA

GoogleGoogle Scholar

Total: 127 hits
ReferencesLink to record
Permanent link

Direct link