Volume 13 Issues 1-2 (2024-06-30)
Volume 11 Issues 1-4 (2022-12-31)
Volume 10 Issues 1-4 (2021-12-31)
Volume 9 Issues 1-3 (2020-09-30)
Volume 8 Issues 3&4 (2019-12-31)
Volume 7 Issues 1&2 (2018-06-30)
Volume 6 Issues 3&4 (2017-12-31)
Volume 6 Issues 1&2 (2017-06-30)
Volume 5 Issues 3&4 (2016-12-31)
Volume 5 Issues 1&2 (2016-06-30)
Volume 4 Issues 3&4 (2015-12-31)
The world responds to international rankings like the PISA with an obsession of ranking and ignores score which is the basis of ranking. The assumption that better ranking indicates better performance is doubtful. Re-analysis of the information of PISA 2009 and 2012 shows that there are a sizable number of countries suffering from score-rank inconsistency when their scores increased but the rankings decreased. Countries wishing to improve their education systems need to study changes in scores but not changes in ranking.
Using generalizability theory as a theoretical framework, this study investigated the impact of raters’ educational background on the assessment of K-12 ESL students’ writing. Twenty teacher candidates (ten TESOL majors and ten non-TESOL majors) from universities in western New York and southern Ontario participated in this study. The 20 participants were asked to rate three ESL essays holistically on a 1- 10 point scale (1 being the lowest and 10 being the highest with permission to use half points). The results indicate that raters’ TESOLrelated educational background did impact their rating of ESL essays. The TESOL teacher candidates marked the three ESL essays more consistently and reliably than their non-TESOL counterparts. Important implications for policy makers are discussed.