Test Scaling and Value-Added Measurement
As currently practiced, value-added assessment relies on a strong assumption about the scales used to measure student achievement, namely that these are interval scales, with equal-sized gains at all points on the scale representing the same increment of learning. Many of the metrics in which test results are expressed do not have this property (e.g., percentile ranks, normal curve equivalents). However, this property is claimed for the scales obtained when tests are scored according to Item Response Theory.
This claim requires that examinees and test items constitute, in the terminology of representational measurement theory, a conjoint structure. Unfortunately, it is difficult to confirm that this condition is met. In addition, end users of the data lack access to item-level data to test these assumptions themselves. The best they can do is to check the plausability of the resulting scales. On this count, IRT scales often do poorly. Reasonable rescalings have a substantial impact on students’ measured growth.
Methods of ordinal data analysis can be employed instead, on the weaker assumption that IRT scales permit us to rank students. Instead of comparing mean achievement of a teacher’s students to the students of a (hypothetical) average teacher, ordinal analysis asks what fraction of the former outperforms the latter. The feasibility of such methods for value-added analysis is demonstrated. It is seen that value-added estimates are sensitive to the choice of ordinal methods over conventional techniques.
Clearly, if IRT scores are an interval-scaled variable, ordinal methods throw away valuable information. Practitioners of value-added measurement should ask themselves, however, whether they are so confident of the metric properties of these scales that they are willing to attribute differences between regression-based estimates of value added and estimates based on ordinal analysis to the superiority of the former.
To read a copy of this paper, please click here.
Connect with Vanderbilt
©2024 Vanderbilt University ·
Site Development: University Web Communications