[From Rick Marken (2010.09.02.1230)]
The reason I asked Richard Kennaway for his correlation data is because there is a bit of a controversy going on in LA these days regarding teacher performance measures. The LA Times has been running this big story over the last couple weeks on how the LAUSD (LA School District) has measures of each teacher’s “value added” (VA) but doesn’t use it. The VA measure for a teacher is the difference in average Math and English test score from the prior to the present year of students in that teachers class. So, for example, if the 30 students in a teacher’s class had an average Math score of 60 from the prior year (going into the teacher’s class) and a score of 70 at the end of the class year with the teacher then the teacher’s VA Score is +10.
The LA Times reporters got a hold of this data and decided that it should be made public so that parents and administrators could know how effective a teacher is. The teacher’s union, of course, went ballistic and tried to prevent the Times from making the data public. But the Times went ahead and published it a few days ago, along with a helpful list of the 100 (out of 6000) best teachers in terms of VA scores.
I got into this because my racquetball partner s a retired teacher and a right wing libertarian who hates unions (I guess I’m liberal enough to put up with him; actually, we rarely discuss anything other than the score – mine is typically higher). So he’s all for using the VA scores. Some of the research on VA scores was done by people at RAND and he asked if I knew one of the main guys and, sure enough, I did. So I got in touch with my RAND friend because I was interested in seeing what the reliability of the VA scores was. It seemed to me that it would be ridiculous to use the VA scores as a measure of individual teacher effectiveness if the reliability of the scores (in terms of the scores for year t being correlated with the scores for year t+1) was not quite high. So he did manage to lead me to some reliability measures for VA scores and, as I suspected, the average reliability of the scores (over 28 different studies) is .35.
According to Kennaway’s tables, with a correlation of around .35 your probability of correctly guessing the sign of a VA score at time t+1 using the person’s score at time t is about .6, just a tad better than flipping a coin and saying “heads” = “+” and “tails” = “-”. So what the times did was publish numbers for 6000 teachers that purport to be measures of each teacher’s “effectiveness” and they would have done little worse if they had published instead the last 2 digits of each teacher’s social security number.
I think the whole discussion of VA scores, which has been mainly about their validity – are they really measures of teacher effectiveness --, should just stop now. These measures are useless as measures of “anything” about the teacher. But maybe “useless” is not quite the right word. What would you call this? “Criminal”? “Stupid”?
I ask because I am planning to write a letter to the LA Times explaining that what they did by publishing the VA scores for 6000 4 and 5th grade teachers as an indication of their “effectiveness” (to help parents pick the ‘good’ teachers, I suppose), was ___________. You fill in the blank.
Best
Rick
···
–
Richard S. Marken PhD
rsmarken@gmail.com
www.mindreadings.com