In fact, the only people to «benefit» from this system are private test designers like Pearson, who are being handed not just lucrative contracts but also terabytes of data to mine for new products, and advocates of firing as many teachers as possible based
upon student test scores.
The legislation «evolved» with hard work by AEA from an original mandate that 50 percent of a teacher's evaluation be based
upon student test scores, Instead, the key features of the new bill include what AEA called «a statewide uniform system of teacher evaluation (beginning in the 2014 - 15 school year) that emphasizes quality assurance and teacher growth»....
For example, it is impossible to measure a music teacher or an art teacher based
upon student test scores on music and art.
In New York State, for example, 40 percent of teachers» yearly evaluations will be based
upon student test scores (New York Governor's Press Office 2012).
Attempts to address teacher evaluation have lead to a dependency
upon student test scores.
Not exact matches
The NEA analysis of the proposed legislation claimed it favored «1) establishing a teacher evaluation system using gains in
student test scores; 2) allowing «community stakeholders» to have a role in designing teacher evaluation systems; and 3) providing merit pay for teachers based
upon gains in
student test scores.»
Since we can totally before the school year predict the expected
test scores of
students based
upon previous
test score and effective teachers we can simply weed out all the
students that would fail if they were assigned in a class with an effective teacher.
It is based
upon the idea of being able to predict
test scores for
students.
But since then, the high - stakes
testing movement has blown up: with increasing frequency,
student scores on standardized exams are tied to teacher, school, and district evaluations,
upon which rewards and punishments are meted out.
Because
student achievement levels vary
upon entry across schools,
student growth measures are better measures of the impact of a school on
student learning than a proficiency rate or average scale
test score.
Given the growing understanding that value added measures (VAMs) of teacher effectiveness rely
upon tests not designed to detect teacher input, are highly unstable, and can not account for teacher impact on variability among
student scores, it is quite apt that Dr. Audrey Amrein - Beardsley of Arizona State University and a leading researcher on value - added measures, described the proposal as going from «bad to idiotic.»
While the NAEP tracks socioeconomic data, it does not weight
test scores based
upon a
student's socioeconomic background.
«From the [New York] study, simply saying we're going to pay people based
upon kids»
test scores does not work to move
student achievement.
Although the
student's detailed
score will be mailed to them, they will receive a preliminary notification as to whether they have passed or failed the
test upon their completion of the examination.
«Third,
test scores are currently important factors in law school rankings calculations, which are heavily relied
upon by
students in deciding where to attend.