Justin, a middle school math teacher, shared his concerns about DDMs with me this week. I asked why he is supporting his district’s purchase of an off-the-shelf product to measure student growth.
Me: Why not create a DDM to measure what you care about, like the math practice standards from the Common Core?
Justin: I worry about the consequences of creating a DDM without a high level of validity and reliability.
Me: If your students’ growth ends up being low due to a flawed measure, won’t you be able to explain that to your evaluator?
Justin: Yes, with my current evaluator– but what if that person goes elsewhere? How can I be sure that his replacement will be equally understanding and trustworthy? And, if I am given a low impact rating, might it become difficult to get a job in another district?
And therein lies the nature of the minefield. Yes, the results of a low impact rating are not huge– moving from a two-year self-directed educator plan to a one-year plan– but what hard-working, caring teacher wants to be unfairly tagged with a low rating? And the long range impact on a teacher’s ability to find a new job may be immense.
What’s to be done? In my next post I’ll propose some of the long range benefits to be gained from implementing DDMs.
As we build measures of student growth with teachers, we develop tools to guide the process. This DDM Organizer-1 leads educators in selecting the topic, format, and audience they’ll address with their DDM. Teachers and administrators can use this DDM Checklist to verify that an assessment is complete.
In the “amusing comparisons” category, I present this post that compares a new administrator’s approach at Hogwarts with the ed reform agenda.
Craig Waterman explained to me new guidance for the use of student growth percentiles as measures of student growth. Craig is DESE’s Assessment Coordinator for district determined measures.
- Educators required to use SGP data as a growth measure (about 14% of all teachers):
- Teachers with 20 or more students who take the MCAS and have MCAS data from the previous year (Ex: a fourth grade classroom teacher)
- Teachers with 30 or more students teaching one MCAS-tested subject across multiple grades (Ex: a special educator)
- Educators not required to use SGP, but may be required by the district (about 6%):
- Teachers with between 8 and 19 students (“SGP is more reliable than a DDM.”)
- Every teacher with the same job description uses the same measure. (Ex: One math teacher has 21 students, another has 19. They should both use SGP or both use a DDM.)
- The spread defining low, moderate, and high growth will be different than that currently used at the school and district level:
- Instead of a range of 40 to 60 defining moderate growth, that range will be 35 to 65. This will result in a teacher distribution comparable to the distribution currently found in SGP ratings of schools and districts.
- Districts will wait to file their impact ratings till October, when SGP data become available, though performance ratings will still be due in June.
Questions? You can post them as comments here or ask Damon Douglas via email by using the box at left!
As a part of their evaluation system, Tennessee is encouraging art teachers to submit portfolios of selected students’ work. Trained evaluators (from the teacher ranks) then analyze the portfolios for evidence of student growth. This evidence becomes a piece of the art teacher’s impact rating. Continue reading