Craig Waterman, DESE’s expert on measuring student growth, will hold Office Hours at CES in Northampton from 2 to 4 on April 28. He’ll answer all your questions about District-Determined Measures. If you’re interested, please register by April 23 with Jill Robinson: firstname.lastname@example.org.
Sophia, a speech and language pathologist, wrote seeking help creating a measure of student growth. I forwarded her DESE’s latest Implementation Brief SISP, designed to support specialized instruction support personnel in building DDMs.
When Sophia responded, she questioned how she could follow the Implementation Brief’s advice to conduct a peer review or administer checklists to measure the quality of collaboration. She writes, “When would [I] have the time to collect the data outside the workday? [I am] not given time in the schedule for this.” Note: Sophia has a caseload of 39.
As an alternative, I sent Sophia this plan to build a DDM based on assessments and data she is already collecting.
1) Assemble a group of students working on the same sort of challenge, say articulation. Use your most populous service area to create as large a cohort as possible.
2) Identify a screening tool that measures how these students are functioning (preferably one you are already using).
3) Select a time period (one school year, a half-year, or whatever best aligns with your work). Administer the screen at the beginning and end of that time period (as you are probably already doing).
4) Determine one year’s expected growth for a student as measured by this screen (you may already define this in the IEP). Students who achieve that expected growth show moderate growth, significantly less show low growth, and significantly greater show high growth.
Even though you already expect different amounts of growth from each of your students (based on the nature of their identified disability), track growth for each and label that growth low, mod, or high based upon your individual expectations for them. Essentially, you’d be taking a moment at the end of the year to step back from your work, look at a cohort of your students, and make a general statement about their rates of growth. Sticking with data you already gather helps avoid an undue extra burden.
On Red Sox Opening Day, we look to baseball for guidance. In calling balls and strikes, umpires address calibration challenges similar to those facing school principals charged with evaluating teachers. To gain greater inter-rater reliability, umpires’ performances are compared to camera-based compu-scores, while school evaluators collaboratively score video clips of teachers in action.
This New York Times article analyzes patterns in umpires’ strike- and ball-calling errors: “Baseball insiders have long suspected what our research confirms: that umpires tend to make errors in ways that favor players who have established themselves at the top of the game’s status hierarchy. But our findings are also suggestive of the way that people in any sort of evaluative role — not just umpires — are unconsciously biased by simple “status characteristics.” Even constant monitoring and incentives can fail to train such biases out of us.”
If we apply this theory to teacher evaluation, are school evaluators more likely to give favorable ratings to teachers with greater status in the building?
With a lot of help from my friends, I’ve built a checklist to guide districts and schools implementing measures of student growth.
The checklist, which is attached here: Checklist for Effective DDM Implementation, tracks tasks to be accomplished now, a culture to grow, and long-term tasks to attend to. Please feel free to suggest revisions and additions by commenting here.
Districts seeking support with DDM implementation are working with CES in the following formats:
In February, a team of five CES presenters introduced DDMs to Gill-Montague Regional School teachers grouped by content area. Working with teachers grouped by subject allows the presenter to gear the material to the audience and present DDM examples in teachers’ disciplines. As the work has progressed, the district has invited presenters back to support specific teacher groups. CES has done similar work for Frontier Regional Schools and will be working with Mahar Regional Schools.
At The Collaborative
Many local districts sent art and music teachers to a series of trainings that led educators through the process of building and vetting DDMs.
A DESE grant brings six districts together with CES to develop DDMs in two areas: tenth grade writing to text and seventh grade technology. This work includes a focus on the needs of struggling learners.
Developing Example DDMs
In response to requests from districts, we’re developing DDM examples in subject areas that present assessment challenges. For example, we’ve built an assessment* that measures a student’s capacity for self-regulation. This example DDM can serve as a model for school counselors interested in measuring other components of social-emotional learning.
If you’re interested in exploring any of these supports for your district, or you’d like to see the DDM for measuring a student’s growth in self-regulation, please post a comment or use the box at right to email me.