I recommend taking a look (if you haven’t already) at the Massachusetts Teachers Association’s DDM Guidance. It explains many complex topics clearly and explores new ground as well: see the crosswalk between assessment type and Bloom’s taxonomy on page 9!
Also worth a visit is DESE’s May report on Rating Educator Impact, which includes scenarios illustrating how professional judgment can inform the rating decision.
I met with eight districts earlier this week to discuss their plans for using student feedback. A meeting summary appears below.
Student Feedback Meeting Summary
How might CES support districts in gathering and using student and staff feedback?
July 8, 2014; Attending: Easthampton, Frontier, Hatfield, Hadley, Northampton, Pioneer Valley, South Hadley, Union 28
Sense of the Meeting: participating districts (8) intend to pursue one of these approaches:
- Use the state student survey short form (3 districts)
- Roll out a personalized approach to gathering and using student feedback (3)
- Wait to see what develops while concentrating on other ed eval components (2)
Student Feedback Next Steps:
- Once districts have studied the state’s student surveys (available here), convene another inter-district meeting to discuss and assess interest in using them. (August or September)
- Maintain communication with Panorama Education in order to:
- book student feedback PD for teachers (Fall ‘14) and district leaders (through the county superintendent steering committees, Fall ‘14)
- pursue cost-saving offer to serve aggregations of small districts (July)
- Check in with districts rolling out local solutions — some districts expressed desire for analysis support (ongoing, FY15)
CES Support for Districts (proposed)
- implement surveys via google forms or survey monkey, utilizing the state instrument or an adaptation of it
- share effective practices for using student feedback to evaluate teachers
- identify sample contract bargaining language about how student feedback data will be used in the evaluation process (available here)
Staff Feedback Next Step:
- Discuss district interest in using the state’s staff survey (available here). (Fall ‘14)
One page vs. hundreds of pages.
All of the state’s regulations for DDMs can be viewed on this single page: State DDM Regs. The recommendations amount to hundreds of pages of text, webinar minutes, and powerpoint slides.
When building measures of student growth, it’s important for educators to be clear about the difference between regs and recs.
The adjective ‘district-determined’ was chosen because districts have been granted a great deal of control over the creation and implementation of DDMs.
Brockton High School, building on work done at Plymouth High School, has created sets of pre- and post performance assessments for most of their science courses. The tasks focus on experimental design. Here are the grade ten performance tasks: Brockton Grade 10 Experimental Design. And I hope to gain permission to share the accompanying scoring rubric that Plymouth designed.
Email me (firstname.lastname@example.org) if you would like to get in touch with the hard-working folks behind all of this work.
At DESE’s Spring Convening last week, I heard a comment from a district leader that has stuck with me. In order to back up the belief that conversations between evaluators and educators are a crucial component in the evaluation process, they directed TeachPoint to add a check box in the online tool through which teachers can indicate whether or not their evaluator involved them in a conversation .
As the data flowed in, it was easy to see that some evaluators were conducting many more conversations with teachers than others. And this realization has led all district evaluators to engage in more conversations.