Because we’re committed to improving instructional practice and the supporting work of our member districts, we’re offering trainings designed to enhance the quality of the implementation of the educator evaluation system as a whole and student growth measures (DDMs) in specific.
Click here to read more about educator evaluation trainings like our seminar for the calibration of administrators. Click here for the principal observation training.
Click here for information about DDM trainings addressing key aspects like revising tasks and rubrics for better alignment with standards and instruction.
Here is a good opportunity for district leaders to step back and reflect on the effectiveness of the new educator evaluation system. Register here.
Implementation of Educator Evaluation Systems: Examining Problems of Practice
Co-hosted by the Northeast Educator Effectiveness Research Alliance at REL Northeast & Islands and the National Center for Teacher Effectiveness at the Harvard Graduate School of Education, this daylong event will examine the successes and challenges related to the implementation of teacher evaluation systems. NCTE Principal Investigator Dr. Tom Kane will present research on factors related to educator evaluation systems, including capacity, measurement, impacts on practice, and school climate and culture. Through structured Q&A and breakout sessions, the event is designed to build meaningful connections between educators and researchers around the use of research to inform decision-making about teacher evaluation systems.
Thursday, October 23, 2014
9:30 a.m. – 3:30 p.m., ET
I recommend taking a look (if you haven’t already) at the Massachusetts Teachers Association’s DDM Guidance. It explains many complex topics clearly and explores new ground as well: see the crosswalk between assessment type and Bloom’s taxonomy on page 9!
Also worth a visit is DESE’s May report on Rating Educator Impact, which includes scenarios illustrating how professional judgment can inform the rating decision.
I met with eight districts earlier this week to discuss their plans for using student feedback. A meeting summary appears below.
Student Feedback Meeting Summary
How might CES support districts in gathering and using student and staff feedback?
July 8, 2014; Attending: Easthampton, Frontier, Hatfield, Hadley, Northampton, Pioneer Valley, South Hadley, Union 28
Sense of the Meeting: participating districts (8) intend to pursue one of these approaches:
- Use the state student survey short form (3 districts)
- Roll out a personalized approach to gathering and using student feedback (3)
- Wait to see what develops while concentrating on other ed eval components (2)
Student Feedback Next Steps:
- Once districts have studied the state’s student surveys (available here), convene another inter-district meeting to discuss and assess interest in using them. (August or September)
- Maintain communication with Panorama Education in order to:
- book student feedback PD for teachers (Fall ‘14) and district leaders (through the county superintendent steering committees, Fall ‘14)
- pursue cost-saving offer to serve aggregations of small districts (July)
- Check in with districts rolling out local solutions — some districts expressed desire for analysis support (ongoing, FY15)
CES Support for Districts (proposed)
- implement surveys via google forms or survey monkey, utilizing the state instrument or an adaptation of it
- share effective practices for using student feedback to evaluate teachers
- identify sample contract bargaining language about how student feedback data will be used in the evaluation process (available here)
Staff Feedback Next Step:
- Discuss district interest in using the state’s staff survey (available here). (Fall ‘14)
One page vs. hundreds of pages.
All of the state’s regulations for DDMs can be viewed on this single page: State DDM Regs. The recommendations amount to hundreds of pages of text, webinar minutes, and powerpoint slides.
When building measures of student growth, it’s important for educators to be clear about the difference between regs and recs.
The adjective ‘district-determined’ was chosen because districts have been granted a great deal of control over the creation and implementation of DDMs.