Managing DDM Data

As teachers begin gathering data from performance assessments and pre-tests that are components of growth measures (DDMs), district leaders are asking timely questions about managing the new data.

It’s a challenge because the data come in many shapes and sizes: from rubric-scored performances to tallies of laps run around the gym to zero-to-one hundred test scores.

Craig Waterman, ESE project lead of growth measures, suggests narrowing what districts collect to the percentage of students with high, moderate, or low growth in any given class.  Organizing the data in this way would allow educators to look for answers to questions like:

  • Which teachers contributed to particularly high student growth?
  • Which grade level or department contributed to high student growth?
  • Which instructional practices in the high growth classrooms and grade levels ought to be more widely shared?

How is your district planning to manage DDM data? How do you intend to use the data?

Beautiful Art DDMs

tigerLooking for creative and inspiring measures of growth in art and design? Check-out the Burlington Public Schools website.

Included are DDMs for every grade level as well as tools that measure Creativity, Habits of Mind, and Elements of Art. The rubric in the Creativity DDM measures skills like originality, risk-taking, and flexibility. It’s well worth a look! (Thanks to Gwynne Morrissey for locating this treasure!)

Do you have any resources to recommend?

Pros and Cons of Merging Student Learning Goals and DDMs

The similarities between student learning goals and measuring student growth– DDMs– have led to confusion in some districts– and an opportunity for streamlining in others.

Before exploring the pros and cons of using the same set of data to assess progress toward student learning goals and student growth, it’s important to review the place of both in the educator evaluation system. A teacher’s progress with student learning goals contributes to Performance Rating (Exemplary, Proficient, Needs Improvement, Unsatisfactory), while measures of student growth are used to arrive at Impact Rating (High, Moderate, or Low).

Pros: By collecting data sets that track progress on student learning goals and measure student growth (DDMs), teachers should be able to decrease the time allocated to assessment and the time needed to score student work.

Cons: The objectives of the student learning goal may be very different than those of the DDM. The first may be short-term, whereas a DDM ought to be long-term, usually a school year or semester. And the grain-size may be different; the student learning goal can address a more narrow set of learning a standards than a DDM. Finally, there is the possibility that one set of data might ‘count against’ a teacher twice.  If that single set of data is disappointing, it can drag down a teacher’s performance and impact rating.

What do you think? How is your district approaching this issue?

Resources for Writing to Text

In preparation for PARCC testing, many districts are administering writing to text DDMs across the grade levels. These tasks ask students to read and analyze several documents and then draft an original response to a prompt about those documents. Now that districts have student work to score, a host of questions arise:

  • Who will score the work?writing hand
  • When will they score it?
  • What scoring method will be used to increase the validity and reliability of the results?

As an answer-all-questions guide, I recommend using the RI Calibration Protocol for Scoring Student Work.

It’s important to remember the value of this work for teachers– when provided ample time and strong facilitation, teachers scoring student work engage in important discussions of standards, instruction, and assessment.

School Counselors Measure Student Growth in Worcester

OLYMPUS DIGITAL CAMERAMany thanks to the Worcester County Guidance and Personnel Association, who invited me to speak to their monthly gathering.  Thanks to preparation support from Gwynne Morrissey, I was able to clarify misapprehensions for the 45 counselors in attendance and share completed DDMs written by peers in the Pioneer Valley. Viewing these examples helped counselors envision what is possible and see that the end product need not be long or overly complex. Guidance Successful Transitions DDM; Common App DDM; SelfControl SelfRegulation DDM

Apparently, many of their misunderstandings spring from trainings that presented conflicting information. They’ve heard from DESE, outside vendors, and district leaders. These misunderstandings concern not only DDMs, but core components of the educator evaluation system as well.

The counselors in attendance are doing important work with their students: one team is teaching the signs of suicide and how to address the topic. They measure student growth meaningfully when they seek to discover how much students have learned from their lessons about suicide prevention.

I emphasized the importance of measuring what counselors and their schools care most about while capturing data that is useful for improving their instruction.

Collaborative for Educational Services