Helping School Nurses Measure Student Growth

Please welcome guest blogger Gwynne Morrissey, who has been working in the schools with teachers to measure student growth.

Three secondary school nurses came to a workshop wanting to measure students’ learning related to a one-time video and discussion about signs of suicidal thinking. How could they measure student growth in understanding this very important topic?

Well . . . I had reservations. It is a very important topic, but this video/discussion combination is a small portion of these nurses’ work, and the content wasn’t delivered by the nurses themselves. We could measure students’ knowledge before the video and just after the post-video discussion, and count the number of visits students made to the nurse’s office related to having suicidal thoughts. But what would that tell nurses about how they could improve their work? How many visits would count as “moderate” growth in students’ learning? It seemed like a non-starter.

Instead, I suggested they come up with a topic more representative of the scope of their work that would provide important information about their work and how it supports the school’s academics. Here is what they identified as their most important tasks:

  • getting ‘frequent fliers’ back to class more quickly (or keeping them in class in the first place)
  • improving the frequency with which students with special dietary or medical needs provide their own needs, rather than rely on the nurse

In response to these topics, we drafted several DDM ideas that align nicely with DESE’s suggestions in the Implementation Brief SISP (page 6). Those ideas and a few others can be found in this collection of Five DDM Options for School Nurses (Better to download the document itself as it features terrific illustrations.)

Five DDM Options for School Nurses

1. Parent survey

Purpose: to determine parents’ perceptions of the effectiveness of nurse visits for their children

Timing: collect throughout year, aggregate at end of each year to compare to previous year

Notes: could use surveys for particular groups of students, like those who must receive medications

Examples: http://survey.constantcontact.com/survey/a07e8oh6q2khpa52zzq/a019qhsm4geks/questions

http://www.surveymonkey.com/s/RQ33W52

2. Seat time/unnecessary visits

Purpose: to measure degree to which unnecessary visits to the nurse are reduced or students’ time spent in class is increased

Timing: collect monthly or quarterly, compare beginning of year to end of year

Notes: there may be groups for whom an effort to increase class time is particularly important (‘frequent fliers’): could limit measurement to those students; maybe this is a measure of how much time certain students spend in the nurse’s office (could depend on tracking systems already in place.

3. Effects of health lesson or educational/prevention campaign

Purpose: to measure how effective the lesson/campaign was in increasing student knowledge or changing behavior

Timing: before and after lesson/campaign, maybe during (depends!)

Notes: One health lesson is probably too small a unit to examine: a unit is more reasonable, though you could make a case for one lesson in some instances. A campaign would have challenges in measuring effects, but could be hugely consequential for the school environment.

4. Completed vision exam cycles

Purpose: to determine how many students—or, more realistically, parents—follow through on results of school-based vision screenings by seeing an ophthalmologist or optometrist, returning the form to the school nurse, and getting glasses (and wearing them!)

Timing: yearly (post-test only format)

Notes: This may work best in elementary and possibly middle schools.

5. Independent monitoring of health needs or accessories

Purpose: to measure how many eligible students increase the frequency of bringing their own snack, monitoring their own health needs, rather than relying on the nurse to provide them with necessities

Timing: beginning and end of year, or repeated measures including middle of the year

Notes: This may work best in middle and high schools, and for certain groups of students, such as those with diabetes.

 

 

DDM Road Show to Visit Northampton

the-office-british_lCraig Waterman, DESE’s expert on measuring student growth, will hold Office Hours at CES in Northampton from 2 to 4 on April 28. He’ll answer all your questions about District-Determined Measures. If you’re interested, please register by April 23 with Jill Robinson: jrobinson@collaborative.org.

How do Specialized Personnel Measure Student Growth?

students measure growthSophia, a speech and language pathologist, wrote seeking help creating a measure of student growth. I forwarded her DESE’s latest Implementation Brief SISP, designed to support specialized instruction support personnel in building DDMs.

When Sophia responded, she questioned how she could follow the Implementation Brief’s advice to conduct a peer review or administer checklists to measure the quality of collaboration. She writes, “When would [I] have the time to collect the data outside the workday?  [I am] not given time in the schedule for this.” Note: Sophia has a caseload of 39.

As an alternative, I sent Sophia this plan to build a DDM based on assessments and data she is already collecting.

1) Assemble a group of students working on the same sort of challenge, say articulation. Use your most populous service area to create as large a cohort as possible.

2) Identify a screening tool that measures how these students are functioning (preferably one you are already using).

3) Select a time period (one school year, a half-year, or whatever best aligns with your work). Administer the screen at the beginning and end of that time period (as you are probably already doing).

4) Determine one year’s expected growth for a student as measured by this screen (you may already define this in the IEP). Students who achieve that expected growth show moderate growth, significantly less show low growth, and significantly greater show high growth.

Even though you already expect different amounts of growth from each of your students (based on the nature of their identified disability), track growth for each and label that growth low, mod, or high based upon your individual expectations for them. Essentially, you’d be taking a moment at the end of the year to step back from your work, look at a cohort of your students, and make a general statement about their rates of growth. Sticking with data you already gather helps avoid an undue extra burden.

 

Baseball Informs Educator Evaluation

baseballOn Red Sox Opening Day, we look to baseball for guidance. In calling balls and strikes, umpires address calibration challenges similar to those facing school principals charged with evaluating teachers. To gain greater inter-rater reliability, umpires’ performances are compared to camera-based compu-scores, while school evaluators collaboratively score video clips of teachers in action.

This New York Times article analyzes patterns in umpires’ strike- and ball-calling errors: “Baseball insiders have long suspected what our research confirms: that umpires tend to make errors in ways that favor players who have established themselves at the top of the game’s status hierarchy. But our findings are also suggestive of the way that people in any sort of evaluative role — not just umpires — are unconsciously biased by simple “status characteristics.” Even constant monitoring and incentives can fail to train such biases out of us.”

If we apply this theory to teacher evaluation, are school evaluators more likely to give favorable ratings to teachers with greater status in the building?

Checklist for Implementing DDMs

With a lot of help from my friends, I’ve built a checklist to guide districts and schools implementing measures of student growth.

The checklist, which is attached here: Checklist for Effective DDM Implementation,  tracks tasks to be accomplished now, a culture to grow, and long-term tasks to attend to. Please feel free to suggest revisions and additions by commenting here.

Collaborative for Educational Services