Since I asked the question not many answers appeared. Time to put on my Cape of Hard Research and Thinking, TO THE INTERNET! AWAY!
Just how do we constructively analyze, evaluate, and make meaning out of student data? The fundamental questions of a PLC frame the discussion: what do we want students to learn, what do we do if they don’t, and what do we do next if they do? From the data on display, it would appear that many students stalled: the more capable ones have nowhere to go next, and the struggling ones didn’t make connections to the routines and scaffolding to the independent steps. Since I am not an ELA teacher this year by title, I could say well, my “name” isn’t associated with students’ scores. But that is the opposite of how I feel and act, and I know many of my colleagues do, too. They want access to the data and understand to their core that we are all teachers of literacy in every shape and kind. That would be my first step: all teachers in the building working together in cross-content teams to share student information, data, and insights. (I wonder where I put that student form from a few years ago we used when we had that team?) Teams are coming back, so that’s positive.
Here are some articles about different ways to look at data. The data carousel, paradoxically, one of the most powerful and weakest: it allows for good comments and discussion, and then never discussed again.
Get Curious About Contradictions and Take Action: How about that ace student who didn’t do so well on the standardized test? Possibly a nervous test-taker? Or it could simply be low motivation, since many students never hear about their standardized test results from previous years? Prior to a test, a brief pep talk or quick review of strategies for lowering test anxiety could be all they need. Also, there is much information to be gained from having individual conversations with students who have these contradictions between their standardized test scores and their classroom grades and performance.
From The Teaching Channel:
As said, data carousels create a burst of powerful discussions, but are not sustained over time.
This one may be the best: from Larry Ferlazzo,
Below are suggestions to assist collaborative inquiry teams in examining student work.
- Begin with anonymous student work samples – perhaps from a colleague’s class in another school (this colleague and the students should remain anonymous). Initially examining work that does not ‘belong’ to anyone in the group will help to build confidence and ease the transition to the more risky activity of sharing their students’ work.
- Use protocols for examining student work. Protocols provide structures and guidelines for looking at and talking about student work. They are designed to help team members reflect on their practice as it relates to student learning and development.
- Select 3-5 students of interest and monitor their progress over time. There is no need to bring student evidence from an entire class. Teachers might select 3-5 students who are performing at different levels of achievement. Collaborative inquiry teams will find it more manageable (and equally informative) to monitor the progress of a few students.
The anonymity piece: making it safe for teachers to share and discuss takes away the judgmental attitude of ‘bad’ versus ‘good’ teachers. And the “progress over time” — showing growth versus proficiency is the miracle of teaching and learning. That is why we are here and do what we do: Larry Ferlazzo’s tips are doable and smart. When creating norms and structures for PLCs, I am hoping my colleagues see the value of adding these protocols.