Educators: If you could get rid of one thing next school year to make things less traumatic and more manageable during the pandemic, what is the one thing you'd get rid of and why would that one thing be state testing?
I’ve been trying to find what my Twitter friend Jennifer Binis describes as Black parents asking for standardized testing from a historical perspective. I will in good faith ask her, because I know she’s an educational historian and scholar. When she and I have had the conversation and exchange of ideas about the state standardized testing, created by Pearson and presumably based on Common Core Standards, all I have to offer are my own observational and test data: the test sucks. It’s biased, racist, and does not achieve the educational equity that may have been its original intent. If we go by intent versus impact, its impact is overwhelmingly damaging. During the COVID19 shut down, the testing was either cancelled outright or put on hold. It costs districts millions. It takes up weeks out of the 180 days of educational instruction time, and speaking for the ELA (reading/writing) uses dissected and random autopsies of texts for students to show their “skills” but never strategies, background or contextual knowledge. Teachers struggle to teach strategies, background or contextual knowledge because of this cursed assessment.
What would many of us like to see instead? Project-based learning based on cross-content disciplines, portfolios, etc. There are many other more authentic assessments, both formative and summative, that we can look toward.
Just in time, during our ELL Endorsement class yesterday, we reviewed various assessment protocols and terms. They weren’t unfamiliar, but a timely reminder. Funny, that.
Two, at the top:
Reliable.
Valid.
As the man said, “Houston, we may have a problem.”
Is the SBA test, or Smarter Balanced Assessment, actually Smart, Balanced or a valid/reliable Assessment? Here are some resources for you to draw your own conclusion, with a dash of my own observations.
Ensures that vital information is provided to educators, families, students, and communities through annual statewide assessments that measure students’ progress toward those high standards.
Yelp. About that. I was told I was ‘teaching to the test’ when I offered my resources about the brief write rubrics so we could help students grow as writers, the resources I wanted to share did not find a forum in my building last year (or this year, either.) But okay–it’s still good stuff, and there are many ways that the Common Core and SBA have some challenging and rigorous questions. I just wish it wasn’t all at once, in secret, but little assessments over the course of the year (i-ready doesn’t count) that show growth.
Think about it: instead of a minimum of four days, with three-hour testing blocks during the month of May, there were smaller assessments, open, transparent, and available for PLC discussions, to show student growth? And what if—WHAT IF — Federal dollars weren’t tied to the scores? If accountability is the order of the day, I would wager most teachers have no issue with accountability as long as their professional judgment and expertise drive the assessment and continuing instructional decisions.
Plus sides (with caveats)
It is essentially a reading comprehension test, even for many of the math questions. If we continue to build strong literacy programs with this focus, that is a plus. However, literacy does not need to mean solely text: it should include multiple pathways and access points for all learners and abilities.
There has been a huge increase in the resources available for educators. Many of the questions are aligned with Common Core.
It’s expensive in terms of instructional time, financial, and stress for students, parents, and teachers. And administrators.
It may not be reliable or valid. It’s chock full of inaccuracies.
Educators may not be accurately helping students prepare for it. (And this could lead to further undermining of teachers as professionals, implementing more canned or scripted programs.)
It is not transparent. This goes against everything Hattie, et. al prescribe for student and teaching effectiveness: know the targets and criteria for success, and allow students to monitor and reflect. This style of summative, opaque assessment flies in the face of that research and best practices.
My own conclusions? More of a wish list, I suppose.
I wish…there are more collaboration instead of isolation about assessment.
I wish…there were alternative pathways for all students with varying abilities and needs to have access to assessment and instruction that challenges, celebrates and is based on growth.
I wish…much, much time and money were spent on this.
A colleague who’s new to the building is, above all, hilarious. I’m so glad he’s at our school and has the experience he does. We don’t get to chat as often as I’d like, as I’m teaching six out of six classes this year, and we don’t have common planning time. We do get to share ideas via social media, so before testing, I shared this image:
Hoarding #FTW!
Those purple things are repurposed CD vinyl holders; in my cupboards, nearly a full box of various colors waited patiently for CDs/DVDs that would never be made again. Our new laptops don’t have DVD players in them, and now the staff scrambles for the external CD player. (Technology is weird that way.) During testing, we’re not allowed to hold cell phones in a shoebox with sticky notes on them, as we’ve done in years past, due to the new admin’s rules. They don’t want to be liable for any cell phones that may be lost or damaged during testing. Makes sense, and just because losing or damaging a cell phone has never happened in our building doesn’t mean it can’t or won’t. Not a ‘hill I want to die on,” so to speak. But if students didn’t put their cell phones in their lockers, they must turn them off and place them face down on the desks/tables in front of them. Seeing how students’ addiction to their phones causes them subconscious touching and turning over of phones, I needed to have a means to thwart them just a bit more, hence the plastic sleeves.
He asked me if I felt some sense of grand vindication that my teacher-hoarding had paid off!?
All year I felt like I was standing on a treasure trove of accessible and important curriculum/instructional ideas that were just out of reach, this nagging feeling that the time and year were too fragmented or…something. I could never put my finger on it. (Maybe because I have mountains of data in Google Drives, OneDrives, Dropboxes, and external hard drives.) Where did the time go? Nailing down multiple approaches to student learning is like hanging an octopus on the wall. No one agrees where it should go, and doesn’t make the room look any better.
Power Points from the Past…
This year because of other instructional directions I didn’t spend as much time on thesis and argumentative writing as I should have. I needed help and support: help in terms of not how to teach it, but in terms of our whole PLC working on it.
But here are some links and goodies. They are based on information that is open to the public. Use if you want, change, alter, etc., feel free. Email me with questions. Right now I just need to focus on getting through the end of the year with students and make them feel like they’re reading for high school because I can tell many do not. I’m going to listen to my instincts about that one. I know how they’re feeling now: scared, anxious, excited, and ready to move on. What is in the past will only inform the future, but the present needs my focus.
This is what I’m trying to do with my Reading Road Trip, but we haven’t left the driveway yet:
I decided the best way to approach the quarter would be with the question of where stories come from, how authors craft stories for readers, and why we read and listen to stories. Framing our independent reading this way, meant that my job was to ask these questions every day and to set up experiences for students to explore these questions. Standards related to author’s craft, words in context, public speaking, and reading diverse, complex texts informed my instruction as I developed essentially five parts to our class, which I will explain below (independent reading, close reading, text structure, language lessons, and read alouds).Essentially, however, I organized our daily time this way:
Monday through Thursday: 7 minutes of language study,34 minutes of reading and conferring individually and in small groups; during the 34 minutes, I would either take a small group to a table to do a mini-lesson or I would meet with students individually to book talk options and confer about what they were noticing about their books.
Fridays: readalouds
. These days, students would read short pieces while their classmates/ audience would listen for and document textual evidence of sensory language, figurative language, tone, mood, word choice, character interaction, etc.
Daily Portfolio Development: Every day, students were responsible for documenting their reading experiences. They’d take pictures of books and sticky notes; they’d document reading responses on Google forms. We used these artifacts for an end-of-quarter portfolio to demonstrate learning and negotiate final grades.
Can I get this moving with a student teacher, other demands, PLC work, district demands, and general tomfoolery of a Title I school (everyone has an agenda, including me, on what works best)?
Someone mentioned the other day our scores have never been good. I guess compared to the district’s numbers, that’s true. Not sure if I paraphrased that correctly, but that was the conclusion she drew.
But have they shown growth? Resoundingly yes.
So, for a few moments, I thought I would look over the scores on the OSPI School Report site, and they are a mixed bag of growth and famine. Consider in the ten years we’ve had three state tests, and during the SBA(C) time the first year of a pilot year, we had no results or data to review. My continuing concern is that the SBA is not a transparent assessment, and we’ve been working mostly blind for years. The fact that my students scored 65-75% passing last year is a testament to them, not me. Overall, last year the 7th grade ELA PLC (of which I was a part) showed significant gains:
41% in 2015-2016
35% in 2014-2015
But herein lies the rub: whenever we make decisions about what’s best, how do we accurately gauge based on only two years’ worth of data? In other years my 7/8th-grade students grew from about 40% of passing reading to 60-65% on the WASL/MSP. Those years included as many best practices as I could muster, working as the Curriculum Leader in a collaborative way, and cross-content teams. It’s false to paint a picture that our students have never grown.
Here’s what I do know: if you focus on teaching it, students learn it. Clear and concise instruction that includes skills and strategies, with a hefty dose of student self-reflection, independence and choice make the most sense.
What data battles have you had to fight? Is it providing students with reading ideas based on solely their Lexile? Is the collective or group/team misinterpreting the data?
My biggest obstacle to instruction, however, isn’t from others, it’s from my students themselves. Those steeped in learned helplessness and confusion. Somehow along the way they have no idea what’s happening, and don’t try at all, and turn nothing in. And it is different for every student, and unpredictable (and thank goodness–this would limit my ability to truly get to know them).
So———-November. Nine weeks or so till the end of first semester. With my student teacher’s help, I think we can do this. Getting students to read, transfer those skills and strategies to bolster their reading confidence can only help. We’ve got this, right?