Monday, January 21, 2013

Tests and dumb tests

"I am not opposed to all wars. I'm opposed to dumb wars."
-- Barack Obama, 2002

Borrowing Obama's formulation, I don't think most teachers oppose testing. I think many oppose dumb testing. 

Some teachers in Seattle are rebelling against testing -- sort of. They're boycotting giving a second test to their students. (The students in question already take an state-mandated end of course exam in some classes. As best I can tell, this isn't controversial.) Jesse Hagopian, a teacher at Garfield High School (and a Teach for America alumnus**) pinpoints the chief issue:
Students don’t take the MAP seriously because they know their scores don’t factor into their grades or graduation status. They approach it less seriously each time they take it, so their scores decline. Our district uses MAP scores in teacher evaluations, even though the MAP company recommends against using it to evaluate teacher effectiveness and it’s not mandated in our union contract.
That is dumb testing.

The more I read, this story seems less about whether testing is worthwhile and more of a tale of how an inane bureaucracy requires teachers to give a useless test. 

A letter from teachers at a Seattle elementary schools demonstrates why the test in questions is useless:
 ...the current version of the MAP test is aligned with the old state standards and it is clearly an unsuitable vehicle for evaluating students currently being taught the new required Common Core Standards. So not only are the results of little instructional value, but this discrepancy between what is taught and what is measured will yield falsely low scores making the MAP test invalid for the purpose of measuring student growth/teacher effectiveness.
(Emphasis mine.)

The test that's the focus of the boycott is the MAP (Measures of Academic Progress). It's a national test, though I don't know if the testmakers adjust the questions to suit the standards of different states. 

As it happens, I can somewhat empathize with the Seattle teachers. My ninth graders have a similar testing regimen -- a state-mandated end of course test (EOC) and the MAP. The reason I'm OK with the kids also taking the MAP is because the data from it helps me do a better job.

The MAP is something we chose to do as a school. We use it to track student progress throughout the year and make sure my students are on track to reach their desired ACT score. The MAP is taken three times a year on a computer and takes about an hour for most students to take a subject-specific test. Unlike the state-mandated tests, it offers a sophisticated breakdown of student growth and skill level.

It's very useful in terms of knowing where a student is, how quickly they're making progress, and tweaking instruction to meet students. As we switch to Common Core standards -- which are more rigorous -- the data from the MAP will be even more relevant. 

It is, of course, worthless if teachers don't use the data. 

The information I get from it actually saves quite a bit of class and planning time because I don't need to design my own diagnostic test, nor do I have to crunch the raw data. It's helped me teach because we're more accurately diagnosing skill deficits. It's one thing to know a student isn't reading on grade level. It's much more useful to know that, say, a student struggles with differentiating between certain vowel sounds. With that sort of precision, we're able to conduct targeted reading interventions. When students take the MAP again, we track progress. 

In other words, this isn't testing for testing's sake. We use this stuff to help our kids become better readers. It's a simple goal, but not an easy one. Helping kids make strides is complicated. That's why so many kids fall behind in the first place.

The idea of education is also simple, but not easy: everything a school does should help students learn. All tests, whether it's a ten-question vocabulary quiz or a state-mandated exam, are supposed to serve that goal. If the data from a test isn't meaningful, then what's the point in having students take it? 

As with so many things in education, poor execution sabotages a policy's intent. This seems like district leadership kept on adding testing systems without thinking through how it would play out in classrooms. The Seattle debate isn't about testing; it's about dumb testing.

****

** Read Hagopian's entire article. There's a persistent idea that TFA alumni don't have many different opinions about what ed reform should be. There's another one that many TFA alumni don't stay in the classroom. Both are false. 

No comments: