Shostack + Friends Blog Archive

 

Measuring the Wrong Stuff

There’s a great deal of discussion out there about security metrics. There’s a belief that better measurement will improve things. And while I don’t disagree, there are substantial risks from measuring the wrong things:

Because the grades are based largely on improvement, not simply meeting state standards, some high-performing schools received low grades. The Clove Valley School in Staten Island, for instance, received an F, although 86.5 percent of the students at the school met state standards in reading on the 2007 tests.

On the opposite end of the spectrum, some schools that had a small number of students reaching state standards on tests received grades that any child would be thrilled to take home. At the East Village Community School, for example, 60 percent of the students met state standards in reading, but the school received an A, largely because of the improvement it showed over 2006, when 46.3 percent of its students met state standards. (The New York Times, “50 Public Schools Fail Under New Rating System

Get that? The school that flunked has more students meeting state standards than the school that got an A.

There’s two important takeaways. First, if you’re reading “scorecards” from somewhere, make sure you understand the nitty gritty details. Second, if you’re designing metrics, consider what perverse incentives and results you may be getting. For example, if I were a school principal today, every other year I’d forbid teachers from mentioning the test. That year’s students would do awfully, and then I’d have an easy time improving next year.