By the numbers

Print

The thing about numbers and statistics is that they often carry an illusion of scientific-ness about them. You can’t listen to a political speech without hearing all kinds of statistics — sometimes cited and sometimes not.  Sometimes statistics get repeated over and over for years even though people eventually forget what the original source was, and even if the numbers have completely changed since that number was originally drawn. For example, as Jay Gabler has pointed out, there’s the widely-believed “statistic” that Minneapolis has more theaters “per capita” than any other city, even though there seems to be no readily findable study that actually shows this to be true.

In my education reporting, I get a LOT of numbers thrown at me. People tell me statistics when they are being interviewed. And of course there are millions of numbers on the Minnesota Department of Education website, and on the district websites. 

You can’t find statistics for everything, though. For example, yesterday I posted on Facebook Mary Turck’s excellent blog post about Peter Bell’s comment on MPR saying that some Minnesota districts have a 50 percent dropout rate, which she pointed out was not true, as there are additional students that graduate in five or six years, or who acquire their GED.  But MDE doesn’t list the number of students who graduate in five or six years (I’m planning on looking into this further.)

At TC Daily Planet, we’re also looking at creating a page that explains how to find out more information about schools, to help people navigate what can be a very frustrating process of finding the numbers.

Then, there’s the problem of numbers that are deceiving, or don’t show the whole clear picture (as in Mary’s example about the dropout rate).  Diane Ravitch, who spoke at the Education Minnesota conference on Thursday (see Alleen Brown’s report here) gave a very compelling argument against using standardized tests, both within the context of No Child Left Behind (and Race to the Top) and for performance evaluations. Among Ravitch’s arguments is that the tests unfairly penalize schools that have high poverty rates, and also that if you look at the broader picture, student performance in the United States has actually improved, not gotten worse.

My problem with Ravitch’s lecture was that in making some of her points, such as talking about the widening achievement gap in the post-integrated school era, she cites statistics based on testing.

I was discussing this with my colleague Alleen Brown after the talk. Alleen has been doing some fabulous reporting about how public schools have higher percentages of students with disabilities than charter schools. Even with tests that take outside factors into account, though, we still have questions about the reliability and validity of tests that are basically comparing apples to oranges.

At the same time, as “members of the media”, we use testing data and statistics that are available to us all the time, even as we also write about other problems with testing. (For example, see my article about the Children’s Theater’s Bridges program, which aims at teaching critical thinking and emotional skills that are not measured on standardized tests.)  It’s a tricky thing, when to know what numbers to use, and when to keep a skeptical distance.