Saturday, February 4, 2012

Un-Scientific American (How Bad Standards Lead to Good Outcomes)




My sloppy journalist of the month award goes to Anna Kuchment at Scientific American for parroting a bizarrely twisted report evaluating state science standards published by the Thomas B. Fordham Institute, an education policy think thank that I've never heard of.

What jumped out of this chart was that the Dakotas got F's while California got A's.  Now that's odd, because I seemed to remember that North Dakota scored near the top when in comes to educational OUTCOMES.

So naturally, I spent 2 minutes doing the background research that Scientific American didn't feel was required and it turns out that indeed the very best NAEP outcomes do indeed come from the states with horrible science standards, as reported by Sci Am:

NAEP Rank State Standards Grade
1st Montana F
2nd North Dakota F
3rd South Dakota F
4th Massachusetts A-
5th New Hampshire D

(my own charts of outcomes vs. standards below the fold)
Conversely, it seems that decent to outstanding standards can still result in the very worst outcomes.  Here are the bottom five states by NAEP assessed outcomes:

NAEP Rank State Standards Grade
50th Mississippi C
49th California A
48th Alabama D
47th Louisiana B
46th Hawaii D

In fact if we create a map of the states based on OUTCOMES, we get a result that looks like the mirror inverse of the map shown in Sci Am.

Compare the chart above with an identically formatted chart of the STANDARDS grades:

Apparently high standards drag down the quality of science education in California.

A quick correlation and scatter plot show the same effect: "good" standards are correlated with bad outcomes.  This time, I used the 2005 science standards grades to eliminate reverse causation.


So bad science standards PREDICT good science education outcomes!

Why harp on actual science OUTCOMES when the article was about STANDARDS?  Well, apart from the obvious point that outcomes are what we care about, Kuchment herself states that "numerous studies have found that high standards are a first step on the road to high student achievement".

Hmm, I'd love to hear more about these "numerous studies" finding "first steps" and so on, but alas no specific studies were cited.  Frankly, that sounds like the babbling of a 10th grade term paper.  I'm surprised she didn't cite a plethora of studies. In the real world studies find correlations and relations and so on, but "first steps"?  Eh, not so much. That's more the province of anecdotes, case studies and professional craft.

So why didn't Sci Am mention the actual outcomes?   After all, in addition the first step nonsense, the report itself references the NAEP standards as the baseline.  Don't Sci Am readers want to see the colorful chart showing the actual results?

I quit reading the popsci magazines years ago when they started pushing blatantly political propaganda with increasingly sloppy thinking, but occasionally I will come across articles on the blogs that I read and normally these articles validate my decision to avoid them.

Reading a little bit farther the problem becomes apparent.  Those Bible-thumping rubes in Montana and North Dakota inconveniently scored highest in the nation on their NAEP science tests despite holding suspiciously circumspect views on evolution.   while the consummately PC young pioneers in California scored the worst despite presumably excellent denunciations of intelligent design in the standards.

2 comments:

Anonymous said...

So that was actually pretty interesting, but the Fordham Intitute is a conservative outfit, so why should I be surprised that their standards are bogus?

Mercy Vetsel said...

Like I said, I've never heard of the Fordham Institute and don't really care if they publish garbage.

Scientific American on the other hand has a legacy of good reporting and for those of us who grew up reading magazines and newspapers, it's distressing to see how sloppy and politicized popular science magazines have become.

Mercy