For those who like simpler formats: Yesterday, we discussed what follows in e-mail form. The whole thing may make more sense in that format. Well post the e-mails below.
NOTHING GLOD CAN STAY: Wonderful news appeared to reside on the front page of Thursdays Washington Post. Needy Students Closing Test Gap Under No Child, the headline said. Maria Glod summarized the claim which would drive this pleasing report:
GLOD (10/2/08): Since enactment of the No Child Left Behind law, students from poor families in the Washington area have made major gains on reading and math tests and are starting to catch up with those from middle-class and affluent backgrounds, a Washington Post analysis shows.
According to Glod, kids from poor families were catching up with kids from middle-class and affluent backgrounds. She soon offered a taste of the evidence:
GLOD: In [Marylands] Montgomery County, for instance, students in poverty have earned better scores on Maryland's reading test in each of the past five years, slicing in half the 28 percentage-point gulf that separated their pass rate from the county average.
In Montgomery County, a higher percentage of poor kids have been passing the Maryland state reading test. The gap in passing rates (poor kids versus all kids) has been cut in half since 2003.
This sounded like extremely good newsand Glods report included this graphic, showing passing rates on state tests for various DC-area counties. At a glance, the evidence looked sound and solid. As big newspapers have done for decades, the Post was giving its readers good news. Low-income kids were catching up. That achievement gap was dramatically narrowing.
But in fact, theres no way to tell from those passing rates whether that achievement gap has narrowed; theres a groaning methodological blunder at the heart of the Posts statistical reasoning. Has the achievement gap narrowed since 2003 in these DC-area counties? Its possiblebut its also possible that the gap has grown wider. Theres no way to tell from Glods work.
We humans love the stories we likeand big newspapers like the Post have always loved this type of story. For decades, they have found ways to tell the public that low-income students are catching upthat the problems of low-income education are now being solved. But as theyve typed these pleasing tales, theyve left few statistical bungles behind. Glods report is the latest example of the Posts statistical bungling.
Whats wrong with Glods statistical reasoning? First, lets look at a minor oddity in the way she presents her data. Then well consider the basic error at the heart of her reasoning. For sake of simplicitly, well consider Montgomery County alone, although the points we make below apply to all these DC-area counties. And well talk about the reading tests, as Glod does in the passage weve quoted. Her charts show results on state math tests too. Her bungle applies there also.
A minor oddity: For the record, there is a minor oddity in the way the Post has presented its data. We dont know why the Post chose to present the data this way. But this minor oddity tends to shrink the size of the achievement gap were hoping to eliminate. And it tends to disguise the groaning problem with Glods basic type of analysis.
What is that minor oddity? The Post could have made a direct comparison between low-income kids in Montgomery County and kids of higher incomes. (Those from middle-class and affluent backgrounds, to use Glods language.) After all, the question were asking is fairly simple: Are low-income kids catching up to kids from more affluent backgrounds? But thats not what the Post chose to do in the data found in its graphic. Instead, the Post compares the passing rates for low-income kids to the passing rates for all Montgomery County students. In that latter measure, the lower passing rate of the low-income kids drags down the overall passing ratethus obscuring the size of the gap between low-income kids and kids of higher incomes.
Heres the problem which is caused by with this minor oddity:
For 2008, the Posts graphic seems to show that roughly 80 percent of Montgomerys low-income kids passed the state reading test. (Thats a huge jump from 2003, when roughly 40 percent passed.) Since the graphic says that the gap in passing rate was 14 percent, that would mean that roughly 94 percent of all county kids passed the test in 2008. But uh-oh! The low-income kids are included in that latter statistic; this means that, on their own, the higher-income kids must have had an extremely high passing rate. Their passing rate must have approached (or equaled) 100 percent. And thats where the problem is lurking.
The bungle: Did low-income kids do better in 2008 than in 2003? Their passing rate on the state reading test doubled, from 40 percent to 80 percent. If the state reading tests have remained the sameif the state hasnt made its reading tests easier (more below)that would suggest that Montgomery Countys low-income kids are reading much better today.
But of course, that isnt Glods claim. Glod doesnt simply claim that low-income kids are doing better. According to Glod, Montgomerys low-income kids are closing the achievement gap with their higher-income peers. And were sorry, but theres absolutely no way to judge that claim based on these inadequate data. In reality, the gap may have widened in the past five years. Heres why:
For the sake of argument, lets assume that the tests were just as hard in 2008 as they were in 2003. That would suggest that Montgomerys low-income kids are reading much better than five years ago. But it may be that the higher-income kids have improved their reading even more, thereby widening the gap. Its possible that this is the case. But if it is, you simply cant tell from passing rates on that state reading test.
Duh. In 2003, the test was so easy that the vast majority of higher-income kids were already passing it. For this reason, their passing rate could, at best, increase by just a few points. (Gruesome: Due to the Posts inept presentation of data, there is no way to tell how high that passing rate was.) It may be that Montgomerys higher-income kids are reading much better than their peers in the past. But their peers were already passing this test. Thus, there would be no way to demonstrate that improvement from current passing rates on this test.
Has the achievement gap narrowed in reading, in math? Its possiblebut theres no way to tell from these data. Its like weighing kids with a scale that only goes up to 100 pouinds. If kids gain a lot more weight after that, you have no way to measure it.
What Glod should have done: This is a groaning statistical bungle. Are low-income kids closing the gap? There is no way to tell from these data. Glod has bungled ginormously.
No, you cant tell from passing rates if the gap is closing. So what should Glod have done with this piece? Can we get serious here?
The basic question here is obvious: Have the state tests gotten easier? Wed be inclined to bet that they have. But the Post has been ducking this question in the past year, since the question surfaced locallybehaving in the way the press corps always has done when the issue is low-income children.(And yes, we mean for decades.)
Have the state tests gotten easier? If not, amazing gains have been recorded in passing rates among low-income students. Youd think the Post would want to know if those gains in passing rates are realor if the gains are just a mirage, induced by an easier test. In recent years, questions like this have been asked on the national level; the problem came home in the past few months, when evidence suggested that Marylands tests may have gotten easier. But on Thursday, the Post just stumbled and bumbled forward, putting its latest feel-good, bungled story out there on page one.
Post readers got a nice warm feeling from the report on Thursdays front page. But they got it from hopelessly bungled workfrom the kind of feel-good tripe about low-income kids your big papers love to hand you.
How low will they go: How low are papers willing to go to peddle these feel-good tales? In 2006, the Post praised a low-income school at the top of page one, calling it (in its headline) A study in pride, progress. But as it turned out, the school in question had the second-lowest third grade reading score in the whole state of Virginia! (At the time, the state only tested third and fifth grades.) Thats right! The second-lowest school in the state was hailed at the top of the Posts page one! That too was produced by a groaning statistical bunglethe kind this great paper loves. See THE DAILY HOWLER, 3/16/06, with links to previous work.
This has now gone on for decades. For reasons only they can explain, big newspapers love to pretend that the problems of low-income education are on the verge of elimination. Darlings! It makes the swells feel delicious! Suitably thrilled by their bogus tales, they return to their fine cribbage games.
In e-mail format: For a simpler version of this tale, heres that e-mail exchange:
E-MAIL (10/3/08): Can't resist your challenge. The Post reported that results improved using the same test. I'll guess that the problem you focused on was teaching to the test. In fact, the article you cited says of one of the schools, "The school focused on material covered in the test."
With lightning speed, we replied:
REPLY: No stuffed animal for you today, my friend!
Lesser oddity: The Post graphs compare the low-income kids to all kids (instead of comparing them to the higher-income kids.) There's nothing "wrong" with that, although it's an odd way to organize the data. But it helps disguise the real problem with what the Post did.
The real problem (Ill focus on the reading test):
In a county like Montgomery, almost all the higher-income kids were already passing the reading test in 2003. (This fact is slightly disguised by the way the Post lumps the lower-income kids in with the higher-income kids.) There was little way for this group to improve its passing rate. On the other hand, only about 40 percent of the low-income kids were passing the test in 2003. There was a lot of room for their passing rate to go up.
Maybe the state made the tests get easier. Maybe the schools began cheating, or teaching to the test. Or who knows: Maybe they actually improved their instruction, in a very significant way. At any rate, the passing rate has gone way up for the low-income kidsbut that doesn't mean theyve actually gained on the higher-income kids. Who knows? The higher-income kids may all be working on grad school level by now. But there's no way to show that on this test, which the county's higher-income kids were already passing in 2003.
It's like weighing children with a scale that stops at 100 pounds. If the heavier kids keep gaining weight, there's no way that scale can record it. It can show you that the lighter kids are gaining weight, and that may make it seem like theyre closing the gap on the heavier kids. But who knows? The heavier kids may all weigh 600 pounds by now. Theres no way to tell if the scale has an upper limitas the state reading test does.
People who haven't pored over these kinds of data wouldnt likely notice this statistical bungle. But it should have occurred to people doing education work for a big paper like the Post.
Once again, the crucial point: Those gains in passing rates dont mean squat if the tests have gotten easier. But lets face it. The Post will examine that seminal question when fine wine freezes in hell.