Daily Howler logo
THERE’S SOMETHING ABOUT MAURY! The Washington Post must insist: Will the real Maury School please stand up? // link // print // previous // next //
THURSDAY, FEBRUARY 9, 2006

LIBERALS HAVE A MAJOR PROBLEM: “Tweety Tries to Make Up With Us,” Atrios writes—extending the groaning downward spiral of the activist liberal web.

Atrios links to a post at Crooks and Liars—a post which notes the way Chris Matthews challenged the Admin’s rationale for Iraq last night. “If this is what [Matthews] really thinks, why do we so rarely hear it?” he asks. But Matthews has taken this approach for years—over and over and over and over. He turned on the war in the fall of 2002; from that point on, he has done cable’s most skeptical work on this subject. As we’ve long written, in massive detail, Matthews has long been a cable train wreck—and he’s done huge damage to Major Dem Pols (Kerry being the major exception). But he’s done this kind of work on Iraq for years. It’s hard to believe that Atrios—or Crooks and Liars—doesn’t know it.

But liberals have a major problem. For years, career liberal writers at the major journals refused to discuss the mainstream press corps. As part of this general failure, they have refused to discuss the astonishing work of Matthews, the loud cable talker. In the case of both Clintons—and in the case of Candidate Gore—Matthews has been an utter disgrace; for example, he was the most significant (and disingenuous) cable player in the two-year War Against Gore. But career liberal writers have disgraced themselves by their endless refusal to deal with this topic. Presumably, it’s an age-old story: They want to go on Hardball too, and reap the massive career advantages. Young career liberal writers look ahead to their future—and to the riches and fame they will gain.

For whatever reason, career liberal writers have refused to discuss Chris Matthews—which has given rise to the often-laughable work now driving the activist web.

Surely, Atrios knows that Matthews has challenged the Bush Iraq policy for years. We’ll assume Crooks and Liars knows this too. But today, these sites seem to gambol and play, as activist sites on the liberal web have increasingly done in the matter of Matthews. Incredibly, Crooks and Liars asks if the Open Letter site has been “having an effect” on Matthews—and Atrios admiringly links you to this ludicrous musing. This is simply self-pimping twaddle. Matthews has always expressed these views on Iraq. Do you think that these writers don’t know it?

So liberals have a major problem. Career liberals avoid discussing the mainstream press—have largely refused to do so for years. Meanwhile the activist web is increasingly driven by a gang of juvenile screamers. Matthews has challenged Iraq for years. It’s appalling to see our new class of “leaders” pretending that they don’t know it.

Liberals have a major problem. We’ll discuss these matters in greater detail in the days ahead. By the way: We read the olde Atrios, with admiration, for years. We’d love to hear his explanation for this utterly silly, absurd post.

LIKE US, HE STILL READS ATE: Most Atrios readers seem involved in deciding who is and isn’t a “f*cker.” Despite that, one of his readers knows the obvious facts about Matthews’ long-standing approach to Iraq. “I know there is more perceptive ability here than is being shown,” he begins. And yes—he’s referring to Atrios, not to Matthews. You know what to do—just click here. Does Atrios really not know this?

P.S. We know—we know. You don’t like this.

Special report: There’s something about Maury!

PART 4—WILL THE REAL MAURY PLEASE STAND UP: There seem to be two Maury Elementary Schools in Alexandria, Virginia.

Last week, one Maury was found at the top the Post’s front page, described in a major headline as “A Study in Pride, Progress.” Inside this low-income school, 92 percent of third- and fifth-graders passed the Virginia state “English” test last spring. This test (also called “Reading/Language Arts”) has two parts—reading and writing. It’s a major part of Virginia’s “Standards of Learning” tests—tests the state uses to meet its obligations under No Child Left Behind.

That first Maury School is “a study in progress.” But there seems to be a second Maury—a school described in an official state of Virginia “school report card” (click here). Inside that low-income school, only 27 percent of third-grade students passed that Reading/Language Arts test last year. And no one could really be “proud” of that score—unless the kids at Maury don’t count. Across the state of Virginia last spring, 77 percent of third-graders passed that very same test.

One Maury School is a study in progress. The other Maury seems to be floundering. It seems to be a low-income school whose children need tons of help, not a front-page free ride from the Post, with a photo of a gorgeous child smiling.

But which school is the real Maury School? The “study in progress” described in the Post? Or the school whose scores are a study in failure? At this point, we simply can’t tell you—although we’d be likely to bet the ranch that the low score is more on the mark. (For Part 1 of our four-part series, see THE DAILY HOWLER, 2/6/06.)

Some HOWLER readers have actually tried to work their way through this puzzling matter. They’ve tried to decipher the material we were sent when we asked the Alexandria schools to explain how that troubling 27 percent became a more pleasing 92 (see THE DAILY HOWLER, 2/8/06). Down below, we’ll post three readers’ efforts—along with a further explanation from the Alexandria testing director. But in fact, the material we were sent is so hard to decipher—and the system described sounds so bizarre—that we simply can’t answer that basic question. We can’t tell you how 27 became 92—how a school with that very low passing rate became “a study in progress.”

We can say several things about this episode, which is both absurd and familiar.

First, the state of Virginia should be embarrassed—ashamed—to publish that “school report card.” As we noted on Monday, that report is a study in incoherence. There are too many problems to summarize here—we’ll go into more detail tomorrow—but the most basic problem is obvious. In one chart, near the top of the card, we seem to see that 92 percent of Maury’s third- and fifth-graders passed the state “English” test last year. In another chart, though, we see something different; we see that only 27 percent of the school’s third-graders passed the very same test. Obviously, no—those numbers don’t jibe. And there is absolutely no way—no way at all—for a citizen to reconcile them. There is no way for a Maury parent to comprehend those contradictory scores.

The problem here is obvious. The one great part of No Child Left Behind is its testing-and-reporting requirement. Every school must test its kids every year—and it must report the results. Theoretically, this requirement makes it hard for a school (or school system) to sell the public a load of goods about the wonderful progress it’s making. This basic theory is completely destroyed in the face of this absurd school report card.

Second, the Washington Post should get to work explaining this puzzling incident. Perhaps some good explanation does exist, although we’ll vote with the skeptics. By e-mail, reporter Jay Mathews tells us that he never saw the chart with the 27 percent passing figure. (“I have not seen the 27 percent figure you are referring to. If it was in the packet of stuff I got from [testing director] Monte Dawson, my eye passed over it.”) But the chart is a basic part of Maury’s official report card—and it also shows an extremely low passing rate for Maury third-graders in math (see THE DAILY HOWLER, 2/8/06). And no, this isn’t some sort of typo. In his e-mails, Dawson hasn’t tried to disown those numbers; he has simply tried to explain how 27 became 92. Dawson has been prompt and responsive, but as we’ve noted, the official state material he sent us is almost completely incomprehensible. As we’ll see below, math professors are struggling to figure it out. But the basic fact remains: Maury’s third-graders had very low passing rates (in reading and math) in the regular administration of last year’s tests, in which only third- and fifth-graders were tested. The paper which called Maury “a study in progress” at the top of page one needs to explain what that means.

Finally, a bit of perspective:

As a society, we began to worry about low-income, minority schools at some point in the 1960s. Soon, standardized test scores were being published in big-city newspapers—and journalists began to note, with dismay, that very low scores were being recorded in most low-income urban schools. The result was almost instantaneous. When the public began to “demand” better scores, schools and school systems began cheating to attain them. Over and over—and over and over—cheating scandals were alleged and documented. And yes, we’re talking about real “cheating” here—we’re not discussing “teaching to the test.” Journalists have never shown much of a taste for this topic. But schools and school systems have cheated in every way you can think—and in a few ways that you can’t. In fact, the latest strange New York City incident was reported in yesterday’s Times! You know what to do—just click here. Comments will follow tomorrow.

In short, ever since school systems were forced to publish their scores, they have looked for ways to get around the bad scores. We’d all be fools if we didn’t suspect that this is just one more example.

So will the real Maury please stand up? We can’t tell you which Maury is the real deal—but the state of Virginia and the Post need to do so. Understand this again: At Maury Elementary, a low-income school, 27 percent of third-graders passed the state reading test last year. Across the state, 77 percent of third-graders passed the same test—three times the rate of Maury. But atop page one of the Washington Post, we’re told that this same Maury Elementary is “a study in pride and progress.” Is failure good enough for Maury’s kids? Can you say SBOLE, boys and girls? Soft bigotry of low expectations?

TOMORROW—27 IS ENOUGH: We return to that high-minded Post editorial, which says how “successful” Maury is. And we take a look at the state of Virginia’s completely inept “school report card.”

PROFESSORS RISE TO THE CHALLENGE: How did 27 become 92? Intrepid readers rose to the challenge of sorting that out. They’ve tried to decipher the material we were sent in reply to our initial query. What follows is the basic “explanation” we received—and then, the decryption attempts of three of our readers, two of whom are college professors. Finally, we post a second e-mail from Alexandria’s testing director—a prompt reply to our request for clarification. Through it all, the obvious question: How can parents understand this stuff when professors can’t puzzle it out?

First, the original explanation, from an unnamed Virginia state document:

VIRGINIA STATE DOCUMENT: Remediation Recovery, which started with the 2001 SOLs, is a third reason for apparent score disparities. Students in grades 4, 6, or 9 may retake failed English: Reading/Literature and Research or mathematics tests for grades 3, 5, or 8, respectively, following a Remediation Recovery program. Additionally, students who failed Algebra I, Geometry, or Algebra II and who are enrolled in a Remediation Recovery program may retake a given EOC mathematics test. Tables 6, 7, 15, 17, and 20 display the number of students who retook the failed SOL, the percentage who passed, the number who passed (Bonus number), and the potential benefit to the school (Recovery Bonus or Unadjusted + Recovery score). In the State's calculations to determine accreditation, the number of students who pass the targeted test following a Remediation Recovery program will be added to the number of students who passed the SOLs in the same content area. For example, a fourth grader’s passing grade 3 mathematics score will be added to that school’s grade 3 mathematics passing scores. At other grade levels, the passing mathematics score will be added to the school’s “collapsed” SOL mathematics scores (for accreditation calculations, all mathematics scores are collapsed or averaged together to create one passing percentage). Remediation Recovery students will be included in the unadjusted number of students who passed, but not in the number of students tested, hence the term Recovery Bonus. Said another way, passing Remediation Recovery students are added to the numerator, but not to the denominator. What this means is that a passing percentage exceeding 100 percent is possible (Note: while this overview reports percentages more than 100 percent, the State caps pass rates at 100 percent).
Hmm. On to the first attempt at decryption:
E-MAIL THE FIRST: Yes, that passage is pretty unclear and convoluted. But if you ignore all the rules and mumbo-jumbo outlined at the beginning of it, the most important part of the paragraph becomes clear:

"At other grade levels, the passing mathematics score will be added to the school’s “collapsed” SOL mathematics scores (for accreditation calculations, all mathematics scores are collapsed or averaged together to create one passing percentage). Remediation Recovery students will be included in the unadjusted number of students who passed, but not in the number of students tested, hence the term Recovery Bonus. Said another way, passing Remediation Recovery students are added to the numerator, but not to the denominator. What this means is that a passing percentage exceeding 100 percent is possible (Note: while this overview reports percentages more than 100 percent, the State caps pass rates at 100 percent)."

What this appears to be saying is that they don't take a straight-out average of the scores of all students that took the test. An example seems to be the best way to explain:

Students A, B, and C get scores of 90, 60, and 50, respectively (assume these are all "passing scores"). Their average score is then (90+60+50)/3 = 66.67. Now, say student D is one grade up from them but takes the remediated test. She gets a 100. The average should now be (90+60+50+100)/4 = 75.

What the schools seem to be doing is to take student D's score, add it to the other students scores, but then not increment the number in the denominator. So, instead of getting a 75 like in the example above, the combined scores of the students come out to (90+60+50+100)/3 = 100.

To the best of my knowledge, this is bad math no matter how you cut it. And the more students that take the remediated tests, the higher the calculated score will be. Adding any number to the numerator without incrementing the denominator will always give a higher score. A kid in 4th grade could get a 10 on the test and that would result in the school's overall average going up! A 10 probably isn't a passing grade, but I think you see what I'm getting at here.

Actually, we don’t see what our excellent mailer is getting at—because we reject the notion that something like this can be part of a public accountability program. It would be intriguing to sort this out, and the Post should speedily do so. But it defeats the purpose of No Child Left Behind to report data which result from such a bewildering statistical process—a process which even a highly-skilled and experienced education writer seems to know nothing about. But as we struggled with such imponderables, the attempts at decryption continued:
E-MAIL THE SECOND: What the testing director seems to be saying in the passage your quote in Wednesday's Howler (and forgive me if you caught this and just didn't include the interpretation in your write-up) is that 4th graders who failed the test last year, but passed their re-take this year are counted as test-passers but not as test-takers!!!

For example: The 3rd grade class consist of 20 students. 5 out of 20 pass the test (a passing rate of 25 percent).

16 3rd graders failed the test lest year and are now 4th graders. They retake the 3rd grade test and 12 out of the 16 pass.

The reported 3rd grade passing rate is now (5 3rd graders + 12 retakers) / 20 3rd graders. That is 17/20 (an 85 percent pass rate). This is an inexplicable definition of passing rate—the number of passing students is adjusted by the number of retakers but the number of test takers is not!!!

In this manner it would be easy to achieve a passing rate of greater than 100 percent (if, say, in our example all 16 4th graders had passed the retake). However, the state helpfully caps the passing rate for any school at 100 percent (presumably so that no one has to think too hard about what a passing rate of 105 percent means.)

Yes: As the state material says, a school’s “passing rate” can go over 100 percent with this procedure—in which case, the state modestly scales it back to “100 percent.” To state the obvious: Whatever this number may actually be, it plainly isn’t a “passing rate.” At any rate, a third e-mailer kicked things off with a chuckle, laughing to keep from crying:
E-MAIL THE THIRD: Can't resist a puzzle! Which is the "real" number? It all depends on what the meaning of “real” is.

Suppose you have 100 third-graders, and only 27 pass in third grade. The other 73 get to retake the test the next year following a remediation program. If 65 of those pass, the third grade pass rate rises to 92. It seems to me that there's one way this might make sense and two ways it clearly doesn't.

First, if your goal is to measure whether students are making progress, cumulative pass rates aren't insane. If the kids who failed the first time through were held back, their improved results would be part of the overall score. Measuring that kind of progress isn't necessarily ridiculous. Is this the best way? Almost certainly not, but I've seen stupider things.

On the other hand, the cumulative score is now measuring fourth-graders according to third-grade standards and reporting them as third-grade scores. The resulting score of 92 does indicate progress, but in doing so it institutionalizes a one-year delay. There is no difference between a school where 92 of third-graders pass the third grade test and one where 92 fourth-graders pass the third grade test. Again, I've seen stupider things, but the bar is getting higher (or lower).

The truly insane thing is to compare 27 to 92 as if they measured the same thing and claim that the year-to-year improvement from adding the remediation capability means anything.

Where I suspect we differ on these issues (like we do in some of the press issues you discuss) is that I'm not sure the people involved are smart enough for this to be conscious fraud. I work with a bunch of smart people who demonstrate statistical incompetence on a daily basis. The school administrators I've met are not smarter or more mathematically inclined than the people I work with. It seems as likely that the administrators think they know something the obvious statistics don't capture, then keep looking until they find a "statistic" that shows what they believe they know.

I don't think that people who find such approaches valid should be allowed to infect our kids with their dangerous ideas, but I'm not sure they're intentionally malicious.

We don't live in a meritocracy. The politicians ultimately in charge succeed or fail don't have to pass policy competence exams in order to get their jobs.

Actually, we semi-agree with this reader. We have no idea what motive explains this procedure. In our experience, we should always assume that such Rube Goldberg schemes may result from incompetence. However, “mistakes” like this have persisted for thirty-five years—and the “mistakes” always seem to cut one way. Ever since the start of test-and-report, schools and school systems have come up with ways to fool the public with gimmicked-up test scores. A serious person has to assume this may just be one more case.

Again—our excellent readers have struggled hard with this bewildering problem. But such complex calculations cannot be part of a public accountability plan. At any rate, for those who want to struggle onward, Alexandria testing director Monte Dawson did send us a clarification of that puzzling paragraph. We publish this to establish a record. We’re not suggesting this is Dawson’s fault—Dawson doesn’t run the Virginia state system. But again, this simply can’t be the way a state does its public reporting:

DAWSON E-MAIL: You are correct that the adjusted passing percentage for third grade English: Reading and Writing was 27 person (33 percent unadjusted). Please note that Maury was more effective (71 percent) than the division (61 percent) in remediating fourth grade students. Thus, in the end, the State added in 12 passing fourth grade students at Maury when making their accreditation calculations.

Background
Remediation Recovery, which has been around since 2001, means that fourth grade students who failed the third grade test in 2004, got to retake the third grade test in 2005. Up until this year (2005), if they passed the third grade test, then they were included in the numerator only of the calculation to determine the third grade passing score. As illustration, if 4 out 5 third grade students passed and 1 out of 5 fourth grade Remediation Recovery students passed, the passing percentage would be 100 percent. However, beginning July 12, 2005, passing Remediation Recovery students were added to the numerator and the denominator.

More tomorrow on the responsibilities—and endless failures—of our big urban papers in these matters.