Contents:
Companion site:
Contact:

Contributions:
blah

Google search...

Webmaster:
Services:
Archives:

Daily Howler: Jay misstates Virginia's procedure--a procedure which adds up to fraud
Daily Howler logo
WILL THE REAL MAURY SCHOOL PLEASE STAND UP! Jay misstates Virginia’s procedure—a procedure which adds up to fraud: // link // print // previous // next //
WEDNESDAY, MARCH 8, 2006

OUR SERIES ON MAURY: On series on Virginia test scores continues. For Part 2, scroll down a tad.

MORE NOTES ON WHO CARES ABOUT ED: Over the weekend, we enjoyed an e-mail exchange with Michelle Pilecki, who disagrees with one of last week’s posts in ways we’ll let her express for herself. Check out her piece at the Huffington Post, “Who Cares About Education for Low-Income Kids?”

We admire Michelle, because she lives in Pittsburgh (which apparently makes her a mensch and a prole), while we live in Baltimore, which apparently makes us an Agent To Power. Mordant jesting to the side, we had a perfectly good exchange, and we recommend her post—a post which does in fact wander a bit away from our original topic.

Michelle says she found it “odd” when we restated our point in an e-mail. (“But here's what I said: Liberal journals and liberal bloggers don't discuss low-income education. As a general matter, I think that's completely obvious. These topics are almost never discussed.”) Indeed, we do think that statement is perfectly obvious—although we told Michelle, as we said last week, that we don’t mean that as an insult or slander, just as a statement of fact and a challenge. Do liberals discuss low-income ed in the journal and blogs where we help shape the discourse? Once again, as a general matter, we think that the answer is clear.

“Mr. Somerby has a somewhat valid point in criticizing the blogosphere for the lack of attention sparked by an excellent investigative Los Angeles Times series...on the large number of dropouts in LA's Class of 2005,” Michelle writes. She may have said that because we asked her a question. The Huffington Post is based in L.A., as are many of its bloggers. But when the Los Angeles Times did that superlative series, how many times was it mentioned in the Huffington Post? We didn’t bother to check before asking—but we’ll guess the answer is small or non-existent, and we note that Michelle doesn’t speak to this point. This wasn’t meant as a slam at the HuffPo or at its bloggers, just as a reflection of basic reality. For most liberal journals and opinion-makers, low-income ed has been off the play-list for these past many years.

We think Michelle is dreaming a bit—and we’re inclined to think that she’s settling. At one point, she claims that preschool for low-income kids “has been a hot topic:”

PILECKI: Disclosure time: In my off-web life as a freelance writer, I recently wrapped up a magazine story, to be published this spring, on one aspect of the pre-K movement, so I've been awash in research. Despite Mr. Somerby's comment, pre-K for at-risk kids has been a hot topic, e.g. in the next day's Washington Post, E. J. Dionne wrote about California's Proposition 82, which would raise taxes to guarantee preschool for every 4-year-old in the state. Further disclosure: The proposition is the brainchild of fellow HuffPoster Rob Reiner, but I know neither him nor the particulars of this pre-K proposal.
Is it true? Has “pre-K for at-risk kids been a hot topic?” Note the way Michelle closes this passage. She herself has been writing an article about some part of the pre-K movement, and even she doesn’t know the particulars of Rob Reiner’s pre-K proposal! Why is even Michelle uninformed? Because Rob’s plan is almost never discussed! For ourselves, we greatly admire Reiner for his long-standing commitment and effort. (Rob’s a big dude—he does not have to do this.) But preschool just isn’t a very “hot topic” in the venues we were discussing. In last year’s The Shame of the Nation, for example, Jonathan Kozol discussed the declining number of low-income kids who have access to Head Start, the nation’s best-known (and largest) pre-K program. “In spite of the generally high level of approval Head Start has received over the years...40 percent of three- and four-year-olds were denied this opportunity in 2001, a percentage of exclusion that has risen steeply in the subsequent four years,” he writes (page 52). After citing some places where preschool is common, he writes this: “More commonly in urban neighborhoods, large numbers of children have received no preschool education and they come into their kindergarten year without the minimal social skills that children need in order to participate in class activities and without even such very modest early-learning skills as knowing how to hold a pencil, identify perhaps a couple of shapes or colors, or recognize that printed pages go from left to right.” We don’t know enough about this topic to evaluate everything Jonathan said. But what was the public reaction to this part of his book? Of course! There was no reaction at all! This book was not discussed in liberal journals or blogs—including at the Huffington Post. This isn’t meant as an insult to the world’s tribal souls. But the general situation is perfectly clear.

“There's a growing national movement for universal access to pre-K, whether liberal bloggers and journals know it or not,” Michelle writes. That may or may not be true, but that misses the point we were making. In this fifth installment of a six-day series (five parts of which were numbered), we were asking why the nation’s low-income education policies are often framed by a vacuous discourse—by the kind of vacuous discourse which turned a kid like Gabriela Ocampo (Los Angeles, California) into a high school drop-out. In part, we said, the discussion is empty because liberal journals and liberal bloggers almost never take part (along with many other groups, by the way). Result? The discussion is left to second-tier entities, with outcomes which are often unfortunate. As we told Michelle yesterday, our basic reaction to her post is this: Man, are you ever willing to settle! At present, there’s little discussion of low-income ed; low-income children, like Gabriela, just don’t seem to count for much. “Sir, may we have just a tiny bit more?” Michelle’s post seems to say.

Special report: Will the real Maury School please stand up!

PART 2—IN A WORD, FRAUD: Will the real Maury School please stand up? The Post’s Jay Mathews—and we at THE HOWLER—have been trying to sort out the puzzling data which appeared on the school’s state “report card” last fall (see THE DAILY HOWLER, 3/7/06; Maury is a low-income school in Alexandria, Virginia). According to one chart—a chart Jay used for a front-page report in the Post last month—92 percent of Maury’s kids passed Virginia’s “Reading/Language Arts” tests in the spring of 2005. Needless to say, that passing rate sounded extremely good. (Third and fifth grades were the only grades tested.) But uh-oh! A second set of charts on the Maury “report card” showed some very different data; in these charts, only 27 percent of the school’s third-graders passed this same state test. (Statewide, 77 percent of third-graders passed.) So which one is the real Maury School? The school where 92 percent passed the state reading test? Or the school whose third-graders performed so poorly? Which school is the real Maury School? Will the real Maury School please stand up?

Sadly, we’re still forced to think that the real Maury School is the one with those low third-grade reading scores. (Instead of saying “Reading/Language Arts,” we’ll shorten it down from here.) We base that judgment on the explanation we got from Alexandria’s director of testing, Monte Dawson. It was a cosmically bad explanation—one which Mathews clearly misstated in the follow-up report he posted on-line just last week.

What is the state of Virginia’s explanation for those dueling data? How did a school in which 27 percent of third-graders passed the state test get credited with an overall passing rate of 92 percent? Last Wednesday, Jay tried to answer that question. Here’s his account of the way that miserable 27 became an impressive 92:

MATHEWS (2/28/06): I reconstructed what happened with the help of Virginia state officials. When Maury's 19 third-graders took the English test the first time last spring, five passed and 14 did not. Of the 24 fifth-graders who took the English test, 22 passed. The school worked with the third-graders who did not pass it and gave them a retest, and 12 passed on that second try.
Quick note: Jay’s fifth-grade data seem to be slightly wrong. Tedious details below.

Back to the basics: In this passage, Jay describes a testing procedure which seems to be perfectly sensible. The third-grade kids were given the state reading test last spring (the spring of 2005). Five out of 19 passed the first time; the other 14 kids were retested, and 12 of them passed on that second attempt. Hence, 17 of the 19 third-graders passed the reading test last spring. For the record, there may be a “retest bias” problem with such a procedure; if kids take the very same test two times in short order, it may skew results. But leaving aside that possible problem, Jay describes a testing procedure which does makes perfect good sense. Unfortunately, this isn’t the procedure Dawson described when we inquired about those Maury test scores. And it isn’t the procedure described by the state in a passage quoted by Jay himself in his report last week.

Jay has described a sensible process. But it seems fairly clear that this is not the process which produced Maury’s high passing rate—the passing rate which put the school atop the Post’s front page, described as a “study in progress.”

What actually produced Maury’s high passing rate? In Jay’s own report last week, he quotes some of the language sent to us by Dawson, Alexandria’s director of testing. In the following passage, Jay misstates what Dawson told us, then quotes Dawson’s actual language. As you can see from reading the Dawson quotation, Dawson doesn’t describe the sensible procedure which Jay describes in the passage above. He describes a totally different process—a process which adds up to fraud:

MATHEWS (2/28/06): Somerby dug further, and soon got an explanation from Alexandria schools testing and assessment director Monte Dawson, my prime source on the story. The big jump, Dawson explained, came mostly from the fact that 12 of those third-graders retook the test and passed it, upping the passing rate considerably, but only because of a odd mathematical rule approved by the state school board in 2000. Somerby quoted Dawson's explanation:

"Remediation Recovery, which has been around since 2001, means that fourth grade students who failed the third grade test in 2004, got to retake the third grade test in 2005. Up until this year (2005), if they passed the third grade test, then they were included in the numerator only of the calculation to determine the third grade passing score. As illustration, if 4 out of 5 third grade students passed and 1 out of 5 fourth grade Remediation Recovery students passed, the passing percentage would be 100 percent."

Dawson clearly didn’t tell us “that 12 of those third-graders retook the test.” Indeed, that clearly is not what his quotation says. That quoted passage doesn’t describe the sensible process from Jay’s report. It doesn’t say that third grade students were allowed to take the test a second time last spring. In fact, it says something vastly different; it says that “fourth grade students who failed the third grade test in 2004, got to retake the third grade test in 2005.” Jay describes a sensible process; that statement (and others—see below) describes something vastly different.

What actually happened at Maury last year? What actually got the school’s passing rate up? Believe it or not, this is what happened, according to everything we have been told: Only 5 of Maury’s third-grade kids passed the third-grade reading test. But an unspecified number of Maury’s fourth-grade kids also were given the third-grade test; when 12 of them passed the third-grade test, they were added to the total of third-graders (five) who had passed. No, none of this makes any sense—but it did get Maury’s “passing rate” up. Only 5 of 19 actually passed—till the state added in those 12 ringers.

Can this possibly be the way this state is computing its “passing rates?” We know, we know—it sounds impossible. It’s hard to believe that any state would engage in such total nonsense. But that is clearly what Dawson told us when we corresponded last month. Here, for example, is the actual language he sent when we e-mailed him the first time. We posted this language in real time (see THE DAILY HOWLER, 2/8/06). Hang on for a wild ride:

DAWSON’S FIRST E-MAIL: Remediation Recovery, which started with the 2001 [state tests], is a third reason for apparent score disparities. Students in grades 4, 6, or 9 may retake failed English: Reading/Literature and Research or mathematics tests for grades 3, 5, or 8, respectively, following a Remediation Recovery program. Additionally, students who failed Algebra I, Geometry, or Algebra II and who are enrolled in a Remediation Recovery program may retake a given EOC mathematics test. Tables 6, 7, 15, 17, and 20 display the number of students who retook the failed SOL, the percentage who passed, the number who passed (Bonus number), and the potential benefit to the school (Recovery Bonus or Unadjusted + Recovery score). In the State's calculations to determine accreditation, the number of students who pass the targeted test following a Remediation Recovery program will be added to the number of students who passed the SOLs in the same content area. For example, a fourth grader’s passing grade 3 mathematics score will be added to that school’s grade 3 mathematics passing scores. At other grade levels, the passing mathematics score will be added to the school’s “collapsed” SOL mathematics scores (for accreditation calculations, all mathematics scores are collapsed or averaged together to create one passing percentage). Remediation Recovery students will be included in the unadjusted number of students who passed, but not in the number of students tested, hence the term Recovery Bonus. Said another way, passing Remediation Recovery students are added to the numerator, but not to the denominator. What this means is that a passing percentage exceeding 100 percent is possible (Note: while this overview reports percentages more than 100 percent, the State caps pass rates at 100 percent).
Yes, it’s true. Believe it or not, that’s the language Dawson sent in response to our first queries. (We’ll post his two complete e-mails below). And no, that passage doesn’t describe the sensible process from Jay’s report. According to that passage from Dawson, fourth-graders who failed the third-grade test in the previous year take the third-grade test again, although they now are in the fourth grade—and if they pass, they get added onto the total number of third-graders who passed. This is an utterly ludicrous process, but Dawson described it again, quite clearly, when we asked for a clarification. We posted the following language last month; see THE DAILY HOWLER, 2/9/06:
DAWSON’S SECOND E-MAIL: You are correct that the adjusted passing percentage for third grade English: Reading and Writing was 27 person (33 percent unadjusted). Please note that Maury was more effective (71 percent) than the division (61 percent) in remediating fourth grade students. Thus, in the end, the State added in 12 passing fourth grade students at Maury when making their accreditation calculations.

Background
Remediation Recovery, which has been around since 2001, means that fourth grade students who failed the third grade test in 2004, got to retake the third grade test in 2005. Up until this year (2005), if they passed the third grade test, then they were included in the numerator only of the calculation to determine the third grade passing score. As illustration, if 4 out 5 third grade students passed and 1 out of 5 fourth grade Remediation Recovery students passed, the passing percentage would be 100 percent. However, beginning July 12, 2005, passing Remediation Recovery students were added to the numerator and the denominator.

We’re still not sure we understand all of that, for reasons which must be obvious. But in this second e-mail, Dawson made part of this process quite clear; “in the end, the State added in 12 passing fourth grade students at Maury when making their accreditation calculations.” Fourth-grade kids took the third-grade test; when they passed, they were added onto the third-grade totals. This is not the sensible process Jay described in last week’s report. Indeed, this is something quite different—an apparent fraud on the public conducted by Virginia’s state Department of Ed.

Amazing, isn’t it? In third grade, only 5 of 19 Maury students actually passed the state reading test. This was a miserable passing rate; it was far below the 77 percent statewide passing rate. So how did Maury end up on the Post’s page one, described as “a study in pride, progress?” Simple! The state added twelve fourth graders onto that total and pretended that 17 third-graders passed! This is, plain and simple, a fraud—an act of deception against the state’s citizens and against the Maury School’s parents.

Drink it in: Five out of 19 third-graders passed. The state found a way to change 5 to 17. And the Post put the story at the top of page one, where readers were told about Maury’s great progress. So yes, we’ll say it one last time: Unless there’s something here which is still unexplained, this is an act of outright fraud—the latest in the endless drive to gimmick up public school test scores.

TOMORROW—PART 3: Why did we check Maury’s scores?

MORE INDICATIONS FROM JAY’S REPORT: We know, we know—the rational mind prefers not to think that any entity, let alone a state government, would execute so absurd a procedure. And let’s be fair; this may be why Jay misdescribed the plain language of Dawson’s two e-mails. In each of his e-mails, Dawson said it; the Maury School had “remediated” some fourth-grade kids and added them to the number of third-grade kids who passed last year’s reading test. Yes—when these fourth-graders passed the third-grade test, they were added to the puny number of third-graders who actually passed. Five out of 19 third-graders passed; by the time the state got through, we were told it was really 17.

In his report, Jay says he was told something different, something more sensible—but part of his report suggests that he may simply have misunderstood the absurd procedure the state described. We’ll now post a chunk of Jay’s report in which he describes a confusing conversation with a state Department of Education spokesman. In this passage, you see Jay struggling to explain the odd mathematics of what is allegedly a sensible process—a process which wouldn’t be hard to explain if it happened as Jay thinks it did:

MATHEWS: I reconstructed what happened with the help of Virginia state officials. When Maury's 19 third-graders took the English test the first time last spring, five passed and 14 did not. Of the 24 fifth-graders who took the English test, 22 passed. The school worked with the third-graders who did not pass it and gave them a retest, and 12 passed on that second try.

Counting third- and fifth-graders together, 62 percent passed the English test the first time. So how did we get to 92 percent passing rate for those two grades in the final tally?

Charles Pyle, spokesman for the Virginia Department of Education, explained that in 2000, the state school board changed the counting procedure to encourage more schools to do what Maury did—give the students who failed some extra help and let them try again. Often the second-test passing rates of students who flunk a test initially are lower than their class's overall passing rate, since they are the class's weakest students. So if those second-test results were combined with the first test results in the usual way, it would likely lower the overall percentage and make the school look worse than otherwise. School districts in Virginia figured this out and resisted the urge to work with their lowest-performing students and test them again.

To give schools an incentive to make that effort, the school board ordered an unorthodox change in the way the school percentage would be calculated after the retesting. If a school had 100 students, with 30 failing the test the first time and 10 of those passing the test the second time, they could add 10 to the 70 who passed the first time, divide those 80 passing students by 100, and get a nice boost from 70 to 80 percent in their passing rate. Done the conventional way, they would have had to add 30 to the denominator as they added 10 to the numerator, and gotten a passing rate of only 62 percent, lower than the 70 percent rate they had before.

Wasn't that fun? This is another good example why I am glad I paid attention in fifth-grade arithmetic.

Jay describes an absurdly complex mathematical procedure, and tries to explain its rationale. But if last spring’s failing third-grade kids were simply retested a few weeks later, that wouldn’t be hard to explain at all. If you reported results “in the usual way,” the explanation would be very simple. You’d simply give the public the facts: Third-graders could take the test two times, and 17 out of 19 eventually passed. You wouldn’t have to jump through mathematical hoops—unless the real procedure is the one Dawson described in his e-mails, the one which seems to involve the state in an act of ridiculous fraud.

Did Jay misunderstand what he was told? Frankly, we wouldn’t be all that surprised; the rational mind does tend to recoil against the ludicrous thing this state did. Then there’s this: If Charles Pyle is just a press spokesman (and not a test specialist), he may have misunderstood this matter, assuming that the state’s procedures couldn’t possibly make so little sense. He may have mis-explained the matter to Jay, assuming the state’s conduct made sense. But at any rate, in last week’s report, Jay does quote Dawson’s clear language (see above): “Remediation Recovery, which has been around since 2001, means that fourth grade students who failed the third grade test in 2004, got to retake the third grade test in 2005.” Fourth grade kids took the third grade test—and when some of the fourth-graders passed, they were added to the tiny number of third-graders who actually passed. But then, scams like this have gone on for decades, since the dawn of “accountability.” As we proceed, we’ll discuss this history in a bit more details—and we’ll tell you why a scam like this matters.

DAWSON’S COMPLETE E-MAILS: For those who want to cross every T, here are Dawson’s two complete e-mails. By the way, Dawson was very prompt and straightforward. This policy comes from the state, not from the straightforward Dawson:

DAWSON’S FIRST E-MAIL (2/6/06):
Mr. Somerby:
Congratulations on your close reading of the State's charts. If I understand your question properly, the answers are not readily discernible from viewing the State Report Card. For State accreditation calculations, remediation recovery scores and the fifth grade English:Writing scores are included in the mix, but not shown on their tables. I've pasted an explanation of remediation recovery below.
Monte Dawson

Remediation Recovery, which started with the 2001 SOLs, is a third reason for apparent score disparities. Students in grades 4, 6, or 9 may retake failed English: Reading/Literature and Research or mathematics tests for grades 3, 5, or 8, respectively, following a Remediation Recovery program. Additionally, students who failed Algebra I, Geometry, or Algebra II and who are enrolled in a Remediation Recovery program may retake a given EOC mathematics test. Tables 6, 7, 15, 17, and 20 display the number of students who retook the failed SOL, the percentage who passed, the number who passed (Bonus number), and the potential benefit to the school (Recovery Bonus or Unadjusted + Recovery score). In the State's calculations to determine accreditation, the number of students who pass the targeted test following a Remediation Recovery program will be added to the number of students who passed the SOLs in the same content area. For example, a fourth grader’s passing grade 3 mathematics score will be added to that school’s grade 3 mathematics passing scores. At other grade levels, the passing mathematics score will be added to the school’s “collapsed” SOL mathematics scores (for accreditation calculations, all mathematics scores are collapsed or averaged together to create one passing percentage). Remediation Recovery students will be included in the unadjusted number of students who passed, but not in the number of students tested, hence the term Recovery Bonus. Said another way, passing Remediation Recovery students are added to the numerator, but not to the denominator. What this means is that a passing percentage exceeding 100% is possible (Note: while this overview reports percentages more than 100%, the State caps pass rates at 100 percent).

DAWSON’S SECOND E-MAIL (2/7/06):
You are correct that the adjusted passing percentage for third grade English: Reading and Writing was 27% (33% unadjusted). Please note that Maury was more effective (71%) than the division (61%) in remediating fourth grade students. Thus, in the end, the State added in 12 passing fourth grade students at Maury when making their accreditation calculations.

Background
Remediation Recovery, which has been around since 2001, means that fourth grade students who failed the third grade test in 2004, got to retake the third grade test in 2005. Up until this year (2005), if they passed the third grade test, then they were included in the numerator only of the calculation to determine the third grade passing score. As illustration, if 4 out 5 third grade students passed and 1 out of 5 fourth grade Remediation Recovery students passed, the passing percentage would be 100%. However, beginning July 12, 2005, passing Remediation Recovery students were added to the numerator and the denominator.

FOR FIGURE FILBERTS ONLY: We mentioned that Jay’s fifth-grade data seem to be slightly wrong too. “When Maury's 19 third-graders took the English test the first time last spring, five passed and 14 did not,” he wrote. “Of the 24 fifth-graders who took the English test, 22 passed.” But according to the state’s grade-level charts, 83 percent of fifth-graders passed; presumably, that would be 20 out of 24 (83.3 percent, rounded down). At this grade level, it would seem that two ringers (sorry: remediated students) were added to the actual “N.” When a state is busy misleading the public, every little bit helps.

One last point: Virginia officials employ a vast array of names for the test in question. Sometimes they call it the “English” test (that’s the name Jay tends to use). Sometimes, they call it “Reading/Language Arts.” In his e-mails, Dawson calls it “English: Reading and Writing.” The confusion which stemmed from this Babel helped make those Virginia “school report cards”—now taken down—an utterly incomprehensible mess. (Yes, as we’ve noted, there was much more confusion.) But now, school report cards have been taken down—for the entire state of Virginia. At present, if you live in Virginia, there’s no way to check your school’s scores.