Wed like to return to other topics—but testing-and-reporting does matter. And this morning, the Post runs another large story about a school systems preparations for this springs high-stakes tests. Indeed, heres the headline atop Nick Andersons piece: What Do We Want! High Test Scores! In the sub-headline, the excitement continues: Prince Georges Schools on Watch List Pile on Drills, Cheers as State Tests Loom.
Yes, this report takes us back to low-income, majority-black Prince Georges County, just outside Washington, D.C.—and it describes the way the county is getting kids ready to take this springs high-stakes tests. Andersons report is quite informative, but we do have a standard objection. Although Anderson describes the county-wide frenzy which is building as the test date nears, he never mentions a salient fact. He never notes that pressures like these have produced many cases, all over the country, in which teachers, principals, and school systems have outright cheated on tests. Almost surely, the desire to get those test scores up produced the odd procedure in the state of Virginia which weve discussed for these several weeks (see below). No, no one is saying that Prince Georges County is doing something wrong as it preps for these tests. But we did start a bit at this statement:
ANDERSON (3/14/06): Prince George's officials have tried nearly everything in recent years to raise scores.Ouch! The history here is clear, nationwide; all too often, when school officials have tried nearly everything, they have ended up slipping over the line. Lets stress again—no one is saying that Prince Georges County has engaged in any misconduct at all. But this has been an important part of the national testing story for the past forty years—and its a part of this story which mainstream reporters almost never mention. In a story of this length, we think this omission is unfortunate. But then, reporters rarely display awareness of this general problem. (Note Andersons unblinking report about one school which has recorded phenomenal gains. In our experience, schools which record phenomenal gains are schools which should be double-checked.)
That said, we especially started at one part of this report. Yes, Prince Georges schools are holding pep rallies to motivate students for the big test. But at one point, Anderson described an educational practice which struck us as a bit odd:
ANDERSON: Like other schools, James Madison [Middle School] broadcasts MSA vocabulary words every morning over the loudspeaker. One recent word: "innocuous." Definition: "harmless, producing no injury." An announcer gave an example of usage tailored to adolescents: "No gossip is innocuous. Gossip always hurts somebody."Really? The school broadcasts specific vocabulary words from the MSA (Maryland Student Assessment) every day on the intercom? Lets get technical: On a nationally norm-referenced, standardized test, this practice would plainly be inappropriate; youd be preparing your students for specific test items in ways the norm group had not been prepared. This makes such a test invalid. By contrast, on a test like the MSA, the practice may be completely appropriate—but it does strike us as slightly odd. Is this really being done in all the states schools. In a uniform way? Does it involve specific vocabulary words which will appear in specific test items? If so, whats the theory behind this sort of preparation? Dont kids need to know lots of words—not just the handful which appear on one test? And by the way: If kids are being prepped in item-specific ways, are we really surprised when it turns out that they do fairly well on these tests?
Lets stress this again: This practice may be wholly appropriate. Who knows—it may even make sense! But it did strike us as somewhat odd—and were accustomed to a world in which education reporters fail to note irregular practices. At the Maryland web site, we cant find examples of vocabulary items from past tests. This procedure may be completely A-OK. But were going to make a couple of calls to double-check on this slightly odd practice.
By the way: If kids really learn from school-wide intercom broadcasts, why do we bother having teachers? Wouldnt we save a lot of dough if we just let the principal do the work?
Continuing story: Yes, Virginia!
HOW LOW DID THEY GO: Should the Washington Post have presented Maury Elementary (Alexandria, Virginia) as a study in pride, progress? Thats how the paper described the small school in a top-of-the-front-page report last month (full links below). But how low did the Post really go in its search for a heart-warming story? With Virginias school report cards accessible again, weve checked through the states school systems, trying to find other schools which scored as low in third-grade reading as Maury seems to have done last year. As we have noted, only 27 percent of Maurys third-graders seem to have passed the state Reading/Language Arts test—a test which was passed by 77 percent of third-graders statewide. (From now on, well just call it reading.) Did any school in the state do worse? We have found only one such school: Annie B. Jackson Elementary School of tiny, rural Sussex County, where only 22 percent of third-graders passed the state reading test last spring. (Assiduous reader RC reports the same tentative finding. Were disregarding the Richmond Alternative School, for reasons explained below.) In short, when the Post hailed Maury as a study in progress, it was hailing a school with one of the lowest reading performances in the entire state of Virginia! As weve noted, only two grade levels were tested last spring—third grade and fifth. And in third grade reading, only one school scored lower—one school in the whole state!
(Maury did score fairly well at the fifth-grade level. According to its school report card, 83 percent of Maury fifth-graders passed the state reading test, compared to 85 percent of fifth-graders statewide.)
Readers, what ever gave the Post the idea that Maury was a study in progress? As weve noted, it was the schools combined Grade 3 and 5" passing rate in reading—a passing rate which appears at the top of the Maury report card. According to that pleasing statistic, 92 percent of Maury students (Grades 3 and 5 combined) passed the English test last year. (Inexcusably, the state uses a confusing array of names for its Reading/Language Arts test.) Weve described the absurd statistical procedure which seems to have yielded that pleasing statistic. For today, lets just note that schools all across Virginia seem to have gained from this procedure. Did Christmas come early at Maury last year, transforming a 27 percent into a pleasing 92? If so, then Christmas came early at many schools—although none of them seem to have gained as much from this rate-shifting process as Maury.
Yes, all around the state of Virginias, schools boast a passing rate for Grade 3 and 5 English which cant be derived from the passing rates of the two grades individually. Consider J. L. Francis Elementary in Richmond, for example. According to its school report card, 60 percent of the schools third-graders passed the reading test last spring, along with 68 of fifth-graders. But the combined passing rate, at the top of its school report card, is much more pleasing—85 percent! This pattern obtains in schools throughout the state, including award-winning Norfolk City—although nowhere to the extent seen at Maury. As such, it does seem that this odd statistical procedure —a procedure in place since 2001—has been systematically misleading citizens in every part of Virginia. We think its time that the states big newspapers investigate and report this odd pattern.
For the record, no district seems to have gained as much from this procedure as Alexandria. Last year, the system reported results for thirteen elementary schools—and in a good number, combined Grade 3 and 5" passing rates were substantially inflated. Example: James K. Polk Elementary School. Deep down in its school report card, we see that 67 percent of third-graders passed the reading test, along with 65 percent of fifth-graders. But at the top of the card, we get the passing rate for the two grades combined—84 percent! Ditto Cora Kelly Magnet Elementary. Deep down in the data, we see that 60 percent of third-graders passed, along with 80 percent of fifth-graders. But what do we see at the top of the card? Happy days are here again! The passing rate for the two grades combined is presented as 90 percent! But then, Christmas came early at Jefferson-Houston Elementary too, where 44 percent of third-graders passed, along with 71 percent of fifth-graders. What does it say at the top of the schools report card? Combined passing rate, 75 percent! In these schools, as in schools all over the state, combined passing rates are substantially higher than the passing rates of the two grades at issue. No, nobodys passing rate was jacked up more than the passing rate at Maury. But if these passing rates derive from a bogus procedure, then bogus data have been peddled all across the state of Virginia—bogus data which persistently over-state school passing rates, of course.
As weve seen, two explanations have been offered for this odd phenomenon. When we asked about Maurys contradictory data, Alexandria testing director Monte Dawson described an absurd statistical procedure, even sending us detailed material (apparently from a technical manual), material which explained how this bizarre procedure works. Later, in an on-line reply in the Post, Jay Mathews gave a different explanation; he described a more sensible process, but he completely misstated what Dawson had told us, and many of the statistical complexities he described simply didnt make any sense if his explanation was accurate. Beyond that, he didnt seem to have asked Dawson about what we had been told, although Dawson had been his original source for the story about Maurys high passing rates. Mathews described a more sensible process—but as a piece of basic journalism, his report didnt seem to add up.
What explains the statewide pattern of apparently inflated passing rates? Will the real explanation please stand up? Last week, we thought wed turn this puzzling story over to real news orgs. But with news orgs seeming to drag their heels, well now go back to Jays basic sources to try to resolve this puzzle.
But understand: Last spring, all third-graders in the state of Virginia were given the third-grade Reading/Language Arts test. Statewide, 77 percent of third-graders passed on that standard first testing. But at Maury, only 27 percent of third-graders passed—the second-lowest result in the state. How low was the Post prepared to go in its search for a heart-warming story? In a rational world, Maurys performance would have been cause for major concern. In our world—a world which loves a feel-good tale—the school was a study in progress.
By the way—Mathews quoted Maury parents who were thrilled by their schools progress. Did they know about the schools third-grade passing rates? (The rate was also quite low in math.) Would these parents have been so pleased if they knew that their third-graders had the second-lowest score in the state? Did they know how low the Post would go to hand them a heart-warming story?
DISREGARDING ONE ALTERNATIVE: One other school scored lower than Maury in third-grade reading—the Richmond Alternative School, where zero percent of third-graders passed last springs reading test. But Richmond Alternative is a very small K-12 school which had no female third-graders last year. Published info is hard to come by, but it seems to have been described in the Richmond Times-Dispatch as a school for disruptive students.
WED LOVE TO SEE THE TIMES-DISPATCH REPORT: Wed love to see state or national orgs report on Virginias school report cards. Do citizens understand the contradictory data routinely found in these reports? We doubt it; in fact, were certain they dont. Weve been working on this topic for a month, and were still not sure that we know just how these dueling data are derived. How did Maurys 27 percent become a pleasing 92? And do Richmond parents understand these matters? Well promise you: No, they do not.
Richmond Times-DispatchWed love to see the Times-Dispatch report this story. No one understands those school report cards—and if Dawson gave us the straight dope, the state has been using an absurd procedure which is churning out embellished data. Or is it OK—to report false scores if were talking about low-income kids?
Pam Stallsmith, State Government & Politics/Special Assignments
Louise Seals, Managing Editor
Tom Kapsidelis, Virginia editor
Jeff Schapiro, State Government
Tyler Whitley, Politics
Holly Prestridge, Education reporter
Lindsay Kastner, Education reporter
Juan Lizama, Education reporter
Olympia Meola, Education reporter
BASIC LINKS: On February 2, Maury hit the top of the Posts front page. You know what to do—just click here.
We questioned this story the following week. See THE DAILY HOWLER, 2/6/06, then click forward from there.
Post reporter Jay Mathews followed up on February 28. Click here and you can read every word.
We responded all last week—and the state of Virginia has hidden its data. See THE DAILY HOWLER, 3/7/06, for our first installment.
For a list of Alexandria schools, just click here. Keep clicking to see each schools report card.