Contents:
Companion site:
Contact:

Contributions:
blah

Google search...

Webmaster:
Services:
Archives:

Daily Howler: Kristof assumes grave harm occurred. But he still says ''no foul''
Daily Howler logo
BIG HARM, NO FOUL! Kristof assumes grave harm occurred. But he still says “no foul:” // link // print // previous // next //
TUESDAY, OCTOBER 25, 2005

BIG HARM, NO FOUL: We’re often puzzled by Nicholas Kristof, and today is no exception. Should Bush Admin leakers be charged with crimes? Throughout his column, Kristof seems to say “no”—even though he assumes that major harm may have resulted from their actions:
KRISTOF (10/25/05): My guess is that the participants in a White House senior staff meeting discussed Mr. Wilson's trip and the charges that the administration had knowingly broadcast false information about uranium in Niger—and then decided to take the offensive. The leak of Mrs. Wilson's identity resulted from that offensive, but it may well have been negligence rather than vengeance. I question whether the White House knew that she was a noc (nonofficial cover), and I wonder whether some official spread the word of Mrs. Wilson's work at the C.I.A. to make her husband's trip look like a nepotistic junket.

That was appalling. It meant that any person ever linked to Mrs. Wilson or to her front company was at grave risk. And we in journalism have extended too much professional courtesy to Robert Novak, who was absolutely wrong to print the disclosure.

Kristof says that the leakers’ conduct may have put many folk “at grave risk.” But he still seems to argue against criminal sanctions. We’re inclined to a “no harm, no foul” standard ourselves. But Kristof doesn’t even seem to care if “grave” harm did result.

But sic semper New York Times journalists! The following passage in today’s Post made us think of Kristof:

MILBANK AND PINCUS (10/25/05): Wilson has also armed his critics by misstating some aspects of the Niger affair. For example, Wilson told The Washington Post anonymously in June 2003 that he had concluded that the intelligence about the Niger uranium was based on forged documents because "the dates were wrong and the names were wrong." The Senate intelligence committee, which examined pre-Iraq war intelligence, reported that Wilson "had never seen the CIA reports and had no knowledge of what names and dates were in the reports." Wilson had to admit he had misspoken.

That inaccuracy was not central to Wilson's claims about Niger, but his critics have used it to cast doubt on his veracity about more important questions...

As far as we know, this is the first time Pincus has acknowledged the way he was misled by Wilson in the June 12, 2003 report referenced here. But Kristof was similarly misled by Wilson—and he hasn’t explained or corrected his work to this very day. That’s right: On May 6, 2003, and then again on June 13 of that year, Kristof wrote columns, based on info from Wilson, which included the groaning bunkum described by Milbank and Pincus above (see THE DAILY HOWLER, 10/21/05). But so what? To this day, Kristof has never explained how that happened—and his work has never been corrected. His columns sit on Nexis today, driven by glaring misstatements—misstatements which stand uncorrected. In fleeting fashion, Pincus addresses his own misstatements today. At the Times, Kristof still hasn’t bothered.

Excited pseudo-liberals don’t care about this; they enjoy misstatements made by their own team. But this kind of who-gives-a-sh*t work is typical of the grinding dysfunction defining the Times in the past fifteen years. Judy Miller is getting slammed now—but Kristof’s misstatements sit uncorrected. Today, the Post addresses this matter, quite late. The Times doesn’t give a sh*t still.

WE LEAD, THE POST FOLLOWS: Today, the inevitable correction:

WASHINGTON POST (10/25/05): An Oct. 24 Metro article about educational testing should have stated that consensus is building among officials that "proficient" on state tests more closely resembles "basic" on the National Assessment of Educational Progress, rather than the reverse.
Huh? We’ll explain at the end of this piece. But this corrects a minor mistake in Daniel de Vise’s report from yesterday morning, in which the scribe discussed a point we incomparably looked at last week (see THE DAILY HOWLER, 10/20/05). All too often, exciting score gains in state-run test programs have not been reflected in the National Assessment of Education Progress—the major, nation-wide federal test program described as “The Nation’s Report Card.” Where state-run programs show thrilling score gains, NAEP results often tend to be flat—thus raising questions about the gains on those ballyhooed “high stakes” state programs.

We raised this point last week—and the Post got around to it yesterday. But before we get to de Vise’s errors, let’s quote the part of his report that made most sense. That’s the passage where he cites a Fordham Foundation semi-study:

DE VISE (10/24/05): The Thomas B. Fordham Foundation, a champion of high-stakes tests, looked at eighth-grade reading scores on 29 state tests and found that two-thirds—19 states, including Virginia and Maryland—reported gains in the past two years. None of those 19 states showed progress this year in eighth-grade reading proficiency on the [NAEP].

"As we look at those numbers, we wonder whether or not the progress being reported at the state level is for real," said Michael Petrilli, vice president of the Fordham Foundation. "Are states subtly making their tests easier in order to make their scores look better?"

Over the past two years, nineteen states showed gains on their state-run tests—and none showed gains on the NAEP! Of course, there’s a weakness with this analysis; it puts a high reliance on results of a single NAEP test session. If there was some anomaly with that session, then Fordham’s comparison loses validity. It would be better to compare statewide score gains vs. NAEP score gains over a longer period of time. We did in THE HOWLER last week (albeit for only two states).

Might statewide score gains be full of hot air? You’d think the public would want to know. But major papers like the Post rarely do quality work on such topics. Public ed sometimes seems like the junkyard beat, with little real effort being invested. We saw this again as we watched de Vise fumble his brief just this week.

Understandably, de Vise focused on the Post’s “home town” states—Maryland and Virginia. Indeed, in his opening, de Vise said that scores gains on those states’ high-stakes tests have not been matched on the NAEP:

DE VISE: In Maryland and Virginia public schools, statewide exams are a cause for perpetual celebration. Scores go up almost every year in virtually every grade level and subject tested. On the Maryland School Assessment this year, scores rose in all 24 school systems.

But on another test, the only one given by the federal government to public students nationwide, scores tell a different story. According to the National Assessment of Educational Progress, Maryland students have improved their proficiency since 2003 in just one area, fourth-grade math. Virginia scores are up, but not by much, and eighth-grade reading performance has stalled.

“The anemic results from the nationwide test, released Wednesday, provide a sharp contrast to the dramatic gains reported by Maryland and Virginia on their statewide exams,” de Vise says. But as a general matter, this is plainly not true in the case of Virginia, as the chart which accompanies de Vise’s report makes abundantly clear.

How inept is public ed reporting? According to de Vise, statewide gains in Maryland and Virginia have not been matched on the NAEP. But in his chart, he posts Virginia scores from 1998 to 2005—and gains on Virginia’s statewide tests are, in fact, fairly closely matched by gains on the NAEP in those years. In grades 4 and 8, in both reading and math, Virginia students have made clear gains on the NAEP during this period. In grade 4 math, for example, Virginia students went from 24 percent “at or above proficient” to 39 percent during the five-year span de Vise examined. (The state went from 71 percent “at or above basic” to 83 percent during that span.) But then, in both reading and math, Virginia kids have consistently improved their passing rates on the NAEP tests. As a general matter, Virginia’s scores seem to contradict the Fordham hypothesis. But de Vise doesn’t seem to have noticed.

(Please note: The passing rates on the NAEP are lower than on the Virginia test—but that is a separate question from the question of overall progress. Over the past seven years, Virginia kids have improved their scores on their statewide tests—and they’ve improved their scores on the NAEP as well. If every state showed this pattern, there would be no story here.)

Are American kids doing better in reading? As we noted last week, NAEP passing rates have been rather flat for the past thirteen years—but on many state tests, passing rates have soared. But this pattern has not obtained in Virginia. Alas! With only two states to check, de Vise doesn’t seem to have noticed. But then again, so it often goes when elite papers slum on the public ed beat.

BACK TO THAT CORRECTION: As noted, passing rates tend to be lower on the NAEP than on “high stakes” state tests. Near the end of his report, de Vise correctly discussed this:

DE VISE: The lack of clear progress on the national test isn't all that concerns education leaders. The new scores also paint a far bleaker picture of overall student abilities than most statewide exams, including those in the Washington region.

Roughly one-third of Maryland and Virginia students rated "proficient" on most sections of the national assessment. But in the latest rounds of statewide testing, most categories of Virginia students scored proficiency rates between 70 and 90 percent. Proficiency rates in Maryland surpassed 50 percent across the board.

In short, it’s easier to test “proficient” on these state-run tests than on the NAEP. But uh-oh! Ed reporting can be hard work! In his next paragraph, de Vise tried to sum this up—but he got his terms turned around:
DE VISE: Consensus is building among officials that "proficient" on the national assessment more closely resembles "basic" on the state tests. Virginia's SOL exams "have always been meant to be a floor and not a ceiling," said Charles Pyle, spokesman for the state Education Department.
Actually, if you score “proficient” on the state tests, that’s roughly equivalent to “basic” on the NAEP. This is a minor error—the one the Post corrects today. De Vise’s other mistake was more striking—and it points to the way public ed is covered at many big papers.

DATA DUMP: To see Virginia and Maryland’s NAEP scores through the years, click here—then click on each state.