BIG HARM, NO FOUL! Kristof assumes grave harm occurred. But he still says no foul: // link // print // previous // next //
TUESDAY, OCTOBER 25, 2005
BIG HARM, NO FOUL: Were often puzzled by Nicholas Kristof, and today is no exception. Should Bush Admin leakers be charged with crimes? Throughout his column, Kristof seems to say no—even though he assumes that major harm may have resulted from their actions:
KRISTOF (10/25/05): My guess is that the participants in a White House senior staff meeting discussed Mr. Wilson's trip and the charges that the administration had knowingly broadcast false information about uranium in Niger—and then decided to take the offensive. The leak of Mrs. Wilson's identity resulted from that offensive, but it may well have been negligence rather than vengeance. I question whether the White House knew that she was a noc (nonofficial cover), and I wonder whether some official spread the word of Mrs. Wilson's work at the C.I.A. to make her husband's trip look like a nepotistic junket.Kristof says that the leakers conduct may have put many folk at grave risk. But he still seems to argue against criminal sanctions. Were inclined to a no harm, no foul standard ourselves. But Kristof doesnt even seem to care if grave harm did result.
But sic semper New York Times journalists! The following passage in todays Post made us think of Kristof:
MILBANK AND PINCUS (10/25/05): Wilson has also armed his critics by misstating some aspects of the Niger affair. For example, Wilson told The Washington Post anonymously in June 2003 that he had concluded that the intelligence about the Niger uranium was based on forged documents because "the dates were wrong and the names were wrong." The Senate intelligence committee, which examined pre-Iraq war intelligence, reported that Wilson "had never seen the CIA reports and had no knowledge of what names and dates were in the reports." Wilson had to admit he had misspoken.As far as we know, this is the first time Pincus has acknowledged the way he was misled by Wilson in the June 12, 2003 report referenced here. But Kristof was similarly misled by Wilson—and he hasnt explained or corrected his work to this very day. Thats right: On May 6, 2003, and then again on June 13 of that year, Kristof wrote columns, based on info from Wilson, which included the groaning bunkum described by Milbank and Pincus above (see THE DAILY HOWLER, 10/21/05). But so what? To this day, Kristof has never explained how that happened—and his work has never been corrected. His columns sit on Nexis today, driven by glaring misstatements—misstatements which stand uncorrected. In fleeting fashion, Pincus addresses his own misstatements today. At the Times, Kristof still hasnt bothered.
Excited pseudo-liberals dont care about this; they enjoy misstatements made by their own team. But this kind of who-gives-a-sh*t work is typical of the grinding dysfunction defining the Times in the past fifteen years. Judy Miller is getting slammed now—but Kristofs misstatements sit uncorrected. Today, the Post addresses this matter, quite late. The Times doesnt give a sh*t still.
WE LEAD, THE POST FOLLOWS: Today, the inevitable correction:
WASHINGTON POST (10/25/05): An Oct. 24 Metro article about educational testing should have stated that consensus is building among officials that "proficient" on state tests more closely resembles "basic" on the National Assessment of Educational Progress, rather than the reverse.Huh? Well explain at the end of this piece. But this corrects a minor mistake in Daniel de Vises report from yesterday morning, in which the scribe discussed a point we incomparably looked at last week (see THE DAILY HOWLER, 10/20/05). All too often, exciting score gains in state-run test programs have not been reflected in the National Assessment of Education Progress—the major, nation-wide federal test program described as The Nations Report Card. Where state-run programs show thrilling score gains, NAEP results often tend to be flat—thus raising questions about the gains on those ballyhooed high stakes state programs.
We raised this point last week—and the Post got around to it yesterday. But before we get to de Vises errors, lets quote the part of his report that made most sense. Thats the passage where he cites a Fordham Foundation semi-study:
DE VISE (10/24/05): The Thomas B. Fordham Foundation, a champion of high-stakes tests, looked at eighth-grade reading scores on 29 state tests and found that two-thirds—19 states, including Virginia and Maryland—reported gains in the past two years. None of those 19 states showed progress this year in eighth-grade reading proficiency on the [NAEP].Over the past two years, nineteen states showed gains on their state-run tests—and none showed gains on the NAEP! Of course, theres a weakness with this analysis; it puts a high reliance on results of a single NAEP test session. If there was some anomaly with that session, then Fordhams comparison loses validity. It would be better to compare statewide score gains vs. NAEP score gains over a longer period of time. We did in THE HOWLER last week (albeit for only two states).
Might statewide score gains be full of hot air? Youd think the public would want to know. But major papers like the Post rarely do quality work on such topics. Public ed sometimes seems like the junkyard beat, with little real effort being invested. We saw this again as we watched de Vise fumble his brief just this week.
Understandably, de Vise focused on the Posts home town states—Maryland and Virginia. Indeed, in his opening, de Vise said that scores gains on those states high-stakes tests have not been matched on the NAEP:
DE VISE: In Maryland and Virginia public schools, statewide exams are a cause for perpetual celebration. Scores go up almost every year in virtually every grade level and subject tested. On the Maryland School Assessment this year, scores rose in all 24 school systems.The anemic results from the nationwide test, released Wednesday, provide a sharp contrast to the dramatic gains reported by Maryland and Virginia on their statewide exams, de Vise says. But as a general matter, this is plainly not true in the case of Virginia, as the chart which accompanies de Vises report makes abundantly clear.
How inept is public ed reporting? According to de Vise, statewide gains in Maryland and Virginia have not been matched on the NAEP. But in his chart, he posts Virginia scores from 1998 to 2005—and gains on Virginias statewide tests are, in fact, fairly closely matched by gains on the NAEP in those years. In grades 4 and 8, in both reading and math, Virginia students have made clear gains on the NAEP during this period. In grade 4 math, for example, Virginia students went from 24 percent at or above proficient to 39 percent during the five-year span de Vise examined. (The state went from 71 percent at or above basic to 83 percent during that span.) But then, in both reading and math, Virginia kids have consistently improved their passing rates on the NAEP tests. As a general matter, Virginias scores seem to contradict the Fordham hypothesis. But de Vise doesnt seem to have noticed.
(Please note: The passing rates on the NAEP are lower than on the Virginia test—but that is a separate question from the question of overall progress. Over the past seven years, Virginia kids have improved their scores on their statewide tests—and theyve improved their scores on the NAEP as well. If every state showed this pattern, there would be no story here.)
Are American kids doing better in reading? As we noted last week, NAEP passing rates have been rather flat for the past thirteen years—but on many state tests, passing rates have soared. But this pattern has not obtained in Virginia. Alas! With only two states to check, de Vise doesnt seem to have noticed. But then again, so it often goes when elite papers slum on the public ed beat.
BACK TO THAT CORRECTION: As noted, passing rates tend to be lower on the NAEP than on high stakes state tests. Near the end of his report, de Vise correctly discussed this:
DE VISE: The lack of clear progress on the national test isn't all that concerns education leaders. The new scores also paint a far bleaker picture of overall student abilities than most statewide exams, including those in the Washington region.In short, its easier to test proficient on these state-run tests than on the NAEP. But uh-oh! Ed reporting can be hard work! In his next paragraph, de Vise tried to sum this up—but he got his terms turned around:
DE VISE: Consensus is building among officials that "proficient" on the national assessment more closely resembles "basic" on the state tests. Virginia's SOL exams "have always been meant to be a floor and not a ceiling," said Charles Pyle, spokesman for the state Education Department.Actually, if you score proficient on the state tests, thats roughly equivalent to basic on the NAEP. This is a minor error—the one the Post corrects today. De Vises other mistake was more striking—and it points to the way public ed is covered at many big papers.
DATA DUMP: To see Virginia and Marylands NAEP scores through the years, click here—then click on each state.