Daily Howler logo
HOW TO READ LITERACY (PART 1)! The Post reports on adult literacy—and makes instant reading mistakes: // link // print // previous // next //
WEDNESDAY, JANUARY 11, 2006

THE WEEKLY WINERIP: In this morning’s New York Times, Michael Winerip profiles bureaucratic shortcomings of the No Child Left Behind program. This doesn’t necessarily mean that NCLB is a blunder on balance. But the Rube Goldberg-style aspects of the act displayed in this tale are worth pondering.

For ourselves, we were struck by a statement by a New York City principal—John Hughes, of P.S. 48 in the Bronx, a school our man Winerip praises:

WINERIP (1/11/06): The principal, John Hughes, has mixed feelings about all the testing that goes on these days, but professionally, he has put that all aside. ''The profit margin in this business is test scores,'' he said. ''That's all they measure you by now.''

Test prep? ''Are you kidding?'' he said. ''We start in September and we don't stop until the tests are over,'' in March.

''I can't afford not to do test prep,'' he said. ''Otherwise my kids don't have a chance. It's all by the test numbers. If they score 3's or 4's, they have marketability for getting into one of the city's good middle schools.'' With low scores of 2 or 1, out of a maximum of 4, they are stuck in a bad neighborhood school.

Winerip sees Hughes as an outstanding principal—and we see no reason to doubt him. But we were puzzled. What kind of “test prep” can a school conduct from September all the way through March? Can the format of these tests be so confusing that it takes six months to prepare students for them? (If so, the tests should end up in the Bronx River.) And is there any evidence that this kind of “test prep” really does affect results? We constantly read about endless “test prep”—activities conducted by principals, like Hughes, who say they would very much like to avoid them. But what goes on in these six-month-long sessions? Truly, we would like to know.
We plan to ask Hughes, who sounds like a great leader. But if you’re a teacher, what goes on in your school? And how about a matter of judgment: Do you think your “test prep” really works?

Special report—How to read literacy!

PART 1—INSTANT EXAMPLES: How well do American adults read? It’s timely that you should ask! On December 15, the National Center for Education Statistics (NCES) released the results of a massive 2003 study—the sequel to an earlier study conducted in 1992. Ten days later, on December 25, the Washington Post reported on the study, which is called the National Assessment of Adult Literacy (NAAL). And uh-oh! Early in her Christmas morning report, Lois Romano said this:

ROMANO (pgh 7): The test measures how well adults comprehend basic instructions and tasks through reading—such as computing costs per ounce of food items, comparing viewpoints on two editorials and reading prescription labels. Only 41 percent of graduate students tested in 2003 could be classified as "proficient" in prose—reading and understanding information in short texts—down 10 percentage points since 1992.
But uh-oh! In fact, there is no part of the NAAL study which reports how “graduate students” performed on the test (details to follow this week). Ironically, Romano had slightly misread the report—had failed at precisely the kind of task the test is designed to measure.

And that was hardly the only mistake in the early parts of this Post report. In her gloomy presentation, Romano focused on some (apparent) bad news from the National Assessment. And just like that—in paragraph 2!—misstatements began to appear:

ROMANO (pgh 1): Literacy experts and educators say they are stunned by the results of a recent adult literacy assessment, which shows that the reading proficiency of college graduates has declined in the past decade, with no obvious explanation.

(2) "It's appalling—it's really astounding," said Michael Gorman, president of the American Library Association and a librarian at California State University at Fresno. "Only 31 percent of college graduates can read a complex book and extrapolate from it. That's not saying much for the remainder."

Gorman was appalled—and was slightly misstating. In fact, no part of the NAAL directly measures the ability to “read a complex book and extrapolate from it.” (Note Romano’s reference to “short texts” in paragraph 7.) But then, when Romano quoted Mark Schneider, NCES commissioner, she had him oddly misstating too:
ROMANO (pgh 5): The declining impact of education on our adult population was the biggest surprise for us, and we just don't have a good explanation," said Mark S. Schneider, commissioner of education statistics. "It may be that institutions have not yet figured out how to teach a whole generation of students who learned to read on the computer and who watch more TV. It's a different kind of literacy."

(6) "What's disturbing is that the assessment is not designed to test your understanding of Proust, but to test your ability to read labels," he added.

It’s true—despite Gorman’s statement in paragraph 2, the NAAL doesn’t measure the ability to read Proust (or other long texts). But uh-oh! While one part of the three-part test is “designed to test your ability to read labels,” that plainly isn’t the part of the test which Romano is discussing here. Since Schneider presumably understands the structure of his own three-part test, we’d have to assume that Romano quoted him out of context. (Or who knows? Maybe not.) Meanwhile, back to Gorman, in paragraph 2. Uh-oh! In fact, well more than 31 percent of “college graduates” were able to pass the part of the test to which he is plainly referring here. Gorman was slightly misstating the test’s results—and Romano had apparently failed to notice.

So let’s see. No part of the NAAL reports the performance of “graduate students.” More than 31 percent of “college graduates” scored “proficient” on the test. And the NAAL simply isn’t a measure of the ability to “read a complex book.” Nor is the part of the test which Romano discusses a measure of how well we read labels. All these misstatements, major and minor, appear in the Post’s first seven paragraphs.

How well do adult Americans read? Not too well, to judge from this report—a report which helps us see how easy it is to misread surveys of this type. (We think Romano’s a perfectly competent and fair-minded scribe.) But in the present day, many important educational judgments are being made on the basis of such reports. For that reason, the NAAL is a good text to study—if we want to polish the skills we employ when we make such decisions.

TOMORROW—PART 2: Good news! Literacy rising!