Daily Howler logo
HOW TO READ LITERACY (PART 3)! Why did college grad scores drop? For one thing, there were more college graduates: // link // print // previous // next //
THURSDAY, JANUARY 19, 2006

BROKEN-DOWN ANALYSTS: With sadness, we thought you ought to see what occurred when Chris Matthews discussed Brokeback Mountain with Don Imus on Wednesday morning. Matthews phoned in to the Imus show; soon, the banter began to break down as Imus insisted he wasn’t seeking a “Brokeback Mountain moment” just because he’d complimented Matthews on his highly attractive recent weight-loss. In what follows, we see the world of the broken-down people who steward our deeply devolved public discourse. Note Chris Matthews’ astonishing reference to “the wonderful Michael Savage:”
MATTHEWS (1/18/06): Have you gone to see it yet? I’ve seen everything else but that. I just—

IMUS: No, I haven’t seen it. Why would I want to see that?

MATTHEWS: I don’t know. No opinion on that. I haven’t seen it either, so—

IMUS: So they were—it was out when I was in New Mexico and—it doesn’t resonate with real cowboys who I know.

MATTHEWS: Yeah—

IMUS: But then, maybe there’s stuff going on on the ranch that I don’t know about. Not on my ranch, but you know—

MATTHEWS: Well, the wonderful Michael Savage, who’s on 570 in DC, who shares a station with you at least, he calls it [laughter]—what’s he call it?—he calls it Bare-back Mount-ing. That’s his name for the movie.

IMUS: Of course, Bernard calls it Fudgepack Mountain...

Of course! And so we see the broken-down souls who conduct our national discourse.

For ourselves, we saw Brokeback Mountain twice, drawn back by its superlative ending. (Overall, we were slightly disappointed, although it’s much better than most modern movies. It’s about real peoples’ real lives, for example.) The film closes with a brief conservation between Heath Ledger and his daughter, who by then is 19. When they speak, the daughter’s clear-eyed thoughts on the future throw her father’s tragic life into relief. What a deft, unexpected conclusion! We thought of shining Hector on the walls of Troy, sharing a moment with his baby son before going off to die at the hands of Achilles. The luminous moment between generations! Homer provides one in that famous scene. For our dime, Brokeback Mountain does too.

The young actress in that scene is named Kate Mara. We rarely can tell who the good actors are. In that scene, though, we thought Mara nailed it. For us, she conveyed a mountain of meaning in just one or two simple lines.

Special report—How to read literacy!

PART 3—THE MORE, THE MURKIER: What’s the state of literacy among college grads? It’s timely that you should ask. As we noted last week, the National Center for Education Statistics recently released a massive survey of adult literacy, the National Assessment of Adult Literacy (NAAL). Conducted in 2003, it was the sequel to a 1992 survey—and for the most part, results were encouraging (see THE DAILY HOWLER, 1/12/06). On average, adult literacy was unchanged from 1992 to 2003, despite a significant increase in people who didn’t grow up speaking English. (The survey measures adult literacy in English.) Adult literacy was improved among blacks; improved among Asian-Americans; and it was unchanged among whites. Scores only dropped among Hispanics—and the study noted that this was tied to significant changes in demographics. Among adults who grew up speaking English, literacy was on the rise.

But one result of the study did dismay our “literacy experts”—at least the experts who were quoted in Lois Romano’s report in the Post. Uh-oh! According to Romano, literacy was down among college graduates—and experts were perplexed and appalled. Let’s recall the flavor of her report:

ROMANO (12/25/05): Literacy experts and educators say they are stunned by the results of a recent adult literacy assessment, which shows that the reading proficiency of college graduates has declined in the past decade, with no obvious explanation.

"It's appalling—it's really astounding," said Michael Gorman, president of the American Library Association and a librarian at California State University at Fresno. "Only 31 percent of college graduates can read a complex book and extrapolate from it. That's not saying much for the remainder."

“Experts could not definitively explain the drop,” Romano reported. As she continued, she quoted a few other experts who seemed stunned and appalled by the drop in the scores. And several sounded a familiar old theme—the unrelenting and troubling dumbness of those kids today:
ROMANO: Dolores Perin, a reading expert at Columbia University Teachers College, said that her work has indicated that the issue may start at the high school level. "There is a tremendous literacy problem among high school graduates that is not talked about," said Perin, who has been sitting in on high school classes as part of a teaching project. "It's a little bit depressing. The colleges are left holding the bag, trying to teach students who have challenges."
We’re not sure what planet it is where the “literacy problem among high school graduates” isn’t being “talked about.” But wherever that planet may happen to be, this expert spends a good deal of time there.

At any rate, Romano stressed that our “literacy experts” couldn’t explain this drop in scores—and that they were stunned and appalled by the lower average score for college graduates. For ourselves, we thought this “expert reaction” was strange and instructive—especially after we reviewed the materials the NCES provided when this new study was released.

Why did literacy drop among grads? “Experts could not definitively explain the drop,” Romano said; there was “no obvious explanation.” But then, there is rarely an “obvious” or “definitive” explanation for changes in numbers on surveys like this. For example, why did scores go up among African-Americans? Is there some “obvious” or “definitive” explanation for that? And is there only one explanation—or might it be that several factors were involved in this improvement? To our taste, Romano and her experts were strangely ham-handed in their reaction to this change in the numbers—especially since one possible factor was discussed right in the “commissioner’s report” by NCES head-man Mark Schneider.

First, let’s get a few facts straight. As we noted last week, Romano’s report contained several factual errors—the very kinds of “reading mistakes” the NAAL is designed to measure. For example, Gorman is factually wrong when he seems to say that “only 31 percent of college graduates” passed the “prose literacy” part of this test. And Romano is factually wrong a bit later when she gives this summation:

ROMANO: The test measures how well adults comprehend basic instructions and tasks through reading—such as computing costs per ounce of food items, comparing viewpoints on two editorials and reading prescription labels. Only 41 percent of graduate students tested in 2003 could be classified as "proficient" in prose [literacy]—reading and understanding information in short texts—down 10 percentage points since 1992. Of college graduates, only 31 percent were classified as proficient—compared with 40 percent in 1992.
In fact, no part of the NAAL reports the performance of “graduate students,” and Romano’s numbers for “college graduates” are factually inaccurate too. These are precisely the kinds of mistakes that cause adults to “fail” at literacy tests—resulting in screams of dismay from the very “experts” who seem to be misinforming Romano. With that in mind, let’s make sure we understand what the results of the NAAL really were.

In its reporting, the NAAL presents scores for eight different subgroups based on “education attainment.” These groups include the entire population. One of these groups is college graduates with no advanced education; another group is college graduates with graduate credits or advanced degrees. Please note: Both these groups are college graduates. At any rate, here’s how these two groups performed in the two NAAL surveys:

College graduates with graduate credits or advanced degrees: 51 percent scored proficient in 1992. 41 percent scored proficient in 2003.

College graduates with no further study: 40 percent scored proficient in 1992. 31 percent scored proficient in 2003.

All these people are college graduates. When Romano wrote about “graduate students,” she was (inaccurately) referring to that first group. When Gorman spoke about “college graduates,” he was (inaccurately) failing to lump the top group in with the other. In fact, how did all “college graduates,” as a group, do on these surveys? In 1992, about 45 percent of college grads scored “proficient” on the test. In 2003, the figure had dropped to something like 36 percent. (The number of people in the two groups was fairly even in each survey, allowing us to make this approximation). None of this changes the general drift of Romano’s report; the percentage of college grads scoring “proficient” dropped by roughly 20 percent.

The passing rate dropped among college grads—but why? Romano’s “experts” were perplexed. But we would suggest that it’s fairly easy to offer some likely explanations.

First, let’s note something that Commissioner Schneider explained in his pithy, straightforward report. Here it is, and it may seem puzzling: The “proficiency” rate actually dropped in all eight “educational attainment” categories. That is, the proficiency rate didn’t just drop among those who have a college degree. It also dropped among those with a high school degree only; among those with a GED only; and among those who have a two-year associates degree but no further study. Yep! When we break the population up by “educational attainment,” the proficiency rate drops in every single group—and yet, the overall rate was unchanged! That may seem like a contradiction. But Schneider explained it, rather clearly:

SCHNEIDER (12/15/05):
Educational Attainment: 1992-2003
I will now present the results on change in scores between 1992 and 2003 for selected educational attainment levels. There were no increases in literacy in any of any of the educational attainment levels. Prose literacy decreased among adults at every level of education....With scores dropping in prose literacy for every level of education, you might wonder why there was no overall decline in the average score for this type of literacy. This is because adults with higher educational levels tend to outperform those with lower educational levels, and the percentage of adults with high educational levels—those with "some college" or more—has been increasing, while the percentage with low levels of education has been declining. We have more higher-scoring adults with high levels of education, and fewer lower scoring adults with low levels of education, which offsets the fact that average scores for highly educated adults are declining.
Read that again: “Prose literacy decreased among adults at every level of education”—not just among college grads. But the overall average score stayed the same because there are now more people in the higher-scoring groups. To cite one example: The average score dropped among people with advanced degrees—but there are now substantially more of these people, and this drives up the overall score. And that suggests one place to start in explaining the drop in average score among those college graduates.

What did the NAAL find in 2003? Among other things, the NAAL found this: We have more college graduates now! In 1992, 19 percent of the surveyed population fell in the two “college graduate” groups. By 2003, that number had climbed to 23 percent. Simplifying for the sake of brevity, this means that a somewhat less “selective” chunk of the population comprised these two groups in 2003—and this may well begin to explain that drop in average scores. Simplifying again for the sake of brevity, the top 19 percent of a population will always tend to score better, on average, than the top 23 percent. Do you want to see the average score for college graduates soar, for example? Simple! Close every college but the Ivy League schools. Over time, you’ll end up with a much smaller number of “college graduates”—but their average score on these tests will be higher! That would be absurd social policy—but you’d get a much higher “average score.” And a higher percentage of college grads would now display “proficiency.”

So in part, the average score of college grads may have dropped because there are more college graduates. But there’s a second obvious possibility here—obvious except to our “literacy experts.” The average score of college grads may have dropped because of the various policies and efforts we tend to describe as “affirmative action.” In that PBS program Country Boys, for example, we see heroic, hard-working people—on the high school and college levels—working hard to get (white) Appalachian kids from low-literacy backgrounds through high school and into college. We think that’s superlative social policy, and we thought the efforts portrayed here were often inspiring. But to the extent that such efforts move relatively low-scoring kids on to college, the “average score” for future college graduates may well tend to drop. In Country Boys, we see heroic efforts at a Kentucky alternative high school—and we see and hear about low-profile Kentucky colleges which are dedicated to serving challenged Appalachian kids. (These colleges give these kids massive financial aid, thanks to the generosity of donors.) If you decided to shut those schools down, the average score for future “college grads” might well rise—and as a nation, we’d be poorer for it.

Why did the average score of college graduates drop? Like Romano’s literacy experts, we can’t give a “definitive” answer. (Nor would we expect there to be one.) Believe it or not, the relative aging of the population may also contribute to this decline. (Don’t even ask.) But we did e-mail Commissioner Schneider, asking him if the growth in the number of college graduates—and the growth in “affirmative action” efforts—may have played a role in this matter. Here is his e-mailed reply:

SCHNEIDER E-MAIL: I think both of these factors are involved, and further research is needed to identify and measure the effects of these types of "compositional" changes in student populations on the performance of colleges and universities. We will need to explore how well the programs that colleges and universities offer perform in light of the new demands put on post-secondary institutions driven by changes in student populations as well as by changes in the American economy and society.
No, that isn’t a “definitive” explanation. But why would journalists or “literacy experts” expect to find such a critter?

Should Americans be concerned by that drop in “literacy” among college grads? For ourselves, we’re more concerned by the work of our “literacy experts.” Romano’s experts made instant factual errors—the kind that help you flunk a literacy test—and they couldn’t wait to rattle off old saws to “explain” the data they couldn’t describe. Last Friday night, we saw how clownish our educational discourse can be when ABC News aired that John Stossel burlesque (see THE DAILY HOWLER, 1/17/06). But to our taste, Romano’s report leaned that way, too. A few final thoughts on the morrow.

TOMORROW: As always, the kids must be stupid.

WATCHING THEM GROW: For the record, the numbers grew among both groups of college graduates. In 1992, 9 percent of the surveyed population had a graduate degree or advanced study. By 2003, that had grown to 11 percent. The slice of the population in this group was now (roughly) 22 percent bigger.

Ditto for college grads without further study. In 1992, 10 percent of the population was in this group; it was 12 percent in 2003.