HOW TO READ LITERACY (PART 2)! Romanos experts were appalled. But the NAAL showed literacy rising: // link // print // previous // next //
THURSDAY, JANUARY 12, 2006
DOES TEST PREP WORK: Yesterday, we asked two questions about test prep. What kinds of test prep sessions go on in schools from September to March? And do these sessions actually work? Regarding that second point, a reader came out swinging—and made a good point:
E-MAIL: Bob, how much recent experience do you have with standardized tests? I admit it has been a long time since I took the SATs and their brethren, but I have taken a lot of professional examinations in the recent past, and I know a little bit about exam strategies. Believe it or not, a pretty successful method for taking these kinds of exams is to do as many old problems as possible in conjunction with using these problems and the syllabus material to construct possible future questions. As a former educator (math professor) I am in complete agreement with you that this indicates a significant problem with the test construction, but this is the reality of many standardized exams today.Were not entirely sure what the highlighted passage means—and of course, we have no way to know if this is the sort of test prep going on at the schools in question. (No doubt, different schools do different things.)
Regarding what the mailer says, we will only say this in reply. In a perfect world, educators would try to limit the degree of gimmicky test preparation—unless these sessions provided general academic value to students. (That excellent principal at P.S. 48 didnt seem to think this was the case.) One problem with these sessions is obvious; to the extent that schools engage in gimmicky prep—and to the extent that such sessions work—were no longer measuring academic achievement on the tests, were now measuring skill at test-taking. As a general matter, this defeats the purpose of such tests. For example, how do we compare test results school-to-school, when kids at one school are taught extensive test-taking techniques and kids at another school are not? In the days before accountability began to build pressure on these tests, test prep tended to be short—and standardized. For example, all kids taking the Iowa Tests of Basic Skills would take the same, short practice tests. These short sessions were scripted by the tests publisher. The theory? All across the United States, kids were test prepped the same way.
What goes on in todays test prep sessions? And do they work, as the mailer suggests? To the extent that they do, were defeating the purpose of the tests. Were making it hard to compare results from the wide range of schools which administer them.
WHAT GOES ON: Heres another e-mail we got. Of course, we cant confirm that this is accurate. But when one considers the test administration problems which have been documented in the past three decades, this e-mail is completely believable:
E-MAIL: In the inner-city middle school where my daughter teaches this is what has happened.We cant confirm. But over the years, practices which are worse—much worse—have been widely documented.
Special report—How to read literacy!
PART 2—LITERACY RISING: Good grief! Things were very gloomy-and-doomy when Lois Romano reported the newly-released results of the 2003 NAAL—the National Assessment of Adult Literacy. She wrote in the Post on Christmas morning (see THE DAILY HOWLER, 1/11/06). Here were her opening paragraphs:
ROMANO (12/25/05): Literacy experts and educators say they are stunned by the results of a recent adult literacy assessment, which shows that the reading proficiency of college graduates has declined in the past decade, with no obvious explanation.Indeed, Romano quoted several literacy experts who, like Gorman, were stunned and appalled by the astounding results. Literacy of College Graduates Is on Decline, read the headline on the Post report. Survey's Finding of a Drop in Reading Proficiency Is Inexplicable, Experts Say.
But how much expertise do our experts show when they review such basic reports? And just how sharp are mainstream scribes when they write about such matters? As we noted in yesterdays HOWLER, there were at least four errors, major and minor, in the opening paragraphs of Romanos report, starting with Gormans high-octane statement. In fact, the NAAL doesnt directly measure the ability to read complex books—and more than 31 percent of college graduates scored proficient on the part of the test to which Gorman was plainly referring. Lesson? Our experts sometimes make reading mistakes when they slam the reading of others. And its easy to be appalled by survey results which, in fact, just arent that bad.
Yep! For whatever reason, experts tend to be gloomy-and-doomy in reaction to surveys of this type. Today, we note a fact which Romano (and others) downplayed: The NAAL showed that English language literacy is rising among most adult groups.
Say what? Yes, this survey contained a fair amount of good news—although Romano buried it at the end of her piece (text below), and gloom-and-doomed her handling of it. As a general matter, what did the NAAL reveal about adult literacy? Heres the basic overview statement from the Department of Eds news release:
DEPARTMENT OF EDUCATION (12/15/05): American adults can read a newspaper or magazine about as well as they could a decade ago, but have made significant strides in performing literacy tasks that involve computation, according to the first national study of adult literacy since 1992.According to the press release, the NAAL found little change between 1992 and 2003 in adults' ability to read and understand sentences and paragraphs. This is the skill the NAAL describes as prose literacy—the skill which Romano and her expert commentators were discussing in her report.
Little change between 1992 and 2003?" That may not sound like fabulous news. (Nor does it sound like disaster.) But as one reads the basic NAAL reports, the picture gets somewhat brighter. For example, how did different subgroups score in this basic prose literacy skill? As the press release explains, the average score of white adults was unchanged from 1992 to 2003. But prose literacy was significantly higher among both African-Americans and Asian-Americans. Prose literacy only dropped among Hispanics—and NAAL reports tied that to substantial demographic changes. For example, in his Commissioners Statement, Mark Schneider (National Center for Education Statistics) offered a bit of perspective on the drop in scores by Hispanics:
SCHNEIDER (12/15/05): Hispanics were the only group whose prose and document literacy scores decreased. This finding should be considered in light of demographic changes. The Hispanic population today is larger and different from the Hispanic population a decade ago. For example, there has been an increase in the number of Hispanic adults who are not native English speakers, as well as an increase in the percentage of Hispanics who were not born in the United States and who arrived here as teenagers or adults.The NAAL is a test of literacy in English—and there has been an increase [since 1992] in the number of Hispanic adults who are not native English speakers, Schneider noted. Other reports offered the relevant numbers. Heres a passage from the Department of Education press release, for example:
DEPARTMENT OF EDUCATION (12/15/05): To put its findings in perspective, NAAL also reported on U.S. population changes between 1992 and 2003...The percentage of adults who spoke only English before starting school decreased from 86 to 81 percent.Since the NAAL measure adult literacy in English, many of those who speak English as a second language are at a disadvantage. Schneider discussed this factor in his statement. In the population as a whole, 13 percent of adults did not speak any English before starting school, he noted, describing the 2003 survey. (Only 9 percent of the population fit this category in 1992.) And how did such English learners perform on this test of English literacy? These adults were over three times more likely to have below basic literacy skills than we would expect by chance alone, Schneider notes. In short, the national population had substantially more non-English speakers in 2003 than in 1992—a jump from 9 percent to 13. But despite this challenge, the overall rate of English literacy stayed the same. This means that literacy has been advancing among American adults who grew up speaking English. One can always imagine a greater advance. But the overall picture here is gain—except among the (growing) segment of the population which did not grow up speaking English.
One can imagine a greater advance—but the overall picture here involves gain. But in the current cultural context, gloom-and-doom often tend to prevail when such studies get reported. Heres how Romano finally treated these facts, in her two closing paragraphs:
ROMANO (12/25/05): On average, adult literacy is virtually unchanged since 1992, with 30 million people struggling with basic reading tasks. While adults made some progress in quantitative literacy, such as the ability to calculate taxes, the study showed that from 1992 to 2003 adults made no improvement in their ability read newspapers or books, or comprehend basic forms.With 30 million people struggling with basic reading tasks. By instinct, Romano reported the glass one-seventh empty. Yes, the things she said here were perfectly accurate—but gloom-and-doom reporting tend to shape the way Americans understand these topics. More on this as we proceed.
Meanwhile, how about those college grads, the ones who were featured in Romanos report? What was up with their sorry scores? As weve seen, Romano stressed the drop in scores by college graduates—and her experts were suitably stunned and appalled by their inexplicable floundering. But for our money, those scores werent really quite so stunning—and were not sure that theyre so inexplicable. According to Romanos experts, there was no obvious explanation for the drop in average score among college grads. (Experts could not definitively explain the drop in these scores, she writes.) But then, explanations dont always drop from the skies—and although the average score did drop, it seemed to us that there was an obvious place to start our search for an explanation.
Literacy was unchanged among whites. It was up among blacks and Asian-Americans. But why was it dropping among college grads? Tomorrow, well try to find out—and well see that it isnt all that easy to report the results of such surveys.
FAILURES OF LITERACY: As we said yesterday, Romano is a perfectly capable and fair-minded journalist. But uh-oh! The following sentence, from her closing paragraph (see above), just doesnt seem to make sense:
ROMANO: For instance, the report showed that the average rate of prose literacy, or reading, among blacks rose six percentage points since 1992.The average rate of prose literacy? We dont really know what that means. Among blacks, the average prose literacy score (on a 500-point scale) rose by six points in 2003. And six percent more blacks scored at the intermediate level in prose literacy in 2003. (34 percent scored there in 1992; it was up to 40 percent in the new survey.) But Romanos construction doesnt seem to make sense. The problem: Many journalists assigned to these topics dont have real expertise. This can create an awkward situation. They highlight the nations literacy problems, while exhibiting such problems themselves.
LITERACY LINKS: For links to all basic reports on the NAAL, you know what to do—just click here.