A SUBSTANDARD DEBATE: Why does the country need a web site devoted to public education—especially, to low-income public education? In part, because of the endless low caliber of existing discussions about our public schools. For one outstanding example, fight your way through this piece from the Posts Outlook section—an article which even informs us at one point that student lunch is now widely accepted as a public school program! This article, found in a prominent place, is a total waste of time. But so it goes in the snoresome regions we devote to public ed.
On Saturday, the New York Times ran a more useful report—a front-page report about state-run, high-stakes testing programs. In this piece, the Times gets around to the matter we noted weeks back; the paper reports that passing rates are often higher on state-run testing programs than on the National Assessment of Educational Progress (the NAEP), the exam known as The Nations Report Card. As he starts, Sam Dillon gives an example:
DILLON (11/26/05): After Tennessee tested its eighth-grade students in math this year, state officials at a jubilant news conference called the results a ''cause for celebration.'' Eighty-seven percent of students performed at or above the proficiency level.Uh-oh. In Tennessee, 87 percent scored proficient on the state test—but only 21 percent scored proficient on the NAEP. Its good to be aware of this phenomenon. But Dillons piece also shows us how much confusion the educational establishment can churn from so simple a point.
But when the federal government made public the findings of its own tests [the NAEP] last month, the results were startlingly different: only 21 percent of Tennessee's eighth graders were considered proficient in math.
Yes, proficiency rates are often higher on those state-devised tests. In this sense, the state-devised tests are easier than the NAEP. But does that mean that the NAEP is a better or more accurate test? More specifically, does that mean that people are somehow being misled by passing rates on the state-run tests? This claim is suggested throughout Dillons piece—in this early passage, for example:
DILLON: ''Under No Child Left Behind, the states get to set the proficiency bar wherever they like, and unfortunately most are setting it quite low,'' said Michael J. Petrilli, a vice president of the Thomas B. Fordham Foundation, which generally supports the federal law.Groan! Petrilli implies that the NAEP results are somehow correct, and that results on the state tests are therefore wrong. But proficiency (at a given grade level) is, inherently, a subjective measure. The NAEP may set the bar somewhat higher, while the states set the bar somewhat lower—but that doesnt mean that either is right. But alas! Neither Petrilli nor Dillon seem to grasp this elementary point of logic. Petrilli prefers the higher standard. But Dillon doesnt ask him why—and the discourse founders accordingly. Other experts make similar statements all through Dillons report.
''They're telling the public in their states that huge numbers of students are proficient, but the NAEP results show that's not the case,'' Mr. Petrilli said.
Indeed, we soon move to another unfounded claim—the (implied) claim that a uniform, higher national standard will lead to greater student achievement. As usual, the Times turns to Diane Ravitch to make this unargued claim:
DILLON: Because of the discrepancies, several prominent educators are now calling for a system of national testing that counts, like those at the heart of educational systems in England, France and Japan.Ravitch and Tenenbaum would prefer national standards and tests. (For ourselves, we dont hold a firm view on the matter.) But would a uniform, tougher, national test lead in some way to higher achievement? On November 7, Ravitch implied that it would, in this Times op-ed column. But she made no argument for this notion. Nor did she support a second, explicit claim—her claim that the current, state-run tests dont provide accurate reporting. But so what? Two weeks later, Brent Staples also lobbied for a national test in a Times piece, plainly implying that the lack of such a standard has been depressing achievement rates. Staples even claimed that the current state-run tests are fraudulent because the proficiency rates achieved on these tests dont match those achieved on the NAEP.
''We need national standards and national tests,'' said Diane Ravitch, a professor at New York University who is a former member of the National Assessment's board. ''I conclude that states are just looking to make everybody feel good.''
Ms. Tenenbaum too says the differences among states have convinced her of the need for a national test. ''I think we should all just take the NAEP,'' she said. ''Get it out of the states' hands.
Would a single, higher, national standard actually lead to greater achievement? Wed love to see somebody argue for this—but silly things like arguments rarely cloud our education discussions. With that in mind, lets note two questions which are absent from Dillons article—questions which actually matter.
Question One: Are states making their tests easier over time? Dillon notes that passing rates are often higher on the state tests, and that these tests are therefore easier than the NAEP. But he fails to ask an attendant question: Are the tests in individual states getting easier over time? Yes, Tennessees test is easier than the NAEP—but that doesnt mean that its somehow too easy, or that its fraudulent or wrong. But if Tennessees test gets easier over time, and no one is told, an element of fraud really is introduced. The states passing rate goes up, producing the appearance of gains in achievement—but its really the test which has changed, not the achievement of Tennessee children. Is this going on in some states? The amazing jumps in some passing rates make it seem that this must be occurring. But Dillon spends his time on murky side points—and fails to raise this actual question.
Question Two: What about cheating? Throughout his report, Dillon notes an important point: High-stakes, state-run tests are major political events. Passing rates on these tests get a good deal of attention. This produces tremendous pressure (on the local and state levels) to achieve high passing rates.
With that in mind, heres a fairly obvious question: To what extent are teachers or principals cheating on these tests? (And yes, we mean cheating, not teaching to the tests. And yes, theres a major difference.) Over the past thirty-five years, endless cheating scandals have been documented in the nations public schools. As the stakes keep rising on state-run tests, the incentive to cheat keeps growing too. But education writers never seem to have heard about this long-running story. Dillon is the latest in a long line to ignore this obvious question.
Would a set of higher, national standards lead to greater student achievement? Wed love to see someone argue this point. Instead, we tend to get ham-handed debates in which the notion is simply assumed. No, theres nothing automatically wrong with the way Tennessee set the bar on its tests—and theres nothing automatically right about the NAEP standards. Nor is it clear that higher standards would actually lead to higher achievement. Its time that someone argued that claim—and clarified this murky discussion. But then, murk and the gloaming seem to rule whenever we hold these substandard discussions—the substandard discussions which seem good enough when we debate public ed.
LUNCH BUNCH: Is student lunch now widely accepted? Why, the Post even tells us whats being served! Care to know how the youngsters are dining? You know what to do—just click here.