Howling Dog Graphic
Point. Click. Search.

Contents: Archives:



Search this weblog
Search WWW
Howler Graphic
by Bob Somerby
  bobsomerby@hotmail.com
E-mail This Page
Socrates Reads Graphic
A companion site.
 

Site maintained by Allegro Web Communications, comments to Marc.

Those high test scores may not be what they seem: A hard look at Baltimore school test results

Robert Somerby, The Baltimore Evening Sun, 2/5/81

 

STANDARDIZED TEST SCORES have risen substantially in the city schools over the past few years, in some cases to levels that are the highest in a decade.

Local response has been understandably enthusiastic.

But interpretation of those rising test scores is nowhere near as simple as observers may believe. For analysis of school-by-school test results in Baltimore over the past several years reveals a disturbing number of individual elementary schools whose achievement patterns are highly irregular—whose sudden, unexpected score gains are almost impossibly extreme.

And the city school system has instituted a number of highly questionable techniques by which students are now prepared to take these tests—techniques which should, in and of themselves, raise serious questions as to the validity of the systemwide gains they may be producing.

Consider the sixth graders who graduated from the school which I will call City School A in the spring of 1979. (School-by-school test results for the past school year are not being made publicly available.) The scoring patterns this school displayed over the last two years of grade school appear in an increasing number of schools in recent years—and are almost impossible to accept at face value.

In the fall of 1977, this grade group, as fifth graders, took the fall session of the Iowa Tests of Basic Skills. They recorded, among other scores, a 3.7 grade equivalent score in vocabulary. This score ranks in the bottom 3 percent nationally (in the second percentile) among fifth-grade groups tested in the fall, according to ITBS manuals. It is typical of the extreme low achievement levels recorded at City School A prior to the spring of 1978—and typically recorded at city schools generally before the school system began altering its test preparation procedures in the 1976-77 school year.

By the spring of 1978, a near miracle seems to have occurred. This same grade group now recorded a grade equivalent of 6.8—placing School A's fifth graders in the 98th percentile nationally. Showing 3.1 years of growth in six months' time, these children had seemingly done the impossible—progressed, after four years of uniformly low achievement, to the very top of the nation.

But when the grade group entered sixth grade in the fall of 1978, it encountered a new test battery, the California Achievement Tests. Disturbingly, the grade group recorded a miserable 4.6 vocabulary score—more than a year and a half below the norm for entering sixth graders, and right back at the low achievement levels the grade group had always shown before its sudden success on the Iowas in the spring of fifth grade.

When the same grade group goes on to record an astonishing 9.0 sixth grade vocabulary score in the spring of 1979—a score reflecting an incredible 4.4 years of growth in six months' time—I think questions must inevitably be asked about how these students' outstanding spring vocabulary scores were obtained.

Did the amazing score gains reflect true, general vocabulary development—the kind that would presumably be reflected by any standard measure of vocabulary achievement?

Or did they reflect a situation in which children were systematically taught the test items on which they would be tested in the spring—producing illusory achievement levels that could not be maintained when a new, unfamiliar test battery was encountered?

Inevitably, the latter explanation must seem likely for City School A and for other city schools showing sudden, extraordinary gains in subjects like vocabulary and math. And this is particularly true when one considers the highly questionable test preparation techniques the city system has recommended to its elementary school teachers over the period of time in which these score gains have appeared.

Over the past three or four years, the city school system, like other urban systems eager to improve their public image, began encouraging schools to institute extensive programs in which students were given practice answering the types of multiple-choice questions found on standardized tests.

By the 1977-78 school year, some city elementary schools were holding such "test awareness" sessions once or twice a week throughout the course of the school year.

Such programs are difficult to reconcile with a fundamental principle of standardized testing: the principle that standardized tests should be administered everywhere in a uniform, standardized manner if scores are to be comparable from one school district to another.

More important, such sessions can build an atmosphere of pressure around a school system's testing program—an atmosphere which can lead teachers and administrators, well-meaning and otherwise, to compromise the most basic principles of test security.

Most disturbingly, the city system has recommended specific student preparation techniques that are completely inconsistent with traditional standards of test security—techniques which could easily result in illusory score gains of the type that seem to have occurred at City School A.

In a well-publicized series of workshops, for example, teachers were told to drill students on practice math items which directly parallel specific Iowa and California test items, practice items in which only the numbers have been changed from the actual test items on which the students would later be tested.

Faculties which systematically follow such advice may well raise their schools' math scores—on the one particular version of the test for which students have been selectively prepared.

But the procedure is clearly invalidating. Neither the Iowa nor the California manuals suggest anything remotely resembling this procedure, for example, and when teachers are invited to subject test items in one subject to this kind of scrutiny, test security in other areas of the testing program is likely to be compromised as well.

Can we place confidence in the city system's rising test scores? Unfortunately, present evidence suggests we cannot. And it suggests that the school board should take a careful look at the operation of this important program.