Daily Howler logo
NEXT YEAR’S EXPERT! This year, she’s a former ed writer. This time next year, she’ll be more: // link // print // previous // next //
WEDNESDAY, SEPTEMBER 6, 2006

GO JUMP IN THE LAKE: At The Lake, they’ve done a superlative job reviewing ABC’s upcoming pseudo-history. (For one example, just click here. And scroll back—there are others.) And hurrah! Their efforts, and the efforts of others, have produced this important, helpful report in today’s New York Times. Meanwhile, what can you say about ABC? Plainly, this upcoming show is a form of Stosselism, in which the network provides an alternate menu for its dumbest pseudo-con viewers. This used to be called a “broadcast” network. Now, they narrow-cast to the intellectually challenged—and make a sick joke of our public discourse. Sorry, but the American system—indeed, the western experiment—simply can’t function this way.

Special report: They ain’t no experts!


PART 2—NEXT YEAR’S EXPERT: You’re right. We are a bit jaundiced about “educational experts,” who seem to be safely back in the saddle as the school year begins. Over the years, we’ve watched them interact with education writers and education bureaucracies, and we’ve often been underwhelmed by the outcomes. And uh-oh! Jay Mathews featured these experts again on the front page of Sunday’s Post, with results that were underwhelming again (see THE DAILY HOWLER, 9/5/06). And we had a somewhat similar reaction that day when we read this Post op-ed, by Linda Perlstein, about merit pay for teachers.

Perlstein, a former Post ed writer, has a book coming out next year—one we look forward to reading. (Title: Tested: One School, and America, Struggle to Make the Grade.) That “one school” seems to be Tyler Heights Elementary, a “high-poverty” Annapolis grade school which Perlstein discusses in detail in her column. Should Maryland institute merit pay, as its governor has now proposed? Perlstein seems to argue yes and no as she debates this in her column. She says she favors “some form of merit pay” and suggests how Maryland should proceed with the project.

But she raises so many objections to the practice that she seems to be arguing against her own view. Indeed, she finds so much that’s unfair about giving teachers “credit, or blame, for students’ [test] scores” that we weren’t quite sure, by the time we were done, why she favors merit pay at all. But this kind of semi-muddled debate typifies much public ed writing.

As we’ve said, we look forward to Perlstein’s book; in 300 pages, she may be able to clarify things that were left muddy in 800 words. But on Sunday, our analysts were groaning loudly after reading Mathews’ report. And Perlstein’s column troubled them a minor tad too. Readers, here’s why they were bothered.

First, a note about technical sophistication. Midway through her column, Perlstein complains about the way Tyler Heights prepares for Maryland’s annual test program, the MSA. As she continues, it becomes clear that she feels the school is putting too much stress on test prep. This is a familiar complaint. But yes, our analysts were somewhat surprised when Perlstein offered this analysis:
PERLSTEIN (9/4/06): An impressive 90 percent of Tyler Heights' third- and fourth-graders passed the reading assessment this year [in the spring of 2006]. Only 82 percent of the fifth-graders passed, and they were considered the smartest kids in school. I wonder if the scores in the fifth-grade class I observed were jeopardized by the teacher's frequent rejection of the school's "laser-sharp focus" on the MSA [Maryland’s annual tests], as the principal put it, to detour into the real-world discussions that so engaged her students.
We’ll admit it—we were surprised by that passage. 90 percent versus 82? Does Perlstein really think that’s a striking difference, especially given the small numbers of children involved in these tests. (In that fifth grade, 33 children were tested.) And that gap isn’t even as big as it seems; only naturally, our analysts checked the statewide numbers, and found that last year’s fifth-grade reading test was slightly “harder” than the third- and fourth-grade counterparts. Here are the percentages of Maryland kids who passed last year’s reading tests:
Statewide passing rates in reading, MSA, spring 2006
Third grade: 78.3 percent
Fourth grade: 81.8 percent
Fifth grade: 76.6 percent
At Tyler Heights, fourth-graders outperformed the state by 8 points—but fifth-graders outperformed it by 5. The difference is barely worth mentioning. By the way, were those fifth-graders “the smartest kids in the school,” as Perlstein says was the general belief? We don’t have the slightest idea—but if teacher judgment weren’t sometimes faulty, there’d be no reason for this type of testing at all! It just may be that those fifth-grade kids were not superior to their lower-grade counterparts. But it’s hard to base judgments on such modest numbers. An education expert should know that.

(For what it’s worth, those fifth-graders underperformed their lower-grade peers by a somewhat wider margin on last spring’s math tests. This could reflect their teacher’s performance in some way. Or it could mean almost nothing. To review the data from Tyler Heights, just click here, then noodle around.)

Yes, our analysts were somewhat surprised when Perlstein gobbled valuable space speculating about such meager numbers. But aarrgh! We were even more frustrated by some of the things she skipped past in her column. Perhaps these points will be fleshed in her book. But our analysts were tearing their hair by the time the scribe wrote this:
PERLSTEIN: And [state officials] shouldn't assume, either, that tests tell you everything. Upping the pressure gives teachers an incentive to narrow the curriculum to just what's on the test. That's fine if the assessment tests everything. But any Tyler Heights teacher would tell you that the third-graders' 90 percent proficiency on a reading test comes at the expense of glossed-over science, social studies and writing curricula, and even of many of the state reading standards that teachers know won't be tested.
Really? Teachers are “glossing over” science and social studies? And they know which skills will be skipped by the tests? Both those statements are quite striking, but they’re passed by in this piece without comment. For ourselves, we were most struck—most struck by far—when we read this part of the column:
PERLSTEIN: In one of last year's third-grade classes, most of the kids were on grade level and had little trouble understanding new lessons. In another class, the kids came with few skills—many used fingers to subtract three from five—learned slowly and had an awful time subduing their anger. Yes, "all children can learn," as politicians put it, but many of these 14 had had quite a hard time of it ever since kindergarten.

So, because more of those children failed the MSA in that class than in others, should their teacher be penalized?
Perlstein is concerned that teachers might get shortchanged—but what about those fourteen struggling children? Again, analytical weakness seems to raise its head here. Those grade-level kids “had little trouble understanding new lessons,” Perlstein says. But presumably, that’s because they were given grade-level lessons—lessons they were prepared for. (Those same kids would have had lots of trouble if they’d been given sixth-grade lessons—or lessons from an MIT course.) Here’s our question: What sort of lessons were those 14 strugglers getting? Earlier, Perlstein says that Tyler Heights teachers use “structured reading and math curricula...along with Anne Arundel County pacing guides that tell them what to teach each day.” (“Once a week teachers follow ‘explicit lessons’ that are completely scripted,” she also says, without further explanation.) Could this mean that those 14 struggling kids are getting the same structured, grade-level lessons that their more advanced counterparts are getting? If so, we pretty much know why these kids have “trouble understanding” their lessons and “an awful time subduing their anger.” What were those struggling kids being taught last year? And oh yeah—since they’ve had “a hard time of it ever since kindergarten,” what were they being asked to do in first grade? These questions are massively more important than the matter of who gets paid what. We hope Perlstein’s book dives into these points. But bitter experience has killed off our hope. Indeed, sometimes we have an awful time subduing our own frustration.

Perlstein is a former ed writer. By this time next year, she’ll be a full expert. But uh-oh! We’ve often noticed that educational experts are expert at missing the most crucial questions! We hope that, when we read her book, we’ll jump up, sweetly surprised.