How a new math program rose to the top

TUESDAY, MAY 23, 2000

How a new math program rose to the top

Critics say the process of giving ‘Core-Plus’ a top rating lacked rigor and evidence of long-term positive impact

Mark Clayton (
Staff writer of The Christian Science Monitor


A plaque in Andover High School’s main office announces that this Bloomfield Hills, Mich., school is no ordinary place – it is ranked one of America’s “100 best” high schools.

Mathematics is a serious matter here. Andover students are drawn from a community of auto-industry engineers and business elites who expect their children to use high-level math skills in a variety of high-tech careers. More than 95 percent of students go on to college.

Andover’s math test scores soar above those of most other schools in the state. Despite that, the prestigious school stopped offering its traditional math curriculum to new students in 1994 and began an experimental program known as Core-Plus Mathematics, based on National Council of Teachers of Mathematics 1989 (NCTM) standards.

Several such programs have come online at all grade levels during the past decade. All have had an important goal: to boost US students’ performance and interest in math. But their method, which some say favors a broad conceptual approach that doesn’t adequately teach skills, has come under attack from mathematicians, scientists, and parents.

The story of Core-Plus is a tale of how such programs got high scores from the US Department of Education that encouraged their adoption in schools – a process that critics say is deeply flawed.

Core-Plus materials were not festooned with arcane math notation. Its texts focused on architectural design, manufacturing, and air-pollution problems. Instead of teaching algebra one year, geometry the next, then advanced algebra, trigonometry, and precalculus, Core-Plus wove together strands of each.

Guided by teachers, Andover students began working in small groups using powerful calculators, writing paragraphs to justify the mathematical rationale for their answers. It sounded promising.


High rating


Developing Core-Plus was a team effort, but it was Christopher Hirsch’s baby. So it was a happy moment for the professor of mathematics and math education at Western Michigan University at Kalamazoo, when he got the good news last fall: Education Secretary Richard Riley had named Core-Plus an “exemplary” program. Riley’s 16-member expert panel had sifted through a crop of 61 new programs. Core-Plus had popped to the top with nine others.

“We wanted to teach math as a whole – not this layer-cake approach,” says Professor Hirsch, a leading name in math education who helped write the 1989 NCTM standards the program followed. “Students advance along these strands so they develop a sense that mathematics is very connected.”

Andover became a Core-Plus pilot school in 1993, and the next year it became one of 36 “field test” high schools. The Education Department “exemplary” rating was vindication for that choice.

“Of course we were pleased,” says John Toma, Andover’s principal. “We really believed in what we were doing and had a strong belief in making the study of math applicable to daily life.”


A student’s view


But Melissa Lynn felt differently. A freshman at Andover in 1993, she became worried her Core-Plus class wasn’t very hard. She asked to switch into a traditional class. But she would have had to travel by bus to another school.

So she threw herself into the Core-Plus program and received straight A’s. She graduated in 1997 at the top of her class, with a 3.97 grade-point average.

Then she took the math placement test at the University of Michigan at Ann Arbor and bombed it. She found herself in “remedial math.” She’s still upset.

“I wasted precious time and money,” she wrote in a 1998 letter to a university professor. “I did receive an ‘A’ in my [university] precalculus class, but I did so in spite of Core-Plus.”

“Core-Plus has strong points,” she acknowledges now. She liked the real-life problem solving. Yet the emphasis on expanding math into other aspects of life was done “at the cost of teaching the basic algebraic manipulations,” she says.

Bloomfield Hills officials point out that Ms. Lynn did well in precalculus and later college math – showing that Core-Plus worked. She disagrees. “My eighth-grade math helped me out more in college than Core-Plus did,” she says.


A panel’s mandate


In summer 1996, a mathematician at University of Texas at San Antonio, Manuel Berriozabal, got a surprise phone call.

A Department of Education official wanted him to join a panel of experts whose congressional mandate was to identify top math programs.

“I was quite flattered,” he recalls. “I thought this was a first step that would help straighten out the math and science education system at the precollege, elementary and middle-school levels.”

Today, he is disappointed.

“The panel was a good idea,” Dr. Berriozabal says, “but we made some bad judgments. From the best I could tell, none of the programs we selected as ‘promising’ or ‘exemplary’ had any kind of long-term track record of achievement.”

After Berriozabal arrived in Washington, the panel began debating the criteria to determine a successful program. Berriozabal thought that long-term proof of achievement should top the list.

Most others on the panel wanted to require programs to conform to NCTM standards – then gauge achievement.

“These programs were just too new to require long-term impact studies,” says Steven Leinwand, co-chair of the expert panel on math. “To require that would have postponed any designation for years…. If we had built that criterion in, it would have been an uneven playing field, since many programs just haven’t been around long enough to have that kind of impact data.”

In 1997, after nearly a year of debate, the Education Department’s expert panel had decided on eight criteria – one of which required evidence of “a measurable difference in student learning” in order for a program to be “exemplary” or “promising.” But long-term evidence was not a factor.

Berriozabal abstained or voted against all 10 programs designated “exemplary” or “promising.”


On a mission for better math


Efforts to develop better ways to teach math emerged from concern about American students’ math performance. Especially after the results in 1997 of the Third International Math and Science Study, which showed US seniors lagging well behind their international counterparts, fears grew that America could one day lose its technological and economic lead. To help reverse the nation’s dwindling ranks of technical majors, the White House and Congress ramped up spending on K-12 math education in the early 1990s. Even the National Security Agency today sends speakers to schools to talk about the importance of math achievement, the basis for code breaking.

But one of the most influential powers in math education today is the National Science Foundation. And even before the report of the Education Department’s panel, the NSF had decided Core-Plus and others were winners.

Between 1990 and 1997, the Education and Human Resources (EHR) Division of the NSF put out calls for research proposals to explore new ways of teaching math. The division spent about $86 million in the past decade to fund 13 multiple-grade level math-curriculum projects and build four “implementation centers,” says John Bradley, EHR’s mathematics program officer.

At the elementary school level, approximately 2.5 million students are using NSF-funded math programs today, Bradley says. Another 5,000 middle schools use NSF-funded math programs. No numbers were available for the number of high schools involved in NSF math programs. But at least 500 high schools use Core-Plus, for example.

Still, the process troubled critics: Where was the independent evidence that they worked? For Connected Math, a middle school program, the NSF “outside evaluation” was done by a team that included Mark Hoover, now at the University of Michigan. For Core-Plus it was a team led by Harold Schoen, a University of Iowa professor.


Opening the floodgates of criticism


Those were questions on Norman Lynn’s mind. In fall 1997, four years after Core-Plus started at Andover, Dr. Lynn told his daughter’s story at a school board meeting, calling her and other Andover students “academic guinea pigs.” Many parents were shocked.

One of those was Gregory Bachelis, a mathematician at Wayne State University in Detroit. He decided to find out if Melissa’s math experience was unique. So he and a colleague put together a questionnaire to survey Melissa Lynn’s graduating class. Nearly half responded. Dr. Bachelis also surveyed graduates of nearby Lahser High School, the Andover rival that stuck with traditional math.

The results: 96 percent of Andover graduates who took Core-Plus and responded to the survey had taken remedial math in college, they said. Among Lahser High graduates who responded, 62 percent took remedial math.

What surprised Bachelis most, he says, were bitter comments from dozens of Andover graduates, including: “I have very few math skills, and none of them helped me with [math] in college.” Also, “I am … not the least bit confident with my math ability. I am upset that I was ever placed in a Core class.”

Not everyone was up in arms, though. Lisa Robinson, a freshmen at the University of Michigan, liked the new math program at Andover.

“I liked Core-Plus,” she says. “The math I’m doing here at U of M is the same kind of program. There’s a lot of calculator stuff. It’s the same thing.”

Several mathematicians reviewed Bechelis’s work and found it solid. Still, Bachelis and his survey were castigated by Bloomfield Hills and Core-Plus officials. A lawsuit was threatened.

“The Bachelis study is flawed,” wrote Bloomfield Hills superintendent Gary Doyle in a letter earlier this year to the American School Board Journal. “He repeatedly contacted students … especially if he believed that they were negative about the program.” Bachelis agrees he persisted to get a complete sample, but denies selecting negative views.

A study that rebutted Bachelis was soon unveiled. It said Andover graduates’ grades at the University of Michigan, Ann Arbor were “stronger” than years before the program was adopted, it said. But that study, too, is debated.

“What’s passing for educational studies these days is really embarrassing,” says David Symington, principal at Lahser High. “We tried Core-Plus. And I’ve been watching it for four years. I would not go to it and my math department wouldn’t either.”

Of the rebuttal study, he says, “They didn’t even have the correct year. They didn’t account for 100 students from Andover taking regular math classes here at Lahser, that made Core-Plus look better.”

Core-Plus director Hirsch dismisses the Bachelis survey as a manifestation of national “math wars.” Some of our critics will go to almost any length to marginalize the good that’s coming from these projects,” he says.

Yet from 1994 to 1998, the years when Core-Plus was Andover students’ only choice, math ACT scores at the school remained flat. Meanwhile, math ACT scores of rival Lahser High, and those of schools across Michigan and the US, rose about 6 percent, according to a study by R. James Milgram, professor of mathematics at Stanford University and a critic of some new NCTM programs.

Exactly what held Andover students back is not known. But the ACT scores, which Andover and Lahser provided to Milgram, are not in dispute, he says.

Colleen Zematis, mother of a student at Andover who went through three years of Core-Plus, decided action was needed. She and others rallied and circulated petitions until, last fall, Andover returned to offering a traditional math option.

Andover Principal Toma says the school likes Core-Plus and was never dissatisfied with it. “Parents’ wishes must be respected,” he says. Half of students now take traditional math, he says.


Opposition to Top 10 math programs


Meanwhile, pressure has been growing elsewhere. Last fall, the Education Department released its Top 10 list of math programs. Reaction was swift.

Within weeks, a full-page open letter to Secretary Riley protesting the department’s choices appeared in The Washington Post, signed by more than 200 mathematicians, physicists, and four Nobel laureates. Few math researchers were involved in the federal review, and there were many mistakes in the new textbooks, they charged.

Some also wondered whether the Education Department criteria were unduly biased toward NCTM standards. Others, whether the panel had relied on unbiased studies of student achievement.

There were other concerns as well. In congressional testimony last month, David Klein, a mathematician at California State University at Northridge, said conflicting interests on the expert panel were a key problem.

“This [expert panel’s] list includes some of the worst math programs you can find anywhere,” said Klein, who signed the open letter to Riley. “The minutes of the [Education Department’s] expert panel show that the panel was aware of the problem of conflicts of interest,” he continued. “They raised the issue – and then dismissed it.”

Education officials and panel members say conflict-of-interest guidelines were followed scrupulously. But they concede the appearance of vested interests.

Luther Williams, for instance, was appointed to the panel in 1996. By most accounts, Dr. Williams, director of the Education and Human Resources division of the NSF that funded math-curriculum development, played a minor role on the panel. He did not attend meetings and left the panel entirely in 1998 before it voted.

But some panel members were mystified and wondered whether having NSF officials on the expert panel opened the door to charges of vested interests.

“Not enough thought had gone into the makeup of the panel,” says James Rutherford, an adviser to the American Association for the Advancement of Science, who was on the panel, too. “I really wondered if Luther should have been there at all. After all, at the NSF he was directly involved in funding the very programs we were evaluating.”

Even after Williams left the panel, there was another NSF official on board. In the end, six of the 10 programs selected by the panel were NSF funded – a striking success rate since only 13 were NSF funded in the 1990s.

“We were just trying to get people with experience,” an Education Department official says. “The NSF had experience.” Both that official and Janice Earle, a program director in the EHR division of NSF, also on the expert panel, deny NSF programs were favored. “They weren’t my programs,” Ms. Earle says.

But critics say other issues raise questions about whether the process was as objective and thorough as it should have been:


  • Programs did not have to show long-term evidence of achievement. But Leinwand says the programs are “too new” for that – and congressional pressure was building for action.
  • Studies showing evidence of higher achievement did not have to be independently reviewed by being published in peer-reviewed journals before being submitted. Rutherford says the panel did a good job but relied on “very soft evidence.”


The dearth of solid research tended to show up most with programs tagged as “promising.” Take, for instance, Middle-school Mathematics through Applications. One of two impact reviews says: “Because the outside evaluation was not complete … the program did not submit sufficient data to substantiate its effect on student achievement…”

A second reviewer rated it “marginal for promising.” The panel rated it “promising.”

Another reviewer wrote of Everyday Math, a K-6 curriculum: “In reviewing all the evidence provided … it does not provide meaningful evidence for program effects….”


  • Only three of the 96 reviewers had published a mathematical paper, says Richard Askey, a math professor at the University of Wisconsin at Madison and a reviewer. Most reviewers were not professional mathematicians, though many were math educators, he concludes. 


  • Several panel members were affiliated with programs being judged. Most notable was the co-chairman of the panel, Mr. Leinwand, who also sat on advisory boards of three programs being judged, two of which were later selected as “exemplary” programs – Connected Mathematics Project and Interactive Mathematics Program. Mr. Leinwand and others say that he reported his affiliations and left the room for all discussions and voting on them. 


  • Critics say the panel favored NCTM standards, which the panel mandated as “a filter,” according to meeting minutes. All 10 programs were based on the standards. But Leinwand denies his position on the NCTM board influenced that. Too, “those were the only national standards out there,” says panel member Jack Price, a math professor from California State Polytechnic University at Pomona, and former NCTM president. Forty-three states had NCTM-based standards.


Linda Rosen, adviser to Secretary Riley on math, says the recommendations are “just one tool.” “What [school districts] do with this tool … is up to them.”

But that’s small consolation for Robert Daitch, now a junior at the University of Michigan. He took four years of Core-Plus before graduating from Andover in 1998.

“Since I was a kid, I loved auto engineering,” he says. “So I attended a session for those who wanted go into engineering. When they asked how many of us took calculus, all the others raised their hands. I began to realize my dream was not going to happen.” So Mr. Daitch took remedial math at the university – and became a communications major.