Easy as pi?
According to the press release, there are two problems: the questions are too easy, and many of the students still can't answer them. To me, testing students on mastery of basic skills seems an entirely sensible and appropriate goal of this kind of testing. It would be a problem if the questions had gotten easier over time, making it difficult to measure improvements, but this doesn't seem to be the case.
Quoting from the study:
Recall that NAEP scores offer a ballpark estimate that today’s eighth graders know about as much mathematics as tenth graders in 1990. If this were true—and if the scores represent real gains in knowledge of mathematics—then a positive impact would be expected in enrollment figures for higher level math courses. That is, the courses that eighth graders are taking today should be relatively similar to the courses taken by tenth graders in 1990.I don't think this follows at all. Even if students are taking the same courses (or at least courses with the same names) as before, but learning more, that is real progress. It is likely that poorly prepared students are often passed along from class to class without having mastered the material. Furthermore, even with better prepared students, schools will not always offer more advanced classes because of a lack of qualified teachers. When I was in 8th grade, I had to travel to the high school every day to take Algebra I. In 12th grade, my large high school had no math teachers on staff who could teach calculus, so someone had to be brought in from outside. (You can find a sample of NAEP questions here.)
The two cohorts are not even close. There have been gains in Algebra I enrollments of eighth graders, from 16% in 1990 to 28% in 2003. Still, twice as many tenth graders had completed Algebra I in 1990 as there were eighth graders enrolled in the course in 2003. Almost half of all tenth graders (45%) had completed geometry in 1990, compared to a paltry 3% of eighth graders enrolled in geometry in 2003.