In the first year of a new test, scores on the 2006 Scholastic Aptitude Test fell by an average of seven points nationally, their sharpest drop in 31 years. For the class of 2006, the average critical reading scores decreased by 5 points to 503. Critics and supporters alike are trying to determine why scores fell far more than they have in any given year in decades (see chart) and what comes next.
The biggest change for the first cohort of students who took the SAT last March was a new, hour-long writing section, which includes both multiple-choice questions and an essay. The essay asked students to respond to a point of view on an issue through an original first-draft format and support a position with reasoning and examples, for a maximum score of 12. The math section added topics from third-year college-preparatory math, such as exponential growth and absolute value, while eliminating quantitative comparisons. In the critical reading section, analogies have been eliminated. All told, the test has gone from 2 hours and 30 minutes to 3 hours and 45 minutes.
"I'm not surprised at all that they've had a big drop in scores. They rolled out this test too quickly. It's too long and I don't care what anyone says, it's fatiguing," says Brad MacGowan, a college counselor at Newton North High School in Newton, Mass. MacGowan wrote a letter to the College Board back in May that included the signatures of more than 250 counselors from around the country, requesting that students have the option to take the new longer test over multiple days. "The critical reading section is too complex-not in a good educational way, but it's just hard to understand what they want you to do."
Changes in Approach
The College Board insists that the new test itself isn't the problem, however, pointing instead to changes in how students approached the test, primarily that 41,000 fewer students retook the test this year, two-thirds of whom were those who took the test three times. Typically, students who take the test a second time see a 30-point increase on their combined score and students who take it three times see a 54-point increase. "The most straightforward way to determine the cause is to look at the direct evidence, look at the first score for each student, which is consistent. This year without any repeating scores, we saw a 3 point decline in critical reading and a
1 point increase in math," says Caren L. Scoropanos, a spokesperson for the College Board.
Scoropanos also notes that 6 percent of the class of 2006 showed up early in their junior year to take the last administrations of the old SAT, presumably to avoid the new test.?But while early testers are usually well-prepared and score well, last year's early tester group scored more than 30 points lower than in prior years. As for fatigue, College Board analysis showed no difference in either the number of items correct or omitted between sections early or late in the test.
Still, even if the test is as well-constructed as it's ever been, the changes in test-taking behavior show that students may not be entirely comfortable with the new format. The College Board doesn't have numbers on why fewer students took the test a second time or why the early testers didn't do as well. "It's all speculation-it could be the test was too long, or they didn't like the writing, or they felt like the first time they did good enough," Scoropanos says.
The impact of the drop in scores isn't immediately apparent, but SAT critics like Bob Schaeffer, the public education director for policy group Fair Test, insist it's there. "It matters because colleges heavily rely on test scores. They have an explicit cut-off for scholarships, for example. One point can mean the difference in getting financial support or between being admitted or not," he says.
One result of the new format may be a decline in the popularity of the college-entrance test. Already under fire for tests that had been misgraded earlier in the year, the College Board saw a very slight decline in the number of students who took the test in 2006, from 1,475,623 in 2005 to 1,465,744. At the same time, the ACT has increased in popularity, from 1,186,251 students taking the test in 2005 to 1,206,455 in 2006, which is in keeping with the rate of increase over the last few years. "Ten years ago, you couldn't find ten students in our school in Massachusetts who had even heard of the ACT. Now a significant number of my kids are taking it," MacGowan says.
Depending on where a student wants to go to college, ignoring the SAT may not be a option, but nobody seems to have a good answer for how to prepare students to tackle a longer and in some ways tougher test, besides packing up a few energy bars. "There's no one silver bullet that's the reason students didn't do as well," argues Cathy Schroeder, the press secretary for the Department of Education in Florida, which saw its scores drop 3 points in 2006, a year where the state's 10th grade assessment stayed flat after several years of decline.
The drop in scores gives critics of the SATs more ammo for their argument that the test shouldn't be a mandatory admittance requirement for universities, and gives late-night comedians fodder for some more jokes about how dumb today's students have become. But it's too early to say for sure what the dip really means. "Typically, when we introduce a new test, student behavior does change," Scoropanos says. "In 1994, when we eliminated antonyms and allowed students to use a calculator on portions of the math section, we saw a 7 point increase. No one complained about the test then."
Carl Vogel is a Chicago-based freelance writer.