Assessing whether or not students learn what they are supposed to has long been part of the educational process, encouraged even more in recent years by mandates to meet federal No Child Left Behind requirements and state achievement standards.
But unlike summative assessments, which rely mostly on end-of-year test scores to gauge student achievement, formative assessments help shape instruction through repeated measures as students are learning. As a result, formative assessment is capturing new attention among teachers and administrators.
As district leaders know, formative assessment is not new. "It's been here a long time. It's just that we are paying more attention to it now," says Raymond Yeagley, vice president of product and business development for the Northwest Evaluation Association (NWEA), a national nonprofit based in Lake Oswego, Ore., that provides computer-adapted assessment services.
Formative assessment is "the name of the game these days" in schools, adds Ray Wilson, executive director of assessment and accountability in the Poway (Calif.) Unified School District. "If you really want to improve student performance, you need to focus on formative assessment primarily and on summative assessment secondarily," he says.
More than just whether students are passing or failing, formative assessment gives teachers and administrators information they need to determine suitable learning programs for students who need help in specific instructional areas. The information comes from benchmark and diagnostic tests that measure not just whether students are meeting required standards but that also identify learning difficulties they may be experiencing. It also is called adaptive assessment, because tests can be adapted to individual students' skill and knowledge levels while they are being given.
Wilson, whose district uses NWEA assessments, describes the benefits with a barnyard analogy: "Summative assessment just weighs a pig. Formative assessment provides the information you need to feed the pig so that it grows."
Personal and Group Targets
Poway tests all its students three times a year-in the fall, winter and spring. After the fall and winter tests, teachers set personal growth targets for individual students, telling them what they have to learn to get higher test scores next time, Wilson says.
Teachers also set classroom goals, such as working more on vocabulary, to improve students' collective performance. "We have learned that setting group targets, in addition to their own targets, is very motivating for kids," Wilson says.
Clayton Collins, director of instruction in the Beech Grove (Ind.) City Schools, which also uses NWEA's assessments, says pressures to satisfy NCLB and state requirements cause districts to focus on students' test results. "When you have a big body like the federal government which gives you money, you want to please them," he says.
But whether students pass or fail a test does not accurately indicate whether they really are learning, Collins maintains. "If a teacher can get good growth out of her students, then they are successful. It has nothing to do with whether or not they pass a state test," he says.
Beech Grove uses NWEA's Measures of Academic Progress (MAP) for primary grades. MAP are state-aligned tests that reflect students' knowledge and growth over time. They can be adapted to students' individual achievement level, giving teachers information about what each student has learned and is ready to learn next. MAP also provide reports of student achievement for school administrators and parents.
Aware of NCLB pressures, NWEA has enhanced some MAP features, including using pictures and sound to enhance questions of specifically K2 students. For example, students might be shown cartoon pictures of a dog, a kite and a bug and be instructed to match the letters D, K and B to sounds that correlate to the pictures. Collins says the measures help prepare Beech Grove's second-graders for third grade, which is when the district starts testing them for NCLB proficiency standards.
Formative assessment supports differentiated learning for struggling pupils. "Believe it or not, kids are different, and we need a way to get more information about what makes each of them unique," says Richard McCallum, a former classroom teacher and co-founder of Let's Go Learn, which provides diagnostic testing, data reporting and instruction to boost student performance in reading. The company offers Diagnostic Online Reading Assessment (DORA) and Diagnostic Online Math Assessment (DOMA) and will start a program that enables teachers to find appropriate materials and link them to Curriculum Associates' print instruction products.
"The more you know about kids and their abilities and behaviors, the better you will be able to diff erentiate or individualize their instruction and make adjustments or adaptations to ensure that they are moving on the right course to achieve standards, or developing confidence, critical thinking, and other skills and abilities they need," he says.
Formative assessment is becoming more popular because educators are realizing that testing students just to see how they score, prompted by accountability requirements of NCLB, is insufficient, McCallum continues. "Teachers already know which students are going to do poorly on those tests," he says. "The data really don't help."
What does help, McCallum asserts, is diagnostic testing that provides "the kind of fine-grained information you need about kids' performance to make appropriate decisions for their instruction." Thus, formative assessment helps teachers defi ne what a student learns best or what help he or she needs in learning and what resources will be helpful.
The computerized formats of formative assessments provide objective results within 24 hours, enabling educators to quickly change what and how they are instructing students who need help in certain curriculum areas. "Our assessment would not be possible without the multimedia adaptive environment of computers that we have today," McCallum acknowledges.
Some other assessment tools, such as Dynamic Indicators of Basic Early Skills (DIBELS), require testers to mark by hand how students respond to test questions in a certain period of time. Then teachers and administrators analyze the results and determine how to proceed with each student. But such a review takes time, and educators sometimes subjectively give students "the benefit of the doubt," says Sue Ann Highland, director of curriculum and instruction in the Highland Weld RE-9 School District in Colorado. She adds that training in assessment may not have been the same for all teachers, which could lead to various testing avenues.
By contrast, students assessed through Let's Go Learn's system respond to questions on computers. "It takes the subjectivity out of the assessment and provides instant feedback," says Highland. She says testing a class of kindergarten students in www.DistrictAdministration.com March 2008 45 a computer lab takes an hour and testing all students in a school takes a week.
Teachers use the information to shift students from one classroom reading group to another, hone in on specific skills that need attention, and make other changes.
As formative assessments continue to develop, their designers and users alike are excited about assessments' impact on helping students learn and subsequently boosting their test scores.
"Just taking a test doesn't make a kid learn more. It's how you use the data in your classroom instruction that will make a difference," declares Yeagley.
"No assessment is good in and of itself. It has to be coupled with proper instruction," Highland agrees. Done properly, she says the bottom-line result usually is improved student learning that translates into higher test scores as well.
Alan Dessoff is a contributing writer.