USDE Study on Educational Technology Effectiveness
In April the U.S. Department of Education released the study "Effectiveness of Reading and Mathematics Software Products," whose findings assert that test scores are not significantly higher in classrooms that use reading and mathematics software products. The study found that students' test scores in classrooms that were randomly assigned to use certain products did not differ by statistically significant figures from test scores in control classrooms.
With computers becoming more common in American classrooms and districts facing increased costs of hardware and software, the No Child Left Behind law called for the USDE to conduct the national study using "scientifically based research methods and control groups."
For the yearlong study, the USDE contracted with Mathematica Policy Research Inc. and SRI International and selected 16 products based on public submissions and ratings by the study team and expert review panels. Thirtythree districts participated in the study and were recruited on the basis that they did not already use technology products that were similar or identical to the study products. Within each school, teachers were randomly assigned to use the study product (the treatment group) or not (the control group).
The study specifically focused on whether students had higher reading and math test scores when teachers had access to the software products, as opposed to "assessing the effectiveness of educational technology across its entire spectrum of uses," the report states.
Appropriate training for teachers on implementing the products was provided, and vendors were responsible for providing professional development and technical assistance. The report maintains that because the study implemented real products in real schools with teachers who had not already used the products, the findings do indeed "provide a sense of product effectiveness" under real world conditions.
Following the release of the study, the Software & Information Industry Association, the principle association of the software and digital content industry, released its own statement on the findings, saying, "As this study recognizes, proper implementation of education software is essential for success." Furthermore, SIIA said that the USDE study may not have accounted for that key factor, which could have led to results that "do not accurately represent the role and impact of technology in education."
Karen Billings, vice president of the SIIA education division, is not surprised that the study did not present a huge difference in student achievement.
"Year one of any study such as this isn't likely to show positive achievement gain," she said. "There are so many implementation issues that need to be sorted out."
In keeping with their commitment to proper implementation of technologies, SIIA has prepared an extensive software implementation toolkit for K12 educators that provides specific guidelines to help facilitate the process, available at www.siia.net/education/foreducators.
A second report for the study will examine whether the software products are more effective when teachers have more experience using them.