Though states are making progress in supporting effective school data use, they must do more to ensure that stakeholders like teachers and parents can easily access information, according to the annual state analysis report, “Data for Action 2012,” released by the Data Quality Campaign, a nonprofit that advocates school data access for all stakeholders.
There are plenty of lessons in predictive analysis models, according to a 2011 white paper, “Worst Practices in Predictive Analysis,” by Information Builders, a company that focuses on enterprise business intelligence and Web reporting software solutions. Here is how to avoid them:
• Determine the ROI. When planning to implement predictive analysis, consider the total cost and the anticipated return to ensure the maximum value is achieved.
We at DA keep our ears to the ground and our noses to the grindstone always looking for new stuff to keep you, our readers, well informed. Much of what we’re hearing these days points toward the growing use of predictive analysis—looking at student data and seeing where kids are going, rather than looking at where they’ve been, as is used with data-driven decision making. Sophisticated modeling software is beginning to move from the corporate world and higher education admissions to K12, and the potential is huge.
“The purpose is to bring out the formative nature of summative tests… to get teachers to also look forward—not just backward.” —Uve Dahmen Twin Rivers USD As school districts strive to meet Adequate Yearly Progress targets, they struggle with two key issues: how to identify students who may not achieve “Proficiency” on state tests, and then how to improve their learning and outcomes.
Every state in the country now has a longitudinal data system extending beyond test scores, according to the Data Quality Campaign’s seventh annual Data for Action analysis. Thirty-six states—a giant leap from zero in 2005—have implemented the organization’s 10 Essential Elements of Statewide Longitudinal Data Systems. While the results are promising, Aimee Guidera, executive director of DQC, warns that building the data system isn’t enough.
In 2004, Deborah Verstegen, professor of education finance, policy and leadership at the College of Education at the University of Reno, wanted to create a vast library of data that, until now, didn’t exist: state-by-state school finance formula figures. “The search for the best model to use in funding education is a perennial concern and interest,” she says.
With Over 60 percent of school districts considering staff reductions to balance budgets (Kober & Renter, 2011), class size is likely on many educators' minds. With money tight, schools are seeking to focus available funds on those policies and programs most likely to have a positive impact on student learning. Although the effects of class size have been debated for decades, Tennessee's STAR project in the late 1980s seemed to settle the argument.
Failure Is Not an Option is not just the title of a best-selling book; it's a mantra for many high-performing districts. The Mansfield (Texas) Independent School District adopted this motto in 2007 and hasn't looked back.
The district—the second-largest in Texas with over 35,000 students—was far from low-achieving, although it was experiencing rapid change with the addition of over 2,000 students each year. Located outside Dallas, Mansfield has had to add a new school each year for the last 13 years to keep up with enrollment. It currently has 40 schools.
When Education Secretary Arne Duncan announced the Race to the Top program in 2009, he added two success factors to the plate of school districts, which are traditionally measured by students’ high school success in math, reading and science: college enrollment rates and credit accumulation. The American Recovery and Reinvestment Act of 2009, which launched Race to the Top, asks states to set up a longitudinal data system to report back on students’ progress after they receive their diplomas.