'Stuck Schools' Methods Simplistic and Misleading
April 7, 2010

New review finds Education Trust report too flawed to be of much value

Contact: Teri Battaglieri (517) 203-2940;
Jaekyung Lee (716) 645-1132;

EAST LANSING, Mi., (April 7, 2010) – A research report released by the Education Trust offers a framework for identifying low-performing schools purported to be most in need of targeted education reform strategies. A Think Twice review of the report finds that its reliance on misleading data and unreliable methodology renders it of little value to policymakers.

The report, Stuck Schools, uses schools in Maryland and Indiana as examples to demonstrate its methods. The report sorts the schools using two variables – performance and improvement. First, schools are sorted by how well students perform, on average, over the first three years of the five-year study period. Then schools are sorted by how much they improve their proficiency over the five-year study period. The report's authors then identify those that are both low-performing and low-improving—the so-called "Stuck Schools" of the report's title.

The report comes as the Obama Administration is shifting away from a school accountability system focused on school performance to one that has a focus on both performance and improvement.

In his review of the report Professor Jaekyung Lee of the SUNY Buffalo finds numerous flaws. Although the report's authors never say so explicitly, they assume that schools improve in a straight line over time. That assumption, however, misses other patterns of growth, as Lee demonstrates by presenting the data from a Maryland school whose actual growth pattern is clearly not linear and that doesn't necessarily warrant its "stuck school" label.

Lee also points out that some of the report's key results may be due to a norm-referenced methodology that guarantees "failed" schools independent of any true performance or improvement level by the school, as well as a well-known statistical artifact called "regression to the mean"—such that school improvement or decline would not be attributable to anything more than measurement error. And he questions the assumption that underlies the report's title and the schools' label as "stuck." That, Lee observes, "suggests that the schools themselves—rather than the structural and resource issues within which those schools carry on—should be blamed and 'turned around.'"

Summing up his review, Lee writes: "The overall conceptual framework of the report helps bring more attention to the issues of validating and using school-level performance trend data for accountability." But he warns that the "report's methods are so simplistic, arbitrary and poorly fitting to the report's own assumptions that it is more harmful to sound policymaking than helpful."

Find Jaekyung Lee's review as well as a link to Stuck Schools on the web at:

About The Think Twice Project
The Think Twice project provides the public, policy makers and the press with timely, academically sound reviews of selected think tank publications. It is a collaboration of the Education Policy Studies Laboratory at Arizona State University and the Education and the Public Interest Center at the University of Colorado at Boulder and is funded by the Great Lakes Center for Education Research and Practice.


The mission of the Great Lakes Center is to improve public education for all students in the Great Lakes region through the support and dissemination of high quality, academically sound research on education policy and practices.

Visit the Great Lakes Center Web Site at: