March 1, 2021 5:16 pm

The Fordham Institute Misleads the Public With False Claims About New Report

Published by

Carol Corbett Burris, Ed.D.

Executive Director of the Network for Public Education

A recent study published by the Thomas B. Fordham Institute, entitled Robbers or Victims: Charter Schools and District Finances, was rolled out with fanfare and sent to policymakers across the country.  When the Fordham Institute sent out its mass email, trumpeting its report, its subject line read: “New report finds charter schools pose no fiscal threat to local districts.” That subject line is a blatantly false and unsupported by their own deeply flawed study.

In the report and its public relations campaign, Fordham cynically attempts to razzle-dazzle the reader with misleading conclusions based on questionable data in hopes of convincing the public that charter schools do no financial harm to public schools. The Walton Foundation and The Fordham Foundation, the Fordham Institute’s related organization, funded the study. It is worth noting that The Fordham Foundation sponsors eleven charter schools in Ohio, for which it receives administrative fees.

The origins of the study, unacknowledged in the report, is author Mark Weber’s 2019 doctoral dissertation. Advocacy organizations are often accused of cherry-picking examples. With Robbers and Victims, Fordham cherry-picked a study on which to base its puffery. In a Fordham podcast and his blog, Weber, an elementary school music teacher who completed his doctoral studies at Rutgers University, reports that Fordham approached him to author their report after they read his dissertation, which is composed of three papers, the first of which is the basis of the Fordham report.

There are differences in substance between the dissertation and the report; however, these are not enough to substantively change results. Also, Robbers or Victims adds two more years of data (2016 and 2017). The research questions are re-phrased, some states were excluded, and several of Weber’s original cautions regarding the interpretation and limitations of his findings are either downplayed or dropped. Glossy bar graphs replace Weber’s tables.

In both the dissertation and the report, Weber attempts to show the association between charter growth and districts’ finances in revenue and spending—as charter schools expand. He found that in most cases in the states whose data he analyzed, revenue and expenditures either increased or stayed the same when the number of students attending charters located in the district went up.  In all cases, there is no evidence of causation, just correlation.

For those not familiar with the distinction, a correlation occurs when two observations follow the same trend line. It does not present evidence that one causes the other. The classic example is the correlation between ice cream sales and murder rates—both are higher during summer months in big cities and then drop as the weather gets cooler. Then, there are hilarious examples of Spurious Correlations that show the associations between such oddities as the age of Miss America and murders by steam, hot vapors, and hot objects.

Fordham’s Petrilli latches onto the correlation and concludes that it appears charters do no financial harm to districts. In their news brief about the report, the National Alliance of Charter School Authorizers take Fordham’s deliberate attempt to deceive one step further saying, “Their findings show that if anything, increasing charter school enrollment has a positive fiscal impact on local districts.”  That is blatantly false and deliberately misleading.  “Impact” means that the study can support a causal inference.  It clearly does not. But that is not the end of this study’s problems.

The Critical Question Not Posed

There is an obvious question that is neither posed nor answered. How do increases and decreases in district revenue and spending compare to districts without charters? Are the comparative rates higher, lower, or the same?

I read the Fordham report and Weber’s dissertation three times in search of that answer or at least a discussion of the limitation. The author never addresses it.

To ensure I was not misinterpreting the analysis, I emailed Professor Bruce Baker of Rutgers University, a national expert on school finance. He is familiar with Weber’s dissertation, having served as his advisor. I noted in my email that even if increases in revenue and spending are associated with charter growth, it is meaningless unless you can compare those increases to those of other districts with no charter schools.

Baker acknowledged the absence of comparative data and then went one step further (quoted with his permission).

“Comparing districts experiencing charter growth with otherwise similar districts (under the same state policy umbrella) not experiencing charter growth is the direction I’ve been trying to push this with a more complicated statistical technique (synthetic control method).

“But even with that, I’m not sure the narrow question applied to the available imprecise data is most important for informing policy. The point is that the entire endeavor of trying to use these types of data – on these narrowly framed questions – is simply a fraught endeavor and one that added complexity can’t really solve.”

Consider the following oversimplification of the problem. Between 2013 and 2018, national spending on K-12 education has increased 17.6% as states recovered from the Great Recession. That is the average. Spending increases in the states ranged from a 2% decrease in Alaska to a 35.5% increase in California. Both states have charter schools. Vermont, with no charter schools at all, had an 18% increase in spending. Florida, which has an ever-expanding charter school sector, increased spending by 11%. Only Alaska (which Weber does not include) did not see an increase. If we look at this from a national perspective, it is a safe guess to expect that revenue followed an upward slope similar to spending. So did the proliferation of charter schools. And so, frankly, did my age.

When examined at the state and local level, school revenue and funding become even more complicated. That is because both spending and revenue depend on who is in power, how much they value education, the perception and reality of need, and the locality’s economic well-being. In states with local funding, all of these decisions play out in each district. Although Weber attempted to control for various factors in his district-based modeling, it is impossible to account for all the drivers of revenue and spending in a meaningful way. Again, I quote Baker: “the entire endeavor of trying to use these types of data – on these narrowly framed questions – is simply a fraught endeavor.”

Bad Data and Big Limitations

The study rests on the measurement of what Weber refers to in his dissertation as “charter penetration,” which attempts to capture the increase in the percentage of students who attend “independent charter schools” in a district with such charter schools within it.

The Fordham study defines “independent charter schools” as “schools that are not part of their host school districts.” “Independent charter schools” is a construct created by Weber to deal with NCES coding and state funding complexities. Using this definition, Florida, which has a high share of charter schools, was excluded from both the dissertation and the Fordham analysis. About half of the states with charter schools during those years were excluded. States that had few students in those schools were among the 21 in the Fordham report. Six of the 21 had fewer than 3% of the state’s students in “independent charter schools.” From these small “penetrations” (I wince whenever I use Weber’s term), big generalizations were made.

But there are more serious problems. Weber’s computations include the number of all students that attend independent charter schools physically located in the district. However, students from outside the district can attend charter schools, too. Weber acknowledges this problem in his dissertation and on page 21 of the Robbers or Victims report but claims its effects are limited. I believe, however, that it is more problematic than Weber admits. Here are a few examples.

On pages 55-57 of his dissertation, Weber lists the 100 school districts with the largest charter proliferation rates. Some of the California rates are absurd due to that state’s policy of allowing districts to authorize charters in other districts. Let’s set those aside.

Midland Borough School District in Pennsylvania is listed as having a total population of 10,312 students in its public and charter schools. This is surprising because the Midland Borough School District has only 272 students in its K-8 schools. It has no high school. Could it be that over 10,000 of their K-8 students attend a charter school? Of course not. Within the district boundaries of this K-8 district, there is the physical address of a virtual school, the PA Cyber Charter that pulls students from all over the state, and a brick and mortar specialty charter high school, The Lincoln Park Performing Arts High School.

The Simi Valley Unified School District is one of California’s finest. According to Weber’s calculations, 45.5% of the district’s students attend a charter school. I could find no brick-and-mortar charter school located within its boundaries. However, California Virtual Academy has a physical address in the City of Simi Valley. A handful of students from the district likely attend the online school, certainly not 45.5%.

Wainscott Common School District in New York is a one-room schoolhouse that serves 26 students Grades K-3. According to Weber’s list, in 2015, over 76% of the students attend a charter school located in the district. What charter school would that be? For a time, the Child Development Center of the Hamptons Charter School was physically located in the district. It was a K-5 school, primarily for special needs students (44% SWD) designed to serve children who lived within a 50-mile radius of the school. Might one or two of this tiny district’s students attend it? Perhaps. But not 76%. The school, which opened in 2001, at its height served about 90 students. It would have registered Wainscott in Weber’s model as a district with substantial charter penetration, which it never was.

The problem of identifying districts as having “high charter penetration” when they do not goes well beyond the issue of including virtual charter schools (which in 2017 enrolled 300,000 students), regional charter schools, and special needs charter schools. It also goes beyond charter high schools located in K-8 districts. Charter schools, especially those located outside of cities, pull from other districts.

Let’s take, for example, Weber’s home state of New Jersey. Weber placed the Frelinghuysen Township School District on his top 100 list. It is a small elementary district that, in the 2017-2018 school year, had only 142 students. Within its boundaries is the Ridge and Valley Charter School. Weber reports that 46% of the district’s students attend the charter. That is likely not true. Based on data obtained from an Open Public Records Request from the New Jersey Department of Education, we know that two years later, only nine district students were attending the charter school. The rest of the school’s students came from numerous surrounding districts[1]. That same year, only 12 of the 213 students from the Sparta Technology Charter School came from the large home district of over 3,000 students. And 96 of the 242 students of Unity Charter School came from the Morris School District of over 5,000 students, in which it was located.

The assumption that nearly all students in charter schools come from the district where the school is located is fallacious in states with many small, local districts and porous charter attendance zones. And it is nearly always the case with virtual charter schools. No doubt similar examples exist in each state, and given the limited number of districts included in some states, the cumulative effect is meaningful. Some districts are included that should not be included. Error is introduced into district calculations, which in turn affects state results.

An additional limitation results from the variety of ways that states fund their charter schools. In some states, charter funding goes directly to the school. In others, it passes through the district. Because it was unclear whether district costs and revenues associated with charter schools were included in source databases, Weber estimated proportional shares for some districts, but these estimations raise questions. For example, he writes that “some of the states with the largest increases in state funding per pupil, such as … New York also have policies that compensate districts specifically for charter-driven enrollment losses.”

However, this funding doesn’t cover for all lost revenue districts suffer from charter school expansion, and the New York state program specifically excludes NYC for any reimbursement, even as the district now spends over $2.4 billion a year for charter schools that comes straight out of the city’s education budget.

Other questions are unanswered. Did Weber include state-by-state quirks such as the New York State requirement that NYC public schools provide or pay for charter school facilities at the cost of $100 million annually? Did he deduct the partial compensation received on the revenue side for this expenditure? Were the fees received by authorizing districts in Michigan and California and the transportation costs that districts in some states pay for students who attend charters included?  From every indication, the answer to these questions is, “no.”  “The available imprecise data” compounds error throughout, making even modest findings more questionable.

What About the Taxpayers and How Should We Determine Cost?

 Weber rightly concludes that some of the observed increases in per-pupil spending are due to the inefficiency of running dual systems, charter and public. When 25 or 30 students leave an elementary school for a charter, you can’t get rid of the principal or lower the heat. You may not be able to decrease staff if the students who leave are from various grade levels. These are known as stranded costs.

Weber refers to it as inefficiency. Superintendent Joe Roy of Bethlehem, Pennsylvania, calls it costs he is forced to pass on to the taxpayers. Back in 2017, I asked Roy how much charters cost his taxpayers. Roy told me that the district budgeted $26 million (about 10 percent of its annual budget) that year to pay for tuition and associated costs to charter schools. According to Roy, “We estimate that if all of the students in charters returned, even with hiring the additional needed staff, we would save $20 million. This is the cost of school choice.”

What Superintendent Joe Roy and his business staff did—determine how costs would rise if charter students were in the district and then deduct the costs associated with sending them to charters—is the only true way to determine the effects of charter schools on district finances.

In 2018, In the Public Interest, a national nonprofit research and policy organization did precisely that kind of analysis to determine the extra costs of charter schools in three large California school districts. Their analysis found that the studied districts lost tens of millions of dollars each year that would be recouped if students attended public schools instead of charters. You can read that report, authored by University of Oregon professor Gordon Lafer, here.

Such reports are tedious, difficult work best done in the small. They do not lend themselves to models cranked out of statistical package programs. They require sampling in states and, depending on how each state finances charter schools, results will vary. However, taxpayers deserve an accurate picture of charter cost, not a guesstimate.

Despite Weber’s sincere efforts, Fordham’s Robbers or Victims makes no substantive contribution on how to determine those costs other than to demonstrate what a fraught endeavor it is to answer such an important question with big, unexamined datasets and incomplete federal reports. Even Weber has made it clear what the study says and does not say on his blog. However, Fordham’s distribution of the report to State Boards of Education, Commissioners, and other policymakers with a deliberately misleading headline does a disservice to public schools and to taxpayers alike.

[1] Based on data obtained from an Open Public Records Request from the New Jersey Department of Education.