September 19, 2021 Reading Time: 5 minutes

“What opportunity costs were factored into your study?”

“None.”

This exchange recently took place during a virtual presentation of an economic impact study arguing that failure to approve a $20 billion transportation plan for the Charlotte, North Carolina metropolitan area would cost “up to 126,100 unrealized jobs, $10.1 billion in lost wages, $28.0 billion in economic outputs and $3.0 billion in tax revenue by 2050.”

In posing that question, Huntersville resident Eric Rowell had asked presenters the very question that’s usually overlooked in these sorts of things. But as founder of the Charlotte chapter of the Bastiat Society, Rowell had insight many listeners didn’t. He understood about “what is not seen” — that productive capital isn’t forcibly moved from Peter to Paul without harm to Peter’s plans and the people Peter would have done business with. He also realized that these discussions often go along the lines of telling Peter what great things Paul is going to do for himself and Peter and his pals, too, and leaving it at that.

Around the country, local city councils and county boards of commissioners look for ways to revitalize their areas. Not often enough, however, do they consider making marginal changes of proven effect — reducing taxes, eliminating restrictive zoning and other regulations, ridding themselves of protectionism, crony back-scratchery, and so forth. Such policies would allow local entrepreneurs to thrive and would serve as warm invitations to others. They would build up the local economy based on their own talents and perceptions of market needs.

This growth would be widespread and gradual, however, not particular, and therefore not visible enough to give credit to the politicians. There are no ribbon-cutting ceremonies when the local grocer adds a shift, no photo ops when a local dressmaker opens a new strip-mall boutique, and no proclamations when a local restaurateur opts for a second location.

So their dreams run big. If only we had a big sports stadium, or a civic center, or a public festival, or a new transportation system, then we would reap big tourism and grow faster. People will flock to us because of this new thing. They cannot be faulted for believing their communities are winsome. It’s a trait you’d prefer in a community leader.

Trouble comes when developers of the new thing show up with a hefty “economic impact” study of the project. Often it will be based on proprietary input/output modeling software, billed as so easy to use that no economist is needed, and it will certainly look comprehensive. It will purport to consider not only the project’s direct economic impact, but also its indirect and induced impacts — i.e., not only how much money will be spent directly on the project, but how much of that spending will be respent in the area and how much new, complementary economic activity the area will accrue.

This study will feature pages and pages of numbers no one wants to read, but the executive summary will suffice: the thing will boost the community beyond council members and commissioners’ wildest dreams. They wouldn’t want to stand in the way of dream growth, would they?

But if the developers have their hands on such a sure-fire money maker, why are they taking it to local elected officials? After all, in the parable of the hidden treasure, Jesus didn’t say that the man who found the treasure covered it up and alerted the local council to share in the bounty. Instead, he bought the field it was in and was glad to do so.

Developers getting significant tax funding for the project is a sure-fire way to protect themselves from big losses. Purchasing an economic impact study is rational rent-seeking behavior.

Economist Roy Cordato warned about these off-the-rack studies in his March 2017 report entitled “Economic Impact Studies: The Missing Ingredient Is Economics.” Cordato wrote:

These studies all have several things in common. First, they typically use proprietary, off-the-shelf models with acronym names like IMPLAN (Impact Analysis for Planning), CUM (Capacity Utilization Model), or REMI (Regional Economic Model, Inc.). Rights to use the models are purchased by professional consulting firms who are hired by the interest groups to do the studies. Furthermore, seldom do those who actually perform the studies have formal training in economics. Instead their expertise is in using one or more of the aforementioned proprietary models. And finally, all of these studies ignore basic principles of economics and, as a result, do not meaningfully measure what they claim to be measuring—the economic impact of the public policies and projects that they are assessing.

Citing a discussion of one such model, Cordato pointed out that the methodology rules out even the “possibility that a new project could cause a net reduction in income, output, or employment … . The ‘unseen’ of opportunity costs go unexamined and therefore unaccounted for.”

Cordato also critiqued the idea of the “multiplier” intrinsic to economic impact modeling:

The idea is that, as the initial spending works its way through all the interconnections among and between industries included in the I-O tables, its impact is multiplied. The multiplier is a number by which the initial spending is multiplied to generate the final “economic” impact. … It is through this multiplier process that a dollar spent on a project may, at least within the context of the model, end up “contributing” many more times to “output, earnings, and employment.”

A safer assumption is that the multiplier is less than one and closer to zero. Otherwise, it means that politicians and special interests have better ideas for the use of your resources than you do — a reliably fatuous, self-serving proposition.

A further problem with the multiplier is, as Cordato pointed out, that “these studies ignore the first principle of production theory, taught in every first-year microeconomics class — the law of diminishing returns.”

For Charlotte, the plan with no admitted opportunity costs featured impressive multipliers. Every aspect would generate positive impacts, up to 2.0 to 2.7 times the original investment. Even spending $100 million on sidewalks would result in up to $240 million in economic impact.

An amusing anecdote along these lines can be found in a 2001 Journal of Travel Research paper by John L. Crompton, Seokho Lee, and Thomas J. Shuster. Asked by city officials to project the economic impact of a 10-day festival boasting over 60 different events, one of the authors returned with an estimated impact of $16 million.

The officials “vigorously contested the results, arguing they were much too low.” Two weeks earlier, they had been told that a three-day professional rodeo event would result in nearly $30 million of economic impact.

“How can we possibly accept that this festival lasting for 10 days and embracing over 60 events had a smaller economic impact than a single 3-day rodeo event?” they asked.

The author requested and received a copy of the rodeo presentation, then took its erroneous assumptions and applied it to his own analysis of the 10-day festival. He therefore projected an economic impact not of $16 million, but “more than $321 million.” So by embracing rather than exposing what he called “abuses in economic impact analyses,” he was able to project for city officials an “economic impact” about twenty times greater than before.

The final irony of the Charlotte transportation plan is that investigative reporting revealed that the plan’s cost was actually going to be 167% to 250% higher than originally projected. Stated cost projections given to Charlotte city council members were between $8 billion and $12 billion. By obtaining internal emails among city officials, WBTV reporter David Hodges uncovered June 22 that they were actually projecting it at $20 billion.

The fact that discovering this higher expense took investigative reporting instead of it being shouted from the rooftops shows that the project’s “everybody wins” multiplier effects aren’t credible. It’s being regarded as bad news that the project will cost much more than advertised.If that spending were truly expected to bring in much greater returns than the amount the public was made to spend on it, then more than expected spending would be good news. Good reporting helps people because they intuitively understand cost overruns are bad. What would also help people are more council members and commissioners who know to ask about what is not seen: opportunity costs.

Jon Sanders

Jon Sanders

Jon Sanders is an economist and the director of the Center for Food, Power, and Life at the John Locke Foundation in Raleigh, North Carolina, where he also serves as research editor. The center focuses on protecting and expanding freedom in the vital areas of agriculture, energy, and the environment.

Follow him on Twitter @jonpsanders

Get notified of new articles from Jon Sanders and AIER.