A fascinating and alarming scandal rocked American academia last week when a group of three scholars revealed they had successfully tricked several peer reviewed journals into publishing fabricated articles that nonetheless cater to the biases of identity politics. The hoax intentionally echoed the 1996 Sokal Affair, in which a prominent physicist tricked a humanities journal into publishing a nonsensical article that purported to show the progressive political implications of quantum gravity.
In the updated version, since dubbed “Sokal Squared,” the pranksters composed 20 test articles with similar intentionally absurd premises, each cloaked in the language of Critical Theory, “cultural studies,” and academically fashionable causes of the far left.
The results of the experiment should raise alarm. Seven hoax papers were accepted for publication, two more were invited to “revise and resubmit,” and another five were in various stages of review or revision when the authors decided to halt their experiment and reveal the results. While some of the accepted papers landed at journals that appear to have been intentionally targeted for their doubtful quality (e.g. a journal specializing in “Poetry Therapy”), at least three passed screening at outlets that are considered to be top-tier field journals. The pranked philosophy journal Hypatia, the social work journal Affilia, and the gender studies journal Sex Roles all have strong journal impact metrics and would be considered a favorable mark for faculty tenure and promotion decisions in their respective fields.
With a few modest exceptions, fallout from the hoax has predictably coalesced along political lines. The political right sees vindication of a long-running complaint that the targeted fields are hopelessly politicized exercises “grievance studies.” The academic left has generally responded by downplaying the significance of the hoax, questioning its lack of a control in other disciplines for comparison, and by attacking its authors for exhibiting disingenuous motives.
Each has a point to some degree. The accepted articles, which included a litany of identity politics tropes and even an intentional re-writing of a passage of Mein Kampf in critical theory jargon, are absurd on their face. The authors also clearly employed deception as charged, although they never suggested anything otherwise and have been forthright about conceding this feature of their hoax.
But allow me to suggest that each of these arguments misses a more important takeaway from Sokal Squared. Political biases certainly made the accepted papers fashionable to these journals, and the pranksters selected their targets based on expectations that their editors and referees would take the bait.
What the hoax actually reveals though is a pervasive problem of declining scholarly rigor afflicting certain parts of the academy.
The Low Bar of Academic Research
Set aside the political dimensions of the hoax for a moment, and consider what its perpetrators were able to do. Despite having almost zero expertise or training in the associated fields of “cultural studies,” the pranksters succeeded in their ploy and did so with alarming ease. They spent no more than a few weeks on each article – sometimes less – and convinced at least seven journal editors and twice as many anonymous “expert” reviewers of their intentionally deficient products’ scholarly “quality.”
They did so by little more than adopting critical theory jargon and appealing to the intellectual biases of each venue, using each to dress up largely specious theses as scholarly products. With exceedingly minimal effort, they made themselves appear to be credible practitioners of scholarly subjects that ostensibly require a decade or more of time-intensive study and graduate-level credentialing to master. In short, they gave us a glimpse of the alarmingly low standards of rigor employed in these fields – so low, in fact, that it can be convincingly feigned by a complete novice.
Sokal Squared provides a glimpse of a larger and more complex problem in our university system, although the ease with which the hoaxers produced 20 papers and published 7 of them is revealing. For context, consider that the majority of college professors in the United States average less than one scholarly publication of any type in a given year – articles, books, book chapters, or even review essays. Almost a third of all professors have not published anything scholarly in the last two years.
It is true that some academic appointments have heavy teaching loads and fewer research expectations as a result, although the aforementioned low publication rates only slightly improve as the survey moves up the ranks of tenured associate and full professors. Others spend their energies on time-intensive products such as books, or focus upon hitting higher quality journals. But I raise the point to illustrate just how quickly the pranksters were able to generate not just one or two but an entire career’s worth of credible-looking yet patently fake articles.
Indeed, if the products of their year-long experiment were compiled together as a single academic Curriculum Vitae with a common theme in “cultural studies,” it would stand a reasonable chance of securing tenure at almost any college or university in America with the possible exception of a few elite departments.
And that is one of the most paradoxical features of the modern academy: publishing is easy, and in certain fields the bar of peer review is both exceedingly low and easily manipulated. And even then, most professors only clear it with sparing frequency.
Research Problems in Other Fields
The intentional targeting of “cultural studies” by the pranksters has prompted the familiar refrain of ‘tu quoque!’ amongst their detractors. One English professor offered this retort in the Chronicle of Higher Education:
“[T]he trio’s work is “simply not rigorous research” and described three objections to it. It is too narrow in disciplinary scope, he said. It focuses on exposing weaknesses in gender and ethnic studies, conspicuously ideological fields, when that effort would be better spent looking at more-substantive problems like the replication crisis in psychology, or unfounded scholarly claims in cold fusion or laissez-faire economics. The trio could have reached out to colleagues in physics and other fields, but instead opted for “poor experimental design.” And they targeted groups that are “likely to be laughed at anyway,” showing not intellectual bravery but cowardice.”
Sadly, problems with the reliability and rigor of scholarly research do in fact plague other areas of the academy, including quantitative fields. They differ from the qualitative work of the hoaxed journals in the sense that they are rarely outright fabrications. Quantitative errors usually stem from sloppiness in experiment design, calculation mistakes, and misinterpreted results. But both types of error evince widespread shortcomings of quality and rigor in published academic works.
Both the physical sciences and quantitative social sciences have experienced a “replication crisis” in the last decade, where researchers struggle to reproduce the empirical results of published articles using the same data and methods described therein. In 2010 the highly regarded medical journal Lancet retracted an infamous paper purporting to link vaccines to autism after its results could not be replicated, and an investigation revealed problems with its research design. Just this week, another peer reviewed journal came under fire for publishing an article that touted the medical benefits of homeopathy – widely regarded as a pseudoscience.
Economics is not immune to the problem either, although the particular grievance against “laissez-faire economics” in the foregoing claim is oddly made without any citation or examples. Rather, problems of rigor and replication in economics appear to concentrate around macroeconomic models – an area of the discipline that is anything but laissez-faire. Other reliability crises in economics may be found in ideologically progressive research areas such as the measurement of inequality. The work of Thomas Piketty, for example, has performed poorly under the scrutiny of replication (see my own research on this subject here, but also an independent verification of the same problem by another scholar). Curiously, it would appear that some of the most problem-plagued areas of economic research are also the most politically favored for their prescriptive policy implications.
In any case, simply shouting “what about ____?” and pointing to other academic disciplines remains an untenable defense of the successfully-hoaxed journals in “cultural studies.” It only highlights the severity of the problem, even as that problem plays out in different ways across different fields. And it does nothing to address or correct for the underlying dearth of rigor that permitted the hoaxes to succeed in the first place.