Social psychology falls from grace

It is not only scientists in social psychology who indulge in fraud.  Anthropology for example has had its share of frauds. While corporations – such as Glaxo Smith Kline– can be held liable and sanctioned for fraud, it is very rare for individual academics who fake data in pursuit of their own agendas to be held liable. Why cannot a concept of tort or “product liability”apply to scientists? The members of the medical profession who aided and abetted GSK are unlikely to face any sanctions. But the recent scandals of social psychologists faking data to show statistical correlations between sets of propositions and then inferring causal relationships have demonstrated two things which I think apply in many more so-called “scientific” disciplines  than just social psychology. :

  1. The ease with which sampled data can be faked or cherry picked by workers from reputed institutions to show apparent correlations can then be provided a stamp of authority through the publication of “peer-reviewed” papers, and
  2. that there is a need to return to the scientific method of focusing on propositions that are falsifiable and to avoid the temptation of concluding that any positive statistical correlation provides proof of a causal relationship.

Christopher Shea writes in the Wall Street Journal: 

Scandals in Social Psychology Spreading

Last fall, when I wrote in the Chronicle of Higher Education about a Dutch psychologist who was found to have been committing research fraud, and noted that a growing number of psychologists were raising questions about some of the statistical techniques used by their peers (techniques that, to be clear, allegedly led to findings driven by wishful thinking, not outright fraud), one response from commenters was: Why pick on psychology?

Indeed, statistical problems bedevil many fields. One noted biostatistician has suggested that as many as half of all published findings in biomedicine are false. Given large, complex datasets, and freedom to choose the parameters of a given study, there is a lot of room for error to creep in when you really want to find a result.

But now, as BPS Research Digest reports, in a useful summary of the case, yet another Dutch social psychologist, Dirk Smeesters, has been found guilty by his university of dubious “data selection” and failing to maintain adequate research records. And so the focus is on social psychology once again. ….. 

Such fraud can be detected but it takes much effort. In the case of Smeesters, the investigative effort was by Uri Simonsohn:


The most startling thing about the latest scandal to hit social psychology isn’t the alleged violation of scientific ethics itself, scientists say, or the fact that it happened in the Netherlands, the home of fallen research star and serial fraudster Diederik Stapel, whose case shook the field to its core less than a year ago. Instead, what fascinates them most is how the new case, which led to the resignation of psychologist Dirk Smeesters of Erasmus University Rotterdam and the requested retraction of two of his papers by his school, came to light: through an unpublished statistical method to detect data fraud.

The technique was developed by Uri Simonsohn, a social psychologist at the Wharton School of the University of Pennsylvania, who tells Science that he has also notified a U.S. university of a psychology paper his method flagged. That paper’s main author, too, has been investigated and has resigned, he says. As Science went to press, Simonsohn said he planned to reveal details about his method, and both cases, as early as this week. ….. 

… Simonsohn already created a stir last year with a paper in Psychological Science showing that it’s “unacceptably easy” to prove almost anything using common ways to massage data and suggesting that a large proportion of papers in the field may be false positives. He first contacted Smeesters on 29 August 2011 about a paper on the psychological effects of color, published earlier that year. The two corresponded for months, and Smeesters sent Simonsohn the underlying data file on 30 November. Smeesters also informed a university official about the exchange. ……  

…. At the moment, scientists don’t know the proportion of papers Simonsohn’s tool might be applicable to or how sensitive it is. Even if it is very good and wrongly indicts only one in 10,000 papers, it would misfire when applied widely, says psychologist Eric-Jan Wagenmakers of the University of Amsterdam. And some worry it could lead to witch hunts or score settling between rivals, Nosek says. Harvard University psychologist Daniel Gilbert says AAAS (the publisher of Science) should have top statisticians study Simonsohn’s method and recommend how to use it. “If we really do have a new tool to uncover fraud, then we should be grateful,” Gilbert says. “But the only difference between a tool and a weapon is in how judiciously they are wielded, and we need to be sure that this tool is used properly, fairly, and wisely.”

But it is not only social psychology and the other “soft” sciences which are susceptible to scientists who make up data to “prove” whatever they wish to prove. The traditionally “hard” sciences of physics and chemistry and medicine are not immune to such chicanery either. And when it comes to “climate science” – which is just as soft a science as social psychology – a political agenda has replaced the scientific method. The data are massaged and apparent correlations are then taken to be “proof” of a pre-assumed causal relationship. Data which may falsify the assumed hypothesis are never permitted.

It is time to return to the old-fashioned scientific method; where statistical correlations can only suggest  relationships but which – in themselves – actually prove nothing.

Tags: , , , ,

%d bloggers like this: