Retraction Watch has now obtained and translated the report of the investigation by the Dutch National Board for Scientific Integrity (LOWI) into the suspicions about Jens Förster’s research. The conclusions are unavoidable that data manipulation must have taken place and could not have been the result of “sloppy science”.
Here are some of the highlights from the document, which we’ve had translated by a Dutch speaker:
“According to the LOWI, the conclusion that manipulation of the research data has taken place is unavoidable […] intervention must have taken place in the presentation of the results of the … experiment”
“The analyses of the expert … did add to [the Complainant’s report] that these linear patterns were specific for the primary analyses in the … article and did not show in subsets of the data, such as gender groups. [The expert stated that] only goal-oriented intervention in the entire dataset could have led this result.”
“There is an “absence of any form of accountability of the raw data files, as these were collected with questionnaires, and [there is an] absence of a convincing account by the Accused of the way in which the data of the experiments in the previous period were collected and processed.”
“[T]he assistants were not aware of the goal and hypotheses of the experiments [and] the LOWI excludes the possibility that the many research assistants, who were involved in the data collection in these experiments, could have created such exceptional statistical relations.”
What is particularly intriguing is the method of statistical investigation that was applied. Suspicions were not only because the data showed a remarkable linearity but that sub-sets of the data did not. The first suggests confirmation bias (cherry picking) but the second brings data manipulation into play. Non-linearity in sub-sets of data cannot just neatly cancel themselves out giving – fortuitously for the hypothesis – a linearity in the complete data set. The investigation methods are of more value than the Förster paper to be retracted.
I have an aversion to “science” based on questionnaires and “primed” subjects. They are almost as bad as the social psychology studies carried out based on Facebook or Twitter responses. They give results which can rarely be replicated. (I have an inherent suspicion of questionnaires due to my own “nasty” habit of “messing around” with my responses to questionnaires – especially when I am bored or if the questionnaire is a marketing or a political study).
Of course priming works— it couldn’t not work. But the lack of control over the information contained in social priming experiments guarantees unreliable outcomes for specific examples. ..
This gets worse because social priming studies are typically between-subject designs, and (shock!) different people are even more different from each other than the same people at different times!
Then there’s also the issue of whether the social primes used across replications are, in fact, the same. It is currently impossible to be sure, because there is no strong theory of what the information is for these primes. In more straight forward perceptual priming (see below) if I present the same stimulus twice I know I’ve presented the same stimulus twice. But the meaning of social information depends not only on what the stimulus is but also who’s giving it and their relationship to the person receiving it, not to mention the state that person is in.
… In social priming, therefore, replicating the design and the stimuli doesn’t actually mean you’ve run the same study. The people are different and there’s just no way to make sure they are all experiencing the same social stimulus, the same information
And results from such studies, if they cannot be replicated, and even if they are the honest results of the study, have no applicability to anything wider than that study.
Diederik Stapel markets himself (anonymously) on Retraction Watch
October 13, 2014In June last year it disturbed me that the New York Times was complicit in helping Diedrik Stapel market his “diary” about his transgressions. There is something very unsatisfactory and distasteful when we allow wrong-doers to cash in on their wrong-doing or their notoriety. I had a similar sense of distaste when I read that the Fontys Academy for Creative Industries offered him a job to teach social psychology – almost as a reward for being a failed, but notorius, social psychologist.
Retraction Watch carried a post about the new job. And Diedrik Stapel was shameless enough to show up in the comments (first anonymously) but finally under his own name when he was exposed by Retraction Watch. The comments were all gratuitously self-serving. Perhaps he was carrying out a social experiment?
But this was noticed also by Professor Janet Stemwedel writing in the Scientific American:
Stapel will surely become a case study for future social psychologists. If he truly wishes rehabilitation he needs to move into a different field. Self-serving, anonymous comments in his own favour will not provide the new trust with his peers and his surroundings that he needs to build up. Just as his diary is “tainted goods”, anything he now does in the field of social psychology starts by being tainted with the onus of proof on him to show that it is not.
Share this:
Like this:
Tags:anonymous comments, Diederik Stapel, rehabilitation, Social psychology
Posted in Academic misconduct, Behaviour, Ethics, Fraud, psychology | Comments Off on Diederik Stapel markets himself (anonymously) on Retraction Watch