Double dipping in science
For better or worse, our mind plays tricks on us: we are prone to cognitive biases and logical fallacies.
These biases don’t impact how do the laundry or prepare supper, although I may convince myself that I always intended to make a galette despite having the cookbook open at the soufflé page.
Unfortunately, these same biases and fallacies can wreck havoc on the decisions I make in science.
Going in circles
In a recent letter in Nature Neuroscience, Katherine Button writes about circularity in analysis. In this logical fallacy, the same data are used more than once in the same analysis. For example, we might analyse our data to identify a subset of data that is particularly interesting, and subsequently, analyse the same data to determine just how interesting it is.
The problem with this type of circular analysis is that it violates the assumption of independence. Why is this bad? Well, as Button explains, it undermines statistical inferences, inflates effect size estimates, and increases the chance of finding a statistically significant result when there is in fact no effect.
Circularity in MRI research
In her letter, Button highlights one of the most damning and widespread examples of circular analysis: MRI data analysis. A decade ago, it was shown that, in a sample of 134 MRI and fMRI articles published in leading journals (Nature, Nature Neuroscience, Science, Journal of Neuroscience and Neuron), 42% contained circular analyses, with the analysis of an additional 14% of paper being unclear. Those are huge numbers. Numbers that bring into question the importance (or even correctness) of many research findings.
A straighter path forward
As pointed out by Button, there is no simple solution to the problem. Some researchers have opted to split their dataset in half, one to explore and one to confirm, while others have used the entire dataset in a single confirmatory study. Button’s suggestions include better data-sharing, collaboration and data reporting, as well as the formal registration of study protocols (and analysis plans) before any data is collected. Unfortunately, based on an analysis of published studies and studies registered on the Open Science Framework, Button demonstrates that, compared to other types of neuroscience research, few MRI and fMRI studies pre-register their protocols.
Change is slow…
The letter by Button highlights that change is often slow in science. We are slow to understand the faults in our logic. We are even slower at putting in place practices that protect us against these faults. Nevertheless, the letter highlights that change is possible. Humans can learn to work with their biases and logical fallacies…
Although I am still convinced I set out to make a galette!