Why we need to report more than “Data were analysed by t-tests or ANOVA”

T-tests and analysis of variance (ANOVA) are common statistical tests in physiology and biomedical science. While the SAMPL guidelines for reporting statistical analyses and methods in published literature state authors should “describe statistical methods with enough detail to enable a knowledgeable reader with access to the original data to verify the reported results”, such recommendations are rarely implemented. Simply stating that “data were analysed by t-tests or ANOVA” is problematic because (1) there is nothing to indicate what type of test was used or whether it was correct and (2) the statement assumes knowledgeable readers would choose the same test, but this assumption may not be accurate. Thus, the lack of details in statistical reporting precludes research replication and transparency, and may hamper efforts to encourage research reproducibility.

In a recent study, Weissgerber and colleagues conducted a systematic review to determine the quality of statistical reporting in biomedical science. Investigators reviewed original research articles published in June 2017 in the top 25% of physiology journals. What did they find?

85% of articles included either a t-test or ANOVA, and 39% included both. ANOVA analysis was more common than t-tests. Here are some key points:

ANOVA

  • Many papers lack information needed to determine what type of ANOVA was performed (Figure 1). 38/225 (16.9%) papers did not report how many factors were included for any ANOVA.
  • Among papers that used one-way ANOVA, 67/110 (60.9%) used one-way ANOVA when the study design included 2 or more factors.
  • Of the 41/225 (18.2%) papers that used repeated-measures ANOVA, 26/41 (63.4%) did not report whether each factor was between- or within-subjects.
  • The remaining 81.8% of papers that used ANOVA did not report whether repeated measures were used.
  • Few papers report information needed to confirm results from the ANOVA (Figure 2). More than 95% of papers did not report the F-statistic or degrees of freedom. In addition, 77.8% of these papers reported ranges of p-values (i.e. p>0.05, p<0.05, p<0.01)

t-tests

  • Over half of the papers (95/179, 53%) did not report whether all t-tests were paired or unpaired.
  • Of the papers that used unpaired t-tests, 100/155 (64.5%) reported whether equal variance was assumed for all t-tests, and 10/155 (6.5%) reported this information for some unpaired t-tests.


 

Figure 1: Proportion of papers reporting information needed to determine what type of ANOVA was performed; Fig 2 of paper.

 


 

Figure 2: Proportion of papers with ANOVAs that reported the F-statistic, degrees of freedom and exact p-values; Fig 8 of paper.

 

 

Summary

In general, statistical reporting in physiology lacks sufficient detail to facilitate research reproducibility, and leaves much to be desired. Unfortunately, Weissgerber et al’s findings agree with our recent research. There is a need for investigators, journals and reviewers to work towards improving the standards of statistical reporting practices. Some strategies include implementing changes through reporting guidelines and journal policies, and up-skilling investigators and peer reviewers through training packages and courses. We hope that improvements in statistical reporting will allow readers and the scientific community to critically evaluate published research.

Reference

Weissgerber TL, Garcia-Valencia O, Garovic VD, Milic NM, Winham SJ. Meta-Research: Why we need to report more than ‘Data were Analyzed by t-tests or ANOVA’. eLife 2018;7:e36163

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s