What's Happening

Reviewer need be retrained to spot bad science

Mar 16th, 2011 by blog | 0

From Esteban Moro Egido blog "publish and perish"

Russ Altman from Stanford University has recently reviewed and presented the most interesting bioinformatics papers from 2010. While this is undoubtedly very subjective (obviously, because he did not mention any of my papers), it is an excellent practice and should be done in any research institute for all the topics that are of interest to this community. Staying informed about what actually is the cutting edge science, is paramount for real progress, rather than keep on using the same old hat, and being frustrated that the outcome never looks promising.

Speaking of promising, I particularly like a paper titled “Over-optimism in bioinformatics research.” that features in this “highlights of 2010″ list, which summarizes some of the things that are seriously wrong with the research community as a whole (not just bioinformatics): negative results are not easily publishable, this seduces “authors to find something positive in their study by performing numerous analyses until one of them yields positive results by chance, i.e. to fish for significance”. Also, positive results are all too willingly accepted while negative results are interrogated until they crack and pretend to be positives just for the publication. We all know that being a significance-fisher does not require too much interrogation skill because the evil sidekick “noise” is abundantly present in the types of data most of us work with. Contributing to this is the fact that most results are not reproducible once they are published.  This I blame on unavailable code, software version changes, reference data update but most of all on the mad rush that is applied during data production and analysis because one might get scooped.

This brings us full circle with why a review of current research activity, “highlights of 20XX”, is paramount: knowing what others do gives us a better chance to position our research in an area that is truly new and unique, which means we have more time to form a solid hypothesis, do good science and then publish high confidence results only (positive or negative). However, the current “publish or perish” attitude makes this approach career suicide. But, there could be a shift towards “more quality than quantity” if the reviewers are briefed by their institute’s “highlights of 20XX” presentations to scrutinize bad science. So, since we are all reviewers at some stage, we can start making the first step towards good science at our next review ourselves.

Leave a Reply