Has the communication of scientific research reached a crisis point Photograph: Alamy

Traditional scientific communication directly threatens the standard of scientific research. Today’s system is unreliable – or worse. Scholarly publishing regularly gives the top status to analyze it is most probably to be wrong. This technique determines the trajectory of a systematic career and the longer we keep it up, the much more likely it’s going to deteriorate.

Think these are strong claims They, and the issues described below, are grounded in research recently presented by Björn Brembs from the University of Regensburg and Marcus Munafò of the University of Bristol in Deep impact: unintended consequences of journal rank.

Retraction rates

Retraction is one possible response to discovering that something is inaccurate with a broadcast scientific article. When it really works well, journals publish a statement identifying the cause of the retraction.

Retraction rates have increased tenfold some time past decade after decades of stability. In step with a up to date paper within the Proceedings of the National Academy of Sciences, two-thirds of all retractions follow from scientific misconduct: fraud, duplicate publication and plagiarism.

More disturbing is the finding that the foremost prestigious journals have the best rates of retraction, and that fraud and misconduct are greater sources of retraction in these journals than in less prestigious ones.

Among articles that aren’t retracted, there’s evidence that probably the most visible journals publish less reliable (in other words, not replicable) research results than lower ranking journals. This can be as a result of a preference among prestigious journals for results that experience more spectacular or novel findings, a phenomenon called publication bias.

The decline effect

One cornerstone of the standard control system in science is replicability – research results ought to be so carefully described that they are often obtained by others who follow a similar procedure. Yet are journals fascinated about publishing mere replications, giving this actual quality controls measure somewhat low status, independent of ways important this is, for instance in studying potential new medicines.

When studies are reproduced, the resulting evidence is usually weaker than within the original study. Brembs and Munafò review research leading them to say that “the strength of evidence for a specific finding often declines through the years.”

In a captivating piece entitled The reality wears off, the brand new Yorker offers the subsequent interpretation of the decline effect, that one of the most likely reason behind the decline is an obvious one: regression to the mean. Because the experiment is repeated, that’s, an early statistical fluke gets cancelled out. Yet it can be precisely the spectacularity of statistical flukes that increase the percentages of having published in a high prestige journal.

The politics of prestige

One option to measuring the significance of a journal is to count what number of times scientists cite its articles; this can be the intuition behind impact factor. Publishing in journals with high impact factors feeds job offers, grants, awards, and promotions. A high impact factor also enhances the recognition – and profitability – of a journal, and journal editors and publishers work flat out to extend them, primarily by looking to publish what they think would be the most significant papers.

However, impact factor is also illegitimately manipulated. As an example, the true calculation of impact factor involves dividing the full selection of citations lately by the choice of articles published inside the journal within the same period. But what’s an editorial Do editorials count What about reviews, replies or comments

By negotiating to exclude some pieces from the denominator on this calculation, publishers can increase the impact factor in their journals. In ‘The impact factor game’, the editors of peer-reviewed open access journal PLoS Medicine describe the negotiations determining their impact factor. Their impact factor might have been anywhere from 4 to 11; an impact think about the 30s is very high, while most journals are under 1. In other words, 4 to 11 is a big range. This process led the editors to “conclude that science is currently rated by a process which is itself unscientific, subjective, and secretive”.

A crisis for science

I believe the issues discussed listed here are a crisis for science and the institutions that fund and perform research. We’ve got a system for communicating leads to which the necessity for retraction is exploding, the replicability of analysis is diminishing, and essentially the mostsome of the most standard measure of journal quality is becoming a farce. Indeed, the ranking of journals by impact factor is on the heart of all three of those problems. Brembs and Munafò conclude that the system is so broken it would be abandoned.

Getting past this crisis would require both systemic and cultural changes. Citations of individual articles is usually a good indicator of quality, however the excellence of individual articles doesn’t correlate with the impact factor of the journals during which they’re published. After we have convinced ourselves of that, we must see the results it has for the evaluation processes necessary to the development of careers in science and we must push nascent alternatives resembling Google Scholar and others forward.

Politicians have a sound impose accountability, and while the benefit of counting – something, anything – makes it tempting for them to deduce quality from quantity, it doesn’t take much reflection to gain that it is a stillborn strategy. So long as we believe that research represents one of many few true hopes for moving society forward, then we need to face this crisis. It will likely be challenging, but there is not any other choice.

Curt Rice is vice chairman for research and development on the University of Tromsø – it’s an edited version of a piece of writing first published on his blog. Follow him on Twitter @curtrice

This content is delivered to you by Guardian Professional. To get more articles like this direct in your inbox, join the upper Education Network.