Friday, 7 July 2017

Are you being bullied into bad science?

It was good news for Minerals Engineering to hear that the Impact Factor had risen 40% in a year to a record high (posting of 30th June). Coincidentally, the number of papers submitted in 2016 was also at a record high, and also a 40% increase on the number submitted in 2015.
However, there is a darker side to this. Of the 1325 articles dealt with in 2016, 65% of them were rejected by me without sending to review and the final rejection rate was 79%, also an all time record.
Clearly there is still a lot of bad science around and early last year there was much vigorous debate on the need to take bad science seriously (posting of 18 January 2016). So why is the message not getting through?
Apparently one of the reasons is that young researchers are being pressurised by their Universities into publishing as many papers as possible in prestigious journals, even if this means cutting corners in a rush to publish. Bullied into Bad Science is a campaign led by a group of junior researchers at the University of Cambridge who are fed up with the relentless pressure to produce reams of 'jazzed-up' findings. The campaign has set its sights on the senior academics and university officials who choose which researchers to hire or fund on the basis of how many papers they have published and in what journals- sometimes without actually reading the articles in question! The movement has now spread to universities around the world, including Bristol, Oxford, University College London and the University of California, Los Angeles and more than 50 academics have signed up so far.
The survey notes that early career researchers (ECRs) are often pressured into publishing against their ethics through threats that they would not get a job/grant unless they publish in particular journals; usually these journals are well established, have a print version, a high impact factor, and are not 100% open access. They feel that these "out of date practices and ideas" hinder rather than help ECRs: evidence shows that publishing open access results in increased citations, media attention, and job/funding opportunities. Open dissemination of all research outputs is also a fundamental principle on which ECRs rely to fight the on-going reproducibility crisis in science and thus improve the quality of their research.
Although I am fully aware that Minerals Engineering is one of those well established journals, with a high impact factor in its field, and there is increasing pressure to submit and have articles published in it, I do think that the aims of the ECRs are laudable, and I would like to open this for discussion. I would particularly like to hear of any of your experiences of being bullied into cutting corners in the hope of early publication (you may if you wish submit your comments anonymously).
Twitter @barrywills

8 comments:

  1. This is great initiative. An interesting, related read is to be found in: Smaldino and McElreath (2016) The natural selection of bad science,

    http://rsos.royalsocietypublishing.org/content/3/9/160384

    Poor research design and data analysis encourage false-positive findings. Such poor methods persist despite perennial calls for improvement, suggesting that they result from something more than just misunderstanding. The persistence of poor methods results partly from incentives that favour them, leading to the natural selection of bad science. This dynamic requires no conscious strategizing—no deliberate cheating nor loafing—by scientists, only that publication is a principal factor for career advancement. Some normative methods of analysis have almost certainly been selected to further publication instead of discovery. In order to improve the culture of science, a shift must be made away from correcting misunderstandings and towards rewarding understanding. We support this argument with empirical evidence and computational modelling. We first present a 60-year meta-analysis of statistical power in the behavioural sciences and show that power has not improved despite repeated demonstrations of the necessity of increasing power. To demonstrate the logical consequences of structural incentives, we then present a dynamic model of scientific communities in which competing laboratories investigate novel or previously published hypotheses using culturally transmitted research methods. As in the real world, successful labs produce more ‘progeny,’ such that their methods are more often copied and their students are more likely to start labs of their own. Selection for high output leads to poorer methods and increasingly high false discovery rates. We additionally show that replication slows but does not stop the process of methodological deterioration. Improving the quality of research requires change at the institutional level.

    Bob

    ReplyDelete
  2. This is also evident in the comments on technical (reports (NI 43-100, JORC, & SAMREC) where rush to publish data is more important than accuracy and reliability. Probably a result of the digital age, where instant information is important.

    "I want it now!"

    Mike Albrecht, P.E.

    ReplyDelete
  3. I am extremely happy that you and the others who commented have put the real" malice " on the table. It is about time that the Administrators of academic institutes and R&D organisations understand that" apples are apples and oranges are oranges"--each discipline should have its own yard stick to assess the work of an individual work and that they should not apply the same yardsticks across all disciplines. The current systems are driving researchers to "manufacture/recycle" results in a hurry in numbers with a hope that some will hit the goal post and would help in their career advancement. The system at the decision making points of Institutes has to change. I hope some of them read your Blog.
    Rao,T.C.

    ReplyDelete
  4. A worthwhile initiative though the buck does not stop at the institutional level. Granting bodies (and their reviewers) and the way institutions are evaluated and assessed by government is a major driver, and that pressure is transferred to the researcher. This is the case in both the UK and Australia. I recall the fates of departments around the UK being determined somewhat by the RAE. We have the same sort of quality review exercise in Australia called the ERA. It is based on assessing the quality of research output (above a certain threshold of numbers). Though volume is not supposed to come into it, some universities argue that it should in as much as distinguishing between two or more institutions of "equal quality". So, starting from the grass roots level is to be applauded, but there must also be leadership from the top.

    ReplyDelete
  5. I'm so glad that this topic is finally being address within the wider scientific community and would firstly like to state that this is my own personal experience and hopefully a representation of the minority rather than the majority. As an ECR within a university and only a year into my first research grant I can completely empathise and relate to the issues highlighted within this article.

    Since having started I have felt the constant pressure to publish with threats relating to the security of my position, my reputation and ability constantly being held over me, the phrase 'publish or perish' often being quoted to me when discussing this with my line manager (a senior professor!).

    This attitude has not only been detrimental to my confidence and development as a researcher but has also in itself threatened to cut my career short as in this situation there are no winners. If I try to submit work that is below standard, not scientifically sound or as rightly said ‘jazzed up’ it not only puts me in an uncomfortable position but may reflect badly on my professional reputation and undermines my integrity – it is also setting us up to fail as most ECR’s know these types of papers probably won’t even be accepted for review and so face undoubted rejection again instilling a lack of self-confidence and conviction within our own work and putting it bluntly wasting time.

    If however I choose not to follow this route and instead focus on conducting good research then presenting it once I have something that is robust and worth sharing I face heavy criticism, being branded ‘lazy’ and my work inadequate and being told I will not survive without a good publication record. But this is the problem ‘good’ does not mean extensive or large - as we all know good science takes time and we are being pressurised into unrealistic timescales and expectations.

    I would like to note however that this is certainly not the experience of every ECR and of course that there has always been an emphasis on publication within the research sector (academic or industrial) but this particular recent change in attitude may be the effect of a larger problem.

    My experience is limited to that of a university environment but from what I can gather with universities now judging professors and holding their positions to ransom against their ‘outputs’, defined by a set of imperial standards such as papers (REF 2020), number of grants, etc. and not on their own individual merits and contributions to a department it was only matter of time before these pressures were passed down the line. As an ECR I am not privy to the complete ins and outs of the university and so it maybe that there are other factors at work which I am not aware of but I am sure this has contributed in some way.

    This may sound like a damming review but it couldn’t be further from the truth my line manager is an amazing researcher and professor in a difficult position and my department is at the top of it’s game with regards to research and high impact publications. I am only sharing my experiences as this is an issue that needs to be addressed across all research.

    At the end of all this (and possibly this is my naivety) surely it’s better to take and invest time to produce and present research that really has an impact and you are proud of otherwise all we will do is devalue ourselves and our research?

    Thank you MEI for providing a forum to discuss this.

    ReplyDelete
  6. Good to see some one opening his heart. From my point of view and experience, there is no real conflict between good research and relevant research. The Institutes themselves, in many cases,are not clear on what they are supposed to do.So every one starts working on whatever he feels will give him papers at the earliest and perpetuates the same..
    There is severe pressure on youngsters leading to all these issues
    In fact this mania for publishing is moving the profession away from industry also.
    Barry, I am glad your Journal has high impact factor and so those working in mineral engn are able to hold their heads high in the total system which has many disciplines.
    The problem is that both the teaching and research institutes have become closed systems and fresh thing on how to know/identify relevant research is missing.
    Rao,T.C.

    ReplyDelete
  7. Dear Barry,
    What is ‘bad science’? There is science and if there is ‘bad science’ then it is not science! It is good science as long as the experimental results are reproducible and they are not copied from somewhere or based on generated data. I think a researcher should have internal urge to do research and there should be external pressure on a career researcher to do research for which he is paid. In the absence of these two factors a researcher will become lazy. A system reaches most stable state spontaneously. This is one way of stating second law of thermodynamics. Thus without ‘internal urge’ and in the absence of ‘external pressure’ a researcher (early career or professional) is bound to become lazy, a stable state.

    As far as publishing in peer reviewed journals, it may be required for early career researchers. The importance of peer reviewed journal with high impact factors will be there. However in these days of internet communication ‘post publication open review’ I think is better. The problem is the peers fear to openly review!

    I posted an article “Principles of Phosphate Fertilization and PROM – Progress Review 2012” on researchgate.net which questions the idea that P in phosphate fertilizers should be in water soluble form for use in alkaline soils. It had 500 reads. Few started making Phosphate Rich Organic Manure (PROM) in India and it works!

    Thanks,
    DMR Sekhar

    ReplyDelete

If you have difficulty posting a comment, please email the comment to bwills@min-eng.com and I will submit on your behalf