January 31, 2016
Publish or Perish
“Publish or Perish” sounds to be these days the axiom in Science. Production and time pressure are part of the scientist’s daily basics. Interrelations between Indexed journals and science production have yielded lesser room for confirmation fashion. For that reason Open Science Collaboration initiatives, tested in the Psychology field, intended to estimate reproducibility of some research reports; that results lead us to new implications, questions and key areas of responsibility in the scientific and academic world.
In addition to Indexed journals that are considered of higher scientific quality, we find the Impact factor (IF) - briefly defining it as a measure reflecting the average number of citations to recent articles published in X journal – both have been highly recognized as standard quality measures but also strongly criticized. In the same line, new paradigms of the relationship between Science and Innovation can lead to a ‘mechanization’ of science, in the sense of understanding innovation exclusively as the expectable outcome. In this regard, it is important to consider the role of science in society; its relation with industry and the way funding is located in this range. If we follow the perspective where Innovation is deeply linked to production, we should take into account that the rules of the market and production might also play a main role.
One essential feature related to production is time. Maximizing time. Production has to be fast and incentives (not only economic) are linked to valuable results, counting with the extremely scarce founding sources available most of the times for scientific research. So, in this regard, the paper ‘An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science’ presents an initiative under the roof of the Open Science movement. It is described as a collaborative project, which integrates several volunteers and institutions. The project described by the authors is called “The Reproducibility Project” and it is tested in the area of Psychology. It starts from the premise of “… [due to] strong incentives for innovation and weak incentives for confirmation, direct replication is rarely practiced or published” p.657. Underlining the authors’ words, innovation is the production motor and determines the speed Science has to be produced.
If the pressure of presenting results is so big, and there is always founding limitations even for actual research plus tight time schedules, what should be the motivation for scientists in reproduce their colleagues work? Reproducibility just for the sake of the Science would be the most favorable answer; however some kinds of incentives have to mediate. Perhaps ‘desire to help Science advance’, ‘improve their own career’ or just ‘contribute’ and ‘learning new stuff’ would be among the answers. The task has also more challenges for instance the fear, as the authors have also mentioned, of damaging the image of Science and therefore further funding could be reduced. Fact that in real terms unfortunately might be true, but I also share the thought that projects like the one described in the paper might increase transparency and accessibility among the scientific community.
Visibility is another concept inherent to Science, traditionally understood as the appearance in Indexed journals, IF, renowned research institution sponsorships, frequency of publication, relevance of the topic and so on; but with this initiative we could also explore another perspective of the visibility approach. What about to offer my research to some reproducibility projects? Some of the fruitful outcomes are contribution, increase the collaborative work spirit and eventually to improve my results on further publications: A win-win result also regarding visibility.
On the other hand side, Peer review is currently one of the most accurate and successful mediums to confirm the veracity of the research in Science. However Peer review has also its own limitations, for instance the need for specialized equipment that is unavailable, or the specificity of the lab samples that are impossible to get access to and not having the budget to invest to, just for mentioned the most common ones. Challenges that might be controlled in a reproducibility project instead.
How to deal with the results of a ‘Reproducibility Project’ in Science, is a big question also. In the paper, the authors in my opinion also struggled with the evaluation of Replication-Study Results, by claiming that “…As yet, there is no single general, standard answer to the question “what is replication?” so we employ multiple criteria” (p. 658). In this point, I will take the risk to say that there is a big difference between research in Psychology and research in Natural Science, particularly in my field, Microbiology. In order to replicate an experiment in Microbiology, most of the scientists that usually work in the same field should know what is the procedure to follow –or by general rule, it has to be clarified in the methodology part of the paper- and also most of them also count with the proper equipments or eventually can have access to them. So, the concept of replication has no a wide range of definitions in Microbiology as it might have in Psychology.
Open Science Collaboration. (2012). An open, large-scale, collaborative effort to estimate the reproducibility of psychological science. Perspectives on Psychological Science, 7(6), 657-660.