The Centre for Open Science Projecthave launched the TOP Guidelines.
The success of a journal is difficult to measure and various methods have been adopted to try and find the most balanced way of doing this. Ranking journals is used to promote high quality publications and reflect the relative difficulty of being published in that journal and therefore the prestige that comes with this. Arguably the most commonly used method of ranking journals is Impact Factor.
Impact factor is a rough measure of how ‘important’ a journal is by calculating the yearly average of citations that articles published in the last two years in a given journal received. Importance is assumed to be determined by citations as this is used as a proxy of quality, originality and in the case of health sciences, clinical relevance. It is quick and easy to use and currently indexes over 11,500 science journals.
IF has many criticisms that it is not a valid measure and that it can be manipulated by editors and authors. There are a number of ways this can happen examples being; journals publishing more review articles which will naturally include more citations within the text as well as trying to publish articles, and publishing more ‘citable’ articles earlier in the year allowing them more time to gather citation numbers. There are more criticisms around gaming the number system involved and many alternatives have tried to take IFs place but not many have gained traction.
This lack of traction could be about to change with the introduction of the Journal Transparency Index also known as the TOP Factor. TOP stands for the Transparency and Openness Promotion guidelines which is a framework of 8 standards that indicate high levels of transparency and reproducibility of research. The 8 standards are: Data Citation, Data Transparency, Analysis Code Transparency, Materials Transparency, Design and Analysis Guidelines, Study Prestige, Analysis Plan Preregristration, Replication, Registered Reports and Open Science Badges. Each of these standards is scored from 0-3 for quality and added up at the end with the idea being that now a journal is ranked on actual quality instead of a proxy of citations.
One of the key differences of the index is not neccissarily to ranking journals but more increasing the visibility of good transparent research thereby increasing the quality of all journals. By promoting transparency and reproducibility it will make it easier to evaluate research quality. Clearly the TOP scoing system isn’t infallible as research can be transparent and still terrible but the general idea is that transparent research is a prerequisite for good research and can be easily evaluated by the reader.
The Centre for Open Science is a not-for-profit company which has set up the TOP guidelines and manages the index. More can be read about them here.