With the advent of Open Access – a new publishing model that asks researchers to pay to make their research publicly accessible – unscrupulous predators were granted easy access to a house full of naïve and ingenuous scientists and doctors, and today thrive at their expense, and at the expense of science, including cancer research.
“The email appeared legitimate. It spelled my name correctly, referenced some of my previous work, and used correct grammar. The journal wasn’t on Beall’s List of Predatory Journals and Publishers. I thought I had done my due diligence. I submitted my manuscript. Shortly after, I celebrated the first round of favorable reviews. Things were going great — or so I thought”. More and more scientists – like Alan Chambers who recently wrote his dramatic story on Science magazine (How I became easy prey to a predatory publisher) – are experiencing the hard way a nasty side-effect of the revolution that shook the world of scientific publishing and conferences, to increase access to published research.
Black sheep in disguise
Several animal metaphors – from black sheep to sharks – have started to pop up because new, unusual and dangerous “beasts” started making their appearance on the desk and in the mailbox of scientists of all disciplines: publishers who lure academics and researchers into publishing in journals with high-resounding names, and into presenting their research at conferences abroad. More and more often, academics and researchers are ceremoniously invited to sit on editorial boards and even chair them, regardless of their qualifications and scientific production.
Those invitations are sent by the new actors in the publishing arena: A journal editor called them “Black sheep”, back in 2008 (Black sheep among Open Access Journals and Publishers), but the expression that stuck with the International scientific community to describe unscrupulous publishers and conference organisers was “Predators”.
At first, one might assume that fraudsters
are easy to distinguish from honest enterprises
but research shows that the opposite is true
It was Jeffrey Beall – then librarian-researcher at the University of Denver, in Colorado – who coined the expression “predatory journals and publishers”, back in 2010. Willing to warn scientists about the risks, he had started in 2008 to publish a blacklist, which was freely available online.
At first, one might assume that fraudsters are easy to distinguish from honest enterprises but research shows that the opposite is true: as much as 5% of academics in developed countries such as Germany, the UK and Italy continue to fall prey to these shady operations, like Chambers.
A fast-growing and changing scenario
According to the most recent report by the International Association of Scientific, Technical and Medical Publishers, “there were about 33,100 active scholarly peer-reviewed English-language journals in mid-2018 (plus a further 9,400 non-English-language journals), collectively publishing over 3 million articles a year”. These numbers keep growing, at an accelerating pace: “The number of articles published each year and the number of journals have both grown steadily for over two centuries, by about 3% and 3.5% per year respectively” the report continues. “However, growth has accelerated to 4% per year for articles and over 5% for journals in recent years”.
According to estimates,
there were 8,000 predatory journals in 2014.
In 2018 their number has grown to 9,179.
The number of “predatory” journals – which generally go hand in hand with conferences based on the same lack of scientific scrutiny – has also grown very rapidly: according to estimates, there were 8,000 predatory journals in 2014. This drains some 75 million dollars from legitimate operations, essentially through the publication fees that legitimate journals use to replace lost income from subscriptions. As of August 2018, the most recent figure from Cabell’s – a publisher that started offering a blacklist at a hefty price – was of 9,179 journals verified as predatory. This list began after legal threats and all kinds of pressures convinced Jeffrey Beall to remove his own list.
In this scenario, researchers looking for an outlet to present their research to the world in a legitimate way, have a really hard time.
The hunter, the hunted and the exploiter
According to a recent analysis of the scientific production of all Italian academics who applied for the National Scientific Qualification to get access to career improvement (Research Policy Volume 48, Issue 2, March 2019), more than 2,200 authors, about 5 per cent of the total, published at least one article in a predatory journal on Beall’s list, and about 30 per cent of those did so more than once.
Some admitted to having been lured
by the idea of easily publishing an article
After identifying the articles published in those questionable journals, the researchers invited the authors to answer a few questions, anonymously: “Some of them said that they were duped, but some admitted to having been lured by the idea of easily publishing an article, because in the short term that would increase their chances of getting the qualification” co-author Mauro Sylos-Labini, a political economics researcher at the Università di Pisa, in Italy, told Cancerworld. “They told us that they regretted having done so, in retrospect”.
The explanation for the apparent non-sense is in the fact that almost one in four of the journals classified in the study as predatory based on the Beall’s list, were also present on Scopus, one of the leading International databases of journals, and used by many research institutions – including the Italian agency for the evaluation of research – as a proxy for quality.
Trouble is that nobody has found yet a clear way to determine if those journals were wrongly included in the Beall’s blacklist or in Scopus’ whitelist: “We can say that in most cases – I would say 98% of them – it is easy to say if a journal or a publisher is predatory, but there are borderline cases for which it can be very difficult” Jeffrey Beall told CancerWorld. “Bad science and questionable publication practices existed long before Open Access, but the advent of Open Access and Internet offered them more opportunities for thriving”.
Last April, one of Beall’s sworn enemies, Indian publisher and conference organiser OMICS, was finally caught by the US Federal Trade Commission and sentenced to pay over 50 million dollars for “unfair and deceptive practices”. Needless to say, the company is still in business.
My name is Fraud, Dr. Anna Fraud
After Science magazine published, in 2013, the first shocking investigation by John Bohannon, which showed that many open access journals declaring to use peer-review, were more than willing to publish a fake scientific article full of obvious mistakes, just to cash a fee (Who’s Afraid of Peer Review? Science 04 Oct 2013), other sting operations demonstrated that the situation is not improving, and might be worsening.
In today’s world, to become editor in chief of a scholarly journal or scientific director of a scientific conference, all one needs is an online presence and some pocket money. Of course, the chutzpah helps.
The case of fictitious scientist Anna O. Szust (Oszust is the Polish word for fraud, fraudster) is exemplary. A group of scientists created a plausible, but totally fake, online resume, and applied on her behalf to the editorial boards of 360 journals: “The profile was dismally inadequate for a role as editor. Szust’s ‘work’ had never been indexed in the Web of Science or Scopus databases, nor did she have a single citation in any literature database. Her CV listed no articles in academic journals or any experience as a reviewer, much less an editor. The books and chapters on her CV did not exist and could not be found through any search engine. Even the publishing houses were fake” they wrote on Nature (Predatory journals recruit fake editor Nature 543, 481–483 23 March 2017).
“The aim of our study was to help academics to understand how bogus versus legitimate journals operate, not to trick journals into accepting our editor. For this reason, Szust was not a persistent applicant” they continued. “In many cases, we received a positive response within days of application, and often within hours. Four titles immediately appointed Szust editor-in-chief”.
Apart from egregious cases like this one, there are more and more operations – often based in the developing world – that are not up to the highest standards, but are doing their best to learn: “It’s not binary, but rather a complex scenario” summarises Beall, who has retired after being cleared of the allegations of research misconduct by his University, sparked by a complaint by OMICS.
Oncotarget: friend or foe?
Then there are highly unusual cases, such as the Open Access journal Oncotarget, owned by US-based publisher Impact Journals: “One day an outstanding researcher of my Institute who had published on Oncotarget called me in shock because he had just realised that one of his research articles had abruptly lost much of its value” recalls Vanna Pistotti, former librarian of the Mario Negri Institute of Pharmacological Research in Milan, now moved to a position as researcher in oncology. “Oncotarget was in the middle of a storm, and we were unable to understand the reason for that”.
“If you look at the scientific literature on predatory publishers,
you find a significant lack of consistency”
The journal, described as “the most proliferative journal of oncology and cancer research of the past decade” (Scientometrics December 2018 “The story behind Oncotarget? A bibliometric analysis”), was dropped from Medline – the database of the US National Library of Medicine – and later delisted from the Journal Citation Reports published by Clarivate Analytics that assigns the highly valued “impact factor” to listed journals.
Weirdly enough, a few months earlier Clarivate Analytics had listed the journal among the “Rising Star from Essential Science Indicators”, basically recommending scientists to submit their research there.
The lack of scientific consensus on predators
“If you look at the scientific literature on predatory publishers, you find a significant lack of consistency” Kelly Cobey, a social psychologist and publication officer from the Centre for Journalology, Ottawa Hospital Research Institute, Canada, told CancerWorld. Cobey conducted a systematic review of the literature (What is a predatory journal? A scoping review F1000Research 2018, 7:1001), identifying many more opinions than empirical research. The analysis included 38 empirical studies, that overall proposed 109 different criteria: “One of the great challenges is finding and validating criteria, and agreeing on the relevance of each of them” she explains.
Every approach seems to have its limitations: “For sure, the idea of making profit from this, like Cabell does, is bizarre” says Cobey, who still has not found a way to access that list for lack of sufficient funding.
Cabell’s blacklist doesn’t seem to offer more than freely available services do. A recent study by a group of researchers of the Swiss Science Foundation (Blacklists and Whitelists To Tackle Predatory Publishing: a Cross-Sectional Comparison and Thematic Analysis),compared two blacklists; Cabell’s and the updated Beall’s list, and two whitelists; the Directory of Open Access Journals (DOAJ) and the Committee on Publication Ethics (COPE). They concluded that “the lists tend to emphasize easily verifiable criteria, which are easier for journals to meet, whereas dimensions that are more difficult to assess, such as peer review, are less well covered”. Furthermore, disagreements suggest that some journals are misclassified and others operate “in a gray zone between fraud and legitimacy”.
It takes two to tango
Researchers publishing or presenting in the wrong place are mostly the victims. Alan Chambers, for example, and the many who have chosen Oncotarget before it was brought down from the stars to the stables. Still, they may soon be invited to justify their behaviour. “Publishing on a predatory journal or likewise presenting an abstract at a questionable conference is not considered misconduct, but still it damages research integrity” explains Cinzia Caporale, a bioethicist and expert of research integrity the Italian National Research Council in Rome. Caporale’s group has worked on a list of recommendations, to be published soon. “We came up with a main list of 8 items, with 5 additional items worth checking. We know that none of the criteria is decisive, but the overall picture is certainly useful” she says.
This careful screening requires a remarkable investment: “In a way, the introduction of Open Access removed the subscription cost, but imposed an additional, hidden cost, which is still hard to quantify” concludes Beall.
Warning signs of a predatory conference
Scientific conferences are today’s golden goose of predatory publishers. As with journals, there is no simple way of distinguishing them from legitimate ones. If you have received an invitation, you may ask yourself the following questions (adapted from Academic Positions):
- Is the conference in your field?
- Does the conference appear to be a first? Look for information about the previous meetings.
- Who is organising the conference? Is it a for-profit enterprise? If so, does it have a connection to a legitimate research organisation/society/institute?
- What sort of fees are associated with attending the conference? Beware if organisers try to bundle registration fees with accommodation, meals, and travel.
- Does the conference claim that abstracts and papers will be peer-reviewed?
- Does the conference advertise a fast review time or high acceptance rate?
- Does the conference guarantee your work will be published in the conference proceedings? Have you ever read any papers from these conference proceedings before?
- Does the theme seem overly broad or that the organisers are trying to combine multiple fields into one event?
- Did the email invitation come from a free email provider such asGmail?
- Is the conference organiser responsible for other conferences this year on the same topic?
If the answer is “yes” for one or more of these questions, caution is advised.
Ask around, and use Google.
Leave a Reply