© BRYAN SATALINOFrom 2009 until early this year, University of Colorado librarian Jeffrey Beall shed unprecedented light on questionable publishing practices with his “blacklist” of hundreds of publishers he considered predatory. The now-defunct list included journals that he deemed unethical for a number of reasons, including their excessive article-processing charges, atypical copyright policies, and shoddy—or nonexistent—peer review. Although Beall took down his list in January, a few months later the academic publishing consultancy Cabell’s International announced its own blacklist, which, like Beall’s, identifies journals that the Beaumont, Texas–based company considers questionable. (Unlike Beall’s list, Cabell’s blacklist is only available for a fee.)
But this new resource fails to address a lingering criticism of such blacklists. “One of the objections that people sometimes had to Beall’s list was, ‘We don’t need to identify and call out the scammers; we just need to identify and certify legitimate publishers,’” Rick Anderson...
To this end, Anderson and others advocate for the use of “whitelists” in addition to or in place of blacklists. Even before Beall’s list came online, for example, the Directory of Open Access Journals (DOAJ) provided researchers with a freely available list of open-access publishers that the organization had vetted in a process outlined on its website, which is widely considered the most comprehensive compendium of open-access journals deemed reputable. “Blacklists are very, very difficult to provide and require a lot of curation—you have to be careful,” says Lars Bjørnshauge, managing director of DOAJ. For one, inadvertently including a nonsuspect journal could unnecessarily harm a publisher’s reputation. And, at least in theory, it’s more feasible to objectively evaluate journals based on what they do, rather than what they do not.
Of the 600 or so journals that submit applications for DOAJ evaluations each month, most do not make the organization’s whitelist. “When we list journals, we are fairly confident that [they have] respectable procedures,” says Bjørnshauge, adding that, following reevaluations, his organization removes 600 to 800 titles from its list each year, of which around 350 are life science journals.
See “Identifying Predatory Publishers”
Following on this approach, urological surgeon Henry Woo of the University of Sydney and colleagues launched a list of open-access and subscription-based urology journals in March. This Urology Green List is meant to be “a positive way to outwit predatory publishers and with minimal resource allocation,” Woo tells The Scientist in an email.
If the field is relatively narrow, it shouldn’t take long to develop a list of legitimate journals.
—Mark Langdorf,
University of California, Irvine
The group originally considered developing a field-specific blacklist, but found the task intractable, given the proliferation of publications and the time constraints of a volunteer-run operation. So the team shifted its focus, concentrating on existing knowledge of reputable urology journals. A whitelist of approved journals, identified by an advisory board of established urologists, would be easier to maintain, the team ascertained. Additionally, whitelists appear less likely to attract legal threats than their finger-pointing counterparts, Woo noted in a letter to Nature this May.
The University of California, Irvine’s Mark Langdorf wasn’t intimidated by the challenge, however. Teaming up with Bhakti Hansoti, an assistant professor of emergency medicine at Johns Hopkins, and Irvine librarian Linda Murphy, he generated a blacklist and a whitelist in the field of emergency medicine, both of which were published last September in Western Journal of Emergency Medicine. Several months later, the team did a second search to update the lists, adding titles to both and moving one journal, Clinical and Experimental Emergency Medicine, from the black to the white. “[It] was brand-new and yet legitimate—it just had not yet been indexed anyplace,” Langdorf tells The Scientist, adding that he and his colleagues have no plans to update the lists again.
“Ideally, these are the real experts,” Bjørnshauge says of the urology and emergency medicine researchers curating field-specific lists, “whereas we [at DOAJ] are the more general experts.” DOAJ is focused primarily on journal policies and practices, he noted; subject-area specialists are better equipped to judge the research that appears in a given publication. “Also, because you’re dealing with a much smaller number of journals, the sheer workload in most disciplines is going to be quite a bit more manageable,” adds SSP’s Anderson.
Langdorf encourages other researchers to investigate the publications within their fields. “It doesn’t need to be as exhaustive as our search,” he says, particularly when generating a specialty-specific whitelist. “If the field is relatively narrow, it shouldn’t take long to develop a list of legitimate journals.”
Of course, no single list, “black” or “white,” will ever be complete. “New publishers are coming all the time,” said Anderson. “So that’s one challenge: simply keeping up with the new actors that are entering the ecosystem.”