The Journalology newsletter helps editors and publishing professionals keep up to date with scholarly publishing, and guides them on how to build influential scholarly journals.
Up until now Journalology has primarily been a newsletter, but I’ve always wanted it to be much more than that. This week I’ve launched a website — hosted on journalology.com — that’s the new digital home for the work I do for the scholarly publishing community.
For the next two weeks I’ll be sponsoring my own newsletter. You won’t see messages from other companies, but you will see short summaries of the services that I offer, which fall into three categories: speaking, coaching, and consulting. Click on the icons to find out more.
My friend and former colleague Nicky Borowiec designed the Journalology logo, colour scheme and icons. Nicky was the brains behind the Nature Portfolio branding strategy, so I was in safe hands.
Since the start of this year I’ve been working independently of Clarke & Esposito, so journalology.com will serve as a shop window to showcase the services that I offer. Over the next few weeks this newsletter should give you a better idea of the ways that I can help you to create influential and impactful academic journals.
This is just the start of my plans for Journalology. There’s a lot more to come. For now, let’s move onto this week’s news. But not until we’ve heard the first message from this week’s sponsor, Journalology!
Most business coaches work across multiple industries and are unable to provide useful insight into scholarly publishing. The Journalology coaching programme is different. Get support from an expert who can help you to get better at your craft.
- The JCR has been enhanced to provide an easier and more complete user experience. It includes the integration of journals from the Emerging Sources Citation Index (ESCI) in the new unified category rankings.
- Coverage of more than 21,800 journals – including ~5,800 journals which publish all their content via open access.
- Scholarly journals from 113 countries, across 254 categories are recognized and receive a JIF. This includes 14,090 science journals, 7,321 social science journals and 3,304 arts & humanities journals.
- 544 journals receive a Journal Impact Factor for the first time.
Clarivate (press release)
JB: Yes, it’s that time of year again. The time when publishers pretend that they don’t care about impact factors and remind everyone of their flaws, while also quietly promoting their new impact factors.
Nandita Quaderi, the Senior Vice President & Editor-in-Chief, Web of Science, wrote an accompanying blog post: Quality and trust over impact. Here’s an excerpt:
In recent years, we have introduced a series of policy changes that reflect our position that the most important differentiator between journals is whether the content they publish can be trusted, rather than whether they are highly cited.
This includes extending the JIF from the most impactful journals in the sciences and social sciences to all journals that have passed our rigorous quality criteria and are indexed in the Web of Science Core Collection.
In doing so, we have made the JIF a marker of quality, not just of scholarly impact. This emphasis on inclusivity and trustworthiness – instead of impact – helps level the global playing field and protects the integrity of the scholarly record.
This year, eLife is going to be an interesting one to watch because it’s the poster child in many ways for the PRC (publish, review, curate) model. Will they get delisted by Clarivate? I’ve looked at the small print on the website. To me, if you read the small print, it looks as though they might, but we’ll see what happens. Just simply because they’re effectively publishing papers that have failed peer review, right? Because the authors are able to choose whether to publish a paper or walk away with it right after the peer review. Even if the reviewers come back and say it’s rubbish, the authors can still choose to publish.
I was obviously interested in knowing how Nature performed against Science and Cell (still ahead), and whether The Lancet maintained the clinical top spot above NEJM (they did). However, the first thing I looked for was whether eLife still had an impact factor (they do).
I like the experiment that eLife is doing. It’s good to try out new publishing models. After all, the current publishing system has many, many flaws.
My biggest criticism of the model, though, is that it puts the needs of authors ahead of readers. eLife uses an elegant assessment process that allows its editors to publicly explain, using a defined framework, the strengths and limitations of the published study. So, if a paper is “inadequate: methods, data and analyses do not support the primary claims” the editors publicly say as much. It’s clear and transparent — I wish more journals adopted something similar.
However, there’s no filter on the website to allow readers to avoid reading the bad papers or to identify the ones that the editors have rated highly.
So eLife is publishing a decent proportion of papers where the strength of evidence is inadequate or incomplete.
I’ve searched for information about indexing policies for eLife, but have not come up with anything concrete. Are all papers indexed or only some of them? If the latter, which papers are excluded from the likes of PubMed, Scopus, Web of Science, Dimensions etc. What criteria are used for exclusion?
This is important because many researchers cite papers after only reading the abstract. Even if they do click on the full text of the paper, they may not see or read the editors’ assessment. Problematic papers shouldn’t be indexed or, if they are, there needs to be some kind of flag against them.
Our long-standing policy of cover-to-cover coverage applies to journals that only publish research articles that have been validated by peer review.
The cover-to-cover policy does not apply to journals/platforms with a policy that states that they will publish research articles that have received peer review comments that support the validity of the published research alongside:
- research articles that have not been peer-reviewed
- and/or research articles where peer review reports indicate shortcomings in methodology or other reasons to doubt the validity of the published results or conclusions
In these cases, evaluation and coverage will be limited to the subset of published articles where the content has been validated by peer review.
To be considered for inclusion – or continued coverage – in the Web of Science, publishers must be able to provide a feed of content that is limited to articles that have been validated by peer review.
[Bold text is Clarivate’s emphasis]
Presumably the fact that eLife still has an impact factor means that eLife has provided a filtered feed to Web of Science, but it’s not clear to me which articles have been excluded. Unfortunately eLife readers don’t have access to the same filtering functionality. Surely there needs to be some clarity here?
This is important because of the precedent it sets; it’s generally best to avoid unintended consequences if you can.
Let’s consider the hypothetical case of a commercial publisher that currently accepts 60% of the papers it publishes. If it moved to a publish, review, curate (PRC) model it could generate revenue from closer to 100% of the papers it publishes; papers that otherwise would have been rejected can now be monetised. Even better, the publisher can generate good publicity by saying that it’s embracing open science principles by moving to a PRC editorial model, even though the primary goal is to generate more cash.
Of course, the PRC model is a lot less appealing if the journal loses its impact factor. Alternatively, it would be harder to monetise the papers that have fundamental flaws if those papers were not formally indexed or were flagged in the index as being potentially problematic.
To be clear, I’m not against the PRC model in principle, but there hasn’t been enough discussion of what “curate” means in practice, in my opinion. For Web of Science to live up to “Quality and trust over impact” then we need much better transparency on which papers for PRC journals are included in their database.
If representatives from eLife or Web of Science would like to provide a response, I'd be happy to include it in the next newsletter. I’m not trying to attack either party here, but to raise an issue that needs more debate and transparency.
We are thrilled to announce that 72% of our ranked MDPI journals (171 of 237) are above average, in Q1 or Q2. Twenty-nine of our journals received their first Impact Factor this year, accounting for more than 5% of the journals accepted into the Web of Science last year.
MDPI (announcement)
JB: MDPI were quick off the blocks with this announcement, which was published within hours of the new impact factors coming out. MDPI also published an associated blog post: Why Journal Metrics are Important and How to Use Them.
Two big scholarly publishers of open-access research—meaning free to read outside a paywall—have been accused of excessively citing papers previously published in their own journals.
A study published in May before peer review on the preprint server SocArXiv suggests that high self-citation rates may have inflated the journal impact factor (JIF) of titles published by MDPI and Frontiers in recent years.
The study authors investigated the citation patterns of 8,360 journals published by 20 of the largest for-profit academic publishers from 1997 to 2021. They found that papers published by Frontiers and MDPI titles contain more self-citation than other publishers’ journals do.
C&EN (Dalmeet Singh Chawla)
JB: I haven’t read the preprint yet and so I make no comment on whether these claims are reasonable. Matt Hodgkinson, who wrote the original Journalology blog, is quoted in the news story as saying that the study is “well conceived and sophisticated.” A Frontiers spokesperson says the study uses “unnecessarily convoluted methodology”.
The agreement is the first of its kind between ACS Publications and an academic consortium. In addition to granting participating institutions within the Couperin consortium full reading access to ACS journals, upon commencement the agreement will also provide authors with support for zero-embargo green OA (ZEGOA). This means that eligible corresponding authors at these institutions will be able to immediately self-archive their accepted manuscript in the repository of their choice for their articles published within ACS’ hybrid journals.
American Chemical Society (announcement)
JB: When I saw the headline, I assumed that this was the latest in a series of Couperin transformative agreements; Elsevier and Springer Nature recently announced deals, which I covered in previous issues of this newsletter.
But what’s different here is ZEGOA (A new acronym to learn. Yay!). A blog post in ACS Axial provides more information:
More than 1,500 institutions are covered by one of ACS Publications’ current read and publish agreements, which combine subscription access with gold open access publishing, but this arrangement with Couperin is different. ACS’ agreement in France is the first to incorporate zero-embargo green open access with an institutional subscription, which we’ve been referring to as “read and green.” As far as we can tell, this is also a first in publishing to truly combine subscription access with zero-embargo green OA publishing.
You may remember that last September ACS announced that it would start charging for green open access. The announcement said:
To ensure a sustainable model of delivering services from submission to final editorial decision, ACS Publications is introducing an article development charge (ADC) as part of this new zero-embargo green open access option. The ADC covers the cost of ACS’ publishing services through the final editorial decision. This includes organizing, developing and maintaining the high-quality, scholarly peer review process and multiple other publishing services provided by a vast global network of editors and reviewers.
cOAlition S funded academics can’t use their grants to pay for green OA, as this is prohibited, but they can fulfil the funder mandates by choosing this option and using another source of money to pay for it (i.e. the funder has their cake and gets to eat it).
Now, researchers working at institutions that are part of the Couperin consortium will be able to deposit their author-accepted manuscripts without charge. Presumably, Couperin are compensating the ACS for this service financially.
Publishers generally dislike green OA because it could result in subscription cancellations further down the line — indeed, this was the whole point of Steven Harnad’s 1994 subversive proposal — leaving the publishers with lots of cost and no revenue.
The ACS took a logical (if controversial) approach by charging for the costs they incur up until the point of manuscript acceptance. The latest agreement with Couperin is financially possible because of this policy.
It’s worth remembering that in October 2022 the Royal Society of Chemistry, the ACS’ major competitor for papers, said that it “aims to make all fully RSC-owned journals Open Access within five years”.
IOP Publishing (IOPP) has published the first original research papers in its flagship journal, Reports on Progress in Physics, adding to its 90-year legacy as one of the world’s most authoritative sources of physics review content.
Reports on Progress in Physics sits at the pinnacle of IOPP’s extensive physics portfolio of journals and covers all areas of physics and related interdisciplinary areas extending across condensed matter, atomic and molecular physics, quantum science, computational physics, chemical physics, biophysics, photonics, nuclear and particle physics and astrophysics.
The journal offers an open access (OA) option, with free OA publishing for researchers from low and lower middle-income countries. In addition, Reports on Progress in Physics is included in IOPP’s expanding programme of Transformative Agreements that now cover more than 1,000 universities and research institutions around the world.
IOP Publishing (press release)
JB: Expanding the scope of a review journal, with a good impact factor, to start publishing primary research papers is a good strategy. So is hiring an experienced Nature journals editor to help run it.
Overall, OA prices are increasing below inflation. Our market sizing suggests a real-terms growth in OA market value, so it seems the value of the OA market is driven by demand.
As authors use more and more open access, it’s not surprising that the total spend on it is increasing. However, the data suggest that they are getting (modestly) increasing value for money as they do so, as price rises are falling in real terms. (And to be fair, the same applies for the scholarly journals market overall, which is also seeing volume grow faster than value.)
The prices of fully OA journals are growing faster than inflation on average, but, again, not as fast as demand is growing. Remember, too, that fully OA prices are cheaper than those for hybrid, so they are growing from a lower base.
Delta Think (Dan Pollock and Heather Staines)
JB: This is useful and important work. It’s easy to see the headline APC price increases and assume that publishers are price gouging. This analysis suggests that, on aggregate, this isn’t happening.
We are pleased to announce that Elsevier has been ranked No. 1 globally, winning the prestigious 2024 Best Leadership Teams award!
This achievement reflects the commitment of our Executive Leadership Team and other leaders within our organization for their dedication to exemplary leadership. They consistently strive to foster an environment where diversity, innovation, collaboration, and excellence can flourish.
At Elsevier, we'll continue to build a future where leadership and culture are integral.
Elsevier (announcement)
JB: For many years Elsevier had a bad reputation, but that seems to have changed under the new leadership. According to this survey, Elsevier staff enjoy working there. By contrast...
Over 200 editors, journalists and production staff working across the Nature family of journals are members of the National Union of Journalists (NUJ) and are expected to participate in today’s strike. A further 180 members of staff affected by the pay negotiations are also eligible to take part in the industrial action.
The Nature editors are well connected and have been using their networks to garner external academic support for their industrial action. They wrote an open letter to the Springer Nature CEO, which they asked researchers to sign. As I write this, over 800 people have signed the letter, including 11 Nobel Prize winners; many have left messages of support (the signatories and their messages can be read here). Garnering support from Nobel Prize winners is one thing, but they also have Stephen Fry as an ally:
Stephen Fry co-authored an article in Nature Reviews Urology in 2019: Both sides of the scalpel: the patient and the surgeon view. It was likely one of the most lightly edited articles the Nature journals have published. After all, how can you improve the work of a genius?
The editors are striking again tomorrow (Monday). For everyone’s sake, I hope this is resolved soon.
You may have noticed that we recently gave our brand a makeover! We are now officially known as BMJ Group. Along with the name change, we have introduced updates to our brand identity, including a new BMJ Group logo and website,
The rebrand reflects our evolving organisation and ambitious strategy for our core lines of business: Publishing and Events, Careers and Learning, and Digital health. It also helps to create a clearer distinction between our organisation and our flagship journal, The BMJ, and strengthens BMJ Group's identity as our organisation’s umbrella brand.
JB: This is a sensible move. When the flagship journal is also the brand, there needs to be clear brand architecture in place. The flagship journal, The BMJ, is hosted on bmj.com and the wider group has a different home on bmjgroup.com.
The Nature journals adopted a different approach. The homepage for nature.com used to be a hodge-podge of links to different Nature journals. A few years ago the homepage started only showcasing the flagship’s content, with the sister journals hosted deeper within the domain name.
In a branded house strategy, a company uses a single, strong brand identity across all its products and services. The brand is the primary focus, and all offerings are closely tied to the same brand name and image. This approach aims to create a consistent and unified brand experience for customers. Think Apple, FedEx, GE.
In a house of brands strategy, a company operates multiple distinct brands, each with its own unique identity and positioning. Each brand operates independently, targeting different market segments or offering diverse product lines. The emphasis is on creating strong individual brands, and there might be little or no overt connection between them at the corporate level. Unilever (Dove, Flora, Magnum) and P&G (Pampers, Old Spice, Gillette) are great models for this approach.
Deciding on a name for a publishing house isn’t easy either. Nature Publishing Group became Nature Research and then morphed into Nature Portfolio within the space of a few years. Including “publishing” is problematic when you also have businesses under a brand that have very little to do with publishing (for example, a jobs board). Most people still call it Nature Publishing Group, though, and assume that nature.com hosts Nature Scientific Reports.
The journal will be 175 years old next year. During that time it has had four names: the
Provincial Medical and Surgical Journal (1840-52), the Association Medical Journal (1853-56), the British Medical Journal (1857-1988), and BMJ (1988-2014). Now it gets a fifth, with the inclusion in its name of the definite article. The journal formerly known as BMJ will now be formally known as The BMJ.
I still hear scholarly publishing professionals and academics calling it The British Medical Journal. Customers ultimately decide what they want to call your journals and your brand; old habits die hard.
The Problematic Paper Screener, an academic project led by Guillaume Cabanac and collaborators, combines automatic machine detection and human evaluation to identify issues in publications, including the presence of tortured phrases.
We’re excited to announce the integration of articles flagged by the Problematic Paper Screener into the Signals Graph as a new signal. This includes over 12,000 problematic publications that have not been retracted. These problematic publications are cited more than 160,000 times by articles. This is now visible through Signals.
Signals (announcement)
JB: In other Signals news, Sven Fund announced on LinkedIn this week that he’s joining their upcoming funding round as an investor. My immediate thoughts turned to nominative determinism, but then I remembered that Paul Peters, who made his fortune by growing and then selling Hindawi, is also an investor.
Recent efforts to roll back access to abortion have relied in part on certain scientific studies that have been disputed, disavowed or subsequently retracted.
These studies, which raise questions about abortion safety, were published in peer-reviewed journals but were later faulted for having irreproducible results, methodological flaws or misleading presentations of data, among other issues.
The authors of the studies contend that efforts to discredit their work are politically motivated.
Professional associations and medical groups including the American College of Obstetricians and Gynecologists, the American Medical Association and the American Psychological Association, say the overwhelming evidence indicates abortions are safe.
WSJ (Nidhi Subbaraman)
JB: This story is a good case study of how science can never be completely disassociated from politics.
GetFTR has extended its Retraction and Errata service to all integrator partner organizations. This rollout enables partners, such as discovery resources, Scholarly Collaboration Networks (SCNs) and publishers that have integrated GetFTR on their reference lists, to display a notification button at the point of discovery, indicating if an article has been updated or retracted.
This crucial information, sourced from Crossref and Retraction Watch, will be available for all articles where retraction or errata information is found. This now includes content from publishers who do not participate in GetFTR, and it will be displayed even if the researcher does not have access to the full content.
Nature has retracted a 2002 paper from the lab of Catherine Verfaillie purporting to show a type of adult stem cell could, under certain circumstances, “contribute to most, if not all, somatic cell types.”
The retracted article, “Pluripotency of mesenchymal stem cells derived from adult marrow,” has been controversial since its publication. Still, it has been cited nearly 4,500 times, according to Clarivate’s Web of Science – making it by far the most-cited retracted paper ever.
The scholarly publishing environment is changing fast. Even the most seasoned publisher can benefit from independent advice. I can help you to build a successful portfolio strategy and thrive in an open access world.
Second, this situation raises an issue that has not, to my knowledge, yet been seriously addressed in discussions of mandatory OA policies: even if (as many argue) the cost to the system as a whole of universal OA publishing would be no greater than the cost of toll-access publishing, the cost of implementing and managing mandatory OA policies – in other words, the administrative cost of removing choice from authors – is an entirely new one, conjured by the requirements of ideological orthodoxy. On the other hand, to the degree that authors’ agency is preserved, these costs don’t arise because no enforcement is required.
And all of this, of course, leads logically to a truly simple (and, I think, admirably effective) way of actually avoiding these costs altogether: leave authors free to decide for themselves where and how they will publish their research results. Effective coercion is expensive. No compliance requirement means no need for enforcement – and therefore real cost savings.
The Scholarly Kitchen (Rick Anderson)
JB: This relates to the article I covered in last week’s issue: Open excess: remove open access burden from REF. We need more opinion pieces that are brave enough to challenge the orthodoxy that’s been created by a small number of influential voices.
One of the biggest changes that we’ve seen over the past decade is the answer to the question of “who owns research?”. In the past, many researchers believed that they had full intellectual ownership of their academic output.
But of course institutions and funders argue, not unreasonably, that since they pay the bills they get to call the shots. If researchers want to receive funding, then there are rules they need to follow, including open-access mandates.
Open access has created considerable administrative burdens for all parties: funders, institutions, researchers, and publishers. That’s unsurprising since the number of financial transactions has increased by many orders of magnitude. Under an institutional subscription model publishers would have a relatively small number (in the thousands) of high value transactions. Under an author-pays APC model, that figure increases to the hundreds of thousands for the largest publishers. Transformative agreements help to reduce that bureaucracy to some degree, but are themselves complex to negotiate.
Rick Anderson finishes his Scholarly Kitchen article with the following:
I suggest that what needs to be rethought is not the locus of enforcement burden, but the fundamental idea of a national (or global) “obligation to OA.” Maybe, just maybe, OA is not the One True Religion of scholarly communication, but rather a model of publishing that – like any other model, including toll access – presents a mix of costs and benefits, of upsides and downsides, and should therefore be treated as one in a panoply of useful but imperfect publishing models, all of which can and should be permitted to flourish in a diverse ecosystem of scholarship.
My biggest criticism of the open access movement is that it’s largely been a top down initiative. It has not been, by and large, a grass roots movement. Most academics want to be able to read journal articles and book chapters without the hassle of navigating pay walls, but don’t really understand the difference between open and free. There wouldn’t need to be enforcement mechanisms if researchers were voluntarily choosing open access paths because they thought there were clear benefits to them, and for society, for doing so.
Publishers respond to demand with supply. Researchers (like all human beings) respond better to a carrot than a stick.
A pertinent question is what COPE recommendations are being followed: there are a whole range of flowcharts provided by COPE for different circumstances. The most relevant in this case would appear to be this one on "Concerns about risk in published data".
Early on in the flowchart, the decision tree asks "Based on the initial assessment, should the dataset be removed or restricted during the investigation to mitigate potential risk?" If the answer is YES, then the appropriate action is "Remove public access to the dataset while following up on the concerns."
Given that the article is still freely available on the web, we have to ask ourselves, is Elsevier being negligent here? Have they decided that there is no risk, despite the evidence that we and others have provided that public health is endangered by publicising misleading data on an ineffective drug? Or do they agree that there is risk, but deem it inconvenient for them to take action?
Senior managers need to re-energise and inspire their teams in hybrid-working environments. You can hire me to give a talk at one of your events, for example at an away day or at an editorial board meeting.
Create an image of a mascot for Journalology, the newsletter written by James Butcher. The image should include a mortar board, a cake and his bald head. Make sure there are academic journals in the picture, please.
This is what it came up with:
I think I may be entering the AI trough of disillusionment. My beak isn’t that big!
Until next time,
James
P.S. Please let me know if you experience problems using the Journalology website. I built it myself and it’s quite possible that I’ve made mistakes along the way. All feedback is gratefully received.
The Journalology newsletter helps editors and publishing professionals keep up to date with scholarly publishing, and guides them on how to build influential scholarly journals.
Subscribe to newsletter Hello fellow journalologists, With Christmas fast approaching, this will be the final newsletter of 2024. Thank you for reading Journalology this year; I hope you’ve got some value from it. I’m attending the APE (Academic Publishing in Europe) conference in Berlin January 14-15, so please do say hello if you’re there too. I’m taking part in a panel discussion that will cover one of my favourite topics: “Balancing Quantity and Quality in Research Communication”. Nandita...
Subscribe to newsletter Hello fellow journalologists, This week’s issue of the newsletter is packed full of scholarly publishing goodness. You’ll learn about two new editorial benchmarking surveys, a report on data sharing, changes to the way Scopus selects journals for inclusion, and much more. Enjoy! Thrive in 2025: join the Journalology coaching programme We’ll work together to clarify your strategic vision and map out a plan to create journals that are impactful editorially and...
Subscribe to newsletter Hello fellow journalologists, We’re in the final month of 2024 and there’s still no sign of Frontiers’ annual progress report for 2023. There’s a page dedicated to the latest impact data, though. Oh well, there’s always next year. The biggest story of last week was an announcement from the Indian government about the One Nation One Subscription agreement with 30 international publishers. Meanwhile, the furore around Clarivate’s decision not to award eLife an impact...