Tag Archives: Yves Gingras

Scientific fraud: widespread and organized according to Northwestern University research + math fraud scandal

I have three stories about issues with science and mathematics: the research, the reporting, and the fraud.

Northwestern University and widespread scientific fraud

An August 4, 2025 article by Cathleen O’Grady for science.org describes a study into global networks instigating scientific fraud, Note: A link has been removed,

For years, sleuths who study scientific fraud have been sounding the alarm about the sheer size and sophistication of the industry that churns out fake publications. Now, an extensive investigation finds evidence of a range of bad actors profiting from fraud. The study, based on an analysis of thousands of publications and their authors and editors, shows paper mills are just part of a complex, interconnected system that includes publishers, journals, and brokers.

The paper, published today in the Proceedings of the National Academy of Sciences, paints an alarming picture. Northwestern University metascientist Reese Richardson and his colleagues identify networks of editors and authors colluding to publish shoddy or fraudulent papers, report that large organizations are placing batches of fake papers in journals, suggest brokers may serve as intermediaries between paper mills and intercepted journals, and find that the number of fake papers—though still relatively small—seems to be increasing at a rate far greater than the scientific literature generally.

The paper shows that misconduct “has become an industry,” says Anna Abalkina of the Free University of Berlin, who studies corruption in science and was not involved with the research. Richardson and colleagues hope their sweeping case will attract attention and spur change.

O’Grady’s August 4, 2025 article provides some fascinating detail, Note: Links have been removed,

They began their analysis by pinpointing corrupt editors. They focused their investigation on PLOS ONE, because the megajournal allows easy access to bulk metadata and publishes the names of the editors who have handled the thousands of papers it publishes each year, making it possible to detect anomalies without behind-the-scenes information. The researchers identified all the papers from the journal that had been retracted or received comments on PubPeer—a website that allows researchers to critique published work—and then identified each paper’s editors.

All told, 33 editors stood out as more frequently handling work that was later retracted or criticized than would be expected by chance. “Some of these were immense outliers,” Richardson says. For instance, of the 79 papers that one editor had handled at PLOS ONE, 49 have been retracted. Flagged editors handled 1.3% of papers published in the journal by 2024, but nearly one-third of all retracted papers.

The team also spotted that these editors worked on certain authors’ papers at a suspiciously high rate. These authors were often editors at PLOS [Public Library of Science] ONE themselves, and they often handled each other’s papers. It’s possible that some editors are being paid bribes, Richardson says, but “also possible that these are informal arrangements that are being made among colleagues.” The researchers detected similarly questionable editor behavior in 10 journals published by Hindawi, an open-access publisher that was shuttered because of rampant paper mill activity after Wiley acquired it. A spokesperson for Wiley told Science the publisher has made “significant investments to address research integrity issues.”

Renee Hoch, head of publication ethics at PLOS, said in an email to Science that the publisher has long been aware of networks like these, and will assess whether any of the editors implicated are still on the journal’s editorial board, opening investigations if they are. She emphasizes that the study focused on PLOS because of its readily accessible data: “Paper mills are truly an industry-wide problem.”

Researchers working on paper mills have long assumed that editors and authors have been colluding. The new findings are “killer evidence” for these suspicions, says Domingo Docampo, a bibliometrician at the University of Vigo (Spain). He adds that although the findings only show collusion at a limited number of journals, others are probably affected. Just last week, Retraction Watch reported that the publisher Frontiers had begun to retract 122 papers after discovering a network of editors and authors “who conducted peer review with undisclosed conflicts of interest,” according to a company statement. The network of 35 individuals has also published more than 4000 papers in journals from seven other publishers, the company said, which require further scrutiny. A Frontiers spokesperson said they planned to share information with the other affected publishers.

Richardson and his colleagues found that the problem goes far beyond networks of unscrupulous editors and authors scratching each other’s backs. They identified what appear to be coordinated efforts to arrange the publication of batches of dubious papers in multiple journals.

For the curious, there’s more in O’Grady’s August 4, 2025 article. An August 4, 2025 Northwestern University news release by Amanda Morris (received via email and available on EurekAlert) focuses on other aspects of the research,

From fabricated research to paid authorships and citations, organized scientific fraud is on the rise, according to a new Northwestern University study.

By combining large-scale data analysis of scientific literature with case studies, the researchers led a deep investigation into scientific fraud. Although concerns around scientific misconduct typically focus on lone individuals, the Northwestern study instead uncovered sophisticated global networks of individuals and entities, which systematically work together to undermine the integrity of academic publishing.

The problem is so widespread that the publication of fraudulent science is outpacing the growth rate of legitimate scientific publications. The authors argue these findings should serve as a wake-up call to the scientific community, which needs to act before the public loses confidence in the scientific process.

The study will be published during the week of August 4 the Proceedings of the National Academy of Sciences.

“Science must police itself better in order to preserve its integrity,” said Northwestern’s Luís A. N. Amaral, the study’s senior author. “If we do not create awareness around this problem, worse and worse behavior will become normalized. At some point, it will be too late, and scientific literature will become completely poisoned. Some people worry that talking about this issue is attacking science. But I strongly believe we are defending science from bad actors. We need to be aware of the seriousness of this problem and take measures to address it.”

An expert in complex social systems, Amaral is the Erastus Otis Haven Professor and professor of engineering sciences and applied mathematics at Northwestern’s McCormick School of Engineering. Reese Richardson, a postdoctoral fellow in Amaral’s laboratory, is the paper’s first author.

Extensive analysis

When people think about scientific fraud, they might remember news reports of retracted papers, falsified data or plagiarism. These reports typically center around the isolated actions of one individual, who takes shortcuts to get ahead in an increasingly competitive industry. But Amaral and his team uncovered a widespread underground network operating within the shadows and outside of the public’s awareness.

“These networks are essentially criminal organizations, acting together to fake the process of science,” Amaral said. “Millions of dollars are involved in these processes.”

To conduct the study, the researchers analyzed extensive datasets of retracted publications, editorial records and instances of image duplication. Most of the data came from major aggregators of scientific literature, including Web of Science (WoS), Elsevier’s Scopus, National Library of Medicine’s PubMed/MEDLINE and OpenAlex, which includes data from Microsoft Academic Graph, Crossref, ORCID, Unpaywall and other institutional repositories.

Richardson and his colleagues also collected lists of de-indexed journals, which are scholarly journals that have been removed from databases for failing to meet certain quality or ethical standards. The researchers also included data on retracted articles from Retraction Watch, article comments from PubPeer and metadata — such as editor names, submission dates and acceptance dates — from articles published in specific journals.

Buying a reputation

After analyzing the data, the team uncovered coordinated efforts involving “paper mills,” brokers and infiltrated journals. Functioning much like factories, paper mills churn out large numbers of manuscripts, which they then sell to academics who want to quickly publish new work. These manuscripts are mostly low quality — featuring fabricated data, manipulated or even stolen images, plagiarized content and sometimes nonsensical or physically impossible claims.

“More and more scientists are being caught up in paper mills,” Amaral said. “Not only can they buy papers, but they can buy citations. Then, they can appear like well-reputed scientists when they have barely conducted their own research at all.”

“Paper mills operate by a variety of different models,” Richardson added. “So, we have only just been able to scratch the surface of how they operate. But they sell basically anything that can be used to launder a reputation. They often sell authorship slots for hundreds or even thousands of dollars. A person might pay more money for the first author position or less money for a fourth author position. People also can pay to get papers they have written automatically accepted in a journal through a sham peer-review process.”

To identify more articles originating from paper mills, the Amaral group launched a parallel project that automatically scans published materials science and engineering papers. The team specifically looked for authors who misidentified instruments they used in their research. A paper with those results was accepted by the journal PLOS ONE.

Brokers, hijacking and collusion

Amaral, Richardson and their collaborators found fraudulent networks use several key strategies: (1) Groups of researchers collude to publish papers across multiple journals. When their activities are discovered, the papers are subsequently retracted; (2) brokers serve as intermediaries to enable mass publication of fraudulent papers in compromised journals; (3) fraudulent activities are concentrated in specific, vulnerable subfields; and (4) organized entities evade quality-control measures, such as journal de-indexing.

“Brokers connect all the different people behind the scenes,” Amaral said. “You need to find someone to write the paper. You need to find people willing to pay to be the authors. You need to find a journal where you can get it all published. And you need editors in that journal who will accept that paper.”

Sometimes these organizations go around established journals altogether, searching instead for defunct journals to hijack. When a legitimate journal stops publishing, for example, bad actors can take over its name or website. These actors surreptitiously assume the journal’s identity, lending credibility to its fraudulent publications, despite the actual publication being defunct.

“This happened to the journal HIV Nursing,” Richardson said. “It was formerly the journal of a professional nursing organization in the U.K., then it stopped publishing, and its online domain lapsed. An organization bought the domain name and started publishing thousands of papers on subjects completely unrelated to nursing, all indexed in Scopus.”

Fighting for science

To combat this growing threat to legitimate scientific publishing, Amaral and Richardson emphasize the need for a multi-prong approach. This approach includes enhanced scrutiny of editorial processes, improved methods for detecting fabricated research, a greater understanding of the networks facilitating this misconduct and a radical restructuring of the system of incentives in science.

Amaral and Richardson also underscore the importance of addressing these issues before artificial intelligence (AI) infiltrates scientific literature more than it already has.

“If we’re not prepared to deal with the fraud that’s already occurring, then we’re certainly not prepared to deal with what generative AI can do to scientific literature,” Richardson said. “We have no clue what’s going to end up in the literature, what’s going to be regarded as scientific fact and what’s going to be used to train future AI models, which then will be used to write more papers.”

“This study is probably the most depressing project I’ve been involved with in my entire life,” Amaral said. “Since I was a kid, I was excited about science. It’s distressing to see others engage in fraud and in misleading others. But if you believe that science is useful and important for humanity, then you have to fight for it.”

Here’s a link to and a citation for the paper,

The entities enabling scientific fraud at scale are large, resilient, and growing rapidly by Reese A. K. Richardson, Spencer S. Hong, Jennifer A. Byrne, Thomas Stoeger, and Luís A. Nunes Amaral. Proceedings of the National Academy of Sciences August 4, 2025 122 (32) e2420092122 DOI: https://doi.org/10.1073/pnas.2420092122

This paper is open access.

And now—math fraud

A September 19, 2025 news item on ScienceDaily features an investigation into fraudulent math research, Note: A link has been removed,

An international team of authors led by Ilka Agricola, professor of mathematics at the University of Marburg, Germany, has investigated fraudulent practices in the publication of research results in mathematics on behalf of the German Mathematical Society (DMV) and the International Mathematical Union (IMU), documenting systematic fraud over many years. The results of the study were recently published on the preprint server arxiv.org and in the Notices of the American Mathematical Society (AMS) and have since caused a stir among mathematicians.

Sanjana Gajbhiye’s September ??, 2025 article for earth.com delves further into the topic, Note: Links have been removed,

Quality lost to quantity

The findings show how the definition of research quality has shifted. Instead of focusing on content, originality, and insight, institutions and individuals are increasingly evaluated by commercial metrics. These include the number of publications, total citations, and the so-called impact factor of journals.

Such measures, calculated by private companies with little transparency, have gained outsized influence. Providers promote their databases globally, and universities use them to enhance prestige and compete internationally.

This environment rewards quantity over quality, pushing academics to publish more, even when contributions are marginal or flawed.

Fraudulent companies have seized this opportunity. They sell services that manipulate rankings, offering ghostwritten articles, fake peer reviews, and even bundles of citations. For individuals, this can mean better career prospects.

For universities, it can result in higher rankings, increased funding, and greater appeal to international students. The collateral damage is a growing pool of unread publications that add nothing to scientific understanding.

Fake mathematics success

The report documents striking examples that reveal how metrics can produce absurd outcomes. In 2019, Clarivate Inc., the market leader for citation data, ranked a Taiwanese university as having the most world-class mathematicians. The catch was startling: mathematics was not even offered at the institution.

Mathematic trust under threat

“‘Fake science‘ is not only annoying, it is a danger to science and society,” said IMU Secretary General Professor Christoph Sorger.

“Because you don’t know what is valid and what is not. Targeted disinformation undermines trust in science and also makes it difficult for us mathematicians to decide which results can be used as a basis for further research.”

This erosion of trust strikes at the heart of mathematics. Proofs rely on certainty, yet when fraudulent or hollow work appears in respected outlets, that certainty weakens.

Fixing trust in mathematics publishing

The commission’s work does not end with exposing the problem. It also outlines possible solutions for a healthier publication system. These recommendations emphasize the need to strengthen peer review, encourage collaboration among journals, and recenter the evaluation of research on quality rather than raw numbers.

Metrics are deeply tied to funding and prestige, so the shift won’t be simple, but it could reshape the landscape for future generations.

A September 20, 2025 Castle Journal blog posting provides more information,

The “Culture of Numbers” and its Consequences

The study, led by Professor Ilka Agricola of the University of Marburg, argues that the root cause of the problem is a “culture of numbers” that prioritizes commercial metrics over scientific content. Universities and research institutions have become increasingly reliant on commercial databases like Clarivate’s Journal Citation Reports (JCR) to evaluate researchers. These metrics, which are not transparent and are not vetted by the scientific community, have become the main currency for career progression, grants, and prestige.

 * “Megajournals”: The study highlights the rise of “megajournals,” which publish anything as long as the authors pay a fee. These journals now publish more articles per year than all reputable mathematics journals combined. The report cites a shocking example where a commercial database ranked a university in Taiwan as having the most world-class researchers in mathematics, despite the fact that the university does not even offer mathematics as a subject.

 * Paper Mills and Citation Cartels: The investigation found evidence of “paper mills,” which sell fabricated papers to researchers, and “citation cartels,” where academics agree to cite each other’s work to artificially inflate their metrics. These services are offered anonymously online, with prices for articles and citations ranging from hundreds to thousands of dollars. The report describes these networks as “criminal organizations” that have invaded the “ecosystem” of scientific publishing.

Ilka Agricola gave an interview to Retraction Watch, from the undated article, Note: Links have been removed,

A pair of papers posted to the arXiv addresses the issue of fraudulent publishing in math, particularly metrics gaming, and offers a list of recommendations to help detect and deal with that problem and other fraudulent activities. (The former was also published in the October AMS Notices; the latter will appear in the November issue.) “Fraudulent publishing undermines trust in science and scientific results and therefore fuels antiscience movements,” mathematician Ilka Agricola, lead author of both papers, told Retraction Watch. 

A professor of mathematics at Marburg University in Germany, Agricola was president of the German Mathematical Society in 2021-2022 and is chair of the Committee on Publishing of the International Mathematical Union. The new articles are the products of a working group of the IMU and the International Council of Industrial and Applied Mathematics. 

Retraction Watch: As you note in the new papers, Clarivate announced in 2023 it had excluded the entire field of math from its list of “Highly Cited Researchers,” or HCRs. What’s going on?

Agricola: The publication culture in math differs a bit from, say, experimental and life sciences. On average, mathematicians publish fewer papers with fewer authors than scientists in other fields. So, with the same absolute number of papers and citations, one can become a “highly-cited researcher” in math, but not in other fields. Thus, gaming the system is easier. 

The list of HCRs for mathematics became so screwed that Clarivate couldn’t pretend anymore that it had any value. This being said, Clarivate announced that they would look into new measuring tools, but didn’t come up with any alternative ideas in the meantime, nor did they contact any representatives of the international mathematical community. 

Retraction Watch:  Few people talk about fraudulent publishing in math. Why is that?

Agricola: For a long time, mathematicians thought that as long as they keep away from predatory journals or paper mills, the problem does not affect them. This turned out to be wrong. 

Retraction Watch: If you look at the number of papers that tripped Clear Skies’ Papermill Alarm in 2022 (we included a histogram in this article we wrote for The Conversation [link and excerpts follow]), math is pretty far down the list. Are there a lot of fake papers in math?

Agricola: It is probably fair to say that the problem is not as severe as in other fields like cancer research, but the community is smaller and the number of fake papers is growing at alarming speed. Predatory and low-quality mega-journals are trying hard to lure respected scientists into their parallel universe of fake science, thus trying to give themselves the impression of respectability. Thus, one of our goals is to raise awareness for the issue in the mathematical community!

Retraction Watch: You and your coauthors are mathematicians, and yet you argue against focusing on numbers like journal impact factors and publication and citation counts. Is that what’s driving all of this bad behavior?

Agricola: “When a measure becomes a target, it ceases to be a good measure.” This quote is from the British economist Charles Goodhart, and it also applies to bibliometrics measures. Of course, gaming these metrics has always existed, but some of us liked to believe that they would be roughly OK, with some error bar due to some cheating. Now, we realize the error bar is larger than the number one wants to measure. Perhaps one advantage of mathematicians is that they are not easily impressed by numbers, and we have the means to understand and analyze them — this is our job. And so, the conclusion is very clear: The correlation between bibliometrics and research quality is so low that we should not use bibliometrics. And I urge all colleagues to say so openly!

Retraction Watch: So how do we judge research quality if we shouldn’t use publication metrics?

Agricola: Read the actual publications instead of relying on bibliometrics! Plus, in mathematics, we are lucky to have two extremely well curated databases for math papers and journals, zbMath Open and MathReviews. If a journal is not included there, it’s either very interdisciplinary or one should get suspicious.

Retraction Watch: Is it possible for individual researchers to jump off the bibiometrics bandwagon without jeopardizing their careers?

Agricola: We need to fight for a change in culture, that’s for sure, and the path will be rash and hard. To young researchers, we should give the warning that being involved in predatory publishing can also just as well put their scientific integrity at risk. Remember the people who had to resign because of data falsification? 

I am providing citations (of a sort) to both papers and links to all three sites where both papers can be found and PDFs for both papers: Everything is open access.

Fraudulent Publishing in the Mathematical Sciences by Ilka Agricola, Lynn Heller, Wil Schilders, Moritz Schubotz, Peter Taylor, Luis Vega.

arXiv: https://arxiv.org/abs/2509.07257

AMS (American Mathematical Society) Notices October 2025: https://www.ams.org/journals/notices/202509/noti3217/noti3217.html?adat=October%202025&trk=3217&pdfissue=202509&pdffile=rnoti-p1038.pdf&cat=none&type=.html

Ilke Agricola’s Research Gate website: https://www.researchgate.net/profile/Ilka-Agricola (scroll down to see the listed papers)

PDF: https://www.ams.org/journals/notices/202509/rnoti-p1038.pdf

How to Fight Fraudulent Publishing in the Mathematical Sciences: Joint Recommendations of the IMU [International Mathematical Union] and the ICIAM [International Council for Industrial and Applied Mathematics] by Ilka Agricola, Lynn Heller, Wil Schilders, Moritz Schubotz, Peter Taylor, Luis Vega.

arXiv: https://arxiv.org/abs/2509.09877

AMS (American Mathematical Society) Notices November 2025: https://www.ams.org/journals/notices/202510/noti3266/noti3266.html?adat=November%202025&trk=3266&pdfissue=202510&pdffile=rnoti-p1179.pdf&cat=none&type=.html

Ilke Agricola’s Research Gate website: https://www.researchgate.net/profile/Ilka-Agricola (scroll down to see the listed papers)

PDF: https://www.ams.org/journals/notices/202510/rnoti-p1179.pdf

Fraud slows down research

Mentioned in the Retraction Watch/Agricola interview, this January 29, 2025 article by Frederik Joelving (contributing editor, Retraction Watch), Cyril Labbé, (professor of computer science, Université Grenoble Alpes [UGA]), Guillaume Cabanac, (professor of computer Science, Institut de Recherche en Informatique de Toulouse) is chilling, Note: Links have been removed,

Over the past decade, furtive commercial entities around the world have industrialized the production, sale and dissemination of bogus scholarly research, undermining the literature that everyone from doctors to engineers rely on to make decisions about human lives.

It is exceedingly difficult to get a handle on exactly how big the problem is. Around 55,000 scholarly papers have been retracted to date, for a variety of reasons, but scientists and companies who screen the scientific literature for telltale signs of fraud estimate that there are many more fake papers circulating – possibly as many as several hundred thousand. This fake research can confound legitimate researchers who must wade through dense equations, evidence, images and methodologies only to find that they were made up.

Even when the bogus papers are spotted – usually by amateur sleuths on their own time – academic journals are often slow to retract the papers, allowing the articles to taint what many consider sacrosanct: the vast global library of scholarly work that introduces new ideas, reviews other research and discusses findings.

These fake papers are slowing down research that has helped millions of people with lifesaving medicine and therapies from cancer to COVID-19. Analysts’ data shows that fields related to cancer and medicine are particularly hard hit, while areas like philosophy and art are less affected. Some scientists have abandoned their life’s work because they cannot keep pace given the number of fake papers they must bat down.

The problem reflects a worldwide commodification of science. Universities, and their research funders, have long used regular publication in academic journals as requirements for promotions and job security, spawning the mantra “publish or perish.”

But now, fraudsters have infiltrated the academic publishing industry to prioritize profits over scholarship. Equipped with technological prowess, agility and vast networks of corrupt researchers, they are churning out papers on everything from obscure genes to artificial intelligence in medicine.

These papers are absorbed into the worldwide library of research faster than they can be weeded out. About 119,000 scholarly journal articles and conference papers are published globally every week, or more than 6 million a year. Publishers estimate that, at most journals, about 2% of the papers submitted – but not necessarily published – are likely fake, although this number can be much higher at some publications.

… there is a bustling online underground economy for all things scholarly publishing. Authorship, citations, even academic journal editors, are up for sale. This fraud is so prevalent that it has its own name: paper mills, a phrase that harks back to “term-paper mills,” where students cheat by getting someone else to write a class paper for them.

The impact on publishers is profound. In high-profile cases, fake articles can hurt a journal’s bottom line. Important scientific indexes – databases of academic publications that many researchers rely on to do their work – may delist journals that publish too many compromised papers. There is growing criticism that legitimate publishers could do more to track and blacklist journals and authors who regularly publish fake papers that are sometimes little more than artificial intelligence-generated phrases strung together.

To better understand the scope, ramifications and potential solutions of this metastasizing assault on science, we – a contributing editor at Retraction Watch, a website that reports on retractions of scientific papers and related topics, and two computer scientists at France’s Université Toulouse III–Paul Sabatier and Université Grenoble Alpes who specialize in detecting bogus publications – spent six months investigating paper mills.

This included, by some of us at different times, trawling websites and social media posts, interviewing publishers, editors, research-integrity experts, scientists, doctors, sociologists and scientific sleuths engaged in the Sisyphean task of cleaning up the literature. It also involved, by some of us, screening scientific articles looking for signs of fakery.

What emerged is a deep-rooted crisis that has many researchers and policymakers calling for a new way for universities and many governments to evaluate and reward academics and health professionals across the globe.

Just as highly biased websites dressed up to look like objective reporting are gnawing away at evidence-based journalism and threatening elections, fake science is grinding down the knowledge base on which modern society rests.

The January 29, 2025 article highlights a number of problems including these,

To expedite the publication of one another’s work, some corrupt scientists form peer review rings. Paper mills may even create fake peer reviewers impersonating real scientists to ensure their manuscripts make it through to publication. Others bribe editors or plant agents on journal editorial boards.

María de los Ángeles Oviedo-García, a professor of marketing at the University of Seville in Spain, spends her spare time hunting for suspect peer reviews from all areas of science, hundreds of which she has flagged on PubPeer. ……

“One of the demanding fights for me is to keep faith in science,” says Oviedo-García, who tells her students to look up papers on PubPeer before relying on them too heavily. Her research has been slowed down, she adds, because she now feels compelled to look for peer review reports for studies she uses in her work. Often there aren’t any, because “very few journals publish those review reports,” Oviedo-García says.

An ‘absolutely huge’ problem

It is unclear when paper mills began to operate at scale. The earliest article retracted due to suspected involvement of such agencies was published in 2004, according to the Retraction Watch Database, which contains details about tens of thousands of retractions. (The database is operated by The Center for Scientific Integrity, the parent nonprofit of Retraction Watch.) Nor is it clear exactly how many low-quality, plagiarized or made-up articles paper mills have spawned.

“The threat of paper mills to scientific publishing and integrity has no parallel over my 30-year scientific career …. In the field of human gene science alone, the number of potentially fraudulent articles could exceed 100,000 original papers,” she [Jennifer Byrne, an Australian scientist] wrote to lawmakers, adding, “This estimate may seem shocking but is likely to be conservative.”

In one area of genetics research – the study of noncoding RNA in different types of cancer – “We’re talking about more than 50% of papers published are from mills,” Byrne said. “It’s like swimming in garbage.”

… in the global south, the publish-or-perish edict runs up against underdeveloped research infrastructures and education systems, leaving scientists in a bind. For a Ph.D., the Cairo physician who requested anonymity conducted an entire clinical trial single-handedly – from purchasing study medication to randomizing patients, collecting and analyzing data and paying article-processing fees. In wealthier nations, entire teams work on such studies, with the tab easily running into the hundreds of thousands of dollars.

“Research is quite challenging here,” the physician said. That’s why scientists “try to manipulate and find easier ways so they get the job done.”

Institutions, too, have gamed the system with an eye to international rankings. In 2011, the journal Science described how prolific researchers in the United States and Europe were offered hefty payments for listing Saudi universities as secondary affiliations on papers. And in 2023, the magazine, in collaboration with Retraction Watch, uncovered a massive self-citation ploy by a top-ranked dental school in India that forced undergraduate students to publish papers referencing faculty work.

According to the January 29, 2025 article, there is a root cause, Note: Links have been removed,

… unsavory schemes can be traced back to the introduction of performance-based metrics in academia, a development driven by the New Public Management movement that swept across the Western world in the 1980s, according to Canadian sociologist of science Yves Gingras of the Université du Québec à Montréal. When universities and public institutions adopted corporate management, scientific papers became “accounting units” used to evaluate and reward scientific productivity rather than “knowledge units” advancing our insight into the world around us, Gingras wrote.

This transformation led many researchers to compete on numbers instead of content, which made publication metrics poor measures of academic prowess. As Gingras has shown, the controversial French microbiologist Didier Raoult, who now has more than a dozen retractions to his name, has an h-index – a measure combining publication and citation numbers – that is twice as high as that of Albert Einstein – “proof that the index is absurd,” Gingras said.

Worse, a sort of scientific inflation, or “scientometric bubble,” has ensued, with each new publication representing an increasingly small increment in knowledge. “We publish more and more superficial papers, we publish papers that have to be corrected, and we push people to do fraud,” said Gingras.

In terms of career prospects of individual academics, too, the average value of a publication has plummeted, triggering a rise in the number of hyperprolific authors. One of the most notorious cases is Spanish chemist Rafael Luque, who in 2023 reportedly published a study every 37 hours.

There is some hope according to the January 29, 2025 article, Note: Links have been removed,

Stern [Bodo Stern, a former editor of the journal Cell and chief of Strategic Initiatives at Howard Hughes Medical Institute] isn’t the first scientist to bemoan the excessive focus on bibliometrics. “We need less research, better research, and research done for the right reasons,” wrote the late statistician Douglas G. Altman in a much-cited editorial from 1994. “Abandoning using the number of publications as a measure of ability would be a start.”

Nearly two decades later, a group of some 150 scientists and 75 science organizations released the San Francisco Declaration on Research Assessment, or DORA, discouraging the use of the journal impact factor and other measures as proxies for quality. The 2013 declaration has since been signed by more than 25,000 individuals and organizations in 165 countries.

Despite the declaration, metrics remain in wide use today, and scientists say there is a new sense of urgency.

Stern and his colleagues have tried to make improvements at their institution. Researchers who wish to renew their seven-year contract have long been required to write a short paragraph describing the importance of their major results. Since the end of 2023, they also have been asked to remove journal names from their applications.

That way, “you can never do what all reviewers do – I’ve done it – look at the bibliography and in just one second decide, ‘Oh, this person has been productive because they have published many papers and they’re published in the right journals,’” says Stern. “What matters is, did it really make a difference?”

Shifting the focus away from convenient performance metrics seems possible not just for wealthy private institutions like Howard Hughes Medical Institute, but also for large government funders. In Australia, for example, the National Health and Medical Research Council in 2022 launched the “top 10 in 10” policy, aiming, in part, to “value research quality rather than quantity of publications.”

Rather than providing their entire bibliography, the agency, which assesses thousands of grant applications every year, asked researchers to list no more than 10 publications from the past decade and explain the contribution each had made to science. …

Gingras, the Canadian sociologist, advocates giving scientists the time they need to produce work that matters, rather than a gushing stream of publications. He is a signatory to the Slow Science Manifesto: “Once you get slow science, I can predict that the number of corrigenda, the number of retractions, will go down,” he says.

At one point, Gingras was involved in evaluating a research organization whose mission was to improve workplace security. An employee presented his work. “He had a sentence I will never forget,” Gingras recalls. The employee began by saying, “‘You know, I’m proud of one thing: My h-index is zero.’ And it was brilliant.” The scientist had developed a technology that prevented fatal falls among construction workers. “He said, ‘That’s useful, and that’s my job.’ I said, ‘Bravo!’”

Sometimes, there’s a science reporting problem

A September 3, 2025 Universiteit van Amsterdam press release (also on EurekAlert) highlights a problem with science reporting and over confidence,

Science journalists aren’t particularly concerned about so-called “predatory journals”, confident that they have the skills and intuition needed to avoid reporting on problematic research. For many, a journal’s reputation and name-recognition are decisive factors in assessing the quality of scientific research – but this could be exacerbating existing imbalances in science and journalism. This perspective emerges from a new study, led by Dr Alice Fleerackers of the University of Amsterdam (UvA), and published on 2 September [2025] in Journalism Practice.

Predatory journals prioritise profit over editorial and publication standards. They often charge researchers publication fees but offer little to no real quality control, such as peer review. As a result, some journals publish almost everything submitted. ‘Predatory journals are not a harmless side effect of the academic publishing industry,’ says Fleerackers. ‘They are becoming increasingly common, raising concerns about the integrity of scientific publishing. They not only undermine the reliability of science but also jeopardise science journalism, as journalists can unknowingly report on weak or even flawed research.’

In the new study, Fleerackers – along with colleagues from Simon Fraser University (Canada) and San Francisco State University (US) – investigated how science journalists view predatory journals and what strategies they employ to ensure the reliability of the journals they report on. The researchers present a qualitative analysis of interviews with 23 health, science, and environmental journalists in Europe and North America.

Problematic, but only in theory

Some of the journalists interviewed were familiar with the phenomenon of predatory journals and acknowledged that they are theoretically problematic. However, most weren’t concerned that they might be using them in their own work. They acknowledged that these journals might be a problem for colleagues, but not for them.

Well-known, therefore reliable

Journalists in the study were confident they wouldn’t fall for a predatory journal because of their strong intuition, which they said allowed them to immediately distinguish high-quality from problematic research. Besides their intuition, they also relied on strategies for verifying the reliability of research that they had developed through years of experience. These strategies often centred on trust proxies – like the journal’s prestige, impact factor, and selectivity – as well as whether the journal claimed to conduct peer review.

Proofreading also played a role for some journalists: if an article contained grammatical or spelling errors, it could be a sign of low-quality research. Open access journals were also considered less reliable by several journalists. ‘But by far the most commonly used benchmark for reliability was the journal’s reputation,’ Fleerackers explains. ‘Some journalists avoid all journals they’re not familiar with and report only on research published in top journals like Science and Nature.’

Distortion in science news

According to Fleerackers, journalists’ focus on the reputation and prestige of journals has major consequences for the diversity of research in the news media. ‘Research from newer, lesser-known journals, and from journals in the Global South, for example, remains hidden from the public. Most journalists in our study didn’t realise that their selection strategies could perpetuate the existing imbalance in science news. I hope that our study can raise awareness of this among journalists.’


Here’s a link to and a citation for the paper.

“I’d Like to Think I’d Be Able to Spot One”: How Journalists Navigate Predatory Journals by Alice Fleerackers, Laura L. Moorhead & Juan Pablo Alperin. Journalism Practice 1–19. DOI: https://doi.org/10.1080/17512786.2025.2551984 Published online: 02 Sep 2025

Final comments

This has been a good wake up call for me. Bad apples, yes, but criminal networks? I had no idea. I will probably write more about this in my 2025 year post. In the meantime, This is a good reminder to exercise caution.

Are science cities London, Paris, New York and Tokyo losing prominence?

I am more accustomed to thinking about great art cities than great science cities but it appears I lack imagination if a Dec. 13, 2013 news item on Nanowerk is to be believed (Note: A link has been removed),

The world’s largest scientific centers are losing some of their prominence due to geographical decentralization at the global scale, according to a team of researchers from the LISST (Laboratoire Interdisciplinaire Solidarités, Sociétés, Territoires, CNRS / Université de Toulouse II-Le Mirail / EHESS) who conducted a systematic statistical analysis of millions of articles and papers published in thousands of scientific reviews between 1987 and 2007. Their project, whose results were recently published on the Urban Studies website (“Cities and the geographical deconcentration of scientific activity: A multilevel analysis of publications (1987–2007)”), was the first to focus on the geography of science in all the world’s cities.

Here’s an image illustrating the researchers’ work,

Courtesy o CNRS [downloaded from http://www2.cnrs.fr/presse/communique/3353.htm]

Courtesy o CNRS [downloaded from http://www2.cnrs.fr/presse/communique/3353.htm]

The Dec. 10, 2013 CNRS (France’s Centre national de la recherche scientifique) news release, [English language version] [en français]), which originated the news item, provides more details,

Geographic encoding, city by city, of all of the articles listed in the Science Citation Index (SCI) (1) between 1987 and 2007 shows that traditional scientific centers are not as prominent as they used to be: the combined share of the world’s top 10 science cities dropped from 20% in 1987 to 13% in 2007. Researchers at the LISST (Laboratoire Interdisciplinaire Solidarités, Sociétés, Territoires, CNRS /Université de Toulouse II-Le Mirail / EHESS), aided by two collaborators at the CIRST (Centre Interuniversitaire de Recherche sur la Science et la Technologie) in Montreal, concluded that this phenomenon is accompanied by a general trend toward decentralization worldwide, especially in emerging nations. China offers a good illustration: the main provincial capitals are playing a much stronger role than they did in the past, and the skyrocketing development of science in China goes alongside with a geographical realignment. Whereas Beijing and Shanghai together accounted for 52.8% of the articles published by Chinese researchers in the Science Citation Index in 1987, this percentage dropped to 31.9% in 2007. Turkey is another striking example of an emerging nation whose scientific system has seen very rapid growth. In terms of the number of articles published, the country rose from 44th to 16th place worldwide between 1987 and 2007. Over the same period, its two main scientific hubs, Ankara and Istanbul, lost some of their pre-eminence within the country. While these two cities represented more than 60% of Turkey’s scientific production in 1987, they now produce slightly less than half of the articles published by Turkish researchers. And, as in China, growth in scientific activity is accompanied by geographical decentralization: Turkey has more science hubs now than it did two decades ago, and its two traditional scientific capitals play a lesser role.

The US, which remains the world leader in terms of scientific production, is an exceptional case: the number of articles published by American researchers continues to rise steadily, but at a slower pace than in the emerging nations. Consequently, the country’s share of worldwide scientific production is lower than it used to be: in 1987, the US represented 34% of the SCI, but by 2007 this figure had fallen to 25%. Nonetheless, the American scientific scene remains quite stable geographically: the role of its main research centers has not evolved significantly because the US scientific establishment has always been one of the least centralized in the world, with research activities scattered across hundreds of cities of all sizes.

Does this development herald the decline of the great scientific centers? The fact that scientific activity is becoming more geographically decentralized on a worldwide scale does not imply that it is declining in large cities with a strong research tradition. The number of articles published in London, Paris, New York and Tokyo continues to rise every year. But the pace of growth in those traditional centers is slower than in others in the global scientific system. As more research is conducted in an increasing number of cities, the main scientific centers contribute a lesser share to the total.

The findings of this project, funded as part of an ANR program (2010-2013), challenge the assumption that scientific production inevitably tends to be concentrated in a few large urban areas, which therefore should be given priority in the allocation of resources.

(1) The Science Citation Index (or SCI) is a bibliographical database created in the US in 1964 for the purpose of documenting all scientific production worldwide. In its current version (SCI-Expanded), which is part of the Thomson Reuters Web of Science database (WoS), it registers more than one million scientific articles every year, encompassing the experimental sciences and sciences of the universe, medicine, the engineering sciences, etc., but not the humanities and social sciences, which are included in the SSCI. The SCI-Expanded records contain information on the content of each article (title, name of publication, summary, keywords), its author or authors (name, institution, city, country), and the list of references cited in the article.

This is especially fascinating in light of a recently published book claiming that the major city centres for art in the 21st century will shifting to some unexpected places. From Phaidon Press’ Art Cities of the Future webpage,

The volume profiles 12 global cities to watch for exciting contemporary art: Beirut, Bogotá, Cluj, Delhi, Istanbul, Johannesburg, Lagos, San Juan, São Paulo, Seoul, Singapore and Vancouver.

Thankfully, in both the old world and the new, commentators appear to agree. “It’s great to have a look around and discover truly interesting new work,” said Simon Armstrong, book buyer at Tate Modern and Tate Britain, in The Bookseller, “and there are some great examples of emergent artists here in this huge presentation of contemporary art from 12 cities on the fringes of the art map.”

Hannah Clugston, writing in Aesthetica concurred, describing the title as “brilliantly executed” with “stunning images,” and possessing an awareness “of the wider concerns behind the work.”

It appears that the geography of creative endeavours in the arts and the sciences is shifting. For those curious about the science end of things, here’s a link to and a citation for the paper about geography and scientific activity,

Cities and the geographical deconcentration of scientific activity: A multilevel analysis of publications (1987–2007) by Michel Grossetti, Denis Eckert, Yves Gingras, Laurent Jégou, Vincent Larivière, and Béatrice Milard. Urban Studies, 0042098013506047, November 20, 2013, doi:10.1177/0042098013506047

This paper is behind a paywall.

Science and Society 2013 opens with a bang: a sponsor releases results of science muzzle survey on opening day (while S&S 2013 offers live streaming of some events)

Today (Oct. 21, 2013) during opening day of the Science and Society 2013 symposium (most recently mentioned in my Oct. 8, 2013 posting), one of the symposium sponsors, the Professional Institute of the Public Service of Canada (PIPSC) has released a survey of thousands of Canadian federal scientists answering questions about government science communication policy or the ‘government muzzle’ as it’s sometimes called (from the Oct. 21, 2013 Canadian Broadcasting Corporation [CBC] news item),

Hundreds of federal scientists responding to a survey said they had been asked to exclude or alter information for non-scientific reasons and thousands said they had been prevented from speaking to the media. [emphasis mine]

The Professional Institute of the Public Service of Canada (PIPSC), which commissioned the survey from Environics Research “to gauge the scale and impact of ‘muzzling’ and political interference among federal scientists,” released the results Monday at a news conference.

The union sent invitations to 15,398 federal scientists in June, asking them to participate in the survey. More than 4,000 took part. [emphasis mine]

PIPSC represents 60,000 public servants across the country, including 20,000 scientists, in federal departments and agencies, including scientists involved in food and consumer product safety and environmental monitoring.

Weirdly, the news item announces hundreds of scientists responded to follow up later stating that a number exceeding 4000 took part.

The Oct. 21, 2013 PIPSC news release about the survey which is included in a report (The Big Chill) can be found on the Live-PR website,

The survey, the findings of which are included in a new report titled The Big Chill, is the first extensive effort to gauge the scale and impact of “muzzling” and political interference among federal scientists since the Harper government introduced communications policies requiring them to seek approval before being interviewed by journalists. Information Commissioner Suzanne Legault is currently conducting her own investigation of the policies, which have been widely criticized for silencing scientists, suppressing information critical or contradictory of government policy, and delaying timely, vital information to the media and public.

In particular, the survey also found that nearly one-quarter (24%) of respondents had been directly asked to exclude or alter information for non-scientific reasons and that over one-third (37%) had been prevented in the past five years from responding to questions from the public and media.

In addition, the survey found that nearly three out of every four federal scientists (74%) believe the sharing of scientific findings has become too restricted in the past five years and that nearly the same number (71%) believe political interference has compromised Canada’s ability to develop policy, law and programs based on scientific evidence. According to the survey, nearly half (48%) are aware of actual cases in which their department or agency suppressed information, leading to incomplete, inaccurate, or misleading impressions by the public, industry and/or other government officials.

“Federal scientists are facing a climate of fear,” says PIPSC president Gary Corbett, “- a chill brought on by government policies that serve no one’s interests, least of all those of the Canadian public. The safety of our food, air, water, of hundreds of consumer and industrial products, and our environment depends on the ability of federal scientists to provide complete, unbiased, timely and accurate information to Canadians. Current policies must change to ensure these objectives are met.”

For anyone interested in seeing the survey and report, you can download it from PIPSC’s The Big Chill webpage.

In this context, the Science and Society 2013 symposium (S&S 2013) being held in Ottawa (site of the PIPSC [an S&S 2013 sponsor] Oct. 21, 2013news conference), is livestreaming a few events for the public (ones at 7 pm) and those intended for symposium attendees only. From an Oct. 18, 2013 announcement about the S&S 2013 live events,

WATCH THESE LIVE ONLINE!

MONDAY OCT. 21, 7PM ET
Transformations in the Relations between Science, Policy and Citizens
Yves Gingras, Canada Research Chair in History and Sociology of Science, UQAM

TUESDAY OCT. 22, 9:15AM ET
Science and Its Publics: Dependence, Disenchantment, and Deliverance
Sheila Jasanoff, Pforzheimer Professor of Science and Technology Studies, Harvard Kennedy School

TUESDAY OCT. 22, 1PM ET
Science, Values and Democracy
Heather Douglas, Waterloo Chair in Science and Society, Waterloo
Carla Fehr, Wolfe Chair in Scientific and Technological Literacy, Waterloo

WEDNESDAY Oct. 23, 1PM ET
Science Communication
Mary Anne Moser, Banff Centre

WEDNESDAY Oct. 23, 5:30PM ET
Influencers Panel
Key decision-makers discuss the symposium results
Scott Findley, Evidence for Democracy
Pat Mooney, ETC Group
Louise Vandelac, UQAM
Denise Amyot, Association of Canadian Community Colleges

Apparently, you can go here to click through to the events being livestreamed. (It looks like I grumbled too soon about the public not being allowed to attend any of the symposium talks outside the evening events specifically designated for the public. Thank you!)

Canadian science and society symposium in Ottawa (Oct. 21 – 23, 2013)

The Science and Society 2013: Emerging Agendas for Citizens and the Sciences symposium (featured previously in my Aug. 16,, 2013 posting) is being held in Ottawa, Ontario from Oct. 21-23, 2013 according to the symposium homepage,

Co-organized by the Situating Science SSHRC Strategic Knowledge Cluster (www.situsci.ca) and the University of Ottawa’s Institute for Science, Society and Policy (www.issp.uottawa.ca), the Science and Society 2013 symposium aims to understand and address the key issues at the interface of science, technology, society and policy.

The event will connect disparate themes and bring different groups with shared interests together to brainstorm solutions to common challenges. It will demonstrate that collaboration among academics, students, policy makers, stakeholders and the public at large can lead to new insights and a deeper understanding of the social and cultural contexts of science and technology.

The symposium aims to make the discussion of science and technology and their place in society more prominent in the national dialogue, notably through the publication of a symposium report containing recommendations on how to understand and improve the science-society interface and improve science policy.  This document will be distributed among media and key decision makers.

There are three events for the public:

The Transformations in the Relations Between Science, Policy and Citizens

Date: Mon. Oct. 21, 2013
Time: 19:00 – 20:30
Location: Desmarais Building, Rm. 12-102 (12th floor), University of Ottawa, 55 Laurier Avenue East, Ottawa
Price: Free (registration required)
Reception and Student Poster Display to follow
Out of town? Watch live online (link TBD)

The traditional relations between scientists, policy makers and citizens have been transformed over the last fifteen years. Scientists were used to providing science for policy makers who were eager to listen, while citizens were relatively confident in the judgments of scientists. Using recent cases of scientific and public controversies, we will show that citizens have more power now than ever before to influence policies in matters relating to scientific research. This raises the pressing issue for us as citizens: How do we give a central place to a scientific culture that is adapted to the 21st century?

Yves Gingras
Canada Research Chair in the History and Sociology of Science
Université du Québec à Montréal

UNCERTAIN SCIENCE, UNCERTAIN TIMES
Selections and discussion of Michael Frayn’s Tony Award-winning play, Copenhagen
Moderated by Jay Ingram
Directed by Kevin Orr
Tuesday Oct. 22, 2013, 7:30 pm
Alumni Auditorium, Jock-Turcot University Centre, 85 University, University of Ottawa
Free
Donations accepted at the door
Reception to follow
“Join” our Facebook event page:
https://www.facebook.com/events/455270781259464/?ref=22Limited seating!  Register online by Sunday Oct. 20:
www.ScienceAndSociety2013.ca    

The Situating Science national Strategic Knowledge Cluster with the University of Ottawa Institute for Science, Society and Policy invite you to join us for a professionally staged reading of selections from Michael Frayn’s acclaimed play Copenhagen, which will be interwoven with expert panel discussions moderated by science broadcaster and author, Jay Ingram.

Copenhagen is based on the final meeting of Nobel-Prize winning physicists Niels Bohr and Werner Heisenberg in the midst of the 1940s War effort. The issues it raises concerning science, ethics and politics are as pressing as ever.

Stage readings by: Tibor Egervari, Peter Hawaorth, and Beverly Wolfe

Panelists:
Dr. Ted Hsu, Member of Parliament for Kingston and the Islands, Science and Technology Critic for the Liberal Party of Canada

Dr. Shohini Ghose, Associate Professor, Department of Physics & Computer Science; Affiliate member, Perimeter Institute for Theoretical Physics and Director, Centre for Women in Science, Wilfred Laurier University

Dr. Robert Smith, Professor, Department of History and Classics, University of Alberta

Influencers Panel
Panel of influential decision-makers discussing results of the symposium

Date: Wed. Oct. 23, 2013
Time: 17:30 – 19:00
Location: Desmarais Building, Rm. 12-102 (12th floor), University of Ottawa, 55 Laurier Avenue East, Ottawa
Price: Free (registration required)
Reception to follow.
Out of town? Watch live online! (link TBD)

Yves St-Onge
Vice-President, Public Affairs and Marketing, Canada Science and Technology Museums Corporation

Scott Findlay
Associate Professor, Department of Biology, University of Ottawa
Evidence for Democracy

Pat Mooney
Executive Director, ETC Group

Louise Vandelac
Professor, Department of Sociology, Université du Québec à Montréal

Denise Amyot
President and CEO, Association of Canadian Community Colleges

Register today to attend the 3 public evening events …
Not in Ottawa? Some select symposium events will be availble to watch online live (no registration needed). Stay tuned to the event website for more.

This symposium, save for the three public evening events, appears to be for invitees only (there’s no symposium registration page). Presumably nobody wants any members of the public or strangers present when the invitees discuss such topics as these (from the symposium programme):

Science and Its Publics: Dependence, Disenchantment, and Deliverance [emphasis mins]

Desmarais Building Rm. 12
102
Chair: Dr. Gordon McOuat, Situating Science
Speaker: Dr. Sheila Jasanoff, Harvard Kennedy School
Session 1a: Science and Democracy [emphasis mine]
Desmarais Building Rm. 12
102
Chair/Speaker: Dr. Heather Douglas, Waterloo
Speakers:
Dr. Frédéric Bouchard, U. de Montréal
Dr. Patrick Feng, U. Calgary
Science, Policy and Citizens: How to improve the Science/Society interface [emphasis mine]
Desmarais Building Rm. 12 – 102
Chairs: Dr. Marc Saner, ISSP and Dr. Gordon McOuat, Situating Science
Speakers: Rapporteurs from previous sessions

It seems odd to be discussing democracy, citizenship, and science without allowing the public to attend any of the sessions. Meanwhile, the symposium’s one and only science and media session features two speakers, Penny Park of the Science Media Centre of Canada and Ivan Semeniuk of the Globe and Mail, who are firmly ensconced members of the mainstream media with no mention of anything else (science blogs?). Arguably, science bloggers could be considered relevant to these discussions since research suggests that interested members of the public are searching for science information online (in blogs and elsewhere) in in increasing numbers. I hope to get a look at the documentation once its been published, assuming there will be public access.