Tag Archives: University of Seville

Scientific fraud: widespread and organized according to Northwestern University research + math fraud scandal

I have three stories about issues with science and mathematics: the research, the reporting, and the fraud.

Northwestern University and widespread scientific fraud

An August 4, 2025 article by Cathleen O’Grady for science.org describes a study into global networks instigating scientific fraud, Note: A link has been removed,

For years, sleuths who study scientific fraud have been sounding the alarm about the sheer size and sophistication of the industry that churns out fake publications. Now, an extensive investigation finds evidence of a range of bad actors profiting from fraud. The study, based on an analysis of thousands of publications and their authors and editors, shows paper mills are just part of a complex, interconnected system that includes publishers, journals, and brokers.

The paper, published today in the Proceedings of the National Academy of Sciences, paints an alarming picture. Northwestern University metascientist Reese Richardson and his colleagues identify networks of editors and authors colluding to publish shoddy or fraudulent papers, report that large organizations are placing batches of fake papers in journals, suggest brokers may serve as intermediaries between paper mills and intercepted journals, and find that the number of fake papers—though still relatively small—seems to be increasing at a rate far greater than the scientific literature generally.

The paper shows that misconduct “has become an industry,” says Anna Abalkina of the Free University of Berlin, who studies corruption in science and was not involved with the research. Richardson and colleagues hope their sweeping case will attract attention and spur change.

O’Grady’s August 4, 2025 article provides some fascinating detail, Note: Links have been removed,

They began their analysis by pinpointing corrupt editors. They focused their investigation on PLOS ONE, because the megajournal allows easy access to bulk metadata and publishes the names of the editors who have handled the thousands of papers it publishes each year, making it possible to detect anomalies without behind-the-scenes information. The researchers identified all the papers from the journal that had been retracted or received comments on PubPeer—a website that allows researchers to critique published work—and then identified each paper’s editors.

All told, 33 editors stood out as more frequently handling work that was later retracted or criticized than would be expected by chance. “Some of these were immense outliers,” Richardson says. For instance, of the 79 papers that one editor had handled at PLOS ONE, 49 have been retracted. Flagged editors handled 1.3% of papers published in the journal by 2024, but nearly one-third of all retracted papers.

The team also spotted that these editors worked on certain authors’ papers at a suspiciously high rate. These authors were often editors at PLOS [Public Library of Science] ONE themselves, and they often handled each other’s papers. It’s possible that some editors are being paid bribes, Richardson says, but “also possible that these are informal arrangements that are being made among colleagues.” The researchers detected similarly questionable editor behavior in 10 journals published by Hindawi, an open-access publisher that was shuttered because of rampant paper mill activity after Wiley acquired it. A spokesperson for Wiley told Science the publisher has made “significant investments to address research integrity issues.”

Renee Hoch, head of publication ethics at PLOS, said in an email to Science that the publisher has long been aware of networks like these, and will assess whether any of the editors implicated are still on the journal’s editorial board, opening investigations if they are. She emphasizes that the study focused on PLOS because of its readily accessible data: “Paper mills are truly an industry-wide problem.”

Researchers working on paper mills have long assumed that editors and authors have been colluding. The new findings are “killer evidence” for these suspicions, says Domingo Docampo, a bibliometrician at the University of Vigo (Spain). He adds that although the findings only show collusion at a limited number of journals, others are probably affected. Just last week, Retraction Watch reported that the publisher Frontiers had begun to retract 122 papers after discovering a network of editors and authors “who conducted peer review with undisclosed conflicts of interest,” according to a company statement. The network of 35 individuals has also published more than 4000 papers in journals from seven other publishers, the company said, which require further scrutiny. A Frontiers spokesperson said they planned to share information with the other affected publishers.

Richardson and his colleagues found that the problem goes far beyond networks of unscrupulous editors and authors scratching each other’s backs. They identified what appear to be coordinated efforts to arrange the publication of batches of dubious papers in multiple journals.

For the curious, there’s more in O’Grady’s August 4, 2025 article. An August 4, 2025 Northwestern University news release by Amanda Morris (received via email and available on EurekAlert) focuses on other aspects of the research,

From fabricated research to paid authorships and citations, organized scientific fraud is on the rise, according to a new Northwestern University study.

By combining large-scale data analysis of scientific literature with case studies, the researchers led a deep investigation into scientific fraud. Although concerns around scientific misconduct typically focus on lone individuals, the Northwestern study instead uncovered sophisticated global networks of individuals and entities, which systematically work together to undermine the integrity of academic publishing.

The problem is so widespread that the publication of fraudulent science is outpacing the growth rate of legitimate scientific publications. The authors argue these findings should serve as a wake-up call to the scientific community, which needs to act before the public loses confidence in the scientific process.

The study will be published during the week of August 4 the Proceedings of the National Academy of Sciences.

“Science must police itself better in order to preserve its integrity,” said Northwestern’s Luís A. N. Amaral, the study’s senior author. “If we do not create awareness around this problem, worse and worse behavior will become normalized. At some point, it will be too late, and scientific literature will become completely poisoned. Some people worry that talking about this issue is attacking science. But I strongly believe we are defending science from bad actors. We need to be aware of the seriousness of this problem and take measures to address it.”

An expert in complex social systems, Amaral is the Erastus Otis Haven Professor and professor of engineering sciences and applied mathematics at Northwestern’s McCormick School of Engineering. Reese Richardson, a postdoctoral fellow in Amaral’s laboratory, is the paper’s first author.

Extensive analysis

When people think about scientific fraud, they might remember news reports of retracted papers, falsified data or plagiarism. These reports typically center around the isolated actions of one individual, who takes shortcuts to get ahead in an increasingly competitive industry. But Amaral and his team uncovered a widespread underground network operating within the shadows and outside of the public’s awareness.

“These networks are essentially criminal organizations, acting together to fake the process of science,” Amaral said. “Millions of dollars are involved in these processes.”

To conduct the study, the researchers analyzed extensive datasets of retracted publications, editorial records and instances of image duplication. Most of the data came from major aggregators of scientific literature, including Web of Science (WoS), Elsevier’s Scopus, National Library of Medicine’s PubMed/MEDLINE and OpenAlex, which includes data from Microsoft Academic Graph, Crossref, ORCID, Unpaywall and other institutional repositories.

Richardson and his colleagues also collected lists of de-indexed journals, which are scholarly journals that have been removed from databases for failing to meet certain quality or ethical standards. The researchers also included data on retracted articles from Retraction Watch, article comments from PubPeer and metadata — such as editor names, submission dates and acceptance dates — from articles published in specific journals.

Buying a reputation

After analyzing the data, the team uncovered coordinated efforts involving “paper mills,” brokers and infiltrated journals. Functioning much like factories, paper mills churn out large numbers of manuscripts, which they then sell to academics who want to quickly publish new work. These manuscripts are mostly low quality — featuring fabricated data, manipulated or even stolen images, plagiarized content and sometimes nonsensical or physically impossible claims.

“More and more scientists are being caught up in paper mills,” Amaral said. “Not only can they buy papers, but they can buy citations. Then, they can appear like well-reputed scientists when they have barely conducted their own research at all.”

“Paper mills operate by a variety of different models,” Richardson added. “So, we have only just been able to scratch the surface of how they operate. But they sell basically anything that can be used to launder a reputation. They often sell authorship slots for hundreds or even thousands of dollars. A person might pay more money for the first author position or less money for a fourth author position. People also can pay to get papers they have written automatically accepted in a journal through a sham peer-review process.”

To identify more articles originating from paper mills, the Amaral group launched a parallel project that automatically scans published materials science and engineering papers. The team specifically looked for authors who misidentified instruments they used in their research. A paper with those results was accepted by the journal PLOS ONE.

Brokers, hijacking and collusion

Amaral, Richardson and their collaborators found fraudulent networks use several key strategies: (1) Groups of researchers collude to publish papers across multiple journals. When their activities are discovered, the papers are subsequently retracted; (2) brokers serve as intermediaries to enable mass publication of fraudulent papers in compromised journals; (3) fraudulent activities are concentrated in specific, vulnerable subfields; and (4) organized entities evade quality-control measures, such as journal de-indexing.

“Brokers connect all the different people behind the scenes,” Amaral said. “You need to find someone to write the paper. You need to find people willing to pay to be the authors. You need to find a journal where you can get it all published. And you need editors in that journal who will accept that paper.”

Sometimes these organizations go around established journals altogether, searching instead for defunct journals to hijack. When a legitimate journal stops publishing, for example, bad actors can take over its name or website. These actors surreptitiously assume the journal’s identity, lending credibility to its fraudulent publications, despite the actual publication being defunct.

“This happened to the journal HIV Nursing,” Richardson said. “It was formerly the journal of a professional nursing organization in the U.K., then it stopped publishing, and its online domain lapsed. An organization bought the domain name and started publishing thousands of papers on subjects completely unrelated to nursing, all indexed in Scopus.”

Fighting for science

To combat this growing threat to legitimate scientific publishing, Amaral and Richardson emphasize the need for a multi-prong approach. This approach includes enhanced scrutiny of editorial processes, improved methods for detecting fabricated research, a greater understanding of the networks facilitating this misconduct and a radical restructuring of the system of incentives in science.

Amaral and Richardson also underscore the importance of addressing these issues before artificial intelligence (AI) infiltrates scientific literature more than it already has.

“If we’re not prepared to deal with the fraud that’s already occurring, then we’re certainly not prepared to deal with what generative AI can do to scientific literature,” Richardson said. “We have no clue what’s going to end up in the literature, what’s going to be regarded as scientific fact and what’s going to be used to train future AI models, which then will be used to write more papers.”

“This study is probably the most depressing project I’ve been involved with in my entire life,” Amaral said. “Since I was a kid, I was excited about science. It’s distressing to see others engage in fraud and in misleading others. But if you believe that science is useful and important for humanity, then you have to fight for it.”

Here’s a link to and a citation for the paper,

The entities enabling scientific fraud at scale are large, resilient, and growing rapidly by Reese A. K. Richardson, Spencer S. Hong, Jennifer A. Byrne, Thomas Stoeger, and Luís A. Nunes Amaral. Proceedings of the National Academy of Sciences August 4, 2025 122 (32) e2420092122 DOI: https://doi.org/10.1073/pnas.2420092122

This paper is open access.

And now—math fraud

A September 19, 2025 news item on ScienceDaily features an investigation into fraudulent math research, Note: A link has been removed,

An international team of authors led by Ilka Agricola, professor of mathematics at the University of Marburg, Germany, has investigated fraudulent practices in the publication of research results in mathematics on behalf of the German Mathematical Society (DMV) and the International Mathematical Union (IMU), documenting systematic fraud over many years. The results of the study were recently published on the preprint server arxiv.org and in the Notices of the American Mathematical Society (AMS) and have since caused a stir among mathematicians.

Sanjana Gajbhiye’s September ??, 2025 article for earth.com delves further into the topic, Note: Links have been removed,

Quality lost to quantity

The findings show how the definition of research quality has shifted. Instead of focusing on content, originality, and insight, institutions and individuals are increasingly evaluated by commercial metrics. These include the number of publications, total citations, and the so-called impact factor of journals.

Such measures, calculated by private companies with little transparency, have gained outsized influence. Providers promote their databases globally, and universities use them to enhance prestige and compete internationally.

This environment rewards quantity over quality, pushing academics to publish more, even when contributions are marginal or flawed.

Fraudulent companies have seized this opportunity. They sell services that manipulate rankings, offering ghostwritten articles, fake peer reviews, and even bundles of citations. For individuals, this can mean better career prospects.

For universities, it can result in higher rankings, increased funding, and greater appeal to international students. The collateral damage is a growing pool of unread publications that add nothing to scientific understanding.

Fake mathematics success

The report documents striking examples that reveal how metrics can produce absurd outcomes. In 2019, Clarivate Inc., the market leader for citation data, ranked a Taiwanese university as having the most world-class mathematicians. The catch was startling: mathematics was not even offered at the institution.

Mathematic trust under threat

“‘Fake science‘ is not only annoying, it is a danger to science and society,” said IMU Secretary General Professor Christoph Sorger.

“Because you don’t know what is valid and what is not. Targeted disinformation undermines trust in science and also makes it difficult for us mathematicians to decide which results can be used as a basis for further research.”

This erosion of trust strikes at the heart of mathematics. Proofs rely on certainty, yet when fraudulent or hollow work appears in respected outlets, that certainty weakens.

Fixing trust in mathematics publishing

The commission’s work does not end with exposing the problem. It also outlines possible solutions for a healthier publication system. These recommendations emphasize the need to strengthen peer review, encourage collaboration among journals, and recenter the evaluation of research on quality rather than raw numbers.

Metrics are deeply tied to funding and prestige, so the shift won’t be simple, but it could reshape the landscape for future generations.

A September 20, 2025 Castle Journal blog posting provides more information,

The “Culture of Numbers” and its Consequences

The study, led by Professor Ilka Agricola of the University of Marburg, argues that the root cause of the problem is a “culture of numbers” that prioritizes commercial metrics over scientific content. Universities and research institutions have become increasingly reliant on commercial databases like Clarivate’s Journal Citation Reports (JCR) to evaluate researchers. These metrics, which are not transparent and are not vetted by the scientific community, have become the main currency for career progression, grants, and prestige.

 * “Megajournals”: The study highlights the rise of “megajournals,” which publish anything as long as the authors pay a fee. These journals now publish more articles per year than all reputable mathematics journals combined. The report cites a shocking example where a commercial database ranked a university in Taiwan as having the most world-class researchers in mathematics, despite the fact that the university does not even offer mathematics as a subject.

 * Paper Mills and Citation Cartels: The investigation found evidence of “paper mills,” which sell fabricated papers to researchers, and “citation cartels,” where academics agree to cite each other’s work to artificially inflate their metrics. These services are offered anonymously online, with prices for articles and citations ranging from hundreds to thousands of dollars. The report describes these networks as “criminal organizations” that have invaded the “ecosystem” of scientific publishing.

Ilka Agricola gave an interview to Retraction Watch, from the undated article, Note: Links have been removed,

A pair of papers posted to the arXiv addresses the issue of fraudulent publishing in math, particularly metrics gaming, and offers a list of recommendations to help detect and deal with that problem and other fraudulent activities. (The former was also published in the October AMS Notices; the latter will appear in the November issue.) “Fraudulent publishing undermines trust in science and scientific results and therefore fuels antiscience movements,” mathematician Ilka Agricola, lead author of both papers, told Retraction Watch. 

A professor of mathematics at Marburg University in Germany, Agricola was president of the German Mathematical Society in 2021-2022 and is chair of the Committee on Publishing of the International Mathematical Union. The new articles are the products of a working group of the IMU and the International Council of Industrial and Applied Mathematics. 

Retraction Watch: As you note in the new papers, Clarivate announced in 2023 it had excluded the entire field of math from its list of “Highly Cited Researchers,” or HCRs. What’s going on?

Agricola: The publication culture in math differs a bit from, say, experimental and life sciences. On average, mathematicians publish fewer papers with fewer authors than scientists in other fields. So, with the same absolute number of papers and citations, one can become a “highly-cited researcher” in math, but not in other fields. Thus, gaming the system is easier. 

The list of HCRs for mathematics became so screwed that Clarivate couldn’t pretend anymore that it had any value. This being said, Clarivate announced that they would look into new measuring tools, but didn’t come up with any alternative ideas in the meantime, nor did they contact any representatives of the international mathematical community. 

Retraction Watch:  Few people talk about fraudulent publishing in math. Why is that?

Agricola: For a long time, mathematicians thought that as long as they keep away from predatory journals or paper mills, the problem does not affect them. This turned out to be wrong. 

Retraction Watch: If you look at the number of papers that tripped Clear Skies’ Papermill Alarm in 2022 (we included a histogram in this article we wrote for The Conversation [link and excerpts follow]), math is pretty far down the list. Are there a lot of fake papers in math?

Agricola: It is probably fair to say that the problem is not as severe as in other fields like cancer research, but the community is smaller and the number of fake papers is growing at alarming speed. Predatory and low-quality mega-journals are trying hard to lure respected scientists into their parallel universe of fake science, thus trying to give themselves the impression of respectability. Thus, one of our goals is to raise awareness for the issue in the mathematical community!

Retraction Watch: You and your coauthors are mathematicians, and yet you argue against focusing on numbers like journal impact factors and publication and citation counts. Is that what’s driving all of this bad behavior?

Agricola: “When a measure becomes a target, it ceases to be a good measure.” This quote is from the British economist Charles Goodhart, and it also applies to bibliometrics measures. Of course, gaming these metrics has always existed, but some of us liked to believe that they would be roughly OK, with some error bar due to some cheating. Now, we realize the error bar is larger than the number one wants to measure. Perhaps one advantage of mathematicians is that they are not easily impressed by numbers, and we have the means to understand and analyze them — this is our job. And so, the conclusion is very clear: The correlation between bibliometrics and research quality is so low that we should not use bibliometrics. And I urge all colleagues to say so openly!

Retraction Watch: So how do we judge research quality if we shouldn’t use publication metrics?

Agricola: Read the actual publications instead of relying on bibliometrics! Plus, in mathematics, we are lucky to have two extremely well curated databases for math papers and journals, zbMath Open and MathReviews. If a journal is not included there, it’s either very interdisciplinary or one should get suspicious.

Retraction Watch: Is it possible for individual researchers to jump off the bibiometrics bandwagon without jeopardizing their careers?

Agricola: We need to fight for a change in culture, that’s for sure, and the path will be rash and hard. To young researchers, we should give the warning that being involved in predatory publishing can also just as well put their scientific integrity at risk. Remember the people who had to resign because of data falsification? 

I am providing citations (of a sort) to both papers and links to all three sites where both papers can be found and PDFs for both papers: Everything is open access.

Fraudulent Publishing in the Mathematical Sciences by Ilka Agricola, Lynn Heller, Wil Schilders, Moritz Schubotz, Peter Taylor, Luis Vega.

arXiv: https://arxiv.org/abs/2509.07257

AMS (American Mathematical Society) Notices October 2025: https://www.ams.org/journals/notices/202509/noti3217/noti3217.html?adat=October%202025&trk=3217&pdfissue=202509&pdffile=rnoti-p1038.pdf&cat=none&type=.html

Ilke Agricola’s Research Gate website: https://www.researchgate.net/profile/Ilka-Agricola (scroll down to see the listed papers)

PDF: https://www.ams.org/journals/notices/202509/rnoti-p1038.pdf

How to Fight Fraudulent Publishing in the Mathematical Sciences: Joint Recommendations of the IMU [International Mathematical Union] and the ICIAM [International Council for Industrial and Applied Mathematics] by Ilka Agricola, Lynn Heller, Wil Schilders, Moritz Schubotz, Peter Taylor, Luis Vega.

arXiv: https://arxiv.org/abs/2509.09877

AMS (American Mathematical Society) Notices November 2025: https://www.ams.org/journals/notices/202510/noti3266/noti3266.html?adat=November%202025&trk=3266&pdfissue=202510&pdffile=rnoti-p1179.pdf&cat=none&type=.html

Ilke Agricola’s Research Gate website: https://www.researchgate.net/profile/Ilka-Agricola (scroll down to see the listed papers)

PDF: https://www.ams.org/journals/notices/202510/rnoti-p1179.pdf

Fraud slows down research

Mentioned in the Retraction Watch/Agricola interview, this January 29, 2025 article by Frederik Joelving (contributing editor, Retraction Watch), Cyril Labbé, (professor of computer science, Université Grenoble Alpes [UGA]), Guillaume Cabanac, (professor of computer Science, Institut de Recherche en Informatique de Toulouse) is chilling, Note: Links have been removed,

Over the past decade, furtive commercial entities around the world have industrialized the production, sale and dissemination of bogus scholarly research, undermining the literature that everyone from doctors to engineers rely on to make decisions about human lives.

It is exceedingly difficult to get a handle on exactly how big the problem is. Around 55,000 scholarly papers have been retracted to date, for a variety of reasons, but scientists and companies who screen the scientific literature for telltale signs of fraud estimate that there are many more fake papers circulating – possibly as many as several hundred thousand. This fake research can confound legitimate researchers who must wade through dense equations, evidence, images and methodologies only to find that they were made up.

Even when the bogus papers are spotted – usually by amateur sleuths on their own time – academic journals are often slow to retract the papers, allowing the articles to taint what many consider sacrosanct: the vast global library of scholarly work that introduces new ideas, reviews other research and discusses findings.

These fake papers are slowing down research that has helped millions of people with lifesaving medicine and therapies from cancer to COVID-19. Analysts’ data shows that fields related to cancer and medicine are particularly hard hit, while areas like philosophy and art are less affected. Some scientists have abandoned their life’s work because they cannot keep pace given the number of fake papers they must bat down.

The problem reflects a worldwide commodification of science. Universities, and their research funders, have long used regular publication in academic journals as requirements for promotions and job security, spawning the mantra “publish or perish.”

But now, fraudsters have infiltrated the academic publishing industry to prioritize profits over scholarship. Equipped with technological prowess, agility and vast networks of corrupt researchers, they are churning out papers on everything from obscure genes to artificial intelligence in medicine.

These papers are absorbed into the worldwide library of research faster than they can be weeded out. About 119,000 scholarly journal articles and conference papers are published globally every week, or more than 6 million a year. Publishers estimate that, at most journals, about 2% of the papers submitted – but not necessarily published – are likely fake, although this number can be much higher at some publications.

… there is a bustling online underground economy for all things scholarly publishing. Authorship, citations, even academic journal editors, are up for sale. This fraud is so prevalent that it has its own name: paper mills, a phrase that harks back to “term-paper mills,” where students cheat by getting someone else to write a class paper for them.

The impact on publishers is profound. In high-profile cases, fake articles can hurt a journal’s bottom line. Important scientific indexes – databases of academic publications that many researchers rely on to do their work – may delist journals that publish too many compromised papers. There is growing criticism that legitimate publishers could do more to track and blacklist journals and authors who regularly publish fake papers that are sometimes little more than artificial intelligence-generated phrases strung together.

To better understand the scope, ramifications and potential solutions of this metastasizing assault on science, we – a contributing editor at Retraction Watch, a website that reports on retractions of scientific papers and related topics, and two computer scientists at France’s Université Toulouse III–Paul Sabatier and Université Grenoble Alpes who specialize in detecting bogus publications – spent six months investigating paper mills.

This included, by some of us at different times, trawling websites and social media posts, interviewing publishers, editors, research-integrity experts, scientists, doctors, sociologists and scientific sleuths engaged in the Sisyphean task of cleaning up the literature. It also involved, by some of us, screening scientific articles looking for signs of fakery.

What emerged is a deep-rooted crisis that has many researchers and policymakers calling for a new way for universities and many governments to evaluate and reward academics and health professionals across the globe.

Just as highly biased websites dressed up to look like objective reporting are gnawing away at evidence-based journalism and threatening elections, fake science is grinding down the knowledge base on which modern society rests.

The January 29, 2025 article highlights a number of problems including these,

To expedite the publication of one another’s work, some corrupt scientists form peer review rings. Paper mills may even create fake peer reviewers impersonating real scientists to ensure their manuscripts make it through to publication. Others bribe editors or plant agents on journal editorial boards.

María de los Ángeles Oviedo-García, a professor of marketing at the University of Seville in Spain, spends her spare time hunting for suspect peer reviews from all areas of science, hundreds of which she has flagged on PubPeer. ……

“One of the demanding fights for me is to keep faith in science,” says Oviedo-García, who tells her students to look up papers on PubPeer before relying on them too heavily. Her research has been slowed down, she adds, because she now feels compelled to look for peer review reports for studies she uses in her work. Often there aren’t any, because “very few journals publish those review reports,” Oviedo-García says.

An ‘absolutely huge’ problem

It is unclear when paper mills began to operate at scale. The earliest article retracted due to suspected involvement of such agencies was published in 2004, according to the Retraction Watch Database, which contains details about tens of thousands of retractions. (The database is operated by The Center for Scientific Integrity, the parent nonprofit of Retraction Watch.) Nor is it clear exactly how many low-quality, plagiarized or made-up articles paper mills have spawned.

“The threat of paper mills to scientific publishing and integrity has no parallel over my 30-year scientific career …. In the field of human gene science alone, the number of potentially fraudulent articles could exceed 100,000 original papers,” she [Jennifer Byrne, an Australian scientist] wrote to lawmakers, adding, “This estimate may seem shocking but is likely to be conservative.”

In one area of genetics research – the study of noncoding RNA in different types of cancer – “We’re talking about more than 50% of papers published are from mills,” Byrne said. “It’s like swimming in garbage.”

… in the global south, the publish-or-perish edict runs up against underdeveloped research infrastructures and education systems, leaving scientists in a bind. For a Ph.D., the Cairo physician who requested anonymity conducted an entire clinical trial single-handedly – from purchasing study medication to randomizing patients, collecting and analyzing data and paying article-processing fees. In wealthier nations, entire teams work on such studies, with the tab easily running into the hundreds of thousands of dollars.

“Research is quite challenging here,” the physician said. That’s why scientists “try to manipulate and find easier ways so they get the job done.”

Institutions, too, have gamed the system with an eye to international rankings. In 2011, the journal Science described how prolific researchers in the United States and Europe were offered hefty payments for listing Saudi universities as secondary affiliations on papers. And in 2023, the magazine, in collaboration with Retraction Watch, uncovered a massive self-citation ploy by a top-ranked dental school in India that forced undergraduate students to publish papers referencing faculty work.

According to the January 29, 2025 article, there is a root cause, Note: Links have been removed,

… unsavory schemes can be traced back to the introduction of performance-based metrics in academia, a development driven by the New Public Management movement that swept across the Western world in the 1980s, according to Canadian sociologist of science Yves Gingras of the Université du Québec à Montréal. When universities and public institutions adopted corporate management, scientific papers became “accounting units” used to evaluate and reward scientific productivity rather than “knowledge units” advancing our insight into the world around us, Gingras wrote.

This transformation led many researchers to compete on numbers instead of content, which made publication metrics poor measures of academic prowess. As Gingras has shown, the controversial French microbiologist Didier Raoult, who now has more than a dozen retractions to his name, has an h-index – a measure combining publication and citation numbers – that is twice as high as that of Albert Einstein – “proof that the index is absurd,” Gingras said.

Worse, a sort of scientific inflation, or “scientometric bubble,” has ensued, with each new publication representing an increasingly small increment in knowledge. “We publish more and more superficial papers, we publish papers that have to be corrected, and we push people to do fraud,” said Gingras.

In terms of career prospects of individual academics, too, the average value of a publication has plummeted, triggering a rise in the number of hyperprolific authors. One of the most notorious cases is Spanish chemist Rafael Luque, who in 2023 reportedly published a study every 37 hours.

There is some hope according to the January 29, 2025 article, Note: Links have been removed,

Stern [Bodo Stern, a former editor of the journal Cell and chief of Strategic Initiatives at Howard Hughes Medical Institute] isn’t the first scientist to bemoan the excessive focus on bibliometrics. “We need less research, better research, and research done for the right reasons,” wrote the late statistician Douglas G. Altman in a much-cited editorial from 1994. “Abandoning using the number of publications as a measure of ability would be a start.”

Nearly two decades later, a group of some 150 scientists and 75 science organizations released the San Francisco Declaration on Research Assessment, or DORA, discouraging the use of the journal impact factor and other measures as proxies for quality. The 2013 declaration has since been signed by more than 25,000 individuals and organizations in 165 countries.

Despite the declaration, metrics remain in wide use today, and scientists say there is a new sense of urgency.

Stern and his colleagues have tried to make improvements at their institution. Researchers who wish to renew their seven-year contract have long been required to write a short paragraph describing the importance of their major results. Since the end of 2023, they also have been asked to remove journal names from their applications.

That way, “you can never do what all reviewers do – I’ve done it – look at the bibliography and in just one second decide, ‘Oh, this person has been productive because they have published many papers and they’re published in the right journals,’” says Stern. “What matters is, did it really make a difference?”

Shifting the focus away from convenient performance metrics seems possible not just for wealthy private institutions like Howard Hughes Medical Institute, but also for large government funders. In Australia, for example, the National Health and Medical Research Council in 2022 launched the “top 10 in 10” policy, aiming, in part, to “value research quality rather than quantity of publications.”

Rather than providing their entire bibliography, the agency, which assesses thousands of grant applications every year, asked researchers to list no more than 10 publications from the past decade and explain the contribution each had made to science. …

Gingras, the Canadian sociologist, advocates giving scientists the time they need to produce work that matters, rather than a gushing stream of publications. He is a signatory to the Slow Science Manifesto: “Once you get slow science, I can predict that the number of corrigenda, the number of retractions, will go down,” he says.

At one point, Gingras was involved in evaluating a research organization whose mission was to improve workplace security. An employee presented his work. “He had a sentence I will never forget,” Gingras recalls. The employee began by saying, “‘You know, I’m proud of one thing: My h-index is zero.’ And it was brilliant.” The scientist had developed a technology that prevented fatal falls among construction workers. “He said, ‘That’s useful, and that’s my job.’ I said, ‘Bravo!’”

Sometimes, there’s a science reporting problem

A September 3, 2025 Universiteit van Amsterdam press release (also on EurekAlert) highlights a problem with science reporting and over confidence,

Science journalists aren’t particularly concerned about so-called “predatory journals”, confident that they have the skills and intuition needed to avoid reporting on problematic research. For many, a journal’s reputation and name-recognition are decisive factors in assessing the quality of scientific research – but this could be exacerbating existing imbalances in science and journalism. This perspective emerges from a new study, led by Dr Alice Fleerackers of the University of Amsterdam (UvA), and published on 2 September [2025] in Journalism Practice.

Predatory journals prioritise profit over editorial and publication standards. They often charge researchers publication fees but offer little to no real quality control, such as peer review. As a result, some journals publish almost everything submitted. ‘Predatory journals are not a harmless side effect of the academic publishing industry,’ says Fleerackers. ‘They are becoming increasingly common, raising concerns about the integrity of scientific publishing. They not only undermine the reliability of science but also jeopardise science journalism, as journalists can unknowingly report on weak or even flawed research.’

In the new study, Fleerackers – along with colleagues from Simon Fraser University (Canada) and San Francisco State University (US) – investigated how science journalists view predatory journals and what strategies they employ to ensure the reliability of the journals they report on. The researchers present a qualitative analysis of interviews with 23 health, science, and environmental journalists in Europe and North America.

Problematic, but only in theory

Some of the journalists interviewed were familiar with the phenomenon of predatory journals and acknowledged that they are theoretically problematic. However, most weren’t concerned that they might be using them in their own work. They acknowledged that these journals might be a problem for colleagues, but not for them.

Well-known, therefore reliable

Journalists in the study were confident they wouldn’t fall for a predatory journal because of their strong intuition, which they said allowed them to immediately distinguish high-quality from problematic research. Besides their intuition, they also relied on strategies for verifying the reliability of research that they had developed through years of experience. These strategies often centred on trust proxies – like the journal’s prestige, impact factor, and selectivity – as well as whether the journal claimed to conduct peer review.

Proofreading also played a role for some journalists: if an article contained grammatical or spelling errors, it could be a sign of low-quality research. Open access journals were also considered less reliable by several journalists. ‘But by far the most commonly used benchmark for reliability was the journal’s reputation,’ Fleerackers explains. ‘Some journalists avoid all journals they’re not familiar with and report only on research published in top journals like Science and Nature.’

Distortion in science news

According to Fleerackers, journalists’ focus on the reputation and prestige of journals has major consequences for the diversity of research in the news media. ‘Research from newer, lesser-known journals, and from journals in the Global South, for example, remains hidden from the public. Most journalists in our study didn’t realise that their selection strategies could perpetuate the existing imbalance in science news. I hope that our study can raise awareness of this among journalists.’


Here’s a link to and a citation for the paper.

“I’d Like to Think I’d Be Able to Spot One”: How Journalists Navigate Predatory Journals by Alice Fleerackers, Laura L. Moorhead & Juan Pablo Alperin. Journalism Practice 1–19. DOI: https://doi.org/10.1080/17512786.2025.2551984 Published online: 02 Sep 2025

Final comments

This has been a good wake up call for me. Bad apples, yes, but criminal networks? I had no idea. I will probably write more about this in my 2025 year post. In the meantime, This is a good reminder to exercise caution.

Nanomaterials used to measure nuclear reaction in radioactive nuclei produced in neutron star collisions

There seems to be renewed interest in nuclear science as measured by the frequency of the research I’m stumbling across and as evidenced by this March 18, 2025 news item on phys.org,

Physicists have measured a nuclear reaction that can occur in neutron star collisions, providing direct experimental data for a process that had previously only been theorized. The study, led by the University of Surrey, provides new insight into how the universe’s heaviest elements are forged—and could even drive advancements in nuclear reactor physics.

Working in collaboration with the University of York, the University of Seville, and TRIUMF, Canada’s national particle accelerator centre, the breakthrough marks the first-ever measurement of a weak r-process reaction cross-section using a radioactive ion beam, in this case studying the 94Sr(α,n)97Zr reaction. This is where a radioactive form of strontium (strontium-94) absorbs an alpha particle (a helium nucleus), then emits a neutron and transforms into zirconium-97.

I’ve highlighted the mention of nanomaterials in the March 18, 2025 University of Surrey press release (also on EurekAlert), which originated the news item,

Dr Matthew Williams, lead author of the study from the University of Surrey, said: 

“The weak r-process plays a crucial role in the formation of heavy elements, which astronomers have observed in ancient stars – celestial fossils that carry the chemical fingerprints of perhaps only one prior cataclysmic event, like a supernovae or neutron star merger. Until now, our understanding of how these elements form has relied on theoretical predictions, but this experiment provides the first real-world data to test those models that involve radioactive nuclei.” 

The experiment was enabled by the use of novel helium targets. Since helium is a noble gas, meaning it is neither reactive nor solid, researchers at the University of Seville developed an innovative nano-material target, embedding helium inside ultra-thin silicon films to form billions of microscopic helium bubbles, each only a few 10s of nanometres across

Using TRIUMF’s advanced radioactive ion beam technology, the team accelerated short-lived strontium-94 isotopes into these targets, allowing them to measure the nuclear reaction under conditions similar to those found in extreme cosmic environments.  

Dr Williams said: 

“This is a major achievement for astrophysics and nuclear physics, and the first-time nanomaterials have been used in this way, opening exciting new possibilities for nuclear research.  

“Beyond astrophysics, understanding how radioactive nuclei behave is crucial for improving nuclear reactor design. These types of nuclei are constantly produced in nuclear reactors, but until recently, studying their reactions has been extremely difficult. Reactor physics depends on this kind of data to predict how often components need replacing, how long they’ll last and how to design more efficient, modern systems.” 

The next phase of research will apply the findings to astrophysical models, helping scientists to better understand the origins of the heaviest known elements. As researchers continue to explore these processes, their work could deepen our understanding of both the extreme physics of neutron star collisions and practical applications in nuclear technology. 

Here’s a citation and a link to the paper,

First Measurement of a Weak 𝑟-Process Reaction on a Radioactive Nucleus by M. Williams, C. Angus, A. M. Laird, B. Davids, C. Aa. Diget, A. Fernandez, E. J. Williams, A. N. Andreye, H. Asch, A. A. Avaa, G. Bartram, S. Chakraborty, I. Dillmann, K. Directo, D. T. Doherty, E. Geerlof, C. J. Griffin, A. Grimes, G. Hackman, J. Henderson, K. Hudson, D. Hufschmidt, J. Jeong, M. C. Jiménez de Haro, V. Karayonchev,, A. Katrusiak, A. Lennarz, G. Lotay, B. Marlow, M. S. Martin, S. Molló, F. Montes, J. R. Murias, J. O’Neill, K. Pak6, C. Paxman, L. Pedro-Botet, A. Psaltis, E. Raleigh-Smith, D. Rhodes, J. S. Rojo, M. Satrazani, T. Sauvage, C. Shenton, C. E. Svensson, D. Tam, L. Wagner, and D. Yates. Phys. Rev. Lett. 134, 112701– Published 17 March, 2025 Vol. 134, Iss. 11 — 21 March 2025. DOI: https://doi.org/10.1103/PhysRevLett.134.112701

This paper is behind a paywall.

The nuclear fusion energy race

In addition to the competition to develop commercial quantum computing, there’s the competition to develop commercial nuclear fusion energy. I have four stories about nuclear fusion, one from Spain, one from Chine, one from the US, and one from Vancouver. There are also a couple of segues into history and the recently (April 2, 2025) announced US tariffs (chaos has since ensued as these have become ‘on again/off again’ tariffs) but the bulk of this posting is focused on the latest (January – early April 2025) in fusion energy.

Fission nuclear energy, where atoms are split, is better known; fusion nuclear energy is released when a star is formed. For anyone unfamiliar with the word tokamak as applied to nuclear fusion (which is mentioned in all the stories), you can find out more in the Tokamak Wikipedia entry.

Spain

A January 21, 2025 news item on phys.org announces the first plasma generated by a tokamak,

In a pioneering approach to achieve fusion energy, the SMART device has successfully generated its first tokamak plasma. This step brings the international fusion community closer to achieving sustainable, clean, and virtually limitless energy through controlled fusion reactions.

A January 21, 2025 University of Seville press release on EurekAlert, which originated the news item, provides some explanations and more detail about the work, Note: Links have been removed,

The SMART tokamak, a state-of-the-art experimental fusion device designed, constructed and operated by the Plasma Science and Fusion Technology Laboratory of the University of Seville, is a worldwide unique spherical tokamak due to its flexible shaping capabilities. SMART has been designed to demonstrate the unique physics and engineering properties of Negative Triangularity shaped plasmas towards compact fusion power plants based on Spherical Tokamaks.

Prof. Manuel García Muñoz, Principal Investigator of the SMART tokamak, stated: “This is an important achievement for the entire team as we are now entering the operational phase of SMART. The SMART approach is a potential game changer with attractive fusion performance and power handling for future compact fusion reactors. We have exciting times ahead!
Prof. Eleonora Viezzer, co-PI of the SMART project, adds: “We were all very excited to see the first magnetically confined plasma and are looking forward to exploiting the capabilities of the SMART device together with the international scientific community. SMART has awoken great interest worldwide.

When negative becomes positive and compact

The triangularity describes the shape of the plasma. Most tokamaks operate with positive triangularity, meaning that the plasma shape looks like a D. When the D is mirrored (as shown in the figure on the right), the plasma has negative triangularity.

Negative triangularity plasma shapes feature enhanced performance as it suppresses instabilities that expel particles and energy from the plasma, preventing severe damage to the tokamak wall. Besides offering high fusion performance, negative triangularity also feature attractive power handling solutions, given that it covers a larger divertor area for distributing the heat exhaust. This also facilitates the engineering design for future compact fusion power plants.

Fusion2Grid aimed at developing the foundation for the most compact fusion power plant

SMART is the first step in the Fusion2Grid strategy led by the PSFT team and, in collaboration with the international fusion community, is aimed at the most compact and most efficient magnetically confined fusion power plant based on Negative Triangularity shaped Spherical Tokamaks.

SMART will be the first compact spherical tokamak operating at fusion temperatures with negative triangularity shaped plasmas.

The objective of SMART is to provide the physics and engineering basis for the most compact design of a fusion power plant based on high-field Spherical Tokamaks combined with Negative Triangularity. The solenoid-driven plasma represents a major achievement in the timeline of getting SMART online and advancing towards the most compact fusion device.

The Plasma Science and Fusion Technology Lab of the University of Seville hosts the SMall Aspect Ratio Tokamak (SMART) and leads several worldwide efforts on energetic particles and plasma transport and stability towards the development of magnetically confined fusion energy.

Here’s a link to and a citation for the paper,

Performance prediction applying different reduced turbulence models to the SMART tokamak by D.J. Cruz-Zabala, M. Podestàa, F. Polib, S.M. Kaye, M. Garcia-Munoz, E. Viezzer and J.W. Berkery. Nuclear Fusion, Volume 64, Number 12DOI 10.1088/1741-4326/ad8a70 Published 7 November 2024 © 2024 The Author(s). Published by IOP Publishing Ltd on behalf of the IAEA (International Atomic Energy Agency)

This paper is open access.

China

Caption: The Experimental Advanced Superconducting Tokamak achieved a remarkable scientific milestone by maintaining steady-state high-confinement plasma operation for an impressive 1,066 seconds. Credit: Image by HFIPS ( Hefei Institutes of Physical Science at the Chinese Academy of Sciences)

China has made a business announcement and there is no academic paper mentioned in their January 21, 2025 press release on EurekAlert (also available on phys.org as a January 21, 2025 news item), Note: A link has been removed,

The Experimental Advanced Superconducting Tokamak (EAST), commonly known as China’s “artificial sun,” has achieved a remarkable scientific milestone by maintaining steady-state high-confinement plasma operation for an impressive 1,066 seconds. This accomplishment, reached on Monday, sets a new world record and marks a significant breakthrough in the pursuit of fusion power generation.

The duration of 1,066 seconds is a critical advancement in fusion research. This milestone, achieved by the Institute of Plasma Physics (ASIPP) at Hefei Institutes of Physical Scienece [sic] (HFIPS) of the Chinese Academy of Sciences, far surpasses the previous world record of 403 seconds, also set by EAST in 2023.

The ultimate goal of developing an artificial sun is to replicate the nuclear fusion processes that occurr [sci] in the sun, providing humanity with a limitless and clean energy source, and enabling exploration beyond our solar system.

Scientists worldwide have dedicated over 70 years to this ambitious goal. However, generating electricity from a nuclear fusion device involves overcoming key challenges, including reaching temperatures exceeding 100 million degrees Celsius, maintaining stable long-term operation, and ensuring precise control of the fusion process.

“A fusion device must achieve stable operation at high efficiency for thousands of seconds to enable the self-sustaining circulation of plasma, which is essential for the continuous power generation of future fusion plants,” said SONG Yuntao, ASIPP director and also vice president of HFIPS. He said that the recent record is monumental, marking a critical step toward realizing a functional fusion reactor.

According to GONG Xianzu, head of the EAST Physics and Experimental Operations division, several systems of the EAST device have been upgraded since the last round of experiments. For example, the heating system, which previously operated at the equivalent power of nearly 70,000 household microwave ovens, has now doubled its power output while maintaining stability and continuity.

Since its inception in 2006, EAST has served as an open testing platform for both Chinese and international scientists to conduct fusion-related experiments and research.

China officially joined the International Thermonuclear Experimental Reactor (ITER) program in 2006 as its seventh member. Under the agreement, China is responsible for approximately 9 percent of the project’s construction and operation, with ASIPP serving as the primary institution for the Chinese mission.

ITER, currently under construction in southern France, is set to become the world’s largest magnetic confinement plasma physics experiment and the largest experimental tokamak nuclear fusion reactor upon completion.

In recent years, EAST has consistently achieved groundbreaking advancements in high-confinement mode, a fundamental operational mode for experimental fusion reactors like ITER and the future China Fusion Engineering Test Reactor (CFETR). These accomplishments provide invaluable insights and references for the global development of fusion reactors.

“We hope to expand international collaboration via EAST and bring fusion energy into practical use for humanity,” said SONG.

In Hefei, Anhui Province, China, where EAST is loacated [sic], a new generation of experimental fusion research facilities is currently under construction. These facilities aim to further accelerate the development and application of fusion energy.

I always feel a little less confident about the information when there are mistakes. Three typos in the same press release? Maybe someone forgot to give it a final once over?

US

Despite the Cambridge University Press mention, this March 27, 2025 Cambridge University Press press release (also on EurekAlert) is about a US development,

Successfully harnessing the power of fusion energy could lead to cleaner and safer energy for all – and contribute substantially to combatting [UK spelling] the climate crisis. Towards this goal, Type One Energy has published a comprehensive, self-consistent, and robust physics basis for a practical fusion pilot power plant.  

This groundbreaking research is presented in a series of six peer-reviewed scientific papers in a special issue of the prestigious Journal of Plasma Physics (JPP), published by Cambridge University Press. 

The articles serve as the foundation for the company’s first fusion power plant project, which Type One Energy is developing with the Tennessee Valley Authority utility in the United States.  

Alex Schekochihin, Professor of Theoretical Physics at the University of Oxford and Editor of the JPP, spoke with enthusiasm about this development: 

“JPP is very proud to provide a platform for rigorous peer review and publication of the papers presenting the physics basis of the Infinity Two stellarator — an innovative and ground-breaking addition to the expanding family of proposed fusion power plant designs.  

“Fusion science and technology are experiencing a period of very rapid development, driven by both public and private enthusiasm for fusion power. In this environment of creative and entrepreneurial ferment, it is crucial that new ideas and designs are both publicly shared and thoroughly scrutinised by the scientific community — Type One Energy and JPP are setting the gold standard for how this is done (as we did with Commonwealth Fusion Systems 5 years ago for their SPARC physics basis).” 

The new physics design basis for the pilot power plant is a robust effort to consider realistically the complex relationship between challenging, competing requirements that all need to function together for fusion energy to be possible.  

This new physics solution also builds on the operating characteristics of high-performing stellarator fusion technology – a stellarator being a machine that uses complex, helical magnetic fields to confine the plasma, thereby enabling scientists to control it and create suitable conditions for fusion. This technology is already being used with success on the world’s largest research stellarator, the Wendelstein 7-X, located in Germany, but the challenge embraced by Type One Energy’s new design is how to scale it up to a pilot plant. 

Building the future of energy 

Functional fusion technology could offer limitless clean energy. As global energy demands increase and energy security is front of mind, too, this new physics design basis comes at an excellent time.  

Christofer Mowry, CEO of Type One Energy, is cognisant of the landmark nature of his company’s achievement and proud of its strong, real-world foundations. 

“The physics basis for our new fusion power plant is grounded in Type One Energy’s expert knowledge about reliable, economic, electrical generation for the power grid. We have an organisation that understands this isn’t only about designing a science project.” 

This research was developed collaboratively between Type One Energy and a broad coalition of scientists from national laboratories and universities around the world. Collaborating organisations included the US Department of Energy, for using their supercomputers, such as the exascale Frontier machine at Oak Ridge National Laboratory, to perform its physics simulations. 

While commercial fusion energy has yet to move from theory into practice, this new research marks an important and promising milestone. Clean and abundant energy may yet become reality.  

You can read the six papers and the accompanying Editorial (all of which are open access) in this special issue, Physics Basics of the Infinity Two Fusion Power Plant of the Journal of Plasma Physics.

Bull Run, eh?

This is not directly related to fusion energy, so, you might want to skip this section.

Caption: Type One Energy employees at the Bull Run [emphasis mine] Fossil Plant, soon to be home to the prototype Infinity One. Credit: Type One Energy

I wonder if anyone argued for a change of name given how charged the US history associated with ‘Bull Run’ is, from the the First Battle of Bull Run Wikipedia entry, Note: Links have been removed,

The First Battle of Bull Run, called the Battle of First Manassas[1] by Confederate forces, was the first major battle of the American Civil War. The battle was fought on July 21, 1861, in Prince William County, Virginia, just north of what is now the city of Manassas and about thirty miles west-southwest of Washington, D.C. The Union Army was slow in positioning themselves, allowing Confederate reinforcements time to arrive by rail. Each side had about 18,000 poorly trained and poorly led troops. The battle was a Confederate victory and was followed by a disorganized post-battle retreat of the Union forces.

A Confederate victory the first time and the second time (Second Battle of Bull Run Wikipedia entry)? For anyone unfamiliar with the history, the US Civil War was fought from 1861 to 1865 between Union and Confederate forces. The Confederate states had seceded from the union (US) and were fighting to retain their slavery-based economy and they lost the war.

Had anyone consulted me I would have advised changing the name from Bull Run to some thing less charged (pun noted) to host your prototype fusion energy pilot plant.

Back to the usual programme.

Type One Energy

Type One Energy issued a March 27, 2025 news release about the special issue of the Journal of Plasma Physics (JPP), Note 1: Some of this redundant; Note 2: Links have been removed,

Type One Energy announced today publication of the world’s first comprehensive, self-consistent, and robust physics basis, with conservative design margins, for a practical fusion pilot power plant. This physics basis is presented in a series of seven peer-reviewed scientific papers in a special issue of the prestigious Journal of Plasma Physics (JPP). They serve as the foundation for the company’s first Infinity Two stellarator fusion power plant project, which Type One Energy is developing for the Tennessee Valley Authority (TVA) utility in the U.S.

The Infinity Two fusion pilot power plant physics design basis realistically considers, for the first time, the complex relationship between competing requirements for plasma performance, power plant startup, construction logistics, reliability, and economics utilizing actual power plant operating experience. This Infinity Two baseline physics solution makes use of the inherently favorable operating characteristics of highly optimized stellarator fusion technology using modular superconducting magnets, as was so successfully proven on the W7-X science machine in Germany.

“Why are we the first private fusion company with an agreement to develop a potential fusion power plant project for an energy utility? Because we have a design anchored in reality,” said Christofer Mowry, CEO of Type One Energy. “The physics basis for Infinity Two is grounded in the knowledge of what is required for application to, and performance in, the demanding environment of reliable electrical generation for the power grid. We have an organization that understands this isn’t about designing a science project.”

Led by Chris Hegna, widely recognized as a leading theorist in modern stellarators, Type One Energy performed high-fidelity computational plasma physics analyses to substantially reduce the risk of meeting Infinity Two power plant functional and performance requirements. This unique and transformational achievement is the result of a global development program led by the Type One Energy plasma physics and stellarator engineering organization, with significant contributions from a broad coalition of scientists from national laboratories and universities around the world. The company made use of a spectrum of high-performance computing facilities, including access to the highest-performance U.S. Department of Energy supercomputers such as the exascale Frontier machine at Oak Ridge National Laboratory (ORNL), to perform its stellarator physics simulations.

“We committed to this ambitious fusion commercialization milestone two years ago and today we delivered,” said John Canik, Chief Science and Engineering Officer for Type One Energy. “The team was able to efficiently develop deep plasma physics insights to inform the design of our Infinity Two stellarator, by taking advantage of our access to high performance computing resources. This enabled the Type One Energy team to demonstrate a realistic, integrated stellarator design that moves far beyond conventional thinking and concepts derived from more limited modeling capabilities.”

The consistent and robust physics solution for Infinity Two results in a deuterium-tritium (D-T) fueled, burning plasma stellarator with 800 MW of fusion power and delivers a nominal 350 MWe to the power grid. It is characterized by fusion plasma with resilient and stable behavior across a broad range of operating conditions, very low heat loss due to turbulent transport, as well as tolerable direct energy losses to the stellarator first wall. The Infinity Two stellarator has sufficient room for both adequately sized island divertors to exhaust helium ash and a blanket which provides appropriate shielding and tritium breeding. Type One Energy has high confidence that this essential physics solution provides a good baseline stellarator configuration for the Infinity Two fusion pilot power plant.

“The articles in this issue [of JPP] represent an important step towards a fusion reactor based on the stellarator concept. Thanks to decades of experiments and theoretical research, much of the latter published in JPP, it has become possible to lay out the physics basis for a stellarator power plant in considerable detail,” said Per Helander, head of Stellarator Theory Division at the Max Planck Institute for Plasma Physics. “JPP is very happy to publish this series of papers from Type One Energy, where this has been accomplished in a way that sets new standards for the fidelity and confidence level in this context.”

Important to successful fusion power plant commercialization, this stellarator configuration has enabled Type One Energy to architect a maintenance solution which supports good power plant Capacity Factors (CF) and associated Levelized Cost of Electricity (LCOE). It also supports favorable regulatory requirements for component manufacturing and power plant construction methods essential to achieving a reasonable Over-Night Cost (ONC) for Infinity Two.

About Type One Energy

Type One Energy Group is mission-driven to provide sustainable, affordable fusion power to the world. Established in 2019 and venture-backed in 2023, the company is led by a team of globally recognized fusion scientists with a strong track record of building state-of-the-art stellarator fusion machines, together with veteran business leaders experienced in scaling companies and commercializing energy technologies. Type One Energy applies proven advanced manufacturing methods, modern computational physics and high-field superconducting magnets to develop its optimized stellarator fusion energy system. Its FusionDirect development program pursues the lowest-risk, shortest-schedule path to a fusion power plant over the coming decade, using a partner-intensive and capital-efficient strategy. Type One Energy is committed to community engagement in the development and deployment of its clean energy technology. For more information, visit www.typeoneenergy.com or follow us on LinkedIn.

While the company is currently headquartered in Knoxville, Tennessee, it was originally a spinoff company from the University of Wisconsin-Madison according to a March 30, 2023 posting on the university’s College of Engineering website,

Type One Energy, a Middleton, Wisconsin-based fusion energy company with roots in the University of Wisconsin-Madison’s College of Engineering, recently announced its first round of seed funding, raising $29 million from investors. The company has also onboarded a new, highly experienced CEO [Christofer Mowry].

Type One, founded in 2019 by a team of globally recognized fusion scientists and business leaders, is hoping to commercialize stellarator technology over the next decade. Stellarators are a type of fusion reactor that uses powerful magnets to confine ultra-hot streams of plasma in order to create the conditions for fusion reactions. Energy from fusion promises to be clean, safe, renewable power. The company is using advanced manufacturing methods, modern computational physics and high-field superconducting magnets to develop its stellarator through an initiative called FusionDirect.

According to the Type One Energy’s About page, there are four offices with the headquarters in Tennessee,

Knoxville (Headquarters)
2410 Cherahala Blvd.
Knoxville, TN 37931

Madison
316 W Washington Ave. Suite 300
Madison, WI 53703

Boston
299 Washington St. Suites C & E
Woburn, MA 01801

Vancouver
1140 West Pender St.
Vancouver, BC V6E 4G1

The mention of an office in Vancouver, Canada piqued my curiosity but before getting to that, I’m going to include some informative excerpts about nuclear energy (both fission and fusion) from this August 31, 2023 article written by Tina Tosukhowong on behalf of TDK Ventures, which was posted on Medium,

Fusion power is the key to the energy transformation that humanity needs to drive decarbonization, clean, and baseload energy production that is inherently fail-safe, with no risk of long-lived radioactive waste, while also delivering on ever-growing energy-consumption demands at the global scale. Fusion is hard and requires exceptional conditions for sustained reaction (which is part of what makes it so safe), which has long served as a deterrent for technical maturation and industrial viability. …

The current reality of our world is monumental fossil-fuel dependence. This, coupled with unprecedented levels of energy demand has resulted in the over 136,700 TWh (that’s 10¹²) of energy consumed via fossil fuels annually [1]. Chief repercussion among the many consequences of this dependence is the now very looming threat of climate catastrophe, which will soon be irreversible if global temperature rise is not abated and held to within 1.5 °C of pre-industrial levels. To do so, the nearly 40 gigatons of CO2 emissions generated each year must be steadily reduced and eventually mitigated entirely [2]. A fundamental shift in how power is generated globally is the only way forward. Humanity needs an energy transformation — the right energy transformation.

Alternative energy-generation techniques, such as wind, solar, geothermal, and hydroelectric approaches have all made excellent strides, and indeed in just the United States electricity generated by renewable methods doubled from 10 to 20% of total between 2010 and 2020 [3–4]. These numbers are incredibly encouraging and give significant credence in the journey to net-zero emission energy generation. However, while these standard renewable approaches should be championed, wind and solar are intermittent and require a large amount of land to deploy, while geothermal and hydroelectric are not available in every geography.

By far the most viable candidates for continuous clean energy generation to replace coal-fired power plants are nuclear-driven technologies, i.e. nuclear fission or nuclear fusion. Nuclear fission has been a proven effective method ever since it was first demonstrated almost 80 years ago underneath the University of Chicago football Stadium by Nobel Laureate Enrico Fermi [5]. Heavier atomic elements, in most cases Uranium-235, are exposed to and bombarded by neutrons. This causes the Uranium to split resulting in two slightly less-heavy elements (like Barium and Krypton). This in turn causes energy to be released and more neutrons to be ejected and bombard other nearby Uranium-235, at which point the process cascades into a chain reaction. The released energy (heat) is utilized in the same way coal is burned in a traditional power plant, being subsequently used to generate electricity usually via the creation of steam to drive a turbine [6]. While already having reached viable commercial maturity, fission carries inherent and nontrivial safety concerns. An unhampered chain reaction can quickly lead to meltdown with disastrous consequences, and, even when properly managed, the end reaction does generate radioactive waste whose half-life can last hundreds of thousands of years.

Figure 1. Breakdown of a nuclear fission reaction [6]. Incident neutron bombards a fissile heavy element, splitting it and release energy and more nuclei setting off a chain reaction.

Especially given modernization efforts and meteoric gains in safety (thanks to advents in material science like ceramic coatings), fission will continue to be a critical piece to better, greener energy transformation. However, in extending our vision to an even brighter future with no such concerns — carbon emissions or safety — nuclear fusion is humanity’s silver bullet. Instead of breaking down atoms leading to a chain reaction, fusion is the combining of atoms (usually isotopes of Hydrogen) into heavier elements which also results in energy release / heat generation [7]. Like fission, fusion can be designed to be a continuous energy source that can serve as a permanent backbone to the power grid. It is extremely energy dense, with 1 kg of fusion fuel producing the same amount of energy as 1,000,000 kg of coal, and it is inherently fail-safe with no long-term radioactive waste.

As a concept, if fusion is a silver bullet to answer humanity’s energy transformation needs, then why haven’t we done so already? The appeal seems so obvious, what’s the hold up? Simply put, nuclear fusion is hard for the very same reason the process is inherently safe. Atoms in the process must have enough energy to overcome electrostatic repulsive forces between the two positive charges of their nuclei to fuse. The key figure of merit to evaluate fusion is the so-called “Lawson Triple Product.” Essentially, this means in order to generate energy by fusion more than the rate of energy oss to the environment, the nuclei must be very close together (as represented by n — the plasma density), kept at a high enough temperature (as represented by T — temperature), and for long enough time to sustain fusion (as represented by τ — the confinement time). The triple product required to achieve fusion “ignition” (the state where the rate of energy production is higher than the rate of loss) depends on the fuel type and occurs within a plasma state. A deuterium and tritium (D-T) system has the lowest Lawson Triple product requirement, where fusion can achieve a viable threshold for ignition when the density of the fuel atoms, n, multiplied by the fuel temperature, T, multiplied by the confinement time, τ, is greater than 5×10²¹ (nTτ > 5×10²¹ keV-s/m³) [8–9]. For context, the temperature alone in this scenario must be higher than 100-million degrees Celsius.

Figure 2. (Left) Conceptual illustration of a fusion reaction with Deuterium (²H) and Tritium (³H) forming an Alpha particle (⁴He) and free neutron along with energy released as heat (Right). To initiate fusion, repelling electrostatic charge must be overcome via conditions meeting the minimum Lawson Triple Product threshold

Tosukhowong’s August 31, 2023 article provides a good overview keeping in mind that it is slanted to justify TDK’s investment in Type One Energy.

Why a Vancouver, Canada office?

As for Type One Energy’s Vancouver (British Columbia, Canada) connection, I was reminded of General Fusion, a local fusion energy company while speculating about the connection. First speculative question: could Type One Energy’s presence in Canada allow it to access Canadian government funds for its research? Second speculative question: do they want to have access to people who might hesitate to move to the US or might want to move out of the US but would move to Canada?

The US is currently in an unstable state as suggested in this April 3, 2025 opinion piece by Les Leyne for vancouverisawsome.com

Les Leyne: Trump’s incoherence makes responding to tariff wall tricky

Trump’s announcement was so incoherent that much of the rest of the world had to scramble to grasp even the basic details

B.C. officials were guarded Wednesday [April 2, 2025] about the impact on Canada of the tariff wall U.S. President Donald Trump erected around the U.S., but it appears it could have been worse.

Trump’s announcement was so incoherent that much of the rest of the world had to scramble to grasp even the basic details. So cabinet ministers begged for more time to check the impacts.

“It’s still very uncertain,” said Housing Minister Ravi Kahlon, who chairs the “war room” committee responsible for countering tariff threats. “It’s hard to make sense from President Trump’s speech.” [emphasis mine]

Kahlon said the challenge is that tariff policies change hour by hour, “and anything can happen.”

On April 2, 2025 US President Donald Trump announced tariffs (then paused some of the tariffs on April 9, 2025) and some of the targets seemed a bit odd, from an April 2, 2025 article by Alex Galbraith for salon.com, Note: Links have been removed,

“Trade war with penguins”: Trump places 10% tariff on uninhabited Antarctic islands

Planned tariffs shared by the White House included a 10% duty on imports from the barren Heard and McDonald Islands

For once in his life, Donald Trump underpromised and over-delivered. 

The president announced a 10% duty on all imports on Wednesday [April 2, 2025], along with a raft of reciprocal tariffs on U.S. trading partners. An extensive graphic released by the White House showed how far Trump was willing to take his tit-for-tat trade war, including a shocking levy of 10% on all imports from the Heard and McDonald Islands. 

If you haven’t heard of this powerhouse of global trade and territory of Australia, you aren’t alone. Few have outside of Antarctic researchers and seals. These extremely remote islands about 1,000 miles north of Antarctica consist mostly of barren tundra. They’re also entirely uninhabited. 

The news that we were starting a trade war with penguins spread quickly after Trump’s announcement. …

U.S. stock futures crumbled following the news of Trump’s widespread tariffs. Dow futures fell by nearly 1,000 points while NASDAQ and S&P futures fell by 3 to 4%. American companies’ stock values rapidly tumbled after the announcement, with large retail importers seeing significant losses. …

No word from the penguins about the ‘pause’. I’m assuming Donald Trump’s next book will be titled, “The art of negotiating trade deals with penguins.” Can’t wait to read it.

(Perhaps someone should tell him there are no penguins in the Arctic so he can’t bypass Canadians or Greenlanders to make a deal.)

Now for the local story.

General Fusion

There’ve been two recent developments at General Fusion. Most recently, an April 2, 2025 General Fusion news release announces a new hire, Note: Links have been removed,

Bob Smith is joining General Fusion as a strategic advisor. Smith brings more than 35 years of experience developing, scaling, and launching world-changing technologies, including spearheading new products and innovation in the aerospace industry at United Space Alliance, Sandia Labs, and Honeywell before serving as CEO of Blue Origin. He joins General Fusion as the company’s Lawson Machine 26 (LM26) fusion demonstration begins operations and progresses toward transformative technical milestones on the path to commercialization.

“I’ve been watching the fusion energy industry closely for my entire career. Fusion is the last energy source humanity will ever need, and I believe its impact as a zero-carbon energy source will transform the global energy supply at the time needed to fight the worst consequences of climate change,” said Smith. “I am thrilled to work with General Fusion. Their novel approach has inherent and distinctive benefits for the generation of commercially competitive fusion power. It’s exciting to join at a time when the team is about to demonstrate the fundamental physics behind their system and move to scaling up to a pilot plant.”

The LM26 program marks a significant step towards commercialization, as the company’s unique Magnetized Target Fusion (MTF) approach makes the path to powering the grid with fusion energy more straightforward than other technologies—because it practically addresses barriers to fusion commercialization, such as neutron material degradation, sustainable fuel production, and efficient energy extraction. As a strategic advisor, Smith will leverage his experience advancing game-changing technologies to help guide General Fusion’s technology development and strategic growth.

“Bob’s insights and experience will be invaluable as we execute the LM26 program and look beyond it to propel our practical technology to powering the grid by the mid-2030s,” said Greg Twinney, CEO, General Fusion. “We are grateful for his commitment of his in-demand time and expertise to our mission and look forward to working together to make fusion power a reality!”

About Bob Smith:

Bob is an experienced business leader in the aerospace and defense industry with extensive technical and operational expertise across the sector. He worked at and managed federal labs, led developments at a large government contractor, grew businesses at a Fortune 100 multinational, and scaled up a launch and space systems startup. Bob also has extensive international experience and has worked with suppliers and OEMs in all the major aerospace regions, including establishing new sites and factories in Europe, India, China, and Puerto Rico.

Bob’s prior leadership roles include Chairman and Chief Executive Officer of Blue Origin, President of Mechanical Systems & Components at Honeywell Aerospace, Chief Technology Officer at Honeywell Aerospace, Chairman of NTESS (Sandia Labs), and Executive Director of Space Shuttle Upgrades at United Space Alliance.

Bob holds a Bachelor of Science degree in aerospace engineering from Texas A&M, a Master of Science degree in engineering/applied mathematics from Brown University, a doctorate from the University of Texas in aerospace engineering, and a business degree from MIT’s Sloan School of Management. Bob is also a Fellow of the Royal Aeronautical Society, a Fellow of the American Institute of Aeronautics and Astronautics, and an Academician in the International Academy of Astronautics.

Quick Facts:  

  • Fusion energy is the ultimate clean energy solution—it is the energy source that powers the sun and stars. Fusion is the process by which two light nuclei merge to form a heavier one, producing a massive amount of energy.
  • General Fusion’s Magnetized Target Fusion (MTF) technology is designed to scale for cost-efficient power plants. It uses mechanical compression to create fusion conditions in short pulses, eliminating the need for expensive lasers or superconducting magnets. An MTF power plant is designed to produce its own fuel and inherently includes a method to extract the energy and put it to work.
  • Lawson Machine 26 (LM26) is a world-first Magnetized Target Fusion demonstration. Launched, designed, and assembled in just 16 months, the machine is now forming magnetized plasmas regularly at 50 per cent commercial scale. It is advancing towards a series of results that will demonstrate MTF in a commercially relevant way: 10 million degrees Celsius (1 keV), 100 million degrees Celsius (10 keV), and scientific breakeven equivalent (100% Lawson).

About General Fusion
General Fusion is pursuing a fast and practical approach to commercial fusion energy and is headquartered in Richmond, Canada. The company was established in 2002 and is funded by a global syndicate of leading energy venture capital firms, industry leaders, and technology pioneers. Learn more at www.generalfusion.com.

Bob Smith and Blue Origin: things did not go well

Sometimes you end up in a job and things do not work out well and that seems to have been the case at Blue Origin according to a September 25, 2023 article by Eric Berger for Ars Tecnica,

After six years of running Blue Origin, Bob Smith announced in a company-wide email on Monday that he will be “stepping aside” as chief executive of the space company founded by Jeff Bezos.

“It has been my privilege to be part of this great team, and I am confident that Blue Origin’s greatest achievements are still ahead of us,” Smith wrote in an email. “We’ve rapidly scaled this company from its prototyping and research roots to a large, prominent space business.”

Shortly after Smith’s email, a Blue Origin spokesperson said the company’s new chief executive will be Dave Limp, who stepped down as Amazon’s vice president of devices and services last month.

To put things politely, Smith has had a rocky tenure as Blue Origin’s chief executive. After being personally vetted and hired by Bezos, Smith took over from Rob Meyerson in 2017. The Honeywell engineer was given a mandate to transform Blue Origin into a large and profitable space business.

He did succeed in growing Blue Origin. The company had about 1,500 employees when Smith arrived, and the company now employs nearly 11,000 people. But he has been significantly late on a number of key programs, including the BE-4 rocket engine and the New Glenn rocket.

As a space reporter, I have spoken with dozens of current and former Blue Origin employees, and virtually none of them have had anything positive to say about Smith’s tenure as chief executive. I asked one current employee about the hiring of Limp on Monday afternoon, and their response was, “Anything is better than Bob.”

Although it is very far from an exact barometer, Smith has received consistently low ratings on Glassdoor for his performance as chief executive of Blue Origin. And two years ago, a group of current and former Blue Origin employees wrote a blistering letter about the company under Smith. “In our experience, Blue Origin’s culture sits on a foundation that ignores the plight of our planet, turns a blind eye to sexism, is not sufficiently attuned to safety concerns, and silences those who seek to correct wrongs,” the essay authors wrote.

With any corporate culture, there will be growing pains, of course. But Smith brought a traditional aerospace mindset into a company that had hitherto been guided by a new space vision, leading to a high turnover rate. And Blue Origin remains significantly underwater, financially. It is likely that Bezos is still providing about $2 billion a year to support the company’s cash needs.

Crucially, as Blue Origin meandered under Smith’s tenure, SpaceX soared, launching hundreds of rockets and thousands of satellites. Smith, clearly, was not the leader Blue Origin needed to make the company more competitive with SpaceX in launch and other spaceflight activities. It became something of a parlor game in the space industry to guess when Bezos would finally get around to firing Smith.

On the technical front, a March 27, 2025 General Fusion news release announces “Peer-reviewed publication confirms General Fusion achieved plasma energy confinement time required for its LM26 large-scale fusion machine,” Note: Links have been removed,

New results published in Nuclear Fusion confirm General Fusion successfully created magnetized plasmas that achieved energy confinement times exceeding 10 milliseconds. The published energy confinement time results were achieved on General Fusion’s PI3 plasma injector — the world’s largest and most powerful plasma injector of its kind. Commissioned in 2017, PI3 formed approximately 20,000 plasmas in a machine of 50 per cent commercial scale. The plasma injector is now integrated into General Fusion’s Lawson Machine 26 (LM26) — a world-first Magnetized Target Fusion demonstration tracking toward game-changing technical milestones that will advance the company’s ultimate mission: generating zero-carbon fusion energy for the grid in the next decade.

The 10-millisecond energy confinement time is the duration required to compress plasmas in LM26 to achieve key temperature thresholds of 1 keV, 10 keV, and, ultimately, scientific breakeven equivalent (100% Lawson). These results were imperative to de-risking LM26. The demonstration machine is now forming plasmas regularly, and the company is optimizing its plasma performance in preparation for compressing plasmas to create fusion and heating from compression.    

Key Findings: 

  • The plasma injector now integrated into General Fusion’s LM26 achieved energy confinement times exceeding 10 milliseconds, the pre-compression confinement time required for LM26’s targeted technical milestones. These results were achieved without requiring active magnetic stabilization or auxiliary heating. This means the results were achieved without superconducting magnets, demonstrating the company’s cost-effective approach.  
  • The plasma’s energy confinement time improved when the plasma injector vessel was coated with natural lithium. A key differentiator in General Fusion’s commercial approach is its use of a liquid lithium wall to compress plasmas during compression. In addition to the confinement time advantages shown in this paper, the liquid lithium wall will also protect a commercial MTF machine from neutron damage, enable the machine to breed its own fuel, and provide an efficient method for extracting energy from the machine.
  • The maximum energy confinement time achieved by PI3 was approximately 12 milliseconds. The machine’s maximum plasma density was approximately 6×1019 m-3, and maximum plasma temperatures exceeded 400 eV. These strong pre-compression results support LM26’s transformative targets.

Quotes:  

“LM26 is designed to achieve a series of results that will demonstrate MTF in a commercially relevant way. Following LM26’s results, our unique approach makes the path to powering the grid with fusion energy more straightforward than other technologies because we have front-loaded the work to address the barriers to commercialization.”  

Dr. Michel Laberge
Founder and Chief Science Officer

“For over 16 years, I have worked hand in hand with Michel to advance General Fusion’s practical technology. This company is entrepreneurial at its core. We pride ourselves on building real machines that get results that matter, and I’m thrilled to have the achievements recognized in Nuclear Fusion.”

Mike Donaldson
Senior Vice President, Technology Development

Here’s a link to and a citation for the paper,

Thermal energy confinement time of spherical tokamak plasmas in PI3 by A. Tancetti, C. Ribeiro, S.J. Howard, S. Coop, C.P. McNally, M. Reynolds, P. Kholodov, F. Braglia, R. Zindler, C. Macdonald. Nuclear Fusion, Volume 65, Number 3DOI: 10.1088/1741-4326/adb8fb Published 28 February 2025 • © 2025 The Author(s). Published by IOP Publishing Ltd on behalf of the IAEA [International Atomic Energy Agency]

This paper is open access.

For anyone curious about General Fusion, I have a brief overview and history of the company and their particular approach to fusion energy in my February 6, 2024 posting (scroll down to ‘The Canadians’).

The sound of frogs (and other amphibians) and climate change

At least once a year I highlight some work about frogs. It’s usually about a new species but this time, it’s all about frog sounds (as well as, sounds from other amphibians).

Caption: The calls of the midwife toad and other amphibians have served to test the sound classifier. Credit: Jaime Bosch (MNCN-CSIC)

In any event, here’s more from an April 30, 2018 Spanish Foundation for Science and Technology (FECYT) press release (also on EurekAlert but with a May 17, 2018 publication date),

The sounds of amphibians are altered by the increase in ambient temperature, a phenomenon that, in addition to interfering with reproductive behaviour, serves as an indicator of global warming. Researchers at the University of Seville have resorted to artificial intelligence to create an automatic classifier of the thousands of frog and toad sounds that can be recorded in a natural environment.

One of the consequences of climate change is its impact on the physiological functions of animals, such as frogs and toads with their calls. Their mating call, which plays a crucial role in the sexual selection and reproduction of these amphibians, is affected by the increase in ambient temperature.

When this exceeds a certain threshold, the physiological processes associated with the sound production are restricted, and some calls are even actually inhibited. In fact, the beginning, duration and intensity of calls from the male to the female are changed, which influences reproductive activity.

Taking into account this phenomenon, the analysis and classification of the sounds produced by certain species of amphibians and other animals have turned out to be a powerful indicator of temperature fluctuations and, therefore, of the existence and evolution of global warming.

To capture the sounds of frogs, networks of audio sensors are placed and connected wirelessly in areas that can reach several hundred square kilometres. The problem is that a huge amount of bio-acoustic information is collected in environments as noisy as a jungle, and this makes it difficult to identify the species and their calls.

To solve this, engineers from the University of Seville have resorted to artificial intelligence. “We’ve segmented the sound into temporary windows or audio frames and have classified them by means of decision trees, an automatic learning technique that is used in computing”, explains Amalia Luque Sendra, co-author of the work.

To perform the classification, the researchers have based it on MPEG-7 parameters and audio descriptors, a standard way of representing audiovisual information. The details are published in Expert Systems with Applications magazine.

This technique has been put to the test with real sounds of amphibians recorded in the middle of nature and provided by the National Museum of Natural Sciences. More specifically, 868 records with 369 mating calls sung by the male and 63 release calls issued by the female natterajck toad (Epidalea calamita), along with 419 mating calls and 17 distress calls of the common midwife toad (Alytesobstetricans).

“In this case we obtained a success rate close to 90% when classifying the sounds,” observes Luque Sendra, who recalls that, in addition to the types of calls, the number of individuals of certain amphibian species that are heard in a geographical region over time can also be used as an indicator of climate change.

“A temperature increase affects the calling patterns,” she says, “but since these in most cases have a sexual calling nature, they also affect the number of individuals. With our method, we still can’t directly determine the exact number of specimens in an area, but it is possible to get a first approximation.”

In addition to the image of the midwife toad, the researchers included this image to illustrate their work,

Caption: This is the architecture of a wireless sensor network. Credit: J. Luque et al./Sensors

Here’s a link to and a citation for the paper,

Non-sequential automatic classification of anuran sounds for the estimation of climate-change indicators by Amalia Luque, Javier Romero-Lemos, Alejandro Carrasco, Julio Barbancho. Expert Systems with Applications Volume 95, 1 April 2018, Pages 248-260 DOI: https://doi.org/10.1016/j.eswa.2017.11.016 Available online 10 November 2017

This paper is open access.