In addition to the competition to develop commercial quantum computing, there’s the competition to develop commercial nuclear fusion energy. I have four stories about nuclear fusion, one from Spain, one from Chine, one from the US, and one from Vancouver. There are also a couple of segues into history and the recently (April 2, 2025) announced US tariffs (chaos has since ensued as these have become ‘on again/off again’ tariffs) but the bulk of this posting is focused on the latest (January – early April 2025) in fusion energy.
Fission nuclear energy, where atoms are split, is better known; fusion nuclear energy is released when a star is formed. For anyone unfamiliar with the word tokamak as applied to nuclear fusion (which is mentioned in all the stories), you can find out more in the Tokamak Wikipedia entry.
In a pioneering approach to achieve fusion energy, the SMART device has successfully generated its first tokamak plasma. This step brings the international fusion community closer to achieving sustainable, clean, and virtually limitless energy through controlled fusion reactions.
The SMART tokamak, a state-of-the-art experimental fusion device designed, constructed and operated by the Plasma Science and Fusion Technology Laboratory of the University of Seville, is a worldwide unique spherical tokamak due to its flexible shaping capabilities. SMART has been designed to demonstrate the unique physics and engineering properties of Negative Triangularity shaped plasmas towards compact fusion power plants based on Spherical Tokamaks.
Prof. Manuel García Muñoz, Principal Investigator of the SMART tokamak, stated: “This is an important achievement for the entire team as we are now entering the operational phase of SMART. The SMART approach is a potential game changer with attractive fusion performance and power handling for future compact fusion reactors. We have exciting times ahead! Prof. Eleonora Viezzer, co-PI of the SMART project, adds: “We were all very excited to see the first magnetically confined plasma and are looking forward to exploiting the capabilities of the SMART device together with the international scientific community. SMART has awoken great interest worldwide.
When negative becomes positive and compact
The triangularity describes the shape of the plasma. Most tokamaks operate with positive triangularity, meaning that the plasma shape looks like a D. When the D is mirrored (as shown in the figure on the right), the plasma has negative triangularity.
Negative triangularity plasma shapes feature enhanced performance as it suppresses instabilities that expel particles and energy from the plasma, preventing severe damage to the tokamak wall. Besides offering high fusion performance, negative triangularity also feature attractive power handling solutions, given that it covers a larger divertor area for distributing the heat exhaust. This also facilitates the engineering design for future compact fusion power plants.
Fusion2Grid aimed at developing the foundation for the most compact fusion power plant
SMART is the first step in the Fusion2Grid strategy led by the PSFT team and, in collaboration with the international fusion community, is aimed at the most compact and most efficient magnetically confined fusion power plant based on Negative Triangularity shaped Spherical Tokamaks.
SMART will be the first compact spherical tokamak operating at fusion temperatures with negative triangularity shaped plasmas.
The objective of SMART is to provide the physics and engineering basis for the most compact design of a fusion power plant based on high-field Spherical Tokamaks combined with Negative Triangularity. The solenoid-driven plasma represents a major achievement in the timeline of getting SMART online and advancing towards the most compact fusion device.
The Plasma Science and Fusion Technology Lab of the University of Seville hosts the SMall Aspect Ratio Tokamak (SMART) and leads several worldwide efforts on energetic particles and plasma transport and stability towards the development of magnetically confined fusion energy.
Caption: The Experimental Advanced Superconducting Tokamak achieved a remarkable scientific milestone by maintaining steady-state high-confinement plasma operation for an impressive 1,066 seconds. Credit: Image by HFIPS ( Hefei Institutes of Physical Science at the Chinese Academy of Sciences)
China has made a business announcement and there is no academic paper mentioned in their January 21, 2025 press release on EurekAlert (also available on phys.org as a January 21, 2025 news item), Note: A link has been removed,
The Experimental Advanced Superconducting Tokamak (EAST), commonly known as China’s “artificial sun,” has achieved a remarkable scientific milestone by maintaining steady-state high-confinement plasma operation for an impressive 1,066 seconds. This accomplishment, reached on Monday, sets a new world record and marks a significant breakthrough in the pursuit of fusion power generation.
The duration of 1,066 seconds is a critical advancement in fusion research. This milestone, achieved by the Institute of Plasma Physics (ASIPP) at Hefei Institutes of Physical Scienece [sic] (HFIPS) of the Chinese Academy of Sciences, far surpasses the previous world record of 403 seconds, also set by EAST in 2023.
The ultimate goal of developing an artificial sun is to replicate the nuclear fusion processes that occurr [sci] in the sun, providing humanity with a limitless and clean energy source, and enabling exploration beyond our solar system.
Scientists worldwide have dedicated over 70 years to this ambitious goal. However, generating electricity from a nuclear fusion device involves overcoming key challenges, including reaching temperatures exceeding 100 million degrees Celsius, maintaining stable long-term operation, and ensuring precise control of the fusion process.
“A fusion device must achieve stable operation at high efficiency for thousands of seconds to enable the self-sustaining circulation of plasma, which is essential for the continuous power generation of future fusion plants,” said SONG Yuntao, ASIPP director and also vice president of HFIPS. He said that the recent record is monumental, marking a critical step toward realizing a functional fusion reactor.
According to GONG Xianzu, head of the EAST Physics and Experimental Operations division, several systems of the EAST device have been upgraded since the last round of experiments. For example, the heating system, which previously operated at the equivalent power of nearly 70,000 household microwave ovens, has now doubled its power output while maintaining stability and continuity.
Since its inception in 2006, EAST has served as an open testing platform for both Chinese and international scientists to conduct fusion-related experiments and research.
China officially joined the International Thermonuclear Experimental Reactor (ITER) program in 2006 as its seventh member. Under the agreement, China is responsible for approximately 9 percent of the project’s construction and operation, with ASIPP serving as the primary institution for the Chinese mission.
ITER, currently under construction in southern France, is set to become the world’s largest magnetic confinement plasma physics experiment and the largest experimental tokamak nuclear fusion reactor upon completion.
In recent years, EAST has consistently achieved groundbreaking advancements in high-confinement mode, a fundamental operational mode for experimental fusion reactors like ITER and the future China Fusion Engineering Test Reactor (CFETR). These accomplishments provide invaluable insights and references for the global development of fusion reactors.
“We hope to expand international collaboration via EAST and bring fusion energy into practical use for humanity,” said SONG.
In Hefei, Anhui Province, China, where EAST is loacated [sic], a new generation of experimental fusion research facilities is currently under construction. These facilities aim to further accelerate the development and application of fusion energy.
I always feel a little less confident about the information when there are mistakes. Three typos in the same press release? Maybe someone forgot to give it a final once over?
Successfully harnessing the power of fusion energy could lead to cleaner and safer energy for all – and contribute substantially to combatting [UK spelling] the climate crisis. Towards this goal, Type One Energy has published a comprehensive, self-consistent, and robust physics basis for a practical fusion pilot power plant.
This groundbreaking research is presented in a series of six peer-reviewed scientific papers in a special issue of the prestigious Journal of Plasma Physics (JPP), published by Cambridge University Press.
The articles serve as the foundation for the company’s first fusion power plant project, which Type One Energy is developing with the Tennessee Valley Authority utility in the United States.
Alex Schekochihin, Professor of Theoretical Physics at the University of Oxford and Editor of the JPP, spoke with enthusiasm about this development:
“JPP is very proud to provide a platform for rigorous peer review and publication of the papers presenting the physics basis of the Infinity Two stellarator — an innovative and ground-breaking addition to the expanding family of proposed fusion power plant designs.
“Fusion science and technology are experiencing a period of very rapid development, driven by both public and private enthusiasm for fusion power. In this environment of creative and entrepreneurial ferment, it is crucial that new ideas and designs are both publicly shared and thoroughly scrutinised by the scientific community — Type One Energy and JPP are setting the gold standard for how this is done (as we did with Commonwealth Fusion Systems 5 years ago for their SPARC physics basis).”
The new physics design basis for the pilot power plant is a robust effort to consider realistically the complex relationship between challenging, competing requirements that all need to function together for fusion energy to be possible.
This new physics solution also builds on the operating characteristics of high-performing stellarator fusion technology – a stellarator being a machine that uses complex, helical magnetic fields to confine the plasma, thereby enabling scientists to control it and create suitable conditions for fusion. This technology is already being used with success on the world’s largest research stellarator, the Wendelstein 7-X, located in Germany, but the challenge embraced by Type One Energy’s new design is how to scale it up to a pilot plant.
Building the future of energy
Functional fusion technology could offer limitless clean energy. As global energy demands increase and energy security is front of mind, too, this new physics design basis comes at an excellent time.
Christofer Mowry, CEO of Type One Energy, is cognisant of the landmark nature of his company’s achievement and proud of its strong, real-world foundations.
“The physics basis for our new fusion power plant is grounded in Type One Energy’s expert knowledge about reliable, economic, electrical generation for the power grid. We have an organisation that understands this isn’t only about designing a science project.”
This research was developed collaboratively between Type One Energy and a broad coalition of scientists from national laboratories and universities around the world. Collaborating organisations included the US Department of Energy, for using their supercomputers, such as the exascale Frontier machine at Oak Ridge National Laboratory, to perform its physics simulations.
While commercial fusion energy has yet to move from theory into practice, this new research marks an important and promising milestone. Clean and abundant energy may yet become reality.
This is not directly related to fusion energy, so, you might want to skip this section.
Caption: Type One Energy employees at the Bull Run [emphasis mine] Fossil Plant, soon to be home to the prototype Infinity One. Credit: Type One Energy
I wonder if anyone argued for a change of name given how charged the US history associated with ‘Bull Run’ is, from the the First Battle of Bull Run Wikipedia entry, Note: Links have been removed,
The First Battle of Bull Run, called the Battle of First Manassas[1] by Confederate forces, was the first major battle of the American Civil War. The battle was fought on July 21, 1861, in Prince William County, Virginia, just north of what is now the city of Manassas and about thirty miles west-southwest of Washington, D.C. The Union Army was slow in positioning themselves, allowing Confederate reinforcements time to arrive by rail. Each side had about 18,000 poorly trained and poorly led troops. The battle was a Confederate victory and was followed by a disorganized post-battle retreat of the Union forces.
…
A Confederate victory the first time and the second time (Second Battle of Bull Run Wikipedia entry)? For anyone unfamiliar with the history, the US Civil War was fought from 1861 to 1865 between Union and Confederate forces. The Confederate states had seceded from the union (US) and were fighting to retain their slavery-based economy and they lost the war.
Had anyone consulted me I would have advised changing the name from Bull Run to some thing less charged (pun noted) to host your prototype fusion energy pilot plant.
Back to the usual programme.
Type One Energy
Type One Energy issued a March 27, 2025 news release about the special issue of the Journal of Plasma Physics (JPP), Note 1: Some of this redundant; Note 2: Links have been removed,
Type One Energy announced today publication of the world’s first comprehensive, self-consistent, and robust physics basis, with conservative design margins, for a practical fusion pilot power plant. This physics basis is presented in a series of seven peer-reviewed scientific papers in a special issue of the prestigious Journal of Plasma Physics (JPP). They serve as the foundation for the company’s first Infinity Two stellarator fusion power plant project, which Type One Energy is developing for the Tennessee Valley Authority (TVA) utility in the U.S.
The Infinity Two fusion pilot power plant physics design basis realistically considers, for the first time, the complex relationship between competing requirements for plasma performance, power plant startup, construction logistics, reliability, and economics utilizing actual power plant operating experience. This Infinity Two baseline physics solution makes use of the inherently favorable operating characteristics of highly optimized stellarator fusion technology using modular superconducting magnets, as was so successfully proven on the W7-X science machine in Germany.
“Why are we the first private fusion company with an agreement to develop a potential fusion power plant project for an energy utility? Because we have a design anchored in reality,” said Christofer Mowry, CEO of Type One Energy. “The physics basis for Infinity Two is grounded in the knowledge of what is required for application to, and performance in, the demanding environment of reliable electrical generation for the power grid. We have an organization that understands this isn’t about designing a science project.”
Led by Chris Hegna, widely recognized as a leading theorist in modern stellarators, Type One Energy performed high-fidelity computational plasma physics analyses to substantially reduce the risk of meeting Infinity Two power plant functional and performance requirements. This unique and transformational achievement is the result of a global development program led by the Type One Energy plasma physics and stellarator engineering organization, with significant contributions from a broad coalition of scientists from national laboratories and universities around the world. The company made use of a spectrum of high-performance computing facilities, including access to the highest-performance U.S. Department of Energy supercomputers such as the exascale Frontier machine at Oak Ridge National Laboratory (ORNL), to perform its stellarator physics simulations.
“We committed to this ambitious fusion commercialization milestone two years ago and today we delivered,” said John Canik, Chief Science and Engineering Officer for Type One Energy. “The team was able to efficiently develop deep plasma physics insights to inform the design of our Infinity Two stellarator, by taking advantage of our access to high performance computing resources. This enabled the Type One Energy team to demonstrate a realistic, integrated stellarator design that moves far beyond conventional thinking and concepts derived from more limited modeling capabilities.”
The consistent and robust physics solution for Infinity Two results in a deuterium-tritium (D-T) fueled, burning plasma stellarator with 800 MW of fusion power and delivers a nominal 350 MWe to the power grid. It is characterized by fusion plasma with resilient and stable behavior across a broad range of operating conditions, very low heat loss due to turbulent transport, as well as tolerable direct energy losses to the stellarator first wall. The Infinity Two stellarator has sufficient room for both adequately sized island divertors to exhaust helium ash and a blanket which provides appropriate shielding and tritium breeding. Type One Energy has high confidence that this essential physics solution provides a good baseline stellarator configuration for the Infinity Two fusion pilot power plant.
“The articles in this issue [of JPP] represent an important step towards a fusion reactor based on the stellarator concept. Thanks to decades of experiments and theoretical research, much of the latter published in JPP, it has become possible to lay out the physics basis for a stellarator power plant in considerable detail,” said Per Helander, head of Stellarator Theory Division at the Max Planck Institute for Plasma Physics. “JPP is very happy to publish this series of papers from Type One Energy, where this has been accomplished in a way that sets new standards for the fidelity and confidence level in this context.”
Important to successful fusion power plant commercialization, this stellarator configuration has enabled Type One Energy to architect a maintenance solution which supports good power plant Capacity Factors (CF) and associated Levelized Cost of Electricity (LCOE). It also supports favorable regulatory requirements for component manufacturing and power plant construction methods essential to achieving a reasonable Over-Night Cost (ONC) for Infinity Two.
About Type One Energy
Type One Energy Group is mission-driven to provide sustainable, affordable fusion power to the world. Established in 2019 and venture-backed in 2023, the company is led by a team of globally recognized fusion scientists with a strong track record of building state-of-the-art stellarator fusion machines, together with veteran business leaders experienced in scaling companies and commercializing energy technologies. Type One Energy applies proven advanced manufacturing methods, modern computational physics and high-field superconducting magnets to develop its optimized stellarator fusion energy system. Its FusionDirect development program pursues the lowest-risk, shortest-schedule path to a fusion power plant over the coming decade, using a partner-intensive and capital-efficient strategy. Type One Energy is committed to community engagement in the development and deployment of its clean energy technology. For more information, visit www.typeoneenergy.com or follow us on LinkedIn.
While the company is currently headquartered in Knoxville, Tennessee, it was originally a spinoff company from the University of Wisconsin-Madison according to a March 30, 2023 posting on the university’s College of Engineering website,
Type One Energy, a Middleton, Wisconsin-based fusion energy company with roots in the University of Wisconsin-Madison’s College of Engineering, recently announced its first round of seed funding, raising $29 million from investors. The company has also onboarded a new, highly experienced CEO [Christofer Mowry].
Type One, founded in 2019 by a team of globally recognized fusion scientists and business leaders, is hoping to commercialize stellarator technology over the next decade. Stellarators are a type of fusion reactor that uses powerful magnets to confine ultra-hot streams of plasma in order to create the conditions for fusion reactions. Energy from fusion promises to be clean, safe, renewable power. The company is using advanced manufacturing methods, modern computational physics and high-field superconducting magnets to develop its stellarator through an initiative called FusionDirect.
…
According to the Type One Energy’s About page, there are four offices with the headquarters in Tennessee,
Madison 316 W Washington Ave. Suite 300 Madison, WI 53703
Boston 299 Washington St. Suites C & E Woburn, MA 01801
Vancouver 1140 West Pender St. Vancouver, BC V6E 4G1
The mention of an office in Vancouver, Canada piqued my curiosity but before getting to that, I’m going to include some informative excerpts about nuclear energy (both fission and fusion) from this August 31, 2023 article written by Tina Tosukhowong on behalf of TDK Ventures, which was posted on Medium,
Fusion power is the key to the energy transformation that humanity needs to drive decarbonization, clean, and baseload energy production that is inherently fail-safe, with no risk of long-lived radioactive waste, while also delivering on ever-growing energy-consumption demands at the global scale. Fusion is hard and requires exceptional conditions for sustained reaction (which is part of what makes it so safe), which has long served as a deterrent for technical maturation and industrial viability. …
…
The current reality of our world is monumental fossil-fuel dependence. This, coupled with unprecedented levels of energy demand has resulted in the over 136,700 TWh (that’s 10¹²) of energy consumed via fossil fuels annually [1]. Chief repercussion among the many consequences of this dependence is the now very looming threat of climate catastrophe, which will soon be irreversible if global temperature rise is not abated and held to within 1.5 °C of pre-industrial levels. To do so, the nearly 40 gigatons of CO2 emissions generated each year must be steadily reduced and eventually mitigated entirely [2]. A fundamental shift in how power is generated globally is the only way forward. Humanity needs an energy transformation — the right energy transformation.
Alternative energy-generation techniques, such as wind, solar, geothermal, and hydroelectric approaches have all made excellent strides, and indeed in just the United States electricity generated by renewable methods doubled from 10 to 20% of total between 2010 and 2020 [3–4]. These numbers are incredibly encouraging and give significant credence in the journey to net-zero emission energy generation. However, while these standard renewable approaches should be championed, wind and solar are intermittent and require a large amount of land to deploy, while geothermal and hydroelectric are not available in every geography.
By far the most viable candidates for continuous clean energy generation to replace coal-fired power plants are nuclear-driven technologies, i.e. nuclear fission or nuclear fusion. Nuclear fission has been a proven effective method ever since it was first demonstrated almost 80 years ago underneath the University of Chicago football Stadium by Nobel Laureate Enrico Fermi [5]. Heavier atomic elements, in most cases Uranium-235, are exposed to and bombarded by neutrons. This causes the Uranium to split resulting in two slightly less-heavy elements (like Barium and Krypton). This in turn causes energy to be released and more neutrons to be ejected and bombard other nearby Uranium-235, at which point the process cascades into a chain reaction. The released energy (heat) is utilized in the same way coal is burned in a traditional power plant, being subsequently used to generate electricity usually via the creation of steam to drive a turbine [6]. While already having reached viable commercial maturity, fission carries inherent and nontrivial safety concerns. An unhampered chain reaction can quickly lead to meltdown with disastrous consequences, and, even when properly managed, the end reaction does generate radioactive waste whose half-life can last hundreds of thousands of years.
Figure 1. Breakdown of a nuclear fission reaction [6]. Incident neutron bombards a fissile heavy element, splitting it and release energy and more nuclei setting off a chain reaction.
Especially given modernization efforts and meteoric gains in safety (thanks to advents in material science like ceramic coatings), fission will continue to be a critical piece to better, greener energy transformation. However, in extending our vision to an even brighter future with no such concerns — carbon emissions or safety — nuclear fusion is humanity’s silver bullet. Instead of breaking down atoms leading to a chain reaction, fusion is the combining of atoms (usually isotopes of Hydrogen) into heavier elements which also results in energy release / heat generation [7]. Like fission, fusion can be designed to be a continuous energy source that can serve as a permanent backbone to the power grid. It is extremely energy dense, with 1 kg of fusion fuel producing the same amount of energy as 1,000,000 kg of coal, and it is inherently fail-safe with no long-term radioactive waste.
…
As a concept, if fusion is a silver bullet to answer humanity’s energy transformation needs, then why haven’t we done so already? The appeal seems so obvious, what’s the hold up? Simply put, nuclear fusion is hard for the very same reason the process is inherently safe. Atoms in the process must have enough energy to overcome electrostatic repulsive forces between the two positive charges of their nuclei to fuse. The key figure of merit to evaluate fusion is the so-called “Lawson Triple Product.” Essentially, this means in order to generate energy by fusion more than the rate of energy oss to the environment, the nuclei must be very close together (as represented by n — the plasma density), kept at a high enough temperature (as represented by T — temperature), and for long enough time to sustain fusion (as represented by τ — the confinement time). The triple product required to achieve fusion “ignition” (the state where the rate of energy production is higher than the rate of loss) depends on the fuel type and occurs within a plasma state. A deuterium and tritium (D-T) system has the lowest Lawson Triple product requirement, where fusion can achieve a viable threshold for ignition when the density of the fuel atoms, n, multiplied by the fuel temperature, T, multiplied by the confinement time, τ, is greater than 5×10²¹ (nTτ > 5×10²¹ keV-s/m³) [8–9]. For context, the temperature alone in this scenario must be higher than 100-million degrees Celsius.
Figure 2. (Left) Conceptual illustration of a fusion reaction with Deuterium (²H) and Tritium (³H) forming an Alpha particle (⁴He) and free neutron along with energy released as heat (Right). To initiate fusion, repelling electrostatic charge must be overcome via conditions meeting the minimum Lawson Triple Product threshold
…
Tosukhowong’s August 31, 2023 article provides a good overview keeping in mind that it is slanted to justify TDK’s investment in Type One Energy.
Why a Vancouver, Canada office?
As for Type One Energy’s Vancouver (British Columbia, Canada) connection, I was reminded of General Fusion, a local fusion energy company while speculating about the connection. First speculative question: could Type One Energy’s presence in Canada allow it to access Canadian government funds for its research? Second speculative question: do they want to have access to people who might hesitate to move to the US or might want to move out of the US but would move to Canada?
The US is currently in an unstable state as suggested in this April 3, 2025 opinion piece by Les Leyne for vancouverisawsome.com
Les Leyne: Trump’s incoherence makes responding to tariff wall tricky
Trump’s announcement was so incoherent that much of the rest of the world had to scramble to grasp even the basic details
B.C. officials were guarded Wednesday [April 2, 2025] about the impact on Canada of the tariff wall U.S. President Donald Trump erected around the U.S., but it appears it could have been worse.
Trump’s announcement was so incoherent that much of the rest of the world had to scramble to grasp even the basic details. So cabinet ministers begged for more time to check the impacts.
“It’s still very uncertain,” said Housing Minister Ravi Kahlon, who chairs the “war room” committee responsible for countering tariff threats. “It’s hard to make sense from President Trump’s speech.” [emphasis mine]
Kahlon said the challenge is that tariff policies change hour by hour, “and anything can happen.”
…
On April 2, 2025 US President Donald Trump announced tariffs (then paused some of the tariffs on April 9, 2025) and some of the targets seemed a bit odd, from an April 2, 2025 article by Alex Galbraith for salon.com, Note: Links have been removed,
“Trade war with penguins”: Trump places 10% tariff on uninhabited Antarctic islands
Planned tariffs shared by the White House included a 10% duty on imports from the barren Heard and McDonald Islands
For once in his life, Donald Trump underpromised and over-delivered.
The president announced a 10% duty on all imports on Wednesday [April 2, 2025], along with a raft of reciprocal tariffs on U.S. trading partners. An extensive graphic released by the White House showed how far Trump was willing to take his tit-for-tat trade war, including a shocking levy of 10% on all imports from the Heard and McDonald Islands.
If you haven’t heard of this powerhouse of global trade and territory of Australia, you aren’t alone. Few have outside of Antarctic researchers and seals. These extremely remote islands about 1,000 miles north of Antarctica consist mostly of barren tundra. They’re also entirely uninhabited.
The news that we were starting a trade war with penguins spread quickly after Trump’s announcement. …
U.S. stock futures crumbled following the news of Trump’s widespread tariffs. Dow futures fell by nearly 1,000 points while NASDAQ and S&P futures fell by 3 to 4%. American companies’ stock values rapidly tumbled after the announcement, with large retail importers seeing significant losses. …
No word from the penguins about the ‘pause’. I’m assuming Donald Trump’s next book will be titled, “The art of negotiating trade deals with penguins.” Can’t wait to read it.
(Perhaps someone should tell him there are no penguins in the Arctic so he can’t bypass Canadians or Greenlanders to make a deal.)
Now for the local story.
General Fusion
There’ve been two recent developments at General Fusion. Most recently, an April 2, 2025 General Fusion news release announces a new hire, Note: Links have been removed,
Bob Smith is joining General Fusion as a strategic advisor. Smith brings more than 35 years of experience developing, scaling, and launching world-changing technologies, including spearheading new products and innovation in the aerospace industry at United Space Alliance, Sandia Labs, and Honeywell before serving as CEO of Blue Origin. He joins General Fusion as the company’s Lawson Machine 26 (LM26) fusion demonstration begins operations and progresses toward transformative technical milestones on the path to commercialization.
“I’ve been watching the fusion energy industry closely for my entire career. Fusion is the last energy source humanity will ever need, and I believe its impact as a zero-carbon energy source will transform the global energy supply at the time needed to fight the worst consequences of climate change,” said Smith. “I am thrilled to work with General Fusion. Their novel approach has inherent and distinctive benefits for the generation of commercially competitive fusion power. It’s exciting to join at a time when the team is about to demonstrate the fundamental physics behind their system and move to scaling up to a pilot plant.”
The LM26 program marks a significant step towards commercialization, as the company’s unique Magnetized Target Fusion (MTF) approach makes the path to powering the grid with fusion energy more straightforward than other technologies—because it practically addresses barriers to fusion commercialization, such as neutron material degradation, sustainable fuel production, and efficient energy extraction. As a strategic advisor, Smith will leverage his experience advancing game-changing technologies to help guide General Fusion’s technology development and strategic growth.
“Bob’s insights and experience will be invaluable as we execute the LM26 program and look beyond it to propel our practical technology to powering the grid by the mid-2030s,” said Greg Twinney, CEO, General Fusion. “We are grateful for his commitment of his in-demand time and expertise to our mission and look forward to working together to make fusion power a reality!”
About Bob Smith:
Bob is an experienced business leader in the aerospace and defense industry with extensive technical and operational expertise across the sector. He worked at and managed federal labs, led developments at a large government contractor, grew businesses at a Fortune 100 multinational, and scaled up a launch and space systems startup. Bob also has extensive international experience and has worked with suppliers and OEMs in all the major aerospace regions, including establishing new sites and factories in Europe, India, China, and Puerto Rico.
Bob’s prior leadership roles include Chairman and Chief Executive Officer of Blue Origin, President of Mechanical Systems & Components at Honeywell Aerospace, Chief Technology Officer at Honeywell Aerospace, Chairman of NTESS (Sandia Labs), and Executive Director of Space Shuttle Upgrades at United Space Alliance.
Bob holds a Bachelor of Science degree in aerospace engineering from Texas A&M, a Master of Science degree in engineering/applied mathematics from Brown University, a doctorate from the University of Texas in aerospace engineering, and a business degree from MIT’s Sloan School of Management. Bob is also a Fellow of the Royal Aeronautical Society, a Fellow of the American Institute of Aeronautics and Astronautics, and an Academician in the International Academy of Astronautics.
Quick Facts:
Fusion energy is the ultimate clean energy solution—it is the energy source that powers the sun and stars. Fusion is the process by which two light nuclei merge to form a heavier one, producing a massive amount of energy.
General Fusion’s Magnetized Target Fusion (MTF) technology is designed to scale for cost-efficient power plants. It uses mechanical compression to create fusion conditions in short pulses, eliminating the need for expensive lasers or superconducting magnets. An MTF power plant is designed to produce its own fuel and inherently includes a method to extract the energy and put it to work.
Lawson Machine 26 (LM26) is a world-first Magnetized Target Fusion demonstration. Launched, designed, and assembled in just 16 months, the machine is now forming magnetized plasmas regularly at 50 per cent commercial scale. It is advancing towards a series of results that will demonstrate MTF in a commercially relevant way: 10 million degrees Celsius (1 keV), 100 million degrees Celsius (10 keV), and scientific breakeven equivalent (100% Lawson).
About General Fusion General Fusion is pursuing a fast and practical approach to commercial fusion energy and is headquartered in Richmond, Canada. The company was established in 2002 and is funded by a global syndicate of leading energy venture capital firms, industry leaders, and technology pioneers. Learn more at www.generalfusion.com.
…
Bob Smith and Blue Origin: things did not go well
Sometimes you end up in a job and things do not work out well and that seems to have been the case at Blue Origin according to a September 25, 2023 article by Eric Berger for Ars Tecnica,
After six years of running Blue Origin, Bob Smith announced in a company-wide email on Monday that he will be “stepping aside” as chief executive of the space company founded by Jeff Bezos.
“It has been my privilege to be part of this great team, and I am confident that Blue Origin’s greatest achievements are still ahead of us,” Smith wrote in an email. “We’ve rapidly scaled this company from its prototyping and research roots to a large, prominent space business.”
Shortly after Smith’s email, a Blue Origin spokesperson said the company’s new chief executive will be Dave Limp, who stepped down as Amazon’s vice president of devices and services last month.
…
To put things politely, Smith has had a rocky tenure as Blue Origin’s chief executive. After being personally vetted and hired by Bezos, Smith took over from Rob Meyerson in 2017. The Honeywell engineer was given a mandate to transform Blue Origin into a large and profitable space business.
He did succeed in growing Blue Origin. The company had about 1,500 employees when Smith arrived, and the company now employs nearly 11,000 people. But he has been significantly late on a number of key programs, including the BE-4 rocket engine and the New Glenn rocket.
As a space reporter, I have spoken with dozens of current and former Blue Origin employees, and virtually none of them have had anything positive to say about Smith’s tenure as chief executive. I asked one current employee about the hiring of Limp on Monday afternoon, and their response was, “Anything is better than Bob.”
Although it is very far from an exact barometer, Smith has received consistently low ratings on Glassdoor for his performance as chief executive of Blue Origin. And two years ago, a group of current and former Blue Origin employees wrote a blistering letter about the company under Smith. “In our experience, Blue Origin’s culture sits on a foundation that ignores the plight of our planet, turns a blind eye to sexism, is not sufficiently attuned to safety concerns, and silences those who seek to correct wrongs,” the essay authors wrote.
With any corporate culture, there will be growing pains, of course. But Smith brought a traditional aerospace mindset into a company that had hitherto been guided by a new space vision, leading to a high turnover rate. And Blue Origin remains significantly underwater, financially. It is likely that Bezos is still providing about $2 billion a year to support the company’s cash needs.
Crucially, as Blue Origin meandered under Smith’s tenure, SpaceX soared, launching hundreds of rockets and thousands of satellites. Smith, clearly, was not the leader Blue Origin needed to make the company more competitive with SpaceX in launch and other spaceflight activities. It became something of a parlor game in the space industry to guess when Bezos would finally get around to firing Smith.
…
On the technical front, a March 27, 2025 General Fusion news release announces “Peer-reviewed publication confirms General Fusion achieved plasma energy confinement time required for its LM26 large-scale fusion machine,” Note: Links have been removed,
New results published in Nuclear Fusionconfirm General Fusion successfully created magnetized plasmas that achieved energy confinement times exceeding 10 milliseconds. The published energy confinement time results were achieved on General Fusion’s PI3 plasma injector — the world’s largest and most powerful plasma injector of its kind. Commissioned in 2017, PI3 formed approximately 20,000 plasmas in a machine of 50 per cent commercial scale. The plasma injector is now integrated into General Fusion’s Lawson Machine 26 (LM26) — a world-first Magnetized Target Fusion demonstration tracking toward game-changing technical milestones that will advance the company’s ultimate mission: generating zero-carbon fusion energy for the grid in the next decade.
The 10-millisecond energy confinement time is the duration required to compress plasmas in LM26 to achieve key temperature thresholds of 1 keV, 10 keV, and, ultimately, scientific breakeven equivalent (100% Lawson). These results were imperative to de-risking LM26. The demonstration machine is now forming plasmas regularly, and the company is optimizing its plasma performance in preparation for compressing plasmas to create fusion and heating from compression.
Key Findings:
The plasma injector now integrated into General Fusion’s LM26 achieved energy confinement times exceeding 10 milliseconds, the pre-compression confinement time required for LM26’s targeted technical milestones. These results were achieved without requiring active magnetic stabilization or auxiliary heating. This means the results were achieved without superconducting magnets, demonstrating the company’s cost-effective approach.
The plasma’s energy confinement time improved when the plasma injector vessel was coated with natural lithium. A key differentiator in General Fusion’s commercial approach is its use of a liquid lithium wall to compress plasmas during compression. In addition to the confinement time advantages shown in this paper, the liquid lithium wall will also protect a commercial MTF machine from neutron damage, enable the machine to breed its own fuel, and provide an efficient method for extracting energy from the machine.
The maximum energy confinement time achieved by PI3 was approximately 12 milliseconds. The machine’s maximum plasma density was approximately 6×1019 m-3, and maximum plasma temperatures exceeded 400 eV. These strong pre-compression results support LM26’s transformative targets.
Quotes:
“LM26 is designed to achieve a series of results that will demonstrate MTF in a commercially relevant way. Following LM26’s results, our unique approach makes the path to powering the grid with fusion energy more straightforward than other technologies because we have front-loaded the work to address the barriers to commercialization.”
Dr. Michel Laberge Founder and Chief Science Officer
“For over 16 years, I have worked hand in hand with Michel to advance General Fusion’s practical technology. This company is entrepreneurial at its core. We pride ourselves on building real machines that get results that matter, and I’m thrilled to have the achievements recognized in Nuclear Fusion.”
Mike Donaldson Senior Vice President, Technology Development
For anyone curious about General Fusion, I have a brief overview and history of the company and their particular approach to fusion energy in my February 6, 2024 posting (scroll down to ‘The Canadians’).
Before diving into some of the latest quantum computing doings, here’s why quantum computing is so highly prized and chased after, from the Quantum supremacy Wikipedia entry, Note: Links have been removed,
In quantum computing, quantum supremacy or quantum advantage is the goal of demonstrating that a programmable quantum computer can solve a problem that no classical computer can solve in any feasible amount of time, irrespective of the usefulness of the problem.[1][2][3] The term was coined by John Preskill in 2011,[1][4] but the concept dates to Yuri Manin’s 1980[5] and Richard Feynman’s 1981[6] proposals of quantum computing.
…
Quantum supremacy and quantum advantage have been mentioned a few times here over the years. You can check my March 6, 2020 posting for when researchers from the University of California at Santa Barbara claimed quantum supremacy and my July 31, 2023 posting for when D-Wave Systems claimed a quantum advantage on optimization problems. I’d understood quantum supremacy and quantum advantage to be synonymous but according the article in Betakit (keep scrolling down to the D-Wave subhead and then, to ‘A controversy of sorts’ subhead in this posting), that’s not so.
The latest news on the quantum front comes from Microsoft (February 2025) and D-Wave systems (March 2025).
Microsoft claims a new state of matter for breakthroughs in quantum computing
Here’s the February 19, 2025 news announcement from Microsoft’s Chetan Nayak, Technical Fellow and Corporate Vice President of Quantum Hardware, Note: Links have been removed,
Quantum computers promise to transform science and society—but only after they achieve the scale that once seemed distant and elusive, and their reliability is ensured by quantum error correction. Today, we’re announcing rapid advancements on the path to useful quantum computing:
Majorana 1: the world’s first Quantum Processing Unit (QPU) powered by a Topological Core, designed to scale to a million qubits on a single chip.
A hardware-protected topological qubit: research published today in Nature, along with data shared at the Station Q meeting, demonstrate our ability to harness a new type of material and engineer a radically different type of qubit that is small, fast, and digitally controlled.
A device roadmap to reliable quantum computation: our path from single-qubit devices to arrays that enable quantum error correction.
Building the world’s first fault-tolerant prototype (FTP) based on topological qubits: Microsoft is on track to build an FTP of a scalable quantum computer—in years, not decades—as part of the final phase of the Defense Advanced Research Projects Agency (DARPA) Underexplored Systems for Utility-Scale Quantum Computing (US2QC) program.
Together, these milestones mark a pivotal moment in quantum computing as we advance from scientific exploration to technological innovation.
Harnessing a new type of material
All of today’s announcements build on our team’s recent breakthrough: the world’s first topoconductor. This revolutionary class of materials enables us to create topological superconductivity, a new state of matter that previously existed only in theory. The advance stems from Microsoft’s innovations in the design and fabrication of gate-defined devices that combine indium arsenide (a semiconductor) and aluminum (a superconductor). When cooled to near absolute zero and tuned with magnetic fields, these devices form topological superconducting nanowires with Majorana Zero Modes (MZMs) at the wires’ ends.
…
Chris Vallance’s February 19, 2025 article for the British Broadcasting Corporation (BBC) news online website provides a description of Microsoft’s claims and makes note of the competitive quantum research environment,
Microsoft has unveiled a new chip called Majorana 1 that it says will enable the creation of quantum computers able to solve “meaningful, industrial-scale problems in years, not decades”.
It is the latest development in quantum computing – tech which uses principles of particle physics to create a new type of computer able to solve problems ordinary computers cannot.
Creating quantum computers powerful enough to solve important real-world problems is very challenging – and some experts believe them to be decades away.
Microsoft says this timetable can now be sped up because of the “transformative” progress it has made in developing the new chip involving a “topological conductor”, based on a new material it has produced.
The firm believes its topoconductor has the potential to be as revolutionary as the semiconductor was in the history of computing.
But experts have told the BBC more data is needed before the significance of the new research – and its effect on quantum computing – can be fully assessed.
Jensen Huang – boss of the leading chip firm, Nvidia – said in January he believed “very useful” quantum computing would come in 20 years.
Chetan Nayak, a technical fellow of quantum hardware at Microsoft, said he believed the developments would shake up conventional thinking about the future of quantum computers.
“Many people have said that quantum computing, that is to say useful quantum computers, are decades away,” he said. “I think that this brings us into years rather than decades.”
Travis Humble, director of the Quantum Science Center of Oak Ridge National Laboratory in the US, said he agreed Microsoft would now be able to deliver prototypes faster – but warned there remained work to do.
“The long term goals for solving industrial applications on quantum computers will require scaling up these prototypes even further,” he said.
…
While rivals produced a steady stream of announcements – notably Google’s “Willow” at the end of 2024 – Microsoft seemed to be taking longer.
Pursuing this approach was, in the company’s own words, a “high-risk, high-rewards” strategy, but one it now believes is going to pay off.
Purdue University’s (Indiana, US) February 25, 2025 news release on EurekAlert announces publication of the research, Note: Links have been removed,
Microsoft Quantum published an article in Nature on Feb. 19 [2025] detailing recent advances in the measurement of quantum devices that will be needed to realize a topological quantum computer. Among the authors are Microsoft scientists and engineers who conduct research at Microsoft Quantum Lab West Lafayette, located at Purdue University. In an announcement by Microsoft Quantum, the team describes the operation of a device that is a necessary building block for a topological quantum computer. The published results are an important milestone along the path to construction of quantum computers that are potentially more robust and powerful than existing technologies.
“Our hope for quantum computation is that it will aid chemists, materials scientists and engineers working on the design and manufacturing of new materials that are so important to our daily lives,” said Michael Manfra, scientific director of Microsoft Quantum Lab West Lafayette and the Bill and Dee O’Brien Distinguished Professor of Physics and Astronomy, professor of materials engineering, and professor of electrical and computer engineering at Purdue. “The promise of quantum computation is in accelerating scientific discovery and its translation into useful technology. For example, if quantum computers reduce the time and cost to produce new lifesaving therapeutic drugs, that is real societal impact.”
The Microsoft Quantum Lab West Lafayette team advanced the complex layered materials that make up the quantum plane of the full device architecture used in the tests. Microsoft scientists working with Manfra are experts in advanced semiconductor growth techniques, including molecular beam epitaxy, that are used to build low-dimensional electron systems that form the basis for quantum bits, or qubits. They built the semiconductor and superconductor layers with atomic layer precision, tailoring the material’s properties to those needed for the device architecture.
Manfra, a member of the Purdue Quantum Science and Engineering Institute, credited the strong relationship between Purdue and Microsoft, built over the course of a decade, with the advances conducted at Microsoft Quantum Lab West Lafayette. In 2017 Purdue deepened its relationship with Microsoft with a multiyear agreement that includes embedding Microsoft employees with Manfra’s research team at Purdue.
“This was a collaborative effort by a very sophisticated team, with a vital contribution from the Microsoft scientists at Purdue,” Manfra said. “It’s a Microsoft team achievement, but it’s also the culmination of a long-standing partnership between Purdue and Microsoft. It wouldn’t have been possible without an environment at Purdue that was conducive to this mode of work — I attempted to blend industrial with academic research to the betterment of both communities. I think that’s a success story.”
Quantum science and engineering at Purdue is a pillar of the Purdue Computes initiative, which is focused on advancing research in computing, physical AI, semiconductors and quantum technologies.
“This research breakthrough in the measurement of the state of quasi particles is a milestone in the development of topological quantum computing, and creates a watershed moment in the semiconductor-superconductor hybrid structure,” Purdue President Mung Chiang said. “Marking also the latest success in the strategic initiative of Purdue Computes, the deep collaboration that Professor Manfra and his team have created with the Microsoft Quantum Lab West Lafayette on the Purdue campus exemplifies the most impactful industry research partnership at any American university today.”
Most approaches to quantum computers rely on local degrees of freedom to encode information. The spin of an electron is a classic example of a qubit. But an individual spin is prone to disturbance — by relatively common things like heat, vibrations or interactions with other quantum particles — which can corrupt quantum information stored in the qubit, necessitating a great deal of effort in detecting and correcting errors. Instead of spin, topological quantum computers store information in a more distributed manner; the qubit state is encoded in the state of many particles acting in concert. Consequently, it is harder to scramble the information as the state of all the particles must be changed to alter the qubit state.
In the Nature paper, the Microsoft team was able to accurately and quickly measure the state of quasi particles that form the basis of the qubit.
“The device is used to measure a basic property of a topological qubit quickly,” Manfra said. “The team is excited to build on these positive results.”
“The team in West Lafayette pushed existing epitaxial technology to a new state-of-the-art for semiconductor-superconductor hybrid structures to ensure a perfect interface between each of the building blocks of the Microsoft hybrid system,” said Sergei Gronin, a Microsoft Quantum Lab scientist.
“The materials quality that is required for quantum computing chips necessitates constant improvements, so that’s one of the biggest challenges,” Gronin said. “First, we had to adjust and improve semiconductor technology to meet a new level that nobody was able to achieve before. But equally important was how to create this hybrid system. To do that, we had to merge a semiconducting part and a superconducting part. And that means you need to perfect the semiconductor and the superconductor and perfect the interface between them.”
While work discussed in the Nature article was performed by Microsoft employees, the exposure to industrial-scale research and development is an outstanding opportunity for Purdue students in Manfra’s academic group as well. John Watson, Geoffrey Gardner and Saeed Fallahi, who are among the coauthors of the paper, earned their doctoral degrees under Manfra and now work for Microsoft Quantum at locations in Redmond, Washington, and Copenhagen, Denmark. Most of Manfra’s former students now work for quantum computing companies, including Microsoft. Tyler Lindemann, who works in the West Lafayette lab and helped to build the hybrid semiconductor-superconductor structures required for the device, is earning a doctoral degree from Purdue under Manfra’s supervision.
“Working in Professor Manfra’s lab in conjunction with my work for Microsoft Quantum has given me a head start in my professional development, and been fruitful for my academic work,” Lindemann said. “At the same time, many of the world-class scientists and engineers at Microsoft Quantum have some background in academia, and being able to draw from their knowledge and experience is an indispensable resource in my graduate studies. From both perspectives, it’s a great opportunity.”
Here’s a link to and a citation for the paper,
Interferometric single-shot parity measurement in InAs–Al hybrid devices by Microsoft Azure Quantum, Morteza Aghaee, Alejandro Alcaraz Ramirez, Zulfi Alam, Rizwan Ali, Mariusz Andrzejczuk, Andrey Antipov, Mikhail Astafev, Amin Barzegar, Bela Bauer, Jonathan Becker, Umesh Kumar Bhaskar, Alex Bocharov, Srini Boddapati, David Bohn, Jouri Bommer, Leo Bourdet, Arnaud Bousquet, Samuel Boutin, Lucas Casparis, Benjamin J. Chapman, Sohail Chatoor, Anna Wulff Christensen, Cassandra Chua, Patrick Codd, William Cole, Paul Cooper, Fabiano Corsetti, Ajuan Cui, Paolo Dalpasso, Juan Pablo Dehollain, Gijs de Lange, Michiel de Moor, Andreas Ekefjärd, Tareq El Dandachi, Juan Carlos Estrada Saldaña, Saeed Fallahi, Luca Galletti, Geoff Gardner, Deshan Govender, Flavio Griggio, Ruben Grigoryan, Sebastian Grijalva, Sergei Gronin, Jan Gukelberger, Marzie Hamdast, Firas Hamze, Esben Bork Hansen, Sebastian Heedt, Zahra Heidarnia, Jesús Herranz Zamorano, Samantha Ho, Laurens Holgaard, John Hornibrook, Jinnapat Indrapiromkul, Henrik Ingerslev, Lovro Ivancevic, Thomas Jensen, Jaspreet Jhoja, Jeffrey Jones, Konstantin V. Kalashnikov, Ray Kallaher, Rachpon Kalra, Farhad Karimi, Torsten Karzig, Evelyn King, Maren Elisabeth Kloster, Christina Knapp, Dariusz Kocon, Jonne V. Koski, Pasi Kostamo, Mahesh Kumar, Tom Laeven, Thorvald Larsen, Jason Lee, Kyunghoon Lee, Grant Leum, Kongyi Li, Tyler Lindemann, Matthew Looij, Julie Love, Marijn Lucas, Roman Lutchyn, Morten Hannibal Madsen, Nash Madulid, Albert Malmros, Michael Manfra, Devashish Mantri, Signe Brynold Markussen, Esteban Martinez, Marco Mattila, Robert McNeil, Antonio B. Mei, Ryan V. Mishmash, Gopakumar Mohandas, Christian Mollgaard, Trevor Morgan, George Moussa, Chetan Nayak, Jens Hedegaard Nielsen, Jens Munk Nielsen, William Hvidtfelt Padkar Nielsen, Bas Nijholt, Mike Nystrom, Eoin O’Farrell, Thomas Ohki, Keita Otani, Brian Paquelet Wütz, Sebastian Pauka, Karl Petersson, Luca Petit, Dima Pikulin, Guen Prawiroatmodjo, Frank Preiss, Eduardo Puchol Morejon, Mohana Rajpalke, Craig Ranta, Katrine Rasmussen, David Razmadze, Outi Reentila, David J. Reilly, Yuan Ren, Ken Reneris, Richard Rouse, Ivan Sadovskyy, Lauri Sainiemi, Irene Sanlorenzo, Emma Schmidgall, Cristina Sfiligoj, Mustafeez Bashir Shah, Kevin Simoes, Shilpi Singh, Sarat Sinha, Thomas Soerensen, Patrick Sohr, Tomas Stankevic, Lieuwe Stek, Eric Stuppard, Henri Suominen, Judith Suter, Sam Teicher, Nivetha Thiyagarajah, Raj Tholapi, Mason Thomas, Emily Toomey, Josh Tracy, Michelle Turley, Shivendra Upadhyay, Ivan Urban, Kevin Van Hoogdalem, David J. Van Woerkom, Dmitrii V. Viazmitinov, Dominik Vogel, John Watson, Alex Webster, Joseph Weston, Georg W. Winkler, Di Xu, Chung Kai Yang, Emrah Yucelen, Roland Zeisel, Guoji Zheng & Justin Zilke. Nature 638, 651–655 (2025). DOI: https://doi.org/10.1038/s41586-024-08445-2 Published online: 19 February 2025 Issue Date: 20 February 2025
This paper is open access. Note: I usually tag all of the authors but not this time.
Controversy over this and previous Microsoft quantum computing claims
Elizabeth Hlavinka’s March 17, 2025 article for Salon.com provides an overview, Note: Links have been removed,
The matter making up the world around us has long-since been organized into three neat categories: solids, liquids and gases. But last month [February 2025], Microsoft announced that it had allegedly discovered another state of matter originally theorized to exist in 1937.
This new state of matter called the Majorana zero mode is made up of quasiparticles, which act as their own particle and antiparticle. The idea is that the Majorana zero mode could be used to build a quantum computer, which could help scientists answer complex questions that standard computers are not capable of solving, with implications for medicine, cybersecurity and artificial intelligence.
In late February [2025], Sen. Ted Cruz presented Microsoft’s new computer chip at a congressional hearing, saying, “Technologies like this new chip I hold in the palm of my hand, the Majorana 1 quantum chip, are unlocking a new era of computing that will transform industries from health care to energy, solving problems that today’s computers simply cannot.”
However, Microsoft’s announcement, claiming a “breakthrough in quantum computing,” was met with skepticism from some physicists in the field. Proving that this form of quantum computing can work requires first demonstrating the existence of Majorana quasiparticles, measuring what the Majorana particles are doing, and creating something called a topological qubit used to store quantum information.
But some say that not all of the data necessary to prove this has been included in the research paper published in Nature, on which this announcement is based. And due to a fraught history of similar claims from the company being disputed and ultimately rescinded, some are extra wary of the results. [emphasis mine]
…
It’s not the first time Microsoft has faced backlash from presenting findings in the field. In 2018, the company reported that they had detected the presence of Majorana zero-modes in a research paper, but it was retracted by Nature, the journal that published it after a report from independent experts put their findings under more intense scrutiny.
In the [2018] report, four physicists not involved in the research concluded that it did not appear that Microsoft had intentionally misrepresented the data, but instead seemed to be “caught up in the excitement of the moment [emphasis mine].”
Establishing the existence of these particles is extremely complex in part because disorder in the device can create signals that mimic these quasiparticles when they are not actually there.
Modern computers in use today are encoded in bits, which can either be in a zero state (no current flowing through them), or a one state (current flowing.) These bits work together to send information and signals that communicate with the computer, powering everything from cell phones to video games.
Companies like Google, IBM and Amazon have invested in designing another form of quantum computer that uses chips built with “qubits,” or quantum bits. Qubits can exist in both zero and one states at the same time due to a phenomenon called superposition.
However, qubits are subject to external noise from the environment that can affect their performance, said Dr. Paolo Molignini, a researcher in theoretical quantum physics at Stockholm University.
“Because qubits are in a superposition of zero and one, they are very prone to errors and they are very prone to what is called decoherence, which means there could be noise, thermal fluctuations or many things that can collapse the state of the qubits,” Molignini told Salon in a video call. “Then you basically lose all of the information that you were encoding.”
…
In December [2024], Google said its quantum computer could perform a calculation that a standard computer could complete in 10 septillion years — a period far longer than the age of the universe — in just under five minutes.
However, a general-purpose computer would require billions of qubits, so these approaches are still a far cry from having practical applications, said Dr. Patrick Lee, a physicist at the Massachusetts Institute of Technology [MIT], who co-authored the report leading to the 2018 Nature paper’s retraction.
Microsoft is taking a different approach to quantum computing by trying to develop a topological qubit, which has the ability to store information in multiple places at once. Topological qubits exist within the Majorana zero states and are appealing because they can theoretically offer greater protection against environmental noise that destroys information within a quantum system.
Think of it like an arrow, where the arrowhead holds a portion of the information and the arrow tail holds the rest, Lee said. Distributing information across space like this is called topological protection.
“If you are able to put them far apart from each other, then you have a chance of maintaining the identity of the arrow even if it is subject to noise,” Lee told Salon in a phone interview. “The idea is that if the noise affects the head, it doesn’t kill the arrow and if it affects only the tail it doesn’t kill your arrow. It has to affect both sides simultaneously to kill your arrow, and that is very unlikely if you are able to put them apart.”
…
… Lee believes that even if the data doesn’t entirely prove that topological qubits exist in the Majorana zero-state, it still represents a scientific advancement. But he noted that several important issues need to be solved before it has practical implications. For one, the coherence time of these particles — or how long they can exist without being affected by environmental noise — is still very short, he explained.
“They make a measurement, come back, and the qubit has changed, so you have lost your coherence,” Lee said. “With this very short time, you cannot do anything with it.”
…
“I just wish they [Microsoft] were a bit more careful with their claims because I fear that if they don’t measure up to what they are saying, there might be a backlash at some point where people say, ‘You promised us all these fancy things and where are they now?’” Molignini said. “That might damage the entire quantum community, not just themselves.”
D-Wave Quantum Systems claims quantum supremacy over real world problem solution
A March 15, 2025 article by Bob Yirka for phys.org announces the news from D-Wave Quantum Systems. Note: The company, which had its headquarters in Canada (Burnaby, BC) now seems to be a largely US company with its main headquarters in Palo Alto, California and an ancillary or junior (?) headquarters in Canada, Note: A link has been removed,
A team of quantum computer researchers at quantum computer maker D-Wave, working with an international team of physicists and engineers, is claiming that its latest quantum processor has been used to run a quantum simulation faster than could be done with a classical computer.
In their paper published in the journal Science, the group describes how they ran a quantum version of a mathematical approximation regarding how matter behaves when it changes states, such as from a gas to a liquid—in a way that they claim would be nearly impossible to conduct on a traditional computer.
New landmark peer-reviewed paper published in Science, “Beyond-Classical Computation in Quantum Simulation,” unequivocally validates D-Wave’s achievement of the world’s first and only demonstration of quantum computational supremacy on a useful, real-world problem
Research shows D-Wave annealing quantum computer performs magnetic materials simulation in minutes that would take nearly one million years and more than the world’s annual electricity consumption to solve using a classical supercomputer built with GPU clusters
D-Wave Advantage2 annealing quantum computer prototype used in supremacy achievement, a testament to the system’s remarkable performance capabilities
PALO ALTO, Calif. – March 12, 2025 – D-Wave Quantum Inc. (NYSE: QBTS) (“D-Wave” or the “Company”), a leader in quantum computing systems, software, and services and the world’s first commercial supplier of quantum computers, today announced a scientific breakthrough published in the esteemed journal Science, confirming that its annealing quantum computer outperformed one of the world’s most powerful classical supercomputers in solving complex magnetic materials simulation problems with relevance to materials discovery. The new landmark peer-reviewed paper, “Beyond-Classical Computation in Quantum Simulation,” validates this achievement as the world’s first and only demonstration of quantum computational supremacy on a useful problem.
An international collaboration of scientists led by D-Wave performed simulations of quantum dynamics in programmable spin glasses—computationally hard magnetic materials simulation problems with known applications to business and science—on both D-Wave’s Advantage2TM prototype annealing quantum computer and the Frontier supercomputer at the Department of Energy’s Oak Ridge National Laboratory. The work simulated the behavior of a suite of lattice structures and sizes across a variety of evolution times and delivered a multiplicity of important material properties. D-Wave’s quantum computer performed the most complex simulation in minutes and with a level of accuracy that would take nearly one million years using the supercomputer. In addition, it would require more than the world’s annual electricity consumption to solve this problem using the supercomputer, which is built with graphics processing unit (GPU) clusters.
“This is a remarkable day for quantum computing. Our demonstration of quantum computational supremacy on a useful problem is an industry first. All other claims of quantum systems outperforming classical computers have been disputed or involved random number generation of no practical value,” said Dr. Alan Baratz, CEO of D-Wave. “Our achievement shows, without question, that D-Wave’s annealing quantum computers are now capable of solving useful problems beyond the reach of the world’s most powerful supercomputers. We are thrilled that D-Wave customers can use this technology today to realize tangible value from annealing quantum computers.”
Realizing an Industry-First Quantum Computing Milestone The behavior of materials is governed by the laws of quantum physics. Understanding the quantum nature of magnetic materials is crucial to finding new ways to use them for technological advancement, making materials simulation and discovery a vital area of research for D-Wave and the broader scientific community. Magnetic materials simulations, like those conducted in this work, use computer models to study how tiny particles not visible to the human eye react to external factors. Magnetic materials are widely used in medical imaging, electronics, superconductors, electrical networks, sensors, and motors.
“This research proves that D-Wave’s quantum computers can reliably solve quantum dynamics problems that could lead to discovery of new materials,” said Dr. Andrew King, senior distinguished scientist at D-Wave. “Through D-Wave’s technology, we can create and manipulate programmable quantum matter in ways that were impossible even a few years ago.”
Materials discovery is a computationally complex, energy-intensive and expensive task. Today’s supercomputers and high-performance computing (HPC) centers, which are built with tens of thousands of GPUs, do not always have the computational processing power to conduct complex materials simulations in a timely or energy-efficient manner. For decades, scientists have aspired to build a quantum computer capable of solving complex materials simulation problems beyond the reach of classical computers. D-Wave’s advancements in quantum hardware have made it possible for its annealing quantum computers to process these types of problems for the first time.
“This is a significant milestone made possible through over 25 years of research and hardware development at D-Wave, two years of collaboration across 11 institutions worldwide, and more than 100,000 GPU and CPU hours of simulation on one of the world’s fastest supercomputers as well as computing clusters in collaborating institutions,” said Dr. Mohammad Amin, chief scientist at D-Wave. “Besides realizing Richard Feynman’s vision of simulating nature on a quantum computer, this research could open new frontiers for scientific discovery and quantum application development.”
Advantage2 System Demonstrates Powerful Performance Gains The results shown in “Beyond-Classical Computation in Quantum Simulation” were enabled by D-Wave’s previous scientific milestones published in Nature Physics (2022) and Nature (2023), which theoretically and experimentally showed that quantum annealing provides a quantum speedup in complex optimization problems. These scientific advancements led to the development of the Advantage2 prototype’s fast anneal feature, which played an essential role in performing the precise quantum calculations needed to demonstrate quantum computational supremacy.
“The broader quantum computing research and development community is collectively building an understanding of the types of computations for which quantum computing can overtake classical computing. This effort requires ongoing and rigorous experimentation,” said Dr. Trevor Lanting, chief development officer at D-Wave. “This work is an important step toward sharpening that understanding, with clear evidence of where our quantum computer was able to outperform classical methods. We believe that the ability to recreate the entire suite of results we produced is not possible classically. We encourage our peers in academia to continue efforts to further define the line between quantum and classical capabilities, and we believe these efforts will help drive the development of ever more powerful quantum computing technology.”
The Advantage2 prototype used to achieve quantum computational supremacy is available for customers to use today via D-Wave’s Leap™ real-time quantum cloud service. The prototype provides substantial performance improvements from previous-generation Advantage systems, including increased qubit coherence, connectivity, and energy scale, which enables higher-quality solutions to larger, more complex problems. Moreover, D-Wave now has an Advantage2 processor that is four times larger than the prototype used in this work and has extended the simulations of this paper from hundreds of qubits to thousands of qubits, which are significantly larger than those described in this paper.
Leading Industry Voices Echo Support Dr. Hidetoshi Nishimori, Professor, Department of Physics, Tokyo Institute of Technology: “This paper marks a significant milestone in demonstrating the real-world applicability of large-scale quantum computing. Through rigorous benchmarking of quantum annealers against state-of-the-art classical methods, it convincingly establishes a quantum advantage in tackling practical problems, revealing the transformative potential of quantum computing at an unprecedented scale.”
Dr. Seth Lloyd, Professor of Quantum Mechanical Engineering, MIT: “Although large-scale, fully error-corrected quantum computers are years in the future, quantum annealers can probe the features of quantum systems today. In an elegant paper, the D-Wave group has used a large-scale quantum annealer to uncover patterns of entanglement in a complex quantum system that lie far beyond the reach of the most powerful classical computer. The D-Wave result shows the promise of quantum annealers for exploring exotic quantum effects in a wide variety of systems.”
Dr. Travis Humble, Director of Quantum Science Center, Distinguished Scientist at Oak Ridge National Laboratory: “ORNL seeks to expand the frontiers of computation through many different avenues, and benchmarking quantum computing for materials science applications provides critical input to our understanding of new computational capabilities.”
Dr. Juan Carrasquilla, Associate Professor at the Department of Physics, ETH Zürich: “I believe these results mark a critical scientific milestone for D-Wave. They also serve as an invitation to the scientific community, as these results offer a strong benchmark and motivation for developing novel simulation techniques for out-of-equilibrium dynamics in quantum many-body physics. Furthermore, I hope these findings encourage theoretical exploration of the computational challenges involved in performing such simulations, both classically and quantum-mechanically.”
Dr. Victor Martin-Mayor, Professor of Theoretical Physics, Universidad Complutense de Madrid: “This paper is not only a tour-de-force for experimental physics, it is also remarkable for the clarity of the results. The authors have addressed a problem that is regarded both as important and as very challenging to a classical computer. The team has shown that their quantum annealer performs better at this task than the state-of-the-art methods for classical simulation.”
Dr. Alberto Nocera, Senior Staff Scientist, The University of British Columbia: “Our work shows the impracticability of state-of-the-art classical simulations to simulate the dynamics of quantum magnets, opening the door for quantum technologies based on analog simulators to solve scientific questions that may otherwise remain unanswered using conventional computers.”
About D-Wave Quantum Inc. D-Wave is a leader in the development and delivery of quantum computing systems, software, and services. We are the world’s first commercial supplier of quantum computers, and the only company building both annealing and gate-model quantum computers. Our mission is to help customers realize the value of quantum, today. Our 5,000+ qubit Advantage™ quantum computers, the world’s largest, are available on-premises or via the cloud, supported by 99.9% availability and uptime. More than 100 organizations trust D-Wave with their toughest computational challenges. With over 200 million problems submitted to our Advantage systems and Advantage2™ prototypes to date, our customers apply our technology to address use cases spanning optimization, artificial intelligence, research and more. Learn more about realizing the value of quantum computing today and how we’re shaping the quantum-driven industrial and societal advancements of tomorrow: www.dwavequantum.com.
Forward-Looking Statements Certain statements in this press release are forward-looking, as defined in the Private Securities Litigation Reform Act of 1995. These statements involve risks, uncertainties, and other factors that may cause actual results to differ materially from the information expressed or implied by these forward-looking statements and may not be indicative of future results. These forward-looking statements are subject to a number of risks and uncertainties, including, among others, various factors beyond management’s control, including the risks set forth under the heading “Risk Factors” discussed under the caption “Item 1A. Risk Factors” in Part I of our most recent Annual Report on Form 10-K or any updates discussed under the caption “Item 1A. Risk Factors” in Part II of our Quarterly Reports on Form 10-Q and in our other filings with the SEC. Undue reliance should not be placed on the forward-looking statements in this press release in making an investment decision, which are based on information available to us on the date hereof. We undertake no duty to update this information unless required by law.
Here’s a link to and a citation for the most recent paper,
Beyond-classical computation in quantum simulation by Andrew D. King , Alberto Nocera, Marek M. Rams, Jacek Dziarmaga, Roeland Wiersema, William Bernoudy, Jack Raymond, Nitin Kaushal, Niclas Heinsdorf, Richard Harris, Kelly Boothby, Fabio Altomare, Mohsen Asad, Andrew J. Berkley, Martin Boschnak, Kevin Chern, Holly Christiani, Samantha Cibere, Jake Connor, Martin H. Dehn, Rahul Deshpande, Sara Ejtemaee, Pau Farre, Kelsey Hamer, Emile Hoskinson, Shuiyuan Huang, Mark W. Johnson, Samuel Kortas, Eric Ladizinsky, Trevor Lanting, Tony Lai, Ryan Li, Allison J. R. MacDonald, Gaelen Marsden, Catherine C. McGeoch, Reza Molavi, Travis Oh, Richard Neufeld, Mana Norouzpour, Joel Pasvolsky, Patrick Poitras, Gabriel Poulin-Lamarre, Thomas Prescott, Mauricio Reis, Chris Rich, Mohammad Samani, Benjamin Sheldan, Anatoly Smirnov, Edward Sterpka, Berta Trullas Clavera, Nicholas Tsai, Mark Volkmann, Alexander M. Whiticar, Jed D. Whittaker, Warren Wilkinson, Jason Yao, T.J. Yi, Anders W. Sandvik, Gonzalo Alvarez, Roger G. Melko, Juan Carrasquilla, Marcel Franz, and Mohammad H. Amin. Science 12 Mar 2025 First Release DOI: 10.1126/science.ado6285
This paper appears to be open access.Note: I usually tag all of the authors but not this time either.
A controversy of sorts
Madison McLauchlan’s March 19, 2025 article for Betakit (website for Canadian Startup News & Tech Innovation), Note: Links have been removed,
Canadian-born company D-Wave Quantum Systems said it achieved “quantum supremacy” last week after publishing what it calls a groundbreaking paper in the prestigious journal Science. Despite the lofty term, Canadian experts say supremacy is not the be-all, end-all of quantum innovation.
D-Wave, which has labs in Palo Alto, Calif., and Burnaby, BC, claimed in a statement that it has shown “the world’s first and only demonstration of quantum computational supremacy on a useful, real-world problem.”
Coined in the early 2010s by physicist John Preskill, quantum supremacy is the ability of a quantum computing system to solve a problem no classical computer can in a feasible amount of time. The metric makes no mention of whether the problem needs to be useful or relevant to real life. Google researchers published a paper in Nature in 2019 claiming they cleared that bar with the Sycamore quantum processor. Researchers at the University of Science and Technology in China claimed they demonstrated quantum supremacy several times.
D-Wave’s attempt differs in that its researchers aimed to solve a real-world materials-simulation problem with quantum computing—one the company claims would be nearly impossible for a traditional computer to solve in a reasonable amount of time. D-Wave used an annealing designed to solve optimization problems. The problem is represented like an energy space, where the “lowest energy state” corresponds to the solution.
While exciting, quantum supremacy is just one metric among several that mark the progress toward widely useful quantum computers, industry experts told BetaKit.
…
“It is a very important and mostly academic metric, but certainly not the most important in the grand scheme of things, as it doesn’t take into account the usefulness of the algorithm,” said Martin Laforest, managing partner at Quantacet, a specialized venture capital fund for quantum startups.
He added that Google and Xanadu’s [Xanadu Quantum Technologies based in Toronto, Canada] past claims to quantum supremacy were “extraordinary pieces of work, but didn’t unlock practicality.”
Laforest, along with executives at Canadian quantum startups Nord Quantique and Photonic, say that the milestones of ‘quantum utility’ or ‘quantum advantage’ may be more important than supremacy.
According to Quantum computing company Quera [QuEra?], quantum advantage is the demonstration of a quantum algorithm solving a real-world problem on a quantum computer faster than any classical algorithm running on any classical computer. On the other hand, quantum utility, according to IBM, refers to when a quantum computer is able to perform reliable computations at a scale beyond brute-force classical computing methods that provide exact solutions to computational problems.
…
Error correction hasn’t traditionally been considered a requirement for quantum supremacy, but Laforest told BetaKit the term is “an ever-moving target, constantly challenged by advances in classical algorithms.” He added: “In my opinion, some level of supremacy or utility may be possible in niche areas without error correction, but true disruption requires it.”
Paul Terry, CEO of Vancouver-based Photonic, thinks that though D-Wave’s claim to quantum supremacy shows “continued progress to real value,” scalability is the industry’s biggest hurdle to overcome.
…
But as with many milestone claims in the quantum space, D-Wave’s latest innovation has been met with scrutiny from industry competitors and researchers on the breakthrough’s significance, claiming that classical computers have achieved similar results. Laforest echoed this sentiment.
“Personally, I wouldn’t say it’s an unequivocal demonstration of supremacy, but it is a damn nice experiment that once again shows the murky zone between traditional computing and early quantum advantage,” Laforest said.
Originally founded out of the University of British Columbia, D-Wave went public on the New York Stock Exchange just over two years ago through a merger with a special-purpose acquisition company in 2022. D-Wave became a Delaware-domiciled corporation as part of the deal.
Earlier this year, D-Wave’s stock price dropped after Nvidia CEO Jensen Huang publicly stated that he estimated that useful quantum computers were more than 15 years away. D-Wave’s stock price, which had been struggling, has seen a considerable bump in recent months alongside a broader boost in the quantum market. The price popped after its most recent earnings, shared right after its quantum supremacy announcement.
The beat goes on
Some of this is standard in science. There’s always a debate over big claims and it’s not unusual for people to get over excited and have to make a retraction. Scientists are people too. That said, there’s a lot of money on the line and that appears to be making situation even more volatile than usual.
That last paragraph was completed on the morning of March 21, 2025 and later that afternoon I came across this March 21, 2025 article by Michael Grothaus for Fast Company, Note: Links have been removed,
Quantum computing stocks got pummeled yesterday, with the four most prominent public quantum computing companies—IonQ, Rigetti Computing, Quantum Computing Inc., and D-Wave Quantum Inc.—falling anywhere from over 9% to over 18%. The reason? A lot of it may have to do with AI chip giant Nvidia. Again.
Stocks crash yesterday on Nvidia quantum news
Yesterday was a bit of a bloodbath on the stock market for the four most prominent publicly traded quantum computing companies. …
…
All four of these quantum computing stocks [IonQ, Inc.; Rigetti Computing, Inc.; Quantum Computing Inc.; D-Wave Quantum Inc.] tumbled on the day that AI chip giant Nvidia kicked off its two-day Quantum Day event. In a blog post from January 14 announcing Quantum Day, Nvidia said the event “brings together leading experts for a comprehensive and balanced perspective on what businesses should expect from quantum computing in the coming decades — mapping the path toward useful quantum applications.”
…
Besides bringing quantum experts together, the AI behemoth also announced that it will be launching a new quantum computing research center in Boston.
Called the NVIDIA Accelerated Quantum Research Center (NVAQC), the new research lab “will help solve quantum computing’s most challenging problems, ranging from qubit noise to transforming experimental quantum processors into practical devices,” the company said in a press release.
The NVAQC’s location in Boston means it will be near both Harvard University and the Massachusetts Institute of Technology (MIT).
…
Before Nvidia’s announcement yesterday, IonQ, Rigetti, D-Wave, and Quantum Computing Inc. were the leaders in the nascent field of quantum computing. And while they still are right now (Nvidia’s quantum research lab hasn’t been built yet), the fear is that Nvidia could use its deep pockets to quickly buy its way into a leadership spot in the field. With its $2.9 trillion market cap, the company can easily afford to throw billions of research dollars into quantum computing.
As noted by the Motley Fool, the location of the NVIDIA Accelerated Quantum Research Center in Boston will also allow Nvidia to more easily tap into top quantum talent from Harvard and MIT—talent that may have otherwise gone to IonQ, Rigetti, D-Wave, and Quantum Computing Inc.
Nvidia’s announcement is a massive about-face from the company in regard to how it views quantum computing. It’s also the second time that Nvidia has caused quantum stocks to crash this year. Back in January, shares in prominent quantum computing companies fell after Huang said that practical use of quantum computing was decades away.
Those comments were something quantum computing company CEOs like D-Wave’s Alan Baratz took issue with. “It’s an egregious error on Mr. Huang’s part,” Bartaz told Fast Company at the time. “We’re not decades away from commercial quantum computers. They exist. There are companies that are using our quantum computer today.”
According to Investor’s Business Daily, Huang reportedly got the idea for Nvidia’s Quantum Day event after the blowback to his comments, inviting quantum computing executives to the event to explain why he was incorrect about quantum computing.
A September 10, 2024 news item on ScienceDaily provides a technical explanation of how memristors, without a power source, can retain information,
Phase separation, when molecules part like oil and water, works alongside oxygen diffusion to help memristors — electrical components that store information using electrical resistance — retain information even after the power is shut off, according to a University of Michigan led study recently published in Matter.
Up to this point, explanations have not fully grasped how memristors retain information without a power source, known as nonvolatile memory, because models and experiments do not match up.
“While experiments have shown devices can retain information for over 10 years, the models used in the community show that information can only be retained for a few hours,” said Jingxian Li, U-M doctoral graduate of materials science and engineering and first author of the study.
To better understand the underlying phenomenon driving nonvolatile memristor memory, the researchers focused on a device known as resistive random access memory or RRAM, an alternative to the volatile RAM used in classical computing, and are particularly promising for energy-efficient artificial intelligence applications.
The specific RRAM studied, a filament-type valence change memory (VCM), sandwiches an insulating tantalum oxide layer between two platinum electrodes. When a certain voltage is applied to the platinum electrodes, a conductive filament forms a tantalum ion bridge passing through the insulator to the electrodes, which allows electricity to flow, putting the cell in a low resistance state representing a “1” in binary code. If a different voltage is applied, the filament is dissolved as returning oxygen atoms react with the tantalum ions, “rusting” the conductive bridge and returning to a high resistance state, representing a binary code of “0”.
It was once thought that RRAM retains information over time because oxygen is too slow to diffuse back. However, a series of experiments revealed that previous models have neglected the role of phase separation.
“In these devices, oxygen ions prefer to be away from the filament and will never diffuse back, even after an indefinite period of time. This process is analogous to how a mixture of water and oil will not mix, no matter how much time we wait, because they have lower energy in a de-mixed state,” said Yiyang Li, U-M assistant professor of materials science and engineering and senior author of the study.
To test retention time, the researchers sped up experiments by increasing the temperature. One hour at 250°C is equivalent to about 100 years at 85°C—the typical temperature of a computer chip.
Using the extremely high-resolution imaging of atomic force microscopy, the researchers imaged filaments, which measure only about five nanometers or 20 atoms wide, forming within the one micron wide RRAM device.
“We were surprised that we could find the filament in the device. It’s like finding a needle in a haystack,” Li said.
The research team found that different sized filaments yielded different retention behavior. Filaments smaller than about 5 nanometers dissolved over time, whereas filaments larger than 5 nanometers strengthened over time. The size-based difference cannot be explained by diffusion alone.
Together, experimental results and models incorporating thermodynamic principles showed the formation and stability of conductive filaments depend on phase separation.
The research team leveraged phase separation to extend memory retention from one day to well over 10 years in a rad-hard memory chip—a memory device built to withstand radiation exposure for use in space exploration.
Other applications include in-memory computing for more energy efficient AI applications or memory devices for electronic skin—a stretchable electronic interface designed to mimic the sensory capabilities of human skin. Also known as e-skin, this material could be used to provide sensory feedback to prosthetic limbs, create new wearable fitness trackers or help robots develop tactile sensing for delicate tasks.
“We hope that our findings can inspire new ways to use phase separation to create information storage devices,” Li said.
Researchers at Ford Research, Dearborn; Oak Ridge National Laboratory; University at Albany; NY CREATES; Sandia National Laboratories; and Arizona State University, Tempe contributed to this study.
…
Here’s a link to and a citation for the paper,
Thermodynamic origin of nonvolatility in resistive memory by Jingxian Li, Anirudh Appachar, Sabrina L. Peczonczyk, Elisa T. Harrison, Anton V. Ievlev, Ryan Hood, Dongjae Shin, Sangmin Yoo, Brianna Roest, Kai Sun, Karsten Beckmann, Olya Popova, Tony Chiang, William S. Wahby, Robin B. Jacobs-Godrim, Matthew J. Marinella, Petro Maksymovych, John T. Heron, Nathaniel Cady, Wei D. Lu, Suhas Kumar, A. Alec Talin, Wenhao Sun, Yiyang Li. Matter DOI: https://doi.org/10.1016/j.matt.2024.07.018 Published online: August 26, 2024
Ultimately, the researchers are working on ways to make agriculture more sustainable but, in the meantime, there’s this June 7, 2024 news item on ScienceDaily describing this work,
Gene silencing in plants has faced significant challenges, primarily due to the difficulty of transporting RNA molecules across plant cell membranes and achieving systemic effects. Traditional genetic engineering methods are time-consuming and often limited by plant genotype. Due to these challenges, there is a pressing need for innovative solutions to facilitate efficient gene silencing and enhance crop productivity.
…
A June 7, 2024 news release, from Nanjing Agricultural University The Academy of Science (publisher of Horticulture Research), on EurekAlert, which originated the news item, goes on to describe the challenges and the proposed solution, Note: Links have been removed,
Gene silencing in plants has faced significant challenges, primarily due to the difficulty of transporting RNA molecules across plant cell membranes and achieving systemic effects. Traditional genetic engineering methods are time-consuming and often limited by plant genotype. Due to these challenges, there is a pressing need for innovative solutions to facilitate efficient gene silencing and enhance crop productivity.
Researchers from the University of Connecticut and Oak Ridge National Laboratory have developed an innovative method using cationized bovine serum albumin (cBSA) and double-stranded RNA (dsRNA) nanocomplexes to achieve effective systemic gene silencing in plants. Published (DOI: 10.1093/hr/uhae045) in Horticulture Research on February 22, 2024, this study demonstrates the potential of these nanocomplexes to overcome the limitations of traditional RNA delivery methods, offering a new tool for plant biotechnology.
The study presents the development of cBSA/dsRNA nanocomplexes for systemic gene silencing in tobacco and poplar plants. By modifying bovine serum albumin to carry a positive charge, researchers created nanocomplexes that bind dsRNA molecules, facilitating their transport and systemic gene silencing. Experiments demonstrated successful silencing of the DR5-GUS and 35S-GUS genes, achieving significant reductions in gene expression. This technology proved effective in delivering RNA molecules across plant cell membranes, overcoming the negative charge barrier of naked RNA applications. Offering a convenient, fast, and non-transgenic approach, this method holds promise for gene function characterization, crop improvement, and large-scale agricultural applications due to its scalability and cost-effectiveness.
Dr. Yi Li, a lead researcher on the project, stated, “The development of cBSA/dsRNA nanocomplexes represents a significant advancement in plant biotechnology. This technology not only facilitates efficient gene silencing but also offers a practical and scalable solution for improving crop productivity. We believe this method will pave the way for new applications in gene editing and agricultural research.”
The implications of this research are vast, offering a potential solution for transient gene silencing in field-grown crops, including orchard trees. This technology could enhance crop productivity by targeting genes that influence drought tolerance, fruit development, and stress resistance, all without the need for genetic modification. The scalable and inexpensive nature of this method could make it a game-changer for sustainable agriculture.
The research and the journal where it is published both have interesting pedigrees. From the June 7, 2024 news release,
Funding information
This work was supported by the USDA National Institute of Food and Agriculture SCRI (grant no. 2015-70016-23027) and the Connecticut-Storrs Agriculture Experimental Station.
Horticulture Researchis an open access journal of Nanjing Agricultural University and ranked number one in the Horticulture category of the Journal Citation Reports ™ from Clarivate, 2022. The journal is committed to publishing original research articles, reviews, perspectives, comments, correspondence articles and letters to the editor related to all major horticultural plants and disciplines, including biotechnology, breeding, cellular and molecular biology, evolution, genetics, inter-species interactions, physiology, and the origination and domestication of crops.
You can add the UK to the US/China mix since the website hosting Horticulture Research is Oxford Academic,
Oxford Academic is Oxford University Press’s academic research platform, providing access to over 50,000 books and 500 journals
…
Finally, here’s a link to and a citation for the paper,
Engineered dsRNA–protein nanoparticles for effective systemic gene silencing in plants by Huayu Sun, Ankarao Kalluri, Dan Tang, Jingwen Ding, Longmei Zhai, Xianbin Gu, Yanjun Li, Huseyin Yer, Xiaohan Yang, Gerald A Tuskan, Zhanao Deng, Frederick G Gmitter Jr, Hui Duan, Challa Kumar, Yi Li. Horticulture Research, Volume 11, Issue 4, April 2024, uhae045, DOI: https://doi.org/10.1093/hr/uhae045 Published online: 22 February 2024
This is a different approach to neuromorphic (brainlike) computing being described in an August 28, 2023 news item on phys.org, Note: A link has been removed,
The word “fractals” might inspire images of psychedelic colors spiraling into infinity in a computer animation. An invisible, but powerful and useful, version of this phenomenon exists in the realm of dynamic magnetic fractal networks.
Dustin Gilbert, assistant professor in the Department of Materials Science and Engineering [University of Tennessee, US], and colleagues have published new findings in the behavior of these networks—observations that could advance neuromorphic computing capabilities.
Their research is detailed in their article “Skyrmion-Excited Spin-Wave Fractal Networks,” cover story for the August 17, 2023, issue of Advanced Materials.
“Most magnetic materials—like in refrigerator magnets—are just comprised of domains where the magnetic spins all orient parallel,” said Gilbert. “Almost 15 years ago, a German research group discovered these special magnets where the spins make loops—like a nanoscale magnetic lasso. These are called skyrmions.”
Named for legendary particle physicist Tony Skyrme, a skyrmion’s magnetic swirl gives it a non-trivial topology. As a result of this topology, the skyrmion has particle-like properties—they are hard to create or destroy, they can move and even bounce off of each other. The skyrmion also has dynamic modes—they can wiggle, shake, stretch, whirl, and breath[e].
As the skyrmions “jump and jive,” they are creating magnetic spin waves with a very narrow wavelength. The interactions of these waves form an unexpected fractal structure.
“Just like a person dancing in a pool of water, they generate waves which ripple outward,” said Gilbert. “Many people dancing make many waves, which normally would seem like a turbulent, chaotic sea. We measured these waves and showed that they have a well-defined structure and collectively form a fractal which changes trillions of times per second.”
Fractals are important and interesting because they are inherently tied to a “chaos effect”—small changes in initial conditions lead to big changes in the fractal network.
“Where we want to go with this is that if you have a skyrmion lattice and you illuminate it with spin waves, the way the waves make its way through this fractal-generating structure is going to depend very intimately on its construction,” said Gilbert. “So, if you could write individual skyrmions, it can effectively process incoming spin waves into something on the backside—and it’s programmable. It’s a neuromorphic architecture.”
The Advanced Materials cover illustration [image at top of this posting] depicts a visual representation of this process, with the skyrmions floating on top of a turbulent blue sea illustrative of the chaotic structure generated by the spin wave fractal.
“Those waves interfere just like if you throw a handful of pebbles into a pond,” said Gilbert. “You get a choppy, turbulent mess. But it’s not just any simple mess, it’s actually a fractal. We have an experiment now showing that the spin waves generated by skyrmions aren’t just a mess of waves, they have inherent structure of their very own. By, essentially, controlling those stones that we ‘throw in,’ you get very different patterns, and that’s what we’re driving towards.”
The discovery was made in part by neutron scattering experiments at the Oak Ridge National Laboratory (ORNL) High Flux Isotope Reactor and at the National Institute of Standards and Technology (NIST) Center for Neutron Research. Neutrons are magnetic and pass through materials easily, making them ideal probes for studying materials with complex magnetic behavior such as skyrmions and other quantum phenomena.
Gilbert’s co-authors for the new article are Nan Tang, Namila Liyanage, and Liz Quigley, students in his research group; Alex Grutter and Julie Borchers from National Institute of Standards and Technology (NIST), Lisa DeBeer-Schmidt and Mike Fitzsimmons from Oak Ridge National Laboratory; and Eric Fullerton, Sheena Patel, and Sergio Montoya from the University of California, San Diego.
The team’s next step is to build a working model using the skyrmion behavior.
“If we can develop thinking computers, that, of course, is extraordinarily important,” said Gilbert. “So, we will propose to make a miniaturized, spin wave neuromorphic architecture.” He also hopes that the ripples from this UT Knoxville discovery inspire researchers to explore uses for a spiraling range of future applications.
Here’s a link to and a citation for the paper,
Skyrmion-Excited Spin-Wave Fractal Networks by Nan Tang, W. L. N. C. Liyanage, Sergio A. Montoya, Sheena Patel, Lizabeth J. Quigley, Alexander J. Grutter, Michael R. Fitzsimmons, Sunil Sinha, Julie A. Borchers, Eric E. Fullerton, Lisa DeBeer-Schmitt, Dustin A. Gilbert. Advanced Materials Volume 35, Issue 33 August 17, 2023 2300416 DOI: https://doi.org/10.1002/adma.202300416 First published: 04 May 2023
Nice to learn of this news, which is on the CBC (Canadian Broadcasting Corporation) news online website. From a December 13, 2022 news item provided by Associated Press (Note: the news item was updated to include general description and some Canadian content at about 12 pm PT) ,
…
Researchers at the Lawrence Livermore National Laboratory in California for the first time produced more energy in a fusion reaction than was used to ignite it, [emphasis mine] something called net energy gain, the Energy Department said.
…
Peter Behr’s December 13, 2022 article on Politico.com about the US Department of Energy’s big announcement also breaks the news,
The Department of Energy announced Tuesday [December 12, 2022] that its scientists have produced the first ever fusion reaction that yielded more energy than the reaction required, an essential step in the long path toward commercial fusion power, officials said.
The experiment Dec. 5 [2022], at the Lawrence Livermore National Laboratory in California, took a few billionths of second. But laboratory leaders said today that it demonstrated for the first time that sustained fusion power is possible.
…
Behr explains what nuclear fusion is but first he touches on why scientists are so interested in the process, from his December 13, 2022 article,
…
In theory, nuclear fusion could produce massive amounts of energy without producing lost-lasting radioactive waste, or posing the risk of meltdowns. That’s unlike nuclear fission, which powers today’s reactors.
Fission results when radioactive atoms — most commonly uranium — are split by neutrons in controlled chain reactions, creating lighter atoms and large amounts of radiation and energy to produce electric power.
Fusion is the opposite process. In the most common approach, swirling hydrogen isotopes are forced together under tremendous heat to create helium and energy for power generation. This is the same process that powers the sun and other stars. But scientists have been trying since the mid-20th century to find a way to use it to generate power on Earth.
…
There are two main approaches to making fusion happen and I found a description for them in an October 2022 article about local company, General Fusion, by Nelson Bennett for Business in Vancouver magazine (paper version),
Most fusion companies are pursuing one of two approaches: Magnet [sic] or inertial confinement. General fusion is one of the few that is taking a more hybrid approach ¬ magnetic confinement with pulse compression.
Fusion occurs when smaller nuclei are fused together under tremendous force into larger nuclei, with a release of energy occurring in the form of neutrons. It’s what happens to stars when gravitational force creates extreme heat that turns on the fusion engine.
Replicating that in a machine requires some form of confinement to squeeze plasma ¬ a kind of super-hot fog of unbound positive and negative particles ¬ to the point where nuclei fuse.
One approach is inertial confinement, in which lasers are focused on a small capsule of heavy hydrogen fuel (deuterium and tritium) to create ignition. This takes a tremendous amount of energy, and the challenge for all fusion efforts is to get a sustained ignition that produces more energy than it takes to get ignition ¬ called net energy gain.
The other main approach is magnetic confinement, using powerful magnets in a machine called a tokomak to contain and squeeze plasma into a donut-shaped form called a torus.
General Fusion uses magnets to confine the plasma, but to get ignition it uses pistons arrayed around a spherical chamber to fire synchronously to essentially collapse the plasma on itself and spark ignition.
General Fusion’s machine uses liquid metal spinning inside a chamber that acts as a protective barrier between the hot plasma and the machine ¬ basically a sphere of plasma contained within a sphere of liquid metal. This protects the machine from damage.
The temperatures generated in fusion ¬ up to to 150 million degrees Celsius ¬ are five to six times hotter than the core of the sun, and can destroy machines that produce them. This makes durability a big challenge in any machine.
…
The Lawrence Livermore National Laboratory (LLNL) issued a December 13, 2022 news release, which provides more detail about their pioneering work, Note: I have changed the order of the paragraphs but all of this is from the news release,
…
Fusion is the process by which two light nuclei combine to form a single heavier nucleus, releasing a large amount of energy. In the 1960s, a group of pioneering scientists at LLNL hypothesized that lasers could be used to induce fusion in a laboratory setting. Led by physicist John Nuckolls, who later served as LLNL director from 1988 to 1994, this revolutionary idea became inertial confinement fusion, kicking off more than 60 years of research and development in lasers, optics, diagnostics, target fabrication, computer modeling and simulation and experimental design.
To pursue this concept, LLNL built a series of increasingly powerful laser systems, leading to the creation of NIF [National Ignition Facility], the world’s largest and most energetic laser system. NIF — located at LLNL in Livermore, California — is the size of a sports stadium and uses powerful laser beams to create temperatures and pressures like those in the cores of stars and giant planets, and inside exploding nuclear weapons.
…
LLNL’s experiment surpassed the fusion threshold by delivering 2.05 megajoules (MJ) of energy to the target, resulting in 3.15 MJ of fusion energy output, demonstrating for the first time a most fundamental science basis for inertial fusion energy (IFE). Many advanced science and technology developments are still needed to achieve simple, affordable IFE to power homes and businesses, and DOE is currently restarting a broad-based, coordinated IFE program in the United States. Combined with private-sector investment, there is a lot of momentum to drive rapid progress toward fusion commercialization.
If you want to see some really excited comments from scientists just read the LLNL’s December 13, 2022 news release. Even the news release’s banner is exuberant,
Fearful that China might wind up dominating fusion energy in the second half of this century, Congress in 2020 told DOE [Department of Energy] to begin funding development of a utility-scale fusion pilot plant that could deliver at least 50 megawatts of power to the U.S. grid.
…
In September [2022], DOE invited private companies to apply for an initial $50 million in research grants to help fund development of detailed pilot plant plans.
“We’re seeking strong partnerships between DOE and the private sector,” a senior DOE official told POLITICO’s E&E News recently. The official was not willing to speak on the record, saying the grant process is ongoing and confidential.
As the competition proceeds, DOE will set technical milestones or requirements, challenging the teams to show how critical engineering challenges will be overcome. DOE’s goal is “hopefully to enable a fusion pilot to operate in the early 2030s,” the official added.
At least 15 U.S. and foreign fusion companies have submitted requests for an initial total of $50 million in pilot plant grants, and some of them are pursuing the laser-ignition fusion process that Lawrence Livermore has pioneered, said Holland. He did not name the companies because the competition is confidential.
…
I wonder if General Fusion whose CEO (Chief Executive Officer) Greg Twinney declared, “Commercializing fusion energy is within reach, and General Fusion is ready to deliver it to the grid by the 2030s …” (in a December 12, 2022 company press release) is part of the US competition.
I noticed that General Fusion lists this at the end of the press release,
… Founded in 2002, we are headquartered in Vancouver, Canada, with additional centers co-located with internationally recognized fusion research laboratories near London, U.K., and Oak Ridge, Tennessee, U.S.A.
The Oak Ridge National Laboratory (ORNL), like the LLNL, is a US Department of Energy research facility.
As for General Fusion’s London connection, I have more about that in my October 28, 2022 posting “Overview of fusion energy scene,” which includes General Fusion’s then latest news about a commercialization agreement with the UKAEA (UK Atomic Energy Authority) and a ‘fusion’ video by rapper Baba Brinkman along with the overview.
In early October 2022, Alain Aspect, John Clauser and Anton Zeilinger were jointly awarded the 2022 Nobel Prize in Physics for work each scientist performed independently of the others.
Alain Aspect Institut d’Optique Graduate School – Université Paris- Saclay and École Polytechnique, Palaiseau, France
John F. Clauser J.F. Clauser & Assoc., Walnut Creek, CA, USA
Anton Zeilinger University of Vienna, Austria
“for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science”
Entangled states – from theory to technology
Alain Aspect, John Clauser and Anton Zeilinger have each conducted groundbreaking experiments using entangled quantum states, where two particles behave like a single unit even when they are separated. Their results have cleared the way for new technology based upon quantum information.
The ineffable effects of quantum mechanics are starting to find applications. There is now a large field of research that includes quantum computers, quantum networks and secure quantum encrypted communication.
One key factor in this development is how quantum mechanics allows two or more particles to exist in what is called an entangled state. What happens to one of the particles in an entangled pair determines what happens to the other particle, even if they are far apart.
For a long time, the question was whether the correlation was because the particles in an entangled pair contained hidden variables, instructions that tell them which result they should give in an experiment. In the 1960s, John Stewart Bell developed the mathematical inequality that is named after him. This states that if there are hidden variables, the correlation between the results of a large number of measurements will never exceed a certain value. However, quantum mechanics predicts that a certain type of experiment will violate Bell’s inequality, thus resulting in a stronger correlation than would otherwise be possible.
John Clauser developed John Bell’s ideas, leading to a practical experiment. When he took the measurements, they supported quantum mechanics by clearly violating a Bell inequality. This means that quantum mechanics cannot be replaced by a theory that uses hidden variables.
Some loopholes remained after John Clauser’s experiment. Alain Aspect developed the setup, using it in a way that closed an important loophole. He was able to switch the measurement settings after an entangled pair had left its source, so the setting that existed when they were emitted could not affect the result.
Using refined tools and long series of experiments, Anton Zeilinger started to use entangled quantum states. Among other things, his research group has demonstrated a phenomenon called quantum teleportation, which makes it possible to move a quantum state from one particle to one at a distance.
“It has become increasingly clear that a new kind of quantum technology is emerging. We can see that the laureates’ work with entangled states is of great importance, even beyond the fundamental questions about the interpretation of quantum mechanics,”says Anders Irbäck, Chair of the Nobel Committee for Physics.
There are some practical applications for their work on establishing quantum entanglement as Dr. Nicholas Peters, University of Tennessee and Oak Ridge National Laboratory (ORNL), explains in his October 7, 2022 essay for The Conversation,
Unhackable communications devices, high-precision GPS and high-resolution medical imaging all have something in common. These technologies—some under development and some already on the market all rely on the non-intuitive quantum phenomenon of entanglement.
Two quantum particles, like pairs of atoms or photons, can become entangled. That means a property of one particle is linked to a property of the other, and a change to one particle instantly affects the other particle, regardless of how far apart they are. This correlation is a key resource in quantum information technologies.
For the most part, quantum entanglement is still a subject of physics research, but it’s also a component of commercially available technologies, and it plays a starring role in the emerging quantum information processing industry.
…
Quantum entanglement is a critical element of quantum information processing, and photonic entanglement of the type pioneered by the Nobel laureates is crucial for transmitting quantum information. Quantum entanglement can be used to build large-scale quantum communications networks.
On a path toward long-distance quantum networks, Jian-Wei Pan, one of Zeilinger’s former students, and colleagues demonstrated entanglement distribution to two locations separated by 764 miles (1,203 km) on Earth via satellite transmission. However, direct transmission rates of quantum information are limited due to loss, meaning too many photons get absorbed by matter in transit so not enough reach the destination.
Entanglement is critical for solving this roadblock, through the nascent technology of quantum repeaters. An important milestone for early quantum repeaters, called entanglement swapping, was demonstrated by Zeilinger and colleagues in 1998. Entanglement swapping links one each of two pairs of entangled photons, thereby entangling the two initially independent photons, which can be far apart from each other.
…
Perhaps the most well known quantum communications application is Quantum Key Distribution (QKD), which allows someone to securely distribute encryption keys. If those keys are stored properly, they will be secure, even from future powerful, code-breaking quantum computers.
…
I don’t usually embed videos that are longer than 5 mins. but this one has a good explanation of cryptography (both classical and quantum),
The video host, Physics Girl (website), is also known as Dianna Cowern.
I wonder if there’s going to be a rush to fund and commercialize more quantum physics projects. There’s certainly an upsurge in activity locally and in Canada (I assume the same is true elsewhere) as my July 26, 2022 posting “Quantum Mechanics & Gravity conference (August 15 – 19, 2022) launches Vancouver (Canada)-based Quantum Gravity Institute and more” makes clear.
At the simplest of levels, nanopores are (nanometre-sized) holes in an insulating membrane. The hole allows ions to pass through the membrane when a voltage is applied, resulting in a measurable current. When a molecule passes through a nanopore it causes a change in the current, this can be used to characterize and even identify individual molecules. Nanopores are extremely powerful single-molecule biosensing devices and can be used to detect and sequence DNA, RNA, and even proteins. Recently, it has been used in the SARS-CoV-2 virus sequencing.
Solid-state nanopores are an extremely versatile type of nanopore formed in ultrathin membranes (less than 50 nanometres), made from materials such as silicon nitride (SiNx). Solid-state nanopores can be created with a range of diameters and can withstand a multitude of conditions (discover more about solid-state nanopore fabrication techniques here). One of the most appealing techniques with which to fabricate nanopores is Controlled Breakdown (CBD). This technique is quick, reduces fabrication costs, does not require specialized equipment, and can be automated.
CBD is a technique in which an electric field is applied across the membrane to induce a current. At some point, a spike in the current is observed, signifying pore formation. The voltage is then quickly reduced to ensure the fabrication of a single, small nanopore.
The mechanisms underlying this process have not been fully elucidated thus an international team involving ITQB NOVA decided to further investigate how electrical conduction through the membrane occurs during breakdown, namely how oxidation and reduction reactions (also called redox reactions, they imply electron loss or gain, respectively) influence the process. To do this, the team created three devices in which the electric field is applied to the membrane (a silicon-rich SiNx membrane) in different ways: via metal electrodes on both sides of the membrane; via electrolyte solutions on both sides of the membrane; and via a mixed device with a metal electrode on one side and an electrolyte solution on the other.
Results showed that redox reactions must occur at the membrane-electrolyte interface, whilst the metal electrodes circumvent this need. The team also demonstrated that, because of this phenomenon, nanopore fabrication could be localized to certain regions by performing CBD with metal microelectrodes on the membrane surface. Finally, by varying the content of silicon in the membrane, the investigators demonstrated that conduction and nanopore formation is highly dependent on the membrane material since it limits the electrical current in the membrane.
“Controlling the location of nanopores has been of interest to us for a number of years”, says James Yates. Pedro Sousa adds that “our findings suggest that CBD can be used to integrate pores with complementary micro or nanostructures, such as tunneling electrodes or field-effect sensors, across a range of different membrane materials.” These devices may then be used for the detection of specific molecules, such as proteins, DNA, or antibodies, and applied to a wide array of scenarios, including pandemic surveillance or food safety.
This project was developed by a research team led by ITQB NOVA’s James Yates and has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation program (grant agreement No 724300 and 875525). Co-author Pedro Miguel Sousa is also from ITQB NOVA. The other consortium members are from the University of Oxford, Oak Ridge National Laboratory, Imperial College London and Queen Mary University of London. The authors would like to thank Andrew Briggs for providing financial support.
Here’s a link to and a citation for the paper,
Understanding Electrical Conduction and Nanopore Formation During Controlled Breakdown by Jasper P. Fried, Jacob L. Swett, Binoy Paulose Nadappuram, Aleksandra Fedosyuk, Pedro Miguel Sousa, Dayrl P. Briggs, Aleksandar P. Ivanov, Joshua B. Edel, Jan A. Mol, James R. Yates. Small DOI: https://doi.org/10.1002/smll.202102543 First published: 01 August 2021
Caption: Researchers at ORNL’s Center for Nanophase Materials Sciences demonstrated the first example of capacitance in a lipid-based biomimetic membrane, opening nondigital routes to advanced, brain-like computation. Credit: Michelle Lehman/Oak Ridge National Laboratory, U.S. Dept. of Energy
The last time I wrote about memcapacitors (June 30, 2014 posting: Memristors, memcapacitors, and meminductors for faster computers), the ideas were largely theoretical; I believe this work is the first research I’ve seen on the topic. From an October 17, 2019 news item on ScienceDaily,
Researchers at the Department of Energy’s Oak Ridge National Laboratory ]ORNL], the University of Tennessee and Texas A&M University demonstrated bio-inspired devices that accelerate routes to neuromorphic, or brain-like, computing.
Results published in Nature Communications report the first example of a lipid-based “memcapacitor,” a charge storage component with memory that processes information much like synapses do in the brain. Their discovery could support the emergence of computing networks modeled on biology for a sensory approach to machine learning.
“Our goal is to develop materials and computing elements that work like biological synapses and neurons—with vast interconnectivity and flexibility—to enable autonomous systems that operate differently than current computing devices and offer new functionality and learning capabilities,” said Joseph Najem, a recent postdoctoral researcher at ORNL’s Center for Nanophase Materials Sciences, a DOE Office of Science User Facility, and current assistant professor of mechanical engineering at Penn State.
The novel approach uses soft materials to mimic biomembranes and simulate the way nerve cells communicate with one another.
The team designed an artificial cell membrane, formed at the interface of two lipid-coated water droplets in oil, to explore the material’s dynamic, electrophysiological properties. At applied voltages, charges build up on both sides of the membrane as stored energy, analogous to the way capacitors work in traditional electric circuits.
But unlike regular capacitors, the memcapacitor can “remember” a previously applied voltage and—literally—shape how information is processed. The synthetic membranes change surface area and thickness depending on electrical activity. These shapeshifting membranes could be tuned as adaptive filters for specific biophysical and biochemical signals.
“The novel functionality opens avenues for nondigital signal processing and machine learning modeled on nature,” said ORNL’s Pat Collier, a CNMS staff research scientist.
A distinct feature of all digital computers is the separation of processing and memory. Information is transferred back and forth from the hard drive and the central processor, creating an inherent bottleneck in the architecture no matter how small or fast the hardware can be.
Neuromorphic computing, modeled on the nervous system, employs architectures that are fundamentally different in that memory and signal processing are co-located in memory elements—memristors, memcapacitors and meminductors.
These “memelements” make up the synaptic hardware of systems that mimic natural information processing, learning and memory.
Systems designed with memelements offer advantages in scalability and low power consumption, but the real goal is to carve out an alternative path to artificial intelligence, said Collier.
Tapping into biology could enable new computing possibilities, especially in the area of “edge computing,” such as wearable and embedded technologies that are not connected to a cloud but instead make on-the-fly decisions based on sensory input and past experience.
Biological sensing has evolved over billions of years into a highly sensitive system with receptors in cell membranes that are able to pick out a single molecule of a specific odor or taste. “This is not something we can match digitally,” Collier said.
Digital computation is built around digital information, the binary language of ones and zeros coursing through electronic circuits. It can emulate the human brain, but its solid-state components do not compute sensory data the way a brain does.
“The brain computes sensory information pushed through synapses in a neural network that is reconfigurable and shaped by learning,” said Collier. “Incorporating biology—using biomembranes that sense bioelectrochemical information—is key to developing the functionality of neuromorphic computing.”
While numerous solid-state versions of memelements have been demonstrated, the team’s biomimetic elements represent new opportunities for potential “spiking” neural networks that can compute natural data in natural ways.
Spiking neural networks are intended to simulate the way neurons spike with electrical potential and, if the signal is strong enough, pass it on to their neighbors through synapses, carving out learning pathways that are pruned over time for efficiency.
A bio-inspired version with analog data processing is a distant aim. Current early-stage research focuses on developing the components of bio-circuitry.
“We started with the basics, a memristor that can weigh information via conductance to determine if a spike is strong enough to be broadcast through a network of synapses connecting neurons,” said Collier. “Our memcapacitor goes further in that it can actually store energy as an electric charge in the membrane, enabling the complex ‘integrate and fire’ activity of neurons needed to achieve dense networks capable of brain-like computation.”
The team’s next steps are to explore new biomaterials and study simple networks to achieve more complex brain-like functionalities with memelements.
Here’s a link to and a citation for the paper,
Dynamical nonlinear memory capacitance in biomimetic membranes by Joseph S. Najem, Md Sakib Hasan, R. Stanley Williams, Ryan J. Weiss, Garrett S. Rose, Graham J. Taylor, Stephen A. Sarles & C. Patrick Collier. Nature Communications volume 10, Article number: 3239 (2019) DOI: DOIhttps://doi.org/10.1038/s41467-019-11223-8 Published July 19, 2019
This paper is open access.
One final comment, you might recognize one of the authors (R. Stanley Williams) who in 2008 helped launch ‘memristor’ research.
While it’s true enough in English where you don’t spell the word team with the letter ‘I’, that’s not the case in French where the word is ‘equipe’. it makes me wonder how many other languages in the world have an ‘I’ in team.
Moving on. This English language saying is true enough in its way but there is no team unless you have a group of ‘I’s’ and the trick is getting them to work together as a July 18, 2019 Northwestern University news release (received via email) about a new online training tool notes,
Coaching scientists to play well together
Free tool shows how to avoid fights over data and authorship conflicts
‘You stole my idea’ or ‘I’m not getting credit for my work’ are common disputes Only tool validated by research to help scientists collaborate smoothly Many NSF [US National Science Foundation] and NIH [US National Institutes of Health] grants now require applicants to show readiness for team science Scientists can’t do it on their own
CHICAGO — When scientists from different disciplines collaborate – as is increasingly necessary to confront the complexity of challenging research problems – interpersonal tussles often arise. One scientist may accuse another of stealing her ideas. Or, a researcher may feel he is not getting credit for his work or doesn’t have access to important data.
“Interdisciplinary team science is now the state of the art across all branches of science and engineering,” said Bonnie Spring, professor of preventive medicine at Northwestern University Feinberg School of Medicine. “But very few scientists have been trained to work with others outside of their own disciplinary silo.”
The skill is critical because many National Institute[s] of Health and National Science Foundationgrants require applicants to show readiness for team science.
A free, online training tool developed by Northwestern — teamscience.net — has been been proven to help scientists develop skills to work with other scientists outside their own discipline.
A new study led by Spring showed scientists who completed the program’s modules – called COALESCE – significantly boosted their knowledge about team science and increased their self-confidence about being able to successfully work in scientific teams. Most people who completed one or more modules (84%) said that the experience of taking the modules was very likely to positively impact their future research.
The study will be published July 18 [2019] in the Journal of Clinical and Translational Science.
There are few training resources to teach scientists how to collaborate, and the ones that are available don’t have evidence of their effectiveness. Teamscience.net is the only free, validated-by-research tool available to anyone at any time.
Almost 1,000 of the COALESCE users opted voluntarily to respond to questions about the learning modules, providing information about how taking each module influenced team science knowledge, skills and attitudes.
‘You stole my idea’
The most common area of dispute among collaborating scientists is authorship concerns, such as accusations that one person stole ideas from another or that a contributor was not getting credit for his or her work, the study authors said. Other disputes arise around access to and analysis of data, utilization of materials or resources and the general direction of the research itself. Underlying all of these issues is a common failure to prepare for working collaboratively with other scientists.
“Preparing in advance before starting to collaborate, often through the creation of a formal collaboration agreement document, is the best way to head off these types of disputes,” said Angela Pfammatter, assistant professor of preventive medicine at Feinberg and a coauthor on the paper.
Spring suggested “having scientists discuss their expectations of one another and the collaboration to prevent acrimonious conflicts.”
Skills to play well together
These skills are critical to a successful scientific team, the authors said:
The ability to choose team members who have the right mix of expertise, temperament and accessibility to round out a team. The ability to anticipate what could go wrong and to develop contingency plans in advance. The ability to manage conflict within the team
The teamscience.net modules help scientists acquire these skills by letting them interact with different problem scenarios that can arise in team-based research. Scientists can try out different solutions and learn from mistakes in a safe, online environment.
More than 16,000 people have accessed the resource in the past six years. Demand for team science training is expected to increase as interdisciplinary teams set out to tackle some of science’s most challenging problems.
Other Northwestern authors on the paper are Ekaterina Klyachko, Phillip Rak, H. Gene McFadden, Juned Siddique and Leland Bardsley.
Funding support for COALESCE is from the National Institutes of Health, National Center for Advancing Translational Sciences grants 3UL1RR025741 and UL1TR001422 and its Office of Behavioral and Social Sciences Research.
i once got caught here on this blog between two warring scientists. My August 24, 2015 posting was a pretty standard one for me. Initially, it was one of my more minimalistic pieces with a copy of the text from a university news release announcing the research and a link to the academic paper. I can’t remember if the problem was which scientist was listed first and which was listed last but one of them took exception and contacted me explaining how it was wrong. (Note: These decisions are not made by me.) I did my best to fix whatever the problem was and then the other scientist contacted me. After the dust settled, I ended up with a dog’s breakfast for my posting and a new policy.
Getting back to COALESCE: I wish the Northwestern University researchers all the best as they look for ways to help scientists work together more smoothly and cooperatively.