In addition to the competition to develop commercial quantum computing, there’s the competition to develop commercial nuclear fusion energy. I have four stories about nuclear fusion, one from Spain, one from Chine, one from the US, and one from Vancouver. There are also a couple of segues into history and the recently (April 2, 2025) announced US tariffs (chaos has since ensued as these have become ‘on again/off again’ tariffs) but the bulk of this posting is focused on the latest (January – early April 2025) in fusion energy.
Fission nuclear energy, where atoms are split, is better known; fusion nuclear energy is released when a star is formed. For anyone unfamiliar with the word tokamak as applied to nuclear fusion (which is mentioned in all the stories), you can find out more in the Tokamak Wikipedia entry.
In a pioneering approach to achieve fusion energy, the SMART device has successfully generated its first tokamak plasma. This step brings the international fusion community closer to achieving sustainable, clean, and virtually limitless energy through controlled fusion reactions.
The SMART tokamak, a state-of-the-art experimental fusion device designed, constructed and operated by the Plasma Science and Fusion Technology Laboratory of the University of Seville, is a worldwide unique spherical tokamak due to its flexible shaping capabilities. SMART has been designed to demonstrate the unique physics and engineering properties of Negative Triangularity shaped plasmas towards compact fusion power plants based on Spherical Tokamaks.
Prof. Manuel García Muñoz, Principal Investigator of the SMART tokamak, stated: “This is an important achievement for the entire team as we are now entering the operational phase of SMART. The SMART approach is a potential game changer with attractive fusion performance and power handling for future compact fusion reactors. We have exciting times ahead! Prof. Eleonora Viezzer, co-PI of the SMART project, adds: “We were all very excited to see the first magnetically confined plasma and are looking forward to exploiting the capabilities of the SMART device together with the international scientific community. SMART has awoken great interest worldwide.
When negative becomes positive and compact
The triangularity describes the shape of the plasma. Most tokamaks operate with positive triangularity, meaning that the plasma shape looks like a D. When the D is mirrored (as shown in the figure on the right), the plasma has negative triangularity.
Negative triangularity plasma shapes feature enhanced performance as it suppresses instabilities that expel particles and energy from the plasma, preventing severe damage to the tokamak wall. Besides offering high fusion performance, negative triangularity also feature attractive power handling solutions, given that it covers a larger divertor area for distributing the heat exhaust. This also facilitates the engineering design for future compact fusion power plants.
Fusion2Grid aimed at developing the foundation for the most compact fusion power plant
SMART is the first step in the Fusion2Grid strategy led by the PSFT team and, in collaboration with the international fusion community, is aimed at the most compact and most efficient magnetically confined fusion power plant based on Negative Triangularity shaped Spherical Tokamaks.
SMART will be the first compact spherical tokamak operating at fusion temperatures with negative triangularity shaped plasmas.
The objective of SMART is to provide the physics and engineering basis for the most compact design of a fusion power plant based on high-field Spherical Tokamaks combined with Negative Triangularity. The solenoid-driven plasma represents a major achievement in the timeline of getting SMART online and advancing towards the most compact fusion device.
The Plasma Science and Fusion Technology Lab of the University of Seville hosts the SMall Aspect Ratio Tokamak (SMART) and leads several worldwide efforts on energetic particles and plasma transport and stability towards the development of magnetically confined fusion energy.
Caption: The Experimental Advanced Superconducting Tokamak achieved a remarkable scientific milestone by maintaining steady-state high-confinement plasma operation for an impressive 1,066 seconds. Credit: Image by HFIPS ( Hefei Institutes of Physical Science at the Chinese Academy of Sciences)
China has made a business announcement and there is no academic paper mentioned in their January 21, 2025 press release on EurekAlert (also available on phys.org as a January 21, 2025 news item), Note: A link has been removed,
The Experimental Advanced Superconducting Tokamak (EAST), commonly known as China’s “artificial sun,” has achieved a remarkable scientific milestone by maintaining steady-state high-confinement plasma operation for an impressive 1,066 seconds. This accomplishment, reached on Monday, sets a new world record and marks a significant breakthrough in the pursuit of fusion power generation.
The duration of 1,066 seconds is a critical advancement in fusion research. This milestone, achieved by the Institute of Plasma Physics (ASIPP) at Hefei Institutes of Physical Scienece [sic] (HFIPS) of the Chinese Academy of Sciences, far surpasses the previous world record of 403 seconds, also set by EAST in 2023.
The ultimate goal of developing an artificial sun is to replicate the nuclear fusion processes that occurr [sci] in the sun, providing humanity with a limitless and clean energy source, and enabling exploration beyond our solar system.
Scientists worldwide have dedicated over 70 years to this ambitious goal. However, generating electricity from a nuclear fusion device involves overcoming key challenges, including reaching temperatures exceeding 100 million degrees Celsius, maintaining stable long-term operation, and ensuring precise control of the fusion process.
“A fusion device must achieve stable operation at high efficiency for thousands of seconds to enable the self-sustaining circulation of plasma, which is essential for the continuous power generation of future fusion plants,” said SONG Yuntao, ASIPP director and also vice president of HFIPS. He said that the recent record is monumental, marking a critical step toward realizing a functional fusion reactor.
According to GONG Xianzu, head of the EAST Physics and Experimental Operations division, several systems of the EAST device have been upgraded since the last round of experiments. For example, the heating system, which previously operated at the equivalent power of nearly 70,000 household microwave ovens, has now doubled its power output while maintaining stability and continuity.
Since its inception in 2006, EAST has served as an open testing platform for both Chinese and international scientists to conduct fusion-related experiments and research.
China officially joined the International Thermonuclear Experimental Reactor (ITER) program in 2006 as its seventh member. Under the agreement, China is responsible for approximately 9 percent of the project’s construction and operation, with ASIPP serving as the primary institution for the Chinese mission.
ITER, currently under construction in southern France, is set to become the world’s largest magnetic confinement plasma physics experiment and the largest experimental tokamak nuclear fusion reactor upon completion.
In recent years, EAST has consistently achieved groundbreaking advancements in high-confinement mode, a fundamental operational mode for experimental fusion reactors like ITER and the future China Fusion Engineering Test Reactor (CFETR). These accomplishments provide invaluable insights and references for the global development of fusion reactors.
“We hope to expand international collaboration via EAST and bring fusion energy into practical use for humanity,” said SONG.
In Hefei, Anhui Province, China, where EAST is loacated [sic], a new generation of experimental fusion research facilities is currently under construction. These facilities aim to further accelerate the development and application of fusion energy.
I always feel a little less confident about the information when there are mistakes. Three typos in the same press release? Maybe someone forgot to give it a final once over?
Successfully harnessing the power of fusion energy could lead to cleaner and safer energy for all – and contribute substantially to combatting [UK spelling] the climate crisis. Towards this goal, Type One Energy has published a comprehensive, self-consistent, and robust physics basis for a practical fusion pilot power plant.
This groundbreaking research is presented in a series of six peer-reviewed scientific papers in a special issue of the prestigious Journal of Plasma Physics (JPP), published by Cambridge University Press.
The articles serve as the foundation for the company’s first fusion power plant project, which Type One Energy is developing with the Tennessee Valley Authority utility in the United States.
Alex Schekochihin, Professor of Theoretical Physics at the University of Oxford and Editor of the JPP, spoke with enthusiasm about this development:
“JPP is very proud to provide a platform for rigorous peer review and publication of the papers presenting the physics basis of the Infinity Two stellarator — an innovative and ground-breaking addition to the expanding family of proposed fusion power plant designs.
“Fusion science and technology are experiencing a period of very rapid development, driven by both public and private enthusiasm for fusion power. In this environment of creative and entrepreneurial ferment, it is crucial that new ideas and designs are both publicly shared and thoroughly scrutinised by the scientific community — Type One Energy and JPP are setting the gold standard for how this is done (as we did with Commonwealth Fusion Systems 5 years ago for their SPARC physics basis).”
The new physics design basis for the pilot power plant is a robust effort to consider realistically the complex relationship between challenging, competing requirements that all need to function together for fusion energy to be possible.
This new physics solution also builds on the operating characteristics of high-performing stellarator fusion technology – a stellarator being a machine that uses complex, helical magnetic fields to confine the plasma, thereby enabling scientists to control it and create suitable conditions for fusion. This technology is already being used with success on the world’s largest research stellarator, the Wendelstein 7-X, located in Germany, but the challenge embraced by Type One Energy’s new design is how to scale it up to a pilot plant.
Building the future of energy
Functional fusion technology could offer limitless clean energy. As global energy demands increase and energy security is front of mind, too, this new physics design basis comes at an excellent time.
Christofer Mowry, CEO of Type One Energy, is cognisant of the landmark nature of his company’s achievement and proud of its strong, real-world foundations.
“The physics basis for our new fusion power plant is grounded in Type One Energy’s expert knowledge about reliable, economic, electrical generation for the power grid. We have an organisation that understands this isn’t only about designing a science project.”
This research was developed collaboratively between Type One Energy and a broad coalition of scientists from national laboratories and universities around the world. Collaborating organisations included the US Department of Energy, for using their supercomputers, such as the exascale Frontier machine at Oak Ridge National Laboratory, to perform its physics simulations.
While commercial fusion energy has yet to move from theory into practice, this new research marks an important and promising milestone. Clean and abundant energy may yet become reality.
This is not directly related to fusion energy, so, you might want to skip this section.
Caption: Type One Energy employees at the Bull Run [emphasis mine] Fossil Plant, soon to be home to the prototype Infinity One. Credit: Type One Energy
I wonder if anyone argued for a change of name given how charged the US history associated with ‘Bull Run’ is, from the the First Battle of Bull Run Wikipedia entry, Note: Links have been removed,
The First Battle of Bull Run, called the Battle of First Manassas[1] by Confederate forces, was the first major battle of the American Civil War. The battle was fought on July 21, 1861, in Prince William County, Virginia, just north of what is now the city of Manassas and about thirty miles west-southwest of Washington, D.C. The Union Army was slow in positioning themselves, allowing Confederate reinforcements time to arrive by rail. Each side had about 18,000 poorly trained and poorly led troops. The battle was a Confederate victory and was followed by a disorganized post-battle retreat of the Union forces.
…
A Confederate victory the first time and the second time (Second Battle of Bull Run Wikipedia entry)? For anyone unfamiliar with the history, the US Civil War was fought from 1861 to 1865 between Union and Confederate forces. The Confederate states had seceded from the union (US) and were fighting to retain their slavery-based economy and they lost the war.
Had anyone consulted me I would have advised changing the name from Bull Run to some thing less charged (pun noted) to host your prototype fusion energy pilot plant.
Back to the usual programme.
Type One Energy
Type One Energy issued a March 27, 2025 news release about the special issue of the Journal of Plasma Physics (JPP), Note 1: Some of this redundant; Note 2: Links have been removed,
Type One Energy announced today publication of the world’s first comprehensive, self-consistent, and robust physics basis, with conservative design margins, for a practical fusion pilot power plant. This physics basis is presented in a series of seven peer-reviewed scientific papers in a special issue of the prestigious Journal of Plasma Physics (JPP). They serve as the foundation for the company’s first Infinity Two stellarator fusion power plant project, which Type One Energy is developing for the Tennessee Valley Authority (TVA) utility in the U.S.
The Infinity Two fusion pilot power plant physics design basis realistically considers, for the first time, the complex relationship between competing requirements for plasma performance, power plant startup, construction logistics, reliability, and economics utilizing actual power plant operating experience. This Infinity Two baseline physics solution makes use of the inherently favorable operating characteristics of highly optimized stellarator fusion technology using modular superconducting magnets, as was so successfully proven on the W7-X science machine in Germany.
“Why are we the first private fusion company with an agreement to develop a potential fusion power plant project for an energy utility? Because we have a design anchored in reality,” said Christofer Mowry, CEO of Type One Energy. “The physics basis for Infinity Two is grounded in the knowledge of what is required for application to, and performance in, the demanding environment of reliable electrical generation for the power grid. We have an organization that understands this isn’t about designing a science project.”
Led by Chris Hegna, widely recognized as a leading theorist in modern stellarators, Type One Energy performed high-fidelity computational plasma physics analyses to substantially reduce the risk of meeting Infinity Two power plant functional and performance requirements. This unique and transformational achievement is the result of a global development program led by the Type One Energy plasma physics and stellarator engineering organization, with significant contributions from a broad coalition of scientists from national laboratories and universities around the world. The company made use of a spectrum of high-performance computing facilities, including access to the highest-performance U.S. Department of Energy supercomputers such as the exascale Frontier machine at Oak Ridge National Laboratory (ORNL), to perform its stellarator physics simulations.
“We committed to this ambitious fusion commercialization milestone two years ago and today we delivered,” said John Canik, Chief Science and Engineering Officer for Type One Energy. “The team was able to efficiently develop deep plasma physics insights to inform the design of our Infinity Two stellarator, by taking advantage of our access to high performance computing resources. This enabled the Type One Energy team to demonstrate a realistic, integrated stellarator design that moves far beyond conventional thinking and concepts derived from more limited modeling capabilities.”
The consistent and robust physics solution for Infinity Two results in a deuterium-tritium (D-T) fueled, burning plasma stellarator with 800 MW of fusion power and delivers a nominal 350 MWe to the power grid. It is characterized by fusion plasma with resilient and stable behavior across a broad range of operating conditions, very low heat loss due to turbulent transport, as well as tolerable direct energy losses to the stellarator first wall. The Infinity Two stellarator has sufficient room for both adequately sized island divertors to exhaust helium ash and a blanket which provides appropriate shielding and tritium breeding. Type One Energy has high confidence that this essential physics solution provides a good baseline stellarator configuration for the Infinity Two fusion pilot power plant.
“The articles in this issue [of JPP] represent an important step towards a fusion reactor based on the stellarator concept. Thanks to decades of experiments and theoretical research, much of the latter published in JPP, it has become possible to lay out the physics basis for a stellarator power plant in considerable detail,” said Per Helander, head of Stellarator Theory Division at the Max Planck Institute for Plasma Physics. “JPP is very happy to publish this series of papers from Type One Energy, where this has been accomplished in a way that sets new standards for the fidelity and confidence level in this context.”
Important to successful fusion power plant commercialization, this stellarator configuration has enabled Type One Energy to architect a maintenance solution which supports good power plant Capacity Factors (CF) and associated Levelized Cost of Electricity (LCOE). It also supports favorable regulatory requirements for component manufacturing and power plant construction methods essential to achieving a reasonable Over-Night Cost (ONC) for Infinity Two.
About Type One Energy
Type One Energy Group is mission-driven to provide sustainable, affordable fusion power to the world. Established in 2019 and venture-backed in 2023, the company is led by a team of globally recognized fusion scientists with a strong track record of building state-of-the-art stellarator fusion machines, together with veteran business leaders experienced in scaling companies and commercializing energy technologies. Type One Energy applies proven advanced manufacturing methods, modern computational physics and high-field superconducting magnets to develop its optimized stellarator fusion energy system. Its FusionDirect development program pursues the lowest-risk, shortest-schedule path to a fusion power plant over the coming decade, using a partner-intensive and capital-efficient strategy. Type One Energy is committed to community engagement in the development and deployment of its clean energy technology. For more information, visit www.typeoneenergy.com or follow us on LinkedIn.
While the company is currently headquartered in Knoxville, Tennessee, it was originally a spinoff company from the University of Wisconsin-Madison according to a March 30, 2023 posting on the university’s College of Engineering website,
Type One Energy, a Middleton, Wisconsin-based fusion energy company with roots in the University of Wisconsin-Madison’s College of Engineering, recently announced its first round of seed funding, raising $29 million from investors. The company has also onboarded a new, highly experienced CEO [Christofer Mowry].
Type One, founded in 2019 by a team of globally recognized fusion scientists and business leaders, is hoping to commercialize stellarator technology over the next decade. Stellarators are a type of fusion reactor that uses powerful magnets to confine ultra-hot streams of plasma in order to create the conditions for fusion reactions. Energy from fusion promises to be clean, safe, renewable power. The company is using advanced manufacturing methods, modern computational physics and high-field superconducting magnets to develop its stellarator through an initiative called FusionDirect.
…
According to the Type One Energy’s About page, there are four offices with the headquarters in Tennessee,
Madison 316 W Washington Ave. Suite 300 Madison, WI 53703
Boston 299 Washington St. Suites C & E Woburn, MA 01801
Vancouver 1140 West Pender St. Vancouver, BC V6E 4G1
The mention of an office in Vancouver, Canada piqued my curiosity but before getting to that, I’m going to include some informative excerpts about nuclear energy (both fission and fusion) from this August 31, 2023 article written by Tina Tosukhowong on behalf of TDK Ventures, which was posted on Medium,
Fusion power is the key to the energy transformation that humanity needs to drive decarbonization, clean, and baseload energy production that is inherently fail-safe, with no risk of long-lived radioactive waste, while also delivering on ever-growing energy-consumption demands at the global scale. Fusion is hard and requires exceptional conditions for sustained reaction (which is part of what makes it so safe), which has long served as a deterrent for technical maturation and industrial viability. …
…
The current reality of our world is monumental fossil-fuel dependence. This, coupled with unprecedented levels of energy demand has resulted in the over 136,700 TWh (that’s 10¹²) of energy consumed via fossil fuels annually [1]. Chief repercussion among the many consequences of this dependence is the now very looming threat of climate catastrophe, which will soon be irreversible if global temperature rise is not abated and held to within 1.5 °C of pre-industrial levels. To do so, the nearly 40 gigatons of CO2 emissions generated each year must be steadily reduced and eventually mitigated entirely [2]. A fundamental shift in how power is generated globally is the only way forward. Humanity needs an energy transformation — the right energy transformation.
Alternative energy-generation techniques, such as wind, solar, geothermal, and hydroelectric approaches have all made excellent strides, and indeed in just the United States electricity generated by renewable methods doubled from 10 to 20% of total between 2010 and 2020 [3–4]. These numbers are incredibly encouraging and give significant credence in the journey to net-zero emission energy generation. However, while these standard renewable approaches should be championed, wind and solar are intermittent and require a large amount of land to deploy, while geothermal and hydroelectric are not available in every geography.
By far the most viable candidates for continuous clean energy generation to replace coal-fired power plants are nuclear-driven technologies, i.e. nuclear fission or nuclear fusion. Nuclear fission has been a proven effective method ever since it was first demonstrated almost 80 years ago underneath the University of Chicago football Stadium by Nobel Laureate Enrico Fermi [5]. Heavier atomic elements, in most cases Uranium-235, are exposed to and bombarded by neutrons. This causes the Uranium to split resulting in two slightly less-heavy elements (like Barium and Krypton). This in turn causes energy to be released and more neutrons to be ejected and bombard other nearby Uranium-235, at which point the process cascades into a chain reaction. The released energy (heat) is utilized in the same way coal is burned in a traditional power plant, being subsequently used to generate electricity usually via the creation of steam to drive a turbine [6]. While already having reached viable commercial maturity, fission carries inherent and nontrivial safety concerns. An unhampered chain reaction can quickly lead to meltdown with disastrous consequences, and, even when properly managed, the end reaction does generate radioactive waste whose half-life can last hundreds of thousands of years.
Figure 1. Breakdown of a nuclear fission reaction [6]. Incident neutron bombards a fissile heavy element, splitting it and release energy and more nuclei setting off a chain reaction.
Especially given modernization efforts and meteoric gains in safety (thanks to advents in material science like ceramic coatings), fission will continue to be a critical piece to better, greener energy transformation. However, in extending our vision to an even brighter future with no such concerns — carbon emissions or safety — nuclear fusion is humanity’s silver bullet. Instead of breaking down atoms leading to a chain reaction, fusion is the combining of atoms (usually isotopes of Hydrogen) into heavier elements which also results in energy release / heat generation [7]. Like fission, fusion can be designed to be a continuous energy source that can serve as a permanent backbone to the power grid. It is extremely energy dense, with 1 kg of fusion fuel producing the same amount of energy as 1,000,000 kg of coal, and it is inherently fail-safe with no long-term radioactive waste.
…
As a concept, if fusion is a silver bullet to answer humanity’s energy transformation needs, then why haven’t we done so already? The appeal seems so obvious, what’s the hold up? Simply put, nuclear fusion is hard for the very same reason the process is inherently safe. Atoms in the process must have enough energy to overcome electrostatic repulsive forces between the two positive charges of their nuclei to fuse. The key figure of merit to evaluate fusion is the so-called “Lawson Triple Product.” Essentially, this means in order to generate energy by fusion more than the rate of energy oss to the environment, the nuclei must be very close together (as represented by n — the plasma density), kept at a high enough temperature (as represented by T — temperature), and for long enough time to sustain fusion (as represented by τ — the confinement time). The triple product required to achieve fusion “ignition” (the state where the rate of energy production is higher than the rate of loss) depends on the fuel type and occurs within a plasma state. A deuterium and tritium (D-T) system has the lowest Lawson Triple product requirement, where fusion can achieve a viable threshold for ignition when the density of the fuel atoms, n, multiplied by the fuel temperature, T, multiplied by the confinement time, τ, is greater than 5×10²¹ (nTτ > 5×10²¹ keV-s/m³) [8–9]. For context, the temperature alone in this scenario must be higher than 100-million degrees Celsius.
Figure 2. (Left) Conceptual illustration of a fusion reaction with Deuterium (²H) and Tritium (³H) forming an Alpha particle (⁴He) and free neutron along with energy released as heat (Right). To initiate fusion, repelling electrostatic charge must be overcome via conditions meeting the minimum Lawson Triple Product threshold
…
Tosukhowong’s August 31, 2023 article provides a good overview keeping in mind that it is slanted to justify TDK’s investment in Type One Energy.
Why a Vancouver, Canada office?
As for Type One Energy’s Vancouver (British Columbia, Canada) connection, I was reminded of General Fusion, a local fusion energy company while speculating about the connection. First speculative question: could Type One Energy’s presence in Canada allow it to access Canadian government funds for its research? Second speculative question: do they want to have access to people who might hesitate to move to the US or might want to move out of the US but would move to Canada?
The US is currently in an unstable state as suggested in this April 3, 2025 opinion piece by Les Leyne for vancouverisawsome.com
Les Leyne: Trump’s incoherence makes responding to tariff wall tricky
Trump’s announcement was so incoherent that much of the rest of the world had to scramble to grasp even the basic details
B.C. officials were guarded Wednesday [April 2, 2025] about the impact on Canada of the tariff wall U.S. President Donald Trump erected around the U.S., but it appears it could have been worse.
Trump’s announcement was so incoherent that much of the rest of the world had to scramble to grasp even the basic details. So cabinet ministers begged for more time to check the impacts.
“It’s still very uncertain,” said Housing Minister Ravi Kahlon, who chairs the “war room” committee responsible for countering tariff threats. “It’s hard to make sense from President Trump’s speech.” [emphasis mine]
Kahlon said the challenge is that tariff policies change hour by hour, “and anything can happen.”
…
On April 2, 2025 US President Donald Trump announced tariffs (then paused some of the tariffs on April 9, 2025) and some of the targets seemed a bit odd, from an April 2, 2025 article by Alex Galbraith for salon.com, Note: Links have been removed,
“Trade war with penguins”: Trump places 10% tariff on uninhabited Antarctic islands
Planned tariffs shared by the White House included a 10% duty on imports from the barren Heard and McDonald Islands
For once in his life, Donald Trump underpromised and over-delivered.
The president announced a 10% duty on all imports on Wednesday [April 2, 2025], along with a raft of reciprocal tariffs on U.S. trading partners. An extensive graphic released by the White House showed how far Trump was willing to take his tit-for-tat trade war, including a shocking levy of 10% on all imports from the Heard and McDonald Islands.
If you haven’t heard of this powerhouse of global trade and territory of Australia, you aren’t alone. Few have outside of Antarctic researchers and seals. These extremely remote islands about 1,000 miles north of Antarctica consist mostly of barren tundra. They’re also entirely uninhabited.
The news that we were starting a trade war with penguins spread quickly after Trump’s announcement. …
U.S. stock futures crumbled following the news of Trump’s widespread tariffs. Dow futures fell by nearly 1,000 points while NASDAQ and S&P futures fell by 3 to 4%. American companies’ stock values rapidly tumbled after the announcement, with large retail importers seeing significant losses. …
No word from the penguins about the ‘pause’. I’m assuming Donald Trump’s next book will be titled, “The art of negotiating trade deals with penguins.” Can’t wait to read it.
(Perhaps someone should tell him there are no penguins in the Arctic so he can’t bypass Canadians or Greenlanders to make a deal.)
Now for the local story.
General Fusion
There’ve been two recent developments at General Fusion. Most recently, an April 2, 2025 General Fusion news release announces a new hire, Note: Links have been removed,
Bob Smith is joining General Fusion as a strategic advisor. Smith brings more than 35 years of experience developing, scaling, and launching world-changing technologies, including spearheading new products and innovation in the aerospace industry at United Space Alliance, Sandia Labs, and Honeywell before serving as CEO of Blue Origin. He joins General Fusion as the company’s Lawson Machine 26 (LM26) fusion demonstration begins operations and progresses toward transformative technical milestones on the path to commercialization.
“I’ve been watching the fusion energy industry closely for my entire career. Fusion is the last energy source humanity will ever need, and I believe its impact as a zero-carbon energy source will transform the global energy supply at the time needed to fight the worst consequences of climate change,” said Smith. “I am thrilled to work with General Fusion. Their novel approach has inherent and distinctive benefits for the generation of commercially competitive fusion power. It’s exciting to join at a time when the team is about to demonstrate the fundamental physics behind their system and move to scaling up to a pilot plant.”
The LM26 program marks a significant step towards commercialization, as the company’s unique Magnetized Target Fusion (MTF) approach makes the path to powering the grid with fusion energy more straightforward than other technologies—because it practically addresses barriers to fusion commercialization, such as neutron material degradation, sustainable fuel production, and efficient energy extraction. As a strategic advisor, Smith will leverage his experience advancing game-changing technologies to help guide General Fusion’s technology development and strategic growth.
“Bob’s insights and experience will be invaluable as we execute the LM26 program and look beyond it to propel our practical technology to powering the grid by the mid-2030s,” said Greg Twinney, CEO, General Fusion. “We are grateful for his commitment of his in-demand time and expertise to our mission and look forward to working together to make fusion power a reality!”
About Bob Smith:
Bob is an experienced business leader in the aerospace and defense industry with extensive technical and operational expertise across the sector. He worked at and managed federal labs, led developments at a large government contractor, grew businesses at a Fortune 100 multinational, and scaled up a launch and space systems startup. Bob also has extensive international experience and has worked with suppliers and OEMs in all the major aerospace regions, including establishing new sites and factories in Europe, India, China, and Puerto Rico.
Bob’s prior leadership roles include Chairman and Chief Executive Officer of Blue Origin, President of Mechanical Systems & Components at Honeywell Aerospace, Chief Technology Officer at Honeywell Aerospace, Chairman of NTESS (Sandia Labs), and Executive Director of Space Shuttle Upgrades at United Space Alliance.
Bob holds a Bachelor of Science degree in aerospace engineering from Texas A&M, a Master of Science degree in engineering/applied mathematics from Brown University, a doctorate from the University of Texas in aerospace engineering, and a business degree from MIT’s Sloan School of Management. Bob is also a Fellow of the Royal Aeronautical Society, a Fellow of the American Institute of Aeronautics and Astronautics, and an Academician in the International Academy of Astronautics.
Quick Facts:
Fusion energy is the ultimate clean energy solution—it is the energy source that powers the sun and stars. Fusion is the process by which two light nuclei merge to form a heavier one, producing a massive amount of energy.
General Fusion’s Magnetized Target Fusion (MTF) technology is designed to scale for cost-efficient power plants. It uses mechanical compression to create fusion conditions in short pulses, eliminating the need for expensive lasers or superconducting magnets. An MTF power plant is designed to produce its own fuel and inherently includes a method to extract the energy and put it to work.
Lawson Machine 26 (LM26) is a world-first Magnetized Target Fusion demonstration. Launched, designed, and assembled in just 16 months, the machine is now forming magnetized plasmas regularly at 50 per cent commercial scale. It is advancing towards a series of results that will demonstrate MTF in a commercially relevant way: 10 million degrees Celsius (1 keV), 100 million degrees Celsius (10 keV), and scientific breakeven equivalent (100% Lawson).
About General Fusion General Fusion is pursuing a fast and practical approach to commercial fusion energy and is headquartered in Richmond, Canada. The company was established in 2002 and is funded by a global syndicate of leading energy venture capital firms, industry leaders, and technology pioneers. Learn more at www.generalfusion.com.
…
Bob Smith and Blue Origin: things did not go well
Sometimes you end up in a job and things do not work out well and that seems to have been the case at Blue Origin according to a September 25, 2023 article by Eric Berger for Ars Tecnica,
After six years of running Blue Origin, Bob Smith announced in a company-wide email on Monday that he will be “stepping aside” as chief executive of the space company founded by Jeff Bezos.
“It has been my privilege to be part of this great team, and I am confident that Blue Origin’s greatest achievements are still ahead of us,” Smith wrote in an email. “We’ve rapidly scaled this company from its prototyping and research roots to a large, prominent space business.”
Shortly after Smith’s email, a Blue Origin spokesperson said the company’s new chief executive will be Dave Limp, who stepped down as Amazon’s vice president of devices and services last month.
…
To put things politely, Smith has had a rocky tenure as Blue Origin’s chief executive. After being personally vetted and hired by Bezos, Smith took over from Rob Meyerson in 2017. The Honeywell engineer was given a mandate to transform Blue Origin into a large and profitable space business.
He did succeed in growing Blue Origin. The company had about 1,500 employees when Smith arrived, and the company now employs nearly 11,000 people. But he has been significantly late on a number of key programs, including the BE-4 rocket engine and the New Glenn rocket.
As a space reporter, I have spoken with dozens of current and former Blue Origin employees, and virtually none of them have had anything positive to say about Smith’s tenure as chief executive. I asked one current employee about the hiring of Limp on Monday afternoon, and their response was, “Anything is better than Bob.”
Although it is very far from an exact barometer, Smith has received consistently low ratings on Glassdoor for his performance as chief executive of Blue Origin. And two years ago, a group of current and former Blue Origin employees wrote a blistering letter about the company under Smith. “In our experience, Blue Origin’s culture sits on a foundation that ignores the plight of our planet, turns a blind eye to sexism, is not sufficiently attuned to safety concerns, and silences those who seek to correct wrongs,” the essay authors wrote.
With any corporate culture, there will be growing pains, of course. But Smith brought a traditional aerospace mindset into a company that had hitherto been guided by a new space vision, leading to a high turnover rate. And Blue Origin remains significantly underwater, financially. It is likely that Bezos is still providing about $2 billion a year to support the company’s cash needs.
Crucially, as Blue Origin meandered under Smith’s tenure, SpaceX soared, launching hundreds of rockets and thousands of satellites. Smith, clearly, was not the leader Blue Origin needed to make the company more competitive with SpaceX in launch and other spaceflight activities. It became something of a parlor game in the space industry to guess when Bezos would finally get around to firing Smith.
…
On the technical front, a March 27, 2025 General Fusion news release announces “Peer-reviewed publication confirms General Fusion achieved plasma energy confinement time required for its LM26 large-scale fusion machine,” Note: Links have been removed,
New results published in Nuclear Fusionconfirm General Fusion successfully created magnetized plasmas that achieved energy confinement times exceeding 10 milliseconds. The published energy confinement time results were achieved on General Fusion’s PI3 plasma injector — the world’s largest and most powerful plasma injector of its kind. Commissioned in 2017, PI3 formed approximately 20,000 plasmas in a machine of 50 per cent commercial scale. The plasma injector is now integrated into General Fusion’s Lawson Machine 26 (LM26) — a world-first Magnetized Target Fusion demonstration tracking toward game-changing technical milestones that will advance the company’s ultimate mission: generating zero-carbon fusion energy for the grid in the next decade.
The 10-millisecond energy confinement time is the duration required to compress plasmas in LM26 to achieve key temperature thresholds of 1 keV, 10 keV, and, ultimately, scientific breakeven equivalent (100% Lawson). These results were imperative to de-risking LM26. The demonstration machine is now forming plasmas regularly, and the company is optimizing its plasma performance in preparation for compressing plasmas to create fusion and heating from compression.
Key Findings:
The plasma injector now integrated into General Fusion’s LM26 achieved energy confinement times exceeding 10 milliseconds, the pre-compression confinement time required for LM26’s targeted technical milestones. These results were achieved without requiring active magnetic stabilization or auxiliary heating. This means the results were achieved without superconducting magnets, demonstrating the company’s cost-effective approach.
The plasma’s energy confinement time improved when the plasma injector vessel was coated with natural lithium. A key differentiator in General Fusion’s commercial approach is its use of a liquid lithium wall to compress plasmas during compression. In addition to the confinement time advantages shown in this paper, the liquid lithium wall will also protect a commercial MTF machine from neutron damage, enable the machine to breed its own fuel, and provide an efficient method for extracting energy from the machine.
The maximum energy confinement time achieved by PI3 was approximately 12 milliseconds. The machine’s maximum plasma density was approximately 6×1019 m-3, and maximum plasma temperatures exceeded 400 eV. These strong pre-compression results support LM26’s transformative targets.
Quotes:
“LM26 is designed to achieve a series of results that will demonstrate MTF in a commercially relevant way. Following LM26’s results, our unique approach makes the path to powering the grid with fusion energy more straightforward than other technologies because we have front-loaded the work to address the barriers to commercialization.”
Dr. Michel Laberge Founder and Chief Science Officer
“For over 16 years, I have worked hand in hand with Michel to advance General Fusion’s practical technology. This company is entrepreneurial at its core. We pride ourselves on building real machines that get results that matter, and I’m thrilled to have the achievements recognized in Nuclear Fusion.”
Mike Donaldson Senior Vice President, Technology Development
For anyone curious about General Fusion, I have a brief overview and history of the company and their particular approach to fusion energy in my February 6, 2024 posting (scroll down to ‘The Canadians’).
I don’t usually stumble across stories about natural nanoparticles; almost all the stories here are about engineered nanoparticles. Nice to get a change of pace. Plus, I love rain. as I sit here composing this post, the rain is pelting against my windows.
This November 8, 2024 news item on ScienceDaily announces a natural nanoparticle story that is centered on the Amazon rainforest,
Atmospheric aerosol particles are essential for the formation of clouds and precipitation, thereby influencing the Earth’s energy budget, water cycle, and climate. However, the origin of aerosol particles in pristine air over the Amazon rainforest during the wet season is poorly understood. A new study, led by the Max Planck Institute for Chemistry in Mainz, reveals that rainfall regularly induces bursts of newly formed nanoparticles in the air above the forest canopy.
…
Caption: A rain front approaches the ATTO research station in the Amazon rainforest. Credit: Sebastian Brill, Max Planck Institute for Chemistry
An international research team from Germany, Brazil, Sweden, and China now showed that rainfall regularly induces bursts of nanoparticles that can grow to form cloud condensation nuclei. The scientists analyzed comprehensive long-term measurements of aerosol particles, trace gases, and meteorological data from the Amazon Tall Tower Observatory, ATTO, which is equipped with sophisticated instrumentation and measurement towers that are up to 325 m high. The observatory is located in the middle of the Amazon rainforest in northern Brazil, about 150 kilometers north-east of Manaus, and jointly operated by scientists from Germany and Brazil.
Luiz Machado, first author of the study now published in the journal Nature Geoscience, explains: “Rainfall removes aerosol particles and introduces ozone from the atmosphere into the forest canopy. Ozone can oxidize plant-emitted volatile organic compounds, especially terpenes, and the oxidation products can enhance the formation of new particles, leading to temporary bursts of nanoparticles.”
Nanoparticle concentrations are highest just above the forest canopy
The researchers discovered that nanoparticle concentrations are highest just above the forest canopy and decrease with increasing altitude. “This gradient persists throughout the wet season, indicating continuous particle formation in the canopy and an upward flux of newly formed particles that can grow by further uptake of low volatile molecules and serve as cloud condensation nuclei”, adds Christopher Pöhlker, co-author and research group leader at the Max Planck Institute for Chemistry. Among the low volatile molecules involved in the formation and growth of natural nanoparticles in the atmosphere are oxygen- and nitrogen-containing organic compounds that are formed upon oxidation of isoprene, terpenes, and other volatile organic compounds, which are naturally emitted by plants and oxidized by ozone and hydroxyl radicals in the air.
Earlier studies had detected new particle formation in the outflow of convective clouds in the upper troposphere and suggested a downward flux rather than an upward flux of newly formed nanoparticles.
“Our findings imply a paradigm shift in the scientific understanding of interactions between the rainforest, aerosols, clouds, and precipitation in the Amazon, which are important for regional and global climate”, concludes Ulrich Pöschl, co-author and director at the Max Planck Institute for Chemistry.
About ATTO: The Amazon Tall Tower Observatory (ATTO) is an internationally collaborative research site in the central Amazon, dedicated to studying atmospheric processes and the exchange of energy, water, and gases between the biosphere and atmosphere. It is one of the world’s most critical observatories for understanding the impacts of climate change on tropical forests.
Here’s a link to and a citation for the paper,
Frequent rainfall-induced new particle formation within the canopy in the Amazon rainforest by Luiz A. T. Machado, Gabriela R. Unfer, Sebastian Brill, Stefanie Hildmann, Christopher Pöhlker, Yafang Cheng, Jonathan Williams, Harder Hartwig, Meinrat O. Andreae, Paulo Artaxo, Joachim Curtius, Marco A. Franco, Micael A. Cecchini, Achim Edtbauer, Thorsten Hoffmann, Bruna Holanda, Théodore Khadir, Radovan Krejci, Leslie A. Kremper, Yunfan Liu, Bruno B. Meller, Mira L. Pöhlker, Carlos A. Quesada, Akima Ringsdorf, Ilona Riipinen, Susan Trumbore, Stefan Wolff, Jos Lelieveld & Ulrich Pöschl. Nature Geoscience volume 17, pages 1225–1232 (2024) DOI: https://doi.org/10.1038/s41561-024-01585-0 Published online: 08 November 2024 Issue Date: December 2024
Part one offers a brief discussion of the Canadian quantum scene and, more saliently, the expert panel along with some details about the agencies (three were nameless) that requested the report and the questions they wanted answered.
Getting back to the topic, this second of two parts offers some report highlights (from my perspective as someone who is not an expert on quantum technologies, or any other technology for that matter) and includes my comments interlaced with excerpts from the report.
A little history from the introduction to the report on quantum potential
Early in the 20th century, physicists believed they had a solid understanding of how the physical world functioned. Using classical theories inherited from luminaries such as Isaac Newton and James Clerk Maxwell, physicists thought that nearly all laws of the physical world could be accounted for. Lord Kelvin is often attributed as saying: “There is nothing new to be discovered in physics now. All that remains is more and more precise measurement.” [p. 2 of the print version and p. 30 of the PDF version]
Conspicuously missing from traditional accounts of the early history of quantum mechanics are the contributions of women, racialized researchers, and researchers outside Europe and North America. Despite systemic barriers and inequities in research and educational opportunities, these scientists made important contributions to the development of the field. For example, Canadian physicist Laura Chalk performed the first experiments that confirmed Schrödinger’s theory of wave mechanics. Wu Chien-Shiung, well-known for her groundbreaking parity-violation experiments, was the first to measure clear evidence of the correlations between entangled pairs of photons. … [pp. 3-4 in print version and pp. 31-32 in PDF version]
The source of the panel’s evidence is always interesting (to me, if no one else), from the Quantum Potential Introduction,
The panel’s assessment was based on a review of diverse sources of evidence, including peer-reviewed publications and grey literature (i.e., policy documents, government publications and websites, as well as reports by national and international organizations and committees). The panel also engaged with guest speakers from the Canadian Security Intelligence Service (CSIS). In October 2022, the panel made site visits to two firms, D-Wave and Photonic Inc. [emphasis mine], as part of its evidence gathering. In January 2023, it hosted a session at Quantum Days 2023, a Canadian conference for quantum academics, students, and professionals, and received input from participants on topics relevant to this assessment.
D-Wave Systems was formerly headquartered in Burnaby, British Columbia (BC), according to its Wikepedia entry the company is now headquartered in Palo Alto, California. Many still consider it to be a BC company as it maintains a presence here. Coincidentally, a “Senior Vice-President, Quantum Technologies and Systems Products, D-Wave Systems Inc.,” Mark W. Johnson, was listed in the panel’s report as a member of the expert panel.
Coincidence No. 2: Photonic Inc. boasts Stephanie Simmons, a former member of the expert panel and the company’s current Chief Quantum Officer and founder (according to the About Us webpage‘s team member profile on the Photonic website) is also professor at Simon Fraser University in the Vancouver metro area of British Columbia [BC]) and co-chair with Raymond Laflamme, (chair of the expert panel) of Canada’s Quantum Advisory Council, which monitors the National Quantum Strategy. Photonic, Simmons’ company is headquartered in Coquitlam, BC (according to its LinkedIn profile).
Despite the fact that members of the expert panel are mostly located in eastern Canada, they seem to have made their only site visits to ‘quantum’ businesses in what is known as the metro Vancouver area while missing a major ‘quantum’ business (Xanadu Quantum Technologies) headquartered in Toronto, Canada. Seems odd.
It gets even more odd where in Chapter 3 in the subsection headed: Quantum expertise is unevenly distributed across Canada (scroll down to Chapter 3 and look for the subhead in the report excerpts) where Xanadu Quantum Technologies remains unmentioned in a listing of Ontario companies but both Photonic Inc. and D-Wave Systems are in a listing of British Columbia companies.
Adding to the oddity of it all, Christian Weedbrook, founder and CEO of Xanadu Quantum Technologies, is listed as a member of Canada’s Quantum Advisory Council (as of October 3, 2024), representing one of the few companies to be found on the list. “The Quantum Advisory Council has been established to provide impartial advice to Innovation, Science and Economic Development Canada and monitor the progress of the National Quantum Strategy.”
Getting back to the report, here’s the last bit from the Quantum Potential Introduction,
1.5 Report Structure Chapter 2 introduces three categories of quantum technologies — computing, communications, and sensing — and explores their commercialization potential. The chapter provides an overview of the possible economic impact of quantum technologies for select sectors in Canada that are among the most likely adopters and beneficiaries of quantum technologies. Chapter 3 describes the quantum technology landscape, positioning Canada within the global ecosystem in terms of research activity, market activity, public policy, and the international quantum value chain. Chapters 4 and 5 provide an in-depth review of the ethical, legal, social, and policy implications as well as the institutional and regulatory challenges associated with the adoption of quantum technologies. Chapter 6 examines the enabling conditions and potential levers available to the public and private sectors to advance the adoption of quantum technologies. Chapter 7 introduces a responsible approach to the adoption of quantum technologies and provides the panel’s final reflections.
Commercialization in Chapter two
“Chapter 2: Commercialization and Adoption of Quantum Technologies” offers this piece of information about why adoption of quantum computing may take some time, from Quantum Potential,
… Economic factors also play an important role; in 2020, the cost of a classical bit was on the order of one-millionth of a cent, whereas the cost of a physical qubit was around US$10,000 and the cost of a logical qubit was estimated to be over US$1 million, and likely between US$10 to 100 million [emphases mine] (Swallow & Joneckis, 2021). [p. 16 in print version and page 44 of PDF version]
…
An estimate of “over US$1 million and likely between US$10 to 100 million” suggests that the experts have no idea of the potential cost for a logical qubit.
Moving on: who knew? It seems we already have quantum sensors in operation in Canada, from Quantum Potential Chapter 2,
Several federal government programs have already been launched to support the development and adoption of quantum sensors in Canada (ISED, 2023d). These include the Department of National Defence (DND) Innovation for Defence Excellence and Security (IDEaS) call for quantum sensing projects for defence and security (DND, 2023); the NRC Internet of Things: Quantum Sensors Challenge program (NRC, 2022b) and Quantum Photonic Sensing and Security program (NRC, 2022a); and the Innovative Solutions Canada call to support pre-commercial quantum sensor prototypes “that can be tested in real life settings and address a variety of priorities within the Government of Canada” (ISC, 2022). [p. 18 in print version and p. 46 in PDF version]
There’s an interesting subsection on Communications, from Quantum Potential Chapter 2
Broadly speaking, quantum communications involve two related areas: (i) transmitting quantum information from one location to another, and (ii) ensuring secure communications that cannot be decrypted by a quantum computer. As to the first point, there are many potential applications for transmitting quantum information, including a quantum internet that could link quantum computers together; blind quantum computing that allows clients to remotely access a quantum computer in such a way that the vendor does not have access to their information; exchanging quantum states for cryptographic protocols; and linking quantum sensors. Importantly, these applications require quantum networks that allow for the exchange of qubits and the distribution of entangled quantum states across nodes in that network (Judge, 2022).4
As to the second point (i.e., ensuring secure communications), many applications of quantum communications relate to cryptography. [emphasis mine] This is a vast area of research and technology development that includes key distribution, encryption, digital signatures, authentications, digital currencies, and much more. As noted in Chapter 1, large quantum computers will someday have the ability to break nearly all existing encryption schemes, such as the RSA and elliptic curve cryptography algorithms that are widely used today, as well as common alternatives. To mitigate this threat, there are at least two possible options. … [p. 22 in print version and p. 50 in PDF version]
This becomes more interesting when you know that Laflamme, chair of the expert panel, has a major research interest in quantum cryptography, from my May 11, 2015 posting; scroll down to the ‘Raymond Laflamme’ subhead,
..;.
One of his [Laflamme’s] major research interests is quantum cryptography, a means of passing messages you can ensure are private. Laflamme’s team and a team in Vienna (Austria) have enabled two quantum communication systems, one purely terrestrial version, which can exchange messages with another such system up to 100 km. away. [emphases mine and added December 6, 2024]
If you’re curious about quantum communications or cybersecurity, you might want to pay special attention to Quantum Potential Chapter 2’s subsection ‘2.1.3 Communications’, which includes some surprising (to me) information,
Canada is among several countries working on satellite QKD technology
Satellite communication is another way of implementing QKD [quantum key distribution; a type of quantum encryption] while avoiding the issues associated with fibre optic cabling. It will likely be used for long-distance QKD, while fibre-based QKD will be used for local communications networks. Canada is currently a leader [emphasis mine] in this area and is pursuing satellite-based QKD through the Quantum Encryption and Science Satellite (QEYSSat) program. Led by the Canadian Space Agency, QEYSSat is a collaborative project that includes Honeywell Aerospace and the Institute for Quantum Computing at the University of Waterloo, with nine additional collaborators at universities across Canada (including a QKD ground station at the University of Calgary) and nine additional collaborating organizations around the world (CSA, 2020; UWaterloo, 2021). Other foreign jurisdictions actively developing satellite-based QKD include China (Optica, 2022; Jones, 2023), Germany (DLR, n.d.), India (ET Telecom, 2023), Israel (QuantLR, 2021), Japan (Mamiya et al., 2022), Luxembourg (Burkitt-Gray, 2021), Singapore (GW, 2019; SpeQtral, 2022), the United Kingdom (Pultarova, 2021), and the European Union (Kramer, 2022; E.C., 2023) [pp. 26 -27 in the print version and pp.. 54 – 55 in the PDF version]
I’ve stumbled across a couple of news releases announcing research on quantum networks, one of the topics in Chapter 2 of Quantum Potential,
…
Although quantum networking is an active area of research — for example, it is one of the central goals of a €1 billion commitment by the European Union (Quantum Flagship, 2017; Cartlidge, 2018) — it is far from technological maturity and commercial availability, and unlikely to be available before 2035 (QDNL, 2020). Nevertheless, many countries are interested in building quantum networks, including Canada (Box 2.1), and near-term investments in these networks will be beneficial to a wide range of quantum technologies over the longer term. [p. 27 in print version and p. 55 in PDF version]
Sadly, Box 2.1 doesn’t offer very exciting information, from Chapter two of Quantum Potential,
Box 2.1 A National Secure Quantum Communications Network
Canada’s National Quantum Strategy (NQS) identified the implementation of a national secure quantum communications network as a key priority (ISED, 2023d). According to the NQS, there is a large commercial market tied to the secure transmission of digital information, which is highly vulnerable to emerging quantum technologies. A secure quantum communications network would incorporate quantum communications technologies as well as QRC protocols to mitigate these risks. DND has committed to developing quantum networks capable of transmitting quantum information over long distances by 2030 (DND & CAF, 2023). The NQS also identifies both land- and satellite-based infrastructure as important, pointing to QEYSSat and the NRC’s High-Throughput and Secure Networks Challenge program as initiatives directed toward this goal (ISED, 2023d). However, in the panel’s view, it is critical that interconnectedness be retained between national and international partners; this can be achieved if Canada is involved in the development and adoption of international standards (Sections 5.4 and 6.2.3) [p. 28 in print version and p. 56 in PDF version]
Since Chapter 2 includes commercialization, it makes sense that market size would be included, from Quantum Potential,
2.1.4 Potential Market Size for Quantum Technologies
Several sources (mainly, but not exclusively, consultancy firms) have attempted to estimate the potential market size for different quantum technologies and how they could grow over time (Figure 2.1). However, the panel cautions that these numbers are inherently speculative, represent rough estimates at best, and should be treated with a high degree of skepticism, as it is difficult to predict the market potential of technologies that are still years (or even decades) away from maturity, and for which few practical applications exist at the time such estimates are drawn. Indeed, in the panel’s view, these numbers are likely inaccurate and exemplify quantum hype (Section 4.2.3). They are presented here simply to demonstrate the high level of uncertainty around the size and growth rate of the market for quantum technologies over the next decades. Moreover, the panel believes that focusing on market size may be myopic, as quantum technologies will undoubtedly have significant social and economic impacts across a wide range of areas. Comparing estimates can also be difficult because various sources assess the market sizes for different sets of technologies and categorize quantum technologies in different ways. Additionally, while most estimates are presented in terms of “revenue” (Batra et al., 2021; Bobier et al., 2021; CSIRO, 2022), others are described as “sales” (Doyletech Corporation, 2020), “market potential” (QDNL, 2020), and “estimated market” (McKinsey, 2022); as such, it is unclear whether these estimates are directly comparable. It is also notable that estimates for the same technology from different sources can differ by as much as an order of magnitude, particularly when they are projected further into the future (e.g., 15 to 20 years). However, in all estimates, quantum computing has a much higher potential market value compared to communications and sensing, accounting for between 50% and 80% of the projected quantum technologies market over the next two decades. [p. 56 – 57 in the print version and pp. 56 – 57 in the PDF version]
…
So taking into account that nobody really knows what the market potential is (I very much appreciate the honesty), the expert panel offers a best guess as to what the economic impact might be for Canada, from Quantum Potential,
The economic impacts of quantum technologies in Canada are potentially very large
Modelling by Doyletech (2020) — which was commissioned by the NRC and cited in Canada’s NQS — projects that the total economic impact of quantum technologies in Canada (including indirect and induced effects) could be $138.9 billion by 2045. This would represent roughly 2.7% to 3.3% of the total Canadian economy in 2045, and result in over $42.3 billion in tax revenues. To put these numbers in perspective, Canada’s aerospace sector generated about $28 billion in economic activity in 2016, representing about 1.3% of the total economy (Doyletech Corporation, 2020). Importantly, however, in the panel’s view, these (and similar) estimates are highly speculative and should be treated with caution given the difficulty of predicting the economic impacts of technologies that are still years (or even decades) away from maturity and for which few practical applications exist (recall Section 2.1.4). Regardless, quantum technologies may present a significant return on investment for Canada. For example, Quantum Delta NL predicts a roughly eight-fold return on quantum technologies for the Dutch economy over the medium term (QDNL, 2020). [p. 38 in print version and p. 66 in PDF version]
…
Most analyses of the economic value of quantum technologies focus on technology developers, not adopters
The potential total market size for quantum technologies has been assessed by several different sources. However, these estimates almost exclusively focus on the market size for suppliers of quantum technology (i.e., developers of quantum computers, sensors, and communications technology) and not on the economic benefits for the adopters or end-users of these technologies. In addition, all available estimates of the economic benefits for adopters of quantum technologies focus entirely on quantum computing, with no estimates of the value created by the adoption of quantum sensors or communications. [p. 40 in print version and p. 68 in PDF version]
…
Chapter 3: Quantum Technology Landscape
This chapter offers a more informed overview of the Canadian ‘quantum scene’ than my admittedly idiosyncratic approach as noted in Part one, from Quantum Potential,,
Early investments and talent development in quantum-based research have given Canada a strong foundation as it experiences the second quantum revolution; however, given its relatively small population and research budget, there is a risk of losing early domestic advantage to China, the United States, and members of the European Union (Dunlop, 2019; ISED, 2022d). In a report on the development of a national quantum strategy, the evolution of the domestic quantum ecosystem is described as the product of a “relatively neutral Canadian public policy environment,” leadership and philanthropy from private investment, successful individual institutions and investigators, and the ability of start-ups to develop their own markets (Dunlop, 2019). As a result, Canada’s nascent quantum ecosystem has grown into a small, yet interdisciplinary network made up of industry organizations, research centres, and business accelerators and incubators (Dunlop, 2019).
Canada’s quantum technology value chain strongly depends on international partnerships. This, however, is not unique to Canada; several of the necessary materials and components can only be obtained from a handful of foreign suppliers. This chapter will highlight some international dependencies and identify areas of the value chain where Canada may be able to emerge as a global leader. [p. 43 in paper version and p. 71 in PDF version]
…
There’s a little more detail about the ecosystem, from Quantum Potential,,
3.1 Quantum Activity in Canada
Canada’s quantum landscape is composed of a range of organizations and institutions that include universities, start-ups, internationally competitive companies, and industry networks and consortia. While Canada has an active and vibrant research and start-up ecosystem, increasing international competition is beginning to challenge its global position. Likewise, Canadian companies are lagging in metrics related to intellectual property (IP) protection [emphasis mine] ; these could be indicators of future challenges related to commercialization and technology adoption. There may be other issues related to technology adoption stemming from Canada’s localized hubs of expertise, which largely exclude the Atlantic provinces and the territories. However, Canadian organizations have a wide- reaching international network of partnerships and collaborations, which could help them in a variety of ways, from attracting talent to connecting companies with larger future markets. [p. 44 in print version and p. 72 in PDF version]
Assessing your research position using intellectual property as a metric could be considered problematic (more about that in my comments).
For anyone interested in the Canadian quantum scene, this is a very interesting subsection, from Quantum Potential,
Quantum expertise is unevenly distributed across Canada
Section 3.1.1 showed that quantum-based research originates in several hubs across Canada; British Columbia, Alberta, Ontario, and Quebec have all developed thriving quantum technology ecosystems, and these clusters are somewhat differentiated by speciality [sic]. However, this also means that certain regions are not as well represented. For example, Atlantic Canada and the territories are largely absent from the National Quantum Strategy (NQS), specifically the Regional Development section of the Commercialization pillar (see ISED, 2023d). The ramifications of an uneven distribution of quantum expertise are discussed in Chapter 4.
National and regional quantum technology research hubs in Canada attempt to facilitate collaboration among researchers, industry partners, and government stakeholders to advance the development and commercialization of quantum technologies. Ideally, these hubs and networks can help develop a talent pipeline that connects students to industry and supports Canada’s innovation ecosystem. Some of the programs discussed below are geographically centred hubs, while others seek to connect academic, industrial, and government entities across the country.
British Columbia has a highly collaborative quantum community, with universities conducting research across various fields, such as quantum computing, communications, and materials (S. Simmons, personal communication, 2023). The Stewart Blusson Quantum Matter Institute at the University of British Columbia (UBC, n.d.) and Simon Fraser University’s 4D Labs (SFU, n.d.) provide testing, fabrication, and prototyping facilities for researchers and companies developing quantum materials, circuits, and devices. Quantum BC, a joint initiative led by three leading research universities, “aims to stimulate and enrich collaborative efforts across research, training and innovation in quantum computing” (Quantum BC, 2022). Through the NSERC CREATE Quantum Computing program, Quantum BC offers a unique training experience for graduate students in quantum computing hardware and software, in part via internships with industry partners (Quantum BC, n.d.). British Columbia also has a growing ecosystem of quantum technology companies and start-ups. Nine quantum companies collectively employ more than 500 employees, receive more than $270 million in funding and hold more than 404 patents (Mantha & Turner, 2023). Notable companies include D-Wave, 1QBit, Good Chemistry [a 1QBit spin-off company was acquired by SandboxAQ on January 9, 2024; SandboxAQ is an Alphabet {formerly known as Google} spin-off company] , and Photonic Inc. Quantum Algorithms Institute (QAI) connects academia to industry, and supports the growth of the province’s quantum computing ecosystem (Wong, 2021). QAI supports practical initiatives to increase quantum awareness, grow the quantum workforce, and educate new customers about quantum solutions to solve business challenges.
Alberta has a 20-year history in quantum research, development, and commercialization, with major quantum advances and investments totalling over $30 million (CFI, 2023). This is enhanced by synergistic areas of strength, such as nanotechnology and AI. The University of Alberta, University of Calgary, and University of Lethbridge have collaborated since 2015 via Quantum Alberta, a consortium of academic and industry experts who joined together to elevate quantum science and technology R&D and commercialization in Alberta (Quantum Alberta, 2022). Spin-off companies, such as Quantized Technologies Inc., Quantum Silicon Inc., and Zero Point Cryogenics, are part of a nascent, growing quantum start-up culture. In 2022, Quantum City — a partnership among the University of Calgary, the Government of Alberta, and leading technology company Mphasis — was established with over $100 million in private and public investments (UofC, 2022). Quantum City is a global quantum knowledge translation hub, bringing together researchers, quantum companies, and early adopters of quantum technologies and services. It is investing in 15 new University of Calgary quantum faculty positions, as well as training and upskilling programs (e.g., master’s in quantum computing, NSERC CREATE in Innovators for Quantum Computing Deployment), in collaboration with the Université de Sherbrooke, a quantum fabrication and characterization facility (qLab), and an incubation and ideas collision hub (qHub) (B.C. Sanders, personal communication, 2023).
In Ontario, an ecosystem consisting of Waterloo and the Quantum Valley Hub was established in 2001. It brings together more than a dozen organizations and start-ups working in fundamental physics, experimental implementation, device engineering, and venture capital (TQT, n.d.-a, n.d.-b), including the Perimeter Institute, a non-profit organization focused on foundational physics; the Institute for Quantum Computing, which aims to develop quantum information science and technology; the Quantum-Nano Fabrication and Characterization Facility, which specializes in building quantum devices; a Canada First Research Excellence Fund project with a focus on quantum health; the Quantum Valley Ideas Lab; and Quantum Valley Investments, a $100 million venture capital fund (R. Laflamme, personal communication, 2023). Some of the start-ups include ISARA Corporation, High Q Technologies, Single Quantum, Universal Quantum Devices, Aegis Quantum, and Quantum Benchmark. Quantum Valley was built as a public-private partnership (PPP) and has benefited from ongoing, strong support from the governments of Canada and Ontario, philanthropy (most significantly from Mike and Ophelia Lazaridis and Douglas Fregin), and the University of Waterloo. More than $1 billion has been invested to date. The ecosystem takes advantage of Waterloo’s strong innovation base and entrepreneurial culture, existing talent, unique R&D infrastructure, and strong network of collaborators, forming a community with a shared vision of a quantum future.
In Quebec, the Innovation Zone in Quantum Science and Technological Applications is formed around the Université de Sherbrooke and represents investments of over $435 million in the region, of which $131 million is public funding (C. Sarra-Bournet, personal communication, 2023). Companies such as 1QBit, Bell, IBM, PASQAL, and Eidos-Sherbrooke have also committed to investments of $270 million over five years. Of note, the Quebec-IBM Discovery Accelerator program is involved with the installation of a 127-qubit IBM quantum computer in collaboration with Plateforme d’Innovation Numérique et Quantique (PINQ2) at its IBM Bromont facility (PINQ2, 2023). Sherbrooke is also the home of Canadian university spin-offs such as Nord Quantique, SBQuantum, and Qubic Technologies. The R&D infrastructure capabilities of the hub are provided by the Integrated Innovation Chain (IIC), led by Institut quantique (IQ), the Interdisciplinary Institute for Technological Innovation (3IT), and the MiQro Innovation Collaborative Centre (C2MI), which acts as a bridge between university research and the development of new products transferred to industry (Quebec Quantique, n.d.; Université de Sherbrooke, n.d.). Since 2010, the IIC has benefited from more than $1 billion in investments, with more than 60% coming from industrial partners. This ecosystem is part of a semiconductors corridor that was recently part of a memorandum of understanding (MOU) between the United States and Canada related to the U.S. CHIPS and Science Act (Platt, 2023) (Box 5.5).[pp. 54 -57 in print version and pp. 82 – 85 in PDF version]
As noted earlier, I noticed that Xanadu Quantum Technologies, which started life as a University of Toronto startup, is not mentioned in the Ontario subsection. Moving on.
International dependencies are noted further on in Chapter 3, from Quantum Potential,
3.2.1 International Dependencies
Most quantum technologies cannot currently be bought off the shelf, and the list of parts and materials needed to build them could include hundreds of different vendors. Moreover, many of these components may only be available from one or two vendors globally, and specialized equipment could, in some cases, be back-ordered by a year or more (INDU, 2022b). Although quantum computers are some of the most complicated quantum technologies, communications and sensing devices may also experience similar supply chain challenges and international dependencies. … [p. 68 in the print version and p. 96 in the PDF version]
…
Geopolitical events can also affect access to materials and devices (INDU, 2022b). For example, starting in 2022, war in Ukraine significantly reduced the availability of crucial materials such as neon gas and high-purity Silicon-28, while much of the world currently depends on Taiwan for certain types of semiconductors and microcontrollers. Assessing international supply chain dependencies is not trivial. In some cases, the provenance of certain components or materials may not be apparent. For example, electronics, rare Earth magnets, and other raw materials may be sourced from foreign vendors, but where these vendors source the materials may not be known (Parker et al., 2022).[p. 70 in the print version and p. 98 in the PDF version]
Yes, I think we in Canada have become increasingly aware of our vulnerabilities vis à vis supply chains and how we get and provide access to critical materials.
Chapter 4: Ethical, Social, and Institutional Challenges
Surprising no one, it seems implementation of quantum technologies could make things worse and/or it could make things better, from Quantum Potential,
… The review of challenges presented in this chapter is the first part of a broader analysis of the multifaceted ethical, legal, social, and policy implications of quantum technologies, also known as Quantum ELSPI (Kop, n.d.) This chapter focuses on social and ethical ssues within the framework of Quantum ELSPI. Chapter 5 reviews the interconnected legal and policy implications. [p. 81 in paper version and p. 107 in PDF version]
For anyone who’s been following ELSPI or ELSI through various new and emerging technologies, this seems like a walk down memory lane, from Quantum Potential,
4.1 Ethical Challenges and Quantum Ethics
Like many other technologies, quantum technologies can have both beneficial and harmful applications. For example, quantum-resistant cryptography (QRC) can enhance individual and collective privacy and security, but the ability of quantum computers to overcome existing cryptography can facilitate mass surveillance and access to confidential information and undermine digital infrastructure essential for healthcare, banking, and public utilities. Quantum technologies can enable scientific breakthroughs in medical research and chemistry, but some actors can exploit the complexity of underlying quantum phenomena to spread misinformation and undermine public trust in scientific progress. Additionally, disparate access to quantum technologies and the expertise necessary for their adoption can exacerbate the digital divide among different communities, regions, and countries.
Due to the relative novelty of quantum technologies, ethical principles guiding human conduct in the face of both beneficial and harmful applications are only beginning to emerge. Quantum ethics is a new field of applied ethics that focuses on moral behaviour in the domain of quantum technologies (Kop, 2021a). It “calls for humans to act virtuously, abiding by the standards of ethical practice and conduct set by the quantum community, and to make sure these actions have desirable consequences, with the latter being higher in rank in case it conflicts with the former” (Kop, 2021a).[p. 82 in print version and p. 110 in PDF version]
I don’t see anything new or quantum-specific about the ELSI or ELSPI concerns in that excerpt or in the subsequent table and paragraphs. As for what follows, most of that seems to be focused on the same old problems but given a twist because quantum technologies could provide solutions while creating new problems to such issues as privacy, confidentiality, etc. For example, from Quantum Potential,
Encryption protects two types of personal data: stored data (i.e., data-at-rest) and data that are sent over the internet (i.e., data-in-flight). Some researchers suggest it is relatively easy to create quantum-resistant data-at-rest systems, and many modern encryption systems already address a possible risk of decryption (Hoofnagle & Garfinkel, 2021). Quantum computers will most likely jeopardize data-in-flight that were sent over the internet at some point in the past and captured and archived by non-governmental actors or intelligence agencies (Hoofnagle & Garfinkel, 2021). Although there is no publicly accessible and reliable information on the ongoing interception of data-in-flight, it is reasonable to assume that any message transmitted anywhere in the world might be captured and stored by some person or agency, and then unlocked at some point in the future (Mosca & Munson, n.d.). While this risk is real, it may also be overstated, because conducting cryptanalysis will require access to a powerful quantum computer, as well as time to perform the analysis (Hoofnagle & Garfinkel, 2021). [ p. 84 in print version and p. 112 in PDF version]
In addition to bypassing current encryption, there could be environmental issues, from Quantum Potential,
4.2.2 Environmental Impacts of Quantum Technologies
Low-dimensional materials and nanomaterials (e.g., zero-dimensional semiconductor quantum dots, semiconductor nanowires, carbon nanotubes) hold promise for quantum technologies (Alfieri et al., 2022). Nanomaterials can, among other things, improve the coherence of qubits and the purity and brightness of quantum emitters, serving as conduits for quantum sensing and imaging (Alfieri et al., 2022). They are, however, double-edged swords. The unique properties that make them beneficial for product development, such as their size and high reactivity, also create environmental and safety concerns (NIEHS, 2021). For example, nanotechnology can allow sensors to find the smallest amounts of chemical vapours; however, it is often impossible to detect the level of nanoparticles in the air. This property presents a concern for the health and safety of employees in workplaces that use nanomaterials (NIEHS, 2021). Moreover, the use of nanomaterials by the quantum industry can increase the number of nanoparticles released into the environment.
The main challenges in conducting research on nanoscale materials include determining their quantity, evaluating biological reaction, and measuring the level of exposure and risk (NIEHS, 2022). In 2022, Environment and Climate Change Canada and Health Canada published their draft Framework for the Risk Assessment of Manufactured Nanomaterials under the Canadian Environmental Protection Act, 1999 (CEPA) (ECCC & HC, 2022). The framework provides that conclusions reached about a substance through the assessment process may differ, depending on its form (e.g., the traditional chemical or nanomaterial form, or variations among different nanoscale forms). Government scientists intend to use a weight-of-evidence approach to decide whether a nanomaterial released in the environment is considered toxic under CEPA (ECCC & HC, 2022). If adopted, the framework may determine how the government will regulate nanomaterials used in quantum technologies, including applicable environmental and human health risk assessments. As of September 2023, the draft framework has not been adopted. [p. 86 in the print version and p. 114 in the PDF version]
Although I’m glad to see awareness of possible environmental impacts, some of this section seems familiar from discussions about nanotechnology and environmental impacts. I have another niggle. For some reason, this report does not distinguish between engineered nanoparticles (which is what the authors are concerned with) from naturally occurring nanoparticles, which we’ve lived with for millennia.
As for this next section, it’s a pretty standard concern where new technologies are concerned, from Quantum Potential,
4.2.3 Exploitation of Misleading Views About Quantum Technologies
Quantum technologies are fundamentally different from other disruptive technologies, such as nanotechnology or AI, due to their perceived ability to operationalize the principles of quantum mechanics [emphasis mine] (Chapter 1). While the proliferation of quantum technologies (and quantum computers’ ability to solve some classically hard problems) has generated interest in the principles of quantum mechanics, specialists — let alone non-specialists — have difficulty understanding and explaining how quantum technologies work and what they might be able to achieve (Aaronson, 2021). For example, a public dialogue exercise conducted in 2017 [emphasis mine] in the United Kingdom found that the public was familiar with the word quantum but had limited understanding of how it applied to quantum technologies (EPSRC, n.d.). Quantum research is dominated by specialized public and private organizations, including defence and intelligence agencies, which contributes to the aura of mystery and secrecy surrounding quantum technologies.
Exploitation of misleading views about quantum technologies can stoke fear and undermine public trust
Some actors can exploit the scientific complexity of quantum mechanics to facilitate the spread and public acceptance of misinformation11 about quantum technologies, which may lead to public controversies. Lessons can be learned from incidents fuelled by misinformation, such as the attacks on 5G towers in the United Kingdom (Parveen & Waterson, 2020), anti-vaccination attitudes, and the attempted bombing of IBM’s nanotechnology facility in Switzerland (WEF, 2022b).
In addition to causing reputational or even physical harm to researchers and organizations working in the area of quantum, misinformation could erode public trust in quantum technologies (WEF, 2022b). In this context, bolstering public trust may require allocating public funds to confirm already-settled research findings. For example, in the European Union, genetically modified (GM) food safety research has received significant funding due to public controversies about genetically modified organisms (GMOs) (Ryan et al., 2020). Between 1982 and 2010, the European Commission spent over €300 million on GMO safety research, leading scientists to conclude that biotechnology was not riskier than commonly used plant breeding technologies (E.C., 2010). [pp. 86 -87 in the paper version and pp. 114 – 115 in the PDF version]
So, the impact of quantum technologies is different because it’s quantum? On the face of it, this seems like circular reasoning. That said, I’m inclined to agree that quantum technologies will likely present different problems than other emerging technologies. Perhaps, we also ought to be considering (or mentioning in this report) the possible impacts from synergies between these technologies.
The mention of violence and extreme responses to emerging technologies points to the thoughtfulness with which this report was compiled.
There is this in Chapter 4 that I want to highlight, from Quantum Potential,
The adage known as Amara’s Law, coined by American researcher and technology forecaster Roy Amara, states that:
“We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” [emphasis and quotation marks mine]
Over the long term, it is exceedingly likely that quantum technologies — and quantum computing in particular — will be incredibly disruptive and transformative, impacting society in ways that cannot currently be predicted. … [p. 87 in the print version and p. 117 in the PDF version]
Point well taken. The discussion that follows Amara’s Law is of the harms that could be inflicted by hype, privacy violations, constitutional rights and freedoms, the increasing the divide between haves and have-nots, andwhat happens to people’s jobs, from Quantum Potential,
Box 4.3 The Port of Los Angeles Logistics Optimization
The Port of Los Angeles is the largest U.S. facility for handling shipborne cargo. An initiative at Pier 300, one of the port’s largest terminals, leveraged D-Wave’s computational power to optimize the port’s logistics (D-Wave, 2022). The Hyper-Optimized Nodal Efficiency (HONE) engine processed data from more than 100,000 different cargo-handling runs across a range of real-world and hypothetical scenarios, in order to identify opportunities for optimization. As a result, the terminal uses nearly 40% fewer of its crane resources for the unloading process, and each of the cranes travels a considerably smaller average distance per day. The cranes have also increased their deliveries by more than 50%, and each truck is spending nearly 10 minutes less receiving the payload at the terminal (D-Wave, 2022). Arguably, a classical program could have been used to optimize logistics; the D-Wave annealing solution was chosen because the project team was familiar with it (QCR, 2022b).
At the same time, the growing use of automation and robotics is a complex issue. It requires considering both the private sector’s desire to increase the efficiency of supply chain management and the impacts of workflow optimization on the workforce. For example, to support the workforce of the Port of Los Angeles, the Government of California funded the Goods Movement Training Campus for truck drivers, mechanics, welders, and others who might require upskilling or re-skilling due to automation, and created professional development opportunities for future hires (Spectrum News 1, 2022). [pp. 93 – 34 in the print version and pp. 121 – 122 in PDF version]
…
They’ve made everything more efficient with D-Wave Systems’s, a Canadian company, computational power, which is noted with a number of statistics and they don’t have any statistics for the number of people affected? Interesting.
The mention of bias and explicability (everything happens in a black box) draws heavily on the experience with artificial intelligence, from Quantum Potential,
Quantum machine learning may exacerbate discrimination or other kinds of unfairness resulting from automated decision-making. Among the reasons existing AI systems generate skewed, inaccurate, or discriminatory results is because available datasets and models make AI biased by design (Robertson et al., 2020; CCA, 2022; Crawford, 2022). Marginalized and racialized people and groups are disproportionately impacted by AI trained on bad data (i.e., missing, incorrect, or inconsistent data) (Richardson et al., 2019). Researchers found that the use of AI to make hiring decisions discriminated against women (Dastin, 2018) and people with physical and mental disabilities (Fruchterman & Mellea, 2018). AI has perpetuated discrimination against Black people in various contexts, including healthcare (Obermeyer et al., 2019), the criminal justice system (Robertson et al., 2020), and online content moderation (Sap et al., 2019). In Canada, data collection and processing practices often discriminate against Indigenous communities (Robertson et al., 2020) and minimize or disregard Indigenous knowledges and experiences (CCA, 2022). The existing funding models enable select institutions to define the research agenda and extract Indigenous data while downplaying the potential negative impacts of these practices for Indigenous communities (GC, 2019).
A lack of gender and racial diversity in STEM (Section 4.4.2) could also amplify bias in quantum-enabled automated decision-making systems. Lessons can be learned from many applications of AI — including facial recognition, speech recognition, and hiring tools — where a lack of diversity in the AI industry has fed into data selection and technology design processes, resulting in outcomes that are biased against women and minoritized and racialized people (West et al., 2019; Stinson, 2022). [p. 94 in the print version and p. 122 in the PDF version]
…
It seems that in some ways quantum technologies may not be so different from other emerging technologies..
Technology adoption in Canada faces a number of barriers both governmental and corporate. Given that the banking industry is very concentrated (as is the telecommunications industry), this projection of how the big banks will adapt to quantum technology is illuminating, from Quantum Potential,
Canada’s big banks adopt technological innovations implemented elsewhere, provided the risks and benefits are well known Various reports cite the financial sector as one of the key adopters of quantum technologies (Chapter 2). However, as is the case with the telecommunications sector, the state of competition in the financial sector may affect the speed of adoption. On one hand, Canada’s banking system has high concentration levels; the six largest banks, known as the Big Six (Bank of Montreal, Bank of Nova Scotia, Canadian Imperial Bank of Commerce, National Bank of Canada, Royal Bank of Canada, and Toronto Dominion), controlled around 90% of overall banking assets from 1996 to 2015 (McKeown, 2017). On the other hand, unlike the telecommunications sector, there is inconclusive evidence on the state of competition in the financial sector (Bednar et al., 2022), and high concentration levels are not indicative of the degree of competition among incumbent firms or market contestability (CCA, 2009). In a contestable market, firms willing to enter and exit the market do not face prohibitive barriers, and the prospect of nascent competition may encourage innovation by incumbent firms (CCA, 2009).
One study that measured the degree of contestability in the Canadian banking industry concluded that the sector is characterized by monopolistic competition (Allen & Liu, 2007), which disincentivizes so-called visible innovations (e.g., innovations in services) because they can be relatively quickly replicated by competitors, thus minimizing the advantages sought by the first innovator (CCA, 2009). As an alternative, innovation usually targets internal processes (e.g., physical capital and software for ICT [information and communications technology]), which are hidden from competitors and, therefore, harder to reproduce. For example, between 2009 and 2019, Canada’s six largest banks collectively invested $100 billion in technology, substantially improving their in-house cybersecurity. This indicates that Canadian banks may be interested in investing in security-enhancing quantum technologies. However, the banking sector remains relatively risk-averse with respect to technological innovations. Usually, it adopts successful technological innovations implemented elsewhere, provided their risks and benefits are well known (the so-called early follower innovation strategy) (CCA, 2009). [pp. 98 – 99 in the print version and pp. 125 – 127 in the PDF version]
Chapter 4 concludes with some data about problems attracting women into quantum technology fields and with attracting and retaining international talent.
Chapter 5: Legal and regulatory challenges
There doesn’t seem to be much new ground covered but there is some fascinating language in this section on privacy, from Quantum Potential,
According to Dekker and Martin-Bariteau (2022), existing privacy frameworks provide a foundation for assessing whether any particular application of quantum sensing is reasonable “within the context of that technology’s sensing capabilities (i.e., the degree of invasiveness on an individual’s privacy).” For example, the technique of counterfactual ghost imaging (Box 5.2) shows how, from a legal standpoint, the use of a quantum sensing for surveillance is not significantly different from the use of any other surveillance technology. [emphasis mine] This technique allows a third party to collect information about a person without their consent and knowledge. The sensing technique, however, is irrelevant because the reasonable expectation of privacy test remains the same as long as certain factual circumstances are met. Such surveillance practices may be illegal both in the private and public sector contexts (Dekker & Martin-Bariteau, 2022).
Box 5.2 Counterfactual Ghost Imaging
Counterfactuality refers to using quantum effects to examine objects or transmit messages without exchanging matter or energy between the two parties when transferring information (Hance & Rarity, 2021). A single photon can be used to go through an interferometer (a device merging two or more sources of light to create an interference pattern) and identify an object or its characteristics without a physical interaction with it (LIGO Caltech, n.d.). A method called ghost imaging uses entangled photon pairs to detect obscure objects with significantly better “signal-to-noise ratio while preventing over-illumination” (Zhang et al., 2019). [box ends]
Still, evaluating privacy implications of quantum-based surveillance may present challenges for courts unfamiliar with this sensing technology. Quantum sensing comes with its own ethical considerations, and its application without guidance and oversight can lead to privacy violations. Proactive regulation and ongoing oversight could support the accountable use of quantum sensing by governments, law enforcement, and private actors (Dekker & Martin-Bariteau, 2022). [pp. 106 -107 in print version and pp. 134 – 135 in PDF version]
Counterfactual ghost imaging is a pretty evocative term.
The discussion on privacy mentions an element of Bill C-27 An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts, which had escaped me in previous stories here, from Quantum Potential,
Bill C-27 excludes anonymized data from data protection rules
In June 2022, the Government of Canada introduced Bill C-27 (An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts) (House of Commons of Canada, 2022a). The Consumer Privacy Protection Act (CPPA) suggested under Bill C-27 establishes separate categories of anonymized and de-identified data. Under the bill, to anonymize “means to irreversibly and permanently modify personal information, in accordance with generally accepted best practices, to ensure that no individual can be identified from the information, whether directly or indirectly, by any means,” whereas to de-identify “means to modify personal information so that an individual cannot be directly identified from it, though a risk of the individual being identified remains” (House of Commons of Canada, 2022a). Bill C-27 kept de-identified data within the regulatory framework but excluded anonymized data [emphasis mine], assuming that they cannot be re-identified (Dekker & Martin-Bariteau, 2022; House of Commons of Canada, 2022a). These separate data categories were introduced partially to offer organizations more flexibility in the processing of anonymized and de-identified information for “internal research, analysis and development purposes” (House of Commons of Canada, 2022a; Gratton et al., 2023). Anonymized data are also exempt from retention limits, and the right of erasure does not apply to them (House of Commons of Canada, 2022a; Scassa, 2022).
Some researchers, however, criticized the proposal to exclude anonymized data from data protection rules, partly because quantum-enabled AI systems may be able to re-identify anonymized data, thereby exacerbating privacy risks (Dekker & Martin-Bariteau, 2022). The proposed Artificial Intelligence and Data Act (part of Bill C-27) aims to partially address this issue by imposing anonymized data governance requirements on private sector organizations, requiring them to “establish measures with respect to (a) the manner in which data is anonymized; and (b) the use or management of anonymized data” (House of Commons of Canada, 2022a). As of September 2023, Bill C-27 has not passed [emphasis mine]. [p. 109 in the print version and p. 137 in the PDF version]
I did not realize that ‘anonymized data’ was being excluded from the bill. Given the quantum computing capabilities described in this report, it seems to be an odd and short-sighted choice on the government’s part.
The approach to intellectual property is pretty standard, from Quantum Potential,
Quantum computers and other quantum applications, such as sensing, cryptography, and communications, are eligible for IP protection (Kop, 2021b; Rand & Rand, 2022). IP rights encompass several rights regimes, including patents, copyright, and trade secrets [emphasis mine]. Generally, these regimes aim to promote innovation [emphasis mins] by granting the owners an exclusive right to make public, commercialize, reproduce, and limit distribution of their inventions (McKenna, 2006). IP rights are instrumental in building the quantum sector’s value appropriation strategy because they can prevent, for a period of time, third parties from deriving economic benefits from the inventions or original expressions of IP owners (DOJ & FTC, 2007). [p. 110 in the print version and p. 138 in the PDF version]
I appreciate the use of the verb ‘aim’, which suggests a little caution with regard to the phrase ‘promote innovation’.
Following on the ‘caution’, there are some suggestions that intellectual property such as copyright (which can be applied to quantum technologies) for one was never intended to ‘promote innovation’, from the description for the 2024 book, “Who Owns This Sentence? a History of Copyrights and Wrongs” by David Bellos and Alexandre Montagu (published by W. W. Norton & Company),
Copyright is everywhere. Your smartphone [emphases mine] incorporates thousands of items of intellectual property. Someone owns the reproduction rights to photographs of your dining table. At this very moment, battles are raging over copyright in the output of artificial intelligence programs. Not only books but wallpaper, computer programs, pop songs, cartoon characters, snapshots, and cuddly toys [emphasis mine] are now deemed to be intellectual properties–making copyright a labyrinthine construction of laws with colorful and often baffling rationales covering almost all products of human creativity. It wasn’t always so. Copyright has its roots in eighteenth-century London, where it was first established to limit printers’ control of books [emphases mine] . But a handful of little-noticed changes in the late twentieth century brought about a new enclosure of the cultural commons, concentrating ownership of immaterial goods in very few hands. Copyright’s metastasis can’t be understood without knowing its backstory, a long tangle of high ideals, low greed, opportunism, and word-mangling that allowed poems and novels (and now, even ringtones and databases) to be treated as if they were no different from farms and houses. Principled arguments against copyright arose from the start and nearly abolished it in the nineteenth century. Nonetheless, countless revisions have made copyright ever stronger.
In many ways copyright and other intellectual property regimes have been weaponized by large companies for control of the markets while smaller companies seem to use these tools more as defensive measures, from Quantum Potential,
The exercise of IP rights by a dominant player can make a winner-takes-all scenario [emphasis mine] more likely. For example, Microsoft has used a topological structure to build a quantum computer while many of its competitors have relied on superconductors (Hoofnagle & Garfinkel, 2021). If its approach is more successful, Microsoft could protect the engineering aspects of topological structures by invoking trade secrecy and “by selling its quantum computers as a service rather than as standalone devices [emphasis mine]” (Hoofnagle & Garfinkel, 2021). While this strategy would allow the company to gain a competitive advantage, it could also impede the development of hybrid quantum processors [emphasis mine] that draw on the strengths of technologies developed by different IP owners.
Patents owned by larger firms [emphasis mine] can also discourage follow-on research, as well as product development and commercialization by SMEs, because “the cost of accessing those patents, through either royalties or legal battles, may simply be too high for small firms to sustain” (Gallini & Hollis, 2019). Dense webs “of overlapping intellectual property rights” that impede technology commercialization by SMEs (i.e., patent thickets [emphasis mine]) (Shapiro, 2000) can be found in two technology fields where Canadian innovators have historically enjoyed a relative advantage [emphasis mine]: computers and communications (Gallini & Hollis, 2019). [p. 111 in the print version and p. 139 in the PDF version]
…
IP assets can stimulate the growth of SMEs [emphasis mine; SME is small and medium-sized enterprises] in different stages of development (including start-ups and university spin-offs) (EPO, 2017). Compared to firms that lack IP rights, IP-holding SMEs are more likely to receive higher amounts of financing (since IP can be used as collateral for loans), innovate, realize plans for domestic and international expansion, and experience higher growth (Collette & Santilli, 2019). IP rights, and particularly patents, enable innovators to fend off competitors, protect their businesses from large firms [emphasis mine], and build patent portfolios that facilitate cross-licensing agreements (Gallini & Hollis, 2019). [p. 112 in print version and p. 140 in PDF version]
Unlike patents, copyright arises automatically and protects original expressions of ideas, including those contained in software such as “computer source code, visual user interface elements, API [Application Programming Interface] structure, user documentation and product guides” (Bereskin & Parr LLP, n.d.). The functional aspects of software, however, are not subject to copyright protection (Samuelson, 2017).
Copyright can protect aspects of quantum technology that constitute “literary works” under the Copyright Act (GC, 1985e; Bereskin & Parr LLP, n.d.). For example, in some contexts, the following components of a quantum computer are eligible for copyright protection: “quantum software, the APIs, quantum arithmetic unit (quantum addition, subtraction, multiplication, and exponentiation), runtime assertion and configuration, quantum computing platforms, program paradigm and languages, the Bacon-Shor stabilization code, color codes, and surface codes” (Kop, 2021b).
These components are eligible for copyright if they meet the criterion of fixation (White, 2013; Dylan, 2019). A work is fixed when it is “expressed to some extent at least in some material form, capable of identification and having a more or less permanent endurance” (Exchequer Court, 1954; Hagen et al., 2022). Fixation is one of the main requirements of copyright because it prevents people from claiming legal protection for thoughts (Schmit, 2013). However, according to Schmit (2013), the material form requirement could be problematic for quantum software because the quantum object code cannot be fixed for “more than a transitory duration due to superposition” that “allows a system of n-qubits to be any or all of 2n different possibilities simultaneously [original emphasis was an italicized ‘simultaneously’ in the standardized text of the original report]. [p. 113 in the print version and p. 140 in the PDF version]
I’m going to speed this up by skipping through areas where my understanding is poor. That means skipping the rest of Chapter 5, which goes on to include sections on Competition Law, Standards and Standardization, Domestic and Foreign Trade Regulations, and more. I’m also going to skip Chapter 6: Enabling Conditions for Adoption entirely.
Chapter 7: Framework for the Responsible Adoption of Quantum Technologies
I found the reflections from the expert panel to be the most interesting part of Chapter 7, from Quantum Potential,
Quantum technologies are being recognized globally as critical investments, inspiring many countries to commit millions if not billions of dollars to their development. The panel notes that these substantial investments are necessary, as quantum technology research can be slow and expensive, with much of the equipment and raw materials needing to be imported from specific suppliers (some of which are the only option). Canada cannot mine, manufacture, or otherwise create every input along any given quantum product’s supply chain. As such, it is crucial that it cultivate robust and reliable international collaborations to supplement the points of the quantum value chain Canadian companies cannot fulfill, while also providing larger and additional export markets for domestic quantum companies. [p. 171 in the print version and p. 199 in the PDF version]
…
In 2023, Canada published its National Quantum Strategy (NQS). While the NQS is a good starting point for developing a domestic quantum ecosystem, the panel believes promising initiatives remain underfunded, especially compared to jurisdictions that have committed more significant amounts. … The NQS [National Quantum Strategy] largely focuses on supply-side initiatives with less support for stimulating diffusion and adoption. Although there is some spotlight on adopting users and sectors — such as the proposed roadmapping process — jurisdictions leading in the quantum space (e.g., China, United States) employ comprehensive technology adoption strategies for both the public and private sectors.
In the panel’s view, the NQS does not pay sufficient attention to ELSPI [emphasis mine; ethical, legal, social, and policy implications] related to the adoption of quantum technologies. As noted above, some quantum sensors could exacerbate surveillance concerns, and quantum computers could threaten digital encryption and worsen individual and collective discrimination. These capabilities complicate the application, interpretation, and enforcement of Canada’s privacy and data protection laws. In addition, there is the challenge of ensuring equitable and broad access to quantum technologies across Canada as they become available. This vulnerability is aggravated by the fact that big technology companies can exploit their market power to dictate the terms and conditions of access; moreover, the application and enforcement of Canada’s IP and competition law may favour major market players. Finally, because quantum science is conceptually challenging, quantum technologies are likewise difficult to understand and can be shrouded in mystery or overhyped. [p. 172 in the print version and p. 200 in the PDF version]
Strangely (or not), there is no mention among the remedies suggested by the panel for public outreach or raising public awareness of the ‘social’ aspect of ELSPI.
Final comments
I learned a lot from Quantum Potential. Clear prose (always appreciated), good explanations of various quantum technologies (thank you!), and a clear-eyed view of the pitfalls associated with developing emerging technologies in Canada. The examination of the pitfalls (some of which are common to emerging technologies with a special emphasis on quantum technologies) gave this report an extra lift.
For the first time (other than when the query is focused on the issue) since I’ve been reading these things, the report addresses diversity issues, specifically, the lack of diversity. I was particularly impressed with this section in Chapter 4, from Quantum Potential,
There is a lack of diversity in STEM [science, technology, engineering, and mathematics]
The continued lack of diversity in STEM is another reason why skilled personnel are hard to find and why talent is not being used to its full potential. For example, Indigenous people make up less than 2% of all STEM sector employees (Cooper, 2020). Only 4.1% of Indigenous workers have post-secondary education in STEM disciplines compared to 10.4% of non-Indigenous people (Kazmi, 2022). Studies show that minoritized and racialized professors are underrepresented, have lower wages than their white colleagues, and feel that their contributions are undervalued by their peers (Henry et al., 2017). …
Math, computer science, and engineering are dominated by men
When it comes to attracting women researchers to quantum fields in Canada, “there is huge competition for the relatively few female candidates in quantum technologies, but this has not necessarily translated into more women entering relevant programs of study” (ISED, 2022d). … women accounted for 56% of post-secondary enrolment between 2010 and 2019. However, only 38.5% of STEM students were women, with even lower rates in math and computer science (28%) and engineering (22%) (Mahboubi, 2022). … [p. 101 in print version and p. 120 in PDF version]
…
The statistics are very much in line with what I’ve been reading for years about diversity. Like many people, in addition to the notion that diversity is about being more fair, I’ve long believed that mixed teams have better problem solving skills. Then, lightning struck …
Cognitive diversity?
I was looking for proof that diverse teams are stronger when I stumbled across a March 30, 2017 article “Teams Solve Problems Faster When They’re More Cognitively Diverse” by Alison Reynolds and David Lewis for the Harvard Business Review, Note: I have a caveat (warning/caution) that follows,
Looking at the executive teams we work with as consultants and those we teach in the classroom, increased diversity of gender, ethnicity, and age is apparent. Over recent decades the rightful endeavor to achieve a more representative workforce has had an impact. Of course, there is a ways to go, but progress has been made.
Throughout this period, we have run a strategic execution exercise with executive groups focused on managing new, uncertain, and complex situations. The exercise requires the group to formulate and execute a strategy to achieve a specified outcome, against the clock.
Received wisdom is that the more diverse the teams in terms of age, ethnicity, and gender, the more creative and productive they are likely to be [emphasis mine]. But having run the execution exercise around the world more than 100 times over the last 12 years, we have found no correlation between this type of diversity and performance [emphasis mine]. With an average group size of 16, comprising senior executives, MBA students, general managers, scientists, teachers, and teenagers, our observations have been consistent. Some groups have fared exceptionally well and others incredibly badly, irrespective of diversity in gender, ethnicity, and age.
Since there is so much focus on the importance of diversity in problem solving, we were intrigued by these results. If not diversity, what accounted for such variability in performance? We wanted to understand what led some groups to succeed and others to crash and burn. This led us to consider differences that go beyond gender, ethnicity, or age. We began to look more closely at cognitive diversity [emphasis mine].
Cognitive diversity has been defined as differences in perspective or information processing styles. It is not predicted by factors such as gender, ethnicity, or age. Here we are interested in a specific aspect of cognitive diversity: how individuals think about and engage with new, uncertain, and complex situations.
…
These cognitive preferences are established when we are young. They are independent of our education, our culture, and other social conditioning. Two things about cognitive diversity make it particularly easy to overlook.
…
First, it is less visible than, for example, ethnic and gender diversity.
…
The second factor that contributes to cognitive diversity being overlooked is that we create cultural barriers that restrict the degree of cognitive diversity, even when we don’t mean to.
There is a familiar saying: “We recruit in our own image.” This bias doesn’t end with demographic distinctions like race or gender, or with the recruiting process, for that matter. Colleagues gravitate toward the people who think and express themselves in a similar way [emphases mine]. As a result, organizations often end up with like-minded teams. …
If you look for it, cognitive diversity is all around — but people like to fit in, so they are cautious about sticking their necks out [emphasis mine]. When we have a strong, homogenous culture (e.g., an engineering culture, an operational culture, or a relational culture), we stifle the natural cognitive diversity in groups through the pressure to conform.
…
If cognitive diversity is what we need to succeed in dealing with new, uncertain, and complex situations, we need to encourage people to reveal and deploy their different modes of thinking. We need to make it safe to try things multiple ways. This means leaders will have to get much better at building their team’s sense of psychological safety.
There is much talk of authentic leadership, i.e., being yourself. Perhaps it is even more important that leaders focus on enabling others to be themselves.
I’d forgotten (after years as a freelancer and also because I have a tendency to underestimate it) the importance of conformity at work and, most of all, for getting ahead. “Go along to get along,” is an old adage my father shared with me as he prepared me for the world of work. That covers a lot of ground including the notion that you conform to expectations.
The authors’ point about cognitive diversity is well made as I had slipped into the habit of viewing more diversity as an automatic guarantor of more diverse thinking being brought to the table. As I see now that is a bit simplistic.
Next, in a not entirely unrelated topic:
Could arts/humanities/social science practitioners enhance the expert panel?
In a report where the expert panel notes that trying to imagine the impact that quantum technology might have on society in the future is very difficult, no one thinks to call on culture workers, sociologists, philosophers, writers, musicians, visual artists, theatre artists, etc. Really?
Sadly, this is not the first time that the Council of Canadian Academies and one of its expert panels has overlooked the obvious. Here’s what I had to say about a previous expert panel in February 22, 2013 posting,
I was very excited when the forthcoming assessment The State of Canada’s Science Culture was announced in early 2012 (or was it late 2011?). At any rate, much has happened since then including what appears to be some political shenanigans. The assessment was originally requested by the Canada Science and Technology Museums Corporation. After many, many months the chair of the panel was announced, Arthur Carty, and mentioned here in my Dec. 19, 2012 posting.
…
Could they not have found one visual or performing artist or writer or culture maker to add to this expert panel? One of them might have added a hint of creativity or imagination to this assessment [emphasis mine and added January 6, 2025]. …
As for incorporating other marginalized, be it by race, ethnicity, social class, ability, etc., groups the panel members’ biography pages do not give any hint of whether or not any attempt was made. I hope attempts will be made during the information gathering process and that those attempts will be documented, however briefly, in the forthcoming assessment [Science Culture: Where Canada Stands].
…
Notably, there is a joint programme (art/sci residency, known as the Quantum Studio) between the Stewart Blusson Quantum Matter Institute (Blusson QMI) and Morris and Helen Belkin Gallery (the Belkin), both at UBC (University of British Columbia), in partnership with The Embassy of France in Canada. For the curious, my October 27, 2024 posting provides more detail, scroll down about 25% of the way.
Adding arts/humanities/social science practitioners is not a guarantor of cognitive diversity any more than diversity of race, social class, ethnicity, gender, etc. would be but it certainly can’t hurt and there is precedence for believing that an outsider might come up with something interesting. For example, the Star Trek television series influenced the design of mobile phones. A set designer for a 1960s science fiction television show came up with the basic design for a cell phone? (You can find out more about how Star Trek influenced technology in 2017’s “Treknology: The Science of Star Trek from Tricorders to Warp Drive” by Ethan Siegel.)
Public engagement and intellectual property
It was good to see public engagement/awareness mentioned in the report. Unfortunately, that’s all the topic rated. ti seems that this activity, as is often the case with emerging technologies, will be relegated to the later stages of quantum technology development in Canada.
Taking into account that it’s helpful to know how much progress you’re making, measuring that progress by the number of patents (intellectual property) being filed could be problematic. It’s a little surprising the expert panel, given that they had reservations in other areas, didn’t make note of any with regard to counting patents. After all, is that measuring progress or legal paperwork?
So, how are we doing quantumwise?
If you’re looking for good news, Quantum Potential does offer a bit but I gather in their eagerness to avoid hype and answer the questions as posed, the sponsoring agencies, the expert panel focused largely on problems. (Confession: I too have a tendency to focus on problems.)
Luckily, there’s a January 5, 2023 article (somewhat dated but the most up-to-date I can find) by James Dargan for the Quantum Insider,
Canada is one of the leading countries when it comes to research and the commercial aspects of quantum computing (QC).
…
With several quantum research institutes and labs, including Simmons’ [Photonic inc.’s founder Stephanie Simmons] very own Silicon Quantum Technologies Lab at Simon Fraser University, Canada can also take pride in the research efforts being done in the sector at places like the Université de Sherbrooke — EPIQ, Université de Sherbrooke — Institut quantique, the University of British Columbia — Advanced Materials and Process Engineering Laboratory (AMPEL), the University of British Columbia — Quantum Information Science, the University of British Columbia — Quantum Matter institute, the University of Calgary — Institute for Quantum Science and Technology, the University of Toronto — Centre for Quantum Information and Quantum Control, and finally the University of Waterloo — Institute for Quantum Computing.
Furthermore, it is in the top ten countries worldwide in planned public funding for quantum technology at more than $600 million, a number that is growing but still behind China with $15 billion, the EU with $7.2 billion and the US with $1.3 billion, according to a McKinsey report.
Canada is also the home of the first quantum computing company in existence, D-Wave (see below), and presently the birthplace to more than two dozen quantum computing startups.
…
[article lists 25 Canadian quantum companies]
There you have it, a dose of boosterism.
Where are we going from here?
As the panel noted, there is an international race to develop quantum technology. The UN has declared 2025 as the International Year of Quantum Science and Technology (IYQ; see more about the announcement in an October 3, 2024 posting on the International Union of Pure and Applied Chemistry [IUPAC] website or in a June 7, 2024 posting on Quantum Insider by Matt Swayne., which reproduces the UN press release),
Recognizing the importance of quantum science and the need for wider awareness of its past and future impact, dozens of national scientific societies gathered together to support marking 100 years of quantum mechanics with a U.N.-declared international year.
On June 7, 2024, the United Nations proclaimed 2025 as the International Year of Quantum Science and Technology (IYQ). According to the proclamation, this year-long, worldwide initiative will “be observed through activities at all levels aimed at increasing public awareness of the importance of quantum science and applications.”
Anyone, anywhere can participate in IYQ by helping others to learn more about quantum on this centennial occasion, participating in or organizing an IYQ event, or simply taking the time to learn more about quantum science and technology.
I didn’t see any Canadian events listed on the IYQ website in early February 2025 but the situation (as of February 18, 2025) has already changed. Quantum Days 2025 from February 19 – 21, 2025 will take place in Toronto, Ontario. Tickets are already sold out but you can check out the Quantum Days 2025 website to get a sense of the programming. The most accessible Canadian event currently (as of February 19, 2025) on the IYQ website appears to be a May 9, 2025 event billed as Quantum Canada Open Doors on the IYQ website,
Quantum Canada: Open Doors invites Canadians from every province and territory to experience the incredible world of quantum through free, accessible events.
From lab tours and public lectures to hands-on workshops and industry showcases, this one-day event will highlight Canada’s leadership in quantum science and technology. Whether you’re a quantum expert, a curious student, or simply someone eager to learn, there’s something for everyone.
Canada was represented at the International Year’s two day launch February 4, – 5, 2025 in Paris as noted in my January 31, 2025 posting. On day one (February 4, 2025 in the 11:50-12:40 Roundtable Discussion: Pushing the Frontiers of Quantum Science and Technology), there was Stephanie Simmons (Simon Fraser University professor and Founder & Chief Quantum Officer at Photonic, Co-Chair of Canada’s National Quantum Advisory Council).
Also on day one, there was John Donohue, Senior Manager of Scientific Outreach at the Institute for Quantum Computing, University of Waterloo in the 14:00-14;50 Panel Discussion: Public Engagement and Education in Quantum Science and Technology.
On day two, Shohini GHOSE, Professor of Physics and Computer Science at Wilfrid Laurier University and Chief Technology Officer, Quantum Algorithms Institute participated in the 09:45-10:45 Panel Discussion: Ethics of Quantum Technologies.
The popular imagination and quantum physics
Quantum physics doesn’t occupy the popular imagination in quite the way that artificial intelligence does—not yet. However, it as a bit of startling to come across this quote in a Vanity Fair article (February 2025 issue, p. 77 in the paper version),
“Scott was like a quantum entangled particle,” said another friend, He had both an inferiority and superiority complex.”
Most of the people consulted for the article (Atlas Shrugged by Dirk Smillie), were a part of the financial services community of which Scott Minerd was a significant player prior to his death at the age of 63.
it’s not a community where I’d expect to hear a ‘quantum’ analogy. Also, I believe the speaker was trying to reference quantum superposition, not entanglement. However, for someone who finds the way new and emerging technologies become known to the public, the speaker’s probable profession and error are both fascinating, especially as it appears in a high end magazine article about extreme wealth during the International Year of Quantum Science and Technology.
Back to the regular programme
Given the focus on security issues in the report, there’s an intriguing announcement in this December 26, 2024 posting by Matt Swayne for The Quantum Insider,
Insider Brief
Proposals must align with national defense priorities while incorporating Indigenous knowledge and demonstrating socio-economic benefits for Canada’s broader innovation ecosystem.
Canada’s IDEaS program is offering $19 million in grants to develop cutting-edge technologies that support NORAD [North American Aerospace Defense Command (NORAD)] modernization and enhance North American defense capabilities [emphasis mine].
The program targets early-stage innovations in quantum computing [emphasis mine], autonomous systems, Arctic mobility, counter-drone measures, and sustainable energy solutions.
…
Quantum Technologies
The contest encourages advances in quantum computing, including logical quantum bits (qubits) for fault-tolerant systems, quantum algorithms for optimization and anomaly detection, and quantum repeaters for secure networking. These innovations have the potential to transform how data is processed and secured in defense systems.
As for the question I asked in the subhead, “Where are we going from here?,” I don’t think anybody knows. If you have some insights please do share them in the comments.
The Artificial Intelligence (AI) Action Summit held from February 10 – 11, 2025 in Paris seems to have been pretty exciting, President Emanuel Macron announced a 09B euros investment in the French AI sector on February 10, 2025 (I have more in my February 13, 2025 posting [scroll down to the ‘What makes Canadian (and Greenlandic) minerals and water so important?’ subhead]). I also have this snippet, which suggests Macron is eager to provide an alternative to US domination in the field of AI, from a February 10, 2025 posting on CCGTN (China Global Television Network),
French President Emmanuel Macron announced on Sunday night [February 10, 2025] that France is set to receive a total investment of 109 billion euros (approximately $112 billion) in artificial intelligence over the coming years.
Speaking in a televised interview on public broadcaster France 2, Macron described the investment as “the equivalent for France of what the United States announced with ‘Stargate’.”
He noted that the funding will come from the United Arab Emirates, major American and Canadian investment funds [emphases mine], as well as French companies.
Prime Minister Justin Trudeau warned U.S. Vice-President J.D. Vance that punishing tariffs on Canadian steel and aluminum will hurt his home state of Ohio, a senior Canadian official said.
The two leaders met on the sidelines of an international summit in Paris Tuesday [February 11, 2025], as the Trump administration moves forward with its threat to impose 25 per cent tariffs on all steel and aluminum imports, including from its biggest supplier, Canada, effective March 12.
…
Speaking to reporters on Wednesday [February 12, 2025] as he departed from Brussels, Trudeau characterized the meeting as a brief chat that took place as the pair met.
…
“It was just a quick greeting exchange,” Trudeau said. “I highlighted that $2.2 billion worth of steel and aluminum exports from Canada go directly into the Ohio economy, often to go into manufacturing there.
“He nodded, and noted it, but it wasn’t a longer exchange than that.”
…
Vance didn’t respond to Canadian media’s questions about the tariffs while arriving at the summit on Tuesday [February 11, 2025].
…
Additional insight can be gained from a February 10, 2025 PBS (US Public Broadcasting Service) posting of an AP (Associated Press) article with contributions from Kelvin Chan and Angela Charlton in Paris, Ken Moritsugu in Beijing, and Aijaz Hussain in New Delhi,
JD Vance stepped onto the world stage this week for the first time as U.S. vice president, using a high-stakes AI summit in Paris and a security conference in Munich to amplify Donald Trump’s aggressive new approach to diplomacy.
The 40-year-old vice president, who was just 18 months into his tenure as a senator before joining Trump’s ticket, is expected, while in Paris, to push back on European efforts to tighten AI oversight while advocating for a more open, innovation-driven approach.
The AI summit has drawn world leaders, top tech executives, and policymakers to discuss artificial intelligence’s impact on global security, economics, and governance. High-profile attendees include Chinese Vice Premier Zhang Guoqing, signaling Beijing’s deep interest in shaping global AI standards.
Macron also called on “simplifying” rules in France and the European Union to allow AI advances, citing sectors like healthcare, mobility, energy, and “resynchronize with the rest of the world.”
“We are most of the time too slow,” he said.
The summit underscores a three-way race for AI supremacy: Europe striving to regulate and invest, China expanding access through state-backed tech giants, and the U.S. under Trump prioritizing a hands-off approach.
…
Vance has signaled he will use the Paris summit as a venue for candid discussions with world leaders on AI and geopolitics.
“I think there’s a lot that some of the leaders who are present at the AI summit could do to, frankly — bring the Russia-Ukraine conflict to a close, help us diplomatically there — and so we’re going to be focused on those meetings in France,” Vance told Breitbart News.
Vance is expected to meet separately Tuesday with Indian Prime Minister Narendra Modi and European Commission President Ursula von der Leyen, according to a person familiar with planning who spoke on the condition of anonymity.
…
Modi is co-hosting the summit with Macron in an effort to prevent the sector from becoming a U.S.-China battle.
Indian Foreign Secretary Vikram Misri stressed the need for equitable access to AI to avoid “perpetuating a digital divide that is already existing across the world.”
But the U.S.-China rivalry overshadowed broader international talks.
…
The U.S.-China rivalry didn’t entirely overshadow the talks. At least one Chinese former diplomat chose to make her presence felt by chastising a Canadian academic according to a February 11, 2025 article by Matthew Broersma for silicon.co.uk
A representative of China at this week’s AI Action Summit in Paris stressed the importance of collaboration on artificial intelligence, while engaging in a testy exchange with Yoshua Bengio, a Canadian academic considered one of the “Godfathers” of AI.
Fu Ying, a former Chinese government official and now an academic at Tsinghua University in Beijing, said the name of China’s official AI Development and Safety Network was intended to emphasise the importance of collaboration to manage the risks around AI.
She also said tensions between the US and China were impeding the ability to develop AI safely.
…
… Fu Ying, a former vice minister of foreign affairs in China and the country’s former UK ambassador, took veiled jabs at Prof Bengio, who was also a member of the panel.
…
Zoe Kleinman’s February 10, 2025 article for the British Broadcasting Corporation (BBC) news online website also notes the encounter,
A former Chinese official poked fun at a major international AI safety report led by “AI Godfather” professor Yoshua Bengio and co-authored by 96 global experts – in front of him.
Fu Ying, former vice minister of foreign affairs and once China’s UK ambassador, is now an academic at Tsinghua University in Beijing.
The pair were speaking at a panel discussion ahead of a two-day global AI summit starting in Paris on Monday [February 10, 2025].
The aim of the summit is to unite world leaders, tech executives, and academics to examine AI’s impact on society, governance, and the environment.
Fu Ying began by thanking Canada’s Prof Bengio for the “very, very long” document, adding that the Chinese translation stretched to around 400 pages and she hadn’t finished reading it.
She also had a dig at the title of the AI Safety Institute – of which Prof Bengio is a member.
China now has its own equivalent; but they decided to call it The AI Development and Safety Network, she said, because there are lots of institutes already but this wording emphasised the importance of collaboration.
The AI Action Summit is welcoming guests from 80 countries, with OpenAI chief executive Sam Altman, Microsoft president Brad Smith and Google chief executive Sundar Pichai among the big names in US tech attending.
Elon Musk is not on the guest list but it is currently unknown whether he will decide to join them. [As of February 13, 2025, Mr. Musk did not attend the summit, which ended February 11, 2025.]
A key focus is regulating AI in an increasingly fractured world. The summit comes weeks after a seismic industry shift as China’s DeepSeek unveiled a powerful, low-cost AI model, challenging US dominance.
The pair’s heated exchanges were a symbol of global political jostling in the powerful AI arms race, but Fu Ying also expressed regret about the negative impact of current hostilities between the US and China on the progress of AI safety.
…
She gave a carefully-crafted glimpse behind the curtain of China’s AI scene, describing an “explosive period” of innovation since the country first published its AI development plan in 2017, five years before ChatGPT became a viral sensation in the west.
She added that “when the pace [of development] is rapid, risky stuff occurs” but did not elaborate on what might have taken place.
“The Chinese move faster [than the west] but it’s full of problems,” she said.
Fu Ying argued that building AI tools on foundations which are open source, meaning everyone can see how they work and therefore contribute to improving them, was the most effective way to make sure the tech did not cause harm.
Most of the US tech giants do not share the tech which drives their products.
Open source offers humans “better opportunities to detect and solve problems”, she said, adding that “the lack of transparency among the giants makes people nervous”.
But Prof Bengio disagreed.
His view was that open source also left the tech wide open for criminals to misuse.
He did however concede that “from a safety point of view”, it was easier to spot issues with the viral Chinese AI assistant DeepSeek, which was built using open source architecture, than ChatGPT, whose code has not been shared by its creator OpenAI.
Announced in November 2023 at the AI Safety Summit at Bletchley Park, England, and inspired by the workings of the United Nations Intergovernmental Panel on Climate Change, the report consolidates leading international expertise on AI and its risks.
Supported by the United Kingdom’s Department for Science, Innovation and Technology, Bengio, founder and scientific director of the UdeM-affiliated Mila – Quebec AI Institute, led a team of 96 international experts in drafting the report.
The experts were drawn from 30 countries, the U.N., the European Union and the OECD [Organisation for Economic Cooperation and Development]. Their report will help inform discussions next month at the AI Action Summit in Paris, France and serve as a global handbook on AI safety to help support policymakers.
Towards a common understanding
The most advanced AI systems in the world now have the ability to write increasingly sophisticated computer programs, identify cyber vulnerabilities, and perform on a par with human PhD-level experts on tests in biology, chemistry, and physics.
In what is identified as a key development for policymakers to monitor, the AI Safety Report published today warns that AI systems are also increasingly capable of acting as AI agents, autonomously planning and acting in pursuit of a goal.
As policymakers worldwide grapple with the rapid and unpredictable advancements in AI, the report contributes to bridging the gap by offering a scientific understanding of emerging risks to guide decision-making.
The document sets out the first comprehensive, independent, and shared scientific understanding of advanced AI systems and their risks, highlighting how quickly the technology has evolved.
Several areas require urgent research attention, according to the report, including how rapidly capabilities will advance, how general-purpose AI models work internally, and how they can be designed to behave reliably.
Three distinct categories of AI risks are identified:
Malicious use risks: these include cyberattacks, the creation of AI-generated child-sexual-abuse material, and even the development of biological weapons;
System malfunctions: these include bias, reliability issues, and the potential loss of control over advanced general-purpose AI systems;
Systemic risks: these stem from the widespread adoption of AI, include workforce disruption, privacy concerns, and environmental impacts.
The report places particular emphasis on the urgency of increasing transparency and understanding in AI decision-making as the systems become more sophisticated and the technology continues to develop at a rapid pace.
While there are still many challenges in mitigating the risks of general-purpose AI, the report highlights promising areas for future research and concludes that progress can be made.
Ultimately, it emphasizes that while AI capabilities could advance at varying speeds, their development and potential risks are not a foregone conclusion. The outcomes depend on the choices that societies and governments make today and in the future.
“The capabilities of general-purpose AI have increased rapidly in recent years and months,” said Bengio. “While this holds great potential for society, AI also presents significant risks that must be carefully managed by governments worldwide.
“This report by independent experts aims to facilitate constructive and evidence-based discussion around these risks and serves as a common basis for policymakers around the world to understand general-purpose AI capabilities, risks and possible mitigations.”
There have been two previous AI Safety Summits that I’m aware of and you can read about them in my May 21, 2024 posting about the one in Korea and in my November 2, 2023 posting about the first summit at Bletchley Park in the UK.
I was taught in high school that the US was running out of its resources and that Canada still had much of its resources. That was decades ago. As well, throughout the years, usually during a vote in Québec about separating, I’ve heard rumblings about the US absorbing part or all of Canada as something they call ‘Manifest Destiny,’ which dates back to the 19th century.
Unlike the previous forays Into Manifest Destiny, this one has not been precipitated by any discussion of separation.
Manifest Destiny
It took a while for that phrase to emerge this time but when it finally did the Canadian Broadcasting Corporation (CBC) online news published a January 19, 2025 article by Ainsley Hawthorn providing some context for the term, Note: Links have been removed,
U.S. president-elect Donald Trump says he’s prepared to use economic force to turn Canada into America’s 51st state, and it’s making Canadians — two-thirds of whom believe he’s sincere — anxious.
But the last time Canada faced the threat of American annexation, it united us more than ever before, leading to the foundation of our country as we know it today.
In the 1860s, several prominent U.S. politicians advocated for annexing the colonies of British North America.
“I look on Rupert’s Land [modern-day Manitoba and parts of Alberta, Saskatchewan, Nunavut, Ontario, and Quebec] and Canada, and see how an ingenious people and a capable, enlightened government are occupied with bridging rivers and making railroads and telegraphs,” Secretary of State William Henry Seward told a crowd in St. Paul, Minn. while campaigning on behalf of presidential candidate Abraham Lincoln.
“I am able to say, it is very well; you are building excellent states to be hereafter admitted into the American Union.”
Seward believed in Manifest Destiny, the doctrine that the United States would inevitably expand across the entire North American continent. While he seems to have preferred to acquire territory through negotiation rather than aggression, Canadians weren’t wholly assured of America’s peaceful intentions.
…
In the late 1850s and early 1860s, Canadian parliament had been so deadlocked it had practically come to a standstill. Within just a few years, American pressure created a sense of unity so great it led to Confederation.
The current conversation around annexation is likewise uniting Canada’s leaders to a degree we’ve rarely seen in recent years.
Representatives across the political spectrum are sharing a common message, the same message as British North Americans in the late nineteenth century: despite our problems, Canadians value Canada.
Critical minerals and water
Prime Minister Justin Trudeau had a few comments to make about US President Donald Trump’s motivation for ‘absorbing’ Canada as the 51st state, from a February 7, 2025 CBC news online article by Peter Zimonjic, ·
Prime Minister Justin Trudeau told business leaders at the Canada-U.S. Economic Summit in Toronto that U.S. President Donald Trump’s threat to annex Canada “is a real thing” motivated by his desire to tap into the country’s critical minerals.
“Mr. Trump has it in mind that the easiest way to do it is absorbing our country and it is a real thing,” Trudeau said, before a microphone cut out at the start of the closed-door meeting.
The prime minister made the remarks to more than 100 business leaders after delivering an opening address to the summit Friday morning [February 7, 2025], outlining the key issues facing the country when it comes to Canada’s trading relationship with the U.S.
After the opening address, media were ushered out of the room when a microphone that was left on picked up what was only meant to be heard by attendees [emphasis mine].
…
Automotive Parts Manufacturers’ Association president Flavio Volpe was in the room when Trudeau made the comments. He said the prime minister went on to say that Trump is driven because the U.S. could benefit from Canada’s critical mineral resources.
…
There was more, from a February 7, 2025 article by Nick Taylor-Vaisey for Politico., Note: A link has been removed,
…
In remarks caught on tape by The Toronto Star, Trudeau suggested the president is keenly aware of Canada’s vast mineral resources. “I suggest that not only does the Trump administration know how many critical minerals we have but that may be even why they keep talking about absorbing us and making us the 51st state,” Trudeau said.
…
All of this reminded me of US President Joe Biden’s visit to Canada and his interest in critical minerals which I mentioned briefly in my comments about the 2023 federal budget, from my April 17, 2023 posting (scroll down to the ‘Canadian economic theory (the staples theory), mining, nuclear energy, quantum science, and more’ subhead,
Critical minerals are getting a lot of attention these days. (They were featured in the 2022 budget, see my April 19, 2022 posting, scroll down to the Mining subhead.) This year, US President Joe Biden, in his first visit to Canada as President, singled out critical minerals at the end of his 28 hour state visit (from a March 24, 2023 CBC news online article by Alexander Panetta; Note: Links have been removed),
There was a pot of gold at the end of President Joe Biden’s jaunt to Canada. It’s going to Canada’s mining sector.
The U.S. military will deliver funds this spring to critical minerals projects in both the U.S. and Canada. The goal is to accelerate the development of a critical minerals industry on this continent.
The context is the United States’ intensifying rivalry with China.
The U.S. is desperate to reduce its reliance on its adversary for materials needed to power electric vehicles, electronics and many other products, and has set aside hundreds of millions of dollars under a program called the Defence Production Act.
The Pentagon already has told Canadian companies they would be eligible to apply. It has said the cash would arrive as grants, not loans.
On Friday [March 24, 2023], before Biden left Ottawa, he promised they’ll get some.
The White House and the Prime Minister’s Office announced that companies from both countries will be eligible this spring for money from a $250 million US fund.
Which Canadian companies? The leaders didn’t say. Canadian officials have provided the U.S. with a list of at least 70 projects that could warrant U.S. funding.
…
“Our nations are blessed with incredible natural resources,” Biden told Canadian parliamentarians during his speech in the House of Commons.
“Canada in particular has large quantities of critical minerals [emphasis mine] that are essential for our clean energy future, for the world’s clean energy future.
…
I don’t think there’s any question that the US knows how much, where, and how easily ‘extractable’ Canadian critical minerals might be.
Pressure builds
On the same day (Monday, February 3, 2025) the tariffs were postponed for a month,Trudeau had two telephone calls with US president Donald Trump. According to a February 9, 2025 article by Steve Chase and Stefanie Marotta for the Globe and Mail, Trump and his minions are exploring the possibility of acquiring Canada by means other than a trade war or economic domination,
…
“He [Trudeau] talked about two phone conversations he had with Mr. Trump on Monday [February 3, 2025] before the President agreed to delay to steep tariffs on Canadian goods for 30 days.n
During the calls, the Prime Minister recalled Mr. Trump referred to a four-page memo that included a list of grievances he had with Canadian trade and commercial rules, including the President’s false claim that US banks are unable to operate in Canada. …
In the second conversation with Mr. Trump on Monday, the Prime Minister told the summit, the President asked him whether he was familiar with the Treaty of 1908, a pact between the United States and Britain that defined the border between the United States and Canada. he told Mr. Trudeau, he should look it up.
Mr. Trudeau told the summit he thought the treaty had been superseded by other developments such as the repatriation the Canadian Constitution – in other words, that the border cannot be dissolved by repealing that treaty. He told the audience that international law would prevent the dissolution 1908 Treaty leading to the erasure of the border. For example, various international laws define sovereign borders, including the United Nationals Charter of which both countries are signatories and which has protection to territorial integrity.
A source familiar with the calls said Mr. Trump’s reference to the 1908 Treaty was taken as an implied threat. … [p. A3 in paper version]
I imagine Mr. Trump and/or his minions will keep trying to find one pretext or another for this attempt to absorb or annex or wage war (economically or otherwise) on Canada.
What makes Canadian (and Greenlandic) minerals and water so important?
You may have noticed the January 21, 2025 announcement by Mr. Trump about the ‘Stargate Project,’ a proposed US $500B AI infrastructure company (you can find more about the Stargate Project (Stargate LLC) in its Wikipedia entry).
Most likely not a coincidence, on February 10, 2025 President of France, Emmanuel Macron announced a 109B euros investment in French AI sector, from the February 9, 2025 Reuters preannouncement article,
France will announce private sector investments totalling some 109 billion euros ($112.5 billion [US]) in its artificial intelligence sector during the Paris AI summit which opens on Monday, President Emmanuel Macron said.
The financing includes plans by Canadian investment firm [emphasis mine] Brookfield to invest 20 billion euros in AI projects in France and financing from the United Arab Emirates which could hit 50 billion euros in the years ahead, Macron’s office said.
…
Big projects, non? It’s no surprise critical minerals will be necessary but the need for massive amounts of water may be. My October 16, 2023 posting focuses on water and AI development, specifically ChatGPT-4,
A September 9, 2023 news item (an Associated Press article by Matt O’Brien and Hannah Fingerhut) on phys.org and also published September 12, 2023 on the Iowa Public Radio website, describe an unexpected cost for building ChatGPT and other AI agents, Note: Links [in the excerpt] have been removed,
The cost of building an artificial intelligence product like ChatGPT can be hard to measure.
But one thing Microsoft-backed OpenAI needed for its technology was plenty of water [emphases mine], pulled from the watershed of the Raccoon and Des Moines rivers in central Iowa to cool a powerful supercomputer as it helped teach its AI systems how to mimic human writing.
As they race to capitalize on a craze for generative AI, leading tech developers including Microsoft, OpenAI and Google have acknowledged that growing demand for their AI tools carries hefty costs, from expensive semiconductors to an increase in water consumption.
But they’re often secretive about the specifics. Few people in Iowa knew about its status as a birthplace of OpenAI’s most advanced large language model, GPT-4, before a top Microsoft executive said in a speech it “was literally made next to cornfields west of Des Moines.”
…
In its latest environmental report, Microsoft disclosed that its global water consumption spiked 34% from 2021 to 2022 (to nearly 1.7 billion gallons , or more than 2,500 Olympic-sized swimming pools), a sharp increase compared to previous years that outside researchers tie to its AI research. [emphases mine]
…
As for how much water was diverted in Iowa for a data centre project, from my October 16, 2023 posting
…
Jason Clayworth’s September 18, 2023 article for AXIOS describes the issue from the Iowan perspective, Note: Links [from the excerpt] have been removed,
Future data center projects in West Des Moines will only be considered if Microsoft can implement technology that can “significantly reduce peak water usage,” the Associated Press reports.
Why it matters: Microsoft’s five WDM data centers — the “epicenter for advancing AI” — represent more than $5 billion in investments in the last 15 years.
Yes, but: They consumed as much as 11.5 million gallons of water a month for cooling, or about 6% of WDM’s total usage during peak summer usage during the last two years, according to information from West Des Moines Water Works.
…
The bottom line is that these technologies consume a lot of water and require critical minerals.
Greenland
Evan Dyer’s January 16, 2025 article for CBC news online describes both US military strategic interests and hunger for resources, Note 1: Article links have been removed; Note 2: I have added one link to a Wikipedia entry,
The person who first put a bug in Donald Trump’s ear about Greenland — if a 2022 biography is to be believed — was his friend Ronald Lauder, a New York billionaire and heir to the Estée Lauder cosmetics fortune.
But it would be wrong to believe that U.S. interest in Greenland originated with idle chatter at the country club, rather than real strategic considerations.
Trump’s talk of using force to annex Greenland — which would be an unprovoked act of war against a NATO ally — has been rebuked by Greenlandic, Danish and European leaders. A Fox News team that travelled to Greenland’s capital Nuuk reported back to the Trump-friendly show Fox & Friends that “most of the people we spoke with did not support Trump’s comments and found them offensive.”
…
Certainly, military considerations motivated the last U.S. attempt at buying Greenland in 1946.
…
The military value to the U.S. of acquiring Greenland is much less clear in 2025 than it was in 1946.
Russian nuclear submarines no longer need to traverse the GIUK [the GIUK gap; “{sometimes written G-I-UK} is an area in the northern Atlantic Ocean that forms a naval choke point. Its name is an acronym for Greenland, Iceland, and the United Kingdom, the gap being the two stretches of open ocean among these three landmasses.”]. They can launch their missiles from closer to home.
And in any case, the U.S. already has a military presence on Greenland, used for early warning, satellite tracking and marine surveillance. The Pentagon simply ignored Denmark’s 1957 ban on nuclear weapons on Greenlandic territory. Indeed, an American B-52 bomber carrying four hydrogen bombs crashed in Greenland in 1968.
“The U.S. already has almost unhindered access [emphasis mine], and just building on their relationship with Greenland is going to do far more good than talk of acquisition,” said Dwayne Menezes, director of the Polar Research and Policy Initiative in London.
The complication, he says, is Greenland’s own independence movement. All existing defence agreements involving the U.S. presence in Greenland are between Washington and the Kingdom of Denmark. [emphasis mine]
“They can’t control what’s happening between Denmark and Greenland,” Menezes said. “Over the long term, the only way to mitigate that risk altogether is by acquiring Greenland.”
Menezes also doesn’t believe U.S. interest in Greenland is purely military.
And Trump’s incoming national security adviser Michael Waltz [emphasis mine] appeared to confirm as much when asked by Fox News why the administration wanted Greenland.
“This is about critical minerals, this is about natural resources [emphasis mine]. This is about, as the ice caps pull back, the Chinese are now cranking out icebreakers and are pushing up there.”
…
While the United States has an abundance of natural resources, it risks coming up short in two vital areas: rare-earth minerals and freshwater.
Greenland’s apparent barrenness belies its richness in those two key 21st-century resources.
The U.S. rise to superpower was driven partly by the good fortune of having abundant reserves of oil, which fuelled its industrial growth. The country is still a net exporter of petroleum.
China, Washington’s chief strategic rival, had no such luck. It has to import more than two-thirds of its oil, and is now importing more than six times as much as it did in 2000.
But the future may not favour the U.S. as much as the past.
…
I stand corrected, where oil is concerned. From Dyer’s January 16, 2025 article, Note: Links have been removed,
…
It’s China, and not the U.S., that nature blessed with rich deposits of rare-earth elements, a collection of 17 metals such as yttrium and scandium that are increasingly necessary for high-tech applications from cellphones and flat-screen TVs to electric cars.
The rare-earth element neodymium is an essential part of many computer hard drives and defence systems including electronic displays, guidance systems, lasers, radar and sonar.
Three decades ago, the U.S. produced a third of the world’s rare-earth elements, and China about 40 per cent. By 2011, China had 97 per cent of world production, and its government was increasingly limiting and controlling exports.
The U.S. has responded by opening new mines and spurring recovery and recycling to reduce dependence on China.
…
Such efforts have allowed the U.S. to claw back about 20 per cent of the world’s annual production of rare-earth elements. But that doesn’t change the fact that China has about 44 million tonnes of reserves, compared to fewer than two million in the U.S.
“There’s a huge dependency on China,” said Menezes. “It offers China the economic leverage, in the midst of a trade war in particular, to restrict supply to the West, thus crippling industries like defence, the green transition. This is where Greenland comes in.”
Greenland’s known reserves are almost equivalent to those of the entire U.S., and much more may lie beneath its icebound landscape.
“Greenland is believed to be able to meet at least 25 per cent of global rare-earth demand well into the future,” he said.
An abundance of freshwater
The melting ice caps referenced by Trump’s nominee for national security adviser are another Greenlandic resource the world is increasingly interested in.
Seventy per cent of the world’s freshwater is locked up in the Antarctic ice cap. Of the remainder, two-thirds is in Greenland, in a massive ice cap that is turning to liquid at nearly twice the volume of melting in Antarctica.
“We know this because you can weigh the ice sheet from satellites,” said Christian Schoof, a professor of Earth, ocean and atmospheric sciences at the University of British Columbia who spent part of last year in Greenland studying ice cap melting.
“The ice sheet is heavy enough that it affects the orbit of satellites going over it. And you can record the change in that acceleration of satellites due to the ice sheet over time, and directly weigh the ice sheet.”
…
“There is a growing demand for freshwater on the world market, and the use of the vast water potential in Greenland may contribute to meeting this demand,” the Greenland government announces on its website.
The Geological Survey of Denmark and Greenland found 10 locations that were suitable for the commercial exploitation of Greenland’s ice and water, and has already issued a number of licenses.
…
Schoof told CBC News that past projects that attempted to tow Greenlandic ice to irrigate farms in the Middle East “haven’t really taken off … but humans are resourceful and inventive, and we face some really significant issues in the future.”
For the U.S., those issues include the 22-year-long “megadrought” which has left the western U.S. [emphases mine] drier than at any time in the past 1,200 years, and which is already threatening the future of some American cities.
…
As important as they are, there’s more than critical minerals and water, according to Dyer’s January 16, 2025 article
…
Even the “rock flour” that lies under the ice cap could have great commercial and strategic importance.
Ground into nanoparticles by the crushing weight of the ice, research has revealed it to have almost miraculous properties, says Menezes.
“Scientists have found that Greenlandic glacial flour has a particular nutrient composition that enables it to be regenerative of soil conditions elsewhere,” he told CBC News. “It improves agricultural yields. It has direct implications for food security.”
Spreading Greenland rock flour on corn fields in Ghana produced a 30 to 50 per cent increase in crop yields. Similar yield gains occurred when it was spread on Danish fields that produce the barley for Carlsberg beer.
…
Canada
It’s getting a little tiring keeping up with Mr. Trump’s tariff tear (using ‘tear’ as a verbal noun; from the Cambridge dictionary, verb: TEAR definition: 1. to pull or be pulled apart, or to pull pieces off: 2. to move very quickly …).
The bottom line is that Mr. Trump wants something and certainly Canadian critical minerals and water constitute either his entire interest or, at least, his main interest for now, with more to be determined later.
Niall McGee’s February 9, 2025 article for the Globe and Mail provides an overview of the US’s dependence on Canada’s critical minerals,
…
The US relies on Canada for a huge swath of its critical mineral imports, including 40 per cent of its primary nickel for its defence industry, 30 per cent of its uranium, which is used in its nuclear-power fleet, and 79 per cent of its potash for growing crops.
The US produces only small amounts of all three, while Canada is the world’s biggest potash producer, the second biggest in uranium, and number six in nickel.
If the US wants to buy fewer critical minerals from Canada, in many cases it would be forced to source them from hostile countries such as Russia and China.
…
Vancouver-based Teck Resources Ltd. is one of the few North American suppliers of germanium. The critical mineral is used in fibre-optic networks, infrared vision systems, solar panels. The US relies on Canada for 23 per cent of its imports of germanium.
China in December [2024] banned exports of the critical mineral to the US citing national security concerns. The ban raised fears of possible shortages for the US.
“It’s obvious we have a lot of what Trump wants to support America’s ambitions, from both an economic and a geopolitical standpoint,” says Martin Turenne, CEO of Vancouver-based FPX Nickel Corp., which is developing a massive nickel project in British Columbia. [p. B5 paper version]
…
Akshay Kulkarni’s January 15, 2025 article for CBC news online provides more details about British Columbia and its critical minerals, Note: Links have been removed,
…
The premier had suggested Tuesday [January 14, 2025] that retaliatory tariffs and export bans could be part of the response, and cited a smelter operation located in Trail, B.C. [emphasis mine; keep reading], which exports minerals that Eby [Premier of British Columbia, David Eby] said are critical for the U.S.
…
The U.S. and Canada both maintain lists of critical minerals — ranging from aluminum and tin to more obscure elements like ytterbium and hafnium — that both countries say are important for defence, energy production and other key areas.
Michael Goehring, the president of the Mining Association of B.C., said B.C. has access to or produces 16 of the 50 minerals considered critical by the U.S.
Individual atoms of silicon and germanium are seen following an Atomic Probe Tomography (APT) measurement at Polytechnique Montreal. Both minerals are manufactured in B.C. (Christinne Muschi/The Canadian Press)
“We have 17 critical mineral projects on the horizon right now, along with a number of precious metal projects,” he told CBC News on Tuesday [January 14, 2025].
“The 17 critical mineral projects alone represent some $32 billion in potential investment for British Columbia,” he added.
John Steen, director of the Bradshaw Research Institute for Minerals and Mining at the University of B.C., pointed to germanium — which is manufactured at Teck’s facility in Trail [emphasis mine] — as one of the materials most important to U.S industry.
…
There are a number of mines and manufacturing facilities across B.C. and Canada for critical minerals.
The B.C. government says the province is Canada’s largest producer of copper, and only producer of molybdenum, which are both considered critical minerals.
…
There’s also graphite, not in BC but in Québec. This April 8, 2023 article by Christian Paas-Lang for CBC news online focuses largely on issues of how to access and exploit graphite and also, importantly, indigenous concerns, but this excerpt focuses on graphite as a critical mineral,
A mining project might not be what comes to mind when you think of the transition to a lower emissions economy. But embedded in electric vehicles, solar panels and hydrogen fuel storage are metals and minerals that come from mines like the one in Lac-des-Îles, Que.
The graphite mine, owned by the company Northern Graphite, is just one of many projects aimed at extracting what are now officially dubbed “critical minerals” — substances of significant strategic and economic importance to the future of national economies.
Lac-des-Îles is the only significant graphite mining project in North America, accounting for Canada’s contribution to an industry dominated by China.
…
There was another proposed graphite mine in Québec, which encountered significant push back from the local Indigenous community as noted in my November 26, 2024 posting, “Local resistance to Lomiko Metals’ Outaouais graphite mine.” The posting also provides a very brief update of graphite mining in Canada.
It seems to me that water does not get the attention that it should and that’s why I lead with water in my headline. Eric Reguly’s February 9, 2025 article in the Globe and Mail highlights some of the water issues facing the US, not just Iowa,
…
Water may be the real reason, or one of the top reasons, propelling his [Mr. Trump’s] desire to turn Canada into Minnesota North. Canadians represent 0.5 per cent of the globe’s population yet sit on 20% or more of its fresh water. Vast tracts of the United States routinely suffer from water shortages, which are drying up rivers – the once mighty Colorado River no longer reaches the Pacific Ocean – shrinking aquifers beneath farmland and preventing water-intensive industries from building factories. Warming average temperatures will intensify the shortages. [p. B2 in paper version]
…
Reguly is more interested in the impact water shortages have on industry. He also offers a brief history of US interest in acquiring Canadian water resources dating back to the first North America Free Trade Agreement (NAFTA) that came into effect on January 1, 1994.
A March 6, 2024 article by Elia Nilsen for CNN television news online details Colorado river geography and gives you a sense of just how serious the situation is, Note: Links have been removed,
Seven Western states are starting to plot a future for how much water they’ll draw from the dwindling Colorado River in a warmer, drier world.
The river is the lifeblood for the West – providing drinking water for tens of millions, irrigating crops, and powering homes and industry with hydroelectric dams.
…
This has bought states more time to figure out how to divvy up the river after 2026, when the current operating guidelines expire.
To that end, the four upper basin river states of Colorado, Utah, New Mexico and Wyoming submitted their proposal for how future cuts should be divvied up among the seven states to the federal government on Tuesday [March 5, 2024], and the three lower basin states of California, Arizona and Nevada submitted their plan on Wednesday [March 6, 2024].
One thing is clear from the competing plans: The two groups of states do not agree so far on who should bear the brunt of future cuts if water levels drop in the Colorado River basin.
…
As of a December 12, 2024 article by Shannon Mullane for watereducationcolorado.org, the states are still wrangling and they are not the only interested parties, Note: A link has been removed,
… officials from seven states are debating the terms of a new agreement for how to store, release and deliver Colorado River water for years to come, and they have until 2026 to finalize a plan. This month, the tone of the state negotiations soured as some state negotiators threw barbs and others called for an end to the political rhetoric and saber-rattling.
…
The state negotiators are not the only players at the table: Tribal leaders, federal officials, environmental organizations, agricultural groups, cities, industrial interests and others are weighing in on the process.
…
Water use from the Colorado river has international implications as this February 5, 2025 essay (Water is the other US-Mexico border crisis, and the supply crunch is getting worse) by Gabriel Eckstein, professor of law at Texas A&M University and Rosario Sanchez, senior research scientist at Texas Water Resources Institute and at Texas A&M University for The Conversation makes clear, Note: Links have been removed,
…
The Colorado River provides water to more than 44 million people, including seven U.S. and two Mexican states, 29 Indian tribes and 5.5 million acres of farmland. Only about 10% of its total flow reaches Mexico. The river once emptied into the Gulf of California, but now so much water is withdrawn along its course that since the 1960s it typically peters out in the desert.
…
At least 28 aquifers – underground rock formations that contain water – also traverse the border. With a few exceptions, very little information on these shared resources exists. One thing that is known is that many of them are severely overtapped and contaminated.
Nonetheless, reliance on aquifers is growing as surface water supplies dwindle. Some 80% of groundwater used in the border region goes to agriculture. The rest is used by farmers and industries, such as automotive and appliance manufacturers.
Over 10 million people in 30 cities and communities throughout the border region rely on groundwater for domestic use. Many communities, including Ciudad Juarez; the sister cities of Nogales in both Arizona and Sonora; and the sister cities of Columbus in New Mexico and Puerto Palomas in Chihuahua, get all or most of their fresh water from these aquifers.
…
A booming region
About 30 million people live within 100 miles (160 kilometers) of the border on both sides. Over the next 30 years, that figure is expected to double.
Municipal and industrial water use throughout the region is also expected to increase. In Texas’ lower Rio Grande Valley, municipal use alone could more than double by 2040.
At the same time, as climate change continues to worsen, scientists project that snowmelt will decrease and evaporation rates will increase. The Colorado River’s baseflow – the portion of its volume that comes from groundwater, rather than from rain and snow – may decline by nearly 30% in the next 30 years.
Precipitation patterns across the region are projected to be uncertain and erratic for the foreseeable future. This trend will fuel more extreme weather events, such as droughts and floods, which could cause widespread harm to crops, industrial activity, human health and the environment.
Further stress comes from growth and development. Both the Colorado River and Rio Grande are tainted by pollutants from agricultural, municipal and industrial sources. Cities on both sides of the border, especially on the Mexican side, have a long history of dumping untreated sewage into the Rio Grande. Of the 55 water treatment plants located along the border, 80% reported ongoing maintenance, capacity and operating problems as of 2019.
Drought across the border region is already stoking domestic and bilateral tensions. Competing water users are struggling to meet their needs, and the U.S. and Mexico are straining to comply with treaty obligations for sharing water [emphasis mine].
…
Getting back to Canada and water, Reguly’s February 9, 2025 article notes Mr. Trump’s attitude towards our water,
…
Mr. Trump’s transaction-oriented brain know that water availability translates into job availability. If Canada were forced to export water by bulk to the United States, Canada would in effect be exporting jobs and America absorbing them. In the fall [2024] when he was campaigning, he called British Columbia “essentially a very large faucet” [emphasis mine] that could be used to overcome California’s permanent water deficit.
…
In Canada’s favour, Canadians have been united in their opposition to bulk water exports. That sentiment is codified in the Transboundary Waters Protection Act, which bans large scale removal from waterways shared with the United States. … [p. B2 in paper version]
…
It’s reassuring to read that we have some rules regarding water removal but British Columbia also has a water treaty with the US, the Columbia River Treaty, and an update to it lingers in limbo as Kirk Lapointe notes in his February 6, 2025 article for vancouverisawesome.com. Lapointe mentions shortcomings on both sides of the negotiating table for the delay in ratifying the update while expressing concern over Mr. Trump’s possible machinations should this matter cross his radar.
What about Ukraine’s critical mineral?
A February 13, 2025 article by Geoff Nixon for CBC news online provides some of the latest news on the situation between the US and the Ukraine, Note: Links have been removed,
Ukraine has clearly grabbed the attention of U.S. President Donald Trump with its apparent willingness to share access to rare-earth resources with Washington, in exchange for its continued support and security guarantees.
Trump wants what he calls “equalization” for support the U.S. has provided to Ukraine in the wake of Russia’s full-scale invasion. And he wants this payment in the form of Ukraine’s rare earth minerals, metals “and other things,” as the U.S. leader put it last week.
U.S. Treasury Secretary Scott Bessent has travelled to Ukraine to discuss the proposition, which was first raised with Trump last fall [2024], telling reporters Wednesday [February 12, 2025] that he hoped a deal could be reached within days.
Bessent says such a deal could provide a “security shield” in post-war Ukraine. Ukrainian President Volodymyr Zelenskyy, meanwhile, said in his daily address that it would both strengthen Ukraine’s security and “give new momentum to our economic relations.”
But just how much trust can Kyiv put in a Trump-led White House to provide support to Ukraine, now and in the future? Ukraine may not be in a position to back away from the offer, with Trump’s interest piqued and U.S. support remaining critical for Kyiv after nearly three years of all-out war with Russia.
“I think the problem for Ukraine is that it doesn’t really have much choice,” said Oxana Shevel, an associate professor of political science at Boston’s Tufts University.
…
Then there’s the issue of the Ukrainian minerals, which have to remain in Kyiv’s hands in order for the U.S. to access them — a point Zelenskyy and other Ukraine officials have underlined.
There are more than a dozen elements considered to be rare earths, and Ukraine’s Institute of Geology says those that can be found in Ukraine include lanthanum, cerium, neodymium, erbium and yttrium. EU-funded research also indicates that Ukraine has scandium reserves. But the details of the data are classified.
Rare earths are used in manufacturing magnets that turn power into motion for electric vehicles, in cellphones and other electronics, as well as for scientific and industrial applications.
…
Trump has said he wants the equivalent of $500 billion US in rare earth minerals.
Yuriy Gorodnichenko, a professor of economics at the University of California, Berkeley, says any effort to develop and extract these resources won’t happen overnight and it’s unclear how plentiful they are.
“The fact is, nobody knows how much you have for sure there and what is the value of that,” he said in an interview.
“It will take years to do geological studies,” he said. “Years to build extraction facilities.”
…
Just how desperate is the US?
Yes, the United States has oil but it doesn’t have much in the way of materials it needs for the new technologies and it’s running out of something very basic: water.
I don’t know how desperate the US is but Mr. Trump’s flailings suggest that the answer is very, very desperate.
There’s been quite the kerfuffle over DeepSeek during the last few days. This January 27, 2025 article by Alexandra Mae Jones for the Canadian Broadcasting Corporation (CBC) news only was my introduction to DeepSeek AI, Note: A link has been removed,
There’s a new player in AI on the world stage: DeepSeek, a Chinese startup that’s throwing tech valuations into chaos and challenging U.S. dominance in the field with an open-source model that they say they developed for a fraction of the cost of competitors.
DeepSeek’s free AI assistant — which by Monday [January 27, 20¸25] had overtaken rival ChatGPT to become the top-rated free application on Apple’s App Store in the United States — offers the prospect of a viable, cheaper AI alternative, raising questions on the heavy spending by U.S. companies such as Apple and Microsoft, amid a growing investor push for returns.
U.S. stocks dropped sharply on Monday [January 27, 2025], as the surging popularity of DeepSeek sparked a sell-off in U.S. chipmakers.
…
“[DeepSeek] performs as well as the leading models in Silicon Valley and in some cases, according to their claims, even better,” Sheldon Fernandez, co-founder of DarwinAI, told CBC News. “But they did it with a fractional amount of the resources is really what is turning heads in our industry.”
…
What is DeepSeek?
Little is known about the small Hangzhou startup behind DeepSeek, which was founded out of a hedge fund in 2023, but largely develops open-source AI models.
Its researchers wrote in a paper last month that the DeepSeek-V3 model, launched on Jan. 10 [2025], cost less than $6 million US to develop and uses less data than competitors, running counter to the assumption that AI development will eat up increasing amounts of money and energy.
Some analysts are skeptical about DeepSeek’s $6 million claim, pointing out that this figure only covers computing power. But Fernandez said that even if you triple DeepSeek’s cost estimates, it would still cost significantly less than its competitors.
The open source release of DeepSeek-R1, which came out on Jan. 20 [2025] and uses DeepSeek-V3 as its base, also means that developers and researchers can look at its inner workings, run it on their own infrastructure and build on it, although its training data has not been made available.
…
“Instead of paying Open $20 a month or $200 a month for the latest advanced versions of these models, [people] can really get these types of features for free. And so it really upends a lot of the business model that a lot of these companies were relying on to justify their very high valuations.”
…
A key difference between DeepSeek’s AI assistant, R1, and other chatbots like OpenAI’s ChatGPT is that DeepSeek lays out its reasoning when it answers prompts and questions, something developers are excited about.
“The dealbreaker is the access to the raw thinking steps,” Elvis Saravia, an AI researcher and co-founder of the U.K.-based AI consulting firm DAIR.AI, wrote on X, adding that the response quality was “comparable” to OpenAI’s latest reasoning model, o1.
U.S. dominance in AI challenged
One of the reasons DeepSeek is making headlines is because its development occurred despite U.S. actions to keep Americans at the top of AI development. In 2022, the U.S. curbed exports of computer chips to China, hampering their advanced supercomputing development.
…
The latest AI models from DeepSeek are widely seen to be competitive with those of OpenAI and Meta, which rely on high-end computer chips and extensive computing power.
…
Christine Mui in a January 27, 2025 article for Politico notes the stock ‘crash’ taking place while focusing on the US policy implications, Note: Links set by Politico have been removed while I have added one link
A little-known Chinese artificial intelligence startup shook the tech world this weekend by releasing an OpenAI-like assistant, which shot to the No.1 ranking on Apple’s app store and caused American tech giants’ stocks to tumble.
From Washington’s perspective, the news raised an immediate policy alarm: It happened despite consistent, bipartisan efforts to stifle AI progress in China.
…
In tech terms, what freaked everyone out about DeepSeek’s R1 model is that it replicated — and in some cases, surpassed — the performance of OpenAI’s cutting-edge o1 product across a host of performance benchmarks, at a tiny fraction of the cost.
The business takeaway was straightforward. DeepSeek’s success shows that American companies might not need to spend nearly as much as expected to develop AI models. That both intrigues and worries investors and tech leaders.
…
The policy implications, though, are more complex. Washington’s rampant anxiety about beating China has led to policies that the industry has very mixed feelings about.
On one hand, most tech firms hate the export controls that stop them from selling as much to the world’s second-largest economy, and force them to develop new products if they want to do business with China. If DeepSeek shows those rules are pointless, many would be delighted to see them go away.
On the other hand, anti-China, protectionist sentiment has encouraged Washington to embrace a whole host of industry wishlist items, from a lighter-touch approach to AI rules to streamlined permitting for related construction projects. Does DeepSeek mean those, too, are failing? Or does it trigger a doubling-down?
DeepSeek’s success truly seems to challenge the belief that the future of American AI demands ever more chips and power. That complicates Trump’s interest in rapidly building out that kind of infrastructure in the U.S.
Why pour $500 billion into the Trump-endorsed “Stargate” mega project [announced by Trump on January 21, 2025] — and why would the market reward companies like Meta that spend $65 billion in just one year on AI — if DeepSeek claims it only took $5.6 million and second-tier Nvidia chips to train one of its latest models? (U.S. industry insiders dispute the startup’s figures and claim they don’t tell the full story, but even at 100 times that cost, it would be a bargain.)
…
Tech companies, of course, love the recent bloom of federal support, and it’s unlikely they’ll drop their push for more federal investment to match anytime soon. Marc Andreessen, a venture capitalist and Trump ally, argued today that DeepSeek should be seen as “AI’s Sputnik moment,” one that raises the stakes for the global competition.
That would strengthen the case that some American AI companies have been pressing for the new administration to invest government resources into AI infrastructure (OpenAI), tighten restrictions on China (Anthropic) and ease up on regulations to ensure their developers build “artificial general intelligence” before their geopolitical rivals.
…
The British Broadcasting Corporation’s (BBC) Peter Hoskins & Imran Rahman-Jones provided a European perspective and some additional information in their January 27, 2025 article for BBC news online, Note: Links have been removed,
US tech giant Nvidia lost over a sixth of its value after the surging popularity of a Chinese artificial intelligence (AI) app spooked investors in the US and Europe.
DeepSeek, a Chinese AI chatbot reportedly made at a fraction of the cost of its rivals, launched last week but has already become the most downloaded free app in the US.
AI chip giant Nvidia and other tech firms connected to AI, including Microsoft and Google, saw their values tumble on Monday [January 27, 2025] in the wake of DeepSeek’s sudden rise.
In a separate development, DeepSeek said on Monday [January 27, 2025] it will temporarily limit registrations because of “large-scale malicious attacks” on its software.
The DeepSeek chatbot was reportedly developed for a fraction of the cost of its rivals, raising questions about the future of America’s AI dominance and the scale of investments US firms are planning.
…
DeepSeek is powered by the open source DeepSeek-V3 model, which its researchers claim was trained for around $6m – significantly less than the billions spent by rivals.
But this claim has been disputed by others in AI.
The researchers say they use already existing technology, as well as open source code – software that can be used, modified or distributed by anybody free of charge.
DeepSeek’s emergence comes as the US is restricting the sale of the advanced chip technology that powers AI to China.
To continue their work without steady supplies of imported advanced chips, Chinese AI developers have shared their work with each other and experimented with new approaches to the technology.
This has resulted in AI models that require far less computing power than before.
It also means that they cost a lot less than previously thought possible, which has the potential to upend the industry.
After DeepSeek-R1 was launched earlier this month, the company boasted of “performance on par with” one of OpenAI’s latest models when used for tasks such as maths, coding and natural language reasoning.
…
In Europe, Dutch chip equipment maker ASML ended Monday’s trading with its share price down by more than 7% while shares in Siemens Energy, which makes hardware related to AI, had plunged by a fifth.
“This idea of a low-cost Chinese version hasn’t necessarily been forefront, so it’s taken the market a little bit by surprise,” said Fiona Cincotta, senior market analyst at City Index.
“So, if you suddenly get this low-cost AI model, then that’s going to raise concerns over the profits of rivals, particularly given the amount that they’ve already invested in more expensive AI infrastructure.”
Singapore-based technology equity adviser Vey-Sern Ling told the BBC it could “potentially derail the investment case for the entire AI supply chain”.
…
Who founded DeepSeek?
The company was founded in 2023 by Liang Wenfeng in Hangzhou, a city in southeastern China.
The 40-year-old, an information and electronic engineering graduate, also founded the hedge fund that backed DeepSeek.
He reportedly built up a store of Nvidia A100 chips, now banned from export to China.
Experts believe this collection – which some estimates put at 50,000 – led him to launch DeepSeek, by pairing these chips with cheaper, lower-end ones that are still available to import.
Mr Liang was recently seen at a meeting between industry experts and the Chinese premier Li Qiang.
In a July 2024 interview with The China Academy, Mr Liang said he was surprised by the reaction to the previous version of his AI model.
“We didn’t expect pricing to be such a sensitive issue,” he said.
“We were simply following our own pace, calculating costs, and setting prices accordingly.”
…
A January 28, 2025 article by Daria Solovieva for salon.com covers much the same territory as the others and includes a few detail about security issues,
…
The pace at which U.S. consumers have embraced DeepSeek is raising national security concerns similar to those surrounding TikTok, the social media platform that faces a ban unless it is sold to a non-Chinese company.
The U.S. Supreme Court this month upheld a federal law that requires TikTok’s sale. The Court sided with the U.S. government’s argument that the app can collect and track data on its 170 million American users. President Donald Trump has paused enforcement of the ban until April to try to negotiate a deal.
But “the threat posed by DeepSeek is more direct and acute than TikTok,” Luke de Pulford, co-founder and executive director of non-profit Inter-Parliamentary Alliance on China, told Salon.
DeepSeek is a fully Chinese company and is subject to Communist Party control, unlike TikTok which positions itself as independent from parent company ByteDance, he said.
“DeepSeek logs your keystrokes, device data, location and so much other information and stores it all in China,” de Pulford said. “So you’ll never know if the Chinese state has been crunching your data to gain strategic advantage, and DeepSeek would be breaking the law if they told you.”
I wonder if other AI companies in other countries also log keystrokes, etc. Is it theoretically possible that one of those governments or their government agencies could gain access to your data? It’s obvious in China but people in other countries may have the issues.
Censorship: DeepSeek and ChatGPT
Anis Heydari’s January 28, 2025 article for CBC news online reveals some surprising results from a head to head comparison between DeepSeek and ChatGPT,
The Chinese-made AI chatbot DeepSeek may not always answer some questions about topics that are often censored by Beijing, according to tests run by CBC News and The Associated Press, and is providing different information than its U.S.-owned competitor ChatGPT.
The new, free chatbot has sparked discussions about the competition between China and the U.S. in AI development, with many users flocking to test it.
But experts warn users should be careful with what information they provide to such software products.
It is also “a little bit surprising,” according to one researcher, that topics which are often censored within China are seemingly also being restricted elsewhere.
“A lot of services will differentiate based on where the user is coming from when deciding to deploy censorship or not,” said Jeffrey Knockel, who researches software censorship and surveillance at the Citizen Lab at the University of Toronto’s Munk School of Global Affairs & Public Policy.
“With this one, it just seems to be censoring everyone.”
…
Both CBC News and The Associated Press posed questions to DeepSeek and OpenAI’s ChatGPT, with mixed and differing results.
For example, DeepSeek seemed to indicate an inability to answer fully when asked “What does Winnie the Pooh mean in China?” For many Chinese people, the Winnie the Pooh character is used as a playful taunt of President Xi Jinping, and social media searches about that character were previously, briefly banned in China.
DeepSeek said the bear is a beloved cartoon character that is adored by countless children and families in China, symbolizing joy and friendship.
Then, abruptly, it added the Chinese government is “dedicated to providing a wholesome cyberspace for its citizens,” and that all online content is managed under Chinese laws and socialist core values, with the aim of protecting national security and social stability.
CBC News was unable to produce this response. DeepSeek instead said “some internet users have drawn comparisons between Winnie the Pooh and Chinese leaders, leading to increased scrutiny and restrictions on the character’s imagery in certain contexts,” when asked the same question on an iOS app on a CBC device in Canada.
…
Asked if Taiwan is a part of China — another touchy subject — it [DeepSeek] began by saying the island’s status is a “complex and sensitive issue in international relations,” adding that China claims Taiwan, but that the island itself operates as a “separate and self-governing entity” which many people consider to be a sovereign nation.
But as that answer was being typed out, for both CBC and the AP, it vanished and was replaced with: “Sorry, that’s beyond my current scope. Let’s talk about something else.”
…
… Brent Arnold, a data breach lawyer in Toronto, says there are concerns about DeepSeek, which explicitly says in its privacy policy that the information it collects is stored on servers in China.
That information can include the type of device used, user “keystroke patterns,” and even “activities on other websites and apps or in stores, including the products or services you purchased, online or in person” depending on whether advertising services have shared those with DeepSeek.
“The difference between this and another AI company having this is now, the Chinese government also has it,” said Arnold.
While much, if not all, of the data DeepSeek collects is the same as that of U.S.-based companies such as Meta or Google, Arnold points out that — for now — the U.S. has checks and balances if governments want to obtain that information.
“With respect to America, we assume the government operates in good faith if they’re investigating and asking for information, they’ve got a legitimate basis for doing so,” he said.
Right now, Arnold says it’s not accurate to compare Chinese and U.S. authorities in terms of their ability to take personal information. But that could change.
“I would say it’s a false equivalency now. But in the months and years to come, we might start to say you don’t see a whole lot of difference in what one government or another is doing,” he said.
…
Graham Fraser’s January 28, 2025 article comparing DeepSeek to the others (OpenAI’s ChatGPT and Google’s Gemini) for BBC news online took a different approach,
…
Writing Assistance
When you ask ChatGPT what the most popular reasons to use ChatGPT are, it says that assisting people to write is one of them.
From gathering and summarising information in a helpful format to even writing blog posts on a topic, ChatGPT has become an AI companion for many across different workplaces.
As a proud Scottish football [soccer] fan, I asked ChatGPT and DeepSeek to summarise the best Scottish football players ever, before asking the chatbots to “draft a blog post summarising the best Scottish football players in history”.
DeepSeek responded in seconds, with a top ten list – Kenny Dalglish of Liverpool and Celtic was number one. It helpfully summarised which position the players played in, their clubs, and a brief list of their achievements.
DeepSeek also detailed two non-Scottish players – Rangers legend Brian Laudrup, who is Danish, and Celtic hero Henrik Larsson. For the latter, it added “although Swedish, Larsson is often included in discussions of Scottish football legends due to his impact at Celtic”.
For its subsequent blog post, it did go into detail of Laudrup’s nationality before giving a succinct account of the careers of the players.
ChatGPT’s answer to the same question contained many of the same names, with “King Kenny” once again at the top of the list.
Its detailed blog post briefly and accurately went into the careers of all the players.
It concluded: “While the game has changed over the decades, the impact of these Scottish greats remains timeless.” Indeed.
For this fun test, DeepSeek was certainly comparable to its best-known US competitor.
Coding
…
Brainstorming ideas
…
Learning and research
…
Steaming ahead
The tasks I set the chatbots were simple but they point to something much more significant – the winner of the so-called AI race is far from decided.
For all the vast resources US firms have poured into the tech, their Chinese rival has shown their achievements can be emulated.
…
Reception from the science community
Days before the news outlets discovered DeepSeek, the company published a paper about its Large Language Models (LLMs) and its new chatbot on arXiv. Here’s a little more information,
DeepSeek-R1: Incentivizing Reasoning Capability in LLMs via Reinforcement Learning
[over 100 authors are listed]
We introduce our first-generation reasoning models, DeepSeek-R1-Zero and DeepSeek-R1. DeepSeek-R1-Zero, a model trained via large-scale reinforcement learning (RL) without supervised fine-tuning (SFT) as a preliminary step, demonstrates remarkable reasoning capabilities. Through RL, DeepSeek-R1-Zero naturally emerges with numerous powerful and intriguing reasoning behaviors. However, it encounters challenges such as poor readability, and language mixing. To address these issues and further enhance reasoning performance, we introduce DeepSeek-R1, which incorporates multi-stage training and cold-start data before RL. DeepSeek-R1 achieves performance comparable to OpenAI-o1-1217 on reasoning tasks. To support the research community, we open-source DeepSeek-R1-Zero, DeepSeek-R1, and six dense models (1.5B, 7B, 8B, 14B, 32B, 70B) distilled from DeepSeek-R1 based on Qwen and Llama.
A Chinese-built large language model called DeepSeek-R1 is thrilling scientists as an affordable and open rival to ‘reasoning’ models such as OpenAI’s o1.
These models generate responses step-by-step, in a process analogous to human reasoning. This makes them more adept than earlier language models at solving scientific problems and could make them useful in research. Initial tests of R1, released on 20 January, show that its performance on certain tasks in chemistry, mathematics and coding is on par with that of o1 — which wowed researchers when it was released by OpenAI in September.
“This is wild and totally unexpected,” Elvis Saravia, an AI researcher and co-founder of the UK-based AI consulting firm DAIR.AI, wrote on X.
R1 stands out for another reason. DeepSeek, the start-up in Hangzhou that built the model, has released it as ‘open-weight’, meaning that researchers can study and build on the algorithm. Published under an MIT licence, the model can be freely reused but is not considered fully open source, because its training data has not been made available.
“The openness of DeepSeek is quite remarkable,” says Mario Krenn, leader of the Artificial Scientist Lab at the Max Planck Institute for the Science of Light in Erlangen, Germany. By comparison, o1 and other models built by OpenAI in San Francisco, California, including its latest effort o3 are “essentially black boxes”, he says.
DeepSeek hasn’t released the full cost of training R1, but it is charging people using its interface around one-thirtieth of what o1 costs to run. The firm has also created mini ‘distilled’ versions of R1 to allow researchers with limited computing power to play with the model. An “experiment that cost more than £300 with o1, cost less than $10 with R1,” says Krenn. “This is a dramatic difference which will certainly play a role its future adoption.”
Different strokes for different folks or, in this case, somewhat different approaches to healing different wounds.
Infected wounds
A July 17, 2024 news item on Nanowerk highlights work from China’s Research Center for Neutrophil Engineering Technology (affiliated with Suzhou Hospital of Nanjing Medical University), Note: A link has been removed,
Infectious wounds represent a critical challenge in healthcare, especially for diabetic patients grappling with ineffective antibiotics and escalating drug resistance. Conventional therapies often inadequately address deep tissue infections, highlighting the need for more innovative solutions. Engineered nanovesicles (NVs) from activated neutrophils provide a precise mechanism to combat pathogens deeply embedded in tissues, potentially revolutionizing the management of complex infectious wounds and boosting overall treatment efficacy.
Researchers at the Research Center for Neutrophil Engineering Technology have achieved a significant advancement in medical nanotechnology. Their findings, published in the journal Burns & Trauma (“Engineered nanovesicles from activated neutrophils with enriched bactericidal proteins have molecular debridement ability and promote infectious wound healing”), detail the creation of novel neutrophil-engineered NVs.
This study reveals that engineered NVs derived from activated neutrophils not only mimic the physical properties of exosomes but surpass them due to their rich content of bactericidal proteins. Extensively tested both in vitro and in vivo, these NVs effectively combat key pathogens like Staphylococcus aureus and Escherichia coli, which contribute to deep tissue infections. The NVs promote rapid debridement, significantly reduce bacterial populations, and boost collagen deposition, thus hastening the healing process. This research positions NVs as a formidable alternative to traditional antibiotics, introducing a novel method for treating resistant infections and advancing the field of wound care.
Dr. Bingwei Sun, the lead researcher, emphasized, “These engineered NVs mark a major advancement in the management of infectious diseases. By targeting the infection site with high levels of bactericidal proteins, we achieve swift and effective healing, thereby opening new paths for the treatment of chronic and resistant infections.”
The advent of activated neutrophil-derived NVs signifies a major leap in medical technology, potentially reducing healthcare costs and enhancing patient outcomes. This innovation not only promises to improve wound healing in diabetic and other chronic infection patients but also sets the stage for further development of biologically inspired therapeutic strategies.
A July 17, 2024 news item on phys.org announces work from another team developing its own approach to healing wounds, albeit, a different category of wounds,
Diabetic wounds are notoriously challenging to treat, due to prolonged inflammation and a high risk of infection. Traditional treatments generally offer only passive protection and fail to dynamically interact with the wound environment.
In a new article published in Burns & Trauma on June 5, 2024, a research team from Mudanjiang Medical University and allied institutions assesses the effectiveness of PLLA nanofibrous membranes.
Infused with curcumin and silver nanoparticles, these membranes are designed to substantially enhance the healing processes in diabetic wounds by targeting fundamental issues like excessive inflammation and infection.
This research centered on developing PLLA/C/Ag nanofibrous membranes through air-jet spinning, achieving a consistent fiber distribution essential for effective therapeutic delivery. The membranes boast dual benefits: antioxidant properties that reduce harmful reactive oxygen species in wound environments and potent antibacterial activity that decreases infection risks.
…
A July 17, 2024 Maximum Academic Press ‘press release‘ on EurekAlert provides more information about the research, Note 1: This press release appears to have originated the news item, which was then edited and rewritten; Note 2: Links have been removed,
In a pioneering study, researchers have developed a poly (L-lactic acid) (PLLA) nanofibrous membrane enhanced with curcumin and silver nanoparticles (AgNPs), aimed at improving the healing of diabetic wounds. This advanced dressing targets critical barriers such as inflammation, oxidative stress, and bacterial infections, which hinder the recovery process in diabetic patients. The study’s results reveal a promising therapeutic strategy that could revolutionize care for diabetes-related wounds.
Diabetic wounds are notoriously challenging to heal, with prolonged inflammation and a high risk of infection. Traditional treatments generally offer only passive protection and fail to dynamically interact with the wound environment. The creation of bioactive dressings like the poly (L-lactic acid) (PLLA) nanofibrous membranes incorporated with AgNPs and curcumin (PLLA/C/Ag) membranes signifies a crucial shift towards therapies that actively correct imbalances in the wound healing process, offering a more effective solution for managing diabetic wounds.
Published (DOI: 10.1093/burnst/tkae009) in Burns & Trauma on June 5, 2024, this trailblazing research by a team from Mudanjiang Medical University and allied institutions assesses the effectiveness of PLLA nanofibrous membranes. Infused with curcumin and silver nanoparticles, these membranes are designed to substantially enhance the healing processes in diabetic wounds by targeting fundamental issues like excessive inflammation and infection.
This research centered on developing PLLA/C/Ag nanofibrous membranes through air-jet spinning, achieving a consistent fiber distribution essential for effective therapeutic delivery. The membranes boast dual benefits: antioxidant properties that reduce harmful reactive oxygen species in wound environments and potent antibacterial activity that decreases infection risks. In vivo tests on diabetic mice demonstrated the membranes’ capability to promote crucial healing processes such as angiogenesis and collagen deposition. These findings illustrate that PLLA/C/Ag membranes not only protect wounds but also actively support and expedite the healing process, marking them as a significant therapeutic innovation for diabetic wound management with potential for broader chronic wound care applications.
Dr. Yanhui Chu, a principal investigator of the study, highlighted the importance of these developments: “The PLLA/C/Ag membranes are a significant breakthrough in diabetic wound care. Their ability to effectively modulate the wound environment and enhance healing could establish a new standard in treatment, providing hope to millions affected by diabetes-related complications.”
The deployment of PLLA/C/Ag nanofibrous membranes in clinical environments could transform the treatment of diabetic wounds, offering a more active and effective approach. Beyond diabetes management, this technology has the potential for extensive applications in various chronic wounds, paving the way for future breakthroughs in bioactive wound dressings. This study not only progresses our understanding of wound management but also paves new paths for developing adaptive treatments for complex wound scenarios.
As I think most people know, publishing of any kind is a tough business, particularly these days. This instability has led to some interesting corporate relationships. E.g., Springer Nature (a German-British academic publisher) is the outcome of some mergers as the Springer Nature Wikipedia entry notes,
The company originates from several journals and publishing houses, notably Springer-Verlag, which was founded in 1842 by Julius Springer in Berlin[4] (the grandfather of Bernhard Springer who founded Springer Publishing in 1950 in New York),[5] Nature Publishing Group which has published Nature since 1869,[6] and Macmillan Education, which goes back to Macmillan Publishers founded in 1843.[7]
Springer Nature was formed in 2015 by the merger of Nature Publishing Group, Palgrave Macmillan, and Macmillan Education (held by Holtzbrinck Publishing Group) with Springer Science+Business Media (held by BC Partners). Plans for the merger were first announced on 15 January 2015.[8] The transaction was concluded in May 2015 with Holtzbrinck having the majority 53% share.[9]
…
Now you have what was an independent science journal, Nature, owned by Springer. By the way, Springer Nature also acquired Scientific American, another major science journal.
Relatedly, seeing Maximum Academic Press as the issuer for the press releases mentioned here aroused my curiosity. I haven’t stumbled across the company before but found this on the company’s About Us webpage, Note: Links have been removed,
Maximum Academic Press (MAP) is an independent publishing company with focus on publishing golden open access academic journals. From 2020 to now, MAP has successfully launched 24 academic journals which cover the research fields of agriculture, biology, environmental sciences, engineering and humanities and social sciences.
Professor Zong-Ming (Max) Cheng, chief editor and founder of MAP, who earned his Ph.D from Cornell University in 1991 and worked as an Assistant, Associate and Professor at North Dakota State University and University of Tennessee for over 30 years. Prior to establishing MAP, Dr. Cheng launched Horticulture Research (initially published by Nature Publishing Group) in 2014, Plant Phenomics (published by American Association of Advancement of Sciences, AAAS) in 2019, and BioDesign Research (published by AAAS) in 2020, and served as the Editor-in-Chief, Co-Editors-in-Chief, and the executive editor, respectively. Dr. Cheng wishes to apply all successful experiences in launching and managing these three high quality journals to MAP-published journals with highest quality and ethics standards.
…
It was a little bit of a surprise to see that MAP doesn’t publish the journal, Burns & Trauma, where the studies (cited here) were published. From the Burns & Trauma About the Journal webpage on the Oxford University Press website for Oxford Academic journals,
Aims and scope
Burns & Trauma is an open access, peer-reviewed journal publishing the latest developments in basic, clinical, and translational research related to burns and traumatic injuries, with a special focus on various aspects of biomaterials, tissue engineering, stem cells, critical care, immunobiology, skin transplantation, prevention, and regeneration of burns and trauma injury.
Society affiliations
Burns & Trauma is the official journal of Asia-Pacific Society of Scar Medicine, Chinese Burn Association, Chinese Burn Care and Rehabilitation Association and Chinese Society for Scar Medicine. It is sponsored by the Institute of Burn Research, Southwest Hospital (First Affiliated Hospital of Army Medical University), China.
…
I don’t know what to make of it all but I can safely say scientific publishing has gotten quite complicated since the days that Nature first published its own eponymous journal.
A June 5, 2024 news item on phys.org announces new research into ‘aqueous’ wearable batteries,
Researchers have developed a safer, cheaper, better performing and more flexible battery option for wearable devices. A paper describing the “recipe” for their new battery type was published in the journal Nano Research Energy on June 3 [2024].
Fitness trackers. Smart watches. Virtual-reality headsets. Even smart clothing and implants. Wearable smart devices are everywhere these days. But for greater comfort, reliability and longevity, these devices will require greater levels of flexibility and miniaturization of their energy storage mechanisms, which are often frustratingly bulky, heavy and fragile. On top of this, any improvements cannot come at the expense of safety.
As a result, in recent years, a great deal of battery research has focused on the development of “micro” flexible energy storage devices, or MFESDs. A range of different structures and electrochemical foundations have been explored, and among them, aqueous micro batteries offer many distinct advantages.
Aqueous batteries—those that use a water-based solution as an electrolyte (the medium that allows transport of ions in the battery and thus creating an electric circuit) are nothing new. They have been around since the late 19th century. However, their energy density—or the amount of energy contained in the battery per unit of volume—is too low for use in things like electric vehicles as they would take up too much space. Lithium-ion batteries are far more appropriate for such uses.
At the same time, aqueous batteries are much less flammable, and thus safer, than lithium-ion batteries. They are also much cheaper. As a result of this more robust safety and low cost, aqueous options have increasingly been explored as one of the better options for MFESDs. These are termed aqueous micro batteries, or just AMBs.
“Up till now, sadly, AMBs have not lived up to their potential,” said Ke Niu, a materials scientist with the Guangxi Key Laboratory of Optical and Electronic Materials and Devices at the Guilin University of Technology—one of the lead researchers on the team. “To be able to be used in a wearable device, they need to withstand a certain degree of real-world bending and twisting. But most of those explored so far fail in the face of such stress.”
To overcome this, any fractures or failure points in an AMB would need to be self-healing following such stress. Unfortunately, the self-healing AMBs that have been developed so far have tended to depend on metallic compounds as the carriers of charge in the battery’s electric circuit. This has the undesirable side-effect of strong reaction between the metal’s ions and the materials that the electrodes (the battery’s positive and negative electrical conductors) are made out of. This in turn reduces the battery’s reaction rate (the speed at which the electrochemical reactions at the heart of any battery take place), drastically limiting performance.
“So we started investigating the possibility of non-metallic charge carriers, as these would not suffer from the same difficulties from interaction with the electrodes,” added Junjie Shi, another leading member of the team and a researcher with the School of Physics and Center zfor Nanoscale Characterization & Devices (CNCD) at the Huazhong University of Science and Technology in Wuhan.
The research team alighted upon ammonium ions, derived from abundantly available ammonium salts, as the optimal charge carriers. They are far less corrosive than other options and have a wide electrochemical stability window.
“But ammonium ions are not the only ingredient in the recipe needed to make our batteries self-healing,” said Long Zhang, the third leading member of the research team, also at CNCD.
For that, the team incorporated the ammonium salts into a hydrogel—a polymer material that can absorb and retain a large amount of water without disturbing its structure. This gives hydrogels impressive flexibility—delivering precisely the sort of self-healing character needed. Gelatin is probably the most well-known hydrogel, although the researchers in this case opted for a polyvinyl alcohol hydrogel (PVA) for its great strength and low cost.
To optimize compatibility with the ammonium electrolyte, titanium carbide—a ‘2D’ nanomaterial with only a single layer of atoms—was chosen for the anode (the negative electrode) material for its excellent conductivity. Meanwhile manganese dioxide, already commonly used in dry cell batteries, was woven into a carbon nanotube matrix (again to improve conductivity) for the cathode (the positive electrode).
Testing of the prototype self-healing battery showed it exhibited excellent energy density, power density, cycle life, flexibility, and self-healing even after ten self-healing cycles.
The team now aims to further develop and optimise their prototype in preparation for commercial production.
Nano Research Energy is launched by Tsinghua University Press and exclusively available via SciOpen, aiming at being an international, open-access and interdisciplinary journal. We will publish research on cutting-edge advanced nanomaterials and nanotechnology for energy. It is dedicated to exploring various aspects of energy-related research that utilizes nanomaterials and nanotechnology, including but not limited to energy generation, conversion, storage, conservation, clean energy, etc. Nano Research Energy will publish four types of manuscripts, that is, Communications, Research Articles, Reviews, and Perspectives in an open-access form.
SciOpen is a professional open access resource for discovery of scientific and technical content published by the Tsinghua University Press and its publishing partners, providing the scholarly publishing community with innovative technology and market-leading capabilities. SciOpen provides end-to-end services across manuscript submission, peer review, content hosting, analytics, and identity management and expert advice to ensure each journal’s development by offering a range of options across all functions as Journal Layout, Production Services, Editorial Services, Marketing and Promotions, Online Functionality, etc. By digitalizing the publishing process, SciOpen widens the reach, deepens the impact, and accelerates the exchange of ideas.
This paper is open access by means of a “Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, distribution and reproduction in any medium, provided the original work is properly cited.”
Nanotechnology’s enormous potential across various sectors has long attracted the eye of investors, keen to capitalise on its commercial potency.
Yet the initial propulsion provided by traditional venture capital avenues was reined back when the reality of long development timelines, regulatory hurdles, and difficulty in translating scientific advances into commercially viable products became apparent.
While the initial flurry of activity declined in the early part of the 21st century, a new kid on the investing block has proved an enticing option beyond traditional funding methods.
Corporate venture capital has, over the last 10 years emerged as a key plank in turning ideas into commercial reality.
Simply put, corporate venture capital (CVC) has seen large corporations, recognising the strategic value of nanotechnology, establish their own VC arms to invest in promising start-ups.
The likes of Samsung, Johnson & Johnson and BASF have all sought to get an edge on their competition by sinking money into start-ups in nano and other technologies, which could deliver benefits to them in the long term.
…
Unlike traditional VC firms, CVCs invest with a strategic lens, aligning their investments with their core business goals. For instance, BASF’s venture capital arm, BASF Venture Capital, focuses on nanomaterials with applications in coatings, chemicals, and construction.
It has an evergreen EUR 250 million fund available and will consider everything from seed to Series B investment opportunities.
…
Samsung Ventures takes a similar approach, explaining: “Our major investment areas are in semiconductors, telecommunication, software, internet, bioengineering and the medical industry from start-ups to established companies that are about to be listed on the stock market.
…
While historically concentrated in North America and Europe, CVC activity in nanotechnology is expanding to Asia, with China being a major player.
China has, perhaps not surprisingly, seen considerable growth over the last decade in nano and few will bet against it being the primary driver of innovation over the next 10 years.
As ever, the long development cycles of emerging nano breakthroughs can frequently deter some CVCs with shorter investment horizons.
…
2023 Nanotechnology patent applications: which countries top the list?
A March 28, 2024 article from statnano.com provides interesting data concerning patent applications,
In 2023, a total of 18,526 nanotechnology patent applications were published at the United States Patent and Trademark Office (USPTO) and the European Patent Office (EPO). The United States accounted for approximately 40% of these nanotechnology patent publications, followed by China, South Korea, and Japan in the next positions.
According to a statistical analysis conducted by StatNano using data from the Orbit database, the USPTO published 84% of the 18,526 nanotechnology patent applications in 2023, which is more than five times the number published by the EPO. However, the EPO saw a nearly 17% increase in nanotechnology patent publications compared to the previous year, while the USPTO’s growth was around 4%.
Nanotechnology patents are defined based on the ISO/TS 18110 standard as those having at least one claim related to nanotechnology orpatents classified with an IPC classification code related to nanotechnology such as B82.
Regulation of artificial intelligence (AI) has become very topical in the last couple of years. There was an AI safety summit in November 2023 at Bletchley Park in the UK (see my November 2, 2023 posting for more about that international meeting).
A very software approach?
This year (2024) has seen a rise in legislative and proposed legislative activity. I have some articles on a few of these activities. China was the first to enact regulations of any kind on AI according to Matt Sheehan’s February 27, 2024 paper for the Carnegie Endowment for International Peace,
In 2021 and 2022, China became the first country to implement detailed, binding regulations on some of the most common applications of artificial intelligence (AI). These rules formed the foundation of China’s emerging AI governance regime, an evolving policy architecture that will affect everything from frontier AI research to the functioning of the world’s second-largest economy, from large language models in Africa to autonomous vehicles in Europe.
…
The Chinese Communist Party (CCP) and the Chinese government started that process with the 2021 rules on recommendation algorithms, an omnipresent use of the technology that is often overlooked in international AI governance discourse. Those rules imposed new obligations on companies to intervene in content recommendations, granted new rights to users being recommended content, and offered protections to gig workers subject to algorithmic scheduling. The Chinese party-state quickly followed up with a new regulation on “deep synthesis,” the use of AI to generate synthetic media such as deepfakes. Those rules required AI providers to watermark AI-generated content and ensure that content does not violate people’s “likeness rights” or harm the “nation’s image.” Together, these two regulations also created and amended China’s algorithm registry, a regulatory tool that would evolve into a cornerstone of the country’s AI governance regime.
…
The UK has adopted a more generalized approach focused on encouraging innovation according to Valeria Gallo’s and Suchitra Nair’s February 21, 2024 article for Deloitte (a British professional services firm also considered one of the big four accounting firms worldwide),
At a glance
The UK Government has adopted a cross-sector and outcome-based framework for regulating AI, underpinned by five core principles. These are safety, security and robustness, appropriate transparency and explainability, fairness, accountability and governance, and contestability and redress.
Regulators will implement the framework in their sectors/domains by applying existing laws and issuing supplementary regulatory guidance. Selected regulators will publish their AI annual strategic plans by 30th April [2024], providing businesses with much-needed direction.
Voluntary safety and transparency measures for developers of highly capable AI models and systems will also supplement the framework and the activities of individual regulators.
The framework will not be codified into law for now, but the Government anticipates the need for targeted legislative interventions in the future. These interventions will address gaps in the current regulatory framework, particularly regarding the risks posed by complex General Purpose AI and the key players involved in its development.
Organisations must prepare for increased AI regulatory activity over the next year, including guidelines, information gathering, and enforcement. International firms will inevitably have to navigate regulatory divergence.
…
While most of the focus appears to be on the software (e.g., General Purpose AI), the UK framework does not preclude hardware.
As part of its digital strategy, the EU wants to regulate artificial intelligence (AI) to ensure better conditions for the development and use of this innovative technology. AI can create many benefits, such as better healthcare; safer and cleaner transport; more efficient manufacturing; and cheaper and more sustainable energy.
In April 2021, the European Commission proposed the first EU regulatory framework for AI. It says that AI systems that can be used in different applications are analysed and classified according to the risk they pose to users. The different risk levels will mean more or less regulation.
The agreed text is expected to be finally adopted in April 2024. It will be fully applicable 24 months after entry into force, but some parts will be applicable sooner:
*The ban of AI systems posing unacceptable risks will apply six months after the entry into force
*Codes of practice will apply nine months after entry into force
*Rules on general-purpose AI systems that need to comply with transparency requirements will apply 12 months after the entry into force
High-risk systems will have more time to comply with the requirements as the obligations concerning them will become applicable 36 months after the entry into force.
…
This EU initiative, like the UK framework, seems largely focused on AI software and according to the Wikipedia entry “Regulation of artificial intelligence,”
… The AI Act is expected to come into effect in late 2025 or early 2026.[109
I do have a few postings about Canadian regulatory efforts, which also seem to be focused on software but don’t preclude hardware. While the January 20, 2024 posting is titled “Canada’s voluntary code of conduct relating to advanced generative AI (artificial intelligence) systems,” information about legislative efforts is also included although you might find my May 1, 2023 posting titled “Canada, AI regulation, and the second reading of the Digital Charter Implementation Act, 2022 (Bill C-27)” offers more comprehensive information about Canada’s legislative progress or lack thereof.
A February 15, 2024 news item on ScienceDaily suggests that regulating hardware may be the most effective way of regulating AI,
Chips and datacentres — the ‘compute’ power driving the AI revolution — may be the most effective targets for risk-reducing AI policies as they have to be physically possessed, according to a new report.
A global registry tracking the flow of chips destined for AI supercomputers is one of the policy options highlighted by a major new report calling for regulation of “compute” — the hardware that underpins all AI — to help prevent artificial intelligence misuse and disasters.
Other technical proposals floated by the report include “compute caps” — built-in limits to the number of chips each AI chip can connect with — and distributing a “start switch” for AI training across multiple parties to allow for a digital veto of risky AI before it feeds on data.
The experts point out that powerful computing chips required to drive generative AI models are constructed via highly concentrated supply chains, dominated by just a handful of companies — making the hardware itself a strong intervention point for risk-reducing AI policies.
The report, published 14 February [2024], is authored by nineteen experts and co-led by three University of Cambridge institutes — the Leverhulme Centre for the Future of Intelligence (LCFI), the Centre for the Study of Existential Risk (CSER) and the Bennett Institute for Public Policy — along with OpenAI and the Centre for the Governance of AI.
“Artificial intelligence has made startling progress in the last decade, much of which has been enabled by the sharp increase in computing power applied to training algorithms,” said Haydn Belfield, a co-lead author of the report from Cambridge’s LCFI.
“Governments are rightly concerned about the potential consequences of AI, and looking at how to regulate the technology, but data and algorithms are intangible and difficult to control.
“AI supercomputers consist of tens of thousands of networked AI chips hosted in giant data centres often the size of several football fields, consuming dozens of megawatts of power,” said Belfield.
“Computing hardware is visible, quantifiable, and its physical nature means restrictions can be imposed in a way that might soon be nearly impossible with more virtual elements of AI.”
The computing power behind AI has grown exponentially since the “deep learning era” kicked off in earnest, with the amount of “compute” used to train the largest AI models doubling around every six months since 2010. The biggest AI models now use 350 million times more compute than thirteen years ago.
Government efforts across the world over the past year – including the US Executive Order on AI, EU AI Act, China’s Generative AI Regulation, and the UK’s AI Safety Institute – have begun to focus on compute when considering AI governance.
Outside of China, the cloud compute market is dominated by three companies, termed “hyperscalers”: Amazon, Microsoft, and Google. “Monitoring the hardware would greatly help competition authorities in keeping in check the market power of the biggest tech companies, and so opening the space for more innovation and new entrants,” said co-author Prof Diane Coyle from Cambridge’s Bennett Institute.
The report provides “sketches” of possible directions for compute governance, highlighting the analogy between AI training and uranium enrichment. “International regulation of nuclear supplies focuses on a vital input that has to go through a lengthy, difficult and expensive process,” said Belfield. “A focus on compute would allow AI regulation to do the same.”
Policy ideas are divided into three camps: increasing the global visibility of AI computing; allocating compute resources for the greatest benefit to society; enforcing restrictions on computing power.
For example, a regularly-audited international AI chip registry requiring chip producers, sellers, and resellers to report all transfers would provide precise information on the amount of compute possessed by nations and corporations at any one time.
The report even suggests a unique identifier could be added to each chip to prevent industrial espionage and “chip smuggling”.
“Governments already track many economic transactions, so it makes sense to increase monitoring of a commodity as rare and powerful as an advanced AI chip,” said Belfield. However, the team point out that such approaches could lead to a black market in untraceable “ghost chips”.
Other suggestions to increase visibility – and accountability – include reporting of large-scale AI training by cloud computing providers, and privacy-preserving “workload monitoring” to help prevent an arms race if massive compute investments are made without enough transparency.
“Users of compute will engage in a mixture of beneficial, benign and harmful activities, and determined groups will find ways to circumvent restrictions,” said Belfield. “Regulators will need to create checks and balances that thwart malicious or misguided uses of AI computing.”
These might include physical limits on chip-to-chip networking, or cryptographic technology that allows for remote disabling of AI chips in extreme circumstances. One suggested approach would require the consent of multiple parties to unlock AI compute for particularly risky training runs, a mechanism familiar from nuclear weapons.
AI risk mitigation policies might see compute prioritised for research most likely to benefit society – from green energy to health and education. This could even take the form of major international AI “megaprojects” that tackle global issues by pooling compute resources.
The report’s authors are clear that their policy suggestions are “exploratory” rather than fully fledged proposals and that they all carry potential downsides, from risks of proprietary data leaks to negative economic impacts and the hampering of positive AI development.
They offer five considerations for regulating AI through compute, including the exclusion of small-scale and non-AI computing, regular revisiting of compute thresholds, and a focus on privacy preservation.
Added Belfield: “Trying to govern AI models as they are deployed could prove futile, like chasing shadows. Those seeking to establish AI regulation should look upstream to compute, the source of the power driving the AI revolution. If compute remains ungoverned it poses severe risks to society.”
Authors include: Girish Sastry, Lennart Heim, Haydn Belfield, Markus Anderljung, Miles Brundage, Julian Hazell, Cullen O’Keefe, Gillian K. Hadfield, Richard Ngo, Konstantin Pilz, George Gor, Emma Bluemke, Sarah Shoker, Janet Egan, Robert F. Trager, Shahar Avin, Adrian Weller, Yoshua Bengio, and Diane Coyle.
The authors are associated with these companies/agencies: OpenAI, Centre for the Governance of AI (GovAI), Leverhulme Centre for the Future of Intelligence at the Uni. of Cambridge, Oxford Internet Institute, Institute for Law & AI, University of Toronto Vector Institute for AI, Georgetown University, ILINA Program, Harvard Kennedy School (of Government), *AI Governance Institute,* Uni. of Oxford, Centre for the Study of Existential Risk at Uni. of Cambridge, Uni. of Cambridge, Uni. of Montreal / Mila, Bennett Institute for Public Policy at the Uni. of Cambridge.
“The ILINIA program is dedicated to providing an outstanding platform for Africans to learn and work on questions around maximizing wellbeing and responding to global catastrophic risks” according to the organization’s homepage.
*As for the AI Governance Institute, I believe that should be the Centre for the Governance of AI at Oxford University since the associated academic is Robert F. Trager from the University of Oxford.
As the months (years?) fly by, I guess we’ll find out if this hardware approach gains any traction where AI regulation is concerned.