Tag Archives: Sebastian Gonzalez

Canadian quantum companies chase US DARPA’s (Defense Advanced Research Projects Agency) $$$ and RIP Raymond Laflamme

Canada’s quantum community, i.e., three companies, are currently ‘competing’ for US science funding. It seems like an odd choice given all of the news about science funding cuts and funding freezes along with the Trump administration’s chaotic and, increasingly, untrustworthy government management.

On April 3, 2025 the US Defense Advanced Research Projects Agency (DARPA) announced that approximately 20 companies were embarked on what they describe as Stage A of the Quantum Benchmarking Initiative (QBI) ‘challenge’,

Here’s more from that April 3, 2025 DARPA notice,

Nearly 20 quantum computing companies have been chosen to enter the initial stage of DARPA’s Quantum Benchmarking Initiative (QBI), in which they will characterize their unique concepts for creating a useful, fault-tolerant quantum computer within a decade.

QBI, which kicked off in July 2024, aims to determine whether it’s possible to build such a computer much faster than conventional predictions. Specifically, QBI is designed to rigorously verify and validate whether any quantum computing approach can achieve utility-scale operation — meaning its computational value exceeds its cost — by the year 2033.

“We selected these companies for Stage A following a review of their written abstracts and daylong oral presentations before a team of U.S. quantum experts to determine whether their proposed concepts might be able to reach industrial utility,” said Joe Altepeter, DARPA QBI program manager. “For the chosen companies, now the real work begins. Stage A is a six-month sprint in which they’ll provide comprehensive technical details of their concepts to show that they hold water and could plausibly lead to a transformative, fault-tolerant quantum computer in under 10 years.”

The following companies* are pursuing a variety of technologies for creating quantum bits (qubits) — the building block for quantum computers — including superconducting qubits, trapped ion qubits, neutral atom qubits, photonic qubits, semiconductor spin qubits, and other novel approaches listed below:

  • Alice & Bob — Cambridge, Massachusetts, and Paris, France (superconducting cat qubits)
  • Atlantic Quantum — Cambridge, Massachusetts (fluxonium qubits with co-located cryogenic controls)
  • Atom Computing — Boulder, Colorado (scalable arrays of neutral atoms)
  • Diraq — Sydney, Australia, with operations in Palo Alto, California, and Boston, Massachusetts (silicon CMOS spin qubits)
  • Hewlett Packard Enterprise — Houston, Texas (superconducting qubits with advanced fabrication)
  • IBM — Yorktown Heights, NY (quantum computing with modular superconducting processors)
  • IonQ — College Park, Maryland (trapped-ion quantum computing)
  • Nord Quantique — Sherbrooke, Quebec, Canada (superconducting qubits with bosonic error correction)
  • Oxford Ionics — Oxford, UK and Boulder, Colorado (trapped-ions)
  • Photonic Inc. — Vancouver, British Columbia, Canada (optically-linked silicon spin qubits)
  • Quantinuum — Broomfield, Colorado (trapped-ion quantum charged coupled device (QCCD) architecture)
  • Quantum Motion — London, UK (MOS-based silicon spin qubits)
  • QuEra Computing — Boston, Massachusetts (neutral atom qubits)
  • Rigetti Computing — Berkeley, California (superconducting tunable transmon qubits)
  • Silicon Quantum Computing Pty. Ltd. — Sydney, Australia (precision atom qubits in silicon)
  • Xanadu — Toronto, Canada (photonic quantum computing)

Companies that successfully complete Stage A will move to a yearlong Stage B, during which DARPA will rigorously examine their research and development approach, followed by a final Stage C where the QBI independent verification and validation (IV&V) team will test the companies’ computer hardware.

“During Stage B we’ll thoroughly review all aspects of their R&D plans to see if they can go the distance — not just meet next year’s milestones — and stand the test of trying to build a transformative technology on this kind of a timeline,” Altepeter explained. “Those who make it through Stages A and B will enter the final portion of the program, Stage C, where a full-size IV&V team will conduct real-time, rigorous evaluation of the components, subsystems, and algorithms – everything that goes into building a fault-tolerant quantum computer for real. And we’ll do all these evaluations without slowing the companies down.”

QBI is not a competition between companies [emphasis mine]; rather, it aims to scan the landscape of commercial quantum computing efforts to spot every company on a plausible path to a useful quantum computer.

DARPA recently announced that Microsoft and PsiQuantum are entering the third and final phase of the Underexplored Systems for Utility-Scale Quantum Computing (US2QC) program, a pilot effort that was expanded to become QBI. Both companies were participating in the second phase of US2QC when the QBI expansion was announced. The final Phase of US2QC has the same technical goals as Stage C of QBI – verification and validation of an industrially useful quantum computer.

“We’ve built and are expanding our world-class IV&V team of U.S. quantum experts, leveraging federal and state test facilities to separate hype from reality in quantum computing,” Altepeter said. “Our team is eager to scrutinize the commercial concepts, designs, R&D plans, and prototype hardware — all with the goal of helping the U.S. government identify and support efforts that are genuinely advancing toward transformative, fault-tolerant quantum computing.”

For more information on QBI visit: www.darpa.mil/QBI.

*16 of the 18 companies are being announced; two are still in negotiations. DARPA will update this announcement once their agreements are signed.

Editor’s Note: This update was edited on April 29, 2025 to add QuEra Computing to the list of companies selected for Stage A.

This sounds like DARPA will pick and choose which bits of technology it may want to develop. Also, who owns the technology? An April 5, 2025 article by Sean Silcoff and Ivan Semeniuk for the Globe and Mail raises the question and answers it (more or less), Note: I have the paper version of the article,

Three Canadian quantum computer companies are in the running for up to US$316-million apiece in funding from the US government if they can prove within eight years that their machines will work at scale.

The companies – Xanadu Quantum Technologies Inc. of Toronto , Vancouver-based Photonics Inc. and Nord Quantique from Sherbrooke, Que. – are among 18 groups from Canada, the US, Britain, and Australia that have qualified for the first stage (Stage A) of the Quantum Benchmarking Initiative (QBI).

QBI is not meant to choose a winner and fund your research and development plan, [emphasis mine]” said Dr. Joe Altpeter, the QBI’s program manager. Rather, the program is structured to reward only those that can quickly execute against their roadmaps and deliver something useful.

However, making it through will likely anoint a winner or winners in the global race to develop a working quantum computer. [emphasis mine]

“I can’t think of any other program that has generated this much excitement and interest from startups and big companies – and a lot of investors know about it,” said Christian Weedbrook, Xanadu’s founder and chief executive officer [CEO].

Quantum computer developers have collectively raised and spend billions of dollars so far, and QBI will likely influence financiers in determining who to continue backing.

Conversely, “groups that don’t get in will be challenged to raise venture capital,” said Ray [Raymond] Laflamme, co-chair of the federal Quantum Advisory Council. The council has recommended the Canadian government provide matching funds [emphasis mine] to any domestic company that makes it through QBI.

Council co-chair Stephanie Simmons, who is also the founder and chief quantum officer [CQO] of Photonic, said the US government will gain access to “deep knowledge that other governments won’t have” [emphasis mine] through QBI.

That will give them geopolitical and other advantages [emphasis mine] that are important in the upcoming economy.” Creating a matching program here would mean “This information would also be owned by the Canadian government.”

“I would love to be proved surprised if companies make it through the gauntlet, you’re really will to advocate for them inside the US government in rooms that they can’t go to and say, ‘Look, we did our best to show this doesn’t work, these guys made it, they can really build this thing,'” he [Dr. Joe Altpeter] said adding that the program was designed to a “simple, cheap way” to determine that.

Mr. Laflamme agreed that QBI “is a very smart way for the US to keep at the front. By tis, the US will who has the lead in the world and people are, everywhere.” [p. B11 paper version]

Clearly, the US has much to gain from this ‘non-competition’. It’s not clear to me what Canada will gain.

One quick note. D-Wave Systems is mentioned in Silcoff’s and Semeniuk’s April 5, 2025 article and described as a Canadian company. That is questionable. It was headquartered in the Vancouver area, British Columbia, Canada for a number of years but is now, according to its Wikipedia entry, headquartered in Palo Alto, California, US (see the sidebar). The company retains laboratories and offices in British Columbia.

It would seem that Silcoff’s and Semeniuk’s April 5, 2025 article hosted one of M. Laflamme’s last interviews.

RIP Raymond Laflamme, July 19, 1960 – June 19, 2025

I’ve had to interview more than one ‘horse’s behind’ (two members of the forestry faculty at the University of British Columbia spring to mind); M. Laflamme was most assuredly not one of them. It was a privilege to interview him for a May 11, 2015 posting about Research2Reality, a Canadian social media engagement project (scroll down to the subhead with his name),

Who convinces a genius that he’s gotten an important cosmological concept wrong or ignored it? Alongside Don Page, Laflamme accomplished that feat as one of Stephen Hawking’s PhD students at the University of Cambridge. Today (May 11, 2015), Laflamme is (from his Wikipedia entry)

… co-founder and current director of the Institute for Quantum Computing at the University of Waterloo. He is also a professor in the Department of Physics and Astronomy at the University of Waterloo and an associate faculty member at Perimeter Institute for Theoretical Physics. Laflamme is currently a Canada Research Chair in Quantum Information.

The Council of Canadian Academies’ (CCA) July 22, 2025 The Advance newsletter (received via email) held this notice, Note: A link has been removed,

And Ray Laflamme, the theoretical physicist and Canada Research Chair in Quantum Information, died on June 19 [2025] following a lengthy battle with cancer. Laflamme, founding director of the Institute for Quantum Computing at the University of Waterloo, served as chair of our Expert Panel on the Responsible Adoption of Quantum Technologies. …

I have a commentary on the CCA report issued by Laflamme and his expert panel. The report was published in November 2023 and my commentary published in two parts about 15 months later,

To wildly paraphrase John Donne (For Whom the Bell Tolls), M. Laflamme’s death diminishes us but more importantly his life enhanced us all in ways both small and large. Thank you.

And the quantum goes on

Members of the Canadian quantum community that M. Laflamme helped build have recently announced a breakthrough. From a July 10, 2025 TRIUMF news release (also on Quantum Wire), Note: A link has been removed,

A cross-Canada team of researchers have brought quantum and generative AI together to prepare for the Large Hadron Collider’s next upgrade.

In the world of collider physics, simulations play a key role in analyzing data from particle accelerators. Now, a cross-Canada effort is combining quantum with generative AI to create novel simulation models for the next big upgrade of the Large Hadron Collider (LHC) – the world’s largest particle accelerator [located at the European particle physics laboratory CERN, in Switzerland].

In a paper published in npj Quantum Information, a team that includes researchers from TRIUMF, Perimeter Institute, and the National Research Council of Canada (NRC) are the first to use annealing quantum computing and deep generative AI to create simulations that are fast, accurate, and computationally efficient. If the models continue to improve, they could represent a new way to create synthetic data to help with analysis in particle collisions

Why simulations are essential for collider physics

Simulations broadly assist collider physics researchers in two ways. First, researchers use them to statistically match observed data to theoretical models. Second, scientists use simulated data to help optimize the design of the data analysis, for instance by isolating the signal they are studying from irrelevant background events.

“To do the data analysis at the LHC, you need to create copious amounts of simulations of collision events,” explains Wojciech Fedorko, one of the principal investigators on the paper and Deputy Department Head, Scientific Computing at TRIUMF, Canada’s particle accelerator centre in Vancouver. “Basically, you take your hypothesis, and you simulate it under multiple scenarios. One of those scenarios will statistically best match the real data that has been produced in the real experiment.”

Currently, the LHC is preparing for a major shutdown in anticipation of its high luminosity upgrade. When it comes back online, it will require more complex simulations that are reliably accurate, fast to produce, and computationally efficient. Those requirements have the potential to create a bottleneck, as the computational power required to create these simulations will no longer be feasible.

“Simulations are projected to cost millions of CPU years annually when the high luminosity LHC turns on,” says Javier Toledo-Marín, a researcher scientist jointly appointed at Perimeter Institute and TRIUMF. “It’s financially and environmentally unsustainable to keep doing business as usual.”

When quantum and generative AI collide 

Particle physicists use specialized detectors called calorimeters to measure the energy released by the showers of particles that result from collisions. Scientists combine the readings from these and other detectors to piece together what happened at the initial collision. It’s through this process of comparing simulations to experimental data that researchers discovered the Higgs boson at the Large Hadron Collider in 2012. Compared to the other sub-detector systems within the LHC experiments, calorimeters and the data they produce are the most computationally intensive to simulate, and as such they represent a major opportunity for efficiency gains.

In 2022, a scientific “challenge” was issued by researchers seeking to spur rapid advances in calorimeter computations, in an attempt to address the coming computational bottleneck at the LHC. Named the “CaloChallenge,” the challenge provided datasets based on LHC experiments for teams to develop and benchmark simulations of calorimeter readings. Fedorko and the team are the only ones so far to take a full-scale quantum approach, thanks to an assist from D-Wave Quantum Inc.’s annealing quantum computing technology.

Annealing quantum computing is a process that is usually used to find the lowest-energy state for a system or a state near to the lowest energy one, which is useful for problems involving optimization.

After discussing with D-Wave, Fedorko, Toledo-Marín, and the rest of the team determined that D-Wave’s annealing quantum computers could be used for simulation generation. You just need to use annealing to manipulate qubits (the smallest bits of quantum information) in an unconventional way.

“In the D-Wave quantum processor, there is a mechanism that ensures the ratio between the ‘bias’ on a given qubit and the ‘weight’ linking it to another qubit is the same throughout the annealing process. With the help of D-Wave, the team realized that they could use this mechanism to instead guarantee outcomes for a subset of the qubits on a device. “We basically hijacked that mechanism to fix in place some of the spins,” says Fedorko. “This mechanism can be used to ‘condition’ the processor – for example, generate showers with specific desired properties – like the energy of a particle impinging on the calorimeter.”

The end result: an unconventional way to use annealing quantum computing to generate high-quality synthetic data for analyzing particle collisions.

The next phase of collider physics simulations

The published result is important because of its performance in three metrics: the speed to generate the simulations, their accuracy, and how much computational resources they require. “For speed, we are in the top bound of results published by other teams and our accuracy is above average,” Toledo-Marín says. “What makes our framework competitive is really the unique combination of several factors – speed, accuracy, and energy consumption.”

Essentially, many types of quantum processing units (QPU) must be kept at an extremely low temperature. But giving it multiple tasks doesn’t significantly impact its energy requirements. A standard graphics processing unit (GPU), by contrast, will increase its energy use for each job it receives. As advanced GPUs become more and more power-hungry, QPUs by contrast can potentially scale up without leading to increasing computational energy requirements.

Looking forward, the team is excited to test their models on new incoming data so they can finetune their models, increasing both speed and accuracy. If all goes well, annealing quantum computing could become an essential aspect of generating simulations.

“It’s a good example of being able to scale something in the field of quantum machine learning to something practical that can potentially be deployed,” says Toledo-Marín.

The authors are grateful for the support of their many funders and contributors, which include the University of British Columbia, the University of Virginia, the NRC, D-Wave, and MITACS [originally funded as: Mathematics of Information Technology and Complex Systems; now a nonprofit research organization].

A joint July 10, 2025 Perimeter Institute for Theoretical Physics and TRIUMF news release on Newswise (also on the Quantum Insider but published July 11, 2025) is markedly shorter more ‘boosterish’ than what appears to be the TRIUMF news release,

In a landmark achievement for Canadian science, a team of scientists led by TRIUMF and the Perimeter Institute for Theoretical Physics have unveiled transformative research that – for the first time – merges quantum computing techniques with advanced AI to model complex simulations in a fast, accurate and energy-efficient way.

“This is a uniquely Canadian success story,” said Wojciech Fedorko, Deputy Department Head, Scientific Computing at TRIUMF. “Uniting the expertise from our country’s research institutions and industry leaders has not only advanced our ability to carry out fundamental research, but also demonstrated Canada’s ability to lead the world in quantum and AI innovation.”

In any event, here’s a link to and a citation for the paper,

Conditioned quantum-assisted deep generative surrogate for particle-calorimeter interactions by J. Quetzalcóatl Toledo-Marín, Sebastian Gonzalez, Hao Jia, Ian Lu, Deniz Sogutlu, Abhishek Abhishek, Colin Gay, Eric Paquet, Roger G. Melko, Geoffrey C. Fox, Maximilian Swiatlowski & Wojciech Fedorko. npj Quantum Information volume 11, Article number: 114 (2025) DOI: https://doi.org/10.1038/s41534-025-01040-x Published: 07 July 2025

This paper is open access.

Raymond Julien Joseph Laflamme (July 19, 1960 – June 19, 2025))

[image downloaded from https://uwaterloo.ca/news/global-impact/opinion-canadas-stake-quantum-race]