This is all about a local (Burnaby is one of the metro Vancouver municipalities) quantum computing companies, D-Wave Systems. The company has been featured here from time to time. It’s usually about about their quantum technology (they are considered a technology star in local and [I think] other circles) but my March 9, 2018 posting about the SXSW (South by Southwest) festival noted that Bo Ewald, President, D-Wave Systems US, was a member of the ‘Quantum Computing: Science Fiction to Science Fact’ panel.
Now, they’re back making technology announcements like this August 22, 2018 news item on phys.org (Note: Links have been removed),
D-Wave Systems today [August 22, 2018] published a milestone study demonstrating a topological phase transition using its 2048-qubit annealing quantum computer. This complex quantum simulation of materials is a major step toward reducing the need for time-consuming and expensive physical research and development.
The paper, entitled “Observation of topological phenomena in a programmable lattice of 1,800 qubits”, was published in the peer-reviewed journal Nature. This work marks an important advancement in the field and demonstrates again that the fully programmable D-Wave quantum computer can be used as an accurate simulator of quantum systems at a large scale. The methods used in this work could have broad implications in the development of novel materials, realizing Richard Feynman’s original vision of a quantum simulator. This new research comes on the heels of D-Wave’s recent Science paper demonstrating a different type of phase transition in a quantum spin-glass simulation. The two papers together signify the flexibility and versatility of the D-Wave quantum computer in quantum simulation of materials, in addition to other tasks such as optimization and machine learning.
In the early 1970s, theoretical physicists Vadim Berezinskii, J. Michael Kosterlitz and David Thouless predicted a new state of matter characterized by nontrivial topological properties. The work was awarded the Nobel Prize in Physics in 2016. D-Wave researchers demonstrated this phenomenon by programming the D-Wave 2000Q™ system to form a two-dimensional frustrated lattice of artificial spins. The observed topological properties in the simulated system cannot exist without quantum effects and closely agree with theoretical predictions.
“This paper represents a breakthrough in the simulation of physical systems which are otherwise essentially impossible,” said 2016 Nobel laureate Dr. J. Michael Kosterlitz. “The test reproduces most of the expected results, which is a remarkable achievement. This gives hope that future quantum simulators will be able to explore more complex and poorly understood systems so that one can trust the simulation results in quantitative detail as a model of a physical system. I look forward to seeing future applications of this simulation method.”
“The work described in the Nature paper represents a landmark in the field of quantum computation: for the first time, a theoretically predicted state of matter was realized in quantum simulation before being demonstrated in a real magnetic material,” said Dr. Mohammad Amin, chief scientist at D-Wave. “This is a significant step toward reaching the goal of quantum simulation, enabling the study of material properties before making them in the lab, a process that today can be very costly and time consuming.”
“Successfully demonstrating physics of Nobel Prize-winning importance on a D-Wave quantum computer is a significant achievement in and of itself. But in combination with D-Wave’s recent quantum simulation work published in Science, this new research demonstrates the flexibility and programmability of our system to tackle recognized, difficult problems in a variety of areas,” said Vern Brownell, D-Wave CEO.
“D-Wave’s quantum simulation of the Kosterlitz-Thouless transition is an exciting and impactful result. It not only contributes to our understanding of important problems in quantum magnetism, but also demonstrates solving a computationally hard problem with a novel and efficient mapping of the spin system, requiring only a limited number of qubits and opening new possibilities for solving a broader range of applications,” said Dr. John Sarrao, principal associate director for science, technology, and engineering at Los Alamos National Laboratory.
“The ability to demonstrate two very different quantum simulations, as we reported in Science and Nature, using the same quantum processor, illustrates the programmability and flexibility of D-Wave’s quantum computer,” said Dr. Andrew King, principal investigator for this work at D-Wave. “This programmability and flexibility were two key ingredients in Richard Feynman’s original vision of a quantum simulator and open up the possibility of predicting the behavior of more complex engineered quantum systems in the future.”
The achievements presented in Nature and Science join D-Wave’s continued work with world-class customers and partners on real-world prototype applications (“proto-apps”) across a variety of fields. The 70+ proto-apps developed by customers span optimization, machine learning, quantum material science, cybersecurity, and more. Many of the proto-apps’ results show that D-Wave systems are approaching, and sometimes surpassing, conventional computing in terms of performance or solution quality on real problems, at pre-commercial scale. As the power of D-Wave systems and software expands, these proto-apps point to the potential for scaled customer application advantage on quantum computers.
The company has prepared a video describing Richard Feynman’s proposal about quantum computing and celebrating their latest achievement,
In 1982, Richard Feynman proposed the idea of simulating the quantum physics of complex systems with a programmable quantum computer. In August 2018, his vision was realized when researchers from D-Wave Systems and the Vector Institute demonstrated the simulation of a topological phase transition—the subject of the 2016 Nobel Prize in Physics—in a fully programmable D-Wave 2000Q™ annealing quantum computer. This complex quantum simulation of materials is a major step toward reducing the need for time-consuming and expensive physical research and development.
You may want to check out the comments in response to the video.
Here’s a link to and a citation for the Nature paper,
Observation of topological phenomena in a programmable lattice of 1,800 qubits by Andrew D. King, Juan Carrasquilla, Jack Raymond, Isil Ozfidan, Evgeny Andriyash, Andrew Berkley, Mauricio Reis, Trevor Lanting, Richard Harris, Fabio Altomare, Kelly Boothby, Paul I. Bunyk, Colin Enderud, Alexandre Fréchette, Emile Hoskinson, Nicolas Ladizinsky, Travis Oh, Gabriel Poulin-Lamarre, Christopher Rich, Yuki Sato, Anatoly Yu. Smirnov, Loren J. Swenson, Mark H. Volkmann, Jed Whittaker, Jason Yao, Eric Ladizinsky, Mark W. Johnson, Jeremy Hilton, & Mohammad H. Amin. Nature volume 560, pages456–460 (2018) DOI: https://doi.org/10.1038/s41586-018-0410-x Published 22 August 2018
This paper is behind a paywall but, for those who don’t have access, there is a synopsis here.
For anyone curious about the earlier paper published in July 2018, here’s a link and a citation,
Phase transitions in a programmable quantum spin glass simulator by R. Harris, Y. Sato, A. J. Berkley, M. Reis, F. Altomare, M. H. Amin, K. Boothby, P. Bunyk, C. Deng, C. Enderud, S. Huang, E. Hoskinson, M. W. Johnson, E. Ladizinsky, N. Ladizinsky, T. Lanting, R. Li, T. Medina, R. Molavi, R. Neufeld, T. Oh, I. Pavlov, I. Perminov, G. Poulin-Lamarre, C. Rich, A. Smirnov, L. Swenson, N. Tsai, M. Volkmann, J. Whittaker, J. Yao. Science 13 Jul 2018: Vol. 361, Issue 6398, pp. 162-165 DOI: 10.1126/science.aat2025
Libraries, archives, records management, oral history, etc. there are many institutions and names for how we manage collective and personal memory. You might call it a peculiarly human obsession stretching back into antiquity. For example, there’s the Library of Alexandria (Wikipedia entry) founded in the third, or possibly 2nd, century BCE (before the common era) and reputed to store all the knowledge in the world. It was destroyed although accounts differ as to when and how but its loss remains a potent reminder of memory’s fragility.
These days, the technology community is terribly concerned with storing ever more bits of data on materials that are reaching their limits for storage.I have news of a possible solution, an interview of sorts with the researchers working on this new technology, and some very recent research into policies for cryptocurrency mining and development. That bit about cryptocurrency makes more sense when you read what the response to one of the interview questions.
It seems University of Alberta researchers may have found a way to increase memory exponentially, from a July 23, 2018 news item on ScienceDaily,
The most dense solid-state memory ever created could soon exceed the capabilities of current computer storage devices by 1,000 times, thanks to a new technique scientists at the University of Alberta have perfected.
“Essentially, you can take all 45 million songs on iTunes and store them on the surface of one quarter,” said Roshan Achal, PhD student in Department of Physics and lead author on the new research. “Five years ago, this wasn’t even something we thought possible.”
Previous discoveries were stable only at cryogenic conditions, meaning this new finding puts society light years closer to meeting the need for more storage for the current and continued deluge of data. One of the most exciting features of this memory is that it’s road-ready for real-world temperatures, as it can withstand normal use and transportation beyond the lab.
“What is often overlooked in the nanofabrication business is actual transportation to an end user, that simply was not possible until now given temperature restrictions,” continued Achal. “Our memory is stable well above room temperature and precise down to the atom.”
Achal explained that immediate applications will be data archival. Next steps will be increasing readout and writing speeds, meaning even more flexible applications.
“With this last piece of the puzzle now in-hand, atom-scale fabrication will become a commercial reality in the very near future,” said Wolkow. Wolkow’s Spin-off [sic] company, Quantum Silicon Inc., is hard at work on commercializing atom-scale fabrication for use in all areas of the technology sector.
To demonstrate the new discovery, Achal, Wolkow, and their fellow scientists not only fabricated the world’s smallest maple leaf, they also encoded the entire alphabet at a density of 138 terabytes, roughly equivalent to writing 350,000 letters across a grain of rice. For a playful twist, Achal also encoded music as an atom-sized song, the first 24 notes of which will make any video-game player of the 80s and 90s nostalgic for yesteryear but excited for the future of technology and society.
As noted in the news release, there is an atom-sized song, which is available in this video,
For interested parties, you can find Quantum Silicon (QSI) here. My Edmonton geography is all but nonexistent, still, it seems to me the company address on Saskatchewan Drive is a University of Alberta address. It’s also the address for the National Research Council of Canada. Perhaps this is a university/government spin-off company?
I sent some questions to the researchers at the University of Alberta who very kindly provided me with the following answers. Roshan Achal passed on one of the questions to his colleague Taleana Huff for her response. Both Achal and Huff are associated with QSI.
Unfortunately I could not find any pictures of all three researchers (Achal, Huff, and Wolkow) together.
Roshan Achal (left) used nanotechnology perfected by his PhD supervisor, Robert Wolkow (right) to create atomic-scale computer memory that could exceed the capacity of today’s solid-state storage drives by 1,000 times. (Photo: Faculty of Science)
(1) SHRINKING THE MANUFACTURING PROCESS TO THE ATOMIC SCALE HAS
ATTRACTED A LOT OF ATTENTION OVER THE YEARS STARTING WITH SCIENCE
FICTION OR RICHARD FEYNMAN OR K. ERIC DREXLER, ETC. IN ANY EVENT, THE
ORIGINS ARE CONTESTED SO I WON’T PUT YOU ON THE SPOT BY ASKING WHO
STARTED IT ALL INSTEAD ASKING HOW DID YOU GET STARTED?
I got started in this field about 6 years ago, when I undertook a MSc
with Dr. Wolkow here at the University of Alberta. Before that point, I
had only ever heard of a scanning tunneling microscope from what was
taught in my classes. I was aware of the famous IBM logo made up from
just a handful of atoms using this machine, but I didn’t know what
else could be done. Here, Dr. Wolkow introduced me to his line of
research, and I saw the immense potential for growth in this area and
decided to pursue it further. I had the chance to interact with and
learn from nanofabrication experts and gain the skills necessary to
begin playing around with my own techniques and ideas during my PhD.
(2) AS I UNDERSTAND IT, THESE ARE THE PIECES YOU’VE BEEN
WORKING ON: (1) THE TUNGSTEN MICROSCOPE TIP, WHICH MAKE[s] (2) THE SMALLEST
QUANTUM DOTS (SINGLE ATOMS OF SILICON), (3) THE AUTOMATION OF THE
QUANTUM DOT PRODUCTION PROCESS, AND (4) THE “MOST DENSE SOLID-STATE
MEMORY EVER CREATED.” WHAT’S MISSING FROM THE LIST AND IS THAT WHAT
YOU’RE WORKING ON NOW?
One of the things missing from the list, that we are currently working
on, is the ability to easily communicate (electrically) from the
macroscale (our world) to the nanoscale, without the use of a scanning
tunneling microscope. With this, we would be able to then construct
devices using the other pieces we’ve developed up to this point, and
then integrate them with more conventional electronics. This would bring
us yet another step closer to the realization of atomic-scale
(3) PERHAPS YOU COULD CLARIFY SOMETHING FOR ME. USUALLY WHEN SOLID STATE
MEMORY IS MENTIONED, THERE’S GREAT CONCERN ABOUT MOORE’S LAW. IS
THIS WORK GOING TO CREATE A NEW LAW? AND, WHAT IF ANYTHING DOES
;YOUR MEMORY DEVICE HAVE TO DO WITH QUANTUM COMPUTING?
That is an interesting question. With the density we’ve achieved,
there are not too many surfaces where atomic sites are more closely
spaced to allow for another factor of two improvement. In that sense, it
would be difficult to improve memory densities further using these
techniques alone. In order to continue Moore’s law, new techniques, or
storage methods would have to be developed to move beyond atomic-scale
The memory design itself does not have anything to do with quantum
computing, however, the lithographic techniques developed through our
work, may enable the development of certain quantum-dot-based quantum
(4) THIS MAY BE A LITTLE OUT OF LEFT FIELD (OR FURTHER OUT THAN THE
OTHERS), COULD;YOUR MEMORY DEVICE HAVE AN IMPACT ON THE
DEVELOPMENT OF CRYPTOCURRENCY AND BLOCKCHAIN? IF SO, WHAT MIGHT THAT
I am not very familiar with these topics, however, co-author Taleana
Huff has provided some thoughts:
Taleana Huff (downloaded from https://ca.linkedin.com/in/taleana-huff]
“The memory, as we’ve designed it, might not have too much of an
impact in and of itself. Cryptocurrencies fall into two categories.
Proof of Work and Proof of Stake. Proof of Work relies on raw
computational power to solve a difficult math problem. If you solve it,
you get rewarded with a small amount of that coin. The problem is that
it can take a lot of power and energy for your computer to crunch
through that problem. Faster access to memory alone could perhaps
streamline small parts of this slightly, but it would be very slight.
Proof of Stake is already quite power efficient and wouldn’t really
have a drastic advantage from better faster computers.
Now, atomic-scale circuitry built using these new lithographic
techniques that we’ve developed, which could perform computations at
significantly lower energy costs, would be huge for Proof of Work coins.
One of the things holding bitcoin back, for example, is that mining it
is now consuming power on the order of the annual energy consumption
required by small countries. A more efficient way to mine while still
taking the same amount of time to solve the problem would make bitcoin
much more attractive as a currency.”
Thank you to Roshan Achal and Taleana Huff for helping me to further explore the implications of their work with Dr. Wolkow.
As usual, after receiving the replies I have more questions but these people have other things to do so I’ll content myself with noting that there is something extraordinary in the fact that we can imagine a near future where atomic scale manufacturing is possible and where as Achal says, ” … storage methods would have to be developed to move beyond atomic-scale [emphasis mine] storage”. In decades past it was the stuff of science fiction or of theorists who didn’t have the tools to turn the idea into a reality. With Wolkow’s, Achal’s, Hauff’s, and their colleagues’ work, atomic scale manufacturing is attainable in the foreseeable future.
Hopefully we’ll be wiser than we have been in the past in how we deploy these new manufacturing techniques. Of course, before we need the wisdom, scientists, as Achal notes, need to find a new way to communicate between the macroscale and the nanoscale.
A study [behind a paywall] published in Energy Research & Social Science warns that failure to lower the energy use by Bitcoin and similar Blockchain designs may prevent nations from reaching their climate change mitigation obligations under the Paris Agreement.
The study, authored by Jon Truby, PhD, Assistant Professor, Director of the Centre for Law & Development, College of Law, Qatar University, Doha, Qatar, evaluates the financial and legal options available to lawmakers to moderate blockchain-related energy consumption and foster a sustainable and innovative technology sector. Based on this rigorous review and analysis of the technologies, ownership models, and jurisdictional case law and practices, the article recommends an approach that imposes new taxes, charges, or restrictions to reduce demand by users, miners, and miner manufacturers who employ polluting technologies, and offers incentives that encourage developers to create less energy-intensive/carbon-neutral Blockchain.
“Digital currency mining is the first major industry developed from Blockchain, because its transactions alone consume more electricity than entire nations,” said Dr. Truby. “It needs to be directed towards sustainability if it is to realize its potential advantages.
“Many developers have taken no account of the environmental impact of their designs, so we must encourage them to adopt consensus protocols that do not result in high emissions. Taking no action means we are subsidizing high energy-consuming technology and causing future Blockchain developers to follow the same harmful path. We need to de-socialize the environmental costs involved while continuing to encourage progress of this important technology to unlock its potential economic, environmental, and social benefits,” explained Dr. Truby.
As a digital ledger that is accessible to, and trusted by all participants, Blockchain technology decentralizes and transforms the exchange of assets through peer-to-peer verification and payments. Blockchain technology has been advocated as being capable of delivering environmental and social benefits under the UN’s Sustainable Development Goals. However, Bitcoin’s system has been built in a way that is reminiscent of physical mining of natural resources – costs and efforts rise as the system reaches the ultimate resource limit and the mining of new resources requires increasing hardware resources, which consume huge amounts of electricity.
Putting this into perspective, Dr. Truby said, “the processes involved in a single Bitcoin transaction could provide electricity to a British home for a month – with the environmental costs socialized for private benefit.
“Bitcoin is here to stay, and so, future models must be designed without reliance on energy consumption so disproportionate on their economic or social benefits.”
The study evaluates various Blockchain technologies by their carbon footprints and recommends how to tax or restrict Blockchain types at different phases of production and use to discourage polluting versions and encourage cleaner alternatives. It also analyzes the legal measures that can be introduced to encourage technology innovators to develop low-emissions Blockchain designs. The specific recommendations include imposing levies to prevent path-dependent inertia from constraining innovation:
Registration fees collected by brokers from digital coin buyers.
“Bitcoin Sin Tax” surcharge on digital currency ownership.
Green taxes and restrictions on machinery purchases/imports (e.g. Bitcoin mining machines).
Smart contract transaction charges.
According to Dr. Truby, these findings may lead to new taxes, charges or restrictions, but could also lead to financial rewards for innovators developing carbon-neutral Blockchain.
The press release doesn’t fully reflect Dr. Truby’s thoughtfulness or the incentives he has suggested. it’s not all surcharges, taxes, and fees constitute encouragement. Here’s a sample from the conclusion,
The possibilities of Blockchain are endless and incentivisation can help solve various climate change issues, such as through the development of digital currencies to fund climate finance programmes. This type of public-private finance initiative is envisioned in the Paris Agreement, and fiscal tools can incentivize innovators to design financially rewarding Blockchain technology that also achieves environmental goals. Bitcoin, for example, has various utilitarian intentions in its White Paper, which may or may not turn out to be as envisioned, but it would not have been such a success without investors seeking remarkable returns. Embracing such technology, and promoting a shift in behaviour with such fiscal tools, can turn the industry itself towards achieving innovative solutions for environmental goals.
I realize Wolkow, et. al, are not focused on cryptocurrency and blockchain technology per se but as Huff notes in her reply, “… new lithographic techniques that we’ve developed, which could perform computations at significantly lower energy costs, would be huge for Proof of Work coins.”
Whether or not there are implications for cryptocurrencies, energy needs, climate change, etc., it’s the kind of innovative work being done by scientists at the University of Alberta which may have implications in fields far beyond the researchers’ original intentions such as more efficient computation and data storage.
ETA Aug. 6, 2018: Dexter Johnson weighed in with an August 3, 2018 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website),
Researchers at the University of Alberta in Canada have developed a new approach to rewritable data storage technology by using a scanning tunneling microscope (STM) to remove and replace hydrogen atoms from the surface of a silicon wafer. If this approach realizes its potential, it could lead to a data storage technology capable of storing 1,000 times more data than today’s hard drives, up to 138 terabytes per square inch.
As a bit of background, Gerd Binnig and Heinrich Rohrer developed the first STM in 1986 for which they later received the Nobel Prize in physics. In the over 30 years since an STM first imaged an atom by exploiting a phenomenon known as tunneling—which causes electrons to jump from the surface atoms of a material to the tip of an ultrasharp electrode suspended a few angstroms above—the technology has become the backbone of so-called nanotechnology.
In addition to imaging the world on the atomic scale for the last thirty years, STMs have been experimented with as a potential data storage device. Last year, we reported on how IBM (where Binnig and Rohrer first developed the STM) used an STM in combination with an iron atom to serve as an electron-spin resonance sensor to read the magnetic pole of holmium atoms. The north and south poles of the holmium atoms served as the 0 and 1 of digital logic.
The Canadian researchers have taken a somewhat different approach to making an STM into a data storage device by automating a known technique that uses the ultrasharp tip of the STM to apply a voltage pulse above an atom to remove individual hydrogen atoms from the surface of a silicon wafer. Once the atom has been removed, there is a vacancy on the surface. These vacancies can be patterned on the surface to create devices and memories.
If you have the time, I recommend reading Dexter’s posting as he provides clear explanations, additional insight into the work, and more historical detail.
This new ‘breed’ of memristor (a component in brain-like/neuromorphic computing) is a kind of thin film. First, here’s an explanation of neuromorphic computing from the Finnish researchers looking into a new kind of memristor, from a January 10, 2018 news item on Nanowerk,
The internet of things [IOT] is coming, that much we know. But still it won’t; not until we have components and chips that can handle the explosion of data that comes with IoT. In 2020, there will already be 50 billion industrial internet sensors in place all around us. A single autonomous device – a smart watch, a cleaning robot, or a driverless car – can produce gigabytes of data each day, whereas an airbus may have over 10 000 sensors in one wing alone.
Two hurdles need to be overcome. First, current transistors in computer chips must be miniaturized to the size of only few nanometres; the problem is they won’t work anymore then. Second, analysing and storing unprecedented amounts of data will require equally huge amounts of energy. Sayani Majumdar, Academy Fellow at Aalto University, along with her colleagues, is designing technology to tackle both issues.
Majumdar has with her colleagues designed and fabricated the basic building blocks of future components in what are called “neuromorphic” computers inspired by the human brain. It’s a field of research on which the largest ICT companies in the world and also the EU are investing heavily. Still, no one has yet come up with a nano-scale hardware architecture that could be scaled to industrial manufacture and use.
“The technology and design of neuromorphic computing is advancing more rapidly than its rival revolution, quantum computing. There is already wide speculation both in academia and company R&D about ways to inscribe heavy computing capabilities in the hardware of smart phones, tablets and laptops. The key is to achieve the extreme energy-efficiency of a biological brain and mimic the way neural networks process information through electric impulses,” explains Majumdar.
Basic components for computers that work like the brain
In their recent article in Advanced Functional Materials, Majumdar and her team show how they have fabricated a new breed of “ferroelectric tunnel junctions”, that is, few-nanometre-thick ferroelectric thin films sandwiched between two electrodes. They have abilities beyond existing technologies and bode well for energy-efficient and stable neuromorphic computing.
The junctions work in low voltages of less than five volts and with a variety of electrode materials – including silicon used in chips in most of our electronics. They also can retain data for more than 10 years without power and be manufactured in normal conditions.
Tunnel junctions have up to this point mostly been made of metal oxides and require 700 degree Celsius temperatures and high vacuums to manufacture. Ferroelectric materials also contain lead which makes them – and all our computers – a serious environmental hazard.
“Our junctions are made out of organic hydro-carbon materials and they would reduce the amount of toxic heavy metal waste in electronics. We can also make thousands of junctions a day in room temperature without them suffering from the water or oxygen in the air”, explains Majumdar.
What makes ferroelectric thin film components great for neuromorphic computers is their ability to switch between not only binary states – 0 and 1 – but a large number of intermediate states as well. This allows them to ‘memorise’ information not unlike the brain: to store it for a long time with minute amounts of energy and to retain the information they have once received – even after being switched off and on again.
We are no longer talking of transistors, but ‘memristors’. They are ideal for computation similar to that in biological brains. Take for example the Mars 2020 Rover about to go chart the composition of another planet. For the Rover to work and process data on its own using only a single solar panel as an energy source, the unsupervised algorithms in it will need to use an artificial brain in the hardware.
“What we are striving for now, is to integrate millions of our tunnel junction memristors into a network on a one square centimetre area. We can expect to pack so many in such a small space because we have now achieved a record-high difference in the current between on and off-states in the junctions and that provides functional stability. The memristors could then perform complex tasks like image and pattern recognition and make decisions autonomously,” says Majumdar.
The probe-station device (the full instrument, left, and a closer view of the device connection, right) which measures the electrical responses of the basic components for computers mimicking the human brain. The tunnel junctions are on a thin film on the substrate plate. Photo: Tapio Reinekoski
It’s that time of year again. The entertainment conference such as South by South West (SXSW) is being held from March 9-18, 2018. The science portion of the conference can be found in the Intelligent Future sessions, from the description,
Imagine a new kind of computer that can quickly solve problems that would stump even the world’s most powerful supercomputers. Quantum computers are fundamentally different. They can store information as not only just ones and zeros, but in all the shades of gray in-between. Several companies and government agencies are investing billions of dollars in the field of quantum information. But what will quantum computers be used for?
Antia Lamas-Linares is a Research Associate in the High Performance Computing group at TACC. Her background is as an experimentalist with quantum computing systems, including work done with them at the Centre for Quantum Technologies in Singapore. She joins podcast host Jorge Salazar to talk about her South by Southwest panel and about some of her latest research on quantum information.
Lamas-Linares co-authored a study (doi: 10.1117/12.2290561) in the Proceedings of the SPIE, The International Society for Optical Engineering, that published in February of 2018. The study, “Secure Quantum Clock Synchronization,” proposed a protocol to verify and secure time synchronization of distant atomic clocks, such as those used for GPS signals in cell phone towers and other places. “It’s important work,” explained Lamas-Linares, “because people are worried about malicious parties messing with the channels of GPS. What James Troupe (Applied Research Laboratories, UT Austin) and I looked at was whether we can use techniques from quantum cryptography and quantum information to make something that is inherently unspoofable.”
Antia Lamas-Linares: The most important thing is that quantum technologies is a really exciting field. And it’s exciting in a fundamental sense. We don’t quite know what we’re going to get out of it. We know a few things, and that’s good enough to drive research. But the things we don’t know are much broader than the things we know, and it’s going to be really interesting. Keep your eyes open for this.
Texas Advanced Computing Center at University of Texas
Startups and established players have sold 2000 Qubit systems, made freely available cloud access to quantum computer processors, and created large scale open source initiatives, all taking quantum computing from science fiction to science fact. Government labs and others like IBM, Microsoft, Google are developing software for quantum computers. What problems will be solved with this quantum leap in computing power that cannot be solved today with the world’s most powerful supercomputers?
[Programming descriptions are generated by participants and do not necessarily reflect the opinions of SXSW.]
Favorited by (1128)
Primary Entry: Platinum Badge, Interactive Badge
Secondary Entry: Music Badge, Film Badge
Event Type: Session
Track: Intelligent Future
I wonder what ‘level’ means? I was not able to find an answer (quickly).
It’s was a bit surprising to find someone from D-Wave Systems (a Vancouver-based quantum computing based enterprise) at an entertainment conference. Still, it shouldn’t have been. Two other examples immediately come to mind, the TED (technology, entertainment, and design) conferences have been melding technology, if not science, with creative activities of all kinds for many years (TED 2018: The Age of Amazement, April 10 -14, 2018 in Vancouver [Canada]) and Beakerhead (2018 dates: Sept. 19 – 23) has been melding art, science, and engineering in a festival held in Calgary (Canada) since 2013. One comment about TED, it was held for several years in California (1984, 1990 – 2013) and moved to Vancouver in 2014.
For anyone wanting to browse the 2018 SxSW Intelligent Future sessions online, go here. or wanting to hear Antia Lamas-Linares talk about quantum computing, there’s the interview with Jorge Salazar (mentioned in the news release),
One of the winners in Canada’s 2017 federal budget announcement of the Pan-Canadian Artificial Intelligence Strategy was Edmonton, Alberta. It’s a fact which sometimes goes unnoticed while Canadians marvel at the wonderfulness found in Toronto and Montréal where it seems new initiatives and monies are being announced on a weekly basis (I exaggerate) for their AI (artificial intelligence) efforts.
Intriguingly, it seems that Edmonton has higher aims than (an almost unnoticed) leadership in AI. Physicists at the University of Alberta have announced hopes to be just as successful as their AI brethren in a Nov. 27, 2017 article by Juris Graney for the Edmonton Journal,
Physicists at the University of Alberta [U of A] are hoping to emulate the success of their artificial intelligence studying counterparts in establishing the city and the province as the nucleus of quantum nanotechnology research in Canada and North America.
Google’s artificial intelligence research division DeepMind announced in July  it had chosen Edmonton as its first international AI research lab, based on a long-running partnership with the U of A’s 10-person AI lab.
Retaining the brightest minds in the AI and machine-learning fields while enticing a global tech leader to Alberta was heralded as a coup for the province and the university.
It is something U of A physics professor John Davis believes the university’s new graduate program, Quanta, can help achieve in the world of quantum nanotechnology.
The field of quantum mechanics had long been a realm of theoretical science based on the theory that atomic and subatomic material like photons or electrons behave both as particles and waves.
“When you get right down to it, everything has both behaviours (particle and wave) and we can pick and choose certain scenarios which one of those properties we want to use,” he said.
But, Davis said, physicists and scientists are “now at the point where we understand quantum physics and are developing quantum technology to take to the marketplace.”
“Quantum computing used to be realm of science fiction, but now we’ve figured it out, it’s now a matter of engineering,” he said.
Quantum computing labs are being bought by large tech companies such as Google, IBM and Microsoft because they realize they are only a few years away from having this power, he said.
Those making the groundbreaking developments may want to commercialize their finds and take the technology to market and that is where Quanta comes in.
East vs. West—Again?
Ivan Semeniuk in his article, Quantum Supremacy, ignores any quantum research effort not located in either Waterloo, Ontario or metro Vancouver, British Columbia to describe a struggle between the East and the West (a standard Canadian trope). From Semeniuk’s Oct. 17, 2017 quantum article [link follows the excerpts] for the Globe and Mail’s October 2017 issue of the Report on Business (ROB),
Lazaridis [Mike], of course, has experienced lost advantage first-hand. As co-founder and former co-CEO of Research in Motion (RIM, now called Blackberry), he made the smartphone an indispensable feature of the modern world, only to watch rivals such as Apple and Samsung wrest away Blackberry’s dominance. Now, at 56, he is engaged in a high-stakes race that will determine who will lead the next technology revolution. In the rolling heartland of southwestern Ontario, he is laying the foundation for what he envisions as a new Silicon Valley—a commercial hub based on the promise of quantum technology.
Semeniuk skips over the story of how Blackberry lost its advantage. I came onto that story late in the game when Blackberry was already in serious trouble due to a failure to recognize that the field they helped to create was moving in a new direction. If memory serves, they were trying to keep their technology wholly proprietary which meant that developers couldn’t easily create apps to extend the phone’s features. Blackberry also fought a legal battle in the US with a patent troll draining company resources and energy in proved to be a futile effort.
Since then Lazaridis has invested heavily in quantum research. He gave the University of Waterloo a serious chunk of money as they named their Quantum Nano Centre (QNC) after him and his wife, Ophelia (you can read all about it in my Sept. 25, 2012 posting about the then new centre). The best details for Lazaridis’ investments in Canada’s quantum technology are to be found on the Quantum Valley Investments, About QVI, History webpage,
History has repeatedly demonstrated the power of research in physics to transform society. As a student of history and a believer in the power of physics, Mike Lazaridis set out in 2000 to make real his bold vision to establish the Region of Waterloo as a world leading centre for physics research. That is, a place where the best researchers in the world would come to do cutting-edge research and to collaborate with each other and in so doing, achieve transformative discoveries that would lead to the commercialization of breakthrough technologies.
Establishing a World Class Centre in Quantum Research:
The first step in this regard was the establishment of the Perimeter Institute for Theoretical Physics. Perimeter was established in 2000 as an independent theoretical physics research institute. Mike started Perimeter with an initial pledge of $100 million (which at the time was approximately one third of his net worth). Since that time, Mike and his family have donated a total of more than $170 million to the Perimeter Institute. In addition to this unprecedented monetary support, Mike also devotes his time and influence to help lead and support the organization in everything from the raising of funds with government and private donors to helping to attract the top researchers from around the globe to it. Mike’s efforts helped Perimeter achieve and grow its position as one of a handful of leading centres globally for theoretical research in fundamental physics.
Perimeter is located in a Governor-General award winning designed building in Waterloo. Success in recruiting and resulting space requirements led to an expansion of the Perimeter facility. A uniquely designed addition, which has been described as space-ship-like, was opened in 2011 as the Stephen Hawking Centre in recognition of one of the most famous physicists alive today who holds the position of Distinguished Visiting Research Chair at Perimeter and is a strong friend and supporter of the organization.
Recognizing the need for collaboration between theorists and experimentalists, in 2002, Mike applied his passion and his financial resources toward the establishment of The Institute for Quantum Computing at the University of Waterloo. IQC was established as an experimental research institute focusing on quantum information. Mike established IQC with an initial donation of $33.3 million. Since that time, Mike and his family have donated a total of more than $120 million to the University of Waterloo for IQC and other related science initiatives. As in the case of the Perimeter Institute, Mike devotes considerable time and influence to help lead and support IQC in fundraising and recruiting efforts. Mike’s efforts have helped IQC become one of the top experimental physics research institutes in the world.
Mike and Doug Fregin have been close friends since grade 5. They are also co-founders of BlackBerry (formerly Research In Motion Limited). Doug shares Mike’s passion for physics and supported Mike’s efforts at the Perimeter Institute with an initial gift of $10 million. Since that time Doug has donated a total of $30 million to Perimeter Institute. Separately, Doug helped establish the Waterloo Institute for Nanotechnology at the University of Waterloo with total gifts for $29 million. As suggested by its name, WIN is devoted to research in the area of nanotechnology. It has established as an area of primary focus the intersection of nanotechnology and quantum physics.
With a donation of $50 million from Mike which was matched by both the Government of Canada and the province of Ontario as well as a donation of $10 million from Doug, the University of Waterloo built the Mike & Ophelia Lazaridis Quantum-Nano Centre, a state of the art laboratory located on the main campus of the University of Waterloo that rivals the best facilities in the world. QNC was opened in September 2012 and houses researchers from both IQC and WIN.
Leading the Establishment of Commercialization Culture for Quantum Technologies in Canada:
For many years, theorists have been able to demonstrate the transformative powers of quantum mechanics on paper. That said, converting these theories to experimentally demonstrable discoveries has, putting it mildly, been a challenge. Many naysayers have suggested that achieving these discoveries was not possible and even the believers suggested that it could likely take decades to achieve these discoveries. Recently, a buzz has been developing globally as experimentalists have been able to achieve demonstrable success with respect to Quantum Information based discoveries. Local experimentalists are very much playing a leading role in this regard. It is believed by many that breakthrough discoveries that will lead to commercialization opportunities may be achieved in the next few years and certainly within the next decade.
Recognizing the unique challenges for the commercialization of quantum technologies (including risk associated with uncertainty of success, complexity of the underlying science and high capital / equipment costs) Mike and Doug have chosen to once again lead by example. The Quantum Valley Investment Fund will provide commercialization funding, expertise and support for researchers that develop breakthroughs in Quantum Information Science that can reasonably lead to new commercializable technologies and applications. Their goal in establishing this Fund is to lead in the development of a commercialization infrastructure and culture for Quantum discoveries in Canada and thereby enable such discoveries to remain here.
Semeniuk goes on to set the stage for Waterloo/Lazaridis vs. Vancouver (from Semeniuk’s 2017 ROB article),
… as happened with Blackberry, the world is once again catching up. While Canada’s funding of quantum technology ranks among the top five in the world, the European Union, China, and the US are all accelerating their investments in the field. Tech giants such as Google [also known as Alphabet], Microsoft and IBM are ramping up programs to develop companies and other technologies based on quantum principles. Meanwhile, even as Lazaridis works to establish Waterloo as the country’s quantum hub, a Vancouver-area company has emerged to challenge that claim. The two camps—one methodically focused on the long game, the other keen to stake an early commercial lead—have sparked an East-West rivalry that many observers of the Canadian quantum scene are at a loss to explain.
Is it possible that some of the rivalry might be due to an influential individual who has invested heavily in a ‘quantum valley’ and has a history of trying to ‘own’ a technology?
Getting back to D-Wave Systems, the Vancouver company, I have written about them a number of times (particularly in 2015; for the full list: input D-Wave into the blog search engine). This June 26, 2015 posting includes a reference to an article in The Economist magazine about D-Wave’s commercial opportunities while the bulk of the posting is focused on a technical breakthrough.
Semeniuk offers an overview of the D-Wave Systems story,
D-Wave was born in 1999, the same year Lazaridis began to fund quantum science in Waterloo. From the start, D-Wave had a more immediate goal: to develop a new computer technology to bring to market. “We didn’t have money or facilities,” says Geordie Rose, a physics PhD who co0founded the company and served in various executive roles. …
The group soon concluded that the kind of machine most scientists were pursing based on so-called gate-model architecture was decades away from being realized—if ever. …
Instead, D-Wave pursued another idea, based on a principle dubbed “quantum annealing.” This approach seemed more likely to produce a working system, even if the application that would run on it were more limited. “The only thing we cared about was building the machine,” says Rose. “Nobody else was trying to solve the same problem.”
D-Wave debuted its first prototype at an event in California in February 2007 running it through a few basic problems such as solving a Sudoku puzzle and finding the optimal seating plan for a wedding reception. … “They just assumed we were hucksters,” says Hilton [Jeremy Hilton, D.Wave senior vice-president of systems]. Federico Spedalieri, a computer scientist at the University of Southern California’s [USC} Information Sciences Institute who has worked with D-Wave’s system, says the limited information the company provided about the machine’s operation provoked outright hostility. “I think that played against them a lot in the following years,” he says.
It seems Lazaridis is not the only one who likes to hold company information tightly.
Back to Semeniuk and D-Wave,
Today [October 2017], the Los Alamos National Laboratory owns a D-Wave machine, which costs about $15million. Others pay to access D-Wave systems remotely. This year , for example, Volkswagen fed data from thousands of Beijing taxis into a machine located in Burnaby [one of the municipalities that make up metro Vancouver] to study ways to optimize traffic flow.
But the application for which D-Wave has the hights hope is artificial intelligence. Any AI program hings on the on the “training” through which a computer acquires automated competence, and the 2000Q [a D-Wave computer] appears well suited to this task. …
Yet, for all the buzz D-Wave has generated, with several research teams outside Canada investigating its quantum annealing approach, the company has elicited little interest from the Waterloo hub. As a result, what might seem like a natural development—the Institute for Quantum Computing acquiring access to a D-Wave machine to explore and potentially improve its value—has not occurred. …
I am particularly interested in this comment as it concerns public funding (from Semeniuk’s article),
Vern Brownell, a former Goldman Sachs executive who became CEO of D-Wave in 2009, calls the lack of collaboration with Waterloo’s research community “ridiculous,” adding that his company’s efforts to establish closer ties have proven futile, “I’ll be blunt: I don’t think our relationship is good enough,” he says. Brownell also point out that, while hundreds of millions in public funds have flowed into Waterloo’s ecosystem, little funding is available for Canadian scientists wishing to make the most of D-Wave’s hardware—despite the fact that it remains unclear which core quantum technology will prove the most profitable.
There’s a lot more to Semeniuk’s article but this is the last excerpt,
The world isn’t waiting for Canada’s quantum rivals to forge a united front. Google, Microsoft, IBM, and Intel are racing to develop a gate-model quantum computer—the sector’s ultimate goal. (Google’s researchers have said they will unveil a significant development early next year.) With the U.K., Australia and Japan pouring money into quantum, Canada, an early leader, is under pressure to keep up. The federal government is currently developing a strategy for supporting the country’s evolving quantum sector and, ultimately, getting a return on its approximately $1-billion investment over the past decade [emphasis mine].
I wonder where the “approximately $1-billion … ” figure came from. I ask because some years ago MP Peter Julian asked the government for information about how much Canadian federal money had been invested in nanotechnology. The government replied with sheets of paper (a pile approximately 2 inches high) that had funding disbursements from various ministries. Each ministry had its own method with different categories for listing disbursements and the titles for the research projects were not necessarily informative for anyone outside a narrow specialty. (Peter Julian’s assistant had kindly sent me a copy of the response they had received.) The bottom line is that it would have been close to impossible to determine the amount of federal funding devoted to nanotechnology using that data. So, where did the $1-billion figure come from?
In any event, it will be interesting to see how the Council of Canadian Academies assesses the ‘quantum’ situation in its more academically inclined, “The State of Science and Technology and Industrial Research and Development in Canada,” when it’s released later this year (2018).
Despite any doubts one might have about Lazaridis’ approach to research and technology, his tremendous investment and support cannot be denied. Without him, Canada’s quantum research efforts would be substantially less significant. As for the ‘cowboys’ in Vancouver, it takes a certain temperament to found a start-up company and it seems the D-Wave folks have more in common with Lazaridis than they might like to admit. As for the Quanta graduate programme, it’s early days yet and no one should ever count out Alberta.
Meanwhile, one can continue to hope that a more thoughtful approach to regional collaboration will be adopted so Canada can continue to blaze trails in the field of quantum research.
Robert Wolkow is no stranger to mastering the ultra-small and the ultra-fast. A pioneer in atomic-scale science with a Guinness World Record to boot (for a needle with a single atom at the point), Wolkow’s team, together with collaborators at the Max Plank Institute in Hamburg, have just released findings that detail how to create atomic switches for electricity, many times smaller than what is currently used.
What does it all mean? With applications for practical systems like silicon semi-conductor electronics, it means smaller, more efficient, more energy-conserving computers, as just one example of the technology revolution that is unfolding right before our very eyes (if you can squint that hard).
“This is the first time anyone’s seen a switching of a single-atom channel,” explains Wolkow, a physics professor at the University of Alberta and the Principal Research Officer at Canada’s National Institute for Nanotechnology. “You’ve heard of a transistor—a switch for electricity—well, our switches are almost a hundred times smaller than the smallest on the market today.”
Today’s tiniest transistors operate at the 14 nanometer level, which still represents thousands of atoms. Wolkow’s and his team at the University of Alberta, NINT, and his spinoff QSi, have worked the technology down to just a few atoms. Since computers are simply a composition of many on/off switches, the findings point the way not only to ultra-efficient general purpose computing but also to a new path to quantum computing.
Green technology for the digital economy
“We’re using this technology to make ultra-green, energy-conserving general purpose computers but also to further the development of quantum computers. We are building the most energy conserving electronics ever, consuming about a thousand times less power than today’s electronics.”
While the new tech is small, the potential societal, economic, and environmental impact of Wolkow’s discovery is very large. Today, our electronics consume several percent of the world’s electricity. As the size of the energy footprint of the digital economy increases, material and energy conservation is becoming increasingly important.
Wolkow says there are surprising benefits to being smaller, both for normal computers, and, for quantum computers too. “Quantum systems are characterized by their delicate hold on information. They’re ever so easily perturbed. Interestingly though, the smaller the system gets, the fewer upsets.” Therefore, Wolkow explains, you can create a system that is simultaneously amazingly small, using less material and churning through less energy, while holding onto information just right.
Smaller systems equal smaller environmental footprint
When the new technology is fully developed, it will lead to not only a smaller energy footprint but also more affordable systems for consumers. “It’s kind of amazing when everything comes together,” says Wolkow.
Wolkow is one of the few people in the world talking about atom-scale manufacturing and believes we are witnessing the beginning of the revolution to come. He and his team have been working with large-scale industry leader Lockheed Martin as the entry point to the market.
“It’s something you don’t even hear about yet, but atom-scale manufacturing is going to be world-changing. People think it’s not quite doable but, but we’re already making things out of atoms routinely. We aren’t doing it just because. We are doing it because the things we can make have ever more desirable properties. They’re not just smaller. They’re different and better. This is just the beginning of what will be at least a century of developments in atom-scale manufacturing, and it will be transformational.”
Bill Mah in a Nov. 1, 2016 article for the Edmonton Journal delves a little further into issues around making transistors smaller and the implications of a single-atom switch,
Current computers use transistors, which are essentially valves for flowing streams of electrons around a circuit. In recent years, engineers have found ways to make these devices smaller, but pushing electrons through narrow spaces raises the danger of the machines overheating and failing.
“The transistors get too hot so you have to run them slower and more gently, so we’re getting more power in modern computers because there are more transistors, but we can’t run them very quickly because they make a lot of heat and they actually just shut down and fail.”
The smallest transistors are currently about 14 nanometres. A nanometre is one-billionth of a metre and contains groupings of 1,000 or more atoms. The switches detailed by Wolkow and his colleagues will shrink them down to just a few atoms.
Potential benefits from the advance could lead to much more energy-efficient and smaller computers, an increasingly important consideration as the power consumption of digital devices keeps growing.
“The world is using about three per cent of our energy today on digital communications and computers,” Wolkow said. “Various reports I’ve seen say that it could easily go up to 10 or 15 per cent in a couple of decades, so it’s crucial that we get that under control.”
Wolkow’s team has received funding from companies such as Lockheed Martin and local investors.
The advances could also open a path to quantum computing. “It turns out these same building blocks … enable a quantum computer, so we’re kind of feverishly working on that at the same time.”
There is an animation illustrating a single-atom switch,
This animation represents an electrical current being switched on and off. Remarkably, the current is confined to a channel that is just one atom wide. Also, the switch is made of just one atom. When the atom in the centre feels an electric field tugging at it, it loses its electron. Once that electron is lost, the many electrons in the body of the silicon (to the left) have a clear passage to flow through. When the electric field is removed, an electron gets trapped in the central atom, switching the current off. Courtesy: University of Alberta
Here’s a link to and a citation for the research paper,
Time-resolved single dopant charge dynamics in silicon by Mohammad Rashidi, Jacob A. J. Burgess, Marco Taucer, Roshan Achal, Jason L. Pitters, Sebastian Loth, & Robert A. Wolkow. Nature Communications 7, Article number: 13258 (2016) doi:10.1038/ncomms13258 Published online: 26 October 2016
Using a small quantum system consisting of three superconducting qubits, researchers at UC Santa Barbara and Google have uncovered a link between aspects of classical and quantum physics thought to be unrelated: classical chaos and quantum entanglement. Their findings suggest that it would be possible to use controllable quantum systems to investigate certain fundamental aspects of nature.
“It’s kind of surprising because chaos is this totally classical concept — there’s no idea of chaos in a quantum system,” Charles Neill, a researcher in the UCSB Department of Physics and lead author of a paper that appears in Nature Physics. “Similarly, there’s no concept of entanglement within classical systems. And yet it turns out that chaos and entanglement are really very strongly and clearly related.”
Initiated in the 15th century, classical physics generally examines and describes systems larger than atoms and molecules. It consists of hundreds of years’ worth of study including Newton’s laws of motion, electrodynamics, relativity, thermodynamics as well as chaos theory — the field that studies the behavior of highly sensitive and unpredictable systems. One classic example of chaos theory is the weather, in which a relatively small change in one part of the system is enough to foil predictions — and vacation plans — anywhere on the globe.
At smaller size and length scales in nature, however, such as those involving atoms and photons and their behaviors, classical physics falls short. In the early 20th century quantum physics emerged, with its seemingly counterintuitive and sometimes controversial science, including the notions of superposition (the theory that a particle can be located in several places at once) and entanglement (particles that are deeply linked behave as such despite physical distance from one another).
And so began the continuing search for connections between the two fields.
All systems are fundamentally quantum systems, according [to] Neill, but the means of describing in a quantum sense the chaotic behavior of, say, air molecules in an evacuated room, remains limited.
Imagine taking a balloon full of air molecules, somehow tagging them so you could see them and then releasing them into a room with no air molecules, noted co-author and UCSB/Google researcher Pedram Roushan. One possible outcome is that the air molecules remain clumped together in a little cloud following the same trajectory around the room. And yet, he continued, as we can probably intuit, the molecules will more likely take off in a variety of velocities and directions, bouncing off walls and interacting with each other, resting after the room is sufficiently saturated with them.
“The underlying physics is chaos, essentially,” he said. The molecules coming to rest — at least on the macroscopic level — is the result of thermalization, or of reaching equilibrium after they have achieved uniform saturation within the system. But in the infinitesimal world of quantum physics, there is still little to describe that behavior. The mathematics of quantum mechanics, Roushan said, do not allow for the chaos described by Newtonian laws of motion.
To investigate, the researchers devised an experiment using three quantum bits, the basic computational units of the quantum computer. Unlike classical computer bits, which utilize a binary system of two possible states (e.g., zero/one), a qubit can also use a superposition of both states (zero and one) as a single state. Additionally, multiple qubits can entangle, or link so closely that their measurements will automatically correlate. By manipulating these qubits with electronic pulses, Neill caused them to interact, rotate and evolve in the quantum analog of a highly sensitive classical system.
The result is a map of entanglement entropy of a qubit that, over time, comes to strongly resemble that of classical dynamics — the regions of entanglement in the quantum map resemble the regions of chaos on the classical map. The islands of low entanglement in the quantum map are located in the places of low chaos on the classical map.
“There’s a very clear connection between entanglement and chaos in these two pictures,” said Neill. “And, it turns out that thermalization is the thing that connects chaos and entanglement. It turns out that they are actually the driving forces behind thermalization.
“What we realize is that in almost any quantum system, including on quantum computers, if you just let it evolve and you start to study what happens as a function of time, it’s going to thermalize,” added Neill, referring to the quantum-level equilibration. “And this really ties together the intuition between classical thermalization and chaos and how it occurs in quantum systems that entangle.”
The study’s findings have fundamental implications for quantum computing. At the level of three qubits, the computation is relatively simple, said Roushan, but as researchers push to build increasingly sophisticated and powerful quantum computers that incorporate more qubits to study highly complex problems that are beyond the ability of classical computing — such as those in the realms of machine learning, artificial intelligence, fluid dynamics or chemistry — a quantum processor optimized for such calculations will be a very powerful tool.
“It means we can study things that are completely impossible to study right now, once we get to bigger systems,” said Neill.
Experimental link between quantum entanglement (left) and classical chaos (right) found using a small quantum computer. Photo Credit: Courtesy Image (Courtesy: UCSB)
Here’s a link to and a citation for the paper,
Ergodic dynamics and thermalization in an isolated quantum system by C. Neill, P. Roushan, M. Fang, Y. Chen, M. Kolodrubetz, Z. Chen, A. Megrant, R. Barends, B. Campbell, B. Chiaro, A. Dunsworth, E. Jeffrey, J. Kelly, J. Mutus, P. J. J. O’Malley, C. Quintana, D. Sank, A. Vainsencher, J. Wenner, T. C. White, A. Polkovnikov, & J. M. Martinis. Nature Physics (2016) doi:10.1038/nphys3830 Published online 11 July 2016
The upcoming performance of a quantum computer built by D-Wave Systems (a Canadian company) and Welsh mezzo soprano Juliette Pochin is the première of “Superposition” by Alexis Kirke. A July 13, 2016 news item on phys.org provides more detail,
What happens when you combine the pure tones of an internationally renowned mezzo soprano and the complex technology of a $15million quantum supercomputer?
The answer will be exclusively revealed to audiences at the Port Eliot Festival [Cornwall, UK] when Superposition, created by Plymouth University composer Alexis Kirke, receives its world premiere later this summer.
A D-Wave 1000 Qubit Quantum Processor. Credit: D-Wave Systems Inc
Combining the arts and sciences, as Dr Kirke has done with many of his previous works, the 15-minute piece will begin dark and mysterious with celebrated performer Juliette Pochin singing a low-pitched slow theme.
But gradually the quiet sounds of electronic ambience will emerge over or beneath her voice, as the sounds of her singing are picked up by a microphone and sent over the internet to the D-Wave quantum computer at the University of Southern California.
It then reacts with behaviours in the quantum realm that are turned into sounds back in the performance venue, the Round Room at Port Eliot, creating a unique and ground-breaking duet.
And when the singer ends, the quantum processes are left to slowly fade away naturally, making their final sounds as the lights go to black.
Dr Kirke, a member of the Interdisciplinary Centre for Computer Music Research at Plymouth University, said:
“There are only a handful of these computers accessible in the world, and this is the first time one has been used as part of a creative performance. So while it is a great privilege to be able to put this together, it is an incredibly complex area of computing and science and it has taken almost two years to get to this stage. For most people, this will be the first time they have seen a quantum computer in action and I hope it will give them a better understanding of how it works in a creative and innovative way.”
Plymouth University is the official Creative and Cultural Partner of the Port Eliot Festival, taking place in South East Cornwall from July 28 to 31, 2016 [emphasis mine].
And Superposition will be one of a number of showcases of University talent and expertise as part of the first Port Eliot Science Lab. Being staged in the Round Room at Port Eliot, it will give festival goers the chance to explore science, see performances and take part in a range of experiments.
The three-part performance will tell the story of Niobe, one of the more tragic figures in Greek mythology, but in this case a nod to the fact the heart of the quantum computer contains the metal named after her, niobium. It will also feature a monologue from Hamlet, interspersed with terms from quantum computing.
This is the latest of Dr Kirke’s pioneering performance works, with previous productions including an opera based on the financial crisis and a piece using a cutting edge wave-testing facility as an instrument of percussion.
Geordie Rose, CTO and Founder, D-Wave Systems, said:
“D-Wave’s quantum computing technology has been investigated in many areas such as image recognition, machine learning and finance. We are excited to see Dr Kirke, a pioneer in the field of quantum physics and the arts, utilising a D-Wave 2X in his next performance. Quantum computing is positioned to have a tremendous social impact, and Dr Kirke’s work serves not only as a piece of innovative computer arts research, but also as a way of educating the public about these new types of exotic computing machines.”
Professor Daniel Lidar, Director of the USC Center for Quantum Information Science and Technology, said:
“This is an exciting time to be in the field of quantum computing. This is a field that was purely theoretical until the 1990s and now is making huge leaps forward every year. We have been researching the D-Wave machines for four years now, and have recently upgraded to the D-Wave 2X – the world’s most advanced commercially available quantum optimisation processor. We were very happy to welcome Dr Kirke on a short training residence here at the University of Southern California recently; and are excited to be collaborating with him on this performance, which we see as a great opportunity for education and public awareness.”
Since I can’t be there, I’m hoping they will be able to successfully livestream the performance. According to Kirke who very kindly responded to my query, the festival’s remote location can make livecasting a challenge. He did note that a post-performance documentary is planned and there will be footage from the performance.
He has also provided more information about the singer and the technical/computer aspects of the performance (from a July 18, 2016 email),
Juliette Pochin: I’ve worked with her before a couple of years ago. She has an amazing voice and style, is musically adventurousness (she is a music producer herself), and brings great grace and charisma to a performance. She can be heard in the Harry Potter and Lord of the Rings soundtracks and has performed at venues such as the Royal Albert Hall, Proms in the Park, and Meatloaf!
Score: The score is in 3 parts of about 5 minutes each. There is a traditional score for parts 1 and 3 that Juliette will sing from. I wrote these manually in traditional music notation. However she can sing in free time and wait for the computer to respond. It is a very dramatic score, almost operatic. The computer’s responses are based on two algorithms: a superposition chord system, and a pitch-loudness entanglement system. The superposition chord system sends a harmony problem to the D-Wave in response to Juliette’s approximate pitch amongst other elements. The D-Wave uses an 8-qubit optimizer to return potential chords. Each potential chord has an energy associated with it. In theory the lowest energy chord is that preferred by the algorithm. However in the performance I will combine the chord solutions to create superposition chords. These are chords which represent, in a very loose way, the superposed solutions which existed in the D-Wave before collapse of the qubits. Technically they are the results of multiple collapses, but metaphorically I can’t think of a more beautiful representation of superposition: chords. These will accompany Juliette, sometimes clashing with her. Sometimes giving way to her.
The second subsystem generates non-pitched noises of different lengths, roughnesses and loudness. These are responses to Juliette, but also a result of a simple D-Wave entanglement. We know the D-Wave can entangle in 8-qubit groups. I send a binary representation of the Juliette’s loudness to 4 qubits and one of approximate pitch to another 4, then entangle the two. The chosen entanglement weights are selected for their variety of solutions amongst the qubits, rather than by a particular musical logic. So the non-pitched subsystem is more of a sonification of entanglement than a musical algorithm.
Thank you Dr. Kirke for a fascinating technical description and for a description of Juliette Pochin that makes one long to hear her in performance.
For anyone wondering about data sonficiatiion, I also have a Feb. 7, 2014 post featuring a data sonification project by Dr. Domenico Vicinanza which includes a sound clip of his Voyager 1 & 2 spacecraft duet.
Whoever wrote the news release used a very catchy title “Particle zoo in a quantum computer”; I just wish they’d explained it. Looking up the definition for a ‘particle zoo’ didn’t help as much as I’d hoped. From the particle zoo entry on Wikipedia (Note: Links have been removed),
In particle physics, the term particle zoo is used colloquially to describe a relatively extensive list of the then known “elementary particles” that almost look like hundreds of species in the zoo.
In the history of particle physics, the situation was particularly confusing in the late 1960s. Before the discovery of quarks, hundreds of strongly interacting particles (hadrons) were known, and believed to be distinct elementary particles in their own right. It was later discovered that they were not elementary particles, but rather composites of the quarks. The set of particles believed today to be elementary is known as the Standard Model, and includes quarks, bosons and leptons.
I believe the writer used the term to indicate that the simulation undertaken involved elementary particles. If you have a better explanation, please feel free to add it to the comments for this post.
Elementary particles are the fundamental buildings blocks of matter, and their properties are described by the Standard Model of particle physics. The discovery of the Higgs boson at the CERN in 2012 constitutes a further step towards the confirmation of the Standard Model. However, many aspects of this theory are still not understood because their complexity makes it hard to investigate them with classical computers. Quantum computers may provide a way to overcome this obstacle as they can simulate certain aspects of elementary particle physics in a well-controlled quantum system. Physicists from the University of Innsbruck and the Institute for Quantum Optics and Quantum Information (IQOQI) at the Austrian Academy of Sciences have now done exactly that: In an international first, Rainer Blatt’s and Peter Zoller’s research teams have simulated lattice gauge theories in a quantum computer. …
Gauge theories describe the interaction between elementary particles, such as quarks and gluons, and they are the basis for our understanding of fundamental processes. “Dynamical processes, for example, the collision of elementary particles or the spontaneous creation of particle-antiparticle pairs, are extremely difficult to investigate,” explains Christine Muschik, theoretical physicist at the IQOQI. “However, scientists quickly reach a limit when processing numerical calculations on classical computers. For this reason, it has been proposed to simulate these processes by using a programmable quantum system.” In recent years, many interesting concepts have been proposed, but until now it was impossible to realize them. “We have now developed a new concept that allows us to simulate the spontaneous creation of electron-positron pairs out of the vacuum by using a quantum computer,” says Muschik. The quantum system consists of four electromagnetically trapped calcium ions that are controlled by laser pulses. “Each pair of ions represent a pair of a particle and an antiparticle,” explains experimental physicist Esteban A. Martinez. “We use laser pulses to simulate the electromagnetic field in a vacuum. Then we are able to observe how particle pairs are created by quantum fluctuations from the energy of this field. By looking at the ion’s fluorescence, we see whether particles and antiparticles were created. We are able to modify the parameters of the quantum system, which allows us to observe and study the dynamic process of pair creation.”
Combining different fields of physics
With this experiment, the physicists in Innsbruck have built a bridge between two different fields in physics: They have used atomic physics experiments to study questions in high-energy physics. While hundreds of theoretical physicists work on the highly complex theories of the Standard Model and experiments are carried out at extremely expensive facilities, such as the Large Hadron Collider at CERN, quantum simulations may be carried out by small groups in tabletop experiments. “These two approaches complement one another perfectly,” says theoretical physicist Peter Zoller. “We cannot replace the experiments that are done with particle colliders. However, by developing quantum simulators, we may be able to understand these experiments better one day.” Experimental physicist Rainer Blatt adds: “Moreover, we can study new processes by using quantum simulation. For example, in our experiment we also investigated particle entanglement produced during pair creation, which is not possible in a particle collider.” The physicists are convinced that future quantum simulators will potentially be able to solve important questions in high-energy physics that cannot be tackled by conventional methods.
Foundation for a new research field
It was only a few years ago that the idea to combine high-energy and atomic physics was proposed. With this work it has been implemented experimentally for the first time. “This approach is conceptually very different from previous quantum simulation experiments studying many-body physics or quantum chemistry. The simulation of elementary particle processes is theoretically very complex and, therefore, has to satisfy very specific requirements. For this reason it is difficult to develop a suitable protocol,” underlines Zoller. The conditions for the experimental physicists were equally demanding: “This is one of the most complex experiments that has ever been carried out in a trapped-ion quantum computer,” says Blatt. “We are still figuring out how these quantum simulations work and will only gradually be able to apply them to more challenging phenomena.” The great theoretical as well as experimental expertise of the physicists in Innsbruck was crucial for the breakthrough. Both Blatt and Zoller emphasize that they have been doing research on quantum computers for many years now and have gained a lot of experience in their implementation. Innsbruck has become one of the leading centers for research in quantum physics; here, the theoretical and experimental branches work together at an extremely high level, which enables them to gain novel insights into fundamental phenomena.
This work on quantum networks comes from a joint Singapore/UK research project, from a June 2, 2016 news item on ScienceDaily,
You can’t sign up for the quantum internet just yet, but researchers have reported a major experimental milestone towards building a global quantum network — and it’s happening in space.
With a network that carries information in the quantum properties of single particles, you can create secure keys for secret messaging and potentially connect powerful quantum computers in the future. But scientists think you will need equipment in space to get global reach.
Researchers from the National University of Singapore (NUS) and the University of Strathclyde, UK, have become the first to test in orbit technology for satellite-based quantum network nodes.
They have put a compact device carrying components used in quantum communication and computing into orbit. And it works: the team report first data in a paper published 31 May 2016 in the journal Physical Review Applied.
The team’s device, dubbed SPEQS, creates and measures pairs of light particles, called photons. Results from space show that SPEQS is making pairs of photons with correlated properties – an indicator of performance.
Team-leader Alexander Ling, an Assistant Professor at the Centre for Quantum Technologies (CQT) at NUS said, “This is the first time anyone has tested this kind of quantum technology in space.”
The team had to be inventive to redesign a delicate, table-top quantum setup to be small and robust enough to fly inside a nanosatellite only the size of a shoebox. The whole satellite weighs just 1.65-kilogramme.
Making correlated photons is a precursor to creating entangled photons. Described by Einstein as “spooky action at a distance”, entanglement is a connection between quantum particles that lends security to communication and power to computing.
Professor Artur Ekert, Director of CQT, invented the idea of using entangled particles for cryptography. He said, “Alex and his team are taking entanglement, literally, to a new level. Their experiments will pave the road to secure quantum communication and distributed quantum computation on a global scale. I am happy to see that Singapore is one of the world leaders in this area.”
Local quantum networks already exist [emphasis mine]. The problem Ling’s team aims to solve is a distance limit. Losses limit quantum signals sent through air at ground level or optical fibre to a few hundred kilometers – but we might ultimately use entangled photons beamed from satellites to connect points on opposite sides of the planet. Although photons from satellites still have to travel through the atmosphere, going top-to-bottom is roughly equivalent to going only 10 kilometres at ground level.
The group’s first device is a technology pathfinder. It takes photons from a BluRay laser and splits them into two, then measures the pair’s properties, all on board the satellite. To do this it contains a laser diode, crystals, mirrors and photon detectors carefully aligned inside an aluminum block. This sits on top of a 10 centimetres by 10 centimetres printed circuit board packed with control electronics.
Through a series of pre-launch tests – and one unfortunate incident – the team became more confident that their design could survive a rocket launch and space conditions. The team had a device in the October 2014 Orbital-3 rocket which exploded on the launch pad. The satellite containing that first device was later found on a beach intact and still in working order.
Even with the success of the more recent mission, a global network is still a few milestones away. The team’s roadmap calls for a series of launches, with the next space-bound SPEQS slated to produce entangled photons. SPEQS stands for Small Photon-Entangling Quantum System.
With later satellites, the researchers will try sending entangled photons to Earth and to other satellites. The team are working with standard “CubeSat” nanosatellites, which can get relatively cheap rides into space as rocket ballast. Ultimately, completing a global network would mean having a fleet of satellites in orbit and an array of ground stations.
In the meantime, quantum satellites could also carry out fundamental experiments – for example, testing entanglement over distances bigger than Earth-bound scientists can manage. “We are reaching the limits of how precisely we can test quantum theory on Earth,” said co-author Dr Daniel Oi at the University of Strathclyde.