Tag Archives: quantum computing

Cornwall (UK) connects with University of Southern California for performance by a quantum computer (D-Wave) and mezzo soprano Juliette Pochin

The upcoming performance of a quantum computer built by D-Wave Systems (a Canadian company) and Welsh mezzo soprano Juliette Pochin is the première of “Superposition” by Alexis Kirke. A July 13, 2016 news item on phys.org provides more detail,

What happens when you combine the pure tones of an internationally renowned mezzo soprano and the complex technology of a $15million quantum supercomputer?

The answer will be exclusively revealed to audiences at the Port Eliot Festival [Cornwall, UK] when Superposition, created by Plymouth University composer Alexis Kirke, receives its world premiere later this summer.

A D-Wave 1000 Qubit Quantum Processor. Credit: D-Wave Systems Inc

A D-Wave 1000 Qubit Quantum Processor. Credit: D-Wave Systems Inc

A July 13, 2016 Plymouth University press release, which originated the news item, expands on the theme,

Combining the arts and sciences, as Dr Kirke has done with many of his previous works, the 15-minute piece will begin dark and mysterious with celebrated performer Juliette Pochin singing a low-pitched slow theme.

But gradually the quiet sounds of electronic ambience will emerge over or beneath her voice, as the sounds of her singing are picked up by a microphone and sent over the internet to the D-Wave quantum computer at the University of Southern California.

It then reacts with behaviours in the quantum realm that are turned into sounds back in the performance venue, the Round Room at Port Eliot, creating a unique and ground-breaking duet.

And when the singer ends, the quantum processes are left to slowly fade away naturally, making their final sounds as the lights go to black.

Dr Kirke, a member of the Interdisciplinary Centre for Computer Music Research at Plymouth University, said:

“There are only a handful of these computers accessible in the world, and this is the first time one has been used as part of a creative performance. So while it is a great privilege to be able to put this together, it is an incredibly complex area of computing and science and it has taken almost two years to get to this stage. For most people, this will be the first time they have seen a quantum computer in action and I hope it will give them a better understanding of how it works in a creative and innovative way.”

Plymouth University is the official Creative and Cultural Partner of the Port Eliot Festival, taking place in South East Cornwall from July 28 to 31, 2016 [emphasis mine].

And Superposition will be one of a number of showcases of University talent and expertise as part of the first Port Eliot Science Lab. Being staged in the Round Room at Port Eliot, it will give festival goers the chance to explore science, see performances and take part in a range of experiments.

The three-part performance will tell the story of Niobe, one of the more tragic figures in Greek mythology, but in this case a nod to the fact the heart of the quantum computer contains the metal named after her, niobium. It will also feature a monologue from Hamlet, interspersed with terms from quantum computing.

This is the latest of Dr Kirke’s pioneering performance works, with previous productions including an opera based on the financial crisis and a piece using a cutting edge wave-testing facility as an instrument of percussion.

Geordie Rose, CTO and Founder, D-Wave Systems, said:

“D-Wave’s quantum computing technology has been investigated in many areas such as image recognition, machine learning and finance. We are excited to see Dr Kirke, a pioneer in the field of quantum physics and the arts, utilising a D-Wave 2X in his next performance. Quantum computing is positioned to have a tremendous social impact, and Dr Kirke’s work serves not only as a piece of innovative computer arts research, but also as a way of educating the public about these new types of exotic computing machines.”

Professor Daniel Lidar, Director of the USC Center for Quantum Information Science and Technology, said:

“This is an exciting time to be in the field of quantum computing. This is a field that was purely theoretical until the 1990s and now is making huge leaps forward every year. We have been researching the D-Wave machines for four years now, and have recently upgraded to the D-Wave 2X – the world’s most advanced commercially available quantum optimisation processor. We were very happy to welcome Dr Kirke on a short training residence here at the University of Southern California recently; and are excited to be collaborating with him on this performance, which we see as a great opportunity for education and public awareness.”

Since I can’t be there, I’m hoping they will be able to successfully livestream the performance. According to Kirke who very kindly responded to my query, the festival’s remote location can make livecasting a challenge. He did note that a post-performance documentary is planned and there will be footage from the performance.

He has also provided more information about the singer and the technical/computer aspects of the performance (from a July 18, 2016 email),

Juliette Pochin: I’ve worked with her before a couple of years ago. She has an amazing voice and style, is musically adventurousness (she is a music producer herself), and brings great grace and charisma to a performance. She can be heard in the Harry Potter and Lord of the Rings soundtracks and has performed at venues such as the Royal Albert Hall, Proms in the Park, and Meatloaf!

Score: The score is in 3 parts of about 5 minutes each. There is a traditional score for parts 1 and 3 that Juliette will sing from. I wrote these manually in traditional music notation. However she can sing in free time and wait for the computer to respond. It is a very dramatic score, almost operatic. The computer’s responses are based on two algorithms: a superposition chord system, and a pitch-loudness entanglement system. The superposition chord system sends a harmony problem to the D-Wave in response to Juliette’s approximate pitch amongst other elements. The D-Wave uses an 8-qubit optimizer to return potential chords. Each potential chord has an energy associated with it. In theory the lowest energy chord is that preferred by the algorithm. However in the performance I will combine the chord solutions to create superposition chords. These are chords which represent, in a very loose way, the superposed solutions which existed in the D-Wave before collapse of the qubits. Technically they are the results of multiple collapses, but metaphorically I can’t think of a more beautiful representation of superposition: chords. These will accompany Juliette, sometimes clashing with her. Sometimes giving way to her.

The second subsystem generates non-pitched noises of different lengths, roughnesses and loudness. These are responses to Juliette, but also a result of a simple D-Wave entanglement. We know the D-Wave can entangle in 8-qubit groups. I send a binary representation of the Juliette’s loudness to 4 qubits and one of approximate pitch to another 4, then entangle the two. The chosen entanglement weights are selected for their variety of solutions amongst the qubits, rather than by a particular musical logic. So the non-pitched subsystem is more of a sonification of entanglement than a musical algorithm.

Thank you Dr. Kirke for a fascinating technical description and for a description of Juliette Pochin that makes one long to hear her in performance.

For anyone who’s thinking of attending the performance or curious, you can find out more about the Port Eliot festival here, Juliette Pochin here, and Alexis Kirke here.

For anyone wondering about data sonficiatiion, I also have a Feb. 7, 2014 post featuring a data sonification project by Dr. Domenico Vicinanza which includes a sound clip of his Voyager 1 & 2 spacecraft duet.

Simulating elementary physics in a quantum simulation (particle zoo in a quantum computer?)

Whoever wrote the news release used a very catchy title “Particle zoo in a quantum computer”; I just wish they’d explained it. Looking up the definition for a ‘particle zoo’ didn’t help as much as I’d hoped. From the particle zoo entry on Wikipedia (Note: Links have been removed),

In particle physics, the term particle zoo[1][2] is used colloquially to describe a relatively extensive list of the then known “elementary particles” that almost look like hundreds of species in the zoo.

In the history of particle physics, the situation was particularly confusing in the late 1960s. Before the discovery of quarks, hundreds of strongly interacting particles (hadrons) were known, and believed to be distinct elementary particles in their own right. It was later discovered that they were not elementary particles, but rather composites of the quarks. The set of particles believed today to be elementary is known as the Standard Model, and includes quarks, bosons and leptons.

I believe the writer used the term to indicate that the simulation undertaken involved elementary particles. If you have a better explanation, please feel free to add it to the comments for this post.

Here’s the news from a June 22, 2016 news item on ScienceDaily,

Elementary particles are the fundamental buildings blocks of matter, and their properties are described by the Standard Model of particle physics. The discovery of the Higgs boson at the CERN in 2012 constitutes a further step towards the confirmation of the Standard Model. However, many aspects of this theory are still not understood because their complexity makes it hard to investigate them with classical computers. Quantum computers may provide a way to overcome this obstacle as they can simulate certain aspects of elementary particle physics in a well-controlled quantum system. Physicists from the University of Innsbruck and the Institute for Quantum Optics and Quantum Information (IQOQI) at the Austrian Academy of Sciences have now done exactly that: In an international first, Rainer Blatt’s and Peter Zoller’s research teams have simulated lattice gauge theories in a quantum computer. …

A June 23, 2016 University of Innsbruck (Universität Innsbruck) press release, which seems  to have originated the news item, provides more detail,

Gauge theories describe the interaction between elementary particles, such as quarks and gluons, and they are the basis for our understanding of fundamental processes. “Dynamical processes, for example, the collision of elementary particles or the spontaneous creation of particle-antiparticle pairs, are extremely difficult to investigate,” explains Christine Muschik, theoretical physicist at the IQOQI. “However, scientists quickly reach a limit when processing numerical calculations on classical computers. For this reason, it has been proposed to simulate these processes by using a programmable quantum system.” In recent years, many interesting concepts have been proposed, but until now it was impossible to realize them. “We have now developed a new concept that allows us to simulate the spontaneous creation of electron-positron pairs out of the vacuum by using a quantum computer,” says Muschik. The quantum system consists of four electromagnetically trapped calcium ions that are controlled by laser pulses. “Each pair of ions represent a pair of a particle and an antiparticle,” explains experimental physicist Esteban A. Martinez. “We use laser pulses to simulate the electromagnetic field in a vacuum. Then we are able to observe how particle pairs are created by quantum fluctuations from the energy of this field. By looking at the ion’s fluorescence, we see whether particles and antiparticles were created. We are able to modify the parameters of the quantum system, which allows us to observe and study the dynamic process of pair creation.”

Combining different fields of physics

With this experiment, the physicists in Innsbruck have built a bridge between two different fields in physics: They have used atomic physics experiments to study questions in high-energy physics. While hundreds of theoretical physicists work on the highly complex theories of the Standard Model and experiments are carried out at extremely expensive facilities, such as the Large Hadron Collider at CERN, quantum simulations may be carried out by small groups in tabletop experiments. “These two approaches complement one another perfectly,” says theoretical physicist Peter Zoller. “We cannot replace the experiments that are done with particle colliders. However, by developing quantum simulators, we may be able to understand these experiments better one day.” Experimental physicist Rainer Blatt adds: “Moreover, we can study new processes by using quantum simulation. For example, in our experiment we also investigated particle entanglement produced during pair creation, which is not possible in a particle collider.” The physicists are convinced that future quantum simulators will potentially be able to solve important questions in high-energy physics that cannot be tackled by conventional methods.

Foundation for a new research field

It was only a few years ago that the idea to combine high-energy and atomic physics was proposed. With this work it has been implemented experimentally for the first time. “This approach is conceptually very different from previous quantum simulation experiments studying many-body physics or quantum chemistry. The simulation of elementary particle processes is theoretically very complex and, therefore, has to satisfy very specific requirements. For this reason it is difficult to develop a suitable protocol,” underlines Zoller. The conditions for the experimental physicists were equally demanding: “This is one of the most complex experiments that has ever been carried out in a trapped-ion quantum computer,” says Blatt. “We are still figuring out how these quantum simulations work and will only gradually be able to apply them to more challenging phenomena.” The great theoretical as well as experimental expertise of the physicists in Innsbruck was crucial for the breakthrough. Both Blatt and Zoller emphasize that they have been doing research on quantum computers for many years now and have gained a lot of experience in their implementation. Innsbruck has become one of the leading centers for research in quantum physics; here, the theoretical and experimental branches work together at an extremely high level, which enables them to gain novel insights into fundamental phenomena.

Here’s a link to and a citation for the paper,

Real-time dynamics of lattice gauge theories with a few-qubit quantum computer by Esteban A. Martinez, Christine A. Muschik, Philipp Schindler, Daniel Nigg, Alexander Erhard, Markus Heyl, Philipp Hauke, Marcello Dalmonte, Thomas Monz, Peter Zoller, & Rainer Blatt.  Nature 534, 516–519 (23 June 2016)  doi:10.1038/nature18318 Published online 22 June 2016

This paper is behind a paywall.

There is a soundcloud audio file featuring an explanation of the work from the lead author, Esteban A. Martinez,

Testing technology for a global quantum network

This work on quantum networks comes from a joint Singapore/UK research project, from a June 2, 2016 news item on ScienceDaily,

You can’t sign up for the quantum internet just yet, but researchers have reported a major experimental milestone towards building a global quantum network — and it’s happening in space.

With a network that carries information in the quantum properties of single particles, you can create secure keys for secret messaging and potentially connect powerful quantum computers in the future. But scientists think you will need equipment in space to get global reach.

Researchers from the National University of Singapore (NUS) and the University of Strathclyde, UK, have become the first to test in orbit technology for satellite-based quantum network nodes.

They have put a compact device carrying components used in quantum communication and computing into orbit. And it works: the team report first data in a paper published 31 May 2016 in the journal Physical Review Applied.

A June 2, 2016 National University of Singapore press release, which originated the news item, provides more detail,

The team’s device, dubbed SPEQS, creates and measures pairs of light particles, called photons. Results from space show that SPEQS is making pairs of photons with correlated properties – an indicator of performance.

Team-leader Alexander Ling, an Assistant Professor at the Centre for Quantum Technologies (CQT) at NUS said, “This is the first time anyone has tested this kind of quantum technology in space.”

The team had to be inventive to redesign a delicate, table-top quantum setup to be small and robust enough to fly inside a nanosatellite only the size of a shoebox. The whole satellite weighs just 1.65-kilogramme.

Towards entanglement

Making correlated photons is a precursor to creating entangled photons. Described by Einstein as “spooky action at a distance”, entanglement is a connection between quantum particles that lends security to communication and power to computing.

Professor Artur Ekert, Director of CQT, invented the idea of using entangled particles for cryptography. He said, “Alex and his team are taking entanglement, literally, to a new level. Their experiments will pave the road to secure quantum communication and distributed quantum computation on a global scale. I am happy to see that Singapore is one of the world leaders in this area.”

Local quantum networks already exist [emphasis mine]. The problem Ling’s team aims to solve is a distance limit. Losses limit quantum signals sent through air at ground level or optical fibre to a few hundred kilometers – but we might ultimately use entangled photons beamed from satellites to connect points on opposite sides of the planet. Although photons from satellites still have to travel through the atmosphere, going top-to-bottom is roughly equivalent to going only 10 kilometres at ground level.

The group’s first device is a technology pathfinder. It takes photons from a BluRay laser and splits them into two, then measures the pair’s properties, all on board the satellite. To do this it contains a laser diode, crystals, mirrors and photon detectors carefully aligned inside an aluminum block. This sits on top of a 10 centimetres by 10 centimetres printed circuit board packed with control electronics.

Through a series of pre-launch tests – and one unfortunate incident – the team became more confident that their design could survive a rocket launch and space conditions. The team had a device in the October 2014 Orbital-3 rocket which exploded on the launch pad. The satellite containing that first device was later found on a beach intact and still in working order.

Future plans

Even with the success of the more recent mission, a global network is still a few milestones away. The team’s roadmap calls for a series of launches, with the next space-bound SPEQS slated to produce entangled photons. SPEQS stands for Small Photon-Entangling Quantum System.

With later satellites, the researchers will try sending entangled photons to Earth and to other satellites. The team are working with standard “CubeSat” nanosatellites, which can get relatively cheap rides into space as rocket ballast. Ultimately, completing a global network would mean having a fleet of satellites in orbit and an array of ground stations.

In the meantime, quantum satellites could also carry out fundamental experiments – for example, testing entanglement over distances bigger than Earth-bound scientists can manage. “We are reaching the limits of how precisely we can test quantum theory on Earth,” said co-author Dr Daniel Oi at the University of Strathclyde.

Here’s a link to and a citation for the paper,

Generation and Analysis of Correlated Pairs of Photons aboard a Nanosatellite by Zhongkan Tang, Rakhitha Chandrasekara, Yue Chuan Tan, Cliff Cheng, Luo Sha, Goh Cher Hiang, Daniel K. L. Oi, and Alexander Ling. Phys. Rev. Applied 5, 054022 DOI: http://dx.doi.org/10.1103/PhysRevApplied.5.054022 Published 31 May 2016

This paper is behind a paywall.

Nanodevices and quantum entanglement

A May 30, 2016 news item on phys.org introduces a scientist with an intriguing approach to quantum computing,

Creating quantum computers which some people believe will be the next generation of computers, with the ability to outperform machines based on conventional technology—depends upon harnessing the principles of quantum mechanics, or the physics that governs the behavior of particles at the subatomic scale. Entanglement—a concept that Albert Einstein once called “spooky action at a distance”—is integral to quantum computing, as it allows two physically separated particles to store and exchange information.

Stevan Nadj-Perge, assistant professor of applied physics and materials science, is interested in creating a device that could harness the power of entangled particles within a usable technology. However, one barrier to the development of quantum computing is decoherence, or the tendency of outside noise to destroy the quantum properties of a quantum computing device and ruin its ability to store information.

Nadj-Perge, who is originally from Serbia, received his undergraduate degree from Belgrade University and his PhD from Delft University of Technology in the Netherlands. He received a Marie Curie Fellowship in 2011, and joined the Caltech Division of Engineering and Applied Science in January after completing postdoctoral appointments at Princeton and Delft.

He recently talked with us about how his experimental work aims to resolve the problem of decoherence.

A May 27, 2016 California Institute of Technology (CalTech) news release by Jessica Stoller-Conrad, which originated the news item, proceeds with a question and answer format,

What is the overall goal of your research?

A large part of my research is focused on finding ways to store and process quantum information. Typically, if you have a quantum system, it loses its coherent properties—and therefore, its ability to store quantum information—very quickly. Quantum information is very fragile and even the smallest amount of external noise messes up quantum states. This is true for all quantum systems. There are various schemes that tackle this problem and postpone decoherence, but the one that I’m most interested in involves Majorana fermions. These particles were proposed to exist in nature almost eighty years ago but interestingly were never found.

Relatively recently theorists figured out how to engineer these particles in the lab. It turns out that, under certain conditions, when you combine certain materials and apply high magnetic fields at very cold temperatures, electrons will form a state that looks exactly as you would expect from Majorana fermions. Furthermore, such engineered states allow you to store quantum information in a way that postpones decoherence.

How exactly is quantum information stored using these Majorana fermions?

The fascinating property of these particles is that they always come in pairs. If you can store information in a pair of Majorana fermions it will be protected against all of the usual environmental noise that affects quantum states of individual objects. The information is protected because it is not stored in a single particle but in the pair itself. My lab is developing ways to engineer nanodevices which host Majorana fermions. Hopefully one day our devices will find applications in quantum computing.

Why did you want to come to Caltech to do this work?

The concept of engineered Majorana fermions and topological protection was, to a large degree, conceived here at Caltech by Alexei Kiteav [Ronald and Maxine Linde Professor of Theoretical Physics and Mathematics] who is in the physics department. A couple of physicists here at Caltech, Gil Refeal [professor of theoretical physics and executive officer of physics] and Jason Alicea [professor of theoretical physics], are doing theoretical work that is very relevant for my field.

Do you have any collaborations planned here?

Nothing formal, but I’ve been talking a lot with Gil and Jason. A student of mine also uses resources in the lab of Harry Atwater [Howard Hughes Professor of Applied Physics and Materials Science and director of the Joint Center for Artificial Photosynthesis], who has experience with materials that are potentially useful for our research.

How does that project relate to your lab’s work?

There are two-dimensional, or 2-D, materials that are basically very thin sheets of atoms. Graphene [emphasis mine]—a single layer of carbon atoms—is one example, but you can create single layer sheets of atoms with many materials. Harry Atwater’s group is working on solar cells made of a 2-D material. We are thinking of using the same materials and combining them with superconductors—materials that can conduct electricity without releasing heat, sound, or any other form of energy—in order to produce Majorana fermions.

How do you do that?

There are several proposed ways of using 2-D materials to create Majorana fermions. The majority of these materials have a strong spin-orbit coupling—an interaction of a particle’s spin with its motion—which is one of the key ingredients for creating Majoranas. Also some of the 2-D materials can become superconductors at low temperatures. One of the ideas that we are seriously considering is using a 2-D material as a substrate on which we could build atomic chains that will host Majorana fermions.

What got you interested in science when you were young?

I don’t come from a family of scientists; my father is an engineer and my mother is an administrative worker. But my father first got me interested in science. As an engineer, he was always solving something and he brought home some of the problems he was working. I worked with him and picked it up at an early age.

How are you adjusting to life in California?

Well, I like being outdoors, and here we have the mountains and the beach and it’s really amazing. The weather here is so much better than the other places I’ve lived. If you want to get the impression of what the weather in the Netherlands is like, you just replace the number of sunny days here with the number of rainy days there.

I wish Stevan Nadj-Perge good luck!

Lockheed Martin upgrades to 1000+ Qubit D-Wave system

D-Wave Systems, a Canadian quantum computing company, seems to be making new business announcements on a weekly basis. After last week’s US Los Alamos National Laboratory announcement (Nov. 12, 2015 posting) , there’s a Nov. 16, 2015 news item on Nanotechnology Now,

Harris & Harris Group, Inc. (NASDAQ:TINY), an investor in transformative companies enabled by disruptive science, notes that its portfolio company, D-Wave Systems, Inc., announced that it has entered into a multi-year agreement with Lockheed Martin to upgrade the company’s 512-qubit D-Wave Two™ quantum computer to the new D-Wave 2X™ system with 1,000+ qubits.

A Nov. 16, 2015 D-Wave Systems news release provides more details about the deal,

D-Wave Systems Inc., the world’s first quantum computing company, today announced that it has entered into a multi-year agreement with Lockheed Martin (NYSE: LMT) to upgrade the company’s 512-qubit D-Wave Two™ quantum computer to the new D-Wave 2X™ system with 1,000+ qubits. This represents the second system upgrade since Lockheed Martin became D-Wave’s first customer in 2011 with the purchase of a 128 qubit D-Wave One™ system. The agreement includes the system, maintenance and associated professional services.

“Our mission is to solve complex challenges, advance scientific discovery and deliver innovative solutions to our customers, which requires expertise in the most advanced technologies,” said Greg Tallant, Lockheed Martin fellow and lead for the University of Southern California-Lockheed Martin Quantum Computation Center (QCC). “Through our continued investment in D-Wave technology, we are able to push the boundaries of quantum computing and apply the latest technologies to address the real-world problems being faced by our customers.”

For quantum computing, the performance gain over traditional computing is most evident in exceedingly complex computational problems. This could be in areas such as validating the performance of software or vehicle planning and scheduling. With the new D-Wave system, Lockheed Martin researchers will be able to explore solutions for significantly larger computational problems with improved accuracy and execution time.

The new system will be hosted at the University of Southern California-Lockheed Martin Quantum Computation Center, which first began exploring the power of quantum computing with the D-Wave One, the world’s first quantum computer.

The installation of the D-Wave 2X system will be completed in January 2016.

Who knows what next week will bring for D-Wave, which by the way is located in Vancouver, Canada or, more accurately, Burnaby?

Quantum teleportation

It’s been two years (my Aug. 16, 2013 posting features a German-Japanese collaboration) since the last quantum teleportation posting here. First, a little visual stimulation,

Captain James T Kirk (credit: http://www.comicvine.com/james-t-kirk/4005-20078/)

Captain James T Kirk (credit: http://www.comicvine.com/james-t-kirk/4005-20078/)

Captain Kirk, also known as William Shatner, is from Montréal, Canada and that’s not the only Canadian connection to this story which is really about some research at York University (UK). From an Oct. 1, 2015 news item on Nanotechnology Now,

Mention the word ‘teleportation’ and for many people it conjures up “Beam me up, Scottie” images of Captain James T Kirk.

But in the last two decades quantum teleportation – transferring the quantum structure of an object from one place to another without physical transmission — has moved from the realms of Star Trek fantasy to tangible reality.

A Sept. 30, 2015 York University (UK) press release, which originated the news item, describes the quantum teleportation research problem and solution,

Quantum teleportation is an important building block for quantum computing, quantum communication and quantum network and, eventually, a quantum Internet. While theoretical proposals for a quantum Internet already exist, the problem for scientists is that there is still debate over which of various technologies provides the most efficient and reliable teleportation system. This is the dilemma which an international team of researchers, led by Dr Stefano Pirandola of the Department of Computer Science at the University of York, set out to resolve.

In a paper published in Nature Photonics, the team, which included scientists from the Freie Universität Berlin and the Universities of Tokyo and Toronto [emphasis mine], reviewed the theoretical ideas around quantum teleportation focusing on the main experimental approaches and their attendant advantages and disadvantages.

None of the technologies alone provide a perfect solution, so the scientists concluded that a hybridisation of the various protocols and underlying structures would offer the most fruitful approach.

For instance, systems using photonic qubits work over distances up to 143 kilometres, but they are probabilistic in that only 50 per cent of the information can be transported. To resolve this, such photon systems may be used in conjunction with continuous variable systems, which are 100 per cent effective but currently limited to short distances.

Most importantly, teleportation-based optical communication needs an interface with suitable matter-based quantum memories where quantum information can be stored and further processed.

Dr Pirandola, who is also a member of the York Centre for Quantum Technologies, said: “We don’t have an ideal or universal technology for quantum teleportation. The field has developed a lot but we seem to need to rely on a hybrid approach to get the best from each available technology.

“The use of quantum teleportation as a building block for a quantum network depends on its integration with quantum memories. The development of good quantum memories would allow us to build quantum repeaters, therefore extending the range of teleportation. They would also give us the ability to store and process the transmitted quantum information at local quantum computers.

“This could ultimately form the backbone of a quantum Internet. The revised hybrid architecture will likely rely on teleportation-based long-distance quantum optical communication, interfaced with solid state devices for quantum information processing.”

Here’s a link to and a citation for the paper,

Advances in quantum teleportation by S. Pirandola, J. Eisert, C. Weedbrook, A. Furusawa, & S. L. Braunstein. Nature Photonics 9, 641–652 (2015) doi:10.1038/nphoton.2015.154 Published online 29 September 2015

This paper is behind a paywall.


D-Wave upgrades Google’s quantum computing capabilities

Vancouver-based (more accurately, Burnaby-based) D-Wave systems has scored a coup as key customers have upgraded from a 512-qubit system to a system with over 1,000 qubits. (The technical breakthrough and concomitant interest from the business community was mentioned here in a June 26, 2015 posting.) As for the latest business breakthrough, here’s more from a Sept. 28, 2015 D-Wave press release,

D-Wave Systems Inc., the world’s first quantum computing company, announced that it has entered into a new agreement covering the installation of a succession of D-Wave systems located at NASA’s Ames Research Center in Moffett Field, California. This agreement supports collaboration among Google, NASA and USRA (Universities Space Research Association) that is dedicated to studying how quantum computing can advance artificial intelligence and machine learning, and the solution of difficult optimization problems. The new agreement enables Google and its partners to keep their D-Wave system at the state-of-the-art for up to seven years, with new generations of D-Wave systems to be installed at NASA Ames as they become available.

“The new agreement is the largest order in D-Wave’s history, and indicative of the importance of quantum computing in its evolution toward solving problems that are difficult for even the largest supercomputers,” said D-Wave CEO Vern Brownell. “We highly value the commitment that our partners have made to D-Wave and our technology, and are excited about the potential use of our systems for machine learning and complex optimization problems.”

Cade Wetz’s Sept. 28, 2015 article for Wired magazine provides some interesting observations about D-Wave computers along with some explanations of quantum computing (Note: Links have been removed),

Though the D-Wave machine is less powerful than many scientists hope quantum computers will one day be, the leap to 1000 qubits represents an exponential improvement in what the machine is capable of. What is it capable of? Google and its partners are still trying to figure that out. But Google has said it’s confident there are situations where the D-Wave can outperform today’s non-quantum machines, and scientists at the University of Southern California [USC] have published research suggesting that the D-Wave exhibits behavior beyond classical physics.

A quantum computer operates according to the principles of quantum mechanics, the physics of very small things, such as electrons and photons. In a classical computer, a transistor stores a single “bit” of information. If the transistor is “on,” it holds a 1, and if it’s “off,” it holds a 0. But in quantum computer, thanks to what’s called the superposition principle, information is held in a quantum system that can exist in two states at the same time. This “qubit” can store a 0 and 1 simultaneously.

Two qubits, then, can hold four values at any given time (00, 01, 10, and 11). And as you keep increasing the number of qubits, you exponentially increase the power of the system. The problem is that building a qubit is a extreme difficult thing. If you read information from a quantum system, it “decoheres.” Basically, it turns into a classical bit that houses only a single value.

D-Wave claims to have a found a solution to the decoherence problem and that appears to be borne out by the USC researchers. Still, it isn’t a general quantum computer (from Wetz’s article),

… researchers at USC say that the system appears to display a phenomenon called “quantum annealing” that suggests it’s truly operating in the quantum realm. Regardless, the D-Wave is not a general quantum computer—that is, it’s not a computer for just any task. But D-Wave says the machine is well-suited to “optimization” problems, where you’re facing many, many different ways forward and must pick the best option, and to machine learning, where computers teach themselves tasks by analyzing large amount of data.

It takes a lot of innovation before you make big strides forward and I think D-Wave is to be congratulated on producing what is to my knowledge the only commercially available form of quantum computing of any sort in the world.

ETA Oct. 6, 2015* at 1230 hours PST: Minutes after publishing about D-Wave I came across this item (h/t Quirks & Quarks twitter) about Australian researchers and their quantum computing breakthrough. From an Oct. 6, 2015 article by Hannah Francis for the Sydney (Australia) Morning Herald,

For decades scientists have been trying to turn quantum computing — which allows for multiple calculations to happen at once, making it immeasurably faster than standard computing — into a practical reality rather than a moonshot theory. Until now, they have largely relied on “exotic” materials to construct quantum computers, making them unsuitable for commercial production.

But researchers at the University of New South Wales have patented a new design, published in the scientific journal Nature on Tuesday, created specifically with computer industry manufacturing standards in mind and using affordable silicon, which is found in regular computer chips like those we use every day in smartphones or tablets.

“Our team at UNSW has just cleared a major hurdle to making quantum computing a reality,” the director of the university’s Australian National Fabrication Facility, Andrew Dzurak, the project’s leader, said.

“As well as demonstrating the first quantum logic gate in silicon, we’ve also designed and patented a way to scale this technology to millions of qubits using standard industrial manufacturing techniques to build the world’s first quantum processor chip.”

According to the article, the university is looking for industrial partners to help them exploit this breakthrough. Fisher’s article features an embedded video, as well as, more detail.

*It was Oct. 6, 2015 in Australia but Oct. 5, 2015 my side of the international date line.

ETA Oct. 6, 2015 (my side of the international date line): An Oct. 5, 2015 University of New South Wales news release on EurekAlert provides additional details.

Here’s a link to and a citation for the paper,

A two-qubit logic gate in silicon by M. Veldhorst, C. H. Yang, J. C. C. Hwang, W. Huang,    J. P. Dehollain, J. T. Muhonen, S. Simmons, A. Laucht, F. E. Hudson, K. M. Itoh, A. Morello    & A. S. Dzurak. Nature (2015 doi:10.1038/nature15263 Published online 05 October 2015

This paper is behind a paywall.

D-Wave passes 1000-qubit barrier

A local (Vancouver, Canada-based, quantum computing company, D-Wave is making quite a splash lately due to a technical breakthrough.  h/t’s Speaking up for Canadian Science for Business in Vancouver article and Nanotechnology Now for Harris & Harris Group press release and Economist article.

A June 22, 2015 article by Tyler Orton for Business in Vancouver describes D-Wave’s latest technical breakthrough,

“This updated processor will allow significantly more complex computational problems to be solved than ever before,” Jeremy Hilton, D-Wave’s vice-president of processor development, wrote in a June 22 [2015] blog entry.

Regular computers use two bits – ones and zeroes – to make calculations, while quantum computers rely on qubits.

Qubits possess a “superposition” that allow it to be one and zero at the same time, meaning it can calculate all possible values in a single operation.

But the algorithm for a full-scale quantum computer requires 8,000 qubits.

A June 23, 2015 Harris & Harris Group press release adds more information about the breakthrough,

Harris & Harris Group, Inc. (Nasdaq: TINY), an investor in transformative companies enabled by disruptive science, notes that its portfolio company, D-Wave Systems, Inc., announced that it has successfully fabricated 1,000 qubit processors that power its quantum computers.  D-Wave’s quantum computer runs a quantum annealing algorithm to find the lowest points, corresponding to optimal or near optimal solutions, in a virtual “energy landscape.”  Every additional qubit doubles the search space of the processor.  At 1,000 qubits, the new processor considers 21000 possibilities simultaneously, a search space which is substantially larger than the 2512 possibilities available to the company’s currently available 512 qubit D-Wave Two. In fact, the new search space contains far more possibilities than there are particles in the observable universe.

A June 22, 2015 D-Wave news release, which originated the technical details about the breakthrough found in the Harris & Harris press release, provides more information along with some marketing hype (hyperbole), Note: Links have been removed,

As the only manufacturer of scalable quantum processors, D-Wave breaks new ground with every succeeding generation it develops. The new processors, comprising over 128,000 Josephson tunnel junctions, are believed to be the most complex superconductor integrated circuits ever successfully yielded. They are fabricated in part at D-Wave’s facilities in Palo Alto, CA and at Cypress Semiconductor’s wafer foundry located in Bloomington, Minnesota.

“Temperature, noise, and precision all play a profound role in how well quantum processors solve problems.  Beyond scaling up the technology by doubling the number of qubits, we also achieved key technology advances prioritized around their impact on performance,” said Jeremy Hilton, D-Wave vice president, processor development. “We expect to release benchmarking data that demonstrate new levels of performance later this year.”

The 1000-qubit milestone is the result of intensive research and development by D-Wave and reflects a triumph over a variety of design challenges aimed at enhancing performance and boosting solution quality. Beyond the much larger number of qubits, other significant innovations include:

  •  Lower Operating Temperature: While the previous generation processor ran at a temperature close to absolute zero, the new processor runs 40% colder. The lower operating temperature enhances the importance of quantum effects, which increases the ability to discriminate the best result from a collection of good candidates.​
  • Reduced Noise: Through a combination of improved design, architectural enhancements and materials changes, noise levels have been reduced by 50% in comparison to the previous generation. The lower noise environment enhances problem-solving performance while boosting reliability and stability.
  • Increased Control Circuitry Precision: In the testing to date, the increased precision coupled with the noise reduction has demonstrated improved precision by up to 40%. To accomplish both while also improving manufacturing yield is a significant achievement.
  • Advanced Fabrication:  The new processors comprise over 128,000 Josephson junctions (tunnel junctions with superconducting electrodes) in a 6-metal layer planar process with 0.25μm features, believed to be the most complex superconductor integrated circuits ever built.
  • New Modes of Use: The new technology expands the boundaries of ways to exploit quantum resources.  In addition to performing discrete optimization like its predecessor, firmware and software upgrades will make it easier to use the system for sampling applications.

“Breaking the 1000 qubit barrier marks the culmination of years of research and development by our scientists, engineers and manufacturing team,” said D-Wave CEO Vern Brownell. “It is a critical step toward bringing the promise of quantum computing to bear on some of the most challenging technical, commercial, scientific, and national defense problems that organizations face.”

A June 20, 2015 article in The Economist notes there is now commercial interest as it provides good introductory information about quantum computing. The article includes an analysis of various research efforts in Canada (they mention D-Wave), the US, and the UK. These excerpts don’t do justice to the article but will hopefully whet your appetite or provide an overview for anyone with limited time,

A COMPUTER proceeds one step at a time. At any particular moment, each of its bits—the binary digits it adds and subtracts to arrive at its conclusions—has a single, definite value: zero or one. At that moment the machine is in just one state, a particular mixture of zeros and ones. It can therefore perform only one calculation next. This puts a limit on its power. To increase that power, you have to make it work faster.

But bits do not exist in the abstract. Each depends for its reality on the physical state of part of the computer’s processor or memory. And physical states, at the quantum level, are not as clear-cut as classical physics pretends. That leaves engineers a bit of wriggle room. By exploiting certain quantum effects they can create bits, known as qubits, that do not have a definite value, thus overcoming classical computing’s limits.

… The biggest question is what the qubits themselves should be made from.

A qubit needs a physical system with two opposite quantum states, such as the direction of spin of an electron orbiting an atomic nucleus. Several things which can do the job exist, and each has its fans. Some suggest nitrogen atoms trapped in the crystal lattices of diamonds. Calcium ions held in the grip of magnetic fields are another favourite. So are the photons of which light is composed (in this case the qubit would be stored in the plane of polarisation). And quasiparticles, which are vibrations in matter that behave like real subatomic particles, also have a following.

The leading candidate at the moment, though, is to use a superconductor in which the qubit is either the direction of a circulating current, or the presence or absence of an electric charge. Both Google and IBM are banking on this approach. It has the advantage that superconducting qubits can be arranged on semiconductor chips of the sort used in existing computers. That, the two firms think, should make them easier to commercialise.

Google is also collaborating with D-Wave of Vancouver, Canada, which sells what it calls quantum annealers. The field’s practitioners took much convincing that these devices really do exploit the quantum advantage, and in any case they are limited to a narrower set of problems—such as searching for images similar to a reference image. But such searches are just the type of application of interest to Google. In 2013, in collaboration with NASA and USRA, a research consortium, the firm bought a D-Wave machine in order to put it through its paces. Hartmut Neven, director of engineering at Google Research, is guarded about what his team has found, but he believes D-Wave’s approach is best suited to calculations involving fewer qubits, while Dr Martinis and his colleagues build devices with more.

It’s not clear to me if the writers at The Economist were aware of  D-Wave’s latest breakthrough at the time of writing but I think not. In any event, they (The Economist writers) have included a provocative tidbit about quantum encryption,

Documents released by Edward Snowden, a whistleblower, revealed that the Penetrating Hard Targets programme of America’s National Security Agency was actively researching “if, and how, a cryptologically useful quantum computer can be built”. In May IARPA [Intellligence Advanced Research Projects Agency], the American government’s intelligence-research arm, issued a call for partners in its Logical Qubits programme, to make robust, error-free qubits. In April, meanwhile, Tanja Lange and Daniel Bernstein of Eindhoven University of Technology, in the Netherlands, announced PQCRYPTO, a programme to advance and standardise “post-quantum cryptography”. They are concerned that encrypted communications captured now could be subjected to quantum cracking in the future. That means strong pre-emptive encryption is needed immediately.

I encourage you to read the Economist article.

Two final comments. (1) The latest piece, prior to this one, about D-Wave was in a Feb. 6, 2015 posting about then new investment into the company. (2) A Canadian effort in the field of quantum cryptography was mentioned in a May 11, 2015 posting (scroll down about 50% of the way) featuring a profile of Raymond Laflamme, at the University of Waterloo’s Institute of Quantum Computing in the context of an announcement about science media initiative Research2Reality.

More investment money for Canada’s D-Wave Systems (quantum computing)

A Feb. 2, 2015 news item on Nanotechnology Now features D-Wave Systems (located in the Vancouver region, Canada) and its recent funding bonanza of $28M dollars,

Harris & Harris Group, Inc. (Nasdaq:TINY), an investor in transformative companies enabled by disruptive science, notes the announcement by portfolio company, D-Wave Systems, Inc., that it has closed $29 million (CAD) in funding from a large institutional investor, among others. This funding will be used to accelerate development of D-Wave’s quantum hardware and software and expand the software application ecosystem. This investment brings total funding in D-Wave to $174 million (CAD), with approximately $62 million (CAD) raised in 2014. Harris & Harris Group’s total investment in D-Wave is approximately $5.8 million (USD). D-Wave’s announcement also includes highlights of 2014, a year of strong growth and advancement for D-Wave.

A Jan. 29, 2015 D-Wave news release provides more details about the new investment and D-Wave’s 2014 triumphs,

D-Wave Systems Inc., the world’s first quantum computing company, today announced that it has closed $29 million in funding from a large institutional investor, among others. This funding will be used to accelerate development of D-Wave’s quantum hardware and software and expand the software application ecosystem. This investment brings total funding in D-Wave to $174 million (CAD), with approximately $62 million raised in 2014.

“The investment is a testament to the progress D-Wave continues to make as the leader in quantum computing systems,” said Vern Brownell, CEO of D-Wave. “The funding we received in 2014 will advance our quantum hardware and software development, as well as our work on leading edge applications of our systems. By making quantum computing available to more organizations, we’re driving our goal of finding solutions to the most complex optimization and machine learning applications in national defense, computing, research and finance.”

The funding follows a year of strong growth and advancement for D-Wave. Highlights include:

•    Significant progress made towards the release of the next D-Wave quantum system featuring a 1000 qubit processor, which is currently undergoing testing in D-Wave’s labs.
•    The company’s patent portfolio grew to over 150 issued patents worldwide, with 11 new U.S. patents being granted in 2014, covering aspects of D-Wave’s processor technology, systems and techniques for solving computational problems using D-Wave’s technology.
•    D-Wave Professional Services launched, providing quantum computing experts to collaborate directly with customers, and deliver training classes on the usage and programming of the D-Wave system to a number of national laboratories, businesses and universities.
•    Partnerships were established with DNA-SEQ and 1QBit, companies that are developing quantum software applications in the spheres of medicine and finance, respectively.
•    Research throughout the year continued to validate D-Wave’s work, including a study showing further evidence of quantum entanglement by D-Wave and USC  [University of Southern California] scientists, published in Physical Review X this past May.

Since 2011, some of the most prestigious organizations in the world, including Lockheed Martin, NASA, Google, USC and the Universities Space Research Association (USRA), have partnered with D-Wave to use their quantum computing systems. In 2015, these partners will continue to work with the D-Wave computer, conducting pioneering research in machine learning, optimization, and space exploration.

D-Wave, which already employs over 120 people, plans to expand hiring with the additional funding. Key areas of growth include research, processor and systems development and software engineering.

Harris & Harris Group offers a description of D-Wave which mentions nanotechnology and hosts a couple of explanatory videos,

D-Wave Systems develops an adiabatic quantum computer (QC).

Privately Held

The Market
Electronics – High Performance Computing

The Problem
Traditional or “classical computers” are constrained by the sequential character of data processing that makes the solving of non-polynomial (NP)-hard problems difficult or potentially impossible in reasonable timeframes. These types of computationally intense problems are commonly observed in software verifications, scheduling and logistics planning, integer programming, bioinformatics and financial portfolio optimization.

D-Wave’s Solution
D-Wave develops quantum computers that are capable of processing data quantum mechanical properties of matter. This leverage of quantum mechanics enables the identification of solutions to some non-polynomial (NP)-hard problems in a reasonable timeframe, instead of the exponential time needed for any classical digital computer. D-Wave sold and installed its first quantum computing system to a commercial customer in 2011.

Nanotechnology Factor
To function properly, D-wave processor requires tight control and manipulation of quantum mechanical phenomena. This control and manipulation is achieved by creating integrated circuits based on Josephson Junctions and other superconducting circuitry. By picking superconductors, D-wave managed to combine quantum mechanical behavior with macroscopic dimensions needed for hi-yield design and manufacturing.

It seems D-Wave has made some research and funding strides since I last wrote about the company in a Jan. 19, 2012 posting, although there is no mention of quantum computer sales.

Could there be a quantum internet?

We’ve always had limited success with predicting future technologies by examining current technologies. For example, the Internet and World Wide Web as we experience them today would have been unthinkable for most people in the 1950s when computers inhabited entire buildings and satellites were a brand new technology designed for space exploration not bouncing communication signals around the planet. That said, this new work on a ‘quantum internet’ from Eindhoven University of Technology is quite intriguing (from a Dec. 15, 2014 news item on Nanowerk),

In the same way as we now connect computers in networks through optical signals, it could also be possible to connect future quantum computers in a ‘quantum internet’. The optical signals would then consist of individual light particles or photons. One prerequisite for a working quantum internet is control of the shape of these photons. Researchers at Eindhoven University of Technology (TU/e) and the FOM foundation  [Foundation for Fundamental Research on Matter] have now succeeded for the first time in getting this control within the required short time.

A Dec. 15, 2014 Eindhoven University of Technology (TU/e) press release, which originated the news item, describes one of the problems with a ‘quantum internet’ and the researchers’ solution,

Quantum computers could in principle communicate with each other by exchanging individual photons to create a ‘quantum internet’. The shape of the photons, in other words how their energy is distributed over time, is vital for successful transmission of information. This shape must be symmetric in time, while photons that are emitted by atoms normally have an asymmetric shape. Therefore, this process requires external control in order to create a quantum internet.

Optical cavity

Researchers at TU/e and FOM have succeeded in getting the required degree of control by embedding a quantum dot – a piece of semiconductor material that can transmit photons – into a ‘photonic crystal’, thereby creating an optical cavity. Then the researchers applied a very short electrical pulse to the cavity, which influences how the quantum dot interacts with it, and how the photon is emitted. By varying the strength of this pulse, they were able to control the shape of the transmitted photons.

Within a billionth of a second

The Eindhoven researchers are the first to achieve this, thanks to the use of electrical pulses shorter than nanosecond, a billionth of a second. This is vital for use in quantum communication, as research leader Andrea Fiore of TU/e explains: “The emission of a photon only lasts for one nanosecond, so if you want to change anything you have to do it within that time. It’s like the shutter of a high-speed camera, which has to be very short if you want to capture something that changes very fast in an image. By controlling the speed at which you send a photon, you can in principle achieve very efficient exchange of photons, which is important for the future quantum internet.”

Here’s a link to and a citation for the paper,

Dynamically controlling the emission of single excitons in photonic crystal cavities by Francesco Pagliano, YongJin Cho, Tian Xia, Frank van Otten, Robert Johne, & Andrea Fiore. Nature Communications 5, Article number: 5786 doi:10.1038/ncomms6786 Published 15 December 2014

This is an open access paper.

ETA Dec. 16, 2014 at 1230 hours PDT: There is a copy of the Dec. 15, 2014 news release on EurekAlert.