Tag Archives: quantum computing

Testing technology for a global quantum network

This work on quantum networks comes from a joint Singapore/UK research project, from a June 2, 2016 news item on ScienceDaily,

You can’t sign up for the quantum internet just yet, but researchers have reported a major experimental milestone towards building a global quantum network — and it’s happening in space.

With a network that carries information in the quantum properties of single particles, you can create secure keys for secret messaging and potentially connect powerful quantum computers in the future. But scientists think you will need equipment in space to get global reach.

Researchers from the National University of Singapore (NUS) and the University of Strathclyde, UK, have become the first to test in orbit technology for satellite-based quantum network nodes.

They have put a compact device carrying components used in quantum communication and computing into orbit. And it works: the team report first data in a paper published 31 May 2016 in the journal Physical Review Applied.

A June 2, 2016 National University of Singapore press release, which originated the news item, provides more detail,

The team’s device, dubbed SPEQS, creates and measures pairs of light particles, called photons. Results from space show that SPEQS is making pairs of photons with correlated properties – an indicator of performance.

Team-leader Alexander Ling, an Assistant Professor at the Centre for Quantum Technologies (CQT) at NUS said, “This is the first time anyone has tested this kind of quantum technology in space.”

The team had to be inventive to redesign a delicate, table-top quantum setup to be small and robust enough to fly inside a nanosatellite only the size of a shoebox. The whole satellite weighs just 1.65-kilogramme.

Towards entanglement

Making correlated photons is a precursor to creating entangled photons. Described by Einstein as “spooky action at a distance”, entanglement is a connection between quantum particles that lends security to communication and power to computing.

Professor Artur Ekert, Director of CQT, invented the idea of using entangled particles for cryptography. He said, “Alex and his team are taking entanglement, literally, to a new level. Their experiments will pave the road to secure quantum communication and distributed quantum computation on a global scale. I am happy to see that Singapore is one of the world leaders in this area.”

Local quantum networks already exist [emphasis mine]. The problem Ling’s team aims to solve is a distance limit. Losses limit quantum signals sent through air at ground level or optical fibre to a few hundred kilometers – but we might ultimately use entangled photons beamed from satellites to connect points on opposite sides of the planet. Although photons from satellites still have to travel through the atmosphere, going top-to-bottom is roughly equivalent to going only 10 kilometres at ground level.

The group’s first device is a technology pathfinder. It takes photons from a BluRay laser and splits them into two, then measures the pair’s properties, all on board the satellite. To do this it contains a laser diode, crystals, mirrors and photon detectors carefully aligned inside an aluminum block. This sits on top of a 10 centimetres by 10 centimetres printed circuit board packed with control electronics.

Through a series of pre-launch tests – and one unfortunate incident – the team became more confident that their design could survive a rocket launch and space conditions. The team had a device in the October 2014 Orbital-3 rocket which exploded on the launch pad. The satellite containing that first device was later found on a beach intact and still in working order.

Future plans

Even with the success of the more recent mission, a global network is still a few milestones away. The team’s roadmap calls for a series of launches, with the next space-bound SPEQS slated to produce entangled photons. SPEQS stands for Small Photon-Entangling Quantum System.

With later satellites, the researchers will try sending entangled photons to Earth and to other satellites. The team are working with standard “CubeSat” nanosatellites, which can get relatively cheap rides into space as rocket ballast. Ultimately, completing a global network would mean having a fleet of satellites in orbit and an array of ground stations.

In the meantime, quantum satellites could also carry out fundamental experiments – for example, testing entanglement over distances bigger than Earth-bound scientists can manage. “We are reaching the limits of how precisely we can test quantum theory on Earth,” said co-author Dr Daniel Oi at the University of Strathclyde.

Here’s a link to and a citation for the paper,

Generation and Analysis of Correlated Pairs of Photons aboard a Nanosatellite by Zhongkan Tang, Rakhitha Chandrasekara, Yue Chuan Tan, Cliff Cheng, Luo Sha, Goh Cher Hiang, Daniel K. L. Oi, and Alexander Ling. Phys. Rev. Applied 5, 054022 DOI: http://dx.doi.org/10.1103/PhysRevApplied.5.054022 Published 31 May 2016

This paper is behind a paywall.

Nanodevices and quantum entanglement

A May 30, 2016 news item on phys.org introduces a scientist with an intriguing approach to quantum computing,

Creating quantum computers which some people believe will be the next generation of computers, with the ability to outperform machines based on conventional technology—depends upon harnessing the principles of quantum mechanics, or the physics that governs the behavior of particles at the subatomic scale. Entanglement—a concept that Albert Einstein once called “spooky action at a distance”—is integral to quantum computing, as it allows two physically separated particles to store and exchange information.

Stevan Nadj-Perge, assistant professor of applied physics and materials science, is interested in creating a device that could harness the power of entangled particles within a usable technology. However, one barrier to the development of quantum computing is decoherence, or the tendency of outside noise to destroy the quantum properties of a quantum computing device and ruin its ability to store information.

Nadj-Perge, who is originally from Serbia, received his undergraduate degree from Belgrade University and his PhD from Delft University of Technology in the Netherlands. He received a Marie Curie Fellowship in 2011, and joined the Caltech Division of Engineering and Applied Science in January after completing postdoctoral appointments at Princeton and Delft.

He recently talked with us about how his experimental work aims to resolve the problem of decoherence.

A May 27, 2016 California Institute of Technology (CalTech) news release by Jessica Stoller-Conrad, which originated the news item, proceeds with a question and answer format,

What is the overall goal of your research?

A large part of my research is focused on finding ways to store and process quantum information. Typically, if you have a quantum system, it loses its coherent properties—and therefore, its ability to store quantum information—very quickly. Quantum information is very fragile and even the smallest amount of external noise messes up quantum states. This is true for all quantum systems. There are various schemes that tackle this problem and postpone decoherence, but the one that I’m most interested in involves Majorana fermions. These particles were proposed to exist in nature almost eighty years ago but interestingly were never found.

Relatively recently theorists figured out how to engineer these particles in the lab. It turns out that, under certain conditions, when you combine certain materials and apply high magnetic fields at very cold temperatures, electrons will form a state that looks exactly as you would expect from Majorana fermions. Furthermore, such engineered states allow you to store quantum information in a way that postpones decoherence.

How exactly is quantum information stored using these Majorana fermions?

The fascinating property of these particles is that they always come in pairs. If you can store information in a pair of Majorana fermions it will be protected against all of the usual environmental noise that affects quantum states of individual objects. The information is protected because it is not stored in a single particle but in the pair itself. My lab is developing ways to engineer nanodevices which host Majorana fermions. Hopefully one day our devices will find applications in quantum computing.

Why did you want to come to Caltech to do this work?

The concept of engineered Majorana fermions and topological protection was, to a large degree, conceived here at Caltech by Alexei Kiteav [Ronald and Maxine Linde Professor of Theoretical Physics and Mathematics] who is in the physics department. A couple of physicists here at Caltech, Gil Refeal [professor of theoretical physics and executive officer of physics] and Jason Alicea [professor of theoretical physics], are doing theoretical work that is very relevant for my field.

Do you have any collaborations planned here?

Nothing formal, but I’ve been talking a lot with Gil and Jason. A student of mine also uses resources in the lab of Harry Atwater [Howard Hughes Professor of Applied Physics and Materials Science and director of the Joint Center for Artificial Photosynthesis], who has experience with materials that are potentially useful for our research.

How does that project relate to your lab’s work?

There are two-dimensional, or 2-D, materials that are basically very thin sheets of atoms. Graphene [emphasis mine]—a single layer of carbon atoms—is one example, but you can create single layer sheets of atoms with many materials. Harry Atwater’s group is working on solar cells made of a 2-D material. We are thinking of using the same materials and combining them with superconductors—materials that can conduct electricity without releasing heat, sound, or any other form of energy—in order to produce Majorana fermions.

How do you do that?

There are several proposed ways of using 2-D materials to create Majorana fermions. The majority of these materials have a strong spin-orbit coupling—an interaction of a particle’s spin with its motion—which is one of the key ingredients for creating Majoranas. Also some of the 2-D materials can become superconductors at low temperatures. One of the ideas that we are seriously considering is using a 2-D material as a substrate on which we could build atomic chains that will host Majorana fermions.

What got you interested in science when you were young?

I don’t come from a family of scientists; my father is an engineer and my mother is an administrative worker. But my father first got me interested in science. As an engineer, he was always solving something and he brought home some of the problems he was working. I worked with him and picked it up at an early age.

How are you adjusting to life in California?

Well, I like being outdoors, and here we have the mountains and the beach and it’s really amazing. The weather here is so much better than the other places I’ve lived. If you want to get the impression of what the weather in the Netherlands is like, you just replace the number of sunny days here with the number of rainy days there.

I wish Stevan Nadj-Perge good luck!

Lockheed Martin upgrades to 1000+ Qubit D-Wave system

D-Wave Systems, a Canadian quantum computing company, seems to be making new business announcements on a weekly basis. After last week’s US Los Alamos National Laboratory announcement (Nov. 12, 2015 posting) , there’s a Nov. 16, 2015 news item on Nanotechnology Now,

Harris & Harris Group, Inc. (NASDAQ:TINY), an investor in transformative companies enabled by disruptive science, notes that its portfolio company, D-Wave Systems, Inc., announced that it has entered into a multi-year agreement with Lockheed Martin to upgrade the company’s 512-qubit D-Wave Two™ quantum computer to the new D-Wave 2X™ system with 1,000+ qubits.

A Nov. 16, 2015 D-Wave Systems news release provides more details about the deal,

D-Wave Systems Inc., the world’s first quantum computing company, today announced that it has entered into a multi-year agreement with Lockheed Martin (NYSE: LMT) to upgrade the company’s 512-qubit D-Wave Two™ quantum computer to the new D-Wave 2X™ system with 1,000+ qubits. This represents the second system upgrade since Lockheed Martin became D-Wave’s first customer in 2011 with the purchase of a 128 qubit D-Wave One™ system. The agreement includes the system, maintenance and associated professional services.

“Our mission is to solve complex challenges, advance scientific discovery and deliver innovative solutions to our customers, which requires expertise in the most advanced technologies,” said Greg Tallant, Lockheed Martin fellow and lead for the University of Southern California-Lockheed Martin Quantum Computation Center (QCC). “Through our continued investment in D-Wave technology, we are able to push the boundaries of quantum computing and apply the latest technologies to address the real-world problems being faced by our customers.”

For quantum computing, the performance gain over traditional computing is most evident in exceedingly complex computational problems. This could be in areas such as validating the performance of software or vehicle planning and scheduling. With the new D-Wave system, Lockheed Martin researchers will be able to explore solutions for significantly larger computational problems with improved accuracy and execution time.

The new system will be hosted at the University of Southern California-Lockheed Martin Quantum Computation Center, which first began exploring the power of quantum computing with the D-Wave One, the world’s first quantum computer.

The installation of the D-Wave 2X system will be completed in January 2016.

Who knows what next week will bring for D-Wave, which by the way is located in Vancouver, Canada or, more accurately, Burnaby?

Quantum teleportation

It’s been two years (my Aug. 16, 2013 posting features a German-Japanese collaboration) since the last quantum teleportation posting here. First, a little visual stimulation,

Captain James T Kirk (credit: http://www.comicvine.com/james-t-kirk/4005-20078/)

Captain James T Kirk (credit: http://www.comicvine.com/james-t-kirk/4005-20078/)

Captain Kirk, also known as William Shatner, is from Montréal, Canada and that’s not the only Canadian connection to this story which is really about some research at York University (UK). From an Oct. 1, 2015 news item on Nanotechnology Now,

Mention the word ‘teleportation’ and for many people it conjures up “Beam me up, Scottie” images of Captain James T Kirk.

But in the last two decades quantum teleportation – transferring the quantum structure of an object from one place to another without physical transmission — has moved from the realms of Star Trek fantasy to tangible reality.

A Sept. 30, 2015 York University (UK) press release, which originated the news item, describes the quantum teleportation research problem and solution,

Quantum teleportation is an important building block for quantum computing, quantum communication and quantum network and, eventually, a quantum Internet. While theoretical proposals for a quantum Internet already exist, the problem for scientists is that there is still debate over which of various technologies provides the most efficient and reliable teleportation system. This is the dilemma which an international team of researchers, led by Dr Stefano Pirandola of the Department of Computer Science at the University of York, set out to resolve.

In a paper published in Nature Photonics, the team, which included scientists from the Freie Universität Berlin and the Universities of Tokyo and Toronto [emphasis mine], reviewed the theoretical ideas around quantum teleportation focusing on the main experimental approaches and their attendant advantages and disadvantages.

None of the technologies alone provide a perfect solution, so the scientists concluded that a hybridisation of the various protocols and underlying structures would offer the most fruitful approach.

For instance, systems using photonic qubits work over distances up to 143 kilometres, but they are probabilistic in that only 50 per cent of the information can be transported. To resolve this, such photon systems may be used in conjunction with continuous variable systems, which are 100 per cent effective but currently limited to short distances.

Most importantly, teleportation-based optical communication needs an interface with suitable matter-based quantum memories where quantum information can be stored and further processed.

Dr Pirandola, who is also a member of the York Centre for Quantum Technologies, said: “We don’t have an ideal or universal technology for quantum teleportation. The field has developed a lot but we seem to need to rely on a hybrid approach to get the best from each available technology.

“The use of quantum teleportation as a building block for a quantum network depends on its integration with quantum memories. The development of good quantum memories would allow us to build quantum repeaters, therefore extending the range of teleportation. They would also give us the ability to store and process the transmitted quantum information at local quantum computers.

“This could ultimately form the backbone of a quantum Internet. The revised hybrid architecture will likely rely on teleportation-based long-distance quantum optical communication, interfaced with solid state devices for quantum information processing.”

Here’s a link to and a citation for the paper,

Advances in quantum teleportation by S. Pirandola, J. Eisert, C. Weedbrook, A. Furusawa, & S. L. Braunstein. Nature Photonics 9, 641–652 (2015) doi:10.1038/nphoton.2015.154 Published online 29 September 2015

This paper is behind a paywall.

 

D-Wave upgrades Google’s quantum computing capabilities

Vancouver-based (more accurately, Burnaby-based) D-Wave systems has scored a coup as key customers have upgraded from a 512-qubit system to a system with over 1,000 qubits. (The technical breakthrough and concomitant interest from the business community was mentioned here in a June 26, 2015 posting.) As for the latest business breakthrough, here’s more from a Sept. 28, 2015 D-Wave press release,

D-Wave Systems Inc., the world’s first quantum computing company, announced that it has entered into a new agreement covering the installation of a succession of D-Wave systems located at NASA’s Ames Research Center in Moffett Field, California. This agreement supports collaboration among Google, NASA and USRA (Universities Space Research Association) that is dedicated to studying how quantum computing can advance artificial intelligence and machine learning, and the solution of difficult optimization problems. The new agreement enables Google and its partners to keep their D-Wave system at the state-of-the-art for up to seven years, with new generations of D-Wave systems to be installed at NASA Ames as they become available.

“The new agreement is the largest order in D-Wave’s history, and indicative of the importance of quantum computing in its evolution toward solving problems that are difficult for even the largest supercomputers,” said D-Wave CEO Vern Brownell. “We highly value the commitment that our partners have made to D-Wave and our technology, and are excited about the potential use of our systems for machine learning and complex optimization problems.”

Cade Wetz’s Sept. 28, 2015 article for Wired magazine provides some interesting observations about D-Wave computers along with some explanations of quantum computing (Note: Links have been removed),

Though the D-Wave machine is less powerful than many scientists hope quantum computers will one day be, the leap to 1000 qubits represents an exponential improvement in what the machine is capable of. What is it capable of? Google and its partners are still trying to figure that out. But Google has said it’s confident there are situations where the D-Wave can outperform today’s non-quantum machines, and scientists at the University of Southern California [USC] have published research suggesting that the D-Wave exhibits behavior beyond classical physics.

A quantum computer operates according to the principles of quantum mechanics, the physics of very small things, such as electrons and photons. In a classical computer, a transistor stores a single “bit” of information. If the transistor is “on,” it holds a 1, and if it’s “off,” it holds a 0. But in quantum computer, thanks to what’s called the superposition principle, information is held in a quantum system that can exist in two states at the same time. This “qubit” can store a 0 and 1 simultaneously.

Two qubits, then, can hold four values at any given time (00, 01, 10, and 11). And as you keep increasing the number of qubits, you exponentially increase the power of the system. The problem is that building a qubit is a extreme difficult thing. If you read information from a quantum system, it “decoheres.” Basically, it turns into a classical bit that houses only a single value.

D-Wave claims to have a found a solution to the decoherence problem and that appears to be borne out by the USC researchers. Still, it isn’t a general quantum computer (from Wetz’s article),

… researchers at USC say that the system appears to display a phenomenon called “quantum annealing” that suggests it’s truly operating in the quantum realm. Regardless, the D-Wave is not a general quantum computer—that is, it’s not a computer for just any task. But D-Wave says the machine is well-suited to “optimization” problems, where you’re facing many, many different ways forward and must pick the best option, and to machine learning, where computers teach themselves tasks by analyzing large amount of data.

It takes a lot of innovation before you make big strides forward and I think D-Wave is to be congratulated on producing what is to my knowledge the only commercially available form of quantum computing of any sort in the world.

ETA Oct. 6, 2015* at 1230 hours PST: Minutes after publishing about D-Wave I came across this item (h/t Quirks & Quarks twitter) about Australian researchers and their quantum computing breakthrough. From an Oct. 6, 2015 article by Hannah Francis for the Sydney (Australia) Morning Herald,

For decades scientists have been trying to turn quantum computing — which allows for multiple calculations to happen at once, making it immeasurably faster than standard computing — into a practical reality rather than a moonshot theory. Until now, they have largely relied on “exotic” materials to construct quantum computers, making them unsuitable for commercial production.

But researchers at the University of New South Wales have patented a new design, published in the scientific journal Nature on Tuesday, created specifically with computer industry manufacturing standards in mind and using affordable silicon, which is found in regular computer chips like those we use every day in smartphones or tablets.

“Our team at UNSW has just cleared a major hurdle to making quantum computing a reality,” the director of the university’s Australian National Fabrication Facility, Andrew Dzurak, the project’s leader, said.

“As well as demonstrating the first quantum logic gate in silicon, we’ve also designed and patented a way to scale this technology to millions of qubits using standard industrial manufacturing techniques to build the world’s first quantum processor chip.”

According to the article, the university is looking for industrial partners to help them exploit this breakthrough. Fisher’s article features an embedded video, as well as, more detail.

*It was Oct. 6, 2015 in Australia but Oct. 5, 2015 my side of the international date line.

ETA Oct. 6, 2015 (my side of the international date line): An Oct. 5, 2015 University of New South Wales news release on EurekAlert provides additional details.

Here’s a link to and a citation for the paper,

A two-qubit logic gate in silicon by M. Veldhorst, C. H. Yang, J. C. C. Hwang, W. Huang,    J. P. Dehollain, J. T. Muhonen, S. Simmons, A. Laucht, F. E. Hudson, K. M. Itoh, A. Morello    & A. S. Dzurak. Nature (2015 doi:10.1038/nature15263 Published online 05 October 2015

This paper is behind a paywall.

D-Wave passes 1000-qubit barrier

A local (Vancouver, Canada-based, quantum computing company, D-Wave is making quite a splash lately due to a technical breakthrough.  h/t’s Speaking up for Canadian Science for Business in Vancouver article and Nanotechnology Now for Harris & Harris Group press release and Economist article.

A June 22, 2015 article by Tyler Orton for Business in Vancouver describes D-Wave’s latest technical breakthrough,

“This updated processor will allow significantly more complex computational problems to be solved than ever before,” Jeremy Hilton, D-Wave’s vice-president of processor development, wrote in a June 22 [2015] blog entry.

Regular computers use two bits – ones and zeroes – to make calculations, while quantum computers rely on qubits.

Qubits possess a “superposition” that allow it to be one and zero at the same time, meaning it can calculate all possible values in a single operation.

But the algorithm for a full-scale quantum computer requires 8,000 qubits.

A June 23, 2015 Harris & Harris Group press release adds more information about the breakthrough,

Harris & Harris Group, Inc. (Nasdaq: TINY), an investor in transformative companies enabled by disruptive science, notes that its portfolio company, D-Wave Systems, Inc., announced that it has successfully fabricated 1,000 qubit processors that power its quantum computers.  D-Wave’s quantum computer runs a quantum annealing algorithm to find the lowest points, corresponding to optimal or near optimal solutions, in a virtual “energy landscape.”  Every additional qubit doubles the search space of the processor.  At 1,000 qubits, the new processor considers 21000 possibilities simultaneously, a search space which is substantially larger than the 2512 possibilities available to the company’s currently available 512 qubit D-Wave Two. In fact, the new search space contains far more possibilities than there are particles in the observable universe.

A June 22, 2015 D-Wave news release, which originated the technical details about the breakthrough found in the Harris & Harris press release, provides more information along with some marketing hype (hyperbole), Note: Links have been removed,

As the only manufacturer of scalable quantum processors, D-Wave breaks new ground with every succeeding generation it develops. The new processors, comprising over 128,000 Josephson tunnel junctions, are believed to be the most complex superconductor integrated circuits ever successfully yielded. They are fabricated in part at D-Wave’s facilities in Palo Alto, CA and at Cypress Semiconductor’s wafer foundry located in Bloomington, Minnesota.

“Temperature, noise, and precision all play a profound role in how well quantum processors solve problems.  Beyond scaling up the technology by doubling the number of qubits, we also achieved key technology advances prioritized around their impact on performance,” said Jeremy Hilton, D-Wave vice president, processor development. “We expect to release benchmarking data that demonstrate new levels of performance later this year.”

The 1000-qubit milestone is the result of intensive research and development by D-Wave and reflects a triumph over a variety of design challenges aimed at enhancing performance and boosting solution quality. Beyond the much larger number of qubits, other significant innovations include:

  •  Lower Operating Temperature: While the previous generation processor ran at a temperature close to absolute zero, the new processor runs 40% colder. The lower operating temperature enhances the importance of quantum effects, which increases the ability to discriminate the best result from a collection of good candidates.​
  • Reduced Noise: Through a combination of improved design, architectural enhancements and materials changes, noise levels have been reduced by 50% in comparison to the previous generation. The lower noise environment enhances problem-solving performance while boosting reliability and stability.
  • Increased Control Circuitry Precision: In the testing to date, the increased precision coupled with the noise reduction has demonstrated improved precision by up to 40%. To accomplish both while also improving manufacturing yield is a significant achievement.
  • Advanced Fabrication:  The new processors comprise over 128,000 Josephson junctions (tunnel junctions with superconducting electrodes) in a 6-metal layer planar process with 0.25μm features, believed to be the most complex superconductor integrated circuits ever built.
  • New Modes of Use: The new technology expands the boundaries of ways to exploit quantum resources.  In addition to performing discrete optimization like its predecessor, firmware and software upgrades will make it easier to use the system for sampling applications.

“Breaking the 1000 qubit barrier marks the culmination of years of research and development by our scientists, engineers and manufacturing team,” said D-Wave CEO Vern Brownell. “It is a critical step toward bringing the promise of quantum computing to bear on some of the most challenging technical, commercial, scientific, and national defense problems that organizations face.”

A June 20, 2015 article in The Economist notes there is now commercial interest as it provides good introductory information about quantum computing. The article includes an analysis of various research efforts in Canada (they mention D-Wave), the US, and the UK. These excerpts don’t do justice to the article but will hopefully whet your appetite or provide an overview for anyone with limited time,

A COMPUTER proceeds one step at a time. At any particular moment, each of its bits—the binary digits it adds and subtracts to arrive at its conclusions—has a single, definite value: zero or one. At that moment the machine is in just one state, a particular mixture of zeros and ones. It can therefore perform only one calculation next. This puts a limit on its power. To increase that power, you have to make it work faster.

But bits do not exist in the abstract. Each depends for its reality on the physical state of part of the computer’s processor or memory. And physical states, at the quantum level, are not as clear-cut as classical physics pretends. That leaves engineers a bit of wriggle room. By exploiting certain quantum effects they can create bits, known as qubits, that do not have a definite value, thus overcoming classical computing’s limits.

… The biggest question is what the qubits themselves should be made from.

A qubit needs a physical system with two opposite quantum states, such as the direction of spin of an electron orbiting an atomic nucleus. Several things which can do the job exist, and each has its fans. Some suggest nitrogen atoms trapped in the crystal lattices of diamonds. Calcium ions held in the grip of magnetic fields are another favourite. So are the photons of which light is composed (in this case the qubit would be stored in the plane of polarisation). And quasiparticles, which are vibrations in matter that behave like real subatomic particles, also have a following.

The leading candidate at the moment, though, is to use a superconductor in which the qubit is either the direction of a circulating current, or the presence or absence of an electric charge. Both Google and IBM are banking on this approach. It has the advantage that superconducting qubits can be arranged on semiconductor chips of the sort used in existing computers. That, the two firms think, should make them easier to commercialise.

Google is also collaborating with D-Wave of Vancouver, Canada, which sells what it calls quantum annealers. The field’s practitioners took much convincing that these devices really do exploit the quantum advantage, and in any case they are limited to a narrower set of problems—such as searching for images similar to a reference image. But such searches are just the type of application of interest to Google. In 2013, in collaboration with NASA and USRA, a research consortium, the firm bought a D-Wave machine in order to put it through its paces. Hartmut Neven, director of engineering at Google Research, is guarded about what his team has found, but he believes D-Wave’s approach is best suited to calculations involving fewer qubits, while Dr Martinis and his colleagues build devices with more.

It’s not clear to me if the writers at The Economist were aware of  D-Wave’s latest breakthrough at the time of writing but I think not. In any event, they (The Economist writers) have included a provocative tidbit about quantum encryption,

Documents released by Edward Snowden, a whistleblower, revealed that the Penetrating Hard Targets programme of America’s National Security Agency was actively researching “if, and how, a cryptologically useful quantum computer can be built”. In May IARPA [Intellligence Advanced Research Projects Agency], the American government’s intelligence-research arm, issued a call for partners in its Logical Qubits programme, to make robust, error-free qubits. In April, meanwhile, Tanja Lange and Daniel Bernstein of Eindhoven University of Technology, in the Netherlands, announced PQCRYPTO, a programme to advance and standardise “post-quantum cryptography”. They are concerned that encrypted communications captured now could be subjected to quantum cracking in the future. That means strong pre-emptive encryption is needed immediately.

I encourage you to read the Economist article.

Two final comments. (1) The latest piece, prior to this one, about D-Wave was in a Feb. 6, 2015 posting about then new investment into the company. (2) A Canadian effort in the field of quantum cryptography was mentioned in a May 11, 2015 posting (scroll down about 50% of the way) featuring a profile of Raymond Laflamme, at the University of Waterloo’s Institute of Quantum Computing in the context of an announcement about science media initiative Research2Reality.

More investment money for Canada’s D-Wave Systems (quantum computing)

A Feb. 2, 2015 news item on Nanotechnology Now features D-Wave Systems (located in the Vancouver region, Canada) and its recent funding bonanza of $28M dollars,

Harris & Harris Group, Inc. (Nasdaq:TINY), an investor in transformative companies enabled by disruptive science, notes the announcement by portfolio company, D-Wave Systems, Inc., that it has closed $29 million (CAD) in funding from a large institutional investor, among others. This funding will be used to accelerate development of D-Wave’s quantum hardware and software and expand the software application ecosystem. This investment brings total funding in D-Wave to $174 million (CAD), with approximately $62 million (CAD) raised in 2014. Harris & Harris Group’s total investment in D-Wave is approximately $5.8 million (USD). D-Wave’s announcement also includes highlights of 2014, a year of strong growth and advancement for D-Wave.

A Jan. 29, 2015 D-Wave news release provides more details about the new investment and D-Wave’s 2014 triumphs,

D-Wave Systems Inc., the world’s first quantum computing company, today announced that it has closed $29 million in funding from a large institutional investor, among others. This funding will be used to accelerate development of D-Wave’s quantum hardware and software and expand the software application ecosystem. This investment brings total funding in D-Wave to $174 million (CAD), with approximately $62 million raised in 2014.

“The investment is a testament to the progress D-Wave continues to make as the leader in quantum computing systems,” said Vern Brownell, CEO of D-Wave. “The funding we received in 2014 will advance our quantum hardware and software development, as well as our work on leading edge applications of our systems. By making quantum computing available to more organizations, we’re driving our goal of finding solutions to the most complex optimization and machine learning applications in national defense, computing, research and finance.”

The funding follows a year of strong growth and advancement for D-Wave. Highlights include:

•    Significant progress made towards the release of the next D-Wave quantum system featuring a 1000 qubit processor, which is currently undergoing testing in D-Wave’s labs.
•    The company’s patent portfolio grew to over 150 issued patents worldwide, with 11 new U.S. patents being granted in 2014, covering aspects of D-Wave’s processor technology, systems and techniques for solving computational problems using D-Wave’s technology.
•    D-Wave Professional Services launched, providing quantum computing experts to collaborate directly with customers, and deliver training classes on the usage and programming of the D-Wave system to a number of national laboratories, businesses and universities.
•    Partnerships were established with DNA-SEQ and 1QBit, companies that are developing quantum software applications in the spheres of medicine and finance, respectively.
•    Research throughout the year continued to validate D-Wave’s work, including a study showing further evidence of quantum entanglement by D-Wave and USC  [University of Southern California] scientists, published in Physical Review X this past May.

Since 2011, some of the most prestigious organizations in the world, including Lockheed Martin, NASA, Google, USC and the Universities Space Research Association (USRA), have partnered with D-Wave to use their quantum computing systems. In 2015, these partners will continue to work with the D-Wave computer, conducting pioneering research in machine learning, optimization, and space exploration.

D-Wave, which already employs over 120 people, plans to expand hiring with the additional funding. Key areas of growth include research, processor and systems development and software engineering.

Harris & Harris Group offers a description of D-Wave which mentions nanotechnology and hosts a couple of explanatory videos,

D-Wave Systems develops an adiabatic quantum computer (QC).

Status
Privately Held

The Market
Electronics – High Performance Computing

The Problem
Traditional or “classical computers” are constrained by the sequential character of data processing that makes the solving of non-polynomial (NP)-hard problems difficult or potentially impossible in reasonable timeframes. These types of computationally intense problems are commonly observed in software verifications, scheduling and logistics planning, integer programming, bioinformatics and financial portfolio optimization.

D-Wave’s Solution
D-Wave develops quantum computers that are capable of processing data quantum mechanical properties of matter. This leverage of quantum mechanics enables the identification of solutions to some non-polynomial (NP)-hard problems in a reasonable timeframe, instead of the exponential time needed for any classical digital computer. D-Wave sold and installed its first quantum computing system to a commercial customer in 2011.

Nanotechnology Factor
To function properly, D-wave processor requires tight control and manipulation of quantum mechanical phenomena. This control and manipulation is achieved by creating integrated circuits based on Josephson Junctions and other superconducting circuitry. By picking superconductors, D-wave managed to combine quantum mechanical behavior with macroscopic dimensions needed for hi-yield design and manufacturing.

It seems D-Wave has made some research and funding strides since I last wrote about the company in a Jan. 19, 2012 posting, although there is no mention of quantum computer sales.

Could there be a quantum internet?

We’ve always had limited success with predicting future technologies by examining current technologies. For example, the Internet and World Wide Web as we experience them today would have been unthinkable for most people in the 1950s when computers inhabited entire buildings and satellites were a brand new technology designed for space exploration not bouncing communication signals around the planet. That said, this new work on a ‘quantum internet’ from Eindhoven University of Technology is quite intriguing (from a Dec. 15, 2014 news item on Nanowerk),

In the same way as we now connect computers in networks through optical signals, it could also be possible to connect future quantum computers in a ‘quantum internet’. The optical signals would then consist of individual light particles or photons. One prerequisite for a working quantum internet is control of the shape of these photons. Researchers at Eindhoven University of Technology (TU/e) and the FOM foundation  [Foundation for Fundamental Research on Matter] have now succeeded for the first time in getting this control within the required short time.

A Dec. 15, 2014 Eindhoven University of Technology (TU/e) press release, which originated the news item, describes one of the problems with a ‘quantum internet’ and the researchers’ solution,

Quantum computers could in principle communicate with each other by exchanging individual photons to create a ‘quantum internet’. The shape of the photons, in other words how their energy is distributed over time, is vital for successful transmission of information. This shape must be symmetric in time, while photons that are emitted by atoms normally have an asymmetric shape. Therefore, this process requires external control in order to create a quantum internet.

Optical cavity

Researchers at TU/e and FOM have succeeded in getting the required degree of control by embedding a quantum dot – a piece of semiconductor material that can transmit photons – into a ‘photonic crystal’, thereby creating an optical cavity. Then the researchers applied a very short electrical pulse to the cavity, which influences how the quantum dot interacts with it, and how the photon is emitted. By varying the strength of this pulse, they were able to control the shape of the transmitted photons.

Within a billionth of a second

The Eindhoven researchers are the first to achieve this, thanks to the use of electrical pulses shorter than nanosecond, a billionth of a second. This is vital for use in quantum communication, as research leader Andrea Fiore of TU/e explains: “The emission of a photon only lasts for one nanosecond, so if you want to change anything you have to do it within that time. It’s like the shutter of a high-speed camera, which has to be very short if you want to capture something that changes very fast in an image. By controlling the speed at which you send a photon, you can in principle achieve very efficient exchange of photons, which is important for the future quantum internet.”

Here’s a link to and a citation for the paper,

Dynamically controlling the emission of single excitons in photonic crystal cavities by Francesco Pagliano, YongJin Cho, Tian Xia, Frank van Otten, Robert Johne, & Andrea Fiore. Nature Communications 5, Article number: 5786 doi:10.1038/ncomms6786 Published 15 December 2014

This is an open access paper.

ETA Dec. 16, 2014 at 1230 hours PDT: There is a copy of the Dec. 15, 2014 news release on EurekAlert.

IBM weighs in with plans for a 7nm computer chip

On the heels of Intel’s announcement about a deal utilizing their 14nm low-power manufacturing process and speculations about a 10nm computer chip (my July 9, 2014 posting), IBM makes an announcement about a 7nm chip as per this July 10, 2014 news item on Azonano,

IBM today [July 10, 2014] announced it is investing $3 billion over the next 5 years in two broad research and early stage development programs to push the limits of chip technology needed to meet the emerging demands of cloud computing and Big Data systems. These investments will push IBM’s semiconductor innovations from today’s breakthroughs into the advanced technology leadership required for the future.

A very comprehensive July 10, 2014 news release lays out the company’s plans for this $3B investment representing 10% of IBM’s total research budget,

The first research program is aimed at so-called “7 nanometer and beyond” silicon technology that will address serious physical challenges that are threatening current semiconductor scaling techniques and will impede the ability to manufacture such chips. The second is focused on developing alternative technologies for post-silicon era chips using entirely different approaches, which IBM scientists and other experts say are required because of the physical limitations of silicon based semiconductors.

Cloud and big data applications are placing new challenges on systems, just as the underlying chip technology is facing numerous significant physical scaling limits.  Bandwidth to memory, high speed communication and device power consumption are becoming increasingly challenging and critical.

The teams will comprise IBM Research scientists and engineers from Albany and Yorktown, New York; Almaden, California; and Europe. In particular, IBM will be investing significantly in emerging areas of research that are already underway at IBM such as carbon nanoelectronics, silicon photonics, new memory technologies, and architectures that support quantum and cognitive computing. [emphasis mine]

These teams will focus on providing orders of magnitude improvement in system level performance and energy efficient computing. In addition, IBM will continue to invest in the nanosciences and quantum computing–two areas of fundamental science where IBM has remained a pioneer for over three decades.

7 nanometer technology and beyond

IBM Researchers and other semiconductor experts predict that while challenging, semiconductors show promise to scale from today’s 22 nanometers down to 14 and then 10 nanometers in the next several years.  However, scaling to 7 nanometers and perhaps below, by the end of the decade will require significant investment and innovation in semiconductor architectures as well as invention of new tools and techniques for manufacturing.

“The question is not if we will introduce 7 nanometer technology into manufacturing, but rather how, when, and at what cost?” said John Kelly, senior vice president, IBM Research. “IBM engineers and scientists, along with our partners, are well suited for this challenge and are already working on the materials science and device engineering required to meet the demands of the emerging system requirements for cloud, big data, and cognitive systems. This new investment will ensure that we produce the necessary innovations to meet these challenges.”

“Scaling to 7nm and below is a terrific challenge, calling for deep physics competencies in processing nano materials affinities and characteristics. IBM is one of a very few companies who has repeatedly demonstrated this level of science and engineering expertise,” said Richard Doherty, technology research director, The Envisioneering Group.

Bridge to a “Post-Silicon” Era

Silicon transistors, tiny switches that carry information on a chip, have been made smaller year after year, but they are approaching a point of physical limitation. Their increasingly small dimensions, now reaching the nanoscale, will prohibit any gains in performance due to the nature of silicon and the laws of physics. Within a few more generations, classical scaling and shrinkage will no longer yield the sizable benefits of lower power, lower cost and higher speed processors that the industry has become accustomed to.

With virtually all electronic equipment today built on complementary metal–oxide–semiconductor (CMOS) technology, there is an urgent need for new materials and circuit architecture designs compatible with this engineering process as the technology industry nears physical scalability limits of the silicon transistor.

Beyond 7 nanometers, the challenges dramatically increase, requiring a new kind of material to power systems of the future, and new computing platforms to solve problems that are unsolvable or difficult to solve today. Potential alternatives include new materials such as carbon nanotubes, and non-traditional computational approaches such as neuromorphic computing, cognitive computing, machine learning techniques, and the science behind quantum computing.

As the leader in advanced schemes that point beyond traditional silicon-based computing, IBM holds over 500 patents for technologies that will drive advancements at 7nm and beyond silicon — more than twice the nearest competitor. These continued investments will accelerate the invention and introduction into product development for IBM’s highly differentiated computing systems for cloud, and big data analytics.

Several exploratory research breakthroughs that could lead to major advancements in delivering dramatically smaller, faster and more powerful computer chips, include quantum computing, neurosynaptic computing, silicon photonics, carbon nanotubes, III-V technologies, low power transistors and graphene:

Quantum Computing

The most basic piece of information that a typical computer understands is a bit. Much like a light that can be switched on or off, a bit can have only one of two values: “1” or “0.” Described as superposition, this special property of qubits enables quantum computers to weed through millions of solutions all at once, while desktop PCs would have to consider them one at a time.

IBM is a world leader in superconducting qubit-based quantum computing science and is a pioneer in the field of experimental and theoretical quantum information, fields that are still in the category of fundamental science – but one that, in the long term, may allow the solution of problems that are today either impossible or impractical to solve using conventional machines. The team recently demonstrated the first experimental realization of parity check with three superconducting qubits, an essential building block for one type of quantum computer.

Neurosynaptic Computing

Bringing together nanoscience, neuroscience, and supercomputing, IBM and university partners have developed an end-to-end ecosystem including a novel non-von Neumann architecture, a new programming language, as well as applications. This novel technology allows for computing systems that emulate the brain’s computing efficiency, size and power usage. IBM’s long-term goal is to build a neurosynaptic system with ten billion neurons and a hundred trillion synapses, all while consuming only one kilowatt of power and occupying less than two liters of volume.

Silicon Photonics

IBM has been a pioneer in the area of CMOS integrated silicon photonics for over 12 years, a technology that integrates functions for optical communications on a silicon chip, and the IBM team has recently designed and fabricated the world’s first monolithic silicon photonics based transceiver with wavelength division multiplexing.  Such transceivers will use light to transmit data between different components in a computing system at high data rates, low cost, and in an energetically efficient manner.

Silicon nanophotonics takes advantage of pulses of light for communication rather than traditional copper wiring and provides a super highway for large volumes of data to move at rapid speeds between computer chips in servers, large datacenters, and supercomputers, thus alleviating the limitations of congested data traffic and high-cost traditional interconnects.

Businesses are entering a new era of computing that requires systems to process and analyze, in real-time, huge volumes of information known as Big Data. Silicon nanophotonics technology provides answers to Big Data challenges by seamlessly connecting various parts of large systems, whether few centimeters or few kilometers apart from each other, and move terabytes of data via pulses of light through optical fibers.

III-V technologies

IBM researchers have demonstrated the world’s highest transconductance on a self-aligned III-V channel metal-oxide semiconductor (MOS) field-effect transistors (FETs) device structure that is compatible with CMOS scaling. These materials and structural innovation are expected to pave path for technology scaling at 7nm and beyond.  With more than an order of magnitude higher electron mobility than silicon, integrating III-V materials into CMOS enables higher performance at lower power density, allowing for an extension to power/performance scaling to meet the demands of cloud computing and big data systems.

Carbon Nanotubes

IBM Researchers are working in the area of carbon nanotube (CNT) electronics and exploring whether CNTs can replace silicon beyond the 7 nm node.  As part of its activities for developing carbon nanotube based CMOS VLSI circuits, IBM recently demonstrated — for the first time in the world — 2-way CMOS NAND gates using 50 nm gate length carbon nanotube transistors.

IBM also has demonstrated the capability for purifying carbon nanotubes to 99.99 percent, the highest (verified) purities demonstrated to date, and transistors at 10 nm channel length that show no degradation due to scaling–this is unmatched by any other material system to date.

Carbon nanotubes are single atomic sheets of carbon rolled up into a tube. The carbon nanotubes form the core of a transistor device that will work in a fashion similar to the current silicon transistor, but will be better performing. They could be used to replace the transistors in chips that power data-crunching servers, high performing computers and ultra fast smart phones.

Carbon nanotube transistors can operate as excellent switches at molecular dimensions of less than ten nanometers – the equivalent to 10,000 times thinner than a strand of human hair and less than half the size of the leading silicon technology. Comprehensive modeling of the electronic circuits suggests that about a five to ten times improvement in performance compared to silicon circuits is possible.

Graphene

Graphene is pure carbon in the form of a one atomic layer thick sheet.  It is an excellent conductor of heat and electricity, and it is also remarkably strong and flexible.  Electrons can move in graphene about ten times faster than in commonly used semiconductor materials such as silicon and silicon germanium. Its characteristics offer the possibility to build faster switching transistors than are possible with conventional semiconductors, particularly for applications in the handheld wireless communications business where it will be a more efficient switch than those currently used.

Recently in 2013, IBM demonstrated the world’s first graphene based integrated circuit receiver front end for wireless communications. The circuit consisted of a 2-stage amplifier and a down converter operating at 4.3 GHz.

Next Generation Low Power Transistors

In addition to new materials like CNTs, new architectures and innovative device concepts are required to boost future system performance. Power dissipation is a fundamental challenge for nanoelectronic circuits. To explain the challenge, consider a leaky water faucet — even after closing the valve as far as possible water continues to drip — this is similar to today’s transistor, in that energy is constantly “leaking” or being lost or wasted in the off-state.

A potential alternative to today’s power hungry silicon field effect transistors are so-called steep slope devices. They could operate at much lower voltage and thus dissipate significantly less power. IBM scientists are researching tunnel field effect transistors (TFETs). In this special type of transistors the quantum-mechanical effect of band-to-band tunneling is used to drive the current flow through the transistor. TFETs could achieve a 100-fold power reduction over complementary CMOS transistors, so integrating TFETs with CMOS technology could improve low-power integrated circuits.

Recently, IBM has developed a novel method to integrate III-V nanowires and heterostructures directly on standard silicon substrates and built the first ever InAs/Si tunnel diodes and TFETs using InAs as source and Si as channel with wrap-around gate as steep slope device for low power consumption applications.

“In the next ten years computing hardware systems will be fundamentally different as our scientists and engineers push the limits of semiconductor innovations to explore the post-silicon future,” said Tom Rosamilia, senior vice president, IBM Systems and Technology Group. “IBM Research and Development teams are creating breakthrough innovations that will fuel the next era of computing systems.”

IBM’s historic contributions to silicon and semiconductor innovation include the invention and/or first implementation of: the single cell DRAM, the “Dennard scaling laws” underpinning “Moore’s Law”, chemically amplified photoresists, copper interconnect wiring, Silicon on Insulator, strained engineering, multi core microprocessors, immersion lithography, high speed silicon germanium (SiGe), High-k gate dielectrics, embedded DRAM, 3D chip stacking, and Air gap insulators.

IBM researchers also are credited with initiating the era of nano devices following the Nobel prize winning invention of the scanning tunneling microscope which enabled nano and atomic scale invention and innovation.

IBM will also continue to fund and collaborate with university researchers to explore and develop the future technologies for the semiconductor industry. In particular, IBM will continue to support and fund university research through private-public partnerships such as the NanoElectornics Research Initiative (NRI), and the Semiconductor Advanced Research Network (STARnet), and the Global Research Consortium (GRC) of the Semiconductor Research Corporation.

I highlighted ‘memory systems’ as this brings to mind HP Labs and their major investment in ‘memristive’ technologies noted in my June 26, 2014 posting,

… During a two-hour presentation held a year and a half ago, they laid out how the computer might work, its benefits, and the expectation that about 75 percent of HP Labs personnel would be dedicated to this one project. “At the end, Meg {Meg Whitman, CEO of HP Labs] turned to [Chief Financial Officer] Cathie Lesjak and said, ‘Find them more money,’” says John Sontag, the vice president of systems research at HP, who attended the meeting and is in charge of bringing the Machine to life. “People in Labs see this as a once-in-a-lifetime opportunity.”

The Machine is based on the memristor and other associated technologies.

Getting back to IBM, there’s this analysis of the $3B investment ($600M/year for five years) by Alex Konrad in a July 10, 2014 article for Forbes (Note: A link has been removed),

When IBM … announced a $3 billion commitment to even tinier semiconductor chips that no longer depended on silicon on Wednesday, the big news was that IBM’s putting a lot of money into a future for chips where Moore’s Law no longer applies. But on second glance, the move to spend billions on more experimental ideas like silicon photonics and carbon nanotubes shows that IBM’s finally shifting large portions of its research budget into more ambitious and long-term ideas.

… IBM tells Forbes the $3 billion isn’t additional money being added to its R&D spend, an area where analysts have told Forbes they’d like to see more aggressive cash commitments in the future. IBM will still spend about $6 billion a year on R&D, 6% of revenue. Ten percent of that research budget, however, now has to come from somewhere else to fuel these more ambitious chip projects.

Neal Ungerleider’s July 11, 2014 article for Fast Company focuses on the neuromorphic computing and quantum computing aspects of this $3B initiative (Note: Links have been removed),

The new R&D initiatives fall into two categories: Developing nanotech components for silicon chips for big data and cloud systems, and experimentation with “post-silicon” microchips. This will include research into quantum computers which don’t know binary code, neurosynaptic computers which mimic the behavior of living brains, carbon nanotubes, graphene tools and a variety of other technologies.

IBM’s investment is one of the largest for quantum computing to date; the company is one of the biggest researchers in the field, along with a Canadian company named D-Wave which is partnering with Google and NASA to develop quantum computer systems.

The curious can find D-Wave Systems here. There’s also a January 19, 2012 posting here which discusses the D-Wave’s situation at that time.

Final observation, these are fascinating developments especially for the insight they provide into the worries troubling HP Labs, Intel, and IBM as they jockey for position.

ETA July 14, 2014: Dexter Johnson has a July 11, 2014 posting on his Nanoclast blog (on the IEEE [Institute for Electrical and Electronics Engineers]) about the IBM announcement and which features some responses he received from IBM officials to his queries,

While this may be a matter of fascinating speculation for investors, the impact on nanotechnology development  is going to be significant. To get a better sense of what it all means, I was able to talk to some of the key figures of IBM’s push in nanotechnology research.

I conducted e-mail interviews with Tze-Chiang (T.C.) Chen, vice president science & technology, IBM Fellow at the Thomas J. Watson Research Center and Wilfried Haensch, senior manager, physics and materials for logic and communications, IBM Research.

Silicon versus Nanomaterials

First, I wanted to get a sense for how long IBM envisioned sticking with silicon and when they expected the company would permanently make the move away from CMOS to alternative nanomaterials. Unfortunately, as expected, I didn’t get solid answers, except for them to say that new manufacturing tools and techniques need to be developed now.

He goes on to ask about carbon nanotubes and graphene. Interestingly, IBM does not have a wide range of electronics applications in mind for graphene.  I encourage you to read Dexter’s posting as Dexter got answers to some very astute and pointed questions.

Graphene, Perimeter Institute, and condensed matter physics

In short, researchers at Canada’s Perimeter Institute are working on theoretical models involving graphene. which could lead to quantum computing. A July 3, 2014 Perimeter Institute news release by Erin Bow (also on EurekAlert) provides some insight into the connections between graphene and condensed matter physics (Note: Bow has included some good basic explanations of graphene, quasiparticles, and more for beginners),

One of the hottest materials in condensed matter research today is graphene.

Graphene had an unlikely start: it began with researchers messing around with pencil marks on paper. Pencil “lead” is actually made of graphite, which is a soft crystal lattice made of nothing but carbon atoms. When pencils deposit that graphite on paper, the lattice is laid down in thin sheets. By pulling that lattice apart into thinner sheets – originally using Scotch tape – researchers discovered that they could make flakes of crystal just one atom thick.

The name for this atom-scale chicken wire is graphene. Those folks with the Scotch tape, Andre Geim and Konstantin Novoselov, won the 2010 Nobel Prize for discovering it. “As a material, it is completely new – not only the thinnest ever but also the strongest,” wrote the Nobel committee. “As a conductor of electricity, it performs as well as copper. As a conductor of heat, it outperforms all other known materials. It is almost completely transparent, yet so dense that not even helium, the smallest gas atom, can pass through it.”

Developing a theoretical model of graphene

Graphene is not just a practical wonder – it’s also a wonderland for theorists. Confined to the two-dimensional surface of the graphene, the electrons behave strangely. All kinds of new phenomena can be seen, and new ideas can be tested. Testing new ideas in graphene is exactly what Perimeter researchers Zlatko Papić and Dmitry (Dima) Abanin set out to do.

“Dima and I started working on graphene a very long time ago,” says Papić. “We first met in 2009 at a conference in Sweden. I was a grad student and Dima was in the first year of his postdoc, I think.”

The two young scientists got to talking about what new physics they might be able to observe in the strange new material when it is exposed to a strong magnetic field.

“We decided we wanted to model the material,” says Papić. They’ve been working on their theoretical model of graphene, on and off, ever since. The two are now both at Perimeter Institute, where Papić is a postdoctoral researcher and Abanin is a faculty member. They are both cross-appointed with the Institute for Quantum Computing (IQC) at the University of Waterloo.

In January 2014, they published a paper in Physical Review Letters presenting new ideas about how to induce a strange but interesting state in graphene – one where it appears as if particles inside it have a fraction of an electron’s charge.

It’s called the fractional quantum Hall effect (FQHE), and it’s head turning. Like the speed of light or Planck’s constant, the charge of the electron is a fixed point in the disorienting quantum universe.

Every system in the universe carries whole multiples of a single electron’s charge. When the FQHE was first discovered in the 1980s, condensed matter physicists quickly worked out that the fractionally charged “particles” inside their semiconductors were actually quasiparticles – that is, emergent collective behaviours of the system that imitate particles.

Graphene is an ideal material in which to study the FQHE. “Because it’s just one atom thick, you have direct access to the surface,” says Papić. “In semiconductors, where FQHE was first observed, the gas of electrons that create this effect are buried deep inside the material. They’re hard to access and manipulate. But with graphene you can imagine manipulating these states much more easily.”

In the January paper, Abanin and Papić reported novel types of FQHE states that could arise in bilayer graphene – that is, in two sheets of graphene laid one on top of another – when it is placed in a strong perpendicular magnetic field. In an earlier work from 2012, they argued that applying an electric field across the surface of bilayer graphene could offer a unique experimental knob to induce transitions between FQHE states. Combining the two effects, they argued, would be an ideal way to look at special FQHE states and the transitions between them.

Once the scientists developed their theory they went to work on some experiments,

Two experimental groups – one in Geneva, involving Abanin, and one at Columbia, involving both Abanin and Papić – have since put the electric field + magnetic field method to good use. The paper by the Columbia group appears in the July 4 issue of Science. A third group, led by Amir Yacoby of Harvard, is doing closely related work.

“We often work hand-in-hand with experimentalists,” says Papić. “One of the reasons I like condensed matter is that often even the most sophisticated, cutting-edge theory stands a good chance of being quickly checked with experiment.”

Inside both the magnetic and electric field, the electrical resistance of the graphene demonstrates the strange behaviour characteristic of the FQHE. Instead of resistance that varies in a smooth curve with voltage, resistance jumps suddenly from one level to another, and then plateaus – a kind of staircase of resistance. Each stair step is a different state of matter, defined by the complex quantum tangle of charges, spins, and other properties inside the graphene.

“The number of states is quite rich,” says Papić. “We’re very interested in bilayer graphene because of the number of states we are detecting and because we have these mechanisms – like tuning the electric field – to study how these states are interrelated, and what happens when the material changes from one state to another.”

For the moment, researchers are particularly interested in the stair steps whose “height” is described by a fraction with an even denominator. That’s because the quasiparticles in that state are expected to have an unusual property.

There are two kinds of particles in our three-dimensional world: fermions (such as electrons), where two identical particles can’t occupy one state, and bosons (such as photons), where two identical particles actually want to occupy one state. In three dimensions, fermions are fermions and bosons are bosons, and never the twain shall meet.

But a sheet of graphene doesn’t have three dimensions – it has two. It’s effectively a tiny two-dimensional universe, and in that universe, new phenomena can occur. For one thing, fermions and bosons can meet halfway – becoming anyons, which can be anywhere in between fermions and bosons. The quasiparticles in these special stair-step states are expected to be anyons.

In particular, the researchers are hoping these quasiparticles will be non-Abelian anyons, as their theory indicates they should be. That would be exciting because non-Abelian anyons can be used in the making of qubits.

Graphene qubits?

Qubits are to quantum computers what bits are to ordinary computers: both a basic unit of information and the basic piece of equipment that stores that information. Because of their quantum complexity, qubits are more powerful than ordinary bits and their power grows exponentially as more of them are added. A quantum computer of only a hundred qubits can tackle certain problems beyond the reach of even the best non-quantum supercomputers. Or, it could, if someone could find a way to build stable qubits.

The drive to make qubits is part of the reason why graphene is a hot research area in general, and why even-denominator FQHE states – with their special anyons – are sought after in particular.

“A state with some number of these anyons can be used to represent a qubit,” says Papić. “Our theory says they should be there and the experiments seem to bear that out – certainly the even-denominator FQHE states seem to be there, at least according to the Geneva experiments.”

That’s still a step away from experimental proof that those even-denominator stair-step states actually contain non-Abelian anyons. More work remains, but Papić is optimistic: “It might be easier to prove in graphene than it would be in semiconductors. Everything is happening right at the surface.”

It’s still early, but it looks as if bilayer graphene may be the magic material that allows this kind of qubit to be built. That would be a major mark on the unlikely line between pencil lead and quantum computers.

Here are links for further research,

January PRL paper mentioned above: http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.112.046602

Experimental paper from the Geneva graphene group, including Abanin: http://pubs.acs.org/doi/abs/10.1021/nl5003922

Experimental paper from the Columbia graphene group, including both Abanin and Papić: http://arxiv.org/abs/1403.2112. This paper is featured in the journal Science.

Related experiment on bilayer graphene by Amir Yacoby’s group at Harvard: http://www.sciencemag.org/content/early/2014/05/28/science.1250270

The Nobel Prize press release on graphene, mentioned above: http://www.nobelprize.org/nobel_prizes/physics/laureates/2010/press.html

I recently posted a piece about some research into the ‘scotch-tape technique’ for isolating graphene (June 30, 2014 posting). Amusingly, Geim argued against coining the technique as the ‘scotch-tape’ technique, something I found out only recently.