Tag Archives: quantum computing

D-Wave upgrades Google’s quantum computing capabilities

Vancouver-based (more accurately, Burnaby-based) D-Wave systems has scored a coup as key customers have upgraded from a 512-qubit system to a system with over 1,000 qubits. (The technical breakthrough and concomitant interest from the business community was mentioned here in a June 26, 2015 posting.) As for the latest business breakthrough, here’s more from a Sept. 28, 2015 D-Wave press release,

D-Wave Systems Inc., the world’s first quantum computing company, announced that it has entered into a new agreement covering the installation of a succession of D-Wave systems located at NASA’s Ames Research Center in Moffett Field, California. This agreement supports collaboration among Google, NASA and USRA (Universities Space Research Association) that is dedicated to studying how quantum computing can advance artificial intelligence and machine learning, and the solution of difficult optimization problems. The new agreement enables Google and its partners to keep their D-Wave system at the state-of-the-art for up to seven years, with new generations of D-Wave systems to be installed at NASA Ames as they become available.

“The new agreement is the largest order in D-Wave’s history, and indicative of the importance of quantum computing in its evolution toward solving problems that are difficult for even the largest supercomputers,” said D-Wave CEO Vern Brownell. “We highly value the commitment that our partners have made to D-Wave and our technology, and are excited about the potential use of our systems for machine learning and complex optimization problems.”

Cade Wetz’s Sept. 28, 2015 article for Wired magazine provides some interesting observations about D-Wave computers along with some explanations of quantum computing (Note: Links have been removed),

Though the D-Wave machine is less powerful than many scientists hope quantum computers will one day be, the leap to 1000 qubits represents an exponential improvement in what the machine is capable of. What is it capable of? Google and its partners are still trying to figure that out. But Google has said it’s confident there are situations where the D-Wave can outperform today’s non-quantum machines, and scientists at the University of Southern California [USC] have published research suggesting that the D-Wave exhibits behavior beyond classical physics.

A quantum computer operates according to the principles of quantum mechanics, the physics of very small things, such as electrons and photons. In a classical computer, a transistor stores a single “bit” of information. If the transistor is “on,” it holds a 1, and if it’s “off,” it holds a 0. But in quantum computer, thanks to what’s called the superposition principle, information is held in a quantum system that can exist in two states at the same time. This “qubit” can store a 0 and 1 simultaneously.

Two qubits, then, can hold four values at any given time (00, 01, 10, and 11). And as you keep increasing the number of qubits, you exponentially increase the power of the system. The problem is that building a qubit is a extreme difficult thing. If you read information from a quantum system, it “decoheres.” Basically, it turns into a classical bit that houses only a single value.

D-Wave claims to have a found a solution to the decoherence problem and that appears to be borne out by the USC researchers. Still, it isn’t a general quantum computer (from Wetz’s article),

… researchers at USC say that the system appears to display a phenomenon called “quantum annealing” that suggests it’s truly operating in the quantum realm. Regardless, the D-Wave is not a general quantum computer—that is, it’s not a computer for just any task. But D-Wave says the machine is well-suited to “optimization” problems, where you’re facing many, many different ways forward and must pick the best option, and to machine learning, where computers teach themselves tasks by analyzing large amount of data.

It takes a lot of innovation before you make big strides forward and I think D-Wave is to be congratulated on producing what is to my knowledge the only commercially available form of quantum computing of any sort in the world.

ETA Oct. 6, 2015* at 1230 hours PST: Minutes after publishing about D-Wave I came across this item (h/t Quirks & Quarks twitter) about Australian researchers and their quantum computing breakthrough. From an Oct. 6, 2015 article by Hannah Francis for the Sydney (Australia) Morning Herald,

For decades scientists have been trying to turn quantum computing — which allows for multiple calculations to happen at once, making it immeasurably faster than standard computing — into a practical reality rather than a moonshot theory. Until now, they have largely relied on “exotic” materials to construct quantum computers, making them unsuitable for commercial production.

But researchers at the University of New South Wales have patented a new design, published in the scientific journal Nature on Tuesday, created specifically with computer industry manufacturing standards in mind and using affordable silicon, which is found in regular computer chips like those we use every day in smartphones or tablets.

“Our team at UNSW has just cleared a major hurdle to making quantum computing a reality,” the director of the university’s Australian National Fabrication Facility, Andrew Dzurak, the project’s leader, said.

“As well as demonstrating the first quantum logic gate in silicon, we’ve also designed and patented a way to scale this technology to millions of qubits using standard industrial manufacturing techniques to build the world’s first quantum processor chip.”

According to the article, the university is looking for industrial partners to help them exploit this breakthrough. Fisher’s article features an embedded video, as well as, more detail.

*It was Oct. 6, 2015 in Australia but Oct. 5, 2015 my side of the international date line.

ETA Oct. 6, 2015 (my side of the international date line): An Oct. 5, 2015 University of New South Wales news release on EurekAlert provides additional details.

Here’s a link to and a citation for the paper,

A two-qubit logic gate in silicon by M. Veldhorst, C. H. Yang, J. C. C. Hwang, W. Huang,    J. P. Dehollain, J. T. Muhonen, S. Simmons, A. Laucht, F. E. Hudson, K. M. Itoh, A. Morello    & A. S. Dzurak. Nature (2015 doi:10.1038/nature15263 Published online 05 October 2015

This paper is behind a paywall.

D-Wave passes 1000-qubit barrier

A local (Vancouver, Canada-based, quantum computing company, D-Wave is making quite a splash lately due to a technical breakthrough.  h/t’s Speaking up for Canadian Science for Business in Vancouver article and Nanotechnology Now for Harris & Harris Group press release and Economist article.

A June 22, 2015 article by Tyler Orton for Business in Vancouver describes D-Wave’s latest technical breakthrough,

“This updated processor will allow significantly more complex computational problems to be solved than ever before,” Jeremy Hilton, D-Wave’s vice-president of processor development, wrote in a June 22 [2015] blog entry.

Regular computers use two bits – ones and zeroes – to make calculations, while quantum computers rely on qubits.

Qubits possess a “superposition” that allow it to be one and zero at the same time, meaning it can calculate all possible values in a single operation.

But the algorithm for a full-scale quantum computer requires 8,000 qubits.

A June 23, 2015 Harris & Harris Group press release adds more information about the breakthrough,

Harris & Harris Group, Inc. (Nasdaq: TINY), an investor in transformative companies enabled by disruptive science, notes that its portfolio company, D-Wave Systems, Inc., announced that it has successfully fabricated 1,000 qubit processors that power its quantum computers.  D-Wave’s quantum computer runs a quantum annealing algorithm to find the lowest points, corresponding to optimal or near optimal solutions, in a virtual “energy landscape.”  Every additional qubit doubles the search space of the processor.  At 1,000 qubits, the new processor considers 21000 possibilities simultaneously, a search space which is substantially larger than the 2512 possibilities available to the company’s currently available 512 qubit D-Wave Two. In fact, the new search space contains far more possibilities than there are particles in the observable universe.

A June 22, 2015 D-Wave news release, which originated the technical details about the breakthrough found in the Harris & Harris press release, provides more information along with some marketing hype (hyperbole), Note: Links have been removed,

As the only manufacturer of scalable quantum processors, D-Wave breaks new ground with every succeeding generation it develops. The new processors, comprising over 128,000 Josephson tunnel junctions, are believed to be the most complex superconductor integrated circuits ever successfully yielded. They are fabricated in part at D-Wave’s facilities in Palo Alto, CA and at Cypress Semiconductor’s wafer foundry located in Bloomington, Minnesota.

“Temperature, noise, and precision all play a profound role in how well quantum processors solve problems.  Beyond scaling up the technology by doubling the number of qubits, we also achieved key technology advances prioritized around their impact on performance,” said Jeremy Hilton, D-Wave vice president, processor development. “We expect to release benchmarking data that demonstrate new levels of performance later this year.”

The 1000-qubit milestone is the result of intensive research and development by D-Wave and reflects a triumph over a variety of design challenges aimed at enhancing performance and boosting solution quality. Beyond the much larger number of qubits, other significant innovations include:

  •  Lower Operating Temperature: While the previous generation processor ran at a temperature close to absolute zero, the new processor runs 40% colder. The lower operating temperature enhances the importance of quantum effects, which increases the ability to discriminate the best result from a collection of good candidates.​
  • Reduced Noise: Through a combination of improved design, architectural enhancements and materials changes, noise levels have been reduced by 50% in comparison to the previous generation. The lower noise environment enhances problem-solving performance while boosting reliability and stability.
  • Increased Control Circuitry Precision: In the testing to date, the increased precision coupled with the noise reduction has demonstrated improved precision by up to 40%. To accomplish both while also improving manufacturing yield is a significant achievement.
  • Advanced Fabrication:  The new processors comprise over 128,000 Josephson junctions (tunnel junctions with superconducting electrodes) in a 6-metal layer planar process with 0.25μm features, believed to be the most complex superconductor integrated circuits ever built.
  • New Modes of Use: The new technology expands the boundaries of ways to exploit quantum resources.  In addition to performing discrete optimization like its predecessor, firmware and software upgrades will make it easier to use the system for sampling applications.

“Breaking the 1000 qubit barrier marks the culmination of years of research and development by our scientists, engineers and manufacturing team,” said D-Wave CEO Vern Brownell. “It is a critical step toward bringing the promise of quantum computing to bear on some of the most challenging technical, commercial, scientific, and national defense problems that organizations face.”

A June 20, 2015 article in The Economist notes there is now commercial interest as it provides good introductory information about quantum computing. The article includes an analysis of various research efforts in Canada (they mention D-Wave), the US, and the UK. These excerpts don’t do justice to the article but will hopefully whet your appetite or provide an overview for anyone with limited time,

A COMPUTER proceeds one step at a time. At any particular moment, each of its bits—the binary digits it adds and subtracts to arrive at its conclusions—has a single, definite value: zero or one. At that moment the machine is in just one state, a particular mixture of zeros and ones. It can therefore perform only one calculation next. This puts a limit on its power. To increase that power, you have to make it work faster.

But bits do not exist in the abstract. Each depends for its reality on the physical state of part of the computer’s processor or memory. And physical states, at the quantum level, are not as clear-cut as classical physics pretends. That leaves engineers a bit of wriggle room. By exploiting certain quantum effects they can create bits, known as qubits, that do not have a definite value, thus overcoming classical computing’s limits.

… The biggest question is what the qubits themselves should be made from.

A qubit needs a physical system with two opposite quantum states, such as the direction of spin of an electron orbiting an atomic nucleus. Several things which can do the job exist, and each has its fans. Some suggest nitrogen atoms trapped in the crystal lattices of diamonds. Calcium ions held in the grip of magnetic fields are another favourite. So are the photons of which light is composed (in this case the qubit would be stored in the plane of polarisation). And quasiparticles, which are vibrations in matter that behave like real subatomic particles, also have a following.

The leading candidate at the moment, though, is to use a superconductor in which the qubit is either the direction of a circulating current, or the presence or absence of an electric charge. Both Google and IBM are banking on this approach. It has the advantage that superconducting qubits can be arranged on semiconductor chips of the sort used in existing computers. That, the two firms think, should make them easier to commercialise.

Google is also collaborating with D-Wave of Vancouver, Canada, which sells what it calls quantum annealers. The field’s practitioners took much convincing that these devices really do exploit the quantum advantage, and in any case they are limited to a narrower set of problems—such as searching for images similar to a reference image. But such searches are just the type of application of interest to Google. In 2013, in collaboration with NASA and USRA, a research consortium, the firm bought a D-Wave machine in order to put it through its paces. Hartmut Neven, director of engineering at Google Research, is guarded about what his team has found, but he believes D-Wave’s approach is best suited to calculations involving fewer qubits, while Dr Martinis and his colleagues build devices with more.

It’s not clear to me if the writers at The Economist were aware of  D-Wave’s latest breakthrough at the time of writing but I think not. In any event, they (The Economist writers) have included a provocative tidbit about quantum encryption,

Documents released by Edward Snowden, a whistleblower, revealed that the Penetrating Hard Targets programme of America’s National Security Agency was actively researching “if, and how, a cryptologically useful quantum computer can be built”. In May IARPA [Intellligence Advanced Research Projects Agency], the American government’s intelligence-research arm, issued a call for partners in its Logical Qubits programme, to make robust, error-free qubits. In April, meanwhile, Tanja Lange and Daniel Bernstein of Eindhoven University of Technology, in the Netherlands, announced PQCRYPTO, a programme to advance and standardise “post-quantum cryptography”. They are concerned that encrypted communications captured now could be subjected to quantum cracking in the future. That means strong pre-emptive encryption is needed immediately.

I encourage you to read the Economist article.

Two final comments. (1) The latest piece, prior to this one, about D-Wave was in a Feb. 6, 2015 posting about then new investment into the company. (2) A Canadian effort in the field of quantum cryptography was mentioned in a May 11, 2015 posting (scroll down about 50% of the way) featuring a profile of Raymond Laflamme, at the University of Waterloo’s Institute of Quantum Computing in the context of an announcement about science media initiative Research2Reality.

More investment money for Canada’s D-Wave Systems (quantum computing)

A Feb. 2, 2015 news item on Nanotechnology Now features D-Wave Systems (located in the Vancouver region, Canada) and its recent funding bonanza of $28M dollars,

Harris & Harris Group, Inc. (Nasdaq:TINY), an investor in transformative companies enabled by disruptive science, notes the announcement by portfolio company, D-Wave Systems, Inc., that it has closed $29 million (CAD) in funding from a large institutional investor, among others. This funding will be used to accelerate development of D-Wave’s quantum hardware and software and expand the software application ecosystem. This investment brings total funding in D-Wave to $174 million (CAD), with approximately $62 million (CAD) raised in 2014. Harris & Harris Group’s total investment in D-Wave is approximately $5.8 million (USD). D-Wave’s announcement also includes highlights of 2014, a year of strong growth and advancement for D-Wave.

A Jan. 29, 2015 D-Wave news release provides more details about the new investment and D-Wave’s 2014 triumphs,

D-Wave Systems Inc., the world’s first quantum computing company, today announced that it has closed $29 million in funding from a large institutional investor, among others. This funding will be used to accelerate development of D-Wave’s quantum hardware and software and expand the software application ecosystem. This investment brings total funding in D-Wave to $174 million (CAD), with approximately $62 million raised in 2014.

“The investment is a testament to the progress D-Wave continues to make as the leader in quantum computing systems,” said Vern Brownell, CEO of D-Wave. “The funding we received in 2014 will advance our quantum hardware and software development, as well as our work on leading edge applications of our systems. By making quantum computing available to more organizations, we’re driving our goal of finding solutions to the most complex optimization and machine learning applications in national defense, computing, research and finance.”

The funding follows a year of strong growth and advancement for D-Wave. Highlights include:

•    Significant progress made towards the release of the next D-Wave quantum system featuring a 1000 qubit processor, which is currently undergoing testing in D-Wave’s labs.
•    The company’s patent portfolio grew to over 150 issued patents worldwide, with 11 new U.S. patents being granted in 2014, covering aspects of D-Wave’s processor technology, systems and techniques for solving computational problems using D-Wave’s technology.
•    D-Wave Professional Services launched, providing quantum computing experts to collaborate directly with customers, and deliver training classes on the usage and programming of the D-Wave system to a number of national laboratories, businesses and universities.
•    Partnerships were established with DNA-SEQ and 1QBit, companies that are developing quantum software applications in the spheres of medicine and finance, respectively.
•    Research throughout the year continued to validate D-Wave’s work, including a study showing further evidence of quantum entanglement by D-Wave and USC  [University of Southern California] scientists, published in Physical Review X this past May.

Since 2011, some of the most prestigious organizations in the world, including Lockheed Martin, NASA, Google, USC and the Universities Space Research Association (USRA), have partnered with D-Wave to use their quantum computing systems. In 2015, these partners will continue to work with the D-Wave computer, conducting pioneering research in machine learning, optimization, and space exploration.

D-Wave, which already employs over 120 people, plans to expand hiring with the additional funding. Key areas of growth include research, processor and systems development and software engineering.

Harris & Harris Group offers a description of D-Wave which mentions nanotechnology and hosts a couple of explanatory videos,

D-Wave Systems develops an adiabatic quantum computer (QC).

Privately Held

The Market
Electronics – High Performance Computing

The Problem
Traditional or “classical computers” are constrained by the sequential character of data processing that makes the solving of non-polynomial (NP)-hard problems difficult or potentially impossible in reasonable timeframes. These types of computationally intense problems are commonly observed in software verifications, scheduling and logistics planning, integer programming, bioinformatics and financial portfolio optimization.

D-Wave’s Solution
D-Wave develops quantum computers that are capable of processing data quantum mechanical properties of matter. This leverage of quantum mechanics enables the identification of solutions to some non-polynomial (NP)-hard problems in a reasonable timeframe, instead of the exponential time needed for any classical digital computer. D-Wave sold and installed its first quantum computing system to a commercial customer in 2011.

Nanotechnology Factor
To function properly, D-wave processor requires tight control and manipulation of quantum mechanical phenomena. This control and manipulation is achieved by creating integrated circuits based on Josephson Junctions and other superconducting circuitry. By picking superconductors, D-wave managed to combine quantum mechanical behavior with macroscopic dimensions needed for hi-yield design and manufacturing.

It seems D-Wave has made some research and funding strides since I last wrote about the company in a Jan. 19, 2012 posting, although there is no mention of quantum computer sales.

Could there be a quantum internet?

We’ve always had limited success with predicting future technologies by examining current technologies. For example, the Internet and World Wide Web as we experience them today would have been unthinkable for most people in the 1950s when computers inhabited entire buildings and satellites were a brand new technology designed for space exploration not bouncing communication signals around the planet. That said, this new work on a ‘quantum internet’ from Eindhoven University of Technology is quite intriguing (from a Dec. 15, 2014 news item on Nanowerk),

In the same way as we now connect computers in networks through optical signals, it could also be possible to connect future quantum computers in a ‘quantum internet’. The optical signals would then consist of individual light particles or photons. One prerequisite for a working quantum internet is control of the shape of these photons. Researchers at Eindhoven University of Technology (TU/e) and the FOM foundation  [Foundation for Fundamental Research on Matter] have now succeeded for the first time in getting this control within the required short time.

A Dec. 15, 2014 Eindhoven University of Technology (TU/e) press release, which originated the news item, describes one of the problems with a ‘quantum internet’ and the researchers’ solution,

Quantum computers could in principle communicate with each other by exchanging individual photons to create a ‘quantum internet’. The shape of the photons, in other words how their energy is distributed over time, is vital for successful transmission of information. This shape must be symmetric in time, while photons that are emitted by atoms normally have an asymmetric shape. Therefore, this process requires external control in order to create a quantum internet.

Optical cavity

Researchers at TU/e and FOM have succeeded in getting the required degree of control by embedding a quantum dot – a piece of semiconductor material that can transmit photons – into a ‘photonic crystal’, thereby creating an optical cavity. Then the researchers applied a very short electrical pulse to the cavity, which influences how the quantum dot interacts with it, and how the photon is emitted. By varying the strength of this pulse, they were able to control the shape of the transmitted photons.

Within a billionth of a second

The Eindhoven researchers are the first to achieve this, thanks to the use of electrical pulses shorter than nanosecond, a billionth of a second. This is vital for use in quantum communication, as research leader Andrea Fiore of TU/e explains: “The emission of a photon only lasts for one nanosecond, so if you want to change anything you have to do it within that time. It’s like the shutter of a high-speed camera, which has to be very short if you want to capture something that changes very fast in an image. By controlling the speed at which you send a photon, you can in principle achieve very efficient exchange of photons, which is important for the future quantum internet.”

Here’s a link to and a citation for the paper,

Dynamically controlling the emission of single excitons in photonic crystal cavities by Francesco Pagliano, YongJin Cho, Tian Xia, Frank van Otten, Robert Johne, & Andrea Fiore. Nature Communications 5, Article number: 5786 doi:10.1038/ncomms6786 Published 15 December 2014

This is an open access paper.

ETA Dec. 16, 2014 at 1230 hours PDT: There is a copy of the Dec. 15, 2014 news release on EurekAlert.

IBM weighs in with plans for a 7nm computer chip

On the heels of Intel’s announcement about a deal utilizing their 14nm low-power manufacturing process and speculations about a 10nm computer chip (my July 9, 2014 posting), IBM makes an announcement about a 7nm chip as per this July 10, 2014 news item on Azonano,

IBM today [July 10, 2014] announced it is investing $3 billion over the next 5 years in two broad research and early stage development programs to push the limits of chip technology needed to meet the emerging demands of cloud computing and Big Data systems. These investments will push IBM’s semiconductor innovations from today’s breakthroughs into the advanced technology leadership required for the future.

A very comprehensive July 10, 2014 news release lays out the company’s plans for this $3B investment representing 10% of IBM’s total research budget,

The first research program is aimed at so-called “7 nanometer and beyond” silicon technology that will address serious physical challenges that are threatening current semiconductor scaling techniques and will impede the ability to manufacture such chips. The second is focused on developing alternative technologies for post-silicon era chips using entirely different approaches, which IBM scientists and other experts say are required because of the physical limitations of silicon based semiconductors.

Cloud and big data applications are placing new challenges on systems, just as the underlying chip technology is facing numerous significant physical scaling limits.  Bandwidth to memory, high speed communication and device power consumption are becoming increasingly challenging and critical.

The teams will comprise IBM Research scientists and engineers from Albany and Yorktown, New York; Almaden, California; and Europe. In particular, IBM will be investing significantly in emerging areas of research that are already underway at IBM such as carbon nanoelectronics, silicon photonics, new memory technologies, and architectures that support quantum and cognitive computing. [emphasis mine]

These teams will focus on providing orders of magnitude improvement in system level performance and energy efficient computing. In addition, IBM will continue to invest in the nanosciences and quantum computing–two areas of fundamental science where IBM has remained a pioneer for over three decades.

7 nanometer technology and beyond

IBM Researchers and other semiconductor experts predict that while challenging, semiconductors show promise to scale from today’s 22 nanometers down to 14 and then 10 nanometers in the next several years.  However, scaling to 7 nanometers and perhaps below, by the end of the decade will require significant investment and innovation in semiconductor architectures as well as invention of new tools and techniques for manufacturing.

“The question is not if we will introduce 7 nanometer technology into manufacturing, but rather how, when, and at what cost?” said John Kelly, senior vice president, IBM Research. “IBM engineers and scientists, along with our partners, are well suited for this challenge and are already working on the materials science and device engineering required to meet the demands of the emerging system requirements for cloud, big data, and cognitive systems. This new investment will ensure that we produce the necessary innovations to meet these challenges.”

“Scaling to 7nm and below is a terrific challenge, calling for deep physics competencies in processing nano materials affinities and characteristics. IBM is one of a very few companies who has repeatedly demonstrated this level of science and engineering expertise,” said Richard Doherty, technology research director, The Envisioneering Group.

Bridge to a “Post-Silicon” Era

Silicon transistors, tiny switches that carry information on a chip, have been made smaller year after year, but they are approaching a point of physical limitation. Their increasingly small dimensions, now reaching the nanoscale, will prohibit any gains in performance due to the nature of silicon and the laws of physics. Within a few more generations, classical scaling and shrinkage will no longer yield the sizable benefits of lower power, lower cost and higher speed processors that the industry has become accustomed to.

With virtually all electronic equipment today built on complementary metal–oxide–semiconductor (CMOS) technology, there is an urgent need for new materials and circuit architecture designs compatible with this engineering process as the technology industry nears physical scalability limits of the silicon transistor.

Beyond 7 nanometers, the challenges dramatically increase, requiring a new kind of material to power systems of the future, and new computing platforms to solve problems that are unsolvable or difficult to solve today. Potential alternatives include new materials such as carbon nanotubes, and non-traditional computational approaches such as neuromorphic computing, cognitive computing, machine learning techniques, and the science behind quantum computing.

As the leader in advanced schemes that point beyond traditional silicon-based computing, IBM holds over 500 patents for technologies that will drive advancements at 7nm and beyond silicon — more than twice the nearest competitor. These continued investments will accelerate the invention and introduction into product development for IBM’s highly differentiated computing systems for cloud, and big data analytics.

Several exploratory research breakthroughs that could lead to major advancements in delivering dramatically smaller, faster and more powerful computer chips, include quantum computing, neurosynaptic computing, silicon photonics, carbon nanotubes, III-V technologies, low power transistors and graphene:

Quantum Computing

The most basic piece of information that a typical computer understands is a bit. Much like a light that can be switched on or off, a bit can have only one of two values: “1” or “0.” Described as superposition, this special property of qubits enables quantum computers to weed through millions of solutions all at once, while desktop PCs would have to consider them one at a time.

IBM is a world leader in superconducting qubit-based quantum computing science and is a pioneer in the field of experimental and theoretical quantum information, fields that are still in the category of fundamental science – but one that, in the long term, may allow the solution of problems that are today either impossible or impractical to solve using conventional machines. The team recently demonstrated the first experimental realization of parity check with three superconducting qubits, an essential building block for one type of quantum computer.

Neurosynaptic Computing

Bringing together nanoscience, neuroscience, and supercomputing, IBM and university partners have developed an end-to-end ecosystem including a novel non-von Neumann architecture, a new programming language, as well as applications. This novel technology allows for computing systems that emulate the brain’s computing efficiency, size and power usage. IBM’s long-term goal is to build a neurosynaptic system with ten billion neurons and a hundred trillion synapses, all while consuming only one kilowatt of power and occupying less than two liters of volume.

Silicon Photonics

IBM has been a pioneer in the area of CMOS integrated silicon photonics for over 12 years, a technology that integrates functions for optical communications on a silicon chip, and the IBM team has recently designed and fabricated the world’s first monolithic silicon photonics based transceiver with wavelength division multiplexing.  Such transceivers will use light to transmit data between different components in a computing system at high data rates, low cost, and in an energetically efficient manner.

Silicon nanophotonics takes advantage of pulses of light for communication rather than traditional copper wiring and provides a super highway for large volumes of data to move at rapid speeds between computer chips in servers, large datacenters, and supercomputers, thus alleviating the limitations of congested data traffic and high-cost traditional interconnects.

Businesses are entering a new era of computing that requires systems to process and analyze, in real-time, huge volumes of information known as Big Data. Silicon nanophotonics technology provides answers to Big Data challenges by seamlessly connecting various parts of large systems, whether few centimeters or few kilometers apart from each other, and move terabytes of data via pulses of light through optical fibers.

III-V technologies

IBM researchers have demonstrated the world’s highest transconductance on a self-aligned III-V channel metal-oxide semiconductor (MOS) field-effect transistors (FETs) device structure that is compatible with CMOS scaling. These materials and structural innovation are expected to pave path for technology scaling at 7nm and beyond.  With more than an order of magnitude higher electron mobility than silicon, integrating III-V materials into CMOS enables higher performance at lower power density, allowing for an extension to power/performance scaling to meet the demands of cloud computing and big data systems.

Carbon Nanotubes

IBM Researchers are working in the area of carbon nanotube (CNT) electronics and exploring whether CNTs can replace silicon beyond the 7 nm node.  As part of its activities for developing carbon nanotube based CMOS VLSI circuits, IBM recently demonstrated — for the first time in the world — 2-way CMOS NAND gates using 50 nm gate length carbon nanotube transistors.

IBM also has demonstrated the capability for purifying carbon nanotubes to 99.99 percent, the highest (verified) purities demonstrated to date, and transistors at 10 nm channel length that show no degradation due to scaling–this is unmatched by any other material system to date.

Carbon nanotubes are single atomic sheets of carbon rolled up into a tube. The carbon nanotubes form the core of a transistor device that will work in a fashion similar to the current silicon transistor, but will be better performing. They could be used to replace the transistors in chips that power data-crunching servers, high performing computers and ultra fast smart phones.

Carbon nanotube transistors can operate as excellent switches at molecular dimensions of less than ten nanometers – the equivalent to 10,000 times thinner than a strand of human hair and less than half the size of the leading silicon technology. Comprehensive modeling of the electronic circuits suggests that about a five to ten times improvement in performance compared to silicon circuits is possible.


Graphene is pure carbon in the form of a one atomic layer thick sheet.  It is an excellent conductor of heat and electricity, and it is also remarkably strong and flexible.  Electrons can move in graphene about ten times faster than in commonly used semiconductor materials such as silicon and silicon germanium. Its characteristics offer the possibility to build faster switching transistors than are possible with conventional semiconductors, particularly for applications in the handheld wireless communications business where it will be a more efficient switch than those currently used.

Recently in 2013, IBM demonstrated the world’s first graphene based integrated circuit receiver front end for wireless communications. The circuit consisted of a 2-stage amplifier and a down converter operating at 4.3 GHz.

Next Generation Low Power Transistors

In addition to new materials like CNTs, new architectures and innovative device concepts are required to boost future system performance. Power dissipation is a fundamental challenge for nanoelectronic circuits. To explain the challenge, consider a leaky water faucet — even after closing the valve as far as possible water continues to drip — this is similar to today’s transistor, in that energy is constantly “leaking” or being lost or wasted in the off-state.

A potential alternative to today’s power hungry silicon field effect transistors are so-called steep slope devices. They could operate at much lower voltage and thus dissipate significantly less power. IBM scientists are researching tunnel field effect transistors (TFETs). In this special type of transistors the quantum-mechanical effect of band-to-band tunneling is used to drive the current flow through the transistor. TFETs could achieve a 100-fold power reduction over complementary CMOS transistors, so integrating TFETs with CMOS technology could improve low-power integrated circuits.

Recently, IBM has developed a novel method to integrate III-V nanowires and heterostructures directly on standard silicon substrates and built the first ever InAs/Si tunnel diodes and TFETs using InAs as source and Si as channel with wrap-around gate as steep slope device for low power consumption applications.

“In the next ten years computing hardware systems will be fundamentally different as our scientists and engineers push the limits of semiconductor innovations to explore the post-silicon future,” said Tom Rosamilia, senior vice president, IBM Systems and Technology Group. “IBM Research and Development teams are creating breakthrough innovations that will fuel the next era of computing systems.”

IBM’s historic contributions to silicon and semiconductor innovation include the invention and/or first implementation of: the single cell DRAM, the “Dennard scaling laws” underpinning “Moore’s Law”, chemically amplified photoresists, copper interconnect wiring, Silicon on Insulator, strained engineering, multi core microprocessors, immersion lithography, high speed silicon germanium (SiGe), High-k gate dielectrics, embedded DRAM, 3D chip stacking, and Air gap insulators.

IBM researchers also are credited with initiating the era of nano devices following the Nobel prize winning invention of the scanning tunneling microscope which enabled nano and atomic scale invention and innovation.

IBM will also continue to fund and collaborate with university researchers to explore and develop the future technologies for the semiconductor industry. In particular, IBM will continue to support and fund university research through private-public partnerships such as the NanoElectornics Research Initiative (NRI), and the Semiconductor Advanced Research Network (STARnet), and the Global Research Consortium (GRC) of the Semiconductor Research Corporation.

I highlighted ‘memory systems’ as this brings to mind HP Labs and their major investment in ‘memristive’ technologies noted in my June 26, 2014 posting,

… During a two-hour presentation held a year and a half ago, they laid out how the computer might work, its benefits, and the expectation that about 75 percent of HP Labs personnel would be dedicated to this one project. “At the end, Meg {Meg Whitman, CEO of HP Labs] turned to [Chief Financial Officer] Cathie Lesjak and said, ‘Find them more money,’” says John Sontag, the vice president of systems research at HP, who attended the meeting and is in charge of bringing the Machine to life. “People in Labs see this as a once-in-a-lifetime opportunity.”

The Machine is based on the memristor and other associated technologies.

Getting back to IBM, there’s this analysis of the $3B investment ($600M/year for five years) by Alex Konrad in a July 10, 2014 article for Forbes (Note: A link has been removed),

When IBM … announced a $3 billion commitment to even tinier semiconductor chips that no longer depended on silicon on Wednesday, the big news was that IBM’s putting a lot of money into a future for chips where Moore’s Law no longer applies. But on second glance, the move to spend billions on more experimental ideas like silicon photonics and carbon nanotubes shows that IBM’s finally shifting large portions of its research budget into more ambitious and long-term ideas.

… IBM tells Forbes the $3 billion isn’t additional money being added to its R&D spend, an area where analysts have told Forbes they’d like to see more aggressive cash commitments in the future. IBM will still spend about $6 billion a year on R&D, 6% of revenue. Ten percent of that research budget, however, now has to come from somewhere else to fuel these more ambitious chip projects.

Neal Ungerleider’s July 11, 2014 article for Fast Company focuses on the neuromorphic computing and quantum computing aspects of this $3B initiative (Note: Links have been removed),

The new R&D initiatives fall into two categories: Developing nanotech components for silicon chips for big data and cloud systems, and experimentation with “post-silicon” microchips. This will include research into quantum computers which don’t know binary code, neurosynaptic computers which mimic the behavior of living brains, carbon nanotubes, graphene tools and a variety of other technologies.

IBM’s investment is one of the largest for quantum computing to date; the company is one of the biggest researchers in the field, along with a Canadian company named D-Wave which is partnering with Google and NASA to develop quantum computer systems.

The curious can find D-Wave Systems here. There’s also a January 19, 2012 posting here which discusses the D-Wave’s situation at that time.

Final observation, these are fascinating developments especially for the insight they provide into the worries troubling HP Labs, Intel, and IBM as they jockey for position.

ETA July 14, 2014: Dexter Johnson has a July 11, 2014 posting on his Nanoclast blog (on the IEEE [Institute for Electrical and Electronics Engineers]) about the IBM announcement and which features some responses he received from IBM officials to his queries,

While this may be a matter of fascinating speculation for investors, the impact on nanotechnology development  is going to be significant. To get a better sense of what it all means, I was able to talk to some of the key figures of IBM’s push in nanotechnology research.

I conducted e-mail interviews with Tze-Chiang (T.C.) Chen, vice president science & technology, IBM Fellow at the Thomas J. Watson Research Center and Wilfried Haensch, senior manager, physics and materials for logic and communications, IBM Research.

Silicon versus Nanomaterials

First, I wanted to get a sense for how long IBM envisioned sticking with silicon and when they expected the company would permanently make the move away from CMOS to alternative nanomaterials. Unfortunately, as expected, I didn’t get solid answers, except for them to say that new manufacturing tools and techniques need to be developed now.

He goes on to ask about carbon nanotubes and graphene. Interestingly, IBM does not have a wide range of electronics applications in mind for graphene.  I encourage you to read Dexter’s posting as Dexter got answers to some very astute and pointed questions.

Graphene, Perimeter Institute, and condensed matter physics

In short, researchers at Canada’s Perimeter Institute are working on theoretical models involving graphene. which could lead to quantum computing. A July 3, 2014 Perimeter Institute news release by Erin Bow (also on EurekAlert) provides some insight into the connections between graphene and condensed matter physics (Note: Bow has included some good basic explanations of graphene, quasiparticles, and more for beginners),

One of the hottest materials in condensed matter research today is graphene.

Graphene had an unlikely start: it began with researchers messing around with pencil marks on paper. Pencil “lead” is actually made of graphite, which is a soft crystal lattice made of nothing but carbon atoms. When pencils deposit that graphite on paper, the lattice is laid down in thin sheets. By pulling that lattice apart into thinner sheets – originally using Scotch tape – researchers discovered that they could make flakes of crystal just one atom thick.

The name for this atom-scale chicken wire is graphene. Those folks with the Scotch tape, Andre Geim and Konstantin Novoselov, won the 2010 Nobel Prize for discovering it. “As a material, it is completely new – not only the thinnest ever but also the strongest,” wrote the Nobel committee. “As a conductor of electricity, it performs as well as copper. As a conductor of heat, it outperforms all other known materials. It is almost completely transparent, yet so dense that not even helium, the smallest gas atom, can pass through it.”

Developing a theoretical model of graphene

Graphene is not just a practical wonder – it’s also a wonderland for theorists. Confined to the two-dimensional surface of the graphene, the electrons behave strangely. All kinds of new phenomena can be seen, and new ideas can be tested. Testing new ideas in graphene is exactly what Perimeter researchers Zlatko Papić and Dmitry (Dima) Abanin set out to do.

“Dima and I started working on graphene a very long time ago,” says Papić. “We first met in 2009 at a conference in Sweden. I was a grad student and Dima was in the first year of his postdoc, I think.”

The two young scientists got to talking about what new physics they might be able to observe in the strange new material when it is exposed to a strong magnetic field.

“We decided we wanted to model the material,” says Papić. They’ve been working on their theoretical model of graphene, on and off, ever since. The two are now both at Perimeter Institute, where Papić is a postdoctoral researcher and Abanin is a faculty member. They are both cross-appointed with the Institute for Quantum Computing (IQC) at the University of Waterloo.

In January 2014, they published a paper in Physical Review Letters presenting new ideas about how to induce a strange but interesting state in graphene – one where it appears as if particles inside it have a fraction of an electron’s charge.

It’s called the fractional quantum Hall effect (FQHE), and it’s head turning. Like the speed of light or Planck’s constant, the charge of the electron is a fixed point in the disorienting quantum universe.

Every system in the universe carries whole multiples of a single electron’s charge. When the FQHE was first discovered in the 1980s, condensed matter physicists quickly worked out that the fractionally charged “particles” inside their semiconductors were actually quasiparticles – that is, emergent collective behaviours of the system that imitate particles.

Graphene is an ideal material in which to study the FQHE. “Because it’s just one atom thick, you have direct access to the surface,” says Papić. “In semiconductors, where FQHE was first observed, the gas of electrons that create this effect are buried deep inside the material. They’re hard to access and manipulate. But with graphene you can imagine manipulating these states much more easily.”

In the January paper, Abanin and Papić reported novel types of FQHE states that could arise in bilayer graphene – that is, in two sheets of graphene laid one on top of another – when it is placed in a strong perpendicular magnetic field. In an earlier work from 2012, they argued that applying an electric field across the surface of bilayer graphene could offer a unique experimental knob to induce transitions between FQHE states. Combining the two effects, they argued, would be an ideal way to look at special FQHE states and the transitions between them.

Once the scientists developed their theory they went to work on some experiments,

Two experimental groups – one in Geneva, involving Abanin, and one at Columbia, involving both Abanin and Papić – have since put the electric field + magnetic field method to good use. The paper by the Columbia group appears in the July 4 issue of Science. A third group, led by Amir Yacoby of Harvard, is doing closely related work.

“We often work hand-in-hand with experimentalists,” says Papić. “One of the reasons I like condensed matter is that often even the most sophisticated, cutting-edge theory stands a good chance of being quickly checked with experiment.”

Inside both the magnetic and electric field, the electrical resistance of the graphene demonstrates the strange behaviour characteristic of the FQHE. Instead of resistance that varies in a smooth curve with voltage, resistance jumps suddenly from one level to another, and then plateaus – a kind of staircase of resistance. Each stair step is a different state of matter, defined by the complex quantum tangle of charges, spins, and other properties inside the graphene.

“The number of states is quite rich,” says Papić. “We’re very interested in bilayer graphene because of the number of states we are detecting and because we have these mechanisms – like tuning the electric field – to study how these states are interrelated, and what happens when the material changes from one state to another.”

For the moment, researchers are particularly interested in the stair steps whose “height” is described by a fraction with an even denominator. That’s because the quasiparticles in that state are expected to have an unusual property.

There are two kinds of particles in our three-dimensional world: fermions (such as electrons), where two identical particles can’t occupy one state, and bosons (such as photons), where two identical particles actually want to occupy one state. In three dimensions, fermions are fermions and bosons are bosons, and never the twain shall meet.

But a sheet of graphene doesn’t have three dimensions – it has two. It’s effectively a tiny two-dimensional universe, and in that universe, new phenomena can occur. For one thing, fermions and bosons can meet halfway – becoming anyons, which can be anywhere in between fermions and bosons. The quasiparticles in these special stair-step states are expected to be anyons.

In particular, the researchers are hoping these quasiparticles will be non-Abelian anyons, as their theory indicates they should be. That would be exciting because non-Abelian anyons can be used in the making of qubits.

Graphene qubits?

Qubits are to quantum computers what bits are to ordinary computers: both a basic unit of information and the basic piece of equipment that stores that information. Because of their quantum complexity, qubits are more powerful than ordinary bits and their power grows exponentially as more of them are added. A quantum computer of only a hundred qubits can tackle certain problems beyond the reach of even the best non-quantum supercomputers. Or, it could, if someone could find a way to build stable qubits.

The drive to make qubits is part of the reason why graphene is a hot research area in general, and why even-denominator FQHE states – with their special anyons – are sought after in particular.

“A state with some number of these anyons can be used to represent a qubit,” says Papić. “Our theory says they should be there and the experiments seem to bear that out – certainly the even-denominator FQHE states seem to be there, at least according to the Geneva experiments.”

That’s still a step away from experimental proof that those even-denominator stair-step states actually contain non-Abelian anyons. More work remains, but Papić is optimistic: “It might be easier to prove in graphene than it would be in semiconductors. Everything is happening right at the surface.”

It’s still early, but it looks as if bilayer graphene may be the magic material that allows this kind of qubit to be built. That would be a major mark on the unlikely line between pencil lead and quantum computers.

Here are links for further research,

January PRL paper mentioned above: http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.112.046602

Experimental paper from the Geneva graphene group, including Abanin: http://pubs.acs.org/doi/abs/10.1021/nl5003922

Experimental paper from the Columbia graphene group, including both Abanin and Papić: http://arxiv.org/abs/1403.2112. This paper is featured in the journal Science.

Related experiment on bilayer graphene by Amir Yacoby’s group at Harvard: http://www.sciencemag.org/content/early/2014/05/28/science.1250270

The Nobel Prize press release on graphene, mentioned above: http://www.nobelprize.org/nobel_prizes/physics/laureates/2010/press.html

I recently posted a piece about some research into the ‘scotch-tape technique’ for isolating graphene (June 30, 2014 posting). Amusingly, Geim argued against coining the technique as the ‘scotch-tape’ technique, something I found out only recently.

Technion-Israel Institute of Technology and the University of Waterloo (Canada) together at last

A March 18, 2014 University of Waterloo news release describes a new agreement signed at a joint Technion-Israel Institute of Technology-University of Waterloo conference held in Israel.

“As two of the world’s top innovation universities, the University of Waterloo and Technion are natural partners,” said Feridun Hamdullahpur, president and vice-chancellor of the University of Waterloo. “This partnership positions both Waterloo and Technion for accelerated progress in the key areas of quantum information science, nanotechnology, and water. [emphasis mine] These disciplines will help to shape the future of communities, industries, and everyday life.”

The conference to mark the start of the new partnership, and a reciprocal event in Waterloo planned for later in 2014, is funded by a donation to the University of Waterloo from The Gerald Schwartz & Heather Reisman Foundation.

“The agreement between the University of Waterloo and Technion will lead to joint research projects between Israeli and Canadian scientists in areas crucial for making our world a better place,” said Peretz Lavie, president of Technion. “I could not think of a better partner for such projects than the University of Waterloo.”

The new partnership agreement will connect students and faculty from both institutions with global markets through technology transfer and commercialization opportunities with industrial partners in Canada and in Israel.

“This partnership between two global innovation leaders puts in place the conditions to support research breakthroughs and new opportunities for commercialization on an international scale,” said George Dixon, vice-president of research at Waterloo. “University of Waterloo and Technion have a history of research collaboration going back almost 20 years.”

Which one of these items does not fit on the list “quantum information science, nanotechnology, and water?” I pick water. I think they mean water remediation or water desalination or, perhaps, water research.

Given the issues with the lack of potable water in that region the interest in water is eminently understandable. (My Feb. 24, 2014 posting mentions the situation in the Middle East in the context of water desalination research at a new nanotechnology at Oman’s Sultan Qaboos University.)

Carbon nanotubes, good vibrations, and quantum computing

Apparently carbon nanotubes can store information within their vibrations and this could have implications for quantum computing, from the Mar. 21, 2013 news release on EurekAlert,

A carbon nanotube that is clamped at both ends can be excited to oscillate. Like a guitar string, it vibrates for an amazingly long time. “One would expect that such a system would be strongly damped, and that the vibration would subside quickly,” says Simon Rips, first author of the publication. “In fact, the string vibrates more than a million times. The information is thus retained up to one second. That is long enough to work with.”

Since such a string oscillates among many physically equivalent states, the physicists resorted to a trick: an electric field in the vicinity of the nanotube ensures that two of these states can be selectively addressed. The information can then be written and read optoelectronically. “Our concept is based on available technology,” says Michael Hartmann, head of the Emmy Noether research group Quantum Optics and Quantum Dynamics at the TU Muenchen. “It could take us a step closer to the realization of a quantum computer.”

The research paper can be found here,

Quantum Information Processing with Nanomechanical Qubits
Simon Rips and Michael J. Hartmann,
Physical Review Letters, 110, 120503 (2013) DOI: 10.1103/PhysRevLett.110.120503
Link: http://prl.aps.org/abstract/PRL/v110/i12/e120503

The paper is behind a paywall.

There are two Good Vibrations songs on YouTube, one by the Beach Boys and one by Marky Mark. I decided to go with this Beach Boys version in part due to its technical description at http://youtu.be/NwrKKbaClME,

FIRST TRUE STEREO version with lead vocals properly placed in the mix. I also restored the original full length of the bridge that was edited out of the released version. An official true stereo mix of the vocal version was not made back in 1967. While there are other “stereo” versions posted, for the most part they are “fake” or poor stereo versions. I tried to make the best judicious decision on sound quality, stereo imaging and mastering while maintaining TRUE STEREO integrity given the source parts available. I hope you enjoy it!

The video,

What is a diamond worth?

A couple of diamond-related news items have crossed my path lately causing me to consider diamonds and their social implications. I’ll start first with the news items, according to an April 4, 2012 news item on physorg.com a quantum computer has been built inside a diamond (from the news item),

Diamonds are forever – or, at least, the effects of this diamond on quantum computing may be. A team that includes scientists from USC has built a quantum computer in a diamond, the first of its kind to include protection against “decoherence” – noise that prevents the computer from functioning properly.

I last mentioned decoherence in my July 21, 2011 posting about a joint (University of British Columbia, University of California at Santa Barbara and the University of Southern California) project on quantum computing.

According to the April 5, 2012 news item by Robert Perkins for the University of Southern California (USC),

The multinational team included USC professor Daniel Lidar and USC postdoctoral researcher Zhihui Wang, as well as researchers from the Delft University of Technology in the Netherlands, Iowa State University and the University of California, Santa Barbara. The findings were published today in Nature.

The team’s diamond quantum computer system featured two quantum bits, or qubits, made of subatomic particles.

As opposed to traditional computer bits, which can encode distinctly either a one or a zero, qubits can encode a one and a zero at the same time. This property, called superposition, along with the ability of quantum states to “tunnel” through energy barriers, some day will allow quantum computers to perform optimization calculations much faster than traditional computers.

Like all diamonds, the diamond used by the researchers has impurities – things other than carbon. The more impurities in a diamond, the less attractive it is as a piece of jewelry because it makes the crystal appear cloudy.

The team, however, utilized the impurities themselves.

A rogue nitrogen nucleus became the first qubit. In a second flaw sat an electron, which became the second qubit. (Though put more accurately, the “spin” of each of these subatomic particles was used as the qubit.)

Electrons are smaller than nuclei and perform computations much more quickly, but they also fall victim more quickly to decoherence. A qubit based on a nucleus, which is large, is much more stable but slower.

“A nucleus has a long decoherence time – in the milliseconds. You can think of it as very sluggish,” said Lidar, who holds appointments at the USC Viterbi School of Engineering and the USC Dornsife College of Letters, Arts and Sciences.

Though solid-state computing systems have existed before, this was the first to incorporate decoherence protection – using microwave pulses to continually switch the direction of the electron spin rotation.

“It’s a little like time travel,” Lidar said, because switching the direction of rotation time-reverses the inconsistencies in motion as the qubits move back to their original position.

Here’s an image I downloaded from the USC webpage hosting Perkins’s news item,

The diamond in the center measures 1 mm X 1 mm. Photo/Courtesy of Delft University of Technolgy/UC Santa Barbara

I’m not sure what they were trying to illustrate with the image but I thought it would provide an interesting contrast to the video which follows about the world’s first purely diamond ring,

I first came across this ring in Laura Hibberd’s March 22, 2012 piece for Huffington Post. For anyone who feels compelled to find out more about it, here’s the jeweller’s (Shawish) website.

What with the posting about Neal Stephenson and Diamond Age (aka, The Diamond Age Or A Young Lady’s Illustrated Primer; a novel that integrates nanotechnology into a story about the future and ubiquitous diamonds), a quantum computer in a diamond, and this ring, I’ve started to wonder about role diamonds will have in society. Will they be integrated into everyday objects or will they remain objects of desire? My guess is that the diamonds we create by manipulating carbon atoms will be considered everyday items while the ones which have been formed in the bowels of the earth will retain their status.

Rail system and choreography metaphors in a couple of science articles

If you are going to use a metaphor/analogy when you’re writing about a science topic  because you want to reach beyond an audience that’s expert on the topic you’re covering or you want to grab attention from an audience that’s inundated with material, or you want to play (for writers, this can be a form of play [for this writer, anyway]), I think you need to remain true to your metaphor. I realize that’s a lot tougher than it sounds.

I’ve got examples of the use of metaphors/analogies in two recent pieces of science writing.

First, here’s the title for a Jan. 23, 2012 article by Samantha Chan for The Asian Scientist,

Scientists Build DNA Rail System For Nanomotors, Complete With Tracks & Switches

Then, there’s the text where the analogy/metaphor of a railway system with tracks and switchers is developed further and abandoned for origami tiles,

Expanding on previous work with engines traveling on straight tracks, a team of researchers at Kyoto University and the University of Oxford have used DNA building blocks to construct a motor capable of navigating a programmable network of tracks with multiple switches.

In this latest effort, the scientists built a network of tracks and switches atop DNA origami tiles, which made it possible for motor molecules to travel along these rail systems.

Sometimes, the material at hand is the issue. ‘DNA origami tiles’ is a term in this field so Chan can’t change it to ‘DNA origami ties’ which would fit with the railway analogy. By the way, the analogy itself comes from (or was influenced by) the title the scientists chose for their published paper in Nature Nanotechnology (it’s behind a paywall),

A DNA-based molecular motor that can navigate a network of tracks

All in all, this was a skillful attempt to get the most out of a metaphor/analogy.

For my second example, I’m using a Jan. 12, 2012 news release by John Sullivan for Princeton University which was published in Jan. 12, 2012 news item on Nanowerk. Here’s the headline from Princeton,

Ten-second dance of electrons is step toward exotic new computers

This sets up the text for the first few paragraphs (found in both the Princeton news release and the Nanowerk news item),

In the basement of Hoyt Laboratory at Princeton University, Alexei Tyryshkin clicked a computer mouse and sent a burst of microwaves washing across a silicon crystal suspended in a frozen cylinder of stainless steel.

The waves pulsed like distant music across the crystal and deep within its heart, billions of electrons started spinning to their beat.

Reaching into the silicon crystal and choreographing the dance of 100 billion infinitesimal particles is an impressive achievement on its own, but it is also a stride toward developing the technology for powerful machines known as quantum computers.

Sullivan has written some very appealing text for an audience who may or may not know about quantum computers.

Somebody on Nanowerk changed the headline to this,

Choreographing dance of electrons offers promise in pursuit of quantum computers

Here, the title has been skilfully reworded for an audience that knows more quantum computers while retaining the metaphor. Nicely done.

Sullivan’s text goes on to provide a fine explanation of an issue in quantum computing, maintaining coherence, for an audience not expert in quantum computing. The one niggle I do have is a shift in the metaphor,

To understand why it is so hard, imagine circus performers spinning plates on the top of sticks. Now imagine a strong wind blasting across the performance space, upending the plates and sending them crashing to the ground. In the subatomic realm, that wind is magnetism, and much of the effort in the experiment goes to minimizing its effect. By using a magnetically calm material like silicon-28, the researchers are able to keep the electrons spinning together for much longer.

Wasn’t there a way to stay with dance? You could have had dancers spinning props or perhaps the dancers themselves being blown off course and avoided the circus performers. Yes, the circus is more colourful and appealing but, in this instance, I would have worked to maintain the metaphor first introduced, assuming I’d noticed that I’d switched metaphors.

So, I think I can safely say that using metaphors is tougher than it looks.