Tag Archives: IARPA

Nucleic acid-based memory storage

We’re running out of memory. To be more specific, there are two problems: the supply of silicon and a limit to how much silicon-based memory can store. An April 27, 2016 news item on Nanowerk announces a nucleic acid-based approach to solving the memory problem,

A group of Boise State [Boise State University in Idaho, US] researchers, led by associate professor of materials science and engineering and associate dean of the College of Innovation and Design Will Hughes, is working toward a better way to store digital information using nucleic acid memory (NAM).

An April 25, 2016 Boise State University news release, which originated the news item, expands on the theme of computer memory and provides more details about the approach,

It’s no secret that as a society we generate vast amounts of data each year. So much so that the 30 billion watts of electricity used annually by server farms today is roughly equivalent to the output of 30 nuclear power plants.

And the demand keeps growing. The global flash memory market is predicted to reach $30.2 billion this year, potentially growing to $80.3 billion by 2025. Experts estimate that by 2040, the demand for global memory will exceed the projected supply of silicon (the raw material used to store flash memory). Furthermore, electronic memory is rapidly approaching its fundamental size limits because of the difficulty in storing electrons in small dimensions.

Hughes, with post-doctoral researcher Reza Zadegan and colleagues Victor Zhirnov (Semiconductor Research Corporation), Gurtej Sandhun (Micron Technology Inc.) and George Church (Harvard University), is looking to DNA molecules to solve the problem. Nucleic acid — the “NA” in “DNA” — far surpasses electronic memory in retention time, according to the researchers, while also providing greater information density and energy of operation.

Their conclusions are outlined in an invited commentary in the prestigious journal Nature Materials published earlier this month.

“DNA is the data storage material of life in general,” said Hughes. “Because of its physical and chemical properties, it also may become the data storage material of our lives.” It may sound like science fiction, but Hughes will participate in an invitation-only workshop this month at the Intelligence Advanced Research Projects Activity (IARPA) Agency to envision a portable DNA hard drive that would have 500 Terabytes of searchable data – that’s about the the size of the Library of Congress Web Archive.

“When information bits are encoded into polymer strings, researchers and manufacturers can manage and manipulate physical, chemical and biological information with standard molecular biology techniques,” the paper [in Nature Materials?] states.

Cost-competitive technologies to read and write DNA could lead to real-world applications ranging from artificial chromosomes, digital hard drives and information-management systems, to a platform for watermarking and tracking genetic content or next-generation encryption tools that necessitate physical rather than electronic embodiment.

Here’s how it works. Current binary code uses 0’s and 1’s to represent bits of information. A computer program then accesses a specific decoder to turn the numbers back into usable data. With nucleic acid memory, 0’s and 1’s are replaced with the nucleotides A, T, C and G. Known as monomers, they are covalently bonded to form longer polymer chains, also known as information strings.

Because of DNA’s superior ability to store data, DNA can contain all the information in the world in a small box measuring 10 x 10 x 10 centimeters cubed. NAM could thus be used as a sustainable time capsule for massive, scientific, financial, governmental, historical, genealogical, personal and genetic records.

Better yet, DNA can store digital information for a very long time – thousands to millions of years. Currently, usable information has been extracted from DNA in bones that are 700,000 years old, making nucleic acid memory a promising archival material. And nucleic acid memory uses 100 million times less energy than storing data electronically in flash, and the data can live on for generations.

At Boise State, Hughes and Zadegan are examining DNA’s stability under extreme conditions. DNA strands are subjected to temperatures varying from negative 20 degrees Celsius to 100 degrees Celsius, and to a variety of UV exposures to see if they can still retain their information. What they’re finding is that much less information is lost with NAM than with the current state of the industry.

Here’s a link to and a citation for the Nature Materials paper,

Nucleic acid memory by Victor Zhirnov, Reza M. Zadegan, Gurtej S. Sandhu, George M. Church, & William L. Hughes. Nature Materials 15, 366–370 (2016)  doi:10.1038/nmat4594 Published online 23 March 2016

This paper is behind a paywall.

D-Wave passes 1000-qubit barrier

A local (Vancouver, Canada-based, quantum computing company, D-Wave is making quite a splash lately due to a technical breakthrough.  h/t’s Speaking up for Canadian Science for Business in Vancouver article and Nanotechnology Now for Harris & Harris Group press release and Economist article.

A June 22, 2015 article by Tyler Orton for Business in Vancouver describes D-Wave’s latest technical breakthrough,

“This updated processor will allow significantly more complex computational problems to be solved than ever before,” Jeremy Hilton, D-Wave’s vice-president of processor development, wrote in a June 22 [2015] blog entry.

Regular computers use two bits – ones and zeroes – to make calculations, while quantum computers rely on qubits.

Qubits possess a “superposition” that allow it to be one and zero at the same time, meaning it can calculate all possible values in a single operation.

But the algorithm for a full-scale quantum computer requires 8,000 qubits.

A June 23, 2015 Harris & Harris Group press release adds more information about the breakthrough,

Harris & Harris Group, Inc. (Nasdaq: TINY), an investor in transformative companies enabled by disruptive science, notes that its portfolio company, D-Wave Systems, Inc., announced that it has successfully fabricated 1,000 qubit processors that power its quantum computers.  D-Wave’s quantum computer runs a quantum annealing algorithm to find the lowest points, corresponding to optimal or near optimal solutions, in a virtual “energy landscape.”  Every additional qubit doubles the search space of the processor.  At 1,000 qubits, the new processor considers 21000 possibilities simultaneously, a search space which is substantially larger than the 2512 possibilities available to the company’s currently available 512 qubit D-Wave Two. In fact, the new search space contains far more possibilities than there are particles in the observable universe.

A June 22, 2015 D-Wave news release, which originated the technical details about the breakthrough found in the Harris & Harris press release, provides more information along with some marketing hype (hyperbole), Note: Links have been removed,

As the only manufacturer of scalable quantum processors, D-Wave breaks new ground with every succeeding generation it develops. The new processors, comprising over 128,000 Josephson tunnel junctions, are believed to be the most complex superconductor integrated circuits ever successfully yielded. They are fabricated in part at D-Wave’s facilities in Palo Alto, CA and at Cypress Semiconductor’s wafer foundry located in Bloomington, Minnesota.

“Temperature, noise, and precision all play a profound role in how well quantum processors solve problems.  Beyond scaling up the technology by doubling the number of qubits, we also achieved key technology advances prioritized around their impact on performance,” said Jeremy Hilton, D-Wave vice president, processor development. “We expect to release benchmarking data that demonstrate new levels of performance later this year.”

The 1000-qubit milestone is the result of intensive research and development by D-Wave and reflects a triumph over a variety of design challenges aimed at enhancing performance and boosting solution quality. Beyond the much larger number of qubits, other significant innovations include:

  •  Lower Operating Temperature: While the previous generation processor ran at a temperature close to absolute zero, the new processor runs 40% colder. The lower operating temperature enhances the importance of quantum effects, which increases the ability to discriminate the best result from a collection of good candidates.​
  • Reduced Noise: Through a combination of improved design, architectural enhancements and materials changes, noise levels have been reduced by 50% in comparison to the previous generation. The lower noise environment enhances problem-solving performance while boosting reliability and stability.
  • Increased Control Circuitry Precision: In the testing to date, the increased precision coupled with the noise reduction has demonstrated improved precision by up to 40%. To accomplish both while also improving manufacturing yield is a significant achievement.
  • Advanced Fabrication:  The new processors comprise over 128,000 Josephson junctions (tunnel junctions with superconducting electrodes) in a 6-metal layer planar process with 0.25μm features, believed to be the most complex superconductor integrated circuits ever built.
  • New Modes of Use: The new technology expands the boundaries of ways to exploit quantum resources.  In addition to performing discrete optimization like its predecessor, firmware and software upgrades will make it easier to use the system for sampling applications.

“Breaking the 1000 qubit barrier marks the culmination of years of research and development by our scientists, engineers and manufacturing team,” said D-Wave CEO Vern Brownell. “It is a critical step toward bringing the promise of quantum computing to bear on some of the most challenging technical, commercial, scientific, and national defense problems that organizations face.”

A June 20, 2015 article in The Economist notes there is now commercial interest as it provides good introductory information about quantum computing. The article includes an analysis of various research efforts in Canada (they mention D-Wave), the US, and the UK. These excerpts don’t do justice to the article but will hopefully whet your appetite or provide an overview for anyone with limited time,

A COMPUTER proceeds one step at a time. At any particular moment, each of its bits—the binary digits it adds and subtracts to arrive at its conclusions—has a single, definite value: zero or one. At that moment the machine is in just one state, a particular mixture of zeros and ones. It can therefore perform only one calculation next. This puts a limit on its power. To increase that power, you have to make it work faster.

But bits do not exist in the abstract. Each depends for its reality on the physical state of part of the computer’s processor or memory. And physical states, at the quantum level, are not as clear-cut as classical physics pretends. That leaves engineers a bit of wriggle room. By exploiting certain quantum effects they can create bits, known as qubits, that do not have a definite value, thus overcoming classical computing’s limits.

… The biggest question is what the qubits themselves should be made from.

A qubit needs a physical system with two opposite quantum states, such as the direction of spin of an electron orbiting an atomic nucleus. Several things which can do the job exist, and each has its fans. Some suggest nitrogen atoms trapped in the crystal lattices of diamonds. Calcium ions held in the grip of magnetic fields are another favourite. So are the photons of which light is composed (in this case the qubit would be stored in the plane of polarisation). And quasiparticles, which are vibrations in matter that behave like real subatomic particles, also have a following.

The leading candidate at the moment, though, is to use a superconductor in which the qubit is either the direction of a circulating current, or the presence or absence of an electric charge. Both Google and IBM are banking on this approach. It has the advantage that superconducting qubits can be arranged on semiconductor chips of the sort used in existing computers. That, the two firms think, should make them easier to commercialise.

Google is also collaborating with D-Wave of Vancouver, Canada, which sells what it calls quantum annealers. The field’s practitioners took much convincing that these devices really do exploit the quantum advantage, and in any case they are limited to a narrower set of problems—such as searching for images similar to a reference image. But such searches are just the type of application of interest to Google. In 2013, in collaboration with NASA and USRA, a research consortium, the firm bought a D-Wave machine in order to put it through its paces. Hartmut Neven, director of engineering at Google Research, is guarded about what his team has found, but he believes D-Wave’s approach is best suited to calculations involving fewer qubits, while Dr Martinis and his colleagues build devices with more.

It’s not clear to me if the writers at The Economist were aware of  D-Wave’s latest breakthrough at the time of writing but I think not. In any event, they (The Economist writers) have included a provocative tidbit about quantum encryption,

Documents released by Edward Snowden, a whistleblower, revealed that the Penetrating Hard Targets programme of America’s National Security Agency was actively researching “if, and how, a cryptologically useful quantum computer can be built”. In May IARPA [Intellligence Advanced Research Projects Agency], the American government’s intelligence-research arm, issued a call for partners in its Logical Qubits programme, to make robust, error-free qubits. In April, meanwhile, Tanja Lange and Daniel Bernstein of Eindhoven University of Technology, in the Netherlands, announced PQCRYPTO, a programme to advance and standardise “post-quantum cryptography”. They are concerned that encrypted communications captured now could be subjected to quantum cracking in the future. That means strong pre-emptive encryption is needed immediately.

I encourage you to read the Economist article.

Two final comments. (1) The latest piece, prior to this one, about D-Wave was in a Feb. 6, 2015 posting about then new investment into the company. (2) A Canadian effort in the field of quantum cryptography was mentioned in a May 11, 2015 posting (scroll down about 50% of the way) featuring a profile of Raymond Laflamme, at the University of Waterloo’s Institute of Quantum Computing in the context of an announcement about science media initiative Research2Reality.