Category Archives: electronics

Silicene in Saskatchewan (Canada)

There’s some very exciting news coming out of the province of Saskatchewan (Canada) about silicene, a material some view as a possible rival to graphene (although that’s problematic according to my Jan. 12, 2014 posting) while others (US National Argonne Laboratory) challenge its existence (my Aug. 1,  2014 posting).

The researchers in Saskatchewan seem quite confident in silicene’s existence according to a Sept. 9, 2014 news item on phys.org,

“Once a device becomes too small it falls prey to the strange laws of the quantum world,” says University of Saskatchewan researcher Neil Johnson, who is using the Canadian Light Source synchrotron to help develop the next generation of computer materials. Johnson is a member of Canada Research Chair Alexander Moewes’ group of graduate students studying the nature of materials using synchrotron radiation.

His work focuses on silicene, a recent and exciting addition to the class of two-dimensional materials. Silicene is made up of an almost flat hexagonal pattern of silicon atoms. Every second atom in each hexagonal ring is slightly lifted, resulting in a buckled sheet that looks the same from the top or the bottom.

A Sept. 9, 2014 Canadian Light Source news release, which originated the news item, provides background as to how Johnson started studying silicene and some details about the work,

In 2012, mere months before Johnson began to study silicene, it was discovered and first created by the research group of Prof. Guy Le Lay of Aix-Marseille University, using silver as a base for the thin film. The Le Lay group is the world-leader in silicene growth, and taught Johnson and his colleagues how to make it at the CLS themselves.

“I read the paper when the Le Lay announced they had made silicene, and within three or four months, Alex had arranged for us to travel down to the Advanced Light Source with these people who had made it for the first time,” says Johnson. It was an exciting collaboration for the young physicist.

“This paper had already been cited over a hundred times in a matter of months. It was a major paper, and we were going to measure this new material that no one had really started doing experiments on yet.”

The most pressing question facing silicene research was its potential as a semiconductor. Today, most electronics use silicon as a switch, and researchers looking for new materials to manage quantum effects in computing could easily use the 2-D version if it was also semiconducting.

Calculations had shown that because of the special buckling of silicene, it would have what’s called a Dirac cone – a special electronic structure that could allow researchers to tune the band gap, or the energy space between electron levels. The band gap is what makes a semiconductor: if the space is too small, the material is simply a conductor. Too large, and there is no conduction at all.

Since silicene has only ever been made on a silver base, the materials community also wondered if silicene would maintain its semiconducting properties in this condition. Though its atomic structure is slightly different than freestanding silicene, it was still predicted to have a band gap. However, silver is a metal, which may make the silicene act as a metal as well.

No one really knew how silicene would behave on its silver base.

To adapt the Le Lay group’s silicene-growing process to the equipment at the CLS took several days of work. Though their team had succeeded in silicene synthesis at the Advanced Light Source at Berkeley lab, they had no way to keep those samples under vacuum to prevent them from oxygen damage. Thanks to the work of fellow beamteam members Drs. David Muir and Israel Perez, samples grown at the CLS could be produced, transported and measured in a matter of hours without ever leaving a vacuum chamber.

Johnson grew the silicene sheets at the Resonant Elastic and Inelastic X-ray Scattering (REIXS), beamline, then transferred them in a vacuum to the XAS/XES endstation for analysis. Finally, Johnson could find the answer to the silicene question.

“I didn’t really know what to expect until I saw the XAS and XES on the same energy scale, and I thought to myself, that looks like a metal,” says Johnson.

And while that result is unfortunate for those searching for a new computing wonder material, it does provide some vital information to that search.

“Our result does help to guide the hunt for 2-D silicon in the future, suggesting that metallic substrates should be avoided at all costs,” Johnson explains. “We’re hopeful that we can grow a similar structure on other substrates, ideally ones that leave the semiconducting nature of silicene intact.”

That work is already in process, with Johnson and his colleagues planning to explore three other growing bases this summer, along with multilayers and nanoribbons of silicene.

Like the Dutch researchers in the Jan. 12, 2014 posting, Johnson finds that silicene is not serious competition for graphene (as regards to its electrical properties), but he does not challenge its existence. He does note problems with the silver substrate although he comes to a different conclusion than did the Argonne National Laboratory researchers (Aug. 1,  2014 posting).

Here’s a link to and a citation for Johnson’s paper,

The Metallic Nature of Epitaxial Silicene Monolayers on Ag(111) by Neil W. Johnson, Patrick Vogt, Andrea Resta, Paola De Padova, Israel Perez, David Muir, Ernst Z. Kurmaev, Guy Le Lay, and Alexander Moewes. Advanced Functional Materials Volume 24, Issue 33, pages 5253–5259, September 3, 2014 DOI: 10.1002/adfm.201400769 Article first published online: 10 JUN 2014

This paper is behind a paywall.

Buckydiamondoids steer electron flow

One doesn’t usually think about buckyballs (Buckminsterfullerenes) and diamondoids as being together in one molecule but that has not stopped scientists from trying to join them and, in this case, successfully. From a Sept. 9, 2014 news item on ScienceDaily,

Scientists have married two unconventional forms of carbon — one shaped like a soccer ball, the other a tiny diamond — to make a molecule that conducts electricity in only one direction. This tiny electronic component, known as a rectifier, could play a key role in shrinking chip components down to the size of molecules to enable faster, more powerful devices.

Here’s an illustration the scientists have provided,

Illustration of a buckydiamondoid molecule under a scanning tunneling microscope (STM). In this study the STM made images of the buckydiamondoids and probed their electronic properties.

Illustration of a buckydiamondoid molecule under a scanning tunneling microscope (STM). In this study the STM made images of the buckydiamondoids and probed their electronic properties.

A Sept. 9, 2014 Stanford University news release by Glenda Chui (also on EurekAlert), which originated the news item, provides some information about this piece of international research along with background information on buckyballs and diamondoids (Note: Links have been removed),

“We wanted to see what new, emergent properties might come out when you put these two ingredients together to create a ‘buckydiamondoid,’” said Hari Manoharan of the Stanford Institute for Materials and Energy Sciences (SIMES) at the U.S. Department of Energy’s SLAC National Accelerator Laboratory. “What we got was basically a one-way valve for conducting electricity – clearly more than the sum of its parts.”

The research team, which included scientists from Stanford University, Belgium, Germany and Ukraine, reported its results Sept. 9 in Nature Communications.

Many electronic circuits have three basic components: a material that conducts electrons; rectifiers, which commonly take the form of diodes, to steer that flow in a single direction; and transistors to switch the flow on and off. Scientists combined two offbeat ingredients – buckyballs and diamondoids – to create the new diode-like component.

Buckyballs – short for buckminsterfullerenes – are hollow carbon spheres whose 1985 discovery earned three scientists a Nobel Prize in chemistry. Diamondoids are tiny linked cages of carbon joined, or bonded, as they are in diamonds, with hydrogen atoms linked to the surface, but weighing less than a billionth of a billionth of a carat. Both are subjects of a lot of research aimed at understanding their properties and finding ways to use them.

In 2007, a team led by researchers from SLAC and Stanford discovered that a single layer of diamondoids on a metal surface can emit and focus electrons into a tiny beam. Manoharan and his colleagues wondered: What would happen if they paired an electron-emitting diamondoid with another molecule that likes to grab electrons? Buckyballs are just that sort of electron-grabbing molecule.

Details are then provided about this specific piece of research (from the Stanford news release),

For this study, diamondoids were produced in the SLAC laboratory of SIMES researchers Jeremy Dahl and Robert Carlson, who are world experts in extracting the tiny diamonds from petroleum. The diamondoids were then shipped to Germany, where chemists at Justus-Liebig University figured out how to attach them to buckyballs.

The resulting buckydiamondoids, which are just a few nanometers long, were tested in SIMES laboratories at Stanford. A team led by graduate student Jason Randel and postdoctoral researcher Francis Niestemski used a scanning tunneling microscope to make images of the hybrid molecules and measure their electronic behavior. They discovered that the hybrid is an excellent rectifier: The electrical current flowing through the molecule was up to 50 times stronger in one direction, from electron-spitting diamondoid to electron-catching buckyball, than in the opposite direction. This is something neither component can do on its own.

While this is not the first molecular rectifier ever invented, it’s the first one made from just carbon and hydrogen, a simplicity researchers find appealing, said Manoharan, who is an associate professor of physics at Stanford. The next step, he said, is to see if transistors can be constructed from the same basic ingredients.

“Buckyballs are easy to make – they can be isolated from soot – and the type of diamondoid we used here, which consists of two tiny cages, can be purchased commercially,” he said. “And now that our colleagues in Germany have figured out how to bind them together, others can follow the recipe. So while our research was aimed at gaining fundamental insights about a novel hybrid molecule, it could lead to advances that help make molecular electronics a reality.”

Other research collaborators came from the Catholic University of Louvain in Belgium and Kiev Polytechnic Institute in Ukraine. The primary funding for the work came from U.S. the Department of Energy Office of Science (Basic Energy Sciences, Materials Sciences and Engineering Divisions).

Here’s a link to and a citation for the paper,

Unconventional molecule-resolved current rectification in diamondoid–fullerene hybrids by Jason C. Randel, Francis C. Niestemski,    Andrés R. Botello-Mendez, Warren Mar, Georges Ndabashimiye, Sorin Melinte, Jeremy E. P. Dahl, Robert M. K. Carlson, Ekaterina D. Butova, Andrey A. Fokin, Peter R. Schreiner, Jean-Christophe Charlier & Hari C. Manoharan. Nature Communications 5, Article number: 4877 doi:10.1038/ncomms5877 Published 09 September 2014

This paper is open access. The scientists provided not only a standard illustration but a pretty picture of the buckydiamondoid,

Caption: An international team led by researchers at SLAC National Accelerator Laboratory and Stanford University joined two offbeat carbon molecules -- diamondoids, the square cages at left, and buckyballs, the soccer-ball shapes at right -- to create "buckydiamondoids," center. These hybrid molecules function as rectifiers, conducting electrons in only one direction, and could help pave the way to molecular electronic devices. Credit: Manoharan Lab/Stanford University

Caption: An international team led by researchers at SLAC National Accelerator Laboratory and Stanford University joined two offbeat carbon molecules — diamondoids, the square cages at left, and buckyballs, the soccer-ball shapes at right — to create “buckydiamondoids,” center. These hybrid molecules function as rectifiers, conducting electrons in only one direction, and could help pave the way to molecular electronic devices.
Credit: Manoharan Lab/Stanford University

Nanoscale light confinement without metal (photonic circuits) at the University of Alberta (Canada)

To be more accurate, this is a step forward towards photonic circuits according to an Aug. 20, 2014 news item on Azonano,

The invention of fibre optics revolutionized the way we share information, allowing us to transmit data at volumes and speeds we’d only previously dreamed of. Now, electrical engineering researchers at the University of Alberta are breaking another barrier, designing nano-optical cables small enough to replace the copper wiring on computer chips.

This could result in radical increases in computing speeds and reduced energy use by electronic devices.

“We’re already transmitting data from continent to continent using fibre optics, but the killer application is using this inside chips for interconnects—that is the Holy Grail,” says Zubin Jacob, an electrical engineering professor leading the research. “What we’ve done is come up with a fundamentally new way of confining light to the nano scale.”

At present, the diameter of fibre optic cables is limited to about one thousandth of a millimetre. Cables designed by graduate student Saman Jahani and Jacob are 10 times smaller—small enough to replace copper wiring still used on computer chips. (To put that into perspective, a dime is about one millimetre thick.)

An Aug. 19, 2014 University of Alberta news release by Richard Cairney (also on EurekAlert), which originated the news item, provides more technical detail and information about funding,

 Jahani and Jacob have used metamaterials to redefine the textbook phenomenon of total internal reflection, discovered 400 years ago by German scientist Johannes Kepler while working on telescopes.

Researchers around the world have been stymied in their efforts to develop effective fibre optics at smaller sizes. One popular solution has been reflective metallic claddings that keep light waves inside the cables. But the biggest hurdle is increased temperatures: metal causes problems after a certain point.

“If you use metal, a lot of light gets converted to heat. That has been the major stumbling block. Light gets converted to heat and the information literally burns up—it’s lost.”

Jacob and Jahani have designed a new, non-metallic metamaterial that enables them to “compress” and contain light waves in the smaller cables without creating heat, slowing the signal or losing data. …

The team’s research is funded by the Natural Sciences and Engineering Research Council of Canada and the Helmholtz-Alberta Initiative.

Jacob and Jahani are now building the metamaterials on a silicon chip to outperform current light confining strategies used in industry.

Given that this work is being performed at the nanoscale and these scientists are located within the Canadian university which houses Canada’s National Institute of Nanotechnology (NINT), the absence of any mention of the NINT comes as a surprise (more about this organization after the link to the researchers’ paper).

Here’s a link to and a citation for the paper,

Transparent subdiffraction optics: nanoscale light confinement without metal by Saman Jahani and Zubin Jacob. Optica, Vol. 1, Issue 2, pp. 96-100 (2014) http://dx.doi.org/10.1364/OPTICA.1.000096

This paper is open access.

In a search for the NINT’s website I found this summary at the University of Alberta’s NINT webpage,

The National Institute for Nanotechnology (NINT) was established in 2001 and is operated as a partnership between the National Research Council and the University of Alberta. Many NINT researchers are affiliated with both the National Research Council and University of Alberta.

NINT is a unique, integrated, multidisciplinary institute involving researchers from fields such as physics, chemistry, engineering, biology, informatics, pharmacy, and medicine. The main focus of the research being done at NINT is the integration of nano-scale devices and materials into complex nanosystems that can be put to practical use. Nanotechnology is a relatively new field of research, so people at NINT are working to discover “design rules” for nanotechnology and to develop platforms for building nanosystems and materials that can be constructed and programmed for a particular application. NINT aims to increase knowledge and support innovation in the area of nanotechnology, as well as to create work that will have long-term relevance and value for Alberta and Canada.

The University of Alberta’s NINT webpage also offers a link to the NINT’s latest rebranded website, The failure to mention the NINT gets more curious when looking at a description of NINT’s programmes one of which is hybrid nanoelectronics (Note: A link has been removed),

Hybrid NanoElectronics provide revolutionary electronic functions that may be utilized by industry through creating circuits that operate using mechanisms unique to the nanoscale. This may include functions that are not possible with conventional circuitry to provide smaller, faster and more energy-efficient components, and extend the development of electronics beyond the end of the roadmap.

After looking at a list of the researchers affiliated with the NINT, it’s apparent that neither Jahani or Jacob are part of that team. Perhaps they have preferred to work independently of the NINT ,which is one of the Canada National Research Council’s institutes.

Cyborgs (a presentation) at the American Chemical Society’s 248th meeting

There will be a plethora of chemistry news online over the next few days as the American Society’s (ACS) 248th meeting in San Francisco, CA from Aug. 10 -14, 2014 takes place. Unexpectedly, an Aug. 11, 2014 news item on Azonano highlights a meeting presentation focused on cyborgs,

No longer just fantastical fodder for sci-fi buffs, cyborg technology is bringing us tangible progress toward real-life electronic skin, prosthetics and ultraflexible circuits. Now taking this human-machine concept to an unprecedented level, pioneering scientists are working on the seamless marriage between electronics and brain signaling with the potential to transform our understanding of how the brain works — and how to treat its most devastating diseases.

An Aug. 10, 2014 ACS news release on EurekAlert provides more detail about the presentation (Note: Links have been removed),

“By focusing on the nanoelectronic connections between cells, we can do things no one has done before,” says Charles M. Lieber, Ph.D. “We’re really going into a new size regime for not only the device that records or stimulates cellular activity, but also for the whole circuit. We can make it really look and behave like smart, soft biological material, and integrate it with cells and cellular networks at the whole-tissue level. This could get around a lot of serious health problems in neurodegenerative diseases in the future.”

These disorders, such as Parkinson’s, that involve malfunctioning nerve cells can lead to difficulty with the most mundane and essential movements that most of us take for granted: walking, talking, eating and swallowing.

Scientists are working furiously to get to the bottom of neurological disorders. But they involve the body’s most complex organ — the brain — which is largely inaccessible to detailed, real-time scrutiny. This inability to see what’s happening in the body’s command center hinders the development of effective treatments for diseases that stem from it.

By using nanoelectronics, it could become possible for scientists to peer for the first time inside cells, see what’s going wrong in real time and ideally set them on a functional path again.

For the past several years, Lieber has been working to dramatically shrink cyborg science to a level that’s thousands of times smaller and more flexible than other bioelectronic research efforts. His team has made ultrathin nanowires that can monitor and influence what goes on inside cells. Using these wires, they have built ultraflexible, 3-D mesh scaffolding with hundreds of addressable electronic units, and they have grown living tissue on it. They have also developed the tiniest electronic probe ever that can record even the fastest signaling between cells.

Rapid-fire cell signaling controls all of the body’s movements, including breathing and swallowing, which are affected in some neurodegenerative diseases. And it’s at this level where the promise of Lieber’s most recent work enters the picture.

In one of the lab’s latest directions, Lieber’s team is figuring out how to inject their tiny, ultraflexible electronics into the brain and allow them to become fully integrated with the existing biological web of neurons. They’re currently in the early stages of the project and are working with rat models.

“It’s hard to say where this work will take us,” he says. “But in the end, I believe our unique approach will take us on a path to do something really revolutionary.”

Lieber acknowledges funding from the U.S. Department of Defense, the National Institutes of Health and the U.S. Air Force.

I first covered Lieber’s work in an Aug. 27, 2012 posting  highlighting some good descriptions from Lieber and his colleagues of their work. There’s also this Aug. 26, 2012 article by Peter Reuell in the Harvard Gazette (featuring a very good technical description for someone not terribly familiar with the field but able to grasp some technical information while managing their own [mine] ignorance). The posting and the article provide details about the foundational work for Lieber’s 2014 presentation at the ACS meeting.

Lieber will be speaking next at the IEEE (Institute for Electrical and Electronics Engineers) 14th International Conference on Nanotechnology sometime between August 18 – 21, 2014 in Toronto, Ontario, Canada.

As for some of Lieber’s latest published work, there’s more information in my Feb. 20, 2014 posting which features a link to a citation for the paper (behind a paywall) in question.

TrueNorth, a brain-inspired chip architecture from IBM and Cornell University

As a Canadian, “true north” is invariably followed by “strong and free” while singing our national anthem. For many Canadians it is almost the only phrase that is remembered without hesitation.  Consequently, some of the buzz surrounding the publication of a paper celebrating ‘TrueNorth’, a brain-inspired chip, is a bit disconcerting. Nonetheless, here is the latest IBM (in collaboration with Cornell University) news from an Aug. 8, 2014 news item on Nanowerk,

Scientists from IBM unveiled the first neurosynaptic computer chip to achieve an unprecedented scale of one million programmable neurons, 256 million programmable synapses and 46 billion synaptic operations per second per watt. At 5.4 billion transistors, this fully functional and production-scale chip is currently one of the largest CMOS chips ever built, yet, while running at biological real time, it consumes a minuscule 70mW—orders of magnitude less power than a modern microprocessor. A neurosynaptic supercomputer the size of a postage stamp that runs on the energy equivalent of a hearing-aid battery, this technology could transform science, technology, business, government, and society by enabling vision, audition, and multi-sensory applications.

An Aug. 7, 2014 IBM news release, which originated the news item, provides an overview of the multi-year process this breakthrough represents (Note: Links have been removed),

There is a huge disparity between the human brain’s cognitive capability and ultra-low power consumption when compared to today’s computers. To bridge the divide, IBM scientists created something that didn’t previously exist—an entirely new neuroscience-inspired scalable and efficient computer architecture that breaks path with the prevailing von Neumann architecture used almost universally since 1946.

This second generation chip is the culmination of almost a decade of research and development, including the initial single core hardware prototype in 2011 and software ecosystem with a new programming language and chip simulator in 2013.

The new cognitive chip architecture has an on-chip two-dimensional mesh network of 4096 digital, distributed neurosynaptic cores, where each core module integrates memory, computation, and communication, and operates in an event-driven, parallel, and fault-tolerant fashion. To enable system scaling beyond single-chip boundaries, adjacent chips, when tiled, can seamlessly connect to each other—building a foundation for future neurosynaptic supercomputers. To demonstrate scalability, IBM also revealed a 16-chip system with sixteen million programmable neurons and four billion programmable synapses.

“IBM has broken new ground in the field of brain-inspired computers, in terms of a radically new architecture, unprecedented scale, unparalleled power/area/speed efficiency, boundless scalability, and innovative design techniques. We foresee new generations of information technology systems – that complement today’s von Neumann machines – powered by an evolving ecosystem of systems, software, and services,” said Dr. Dharmendra S. Modha, IBM Fellow and IBM Chief Scientist, Brain-Inspired Computing, IBM Research. “These brain-inspired chips could transform mobility, via sensory and intelligent applications that can fit in the palm of your hand but without the need for Wi-Fi. This achievement underscores IBM’s leadership role at pivotal transformational moments in the history of computing via long-term investment in organic innovation.”

The Defense Advanced Research Projects Agency (DARPA) has funded the project since 2008 with approximately $53M via Phase 0, Phase 1, Phase 2, and Phase 3 of the Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program. Current collaborators include Cornell Tech and iniLabs, Ltd.

Building the Chip

The chip was fabricated using Samsung’s 28nm process technology that has a dense on-chip memory and low-leakage transistors.

“It is an astonishing achievement to leverage a process traditionally used for commercially available, low-power mobile devices to deliver a chip that emulates the human brain by processing extreme amounts of sensory information with very little power,” said Shawn Han, vice president of Foundry Marketing, Samsung Electronics. “This is a huge architectural breakthrough that is essential as the industry moves toward the next-generation cloud and big-data processing. It’s a pleasure to be part of technical progress for next-generation through Samsung’s 28nm technology.”

The event-driven circuit elements of the chip used the asynchronous design methodology developed at Cornell Tech [aka Cornell University] and refined with IBM since 2008.

“After years of collaboration with IBM, we are now a step closer to building a computer similar to our brain,” said Professor Rajit Manohar, Cornell Tech.

The combination of cutting-edge process technology, hybrid asynchronous-synchronous design methodology, and new architecture has led to a power density of 20mW/cm2 which is nearly four orders of magnitude less than today’s microprocessors.

Advancing the SyNAPSE Ecosystem

The new chip is a component of a complete end-to-end vertically integrated ecosystem spanning a chip simulator, neuroscience data, supercomputing, neuron specification, programming paradigm, algorithms and applications, and prototype design models. The ecosystem supports all aspects of the programming cycle from design through development, debugging, and deployment.

To bring forth this fundamentally different technological capability to society, IBM has designed a novel teaching curriculum for universities, customers, partners, and IBM employees.

Applications and Vision

This ecosystem signals a shift in moving computation closer to the data, taking in vastly varied kinds of sensory data, analyzing and integrating real-time information in a context-dependent way, and dealing with the ambiguity found in complex, real-world environments.

Looking to the future, IBM is working on integrating multi-sensory neurosynaptic processing into mobile devices constrained by power, volume and speed; integrating novel event-driven sensors with the chip; real-time multimedia cloud services accelerated by neurosynaptic systems; and neurosynaptic supercomputers by tiling multiple chips on a board, creating systems that would eventually scale to one hundred trillion synapses and beyond.

Building on previously demonstrated neurosynaptic cores with on-chip, online learning, IBM envisions building learning systems that adapt in real world settings. While today’s hardware is fabricated using a modern CMOS process, the underlying architecture is poised to exploit advances in future memory, 3D integration, logic, and sensor technologies to deliver even lower power, denser package, and faster speed.

I have two articles that may prove of interest, Peter Stratton’s Aug. 7, 2014 article for The Conversation provides an easy-to-read introduction to both brains, human and computer, (as they apply to this research) and TrueNorth (h/t phys.org also hosts Stratton’s article). There’s also an Aug. 7, 2014 article by Rob Farber for techenablement.com which includes information from a range of text and video sources about TrueNorth and cognitive computing as it’s also known (well worth checking out).

Here’s a link to and a citation for the paper,

A million spiking-neuron integrated circuit with a scalable communication network and interface by Paul A. Merolla, John V. Arthur, Rodrigo Alvarez-Icaza, Andrew S. Cassidy, Jun Sawada, Filipp Akopyan, Bryan L. Jackson, Nabil Imam, Chen Guo, Yutaka Nakamura, Bernard Brezzo, Ivan Vo, Steven K. Esser, Rathinakumar Appuswamy, Brian Taba, Arnon Amir, Myron D. Flickner, William P. Risk, Rajit Manohar, and Dharmendra S. Modha. Science 8 August 2014: Vol. 345 no. 6197 pp. 668-673 DOI: 10.1126/science.1254642

This paper is behind a paywall.

Carbyne stretches from theory to reality and reveals its conundrum-type self

Rice University (Texas, US) scientists have taken a rather difficult material, carbyne, and twisted it to reveal new properties according to a July 21, 2014 news item on ScienceDaily,

Applying just the right amount of tension to a chain of carbon atoms can turn it from a metallic conductor to an insulator, according to Rice University scientists.

Stretching the material known as carbyne — a hard-to-make, one-dimensional chain of carbon atoms — by just 3 percent can begin to change its properties in ways that engineers might find useful for mechanically activated nanoscale electronics and optics.

A July 21, 2014 Rice University news release (also on EurekAlert), which originated the news item, describes carbyne and some of the difficulties the scientists addressed in their research on the material,

Until recently, carbyne has existed mostly in theory, though experimentalists have made some headway in creating small samples of the finicky material. The carbon chain would theoretically be the strongest material ever, if only someone could make it reliably.

The first-principle calculations by Yakobson and his co-authors, Rice postdoctoral researcher Vasilii Artyukhov and graduate student Mingjie Liu, show that stretching carbon chains activates the transition from conductor to insulator by widening the material’s band gap. Band gaps, which free electrons must overcome to complete a circuit, give materials the semiconducting properties that make modern electronics possible.

In their previous work on carbyne, the researchers believed they saw hints of the transition, but they had to dig deeper to find that stretching would effectively turn the material into a switch.

Each carbon atom has four electrons available to form covalent bonds. In their relaxed state, the atoms in a carbyne chain would be more or less evenly spaced, with two bonds between them. But the atoms are never static, due to natural quantum uncertainty, which Yakobson said keeps them from slipping into a less-stable Peierls distortion.

“Peierls said one-dimensional metals are unstable and must become semiconductors or insulators,” Yakobson said. “But it’s not that simple, because there are two driving factors.”

One, the Peierls distortion, “wants to open the gap that makes it a semiconductor.” The other, called zero-point vibration (ZPV), “wants to maintain uniformity and the metal state.”

Yakobson explained that ZPV is a manifestation of quantum uncertainty, which says atoms are always in motion. “It’s more a blur than a vibration,” he said. “We can say carbyne represents the uncertainty principle in action, because when it’s relaxed, the bonds are constantly confused between 2-2 and 1-3, to the point where they average out and the chain remains metallic.”

But stretching the chain shifts the balance toward alternating long and short (1-3) bonds. That progressively opens a band gap beginning at about 3 percent tension, according to the computations. The Rice team created a phase diagram to illustrate the relationship of the band gap to strain and temperature.

How carbyne is attached to electrodes also matters, Artyukhov said. “Different bond connectivity patterns can affect the metallic/dielectric state balance and shift the transition point, potentially to where it may not be accessible anymore,” he said. “So one has to be extremely careful about making the contacts.”

“Carbyne’s structure is a conundrum,” he said. “Until this paper, everybody was convinced it was single-triple, with a long bond then a short bond, caused by Peierls instability.” He said the realization that quantum vibrations may quench Peierls, together with the team’s earlier finding that tension can increase the band gap and make carbyne more insulating, prompted the new study.

“Other researchers considered the role of ZPV in Peierls-active systems, even carbyne itself, before we did,” Artyukhov said. “However, in all previous studies only two possible answers were being considered: either ‘carbyne is semiconducting’ or ‘carbyne is metallic,’ and the conclusion, whichever one, was viewed as sort of a timeless mathematical truth, a static ‘ultimate verdict.’ What we realized here is that you can use tension to dynamically go from one regime to the other, which makes it useful on a completely different level.”

Yakobson noted the findings should encourage more research into the formation of stable carbyne chains and may apply equally to other one-dimensional chains subject to Peierls distortions, including conducting polymers and charge/spin density-wave materials.

According to the news release the research was funded by the U.S. Air Force Office of Scientific Research, the Office of Naval Research Multidisciplinary University Research Initiative, and the Robert Welch Foundation. (I can’t recall another instance of the air force and the navy funding the same research.) In any event, here’s a link to and a citation for the paper,

Mechanically Induced Metal–Insulator Transition in Carbyne by Vasilii I. Artyukhov, Mingjie Liu, and Boris I. Yakobson. Nano Lett., Article ASAP DOI: 10.1021/nl5017317 Publication Date (Web): July 3, 2014

Copyright © 2014 American Chemical Society

This paper is behind a paywall.

The researchers have provided an image to illustrate their work,

[downloaded from http://pubs.acs.org/doi/abs/10.1021/nl5017317]

[downloaded from http://pubs.acs.org/doi/abs/10.1021/nl5017317]

I’m not sure what the bird is doing in the image but it caught my fancy. There is another less whimsical illustration (you can see it in the  July 21, 2014 news item on ScienceDaily) and I believe the same caption can be used for the one I’ve chosen from the journal’s abstract page, “Carbyne chains of carbon atoms can be either metallic or semiconducting, according to first-principle calculations by scientists at Rice University. Stretching the chain dimerizes the atoms, opening a band gap between the pairs. Credit: Vasilii Artyukhov/Rice University.”

I last wrote about carbyne in an Oct. 9, 2013 posting where I noted that the material was unlikely to dethrone graphene as it didn’t appear to have properties useful in electronic applications. It seems the scientists have proved otherwise, at least in the laboratory.

Transmetalation, substituting one set of metal atoms for another set

Transmetalation bears a resemblance of sorts to transmutation. While the chemists from the University of Oregon aren’t turning lead to gold through an alchemical process they are switching out individual metal atoms, aluminum for indium. From a July 21, 2014 news item on ScienceDaily,

The yield so far is small, but chemists at the University of Oregon have developed a low-energy, solution-based mineral substitution process to make a precursor to transparent thin films that could find use in electronics and alternative energy devices.

A paper describing the approach is highlighted on the cover of the July 21 [2014] issue of the journal Inorganic Chemistry, which draws the most citations of research in the inorganic and nuclear chemistry fields. [emphasis mine] The paper was chosen by the American Chemical Society journal as an ACS Editor’s Choice for its potential scientific and broad public interest when it initially published online.

One observation unrelated to the research, the competition amongst universities seems to be heating up. While journals often tout their impact factor, it’s usually more discreetly than in what amounts to a citation in the second paragraph of the university news release, which originated the news item.

The July 21, 2014 University of Oregon news release (also on EurekAlert), describes the work in more detail,

The process described in the paper represents a new approach to transmetalation, in which individual atoms of one metal complex — a cluster in this case — are individually substituted in water. For this study, Maisha K. Kamunde-Devonish and Milton N. Jackson Jr., doctoral students in the Department of Chemistry and Biochemistry, replaced aluminum atoms with indium atoms.

The goal is to develop inorganic clusters as precursors that result in dense thin films with negligible defects, resulting in new functional materials and thin-film metal oxides. The latter would have wide application in a variety of electronic devices.

“Since the numbers of compounds that fit this bill is small, we are looking at transmetelation as a method for creating new precursors with new combinations of metals that would circumvent barriers to performance,” Kamunde-Devonish said.

Components in these devices now use deposition techniques that require a lot of energy in the form of pressure or temperature. Doing so in a more green way — reducing chemical waste during preparation — could reduce manufacturing costs and allow for larger-scale materials, she said.

“In essence,” said co-author Darren W. Johnson, a professor of chemistry, “we can prepare one type of nanoscale cluster compound, and then step-by-step substitute out the individual metal atoms to make new clusters that cannot be made by direct methods. The cluster we report in this paper serves as an excellent solution precursor to make very smooth thin films of amorphous aluminum indium oxide, a semiconductor material that can be used in transparent thin-film transistors.”

Transmetalation normally involves a reaction done in organic chemistry in which the substitution of metal ions generates new metal-carbon bonds for use in catalytic systems and to synthesize new metal complexes.

“This is a new way to use the process,” Kamunde-Devonish said, “Usually you take smaller building blocks and put them together to form a mix of your basic two or three metals. Instead of building a house from the ground up, we’re doing some remodeling. In everyday life that happens regularly, but in chemistry it doesn’t happen very often. We’ve been trying to make materials, compounds, anything that can be useful to improve the processes to make thin films that find application in a variety of electronic devices.”

The process, she added, could be turned into a toolbox that allows for precise substitutions to generate specifically desired properties. “Currently, we can only make small amounts,” she said, “but the fact that we can do this will allow us to get a fundamental understanding of how this process happens. The technology is possible already. It’s just a matter of determining if this type of material we’ve produced is the best for the process.”

Here’s a citation for and a link to the paper,

Transmetalation of Aqueous Inorganic Clusters: A Useful Route to the Synthesis of Heterometallic Aluminum and Indium Hydroxo—Aquo Clusters by Maisha K. Kamunde-Devonish, Milton N. Jackson, Jr., Zachary L. Mensinger, Lev N. Zakharov, and Darren W. Johnson. Inorg. Chem., 2014, 53 (14), pp 7101–7105 DOI: 10.1021/ic403121r Publication Date (Web): April 18, 2014

Copyright © 2014 American Chemical Society

This paper appears to be open access (I was able to view the HTML version when I clicked).

Better RRAM memory devices in the short term

Given my recent spate of posts about computing and the future of the chip (list to follow at the end of this post), this Rice University [Texas, US] research suggests that some improvements to current memory devices might be coming to the market in the near future. From a July 12, 2014 news item on Azonano,

Rice University’s breakthrough silicon oxide technology for high-density, next-generation computer memory is one step closer to mass production, thanks to a refinement that will allow manufacturers to fabricate devices at room temperature with conventional production methods.

A July 10, 2014 Rice University news release, which originated the news item, provides more detail,

Tour and colleagues began work on their breakthrough RRAM technology more than five years ago. The basic concept behind resistive memory devices is the insertion of a dielectric material — one that won’t normally conduct electricity — between two wires. When a sufficiently high voltage is applied across the wires, a narrow conduction path can be formed through the dielectric material.

The presence or absence of these conduction pathways can be used to represent the binary 1s and 0s of digital data. Research with a number of dielectric materials over the past decade has shown that such conduction pathways can be formed, broken and reformed thousands of times, which means RRAM can be used as the basis of rewritable random-access memory.

RRAM is under development worldwide and expected to supplant flash memory technology in the marketplace within a few years because it is faster than flash and can pack far more information into less space. For example, manufacturers have announced plans for RRAM prototype chips that will be capable of storing about one terabyte of data on a device the size of a postage stamp — more than 50 times the data density of current flash memory technology.

The key ingredient of Rice’s RRAM is its dielectric component, silicon oxide. Silicon is the most abundant element on Earth and the basic ingredient in conventional microchips. Microelectronics fabrication technologies based on silicon are widespread and easily understood, but until the 2010 discovery of conductive filament pathways in silicon oxide in Tour’s lab, the material wasn’t considered an option for RRAM.

Since then, Tour’s team has raced to further develop its RRAM and even used it for exotic new devices like transparent flexible memory chips. At the same time, the researchers also conducted countless tests to compare the performance of silicon oxide memories with competing dielectric RRAM technologies.

“Our technology is the only one that satisfies every market requirement, both from a production and a performance standpoint, for nonvolatile memory,” Tour said. “It can be manufactured at room temperature, has an extremely low forming voltage, high on-off ratio, low power consumption, nine-bit capacity per cell, exceptional switching speeds and excellent cycling endurance.”

In the latest study, a team headed by lead author and Rice postdoctoral researcher Gunuk Wang showed that using a porous version of silicon oxide could dramatically improve Rice’s RRAM in several ways. First, the porous material reduced the forming voltage — the power needed to form conduction pathways — to less than two volts, a 13-fold improvement over the team’s previous best and a number that stacks up against competing RRAM technologies. In addition, the porous silicon oxide also allowed Tour’s team to eliminate the need for a “device edge structure.”

“That means we can take a sheet of porous silicon oxide and just drop down electrodes without having to fabricate edges,” Tour said. “When we made our initial announcement about silicon oxide in 2010, one of the first questions I got from industry was whether we could do this without fabricating edges. At the time we could not, but the change to porous silicon oxide finally allows us to do that.”

Wang said, “We also demonstrated that the porous silicon oxide material increased the endurance cycles more than 100 times as compared with previous nonporous silicon oxide memories. Finally, the porous silicon oxide material has a capacity of up to nine bits per cell that is highest number among oxide-based memories, and the multiple capacity is unaffected by high temperatures.”

Tour said the latest developments with porous silicon oxide — reduced forming voltage, elimination of need for edge fabrication, excellent endurance cycling and multi-bit capacity — are extremely appealing to memory companies.

“This is a major accomplishment, and we’ve already been approached by companies interested in licensing this new technology,” he said.

Here’s a link to and a citation for the paper,

Nanoporous Silicon Oxide Memory by Gunuk Wang, Yang Yang, Jae-Hwang Lee, Vera Abramova, Huilong Fei, Gedeng Ruan, Edwin L. Thomas, and James M. Tour. Nano Lett., Article ASAP DOI: 10.1021/nl501803s Publication Date (Web): July 3, 2014

Copyright © 2014 American Chemical Society

This paper is behind a paywall.

As for my recent spate of posts on computers and chips, there’s a July 11, 2014 posting about IBM, a 7nm chip, and much more; a July 9, 2014 posting about Intel and its 14nm low-power chip processing and plans for a 10nm chip; and, finally, a June 26, 2014 posting about HP Labs and its plans for memristive-based computing and their project dubbed ‘The Machine’.

IBM weighs in with plans for a 7nm computer chip

On the heels of Intel’s announcement about a deal utilizing their 14nm low-power manufacturing process and speculations about a 10nm computer chip (my July 9, 2014 posting), IBM makes an announcement about a 7nm chip as per this July 10, 2014 news item on Azonano,

IBM today [July 10, 2014] announced it is investing $3 billion over the next 5 years in two broad research and early stage development programs to push the limits of chip technology needed to meet the emerging demands of cloud computing and Big Data systems. These investments will push IBM’s semiconductor innovations from today’s breakthroughs into the advanced technology leadership required for the future.

A very comprehensive July 10, 2014 news release lays out the company’s plans for this $3B investment representing 10% of IBM’s total research budget,

The first research program is aimed at so-called “7 nanometer and beyond” silicon technology that will address serious physical challenges that are threatening current semiconductor scaling techniques and will impede the ability to manufacture such chips. The second is focused on developing alternative technologies for post-silicon era chips using entirely different approaches, which IBM scientists and other experts say are required because of the physical limitations of silicon based semiconductors.

Cloud and big data applications are placing new challenges on systems, just as the underlying chip technology is facing numerous significant physical scaling limits.  Bandwidth to memory, high speed communication and device power consumption are becoming increasingly challenging and critical.

The teams will comprise IBM Research scientists and engineers from Albany and Yorktown, New York; Almaden, California; and Europe. In particular, IBM will be investing significantly in emerging areas of research that are already underway at IBM such as carbon nanoelectronics, silicon photonics, new memory technologies, and architectures that support quantum and cognitive computing. [emphasis mine]

These teams will focus on providing orders of magnitude improvement in system level performance and energy efficient computing. In addition, IBM will continue to invest in the nanosciences and quantum computing–two areas of fundamental science where IBM has remained a pioneer for over three decades.

7 nanometer technology and beyond

IBM Researchers and other semiconductor experts predict that while challenging, semiconductors show promise to scale from today’s 22 nanometers down to 14 and then 10 nanometers in the next several years.  However, scaling to 7 nanometers and perhaps below, by the end of the decade will require significant investment and innovation in semiconductor architectures as well as invention of new tools and techniques for manufacturing.

“The question is not if we will introduce 7 nanometer technology into manufacturing, but rather how, when, and at what cost?” said John Kelly, senior vice president, IBM Research. “IBM engineers and scientists, along with our partners, are well suited for this challenge and are already working on the materials science and device engineering required to meet the demands of the emerging system requirements for cloud, big data, and cognitive systems. This new investment will ensure that we produce the necessary innovations to meet these challenges.”

“Scaling to 7nm and below is a terrific challenge, calling for deep physics competencies in processing nano materials affinities and characteristics. IBM is one of a very few companies who has repeatedly demonstrated this level of science and engineering expertise,” said Richard Doherty, technology research director, The Envisioneering Group.

Bridge to a “Post-Silicon” Era

Silicon transistors, tiny switches that carry information on a chip, have been made smaller year after year, but they are approaching a point of physical limitation. Their increasingly small dimensions, now reaching the nanoscale, will prohibit any gains in performance due to the nature of silicon and the laws of physics. Within a few more generations, classical scaling and shrinkage will no longer yield the sizable benefits of lower power, lower cost and higher speed processors that the industry has become accustomed to.

With virtually all electronic equipment today built on complementary metal–oxide–semiconductor (CMOS) technology, there is an urgent need for new materials and circuit architecture designs compatible with this engineering process as the technology industry nears physical scalability limits of the silicon transistor.

Beyond 7 nanometers, the challenges dramatically increase, requiring a new kind of material to power systems of the future, and new computing platforms to solve problems that are unsolvable or difficult to solve today. Potential alternatives include new materials such as carbon nanotubes, and non-traditional computational approaches such as neuromorphic computing, cognitive computing, machine learning techniques, and the science behind quantum computing.

As the leader in advanced schemes that point beyond traditional silicon-based computing, IBM holds over 500 patents for technologies that will drive advancements at 7nm and beyond silicon — more than twice the nearest competitor. These continued investments will accelerate the invention and introduction into product development for IBM’s highly differentiated computing systems for cloud, and big data analytics.

Several exploratory research breakthroughs that could lead to major advancements in delivering dramatically smaller, faster and more powerful computer chips, include quantum computing, neurosynaptic computing, silicon photonics, carbon nanotubes, III-V technologies, low power transistors and graphene:

Quantum Computing

The most basic piece of information that a typical computer understands is a bit. Much like a light that can be switched on or off, a bit can have only one of two values: “1″ or “0.” Described as superposition, this special property of qubits enables quantum computers to weed through millions of solutions all at once, while desktop PCs would have to consider them one at a time.

IBM is a world leader in superconducting qubit-based quantum computing science and is a pioneer in the field of experimental and theoretical quantum information, fields that are still in the category of fundamental science – but one that, in the long term, may allow the solution of problems that are today either impossible or impractical to solve using conventional machines. The team recently demonstrated the first experimental realization of parity check with three superconducting qubits, an essential building block for one type of quantum computer.

Neurosynaptic Computing

Bringing together nanoscience, neuroscience, and supercomputing, IBM and university partners have developed an end-to-end ecosystem including a novel non-von Neumann architecture, a new programming language, as well as applications. This novel technology allows for computing systems that emulate the brain’s computing efficiency, size and power usage. IBM’s long-term goal is to build a neurosynaptic system with ten billion neurons and a hundred trillion synapses, all while consuming only one kilowatt of power and occupying less than two liters of volume.

Silicon Photonics

IBM has been a pioneer in the area of CMOS integrated silicon photonics for over 12 years, a technology that integrates functions for optical communications on a silicon chip, and the IBM team has recently designed and fabricated the world’s first monolithic silicon photonics based transceiver with wavelength division multiplexing.  Such transceivers will use light to transmit data between different components in a computing system at high data rates, low cost, and in an energetically efficient manner.

Silicon nanophotonics takes advantage of pulses of light for communication rather than traditional copper wiring and provides a super highway for large volumes of data to move at rapid speeds between computer chips in servers, large datacenters, and supercomputers, thus alleviating the limitations of congested data traffic and high-cost traditional interconnects.

Businesses are entering a new era of computing that requires systems to process and analyze, in real-time, huge volumes of information known as Big Data. Silicon nanophotonics technology provides answers to Big Data challenges by seamlessly connecting various parts of large systems, whether few centimeters or few kilometers apart from each other, and move terabytes of data via pulses of light through optical fibers.

III-V technologies

IBM researchers have demonstrated the world’s highest transconductance on a self-aligned III-V channel metal-oxide semiconductor (MOS) field-effect transistors (FETs) device structure that is compatible with CMOS scaling. These materials and structural innovation are expected to pave path for technology scaling at 7nm and beyond.  With more than an order of magnitude higher electron mobility than silicon, integrating III-V materials into CMOS enables higher performance at lower power density, allowing for an extension to power/performance scaling to meet the demands of cloud computing and big data systems.

Carbon Nanotubes

IBM Researchers are working in the area of carbon nanotube (CNT) electronics and exploring whether CNTs can replace silicon beyond the 7 nm node.  As part of its activities for developing carbon nanotube based CMOS VLSI circuits, IBM recently demonstrated — for the first time in the world — 2-way CMOS NAND gates using 50 nm gate length carbon nanotube transistors.

IBM also has demonstrated the capability for purifying carbon nanotubes to 99.99 percent, the highest (verified) purities demonstrated to date, and transistors at 10 nm channel length that show no degradation due to scaling–this is unmatched by any other material system to date.

Carbon nanotubes are single atomic sheets of carbon rolled up into a tube. The carbon nanotubes form the core of a transistor device that will work in a fashion similar to the current silicon transistor, but will be better performing. They could be used to replace the transistors in chips that power data-crunching servers, high performing computers and ultra fast smart phones.

Carbon nanotube transistors can operate as excellent switches at molecular dimensions of less than ten nanometers – the equivalent to 10,000 times thinner than a strand of human hair and less than half the size of the leading silicon technology. Comprehensive modeling of the electronic circuits suggests that about a five to ten times improvement in performance compared to silicon circuits is possible.

Graphene

Graphene is pure carbon in the form of a one atomic layer thick sheet.  It is an excellent conductor of heat and electricity, and it is also remarkably strong and flexible.  Electrons can move in graphene about ten times faster than in commonly used semiconductor materials such as silicon and silicon germanium. Its characteristics offer the possibility to build faster switching transistors than are possible with conventional semiconductors, particularly for applications in the handheld wireless communications business where it will be a more efficient switch than those currently used.

Recently in 2013, IBM demonstrated the world’s first graphene based integrated circuit receiver front end for wireless communications. The circuit consisted of a 2-stage amplifier and a down converter operating at 4.3 GHz.

Next Generation Low Power Transistors

In addition to new materials like CNTs, new architectures and innovative device concepts are required to boost future system performance. Power dissipation is a fundamental challenge for nanoelectronic circuits. To explain the challenge, consider a leaky water faucet — even after closing the valve as far as possible water continues to drip — this is similar to today’s transistor, in that energy is constantly “leaking” or being lost or wasted in the off-state.

A potential alternative to today’s power hungry silicon field effect transistors are so-called steep slope devices. They could operate at much lower voltage and thus dissipate significantly less power. IBM scientists are researching tunnel field effect transistors (TFETs). In this special type of transistors the quantum-mechanical effect of band-to-band tunneling is used to drive the current flow through the transistor. TFETs could achieve a 100-fold power reduction over complementary CMOS transistors, so integrating TFETs with CMOS technology could improve low-power integrated circuits.

Recently, IBM has developed a novel method to integrate III-V nanowires and heterostructures directly on standard silicon substrates and built the first ever InAs/Si tunnel diodes and TFETs using InAs as source and Si as channel with wrap-around gate as steep slope device for low power consumption applications.

“In the next ten years computing hardware systems will be fundamentally different as our scientists and engineers push the limits of semiconductor innovations to explore the post-silicon future,” said Tom Rosamilia, senior vice president, IBM Systems and Technology Group. “IBM Research and Development teams are creating breakthrough innovations that will fuel the next era of computing systems.”

IBM’s historic contributions to silicon and semiconductor innovation include the invention and/or first implementation of: the single cell DRAM, the “Dennard scaling laws” underpinning “Moore’s Law”, chemically amplified photoresists, copper interconnect wiring, Silicon on Insulator, strained engineering, multi core microprocessors, immersion lithography, high speed silicon germanium (SiGe), High-k gate dielectrics, embedded DRAM, 3D chip stacking, and Air gap insulators.

IBM researchers also are credited with initiating the era of nano devices following the Nobel prize winning invention of the scanning tunneling microscope which enabled nano and atomic scale invention and innovation.

IBM will also continue to fund and collaborate with university researchers to explore and develop the future technologies for the semiconductor industry. In particular, IBM will continue to support and fund university research through private-public partnerships such as the NanoElectornics Research Initiative (NRI), and the Semiconductor Advanced Research Network (STARnet), and the Global Research Consortium (GRC) of the Semiconductor Research Corporation.

I highlighted ‘memory systems’ as this brings to mind HP Labs and their major investment in ‘memristive’ technologies noted in my June 26, 2014 posting,

… During a two-hour presentation held a year and a half ago, they laid out how the computer might work, its benefits, and the expectation that about 75 percent of HP Labs personnel would be dedicated to this one project. “At the end, Meg {Meg Whitman, CEO of HP Labs] turned to [Chief Financial Officer] Cathie Lesjak and said, ‘Find them more money,’” says John Sontag, the vice president of systems research at HP, who attended the meeting and is in charge of bringing the Machine to life. “People in Labs see this as a once-in-a-lifetime opportunity.”

The Machine is based on the memristor and other associated technologies.

Getting back to IBM, there’s this analysis of the $3B investment ($600M/year for five years) by Alex Konrad in a July 10, 2014 article for Forbes (Note: A link has been removed),

When IBM … announced a $3 billion commitment to even tinier semiconductor chips that no longer depended on silicon on Wednesday, the big news was that IBM’s putting a lot of money into a future for chips where Moore’s Law no longer applies. But on second glance, the move to spend billions on more experimental ideas like silicon photonics and carbon nanotubes shows that IBM’s finally shifting large portions of its research budget into more ambitious and long-term ideas.

… IBM tells Forbes the $3 billion isn’t additional money being added to its R&D spend, an area where analysts have told Forbes they’d like to see more aggressive cash commitments in the future. IBM will still spend about $6 billion a year on R&D, 6% of revenue. Ten percent of that research budget, however, now has to come from somewhere else to fuel these more ambitious chip projects.

Neal Ungerleider’s July 11, 2014 article for Fast Company focuses on the neuromorphic computing and quantum computing aspects of this $3B initiative (Note: Links have been removed),

The new R&D initiatives fall into two categories: Developing nanotech components for silicon chips for big data and cloud systems, and experimentation with “post-silicon” microchips. This will include research into quantum computers which don’t know binary code, neurosynaptic computers which mimic the behavior of living brains, carbon nanotubes, graphene tools and a variety of other technologies.

IBM’s investment is one of the largest for quantum computing to date; the company is one of the biggest researchers in the field, along with a Canadian company named D-Wave which is partnering with Google and NASA to develop quantum computer systems.

The curious can find D-Wave Systems here. There’s also a January 19, 2012 posting here which discusses the D-Wave’s situation at that time.

Final observation, these are fascinating developments especially for the insight they provide into the worries troubling HP Labs, Intel, and IBM as they jockey for position.

ETA July 14, 2014: Dexter Johnson has a July 11, 2014 posting on his Nanoclast blog (on the IEEE [Institute for Electrical and Electronics Engineers]) about the IBM announcement and which features some responses he received from IBM officials to his queries,

While this may be a matter of fascinating speculation for investors, the impact on nanotechnology development  is going to be significant. To get a better sense of what it all means, I was able to talk to some of the key figures of IBM’s push in nanotechnology research.

I conducted e-mail interviews with Tze-Chiang (T.C.) Chen, vice president science & technology, IBM Fellow at the Thomas J. Watson Research Center and Wilfried Haensch, senior manager, physics and materials for logic and communications, IBM Research.

Silicon versus Nanomaterials

First, I wanted to get a sense for how long IBM envisioned sticking with silicon and when they expected the company would permanently make the move away from CMOS to alternative nanomaterials. Unfortunately, as expected, I didn’t get solid answers, except for them to say that new manufacturing tools and techniques need to be developed now.

He goes on to ask about carbon nanotubes and graphene. Interestingly, IBM does not have a wide range of electronics applications in mind for graphene.  I encourage you to read Dexter’s posting as Dexter got answers to some very astute and pointed questions.

The age of the ‘nano-pixel’

As mentioned here before, ‘The Diamond Age: Or, A Young Lady’s Illustrated Primer’, a 1985 novel by Neal Stephenson featured in its opening chapter a flexible, bendable, rollable, newspaper screen. It’s one of those devices promised by ‘nano evangelists’ that never quite seems to come into existence. However, ‘hope springs eternally’ as they say and a team from the University of Oxford claims to be bringing us one step closer.

From a July 10, 2014 University of Oxford press release (also on EurekAlert but dated July 9, 2014 and on Azoanano as a July 10, 2014 news item),

A new discovery will make it possible to create pixels just a few hundred nanometres across that could pave the way for extremely high-resolution and low-energy thin, flexible displays for applications such as ‘smart’ glasses, synthetic retinas, and foldable screens.

A team led by Oxford University scientists explored the link between the electrical and optical properties of phase change materials (materials that can change from an amorphous to a crystalline state). They found that by sandwiching a seven nanometre thick layer of a phase change material (GST) between two layers of a transparent electrode they could use a tiny current to ‘draw’ images within the sandwich ‘stack’.

Here’s a series of images the researchers have created using this technology,

Still images drawn with the technology: at around 70 micrometres across each image is smaller than the width of a human hair.  Courtesy University of Oxford

Still images drawn with the technology: at around 70 micrometres across each image is smaller than the width of a human hair. Courtesy University of Oxford

The press release offers a technical description,

Initially still images were created using an atomic force microscope but the team went on to demonstrate that such tiny ‘stacks’ can be turned into prototype pixel-like devices. These ‘nano-pixels’ – just 300 by 300 nanometres in size – can be electrically switched ‘on and off’ at will, creating the coloured dots that would form the building blocks of an extremely high-resolution display technology.

‘We didn’t set out to invent a new kind of display,’ said Professor Harish Bhaskaran of Oxford University’s Department of Materials, who led the research. ‘We were exploring the relationship between the electrical and optical properties of phase change materials and then had the idea of creating this GST ‘sandwich’ made up of layers just a few nanometres thick. We found that not only were we able to create images in the stack but, to our surprise, thinner layers of GST actually gave us better contrast. We also discovered that altering the size of the bottom electrode layer enabled us to change the colour of the image.’

The layers of the GST sandwich are created using a sputtering technique where a target is bombarded with high energy particles so that atoms from the target are deposited onto another material as a thin film.

‘Because the layers that make up our devices can be deposited as thin films they can be incorporated into very thin flexible materials – we have already demonstrated that the technique works on flexible Mylar sheets around 200 nanometres thick,’ said Professor Bhaskaran. ‘This makes them potentially useful for ‘smart’ glasses, foldable screens, windshield displays, and even synthetic retinas that mimic the abilities of photoreceptor cells in the human eye.’

Peiman Hosseini of Oxford University’s Department of Materials, first author of the paper, said: ‘Our models are so good at predicting the experiment that we can tune our prototype ‘pixels’ to create any colour we want – including the primary colours needed for a display. One of the advantages of our design is that, unlike most conventional LCD screens, there would be no need to constantly refresh all pixels, you would only have to refresh those pixels that actually change (static pixels remain as they were). This means that any display based on this technology would have extremely low energy consumption.’

The research suggests that flexible paper-thin displays based on the technology could have the capacity to switch between a power-saving ‘colour e-reader mode’, and a backlit display capable of showing video. Such displays could be created using cheap materials and, because they would be solid-state, promise to be reliable and easy to manufacture. The tiny ‘nano-pixels’ make it ideal for applications, such as smart glasses, where an image would be projected at a larger size as, even enlarged, they would offer very high-resolution.

Professor David Wright of the Department of Engineering at the University of Exeter, co-author of the paper, said: ‘Along with many other researchers around the world we have been looking into the use of these GST materials for memory applications for many years, but no one before thought of combining their electrical and optical functionality to provide entirely new kinds of non-volatile, high-resolution, electronic colour displays – so our work is a real breakthrough.’

The phase change material used was the alloy Ge2Sb2Te5 (Germanium-Antimony-Tellurium or GST) sandwiched between electrode layers made of indium tin oxide (ITO).

I gather the researchers are looking for investors (from the press release),

Whilst the work is still in its early stages, realising its potential, the Oxford team has filed a patent on the discovery with the help of Isis Innovation, Oxford University’s technology commercialisation company. Isis is now discussing the displays with companies who are interested in assessing the technology, and with investors.

Here’s a link to and a citation for the paper,

An optoelectronic framework enabled by low-dimensional phase-change films by Peiman Hosseini, C. David Wright, & Harish Bhaskaran. Nature 511, 206–211 (10 July 2014) doi:10.1038/nature13487 Published online 09 July 2014

This paper is behind a paywall.