In a groundbreaking development, Professor Xingbin Yan and his team have successfully merged two seemingly disparate research areas: supercapacitors, traditionally used in energy storage, and memristors, integral to neuromorphic computing. Their study introduces a novel phenomenon—the memristive effect in supercapacitors.
“Scientifically, we combine two seemingly disparate research areas: supercapacitors, traditionally used in energy storage, and memristors, integral to neuromorphic computing.” Prof. Xingbin Yan said, “the introduction of memristive behavior in supercapacitors not only enriches the fundamental physics underlying capacitive and memristive components but also extends supercapacitor technology into the realm of artificial synapses. This advancement opens new avenues for developing neuromorphic devices with advanced functionalities, offering a fresh research paradigm for the field.”
In 1971, Chua et al. at UC Berkeley introduced the memristor, proposing it as the fourth fundamental circuit element. Memristors have variable resistance that retains its value after current stops, mimicking neuron behavior, and are considered key for future memory and AI devices. In 2008, HP Labs [R. Stanley Williams and his team] developed nanoscale memristors based on TiO2. However, solid-state devices struggle with simulating chemical synapses. Fluidic memristors are promising due to their biocompatibility and ability to perform various neuromorphic functions. Confining ions in nanoscale channels allows for functionalities like ion diodes and switches, with some systems exhibiting memristive effects.
In 2021, Bocquet [Marc Bocquet, Aix-Marseille Université] et al. predicted that two-dimensional nanoscopic channels could achieve ionic memory functions. Their simulations showed strong nonlinear transport effects in these channels. They confined electrolytes to a monolayer and observed that salts could form tightly bound pairs. Following this, Bocquet’s team created nanoscale fluidic devices with salt solutions, showing hysteresis effects and variable memory times. Similarly, Mao et al. found comparable results with polymer electrolyte brushes, demonstrating hysteresis and frequency-dependent I-V curves. Both studies highlight advancements in controlling ions in nanofluidic devices, mimicking biological systems.
Supercapacitors are known for their higher power density, rapid response, and long lifespan, making them essential for applications in electronics, aerospace, transportation, and smart grids. Recently, a novel type of capacitive ionic device, called supercapacitor-diodes (CAPodes), has been introduced. These devices enable controlled and selective unidirectional ion transport, enhancing the functionality of supercapacitors.
In supercapacitors, charge storage occurs through ion adsorption or rapid redox reactions at the electrode surface, a principle similar to that in fluidic memristors. Inspired by CAPodes, the innovative idea is to explore whether a supercapacitor can be designed with nano-ion channels within the electrode material to achieve memory performance similar to that of fluidic memristors. If feasible, this would not only enhance traditional energy storage but also enable hysteresis in the transport and redistribution of electrolyte ions under varying electric fields.
In this design, the nanochannels of the ZIF-7 electrode in an aqueous supercapacitor allow for the enrichment and dissipation of anionic species (OH−) under varying voltage regimes. This results in a hysteresis effect in ion conductivity, which imparts memristive behavior to the supercapacitor. Consequently, the CAPistor combines the programmable resistance and memory functions of an ionic memristor with the energy storage capabilities of a supercapacitor. This integration opens up new possibilities for extending supercapacitors’ traditional applications into advanced fields such as biomimetic nanofluidic ionics and neuromorphic computing.
Here’s a link to and a citation for the paper,
Constructing a supercapacitor-memristor through non-linear ion transport in MOF nanochannels by Pei Tang, Pengwei Jing, Zhiyuan Luo, Kekang Liu, Xiaoxi Zhao, Yining Lao, Qianqian Yao, Chuyi Zhong, Qingfeng Fu, Jian Zhu, Yanghui Liu, Qingyun Dou, Xingbin Yan. National Science Review, Volume 11, Issue 10, October 2024, nwae322, DOI: https://doi.org/10.1093/nsr/nwae322 Published: 11 September 2024
Katherine Bourzac’s September 16, 2024 article for the IEEE (Institute for Electrical and Electronics Engineers) Spectrum magazine provides an accessible (relatively speaking) description of a possible breakthrough for neuromorphic computing, Note: Links have been removed,
In electrical engineering, “we just take it for granted that the signal decays” as it travels, says Timothy Brown, a postdoc in materials physics at Sandia National Lab who was part of the group of researchers who made the self-amplifying device. Even the best wires and chip interconnects put up resistance to the flow of electrons, degrading signal quality over even relatively small distances. This constrains chip designs—lossy interconnects are broken up into ever smaller lengths, and signals are bolstered by buffers and drivers. A 1-square-centimeter chip has about 10,000 repeaters to drive signals, estimates R. Stanley Williams, a professor of computer engineering at Texas A&M University.
Williams is one of the pioneers of neuromorphic computing, which takes inspiration from the nervous system. Axons, the electrical cables that carry signals from the body of a nerve cell to synapses where they connect with projections from other cells, are made up of electrically resistant materials. Yet they can carry high fidelity signals over long distances. The longest axons in the human body are about 1 meter, running from the base of the spine to the feet. Blue whales are thought to have 30 m long axons stretching to the tips of their tails. If something bites the whale’s tail, it will react rapidly. Even from 30 meters away, “the pulses arrive perfectly,” says Williams. “That’s something that doesn’t exist in electrical engineering.”
That’s because axons are active transmission lines: they provide gain to the signal along their length. Williams says he started pondering how to mimic this in an inorganic system 12 years ago. A grant from the US Department of Energy enabled him to build a team with the necessary resources to make it happen. The team included Williams, Brown, and Suhas Kumar, a materials physicist at Sandia.
Axons are coated with an insulating layer called the myelin sheath. Where there are gaps in the sheath, negatively charged sodium ions and positively charged potassium ions can move in and out of the axon, changing the voltage across the cell membrane and pumping in energy in the process. Some of that energy gets taken up by the electrical signal, amplifying it.
…
Williams and his team wanted to mimic this in a simple structure. They didn’t try to mimic all the physical structures in axons—instead, they sought guidance in a mathematical description of how they amplify signals. Axons operate in a mode called the “edge of chaos,” which combines stable and unstable qualities. This may seem inherently contradictory. Brown likens this kind of system to a saddle that’s curved with two dips. The saddle curves up towards the front and the back, keeping you stable as you rock back and forth. But if you get jostled from side to side, you’re more likely to fall off. When you’re riding in the saddle, you’re operating at the edge of chaos, in a semistable state. In the abstract space of electrical engineering, that jostling is equivalent to wiggles in current and voltage.
…
There’s a long way to go from this first experimental demonstration to a reimagining of computer chip interconnects. The team is providing samples for other researchers [emphasis mine] who want to verify their measurements. And they’re trying other materials to see how well they do—LaCoO3 [lanthanum colbalt oxide] is only the first one they’ve tested.
Williams hopes this research will show electrical engineers new ideas about how to move forward. “The dream is to redesign chips,” he says. Electrical engineers have long known about nonlinear dynamics, but have hardly ever taken advantage of them, Williams says. “This requires thinking about things and doing measurements differently than they have been done for 50 years,” he says.
If you have the time, please read Bourzac’s September 16, 2024 article in its entirety. For those who want the technical nitty gritty, here’s a link to and a citation for the paper,
Axon-like active signal transmission by Timothy D. Brown, Alan Zhang, Frederick U. Nitta, Elliot D. Grant, Jenny L. Chong, Jacklyn Zhu, Sritharini Radhakrishnan, Mahnaz Islam, Elliot J. Fuller, A. Alec Talin, Patrick J. Shamberger, Eric Pop, R. Stanley Williams & Suhas Kumar. Nature volume 633, pages 804–810 (2024) DOI: https://doi.org/10.1038/s41586-024-07921 Published online: 11 September 2024 Issue Date: 26 September 2024
In a landmark advancement, researchers at the Indian Institute of Science (IISc) have developed a brain-inspired analog computing platform capable of storing and processing data in an astonishing 16,500 conductance states within a molecular film. Published today in the journal Nature, this breakthrough represents a huge step forward over traditional digital computers in which data storage and processing are limited to just two states.
Such a platform could potentially bring complex AI tasks, like training Large Language Models (LLMs), to personal devices like laptops and smartphones, thus taking us closer to democratising the development of AI tools. These developments are currently restricted to resource-heavy data centres, due to a lack of energy-efficient hardware. With silicon electronics nearing saturation, designing brain-inspired accelerators that can work alongside silicon chips to deliver faster, more efficient AI is also becoming crucial.
“Neuromorphic computing has had its fair share of unsolved challenges for over a decade,” explains Sreetosh Goswami, Assistant Professor at the Centre for Nano Science and Engineering (CeNSE), IISc, who led the research team. “With this discovery, we have almost nailed the perfect system – a rare feat.”
The fundamental operation underlying most AI algorithms is quite basic – matrix multiplication, a concept taught in high school maths. But in digital computers, these calculations hog a lot of energy. The platform developed by the IISc team drastically cuts down both the time and energy involved, making these calculations a lot faster and easier.
The molecular system at the heart of the platform was designed by Sreebrata Goswami, Visiting Professor at CeNSE. As molecules and ions wiggle and move within a material film, they create countless unique memory states, many of which have been inaccessible so far. Most digital devices are only able to access two states (high and low conductance), without being able to tap into the infinite number of intermediate states possible.
By using precisely timed voltage pulses, the IISc team found a way to effectively trace a much larger number of molecular movements, and map each of these to a distinct electrical signal, forming an extensive “molecular diary” of different states. “This project brought together the precision of electrical engineering with the creativity of chemistry, letting us control molecular kinetics very precisely inside an electronic circuit powered by nanosecond voltage pulses,” explains Sreebrata Goswami.
Tapping into these tiny molecular changes allowed the team to create a highly precise and efficient neuromorphic accelerator, which can store and process data within the same location, similar to the human brain. Such accelerators can be seamlessly integrated with silicon circuits to boost their performance and energy efficiency.
A key challenge that the team faced was characterising the various conductance states, which proved impossible using existing equipment. The team designed a custom circuit board that could measure voltages as tiny as a millionth of a volt, to pinpoint these individual states with unprecedented accuracy.
The team also turned this scientific discovery into a technological feat. They were able to recreate NASA’s iconic “Pillars of Creation” image from the James Webb Space Telescope data – originally created by a supercomputer – using just a tabletop computer. They were also able to do this at a fraction of the time and energy that traditional computers would need.
The team includes several students and research fellows at IISc. Deepak Sharma performed the circuit and system design and electrical characterisation, Santi Prasad Rath handled synthesis and fabrication, Bidyabhusan Kundu tackled the mathematical modelling, and Harivignesh S crafted bio-inspired neuronal response behaviour. The team also collaborated with Stanley Williams [also known as R. Stanley Williams], Professor at Texas A&M University and Damien Thompson, Professor at the University of Limerick.
The researchers believe that this breakthrough could be one of India’s biggest leaps in AI hardware, putting the country on the map of global technology innovation. Navakanta Bhat, Professor at CeNSE and an expert in silicon electronics led the circuit and system design in this project. “What stands out is how we have transformed complex physics and chemistry understanding into groundbreaking technology for AI hardware,” he explains. “In the context of the India Semiconductor Mission, this development could be a game-changer, revolutionising industrial, consumer and strategic applications. The national importance of such research cannot be overstated.”
With support from the Ministry of Electronics and Information Technology, the IISc team is now focused on developing a fully indigenous integrated neuromorphic chip. “This is a completely home-grown effort, from materials to circuits and systems,” emphasises Sreetosh Goswami. “We are well on our way to translating this technology into a system-on-a-chip.”
Here’s a link to and a citation for the paper,
Linear symmetric self-selecting 14-bit kinetic molecular memristors by Deepak Sharma, Santi Prasad Rath, Bidyabhusan Kundu, Anil Korkmaz, Harivignesh S, Damien Thompson, Navakanta Bhat, Sreebrata Goswami, R. Stanley Williams & Sreetosh Goswami. Nature volume 633, pages 560–566 (2024) DOI: https://doi.org/10.1038/s41586-024-07902-2 Published online: 11 September 2024 Issue Date: 19 September 2024
In a February 13, 2023 essay, Michael Berger who runs the Nanowerk website provides an overview of brainlike (neuromorphic) engineering.
This essay is the most extensive piece I’ve seen on Berger’s website and it covers everything from the reasons why scientists are so interested in mimicking the human brain to specifics about memristors. Here are a few excerpts (Note: Links have been removed),
Neuromorphic engineering is a cutting-edge field that focuses on developing computer hardware and software systems inspired by the structure, function, and behavior of the human brain. The ultimate goal is to create computing systems that are significantly more energy-efficient, scalable, and adaptive than conventional computer systems, capable of solving complex problems in a manner reminiscent of the brain’s approach.
This interdisciplinary field draws upon expertise from various domains, including neuroscience, computer science, electronics, nanotechnology, and materials science. Neuromorphic engineers strive to develop computer chips and systems incorporating artificial neurons and synapses, designed to process information in a parallel and distributed manner, akin to the brain’s functionality.
Key challenges in neuromorphic engineering encompass developing algorithms and hardware capable of performing intricate computations with minimal energy consumption, creating systems that can learn and adapt over time, and devising methods to control the behavior of artificial neurons and synapses in real-time.
Neuromorphic engineering has numerous applications in diverse areas such as robotics, computer vision, speech recognition, and artificial intelligence. The aspiration is that brain-like computing systems will give rise to machines better equipped to tackle complex and uncertain tasks, which currently remain beyond the reach of conventional computers.
It is essential to distinguish between neuromorphic engineering and neuromorphic computing, two related but distinct concepts. Neuromorphic computing represents a specific application of neuromorphic engineering, involving the utilization of hardware and software systems designed to process information in a manner akin to human brain function.
…
One of the major obstacles in creating brain-inspired computing systems is the vast complexity of the human brain. Unlike traditional computers, the brain operates as a nonlinear dynamic system that can handle massive amounts of data through various input channels, filter information, store key information in short- and long-term memory, learn by analyzing incoming and stored data, make decisions in a constantly changing environment, and do all of this while consuming very little power.
…
The Human Brain Project [emphasis mine], a large-scale research project launched in 2013, aims to create a comprehensive, detailed, and biologically realistic simulation of the human brain, known as the Virtual Brain. One of the goals of the project is to develop new brain-inspired computing technologies, such as neuromorphic computing.
The final Human Brain Project Summit 2023 will take place in Marseille, France, from March 28-31, 2023.
As the ten-year European Flagship Human Brain Project (HBP) approaches its conclusion in September 2023, the final HBP Summit will highlight the scientific achievements of the project at the interface of neuroscience and technology and the legacy that it will leave for the brain research community. …
One last excerpt from the essay,
Neuromorphic computing is a radical reimagining of computer architecture at the transistor level, modeled after the structure and function of biological neural networks in the brain. This computing paradigm aims to build electronic systems that attempt to emulate the distributed and parallel computation of the brain by combining processing and memory in the same physical location.
This is unlike traditional computing, which is based on von Neumann systems consisting of three different units: processing unit, I/O unit, and storage unit. This stored program architecture is a model for designing computers that uses a single memory to store both data and instructions, and a central processing unit to execute those instructions. This design, first proposed by mathematician and computer scientist John von Neumann, is widely used in modern computers and is considered to be the standard architecture for computer systems and relies on a clear distinction between memory and processing.
…
I found the diagram Berger Included with von Neumann’s design contrasted with a neuromorphic design illuminating,
Berger offers a very good overview and I recommend reading his February 13, 2023 essay on neuromorphic engineering with one proviso, Note: A link has been removed,
Many researchers in this field see memristors as a key device component for neuromorphic engineering. Memristor – or memory resistor – devices are non-volatile nanoelectronic memory devices that were first theorized [emphasis mine] by Leon Chua in the 1970’s. However, it was some thirty years later that the first practical device was fabricated in 2008 by a group led by Stanley Williams [sometimes cited as R. Stanley Williams] at HP Research Labs.
…
Chua wasn’t the first as he, himself, has noted. Chua arrived at his theory independently in the 1970s but Bernard Widrow theorized what he called a ‘memistor’ in the 1960s. In fact “Memristors: they are older than you think” is a May 22, 2012 posting which featured an article “Two centuries of memristors” by Themistoklis Prodromakis, Christofer Toumazou and Leon Chua published in Nature Materials.
Most of us try to get it right but we don’t always succeed. It’s always good practice to read everyone (including me) with a little skepticism.
A September 1, 2021 news item on ScienceDaily announces a new type of memristor from Texas A&M University (Texas A&M or TAMU) and the National University of Singapore (NUS)
In a discovery published in the journal Nature, an international team of researchers has described a novel molecular device with exceptional computing prowess.
Reminiscent of the plasticity of connections in the human brain, the device can be reconfigured on the fly for different computational tasks by simply changing applied voltages. Furthermore, like nerve cells can store memories, the same device can also retain information for future retrieval and processing.
“The brain has the remarkable ability to change its wiring around by making and breaking connections between nerve cells. Achieving something comparable in a physical system has been extremely challenging,” said Dr. R. Stanley Williams [emphasis mine], professor in the Department of Electrical and Computer Engineering at Texas A&M University. “We have now created a molecular device with dramatic reconfigurability, which is achieved not by changing physical connections like in the brain, but by reprogramming its logic.”
Dr. T. Venkatesan, director of the Center for Quantum Research and Technology (CQRT) at the University of Oklahoma, Scientific Affiliate at National Institute of Standards and Technology, Gaithersburg, and adjunct professor of electrical and computer engineering at the National University of Singapore, added that their molecular device might in the future help design next-generation processing chips with enhanced computational power and speed, but consuming significantly reduced energy.
Whether it is the familiar laptop or a sophisticated supercomputer, digital technologies face a common nemesis, the von Neumann bottleneck. This delay in computational processing is a consequence of current computer architectures, wherein the memory, containing data and programs, is physically separated from the processor. As a result, computers spend a significant amount of time shuttling information between the two systems, causing the bottleneck. Also, despite extremely fast processor speeds, these units can be idling for extended amounts of time during periods of information exchange.
As an alternative to conventional electronic parts used for designing memory units and processors, devices called memristors offer a way to circumvent the von Neumann bottleneck. Memristors, such as those made of niobium dioxide and vanadium dioxide, transition from being an insulator to a conductor at a set temperature. This property gives these types of memristors the ability to perform computations and store data.
However, despite their many advantages, these metal oxide memristors are made of rare-earth elements and can operate only in restrictive temperature regimes. Hence, there has been an ongoing search for promising organic molecules that can perform a comparable memristive function, said Williams.
Dr. Sreebrata Goswami, a professor at the Indian Association for the Cultivation of Science, designed the material used in this work. The compound has a central metal atom (iron) bound to three phenyl azo pyridine organic molecules called ligands.
“This behaves like an electron sponge that can absorb as many as six electrons reversibly, resulting in seven different redox states,” said Sreebrata. “The interconnectivity between these states is the key behind the reconfigurability shown in this work.”
Dr. Sreetosh Goswami, a researcher at the National University of Singapore, devised this project by creating a tiny electrical circuit consisting of a 40-nanometer layer of molecular film sandwiched between a layer of gold on top and gold-infused nanodisc and indium tin oxide at the bottom.
On applying a negative voltage on the device, Sreetosh witnessed a current-voltage profile that was nothing like anyone had seen before. Unlike metal-oxide memristors that can switch from metal to insulator at only one fixed voltage, the organic molecular devices could switch back and forth from insulator to conductor at several discrete sequential voltages.
“So, if you think of the device as an on-off switch, as we were sweeping the voltage more negative, the device first switched from on to off, then off to on, then on to off and then back to on. I’ll say that we were just blown out of our seat,” said Venkatesan. “We had to convince ourselves that what we were seeing was real.”
Sreetosh and Sreebrata investigated the molecular mechanisms underlying the curious switching behavior using an imaging technique called Raman spectroscopy. In particular, they looked for spectral signatures in the vibrational motion of the organic molecule that could explain the multiple transitions. Their investigation revealed that sweeping the voltage negative triggered the ligands on the molecule to undergo a series of reduction, or electron-gaining, events that caused the molecule to transition between off state and on states.
Next, to describe the extremely complex current-voltage profile of the molecular device mathematically, Williams deviated from the conventional approach of basic physics-based equations. Instead, he described the behavior of the molecules using a decision tree algorithm with “if-then-else” statements, a commonplace line of code in several computer programs, particularly digital games.
“Video games have a structure where you have a character that does something, and then something occurs as a result. And so, if you write that out in a computer algorithm, they are if-then-else statements,” said Williams. “Here, the molecule is switching from on to off as a consequence of applied voltage, and that’s when I had the eureka moment to use decision trees to describe these devices, and it worked very well.”
But the researchers went a step further to exploit these molecular devices to run programs for different real-world computational tasks. Sreetosh showed experimentally that their devices could perform fairly complex computations in a single time step and then be reprogrammed to perform another task in the next instant.
“It was quite extraordinary; our device was doing something like what the brain does, but in a very different way,” said Sreetosh. “When you’re learning something new or when you’re deciding, the brain can actually reconfigure and change physical wiring around. Similarly, we can logically reprogram or reconfigure our devices by giving them a different voltage pulse then they’ve seen before.”
Venkatesan noted that it would take thousands of transistors to perform the same computational functions as one of their molecular devices with its different decision trees. Hence, he said their technology might first be used in handheld devices, like cell phones and sensors, and other applications where power is limited.
Other contributors to the research include Dr. Abhijeet Patra and Dr. Ariando from the National University of Singapore; Dr. Rajib Pramanick and Dr. Santi Prasad Rath from the Indian Association for the Cultivation of Science; Dr. Martin Foltin from Hewlett Packard Enterprise, Colorado; and Dr. Damien Thompson from the University of Limerick, Ireland.
Venkatesan said that this research is indicative of the future discoveries from this collaborative team, which will include the center of nanoscience and engineering at the Indian Institute of Science and the Microsystems and Nanotechnology Division at the NIST.
I’ve highlighted R. Stanley Williams because he and his team at HP [Hewlett Packard] Labs helped to kick off current memristor research in 2008 with the publication of two papers as per my April 5, 2010 posting,
In 2008, two memristor papers were published in Nature and Nature Nanotechnology, respectively. In the first (Nature, May 2008 [article still behind a paywall], a team at HP Labs claimed they had proved the existence of memristors (a fourth member of electrical engineering’s ‘Holy Trinity of the capacitor, resistor, and inductor’). In the second paper (Nature Nanotechnology, July 2008 [article still behind a paywall]) the team reported that they had achieved engineering control.
Many electronic devices today are dependent on semiconductor logic circuits based on switches hard-wired to perform predefined logic functions. Physicists from the National University of Singapore (NUS), together with an international team of researchers, have developed a novel molecular memristor, or an electronic memory device, that has exceptional memory reconfigurability.
Unlike hard-wired standard circuits, the molecular device can be reconfigured using voltage to embed different computational tasks. The energy-efficient new technology, which is capable of enhanced computational power and speed, can potentially be used in edge computing, as well as handheld devices and applications with limited power resource.
“This work is a significant breakthrough in our quest to design low-energy computing. The idea of using multiple switching in a single element draws inspiration from how the brain works and fundamentally reimagines the design strategy of a logic circuit,” said Associate Professor Ariando from the NUS Department of Physics who led the research.
The research was first published in the journal Nature on 1 September 2021, and carried out in collaboration with the Indian Association for the Cultivation of Science, Hewlett Packard Enterprise, the University of Limerick, the University of Oklahoma, and Texas A&M University.
Brain-inspired technology
“This new discovery can contribute to developments in edge computing as a sophisticated in-memory computing approach to overcome the von Neumann bottleneck, a delay in computational processing seen in many digital technologies due to the physical separation of memory storage from a device’s processor,” said Assoc Prof Ariando. The new molecular device also has the potential to contribute to designing next generation processing chips with enhanced computational power and speed.
“Similar to the flexibility and adaptability of connections in the human brain, our memory device can be reconfigured on the fly for different computational tasks by simply changing applied voltages. Furthermore, like how nerve cells can store memories, the same device can also retain information for future retrieval and processing,” said first author Dr Sreetosh Goswami, Research Fellow from the Department of Physics at NUS.
Research team member Dr Sreebrata Goswami, who was a Senior Research Scientist at NUS and previously Professor at the Indian Association for the Cultivation of Science, conceptualised and designed a molecular system belonging to the chemical family of phenyl azo pyridines that have a central metal atom bound to organic molecules called ligands. “These molecules are like electron sponges that can offer as many as six electron transfers resulting in five different molecular states. The interconnectivity between these states is the key behind the device’s reconfigurability,” explained Dr Sreebrata Goswami.
Dr Sreetosh Goswami created a tiny electrical circuit consisting a 40-nanometer layer of molecular film sandwiched between a top layer of gold, and a bottom layer of gold-infused nanodisc and indium tin oxide. He observed an unprecedented current-voltage profile upon applying a negative voltage to the device. Unlike conventional metal-oxide memristors that are switched on and off at only one fixed voltage, these organic molecular devices could switch between on-off states at several discrete sequential voltages.
Using an imaging technique called Raman spectroscopy, spectral signatures in the vibrational motion of the organic molecule were observed to explain the multiple transitions. Dr Sreebrata Goswami explained, “Sweeping the negative voltage triggered the ligands on the molecule to undergo a series of reduction, or electron-gaining which caused the molecule to transition between off and on states.”
The researchers described the behavior of the molecules using a decision tree algorithm with “if-then-else” statements, which is used in the coding of several computer programs, particularly digital games, as compared to the conventional approach of using basic physics-based equations.
New possibilities for energy-efficient devices
Building on their research, the team used the molecular memory devices to run programs for different real-world computational tasks. As a proof of concept, the team demonstrated that their technology could perform complex computations in a single step, and could be reprogrammed to perform another task in the next instant. An individual molecular memory device could perform the same computational functions as thousands of transistors, making the technology a more powerful and energy-efficient memory option.
“The technology might first be used in handheld devices, like cell phones and sensors, and other applications where power is limited,” added Assoc Prof Ariando.
The team in the midst of building new electronic devices incorporating their innovation, and working with collaborators to conduct simulation and benchmarking relating to existing technologies.
Other contributors to the research paper include Abhijeet Patra and Santi Prasad Rath from NUS, Rajib Pramanick from the Indian Association for the Cultivation of Science, Martin Foltin from Hewlett Packard Enterprise, Damien Thompson from the University of Limerick, T. Venkatesan from the University of Oklahoma, and R. Stanley Williams from Texas A&M University.
Here’s a link to and a citation for the paper,
Decision trees within a molecular memristor by Sreetosh Goswami, Rajib Pramanick, Abhijeet Patra, Santi Prasad Rath, Martin Foltin, A. Ariando, Damien Thompson, T. Venkatesan, Sreebrata Goswami & R. Stanley Williams. Nature volume 597, pages 51–56 (2021) DOI: https://doi.org/10.1038/s41586-021-03748-0 Published 01 September 2021 Issue Date 02 September 2021
Given R. Stanley Williams’s presence on the author list, it’s a bit surprising that there’s no mention of memristors. If I read the signs rightly the interest is shifting, in some cases, from the memristor to a more comprehensive grouping of circuit elements referred to as ‘neuristors’ or, more likely, ‘nanocirucuit elements’ in the effort to achieve brainlike (neuromorphic) computing (engineering). (Williams was the leader of the HP Labs team that offered proof and more of the memristor’s existence, which I mentioned here in an April 5, 2010 posting. There are many, many postings on this topic here; try ‘memristors’ or ‘brainlike computing’ for your search terms.)
A September 24, 2020 news item on ScienceDaily announces a recent development in the field of neuromorphic engineering,
In the September [2020] issue of the journal Nature, scientists from Texas A&M University, Hewlett Packard Labs and Stanford University have described a new nanodevice that acts almost identically to a brain cell. Furthermore, they have shown that these synthetic brain cells can be joined together to form intricate networks that can then solve problems in a brain-like manner.
“This is the first study where we have been able to emulate a neuron with just a single nanoscale device, which would otherwise need hundreds of transistors,” said Dr. R. Stanley Williams, senior author on the study and professor in the Department of Electrical and Computer Engineering. “We have also been able to successfully use networks of our artificial neurons to solve toy versions of a real-world problem that is computationally intense even for the most sophisticated digital technologies.”
In particular, the researchers have demonstrated proof of concept that their brain-inspired system can identify possible mutations in a virus, which is highly relevant for ensuring the efficacy of vaccines and medications for strains exhibiting genetic diversity.
Over the past decades, digital technologies have become smaller and faster largely because of the advancements in transistor technology. However, these critical circuit components are fast approaching their limit of how small they can be built, initiating a global effort to find a new type of technology that can supplement, if not replace, transistors.
In addition to this “scaling-down” problem, transistor-based digital technologies have other well-known challenges. For example, they struggle at finding optimal solutions when presented with large sets of data.
“Let’s take a familiar example of finding the shortest route from your office to your home. If you have to make a single stop, it’s a fairly easy problem to solve. But if for some reason you need to make 15 stops in between, you have 43 billion routes to choose from,” said Dr. Suhas Kumar, lead author on the study and researcher at Hewlett Packard Labs. “This is now an optimization problem, and current computers are rather inept at solving it.”
Kumar added that another arduous task for digital machines is pattern recognition, such as identifying a face as the same regardless of viewpoint or recognizing a familiar voice buried within a din of sounds.
But tasks that can send digital machines into a computational tizzy are ones at which the brain excels. In fact, brains are not just quick at recognition and optimization problems, but they also consume far less energy than digital systems. Hence, by mimicking how the brain solves these types of tasks, Williams said brain-inspired or neuromorphic systems could potentially overcome some of the computational hurdles faced by current digital technologies.
To build the fundamental building block of the brain or a neuron, the researchers assembled a synthetic nanoscale device consisting of layers of different inorganic materials, each with a unique function. However, they said the real magic happens in the thin layer made of the compound niobium dioxide.
When a small voltage is applied to this region, its temperature begins to increase. But when the temperature reaches a critical value, niobium dioxide undergoes a quick change in personality, turning from an insulator to a conductor. But as it begins to conduct electric currents, its temperature drops and niobium dioxide switches back to being an insulator.
These back-and-forth transitions enable the synthetic devices to generate a pulse of electrical current that closely resembles the profile of electrical spikes, or action potentials, produced by biological neurons. Further, by changing the voltage across their synthetic neurons, the researchers reproduced a rich range of neuronal behaviors observed in the brain, such as sustained, burst and chaotic firing of electrical spikes.
“Capturing the dynamical behavior of neurons is a key goal for brain-inspired computers,” said Kumar. “Altogether, we were able to recreate around 15 types of neuronal firing profiles, all using a single electrical component and at much lower energies compared to transistor-based circuits.”
To evaluate if their synthetic neurons [neuristor?] can solve real-world problems, the researchers first wired 24 such nanoscale devices together in a network inspired by the connections between the brain’s cortex and thalamus, a well-known neural pathway involved in pattern recognition. Next, they used this system to solve a toy version of the viral quasispecies reconstruction problem, where mutant variations of a virus are identified without a reference genome.
By means of data inputs, the researchers introduced the network to short gene fragments. Then, by programming the strength of connections between the artificial neurons within the network, they established basic rules about joining these genetic fragments. The jigsaw puzzle-like task for the network was to list mutations in the virus’ genome based on these short genetic segments.
The researchers found that within a few microseconds, their network of artificial neurons settled down in a state that was indicative of the genome for a mutant strain.
Williams and Kumar noted this result is proof of principle that their neuromorphic systems can quickly perform tasks in an energy-efficient way.
The researchers said the next steps in their research will be to expand the repertoire of the problems that their brain-like networks can solve by incorporating other firing patterns and some hallmark properties of the human brain like learning and memory. They also plan to address hardware challenges for implementing their technology on a commercial scale.
“Calculating the national debt or solving some large-scale simulation is not the type of task the human brain is good at and that’s why we have digital computers. Alternatively, we can leverage our knowledge of neuronal connections for solving problems that the brain is exceptionally good at,” said Williams. “We have demonstrated that depending on the type of problem, there are different and more efficient ways of doing computations other than the conventional methods using digital computers with transistors.”
If you look at the news release on EurekAlert, you’ll see this informative image is titled: NeuristerSchematic [sic],
(On the university website, the image is credited to Rachel Barton.) You can see one of the first mentions of a ‘neuristor’ here in an August 24, 2017 posting.
Here’s a link to and a citation for the paper,
Third-order nanocircuit elements for neuromorphic engineering by Suhas Kumar, R. Stanley Williams & Ziwen Wang. Nature volume 585, pages518–523(2020) DOI: https://doi.org/10.1038/s41586-020-2735-5 Published: 23 September 2020 Issue Date: 24 September 2020
I went down a rabbit hole while trying to figure out the difference between ‘organic’ memristors and standard memristors. I have put the results of my investigation at the end of this post. First, there’s the news.
An April 21, 2020 news item on ScienceDaily explains why researchers are so focused on memristors and brainlike computing,
The advent of artificial intelligence, machine learning and the internet of things is expected to change modern electronics and bring forth the fourth Industrial Revolution. The pressing question for many researchers is how to handle this technological revolution.
“It is important for us to understand that the computing platforms of today will not be able to sustain at-scale implementations of AI algorithms on massive datasets,” said Thirumalai Venkatesan, one of the authors of a paper published in Applied Physics Reviews, from AIP Publishing.
“Today’s computing is way too energy-intensive to handle big data. We need to rethink our approaches to computation on all levels: materials, devices and architecture that can enable ultralow energy computing.”
Brain-inspired electronics with organic memristors could offer a functionally promising and cost- effective platform, according to Venkatesan. Memristive devices are electronic devices with an inherent memory that are capable of both storing data and performing computation. Since memristors are functionally analogous to the operation of neurons, the computing units in the brain, they are optimal candidates for brain-inspired computing platforms.
Until now, oxides have been the leading candidate as the optimum material for memristors. Different material systems have been proposed but none have been successful so far.
“Over the last 20 years, there have been several attempts to come up with organic memristors, but none of those have shown any promise,” said Sreetosh Goswami, lead author on the paper. “The primary reason behind this failure is their lack of stability, reproducibility and ambiguity in mechanistic understanding. At a device level, we are now able to solve most of these problems,”
This new generation of organic memristors is developed based on metal azo complex devices, which are the brainchild of Sreebata Goswami, a professor at the Indian Association for the Cultivation of Science in Kolkata and another author on the paper.
“In thin films, the molecules are so robust and stable that these devices can eventually be the right choice for many wearable and implantable technologies or a body net, because these could be bendable and stretchable,” said Sreebata Goswami. A body net is a series of wireless sensors that stick to the skin and track health.
The next challenge will be to produce these organic memristors at scale, said Venkatesan.
“Now we are making individual devices in the laboratory. We need to make circuits for large-scale functional implementation of these devices.”
This undated article on Nanowerk provides a relatively complete and technical description of memristors in general (Note: A link has been removed),
A memristor (named as a portmanteau of memory and resistor) is a non-volatile electronic memory device that was first theorized by Leon Ong Chua in 1971 as the fourth fundamental two-terminal circuit element following the resistor, the capacitor, and the inductor (IEEE Transactions on Circuit Theory, “Memristor-The missing circuit element”).
Its special property is that its resistance can be programmed (resistor function) and subsequently remains stored (memory function). Unlike other memories that exist today in modern electronics, memristors are stable and remember their state even if the device loses power.
However, it was only almost 40 years later that the first practical device was fabricated. This was in 2008, when a group led by Stanley Williams at HP Research Labs realized that switching of the resistance between a conducting and less conducting state in metal-oxide thin-film devices was showing Leon Chua’s memristor behavior. …
The article on Nanowerk includes an embedded video presentation on memristors given by Stanley Williams (also known as R. Stanley Williams).
The memristor is composed of the transition metal ruthenium complexed with “azo-aromatic ligands.” [emphasis mine] The theoretical work enabling this material was performed at Yale, and the organic molecules were synthesized at the Indian Association for the Cultivation of Sciences. …
I highlighted ‘ligands’ because that appears to be the difference. However, there is more than one type of ligand on Wikipedia.
In coordination chemistry, a ligand[help 1] is an ion or molecule (functional group) that binds to a central metal atom to form a coordination complex …
Ligand, an atom, ion, or functional group that donates one or more of its electrons through a coordinate covalent bond to one or more central atoms or ions
Ligand (biochemistry), a substance that binds to a protein
a ‘guest’ in host–guest chemistry
I did take a look at the paper and did not see any references to proteins or other biomolecules that I could recognize as such. I’m not sure why the researchers are describing their device as an ‘organic’ memristor but this may reflect a shortcoming in the definitions I have found or shortcomings in my reading of the paper rather than an error on their parts.
Hopefully, more research will be forthcoming and it will be possible to better understand the terminology.
The last time I wrote about memcapacitors (June 30, 2014 posting: Memristors, memcapacitors, and meminductors for faster computers), the ideas were largely theoretical; I believe this work is the first research I’ve seen on the topic. From an October 17, 2019 news item on ScienceDaily,
Researchers at the Department of Energy’s Oak Ridge National Laboratory ]ORNL], the University of Tennessee and Texas A&M University demonstrated bio-inspired devices that accelerate routes to neuromorphic, or brain-like, computing.
Results published in Nature Communications report the first example of a lipid-based “memcapacitor,” a charge storage component with memory that processes information much like synapses do in the brain. Their discovery could support the emergence of computing networks modeled on biology for a sensory approach to machine learning.
“Our goal is to develop materials and computing elements that work like biological synapses and neurons—with vast interconnectivity and flexibility—to enable autonomous systems that operate differently than current computing devices and offer new functionality and learning capabilities,” said Joseph Najem, a recent postdoctoral researcher at ORNL’s Center for Nanophase Materials Sciences, a DOE Office of Science User Facility, and current assistant professor of mechanical engineering at Penn State.
The novel approach uses soft materials to mimic biomembranes and simulate the way nerve cells communicate with one another.
The team designed an artificial cell membrane, formed at the interface of two lipid-coated water droplets in oil, to explore the material’s dynamic, electrophysiological properties. At applied voltages, charges build up on both sides of the membrane as stored energy, analogous to the way capacitors work in traditional electric circuits.
But unlike regular capacitors, the memcapacitor can “remember” a previously applied voltage and—literally—shape how information is processed. The synthetic membranes change surface area and thickness depending on electrical activity. These shapeshifting membranes could be tuned as adaptive filters for specific biophysical and biochemical signals.
“The novel functionality opens avenues for nondigital signal processing and machine learning modeled on nature,” said ORNL’s Pat Collier, a CNMS staff research scientist.
A distinct feature of all digital computers is the separation of processing and memory. Information is transferred back and forth from the hard drive and the central processor, creating an inherent bottleneck in the architecture no matter how small or fast the hardware can be.
Neuromorphic computing, modeled on the nervous system, employs architectures that are fundamentally different in that memory and signal processing are co-located in memory elements—memristors, memcapacitors and meminductors.
These “memelements” make up the synaptic hardware of systems that mimic natural information processing, learning and memory.
Systems designed with memelements offer advantages in scalability and low power consumption, but the real goal is to carve out an alternative path to artificial intelligence, said Collier.
Tapping into biology could enable new computing possibilities, especially in the area of “edge computing,” such as wearable and embedded technologies that are not connected to a cloud but instead make on-the-fly decisions based on sensory input and past experience.
Biological sensing has evolved over billions of years into a highly sensitive system with receptors in cell membranes that are able to pick out a single molecule of a specific odor or taste. “This is not something we can match digitally,” Collier said.
Digital computation is built around digital information, the binary language of ones and zeros coursing through electronic circuits. It can emulate the human brain, but its solid-state components do not compute sensory data the way a brain does.
“The brain computes sensory information pushed through synapses in a neural network that is reconfigurable and shaped by learning,” said Collier. “Incorporating biology—using biomembranes that sense bioelectrochemical information—is key to developing the functionality of neuromorphic computing.”
While numerous solid-state versions of memelements have been demonstrated, the team’s biomimetic elements represent new opportunities for potential “spiking” neural networks that can compute natural data in natural ways.
Spiking neural networks are intended to simulate the way neurons spike with electrical potential and, if the signal is strong enough, pass it on to their neighbors through synapses, carving out learning pathways that are pruned over time for efficiency.
A bio-inspired version with analog data processing is a distant aim. Current early-stage research focuses on developing the components of bio-circuitry.
“We started with the basics, a memristor that can weigh information via conductance to determine if a spike is strong enough to be broadcast through a network of synapses connecting neurons,” said Collier. “Our memcapacitor goes further in that it can actually store energy as an electric charge in the membrane, enabling the complex ‘integrate and fire’ activity of neurons needed to achieve dense networks capable of brain-like computation.”
The team’s next steps are to explore new biomaterials and study simple networks to achieve more complex brain-like functionalities with memelements.
Here’s a link to and a citation for the paper,
Dynamical nonlinear memory capacitance in biomimetic membranes by Joseph S. Najem, Md Sakib Hasan, R. Stanley Williams, Ryan J. Weiss, Garrett S. Rose, Graham J. Taylor, Stephen A. Sarles & C. Patrick Collier. Nature Communications volume 10, Article number: 3239 (2019) DOI: DOIhttps://doi.org/10.1038/s41467-019-11223-8 Published July 19, 2019
This paper is open access.
One final comment, you might recognize one of the authors (R. Stanley Williams) who in 2008 helped launch ‘memristor’ research.
Down the memristor rabbit hole one more time.* I started out with news about two new papers and inadvertently found two more. In a bid to keep this posting to a manageable size, I’m stopping at four.
Memristor (or memory resistors) devices are non-volatile electronic memory devices that were first theorized by Leon Chua in the 1970’s. However, it was some thirty years later that the first practical device was fabricated. This was in 2008 when a group led by Stanley Williams at HP Research Labs realized that switching of the resistance between a conducting and less conducting state in metal-oxide thin-film devices was showing Leon Chua’s memristor behaviour.
…
The high interest in memristor devices also stems from the fact that these devices emulate the memory and learning properties of biological synapses. i.e. the electrical resistance value of the device is dependent on the history of the current flowing through it.
There is a huge effort underway to use memristor devices in neuromorphic computing applications and it is now reasonable to imagine the development of a new generation of artificial intelligent devices with very low power consumption (non-volatile), ultra-fast performance and high-density integration.
These discoveries come at an important juncture in microelectronics, since there is increasing disparity between computational needs of Big Data, Artificial Intelligence (A.I.) and the Internet of Things (IoT), and the capabilities of existing computers. The increases in speed, efficiency and performance of computer technology cannot continue in the same manner as it has done since the 1960s.
…
To date, most memristor research has focussed on the electronic switching properties of the device. However, for many applications it is useful to have an additional handle (or degree of freedom) on the device to control its resistive state. For example memory and processing in the brain also involves numerous chemical and bio-chemical reactions that control the brain structure and its evolution through development.
To emulate this in a simple solid-state system composed of just switches alone is not possible. In our research, we are interested in using light to mediate this essential control.
…
We have demonstrated that light can be used to make short and long-term memory and we have shown how light can modulate a special type of learning, called spike timing dependent plasticity (STDP). STDP involves two neuronal spikes incident across a synapse at the same time. Depending on the relative timing of the spikes and their overlap across the synaptic cleft, the connection strength is other strengthened or weakened.
In our earlier work, we were only able to achieve to small switching effects in memristors using light. In our latest work (Advanced Electronic Materials, “Percolation Threshold Enables Optical Resistive-Memory Switching and Light-Tuneable Synaptic Learning in Segregated Nanocomposites”), we take advantage of a percolating-like nanoparticle morphology to vastly increase the magnitude of the switching between electronic resistance states when light is incident on the device.
…
We have used an inhomogeneous percolating network consisting of metallic nanoparticles distributed in filamentary-like conduction paths. Electronic conduction and the resistance of the device is very sensitive to any disruption of the conduction path(s).
By embedding the nanoparticles in a polymer that can expand or contract with light the conduction pathways are broken or re-connected causing very large changes in the electrical resistance and memristance of the device.
Our devices could lead to the development of new memristor-based artificial intelligence systems that are adaptive and reconfigurable using a combination of optical and electronic signalling. Furthermore, they have the potential for the development of very fast optical cameras for artificial intelligence recognition systems.
Our work provides a nice proof-of-concept but the materials used means the optical switching is slow. The materials are also not well suited to industry fabrication. In our on-going work we are addressing these switching speed issues whilst also focussing on industry compatible materials.
Currently we are working on a new type of optical memristor device that should give us orders of magnitude improvement in the optical switching speeds whilst also retaining a large difference between the resistance on and off states. We hope to be able to achieve nanosecond switching speeds. The materials used are also compatible with industry standard methods of fabrication.
The new devices should also have applications in optical communications, interfacing and photonic computing. We are currently looking for commercial investors to help fund the research on these devices so that we can bring the device specifications to a level of commercial interest.
…
If you’re interested in memristors, Kemp’s article is well written and quite informative for nonexperts, assuming of course you can tolerate not understanding everything perfectly.
Here are links and citations for two papers. The first is the latest referred to in the article, a May 2019 paper and the second is a paper appearing in July 2019.
Memristors, demonstrated by solid-state devices with continuously tunable resistance, have emerged as a new paradigm for self-adaptive networks that require synapse-like functions. Spin-based memristors offer advantages over other types of memristors because of their significant endurance and high energy effciency.
However, it remains a challenge to build dense and functional spintronic memristors with structures and materials that are compatible with existing ferromagnetic devices. Ta/CoFeB/MgO heterostructures are commonly used in interfacial PMA-based [perpendicular magnetic anisotropy] magnetic tunnel junctions, which exhibit large tunnel magnetoresistance and are implemented in commercial MRAM [magnetic random access memory] products.
“To achieve the memristive function, DW is driven back and forth in a continuous manner in the CoFeB layer by applying in-plane positive or negative current pulses along the Ta layer, utilizing SOT that the current exerts on the CoFeB magnetization,” said Shuai Zhang, a coauthor in the paper. “Slowly propagating domain wall generates a creep in the detection area of the device, which yields a broad range of intermediate resistive states in the AHE [anomalous Hall effect] measurements. Consequently, AHE resistance is modulated in an analog manner, being controlled by the pulsed current characteristics including amplitude, duration, and repetition number.”
“For a follow-up study, we are working on more neuromorphic operations, such as spike-timing-dependent plasticity and paired pulsed facilitation,” concludes You. …
Here’s are links to and citations for the paper (Note: It’s a little confusing but I believe that one of the links will take you to the online version, as for the ‘open access’ link, keep reading),
A Spin–Orbit‐Torque Memristive Device by Shuai Zhang, Shijiang Luo, Nuo Xu, Qiming Zou, Min Song, Jijun Yun, Qiang Luo, Zhe Guo, Ruofan Li, Weicheng Tian, Xin Li, Hengan Zhou, Huiming Chen, Yue Zhang, Xiaofei Yang, Wanjun Jiang, Ka Shen, Jeongmin Hong, Zhe Yuan, Li Xi, Ke Xia, Sayeef Salahuddin, Bernard Dieny, Long You. Advanced Electronic Materials Volume 5, Issue 4 April 2019 (print version) 1800782 DOI: https://doi.org/10.1002/aelm.201800782 First published [online]: 30 January 2019 Note: there is another DOI, https://doi.org/10.1002/aelm.201970022 where you can have open access to Memristors: A Spin–Orbit‐Torque Memristive Device (Adv. Electron. Mater. 4/2019)
The paper published online in January 2019 is behind a paywall and the paper (almost the same title) published in April 2019 has a new DOI and is open access. Final note: I tried accessing the ‘free’ paper and opened up a free file for the artwork featuring the work from China on the back cover of the April 2019 of Advanced Electronic Materials.
Korea
Usually when I see the words transparency and flexibility, I expect to see graphene is one of the materials. That’s not the case for this paper (link to and citation for),
Here’s the abstract for the paper where you’ll see that the material is made up of zinc oxide silver nanowires,
An artificial photonic synapse having tunable manifold synaptic response can be an essential step forward for the advancement of novel neuromorphic computing. In this work, we reported the development of highly transparent and flexible two-terminal ZnO/Ag-nanowires/PET photonic artificial synapse [emphasis mine]. The device shows purely photo-triggered all essential synaptic functions such as transition from short-to long-term plasticity, paired-pulse facilitation, and spike-timing-dependent plasticity, including in the versatile memory capability. Importantly, strain-induced piezo-phototronic effect within ZnO provides an additional degree of regulation to modulate all of the synaptic functions in multi-levels. The observed effect is quantitatively explained as a dynamic of photo-induced electron-hole trapping/detraining via the defect states such as oxygen vacancies. We revealed that the synaptic functions can be consolidated and converted by applied strain, which is not previously applied any of the reported synaptic devices. This study will open a new avenue to the scientific community to control and design highly transparent wearable neuromorphic computing.
Mott memristors (mentioned in my Aug. 24, 2017 posting about neuristors and brainlike computing) gets more fulsome treatment in an Oct. 9, 2017 posting by Samuel K. Moore on the Nanoclast blog (found on the IEEE [Institute of Electrical and Electronics Engineers] website) Note: 1: Links have been removed; Note 2 : I quite like Moore’s writing style but he’s not for the impatient reader,
When you’re really harried, you probably feel like your head is brimful of chaos. You’re pretty close. Neuroscientists say your brain operates in a regime termed the “edge of chaos,” and it’s actually a good thing. It’s a state that allows for fast, efficient analog computation of the kind that can solve problems that grow vastly more difficult as they become bigger in size.
The trouble is, if you’re trying to replicate that kind of chaotic computation with electronics, you need an element that both acts chaotically—how and when you want it to—and could scale up to form a big system.
“No one had been able to show chaotic dynamics in a single scalable electronic device,” says Suhas Kumar, a researcher at Hewlett Packard Labs, in Palo Alto, Calif. Until now, that is.
He, John Paul Strachan, and R. Stanley Williams recently reported in the journal Nature that a particular configuration of a certain type of memristor contains that seed of controlled chaos. What’s more, when they simulated wiring these up into a type of circuit called a Hopfield neural network, the circuit was capable of solving a ridiculously difficult problem—1,000 instances of the traveling salesman problem—at a rate of 10 trillion operations per second per watt.
(It’s not an apples-to-apples comparison, but the world’s most powerful supercomputer as of June 2017 managed 93,015 trillion floating point operations per second but consumed 15 megawatts doing it. So about 6 billion operations per second per watt.)
The device in question is called a Mott memristor. Memristors generally are devices that hold a memory, in the form of resistance, of the current that has flowed through them. The most familiar type is called resistive RAM (or ReRAM or RRAM, depending on who’s asking). Mott memristors have an added ability in that they can also reflect a temperature-driven change in resistance.
The HP Labs team made their memristor from an 8-nanometer-thick layer of niobium dioxide (NbO2) sandwiched between two layers of titanium nitride. The bottom titanium nitride layer was in the form of a 70-nanometer wide pillar. “We showed that this type of memristor can generate chaotic and nonchaotic signals,” says Williams, who invented the memristor based on theory by Leon Chua.
…
(The traveling salesman problem is one of these. In it, the salesman must find the shortest route that lets him visit all of his customers’ cities, without going through any of them twice. It’s a difficult problem because it becomes exponentially more difficult to solve with each city you add.)
Here’s what the niobium dioxide-based Mott memristor looks like,
Photo: Suhas Kumar/Hewlett Packard Labs A micrograph shows the construction of a Mott memristor composed of an 8-nanometer-thick layer of niobium dioxide between two layers of titanium nitride.