Tag Archives: Stephanie E. Liu

Brainlike transistor and human intelligence

This brainlike transistor (not a memristor) is important because it functions at room temperature as opposed to others, which require cryogenic temperatures.

A December 20, 2023 Northwestern University news release (received via email; also on EurekAlert) fills in the details,

  • Researchers develop transistor that simultaneously processes and stores information like the human brain
  • Transistor goes beyond categorization tasks to perform associative learning
  • Transistor identified similar patterns, even when given imperfect input
  • Previous similar devices could only operate at cryogenic temperatures; new transistor operates at room temperature, making it more practical

EVANSTON, Ill. — Taking inspiration from the human brain, researchers have developed a new synaptic transistor capable of higher-level thinking.

Designed by researchers at Northwestern University, Boston College and the Massachusetts Institute of Technology (MIT), the device simultaneously processes and stores information just like the human brain. In new experiments, the researchers demonstrated that the transistor goes beyond simple machine-learning tasks to categorize data and is capable of performing associative learning.

Although previous studies have leveraged similar strategies to develop brain-like computing devices, those transistors cannot function outside cryogenic temperatures. The new device, by contrast, is stable at room temperatures. It also operates at fast speeds, consumes very little energy and retains stored information even when power is removed, making it ideal for real-world applications.

The study was published today (Dec. 20 [2023]) in the journal Nature.

“The brain has a fundamentally different architecture than a digital computer,” said Northwestern’s Mark C. Hersam, who co-led the research. “In a digital computer, data move back and forth between a microprocessor and memory, which consumes a lot of energy and creates a bottleneck when attempting to perform multiple tasks at the same time. On the other hand, in the brain, memory and information processing are co-located and fully integrated, resulting in orders of magnitude higher energy efficiency. Our synaptic transistor similarly achieves concurrent memory and information processing functionality to more faithfully mimic the brain.”

Hersam is the Walter P. Murphy Professor of Materials Science and Engineering at Northwestern’s McCormick School of Engineering. He also is chair of the department of materials science and engineering, director of the Materials Research Science and Engineering Center and member of the International Institute for Nanotechnology. Hersam co-led the research with Qiong Ma of Boston College and Pablo Jarillo-Herrero of MIT.

Recent advances in artificial intelligence (AI) have motivated researchers to develop computers that operate more like the human brain. Conventional, digital computing systems have separate processing and storage units, causing data-intensive tasks to devour large amounts of energy. With smart devices continuously collecting vast quantities of data, researchers are scrambling to uncover new ways to process it all without consuming an increasing amount of power. Currently, the memory resistor, or “memristor,” is the most well-developed technology that can perform combined processing and memory function. But memristors still suffer from energy costly switching.

“For several decades, the paradigm in electronics has been to build everything out of transistors and use the same silicon architecture,” Hersam said. “Significant progress has been made by simply packing more and more transistors into integrated circuits. You cannot deny the success of that strategy, but it comes at the cost of high power consumption, especially in the current era of big data where digital computing is on track to overwhelm the grid. We have to rethink computing hardware, especially for AI and machine-learning tasks.”

To rethink this paradigm, Hersam and his team explored new advances in the physics of moiré patterns, a type of geometrical design that arises when two patterns are layered on top of one another. When two-dimensional materials are stacked, new properties emerge that do not exist in one layer alone. And when those layers are twisted to form a moiré pattern, unprecedented tunability of electronic properties becomes possible.

For the new device, the researchers combined two different types of atomically thin materials: bilayer graphene and hexagonal boron nitride. When stacked and purposefully twisted, the materials formed a moiré pattern. By rotating one layer relative to the other, the researchers could achieve different electronic properties in each graphene layer even though they are separated by only atomic-scale dimensions. With the right choice of twist, researchers harnessed moiré physics for neuromorphic functionality at room temperature.

“With twist as a new design parameter, the number of permutations is vast,” Hersam said. “Graphene and hexagonal boron nitride are very similar structurally but just different enough that you get exceptionally strong moiré effects.”

To test the transistor, Hersam and his team trained it to recognize similar — but not identical — patterns. Just earlier this month, Hersam introduced a new nanoelectronic device capable of analyzing and categorizing data in an energy-efficient manner, but his new synaptic transistor takes machine learning and AI one leap further.

“If AI is meant to mimic human thought, one of the lowest-level tasks would be to classify data, which is simply sorting into bins,” Hersam said. “Our goal is to advance AI technology in the direction of higher-level thinking. Real-world conditions are often more complicated than current AI algorithms can handle, so we tested our new devices under more complicated conditions to verify their advanced capabilities.”

First the researchers showed the device one pattern: 000 (three zeros in a row). Then, they asked the AI to identify similar patterns, such as 111 or 101. “If we trained it to detect 000 and then gave it 111 and 101, it knows 111 is more similar to 000 than 101,” Hersam explained. “000 and 111 are not exactly the same, but both are three digits in a row. Recognizing that similarity is a higher-level form of cognition known as associative learning.”

In experiments, the new synaptic transistor successfully recognized similar patterns, displaying its associative memory. Even when the researchers threw curveballs — like giving it incomplete patterns — it still successfully demonstrated associative learning.

“Current AI can be easy to confuse, which can cause major problems in certain contexts,” Hersam said. “Imagine if you are using a self-driving vehicle, and the weather conditions deteriorate. The vehicle might not be able to interpret the more complicated sensor data as well as a human driver could. But even when we gave our transistor imperfect input, it could still identify the correct response.”

The study, “Moiré synaptic transistor with room-temperature neuromorphic functionality,” was primarily supported by the National Science Foundation.

Here’s a link to and a citation for the paper,

Moiré synaptic transistor with room-temperature neuromorphic functionality by Xiaodong Yan, Zhiren Zheng, Vinod K. Sangwan, Justin H. Qian, Xueqiao Wang, Stephanie E. Liu, Kenji Watanabe, Takashi Taniguchi, Su-Yang Xu, Pablo Jarillo-Herrero, Qiong Ma & Mark C. Hersam. Nature volume 624, pages 551–556 (2023) DOI: https://doi.org/10.1038/s41586-023-06791-1 Published online: 20 December 2023 Issue Date: 21 December 2023

This paper is behind a paywall.

100-fold increase in AI energy efficiency

Most people don’t realize how much energy computing, streaming video, and other technologies consume and AI (artificial intelligence) consumes a lot. (For more about work being done in this area, there’s my October 13, 2023 posting about an upcoming ArtSci Salon event in Toronto featuring Laura U. Marks’s recent work ‘Streaming Carbon Footprint’ and my October 16, 2023 posting about how much water is used for AI.)

So this news is welcome, from an October 12, 2023 Northwestern University news release (also received via email and on EurekAlert), Note: Links have been removed,

AI just got 100-fold more energy efficient

Nanoelectronic device performs real-time AI classification without relying on the cloud

– AI is so energy hungry that most data analysis must be performed in the cloud
– New energy-efficient device enables AI tasks to be performed within wearables
– This allows real-time analysis and diagnostics for faster medical interventions
– Researchers tested the device by classifying 10,000 electrocardiogram samples
– The device successfully identified six types of heart beats with 95% accuracy

Northwestern University engineers have developed a new nanoelectronic device that can perform accurate machine-learning classification tasks in the most energy-efficient manner yet. Using 100-fold less energy than current technologies, the device can crunch large amounts of data and perform artificial intelligence (AI) tasks in real time without beaming data to the cloud for analysis.

With its tiny footprint, ultra-low power consumption and lack of lag time to receive analyses, the device is ideal for direct incorporation into wearable electronics (like smart watches and fitness trackers) for real-time data processing and near-instant diagnostics.

To test the concept, engineers used the device to classify large amounts of information from publicly available electrocardiogram (ECG) datasets. Not only could the device efficiently and correctly identify an irregular heartbeat, it also was able to determine the arrhythmia subtype from among six different categories with near 95% accuracy.

The research was published today (Oct. 12 [2023]) in the journal Nature Electronics.

“Today, most sensors collect data and then send it to the cloud, where the analysis occurs on energy-hungry servers before the results are finally sent back to the user,” said Northwestern’s Mark C. Hersam, the study’s senior author. “This approach is incredibly expensive, consumes significant energy and adds a time delay. Our device is so energy efficient that it can be deployed directly in wearable electronics for real-time detection and data processing, enabling more rapid intervention for health emergencies.”

A nanotechnology expert, Hersam is Walter P. Murphy Professor of Materials Science and Engineering at Northwestern’s McCormick School of Engineering. He also is chair of the Department of Materials Science and Engineering, director of the Materials Research Science and Engineering Center and member of the International Institute of Nanotechnology. Hersam co-led the research with Han Wang, a professor at the University of Southern California, and Vinod Sangwan, a research assistant professor at Northwestern.

Before machine-learning tools can analyze new data, these tools must first accurately and reliably sort training data into various categories. For example, if a tool is sorting photos by color, then it needs to recognize which photos are red, yellow or blue in order to accurately classify them. An easy chore for a human, yes, but a complicated — and energy-hungry — job for a machine.

For current silicon-based technologies to categorize data from large sets like ECGs, it takes more than 100 transistors — each requiring its own energy to run. But Northwestern’s nanoelectronic device can perform the same machine-learning classification with just two devices. By reducing the number of devices, the researchers drastically reduced power consumption and developed a much smaller device that can be integrated into a standard wearable gadget.

The secret behind the novel device is its unprecedented tunability, which arises from a mix of materials. While traditional technologies use silicon, the researchers constructed the miniaturized transistors from two-dimensional molybdenum disulfide and one-dimensional carbon nanotubes. So instead of needing many silicon transistors — one for each step of data processing — the reconfigurable transistors are dynamic enough to switch among various steps.

“The integration of two disparate materials into one device allows us to strongly modulate the current flow with applied voltages, enabling dynamic reconfigurability,” Hersam said. “Having a high degree of tunability in a single device allows us to perform sophisticated classification algorithms with a small footprint and low energy consumption.”

To test the device, the researchers looked to publicly available medical datasets. They first trained the device to interpret data from ECGs, a task that typically requires significant time from trained health care workers. Then, they asked the device to classify six types of heart beats: normal, atrial premature beat, premature ventricular contraction, paced beat, left bundle branch block beat and right bundle branch block beat.

The nanoelectronic device was able to identify accurately each arrhythmia type out of 10,000 ECG samples. By bypassing the need to send data to the cloud, the device not only saves critical time for a patient but also protects privacy.

“Every time data are passed around, it increases the likelihood of the data being stolen,” Hersam said. “If personal health data is processed locally — such as on your wrist in your watch — that presents a much lower security risk. In this manner, our device improves privacy and reduces the risk of a breach.”

Hersam imagines that, eventually, these nanoelectronic devices could be incorporated into everyday wearables, personalized to each user’s health profile for real-time applications. They would enable people to make the most of the data they already collect without sapping power.

“Artificial intelligence tools are consuming an increasing fraction of the power grid,” Hersam said. “It is an unsustainable path if we continue relying on conventional computer hardware.”

Here’s a link to and a citation for the paper,

Reconfigurable mixed-kernel heterojunction transistors for personalized support vector machine classification by Xiaodong Yan, Justin H. Qian, Jiahui Ma, Aoyang Zhang, Stephanie E. Liu, Matthew P. Bland, Kevin J. Liu, Xuechun Wang, Vinod K. Sangwan, Han Wang & Mark C. Hersam. Nature Electronics (2023) DOI: https://doi.org/10.1038/s41928-023-01042-7 Published: 12 October 2023

This paper is behind a paywall.