Tag Archives: TU Graz

Save energy with neuromorphic (brainlike) hardware

It seems the appetite for computing power is bottomless, which presents a problem in a world where energy resources are increasingly constrained. A May 24, 2022 news item on ScienceDaily announces research into neuromorphic computing which hints the energy efficiency long promised by the technology may be realized in the foreseeable future,

For the first time TU Graz’s [Graz University of Technology; Austria] Institute of Theoretical Computer Science and Intel Labs demonstrated experimentally that a large neural network can process sequences such as sentences while consuming four to sixteen times less energy while running on neuromorphic hardware than non-neuromorphic hardware. The new research based on Intel Labs’ Loihi neuromorphic research chip that draws on insights from neuroscience to create chips that function similar to those in the biological brain.

Rich Uhlig, managing director of Intel Labs, holds one of Intel’s Nahuku boards, each of which contains 8 to 32 Intel Loihi neuromorphic chips. Intel’s latest neuromorphic system, Pohoiki Beach, is made up of multiple Nahuku boards and contains 64 Loihi chips. Pohoiki Beach was introduced in July 2019. (Credit: Tim Herman/Intel Corporation)

A May 24, 2022 Graz University of Technology (TU Graz) press release (also on EurekAlert), which originated the news item, delves further into the research, Note: Links have been removed,

The research was funded by The Human Brain Project (HBP), one of the largest research projects in the world with more than 500 scientists and engineers across Europe studying the human brain. The results of the research are published in the research paper “Memory for AI Applications in Spike-based Neuromorphic Hardware” [sic] (DOI 10.1038/s42256-022-00480-w) which in published in Nature Machine Intelligence.  

Human brain as a role model

Smart machines and intelligent computers that can autonomously recognize and infer objects and relationships between different objects are the subjects of worldwide artificial intelligence (AI) research. Energy consumption is a major obstacle on the path to a broader application of such AI methods. It is hoped that neuromorphic technology will provide a push in the right direction. Neuromorphic technology is modelled after the human brain, which is highly efficient in using energy. To process information, its hundred billion neurons consume only about 20 watts, not much more energy than an average energy-saving light bulb.

In the research, the group focused on algorithms that work with temporal processes. For example, the system had to answer questions about a previously told story and grasp the relationships between objects or people from the context. The hardware tested consisted of 32 Loihi chips.

Loihi research chip: up to sixteen times more energy-efficient than non-neuromorphic hardware

“Our system is four to sixteen times more energy-efficient than other AI models on conventional hardware,” says Philipp Plank, a doctoral student at TU Graz’s Institute of Theoretical Computer Science. Plank expects further efficiency gains as these models are migrated to the next generation of Loihi hardware, which significantly improves the performance of chip-to-chip communication.

“Intel’s Loihi research chips promise to bring gains in AI, especially by lowering their high energy cost,“ said Mike Davies, director of Intel’s Neuromorphic Computing Lab. “Our work with TU Graz provides more evidence that neuromorphic technology can improve the energy efficiency of today’s deep learning workloads by re-thinking their implementation from the perspective of biology.”

Mimicking human short-term memory

In their neuromorphic network, the group reproduced a presumed memory mechanism of the brain, as Wolfgang Maass, Philipp Plank’s doctoral supervisor at the Institute of Theoretical Computer Science, explains: “Experimental studies have shown that the human brain can store information for a short period of time even without neural activity, namely in so-called ‘internal variables’ of neurons. Simulations suggest that a fatigue mechanism of a subset of neurons is essential for this short-term memory.”

Direct proof is lacking because these internal variables cannot yet be measured, but it does mean that the network only needs to test which neurons are currently fatigued to reconstruct what information it has previously processed. In other words, previous information is stored in the non-activity of neurons, and non-activity consumes the least energy.

Symbiosis of recurrent and feed-forward network

The researchers link two types of deep learning networks for this purpose. Feedback neural networks are responsible for “short-term memory.” Many such so-called recurrent modules filter out possible relevant information from the input signal and store it. A feed-forward network then determines which of the relationships found are very important for solving the task at hand. Meaningless relationships are screened out, the neurons only fire in those modules where relevant information has been found. This process ultimately leads to energy savings.

“Recurrent neural structures are expected to provide the greatest gains for applications running on neuromorphic hardware in the future,” said Davies. “Neuromorphic hardware like Loihi is uniquely suited to facilitate the fast, sparse and unpredictable patterns of network activity that we observe in the brain and need for the most energy efficient AI applications.”

This research was financially supported by Intel and the European Human Brain Project, which connects neuroscience, medicine, and brain-inspired technologies in the EU. For this purpose, the project is creating a permanent digital research infrastructure, EBRAINS. This research work is anchored in the Fields of Expertise Human and Biotechnology and Information, Communication & Computing, two of the five Fields of Expertise of TU Graz.

Here’s a link to and a citation for the paper,

A Long Short-Term Memory for AI Applications in Spike-based Neuromorphic Hardware by Arjun Rao, Philipp Plank, Andreas Wild & Wolfgang Maass. Nature Machine Intelligence (2022) DOI: https://doi.org/10.1038/s42256-022-00480-w Published: 19 May 2022

This paper is behind a paywall.

For anyone interested in the EBRAINS project, here’s a description from their About page,

EBRAINS provides digital tools and services which can be used to address challenges in brain research and brain-inspired technology development. Its components are designed with, by, and for researchers. The tools assist scientists to collect, analyse, share, and integrate brain data, and to perform modelling and simulation of brain function.

EBRAINS’ goal is to accelerate the effort to understand human brain function and disease.

This EBRAINS research infrastructure is the entry point for researchers to discover EBRAINS services. The services are being developed and powered by the EU-funded Human Brain Project.

You can register to use the EBRAINS research infrastructure HERE

One last note, the Human Brain Project is a major European Union (EU)-funded science initiative (1B Euros) announced in 2013 and to be paid out over 10 years.

Brain composer

This is a representation of the work they are doing on brain-computer interfaces (BCI) at the Technical University of Graz (TU Graz; Austria),

A Sept. 11, 2017 news item on phys.org announces the research into thinking melodies turning them into a musical score,

TU Graz researchers develop new brain-computer interface application that allows music to be composed by the power of thought. They have published their results in the current issue of the journal PLOS ONE.

Brain-computer interfaces (BCI) can replace bodily functions to a certain degree. Thanks to BCI, physically impaired persons can control special prostheses via their minds, surf the internet and write emails.

A group led by BCI expert Gernot Müller-Putz from TU Graz’s Institute of Neural Engineering shows that experiences of quite a different tone can be sounded from the keys of brain-computer interfaces. Derived from an established BCI method for writing, the team has developed a new application by which music can be composed and transferred onto a musical score through the power of thought. It employs a special cap that measures brain waves, the adapted BCI, music composition software, and a bit of musical knowledge.

A Sept. 6, 2017 TU Graz press release by Suzanne Eigner, which originated the news item, explains the research in more detail,

The basic principle of the BCI method used, which is called P300, can be briefly described: various options, such as letters or notes, pauses, chords, etc. flash by one after the other in a table. If you’re trained and can focus on the desired option while it lights up, you cause a minute change in your brain waves. The BCI recognises this change and draws conclusions about the chosen option.

Musical test persons

18 test persons chosen for the study by Gernot Müller-Putz, Andreas Pinegger and Selina C. Wriessnegger from TU Graz’s Institute of Neural Engineering as well as Hannah Hiebel, meanwhile at the Institute of Cognitive Psychology & Neuroscience at the University of Graz, had to “think” melodies onto a musical score. All test subjects were of sound bodily health during the study and had a certain degree of basic musical and compositional knowledge since they all played musical instruments to some degree. Among the test persons was the late Graz composer and clarinettist, Franz Cibulka. “The results of the BCI compositions can really be heard. And what is more important: the test persons enjoyed it. After a short training session, all of them could start composing and seeing their melodies on the score and then play them. The very positive results of the study with bodily healthy test persons are the first step in a possible expansion of the BCI composition to patients,” stresses Müller-Putz.

Sideshow of BCI research

This little-noticed sideshow of the lively BCI research at TU Graz, with its distinct focus on disabled persons, shows us which other avenues may yet be worth exploring. Meanwhile there are some initial attempts at BCI systems on smart phones. This makes it easier for people to use BCI applications, since the smart phone as powerful computer is becoming part of the BCI system. It is thus conceivable, for instance, to have BCI apps which can analyse brain signals for various applications. “20 years ago, the idea of composing a piece of music using the power of the mind was unimaginable. Now we can do it, and at the same time have tens of new, different ideas which are in part, once again, a long way from becoming reality. We still need a bit more time before it is mature enough for daily applications. The BCI community is working in many directions at high pressure.

Here’s a link to and a citation for the paper,

Composing only by thought: Novel application of the P300 brain-computer interface by Andreas Pinegger, Hannah Hiebel, Selina C. Wriessnegger, Gernot R. Müller-Putz. PLOS https://doi.org/10.1371/journal.pone.0181584 Published: September 6, 2017

This paper is open access.

This BCI ‘sideshow’ reminded me of The Music Man, a musical by Meredith Wilson. It was both a play and a film  and I’ve only ever seen the 1962 film. It features a con man, Harold Hill, who sells musical instruments and uniforms in small towns in Iowa. He has no musical training but while he’s conning the townspeople he convinces them that he can provide musical training with his ‘think method’. After falling in love with one of the townsfolk, he is hunted down and made to prove his method works. This is a clip from a Broadway revival of the play where Harold Hill is hoping that his ‘think method’ while yield results,

Of course, the people in this study had musicaltraining so they could think a melody into a musical score but I find the echo from the past amusing nonetheless.