Monthly Archives: January 2023

Implantable living pharmacy

I stumbled across a very interesting US Defense Advanced Research Projects Agency (DARPA) project (from an August 30, 2021 posting on Northwestern University’s Rivnay Lab [a laboratory for organic bioelectronics] blog),

Our lab has received a cooperative agreement with DARPA to develop a wireless, fully implantable ‘living pharmacy’ device that could help regulate human sleep patterns. The project is through DARPA’s BTO (biotechnology office)’s Advanced Acclimation and Protection Tool for Environmental Readiness (ADAPTER) program, meant to address physical challenges of travel, such as jetlag and fatigue.

The device, called NTRAIN (Normalizing Timing of Rhythms Across Internal Networks of Circadian Clocks), would control the body’s circadian clock, reducing the time it takes for a person to recover from disrupted sleep/wake cycles by as much as half the usual time.

The project spans 5 institutions including Northwestern, Rice University, Carnegie Mellon, University of Minnesota, and Blackrock Neurotech.

Prior to the Aug. 30, 2021 posting, Amanda Morris wrote a May 13, 2021 article for Northwestern NOW (university magazine), which provides more details about the project, Note: A link has been removed,

The first phase of the highly interdisciplinary program will focus on developing the implant. The second phase, contingent on the first, will validate the device. If that milestone is met, then researchers will test the device in human trials, as part of the third phase. The full funding corresponds to $33 million over four-and-a-half years. 

Nicknamed the “living pharmacy,” the device could be a powerful tool for military personnel, who frequently travel across multiple time zones, and shift workers including first responders, who vacillate between overnight and daytime shifts.

Combining synthetic biology with bioelectronics, the team will engineer cells to produce the same peptides that the body makes to regulate sleep cycles, precisely adjusting timing and dose with bioelectronic controls. When the engineered cells are exposed to light, they will generate precisely dosed peptide therapies. 

“This control system allows us to deliver a peptide of interest on demand, directly into the bloodstream,” said Northwestern’s Jonathan Rivnay, principal investigator of the project. “No need to carry drugs, no need to inject therapeutics and — depending on how long we can make the device last — no need to refill the device. It’s like an implantable pharmacy on a chip that never runs out.” 

Beyond controlling circadian rhythms, the researchers believe this technology could be modified to release other types of therapies with precise timing and dosing for potentially treating pain and disease. The DARPA program also will help researchers better understand sleep/wake cycles, in general.

“The experiments carried out in these studies will enable new insights into how internal circadian organization is maintained,” said Turek [Fred W. Turek], who co-leads the sleep team with Vitaterna [Martha Hotz Vitaterna]. “These insights will lead to new therapeutic approaches for sleep disorders as well as many other physiological and mental disorders, including those associated with aging where there is often a spontaneous breakdown in temporal organization.” 

For those who like to dig even deeper, Dieynaba Young’s June 17, 2021 article for Smithsonian Magazine (GetPocket.com link to article) provides greater context and greater satisfaction, Note: Links have been removed,

In 1926, Fritz Kahn completed Man as Industrial Palace, the preeminent lithograph in his five-volume publication The Life of Man. The illustration shows a human body bustling with tiny factory workers. They cheerily operate a brain filled with switchboards, circuits and manometers. Below their feet, an ingenious network of pipes, chutes and conveyer belts make up the blood circulatory system. The image epitomizes a central motif in Kahn’s oeuvre: the parallel between human physiology and manufacturing, or the human body as a marvel of engineering.

An apparatus in the embryonic stage of development at the time of this writing in June of 2021—the so-called “implantable living pharmacy”—could have easily originated in Kahn’s fervid imagination. The concept is being developed by the Defense Advanced Research Projects Agency (DARPA) in conjunction with several universities, notably Northwestern and Rice. Researchers envision a miniaturized factory, tucked inside a microchip, that will manufacture pharmaceuticals from inside the body. The drugs will then be delivered to precise targets at the command of a mobile application. …

The implantable living pharmacy, which is still in the “proof of concept” stage of development, is actually envisioned as two separate devices—a microchip implant and an armband. The implant will contain a layer of living synthetic cells, along with a sensor that measures temperature, a short-range wireless transmitter and a photo detector. The cells are sourced from a human donor and reengineered to perform specific functions. They’ll be mass produced in the lab, and slathered onto a layer of tiny LED lights.

The microchip will be set with a unique identification number and encryption key, then implanted under the skin in an outpatient procedure. The chip will be controlled by a battery-powered hub attached to an armband. That hub will receive signals transmitted from a mobile app.

If a soldier wishes to reset their internal clock, they’ll simply grab their phone, log onto the app and enter their upcoming itinerary—say, a flight departing at 5:30 a.m. from Arlington, Virginia, and arriving 16 hours later at Fort Buckner in Okinawa, Japan. Using short-range wireless communications, the hub will receive the signal and activate the LED lights inside the chip. The lights will shine on the synthetic cells, stimulating them to generate two compounds that are naturally produced in the body. The compounds will be released directly into the bloodstream, heading towards targeted locations, such as a tiny, centrally-located structure in the brain called the suprachiasmatic nucleus (SCN) that serves as master pacemaker of the circadian rhythm. Whatever the target location, the flow of biomolecules will alter the natural clock. When the solider arrives in Okinawa, their body will be perfectly in tune with local time.

The synthetic cells will be kept isolated from the host’s immune system by a membrane constructed of novel biomaterials, allowing only nutrients and oxygen in and only the compounds out. Should anything go wrong, they would swallow a pill that would kill the cells inside the chip only, leaving the rest of their body unaffected.

If you have the time, I recommend reading Young’s June 17, 2021 Smithsonian Magazine article (GetPocket.com link to article) in its entirety. Young goes on to discuss, hacking, malware, and ethical/societal issues and more.

There is an animation of Kahn’s original poster in a June 23, 2011 posting on openculture.com (also found on Vimeo; Der Mensch als Industriepalast [Man as Industrial Palace])

Credits: Idea & Animation: Henning M. Lederer / led-r-r.net; Sound-Design: David Indge; and original poster art: Fritz Kahn.

New chip for neuromorphic computing runs at a fraction of the energy of today’s systems

An August 17, 2022 news item on Nanowerk announces big (so to speak) claims from a team researching neuromorphic (brainlike) computer chips,

An international team of researchers has designed and built a chip that runs computations directly in memory and can run a wide variety of artificial intelligence (AI) applications–all at a fraction of the energy consumed by computing platforms for general-purpose AI computing.

The NeuRRAM neuromorphic chip brings AI a step closer to running on a broad range of edge devices, disconnected from the cloud, where they can perform sophisticated cognitive tasks anywhere and anytime without relying on a network connection to a centralized server. Applications abound in every corner of the world and every facet of our lives, and range from smart watches, to VR headsets, smart earbuds, smart sensors in factories and rovers for space exploration.

The NeuRRAM chip is not only twice as energy efficient as the state-of-the-art “compute-in-memory” chips, an innovative class of hybrid chips that runs computations in memory, it also delivers results that are just as accurate as conventional digital chips. Conventional AI platforms are a lot bulkier and typically are constrained to using large data servers operating in the cloud.

In addition, the NeuRRAM chip is highly versatile and supports many different neural network models and architectures. As a result, the chip can be used for many different applications, including image recognition and reconstruction as well as voice recognition.

..

An August 17, 2022 University of California at San Diego (UCSD) news release (also on EurekAlert), which originated the news item, provides more detail than usually found in a news release,

“The conventional wisdom is that the higher efficiency of compute-in-memory is at the cost of versatility, but our NeuRRAM chip obtains efficiency while not sacrificing versatility,” said Weier Wan, the paper’s first corresponding author and a recent Ph.D. graduate of Stanford University who worked on the chip while at UC San Diego, where he was co-advised by Gert Cauwenberghs in the Department of Bioengineering. 

The research team, co-led by bioengineers at the University of California San Diego, presents their results in the Aug. 17 [2022] issue of Nature.

Currently, AI computing is both power hungry and computationally expensive. Most AI applications on edge devices involve moving data from the devices to the cloud, where the AI processes and analyzes it. Then the results are moved back to the device. That’s because most edge devices are battery-powered and as a result only have a limited amount of power that can be dedicated to computing. 

By reducing power consumption needed for AI inference at the edge, this NeuRRAM chip could lead to more robust, smarter and accessible edge devices and smarter manufacturing. It could also lead to better data privacy as the transfer of data from devices to the cloud comes with increased security risks. 

On AI chips, moving data from memory to computing units is one major bottleneck. 

“It’s the equivalent of doing an eight-hour commute for a two-hour work day,” Wan said. 

To solve this data transfer issue, researchers used what is known as resistive random-access memory, a type of non-volatile memory that allows for computation directly within memory rather than in separate computing units. RRAM and other emerging memory technologies used as synapse arrays for neuromorphic computing were pioneered in the lab of Philip Wong, Wan’s advisor at Stanford and a main contributor to this work. Computation with RRAM chips is not necessarily new, but generally it leads to a decrease in the accuracy of the computations performed on the chip and a lack of flexibility in the chip’s architecture. 

“Compute-in-memory has been common practice in neuromorphic engineering since it was introduced more than 30 years ago,” Cauwenberghs said.  “What is new with NeuRRAM is that the extreme efficiency now goes together with great flexibility for diverse AI applications with almost no loss in accuracy over standard digital general-purpose compute platforms.”

A carefully crafted methodology was key to the work with multiple levels of “co-optimization” across the abstraction layers of hardware and software, from the design of the chip to its configuration to run various AI tasks. In addition, the team made sure to account for various constraints that span from memory device physics to circuits and network architecture. 

“This chip now provides us with a platform to address these problems across the stack from devices and circuits to algorithms,” said Siddharth Joshi, an assistant professor of computer science and engineering at the University of Notre Dame , who started working on the project as a Ph.D. student and postdoctoral researcher in Cauwenberghs lab at UC San Diego. 

Chip performance

Researchers measured the chip’s energy efficiency by a measure known as energy-delay product, or EDP. EDP combines both the amount of energy consumed for every operation and the amount of times it takes to complete the operation. By this measure, the NeuRRAM chip achieves 1.6 to 2.3 times lower EDP (lower is better) and 7 to 13 times higher computational density than state-of-the-art chips. 

Researchers ran various AI tasks on the chip. It achieved 99% accuracy on a handwritten digit recognition task; 85.7% on an image classification task; and 84.7% on a Google speech command recognition task. In addition, the chip also achieved a 70% reduction in image-reconstruction error on an image-recovery task. These results are comparable to existing digital chips that perform computation under the same bit-precision, but with drastic savings in energy. 

Researchers point out that one key contribution of the paper is that all the results featured are obtained directly on the hardware. In many previous works of compute-in-memory chips, AI benchmark results were often obtained partially by software simulation. 

Next steps include improving architectures and circuits and scaling the design to more advanced technology nodes. Researchers also plan to tackle other applications, such as spiking neural networks.

“We can do better at the device level, improve circuit design to implement additional features and address diverse applications with our dynamic NeuRRAM platform,” said Rajkumar Kubendran, an assistant professor for the University of Pittsburgh, who started work on the project while a Ph.D. student in Cauwenberghs’ research group at UC San Diego.

In addition, Wan is a founding member of a startup that works on productizing the compute-in-memory technology. “As a researcher and  an engineer, my ambition is to bring research innovations from labs into practical use,” Wan said. 

New architecture 

The key to NeuRRAM’s energy efficiency is an innovative method to sense output in memory. Conventional approaches use voltage as input and measure current as the result. But this leads to the need for more complex and more power hungry circuits. In NeuRRAM, the team engineered a neuron circuit that senses voltage and performs analog-to-digital conversion in an energy efficient manner. This voltage-mode sensing can activate all the rows and all the columns of an RRAM array in a single computing cycle, allowing higher parallelism. 

In the NeuRRAM architecture, CMOS neuron circuits are physically interleaved with RRAM weights. It differs from conventional designs where CMOS circuits are typically on the peripheral of RRAM weights.The neuron’s connections with the RRAM array can be configured to serve as either input or output of the neuron. This allows neural network inference in various data flow directions without incurring overheads in area or power consumption. This in turn makes the architecture easier to reconfigure. 

To make sure that accuracy of the AI computations can be preserved across various neural network architectures, researchers developed a set of hardware algorithm co-optimization techniques. The techniques were verified on various neural networks including convolutional neural networks, long short-term memory, and restricted Boltzmann machines. 

As a neuromorphic AI chip, NeuroRRAM performs parallel distributed processing across 48 neurosynaptic cores. To simultaneously achieve high versatility and high efficiency, NeuRRAM supports data-parallelism by mapping a layer in the neural network model onto multiple cores for parallel inference on multiple data. Also, NeuRRAM offers model-parallelism by mapping different layers of a model onto different cores and performing inference in a pipelined fashion.

An international research team

The work is the result of an international team of researchers. 

The UC San Diego team designed the CMOS circuits that implement the neural functions interfacing with the RRAM arrays to support the synaptic functions in the chip’s architecture, for high efficiency and versatility. Wan, working closely with the entire team, implemented the design; characterized the chip; trained the AI models; and executed the experiments. Wan also developed a software toolchain that maps AI applications onto the chip. 

The RRAM synapse array and its operating conditions were extensively characterized and optimized at Stanford University. 

The RRAM array was fabricated and integrated onto CMOS at Tsinghua University. 

The Team at Notre Dame contributed to both the design and architecture of the chip and the subsequent machine learning model design and training.

The research started as part of the National Science Foundation funded Expeditions in Computing project on Visual Cortex on Silicon at Penn State University, with continued funding support from the Office of Naval Research Science of AI program, the Semiconductor Research Corporation and DARPA [{US} Defense Advanced Research Projects Agency] JUMP program, and Western Digital Corporation. 

Here’s a link to and a citation for the paper,

A compute-in-memory chip based on resistive random-access memory by Weier Wan, Rajkumar Kubendran, Clemens Schaefer, Sukru Burc Eryilmaz, Wenqiang Zhang, Dabin Wu, Stephen Deiss, Priyanka Raina, He Qian, Bin Gao, Siddharth Joshi, Huaqiang Wu, H.-S. Philip Wong & Gert Cauwenberghs. Nature volume 608, pages 504–512 (2022) DOI: https://doi.org/10.1038/s41586-022-04992-8 Published: 17 August 2022 Issue Date: 18 August 2022

This paper is open access.

Synaptic transistors for brainlike computers based on (more environmentally friendly) graphene

An August 9, 2022 news item on ScienceDaily describes research investigating materials other than silicon for neuromorphic (brainlike) computing purposes,

Computers that think more like human brains are inching closer to mainstream adoption. But many unanswered questions remain. Among the most pressing, what types of materials can serve as the best building blocks to unlock the potential of this new style of computing.

For most traditional computing devices, silicon remains the gold standard. However, there is a movement to use more flexible, efficient and environmentally friendly materials for these brain-like devices.

In a new paper, researchers from The University of Texas at Austin developed synaptic transistors for brain-like computers using the thin, flexible material graphene. These transistors are similar to synapses in the brain, that connect neurons to each other.

An August 8, 2022 University of Texas at Austin news release (also on EurekAlert but published August 9, 2022), which originated the news item, provides more detail about the research,

“Computers that think like brains can do so much more than today’s devices,” said Jean Anne Incorvia, an assistant professor in the Cockrell School of Engineering’s Department of Electrical and Computer Engineer and the lead author on the paper published today in Nature Communications. “And by mimicking synapses, we can teach these devices to learn on the fly, without requiring huge training methods that take up so much power.”

The Research: A combination of graphene and nafion, a polymer membrane material, make up the backbone of the synaptic transistor. Together, these materials demonstrate key synaptic-like behaviors — most importantly, the ability for the pathways to strengthen over time as they are used more often, a type of neural muscle memory. In computing, this means that devices will be able to get better at tasks like recognizing and interpreting images over time and do it faster.

Another important finding is that these transistors are biocompatible, which means they can interact with living cells and tissue. That is key for potential applications in medical devices that come into contact with the human body. Most materials used for these early brain-like devices are toxic, so they would not be able to contact living cells in any way.

Why It Matters: With new high-tech concepts like self-driving cars, drones and robots, we are reaching the limits of what silicon chips can efficiently do in terms of data processing and storage. For these next-generation technologies, a new computing paradigm is needed. Neuromorphic devices mimic processing capabilities of the brain, a powerful computer for immersive tasks.

“Biocompatibility, flexibility, and softness of our artificial synapses is essential,” said Dmitry Kireev, a post-doctoral researcher who co-led the project. “In the future, we envision their direct integration with the human brain, paving the way for futuristic brain prosthesis.”

Will It Really Happen: Neuromorphic platforms are starting to become more common. Leading chipmakers such as Intel and Samsung have either produced neuromorphic chips already or are in the process of developing them. However, current chip materials place limitations on what neuromorphic devices can do, so academic researchers are working hard to find the perfect materials for soft brain-like computers.

“It’s still a big open space when it comes to materials; it hasn’t been narrowed down to the next big solution to try,” Incorvia said. “And it might not be narrowed down to just one solution, with different materials making more sense for different applications.”

The Team: The research was led by Incorvia and Deji Akinwande, professor in the Department of Electrical and Computer Engineering. The two have collaborated many times together in the past, and Akinwande is a leading expert in graphene, using it in multiple research breakthroughs, most recently as part of a wearable electronic tattoo for blood pressure monitoring.

The idea for the project was conceived by Samuel Liu, a Ph.D. student and first author on the paper, in a class taught by Akinwande. Kireev then suggested the specific project. Harrison Jin, an undergraduate electrical and computer engineering student, measured the devices and analyzed data.

The team collaborated with T. Patrick Xiao and Christopher Bennett of Sandia National Laboratories, who ran neural network simulations and analyzed the resulting data.

Here’s a link to and a citation for the ‘graphene transistor’ paper,

Metaplastic and energy-efficient biocompatible graphene artificial synaptic transistors for enhanced accuracy neuromorphic computing by Dmitry Kireev, Samuel Liu, Harrison Jin, T. Patrick Xiao, Christopher H. Bennett, Deji Akinwande & Jean Anne C. Incorvia. Nature Communications volume 13, Article number: 4386 (2022) DOI: https://doi.org/10.1038/s41467-022-32078-6 Published: 28 July 2022

This paper is open access.

Neuromorphic computing and liquid-light interaction

Simulation result of light affecting liquid geometry, which in turn affects reflection and transmission properties of the optical mode, thus constituting a two-way light–liquid interaction mechanism. The degree of deformation serves as an optical memory allowing to store the power magnitude of the previous optical pulse and use fluid dynamics to affect the subsequent optical pulse at the same actuation region, thus constituting an architecture where memory is part of the computation process. Credit: Gao et al., doi 10.1117/1.AP.4.4.046005

This is a fascinating approach to neuromorphic (brainlike) computing and given my recent post (August 29, 2022) about human cells being incorporated into computer chips, it’s part o my recent spate of posts about neuromorphic computing. From a July 25, 2022 news item on phys.org,

Sunlight sparkling on water evokes the rich phenomena of liquid-light interaction, spanning spatial and temporal scales. While the dynamics of liquids have fascinated researchers for decades, the rise of neuromorphic computing has sparked significant efforts to develop new, unconventional computational schemes based on recurrent neural networks, crucial to supporting wide range of modern technological applications, such as pattern recognition and autonomous driving. As biological neurons also rely on a liquid environment, a convergence may be attained by bringing nanoscale nonlinear fluid dynamics to neuromorphic computing.

A July 25, 2022 SPIE (International Society for Optics and Photonics) press release (also on EurekAlert), which originated the news item,

Researchers from University of California San Diego recently proposed a novel paradigm where liquids, which usually do not strongly interact with light on a micro- or nanoscale, support significant nonlinear response to optical fields. As reported in Advanced Photonics, the researchers predict a substantial light–liquid interaction effect through a proposed nanoscale gold patch operating as an optical heater and generating thickness changes in a liquid film covering the waveguide.

The liquid film functions as an optical memory. Here’s how it works: Light in the waveguide affects the geometry of the liquid surface, while changes in the shape of the liquid surface affect the properties of the optical mode in the waveguide, thus constituting a mutual coupling between the optical mode and the liquid film. Importantly, as the liquid geometry changes, the properties of the optical mode undergo a nonlinear response; after the optical pulse stops, the magnitude of liquid film’s deformation indicates the power of the previous optical pulse.

Remarkably, unlike traditional computational approaches, the nonlinear response and the memory reside at the same spatial region, thus suggesting realization of a compact (beyond von-Neumann) architecture where memory and computational unit occupy the same space. The researchers demonstrate that the combination of memory and nonlinearity allow the possibility of “reservoir computing” capable of performing digital and analog tasks, such as nonlinear logic gates and handwritten image recognition.

Their model also exploits another significant liquid feature: nonlocality. This enables them to predict computation enhancement that is simply not possible in solid state material platforms with limited nonlocal spatial scale. Despite nonlocality, the model does not quite achieve the levels of modern solid-state optics-based reservoir computing systems, yet the work nonetheless presents a clear roadmap for future experimental works aiming to validate the predicted effects and explore intricate coupling mechanisms of various physical processes in a liquid environment for computation.

Using multiphysics simulations to investigate coupling between light, fluid dynamics, heat transport, and surface tension effects, the researchers predict a family of novel nonlinear and nonlocal optical effects. They go a step further by indicating how these can be used to realize versatile, nonconventional computational platforms. Taking advantage of a mature silicon photonics platform, they suggest improvements to state-of-the-art liquid-assisted computation platforms by around five orders magnitude in space and at least two orders of magnitude in speed.

Here’s a link to and a citation for the paper,

Thin liquid film as an optical nonlinear-nonlocal medium and memory element in integrated optofluidic reservoir computer by Chengkuan Gao, Prabhav Gaur, Shimon Rubin, Yeshaiahu Fainman. Advanced Photonics, 4(4), 046005 (2022). https://doi.org/10.1117/1.AP.4.4.046005 Published: 1 July 2022

This paper is open access.

Guide for memristive hardware design

An August 15 ,2022 news item on ScienceDaily announces a type of guide for memristive hardware design,

They are many times faster than flash memory and require significantly less energy: memristive memory cells could revolutionize the energy efficiency of neuromorphic [brainlike] computers. In these computers, which are modeled on the way the human brain works, memristive cells function like artificial synapses. Numerous groups around the world are working on the use of corresponding neuromorphic circuits — but often with a lack of understanding of how they work and with faulty models. Jülich researchers have now summarized the physical principles and models in a comprehensive review article in the renowned journal Advances in Physics.

An August 15, 2022 Forschungszentrum Juelich press release (also on EurekAlert), which originated the news item, describes two papers designed to help researchers better understand and design memristive hardware,

Certain tasks – such as recognizing patterns and language – are performed highly efficiently by a human brain, requiring only about one ten-thousandth of the energy of a conventional, so-called “von Neumann” computer. One of the reasons lies in the structural differences: In a von Neumann architecture, there is a clear separation between memory and processor, which requires constant moving of large amounts of data. This is time and energy consuming – the so-called von Neumann bottleneck. In the brain, the computational operation takes place directly in the data memory and the biological synapses perform the tasks of memory and processor at the same time.

In Jülich, scientists have been working for more than 15 years on special data storage devices and components that can have similar properties to the synapses in the human brain. So-called memristive memory devices, also known as memristors, are considered to be extremely fast, energy-saving and can be miniaturized very well down to the nanometer range. The functioning of memristive cells is based on a very special effect: Their electrical resistance is not constant, but can be changed and reset again by applying an external voltage, theoretically continuously. The change in resistance is controlled by the movement of oxygen ions. If these move out of the semiconducting metal oxide layer, the material becomes more conductive and the electrical resistance drops. This change in resistance can be used to store information.

The processes that can occur in cells are very complex and vary depending on the material system. Three researchers from the Jülich Peter Grünberg Institute – Prof. Regina Dittmann, Dr. Stephan Menzel, and Prof. Rainer Waser – have therefore compiled their research results in a detailed review article, “Nanoionic memristive phenomena in metal oxides: the valence change mechanism”. They explain in detail the various physical and chemical effects in memristors and shed light on the influence of these effects on the switching properties of memristive cells and their reliability.

“If you look at current research activities in the field of neuromorphic memristor circuits, they are often based on empirical approaches to material optimization,” said Rainer Waser, director at the Peter Grünberg Institute. “Our goal with our review article is to give researchers something to work with in order to enable insight-driven material optimization.” The team of authors worked on the approximately 200-page article for ten years and naturally had to keep incorporating advances in knowledge.

“The analogous functioning of memristive cells required for their use as artificial synapses is not the normal case. Usually, there are sudden jumps in resistance, generated by the mutual amplification of ionic motion and Joule heat,” explains Regina Dittmann of the Peter Grünberg Institute. “In our review article, we provide researchers with the necessary understanding of how to change the dynamics of the cells to enable an analog operating mode.”

“You see time and again that groups simulate their memristor circuits with models that don’t take into account high dynamics of the cells at all. These circuits will never work.” said Stephan Menzel, who leads modeling activities at the Peter Grünberg Institute and has developed powerful compact models that are now in the public domain (www.emrl.de/jart.html). “In our review article, we provide the basics that are extremely helpful for a correct use of our compact models.”

Roadmap neuromorphic computing

The “Roadmap of Neuromorphic Computing and Engineering”, which was published in May 2022, shows how neuromorphic computing can help to reduce the enormous energy consumption of IT globally. In it, researchers from the Peter Grünberg Institute (PGI-7), together with leading experts in the field, have compiled the various technological possibilities, computational approaches, learning algorithms and fields of application. 

According to the study, applications in the field of artificial intelligence, such as pattern recognition or speech recognition, are likely to benefit in a special way from the use of neuromorphic hardware. This is because they are based – much more so than classical numerical computing operations – on the shifting of large amounts of data. Memristive cells make it possible to process these gigantic data sets directly in memory without transporting them back and forth between processor and memory. This could reduce the energy efficiency of artificial neural networks by orders of magnitude.

Memristive cells can also be interconnected to form high-density matrices that enable neural networks to learn locally. This so-called edge computing thus shifts computations from the data center to the factory floor, the vehicle, or the home of people in need of care. Thus, monitoring and controlling processes or initiating rescue measures can be done without sending data via a cloud. “This achieves two things at the same time: you save energy, and at the same time, personal data and data relevant to security remain on site,” says Prof. Dittmann, who played a key role in creating the roadmap as editor.

Here’s a link to and a citation for the ‘roadmap’,

2022 roadmap on neuromorphic computing and engineering by Dennis V Christensen, Regina Dittmann, Bernabe Linares-Barranco, Abu Sebastian, Manuel Le Gallo, Andrea Redaelli, Stefan Slesazeck, Thomas Mikolajick, Sabina Spiga, Stephan Menzel, Ilia Valov, Gianluca Milano, Carlo Ricciardi, Shi-Jun Liang, Feng Miao, Mario Lanza, Tyler J Quill, Scott T Keene, Alberto Salleo, Julie Grollier, Danijela Marković, Alice Mizrahi, Peng Yao, J Joshua Yang, Giacomo Indiveri, John Paul Strachan, Suman Datta, Elisa Vianello, Alexandre Valentian, Johannes Feldmann, Xuan Li, Wolfram H P Pernice, Harish Bhaskaran, Steve Furber, Emre Neftci, Franz Scherr, Wolfgang Maass, Srikanth Ramaswamy, Jonathan Tapson, Priyadarshini Panda, Youngeun Kim, Gouhei Tanaka, Simon Thorpe, Chiara Bartolozzi, Thomas A Cleland, Christoph Posch, ShihChii Liu, Gabriella Panuccio, Mufti Mahmud, Arnab Neelim Mazumder, Morteza Hosseini, Tinoosh Mohsenin, Elisa Donati, Silvia Tolu, Roberto Galeazzi, Martin Ejsing Christensen, Sune Holm, Daniele Ielmini and N Pryds. Neuromorphic Computing and Engineering , Volume 2, Number 2 DOI: 10.1088/2634-4386/ac4a83 20 May 2022 • © 2022 The Author(s)

This paper is open access.

Here’s the most recent paper,

Nanoionic memristive phenomena in metal oxides: the valence change mechanism by Regina Dittmann, Stephan Menzel & Rainer Waser. Advances in Physics
Volume 70, 2021 – Issue 2 Pages 155-349 DOI: https://doi.org/10.1080/00018732.2022.2084006 Published online: 06 Aug 2022

This paper is behind a paywall.

Lab-made cartilage gel for stiff, achy knees

Researchers claim their lab-made cartilage is better than the real thing in an August 11, 2022 news item on phys.org, Note: Links have been removed,

Over-the-counter pain relievers, physical therapy, steroid injections—some people have tried it all and are still dealing with knee pain.

Often knee pain comes from the progressive wear and tear of cartilage known as osteoarthritis, which affects nearly one in six adults—867 million people—worldwide. For those who want to avoid replacing the entire knee joint, there may soon be another option that could help patients get back on their feet fast, pain-free, and stay that way.

Writing in the journal Advanced Functional Materials, a Duke University-led team says they have created the first gel-based cartilage substitute that is even stronger and more durable than the real thing.

Caption: Duke researchers have developed a gel-based cartilage substitute to relieve achy knees that’s even stronger and more durable than the real thing. Clinical trials to start next year. Credit: Canva Credit: Benjamin Wiley, Duke University

Here’s the August 11, 2022 Duke University news release (also on EurekAlert), which originated the news item, where you’ll find more details about the research, Note: Links have been removed,

Mechanical testing reveals that the Duke team’s hydrogel — a material made of water-absorbing polymers — can be pressed and pulled with more force than natural cartilage, and is three times more resistant to wear and tear.

Implants made of the material are currently being developed by Sparta Biomedical and tested in sheep. Researchers are gearing up to begin clinical trials in humans next year.

“If everything goes according to plan, the clinical trial should start as soon as April 2023,” said Duke chemistry professor Benjamin Wiley, who led the research along with Duke mechanical engineering and materials science professor Ken Gall.

To make this material, the Duke team took thin sheets of cellulose fibers and infused them with a polymer called polyvinyl alcohol — a viscous goo consisting of stringy chains of repeating molecules — to form a gel.

The cellulose fibers act like the collagen fibers in natural cartilage, Wiley said — they give the gel strength when stretched. The polyvinyl alcohol helps it return to its original shape. The result is a Jello-like material, 60% water, which is supple yet surprisingly strong.

Natural cartilage can withstand a whopping 5,800 to 8,500 pounds per inch of tugging and squishing, respectively, before reaching its breaking point. Their lab-made version is the first hydrogel that can handle even more. It is 26% stronger than natural cartilage in tension, something like suspending seven grand pianos from a key ring, and 66% stronger in compression — which would be like parking a car on a postage stamp.

“It’s really off the charts in terms of hydrogel strength,” Wiley said.

The team has already made hydrogels with remarkable properties. In 2020, they reported that they had created the first hydrogel strong enough for knees, which feel the force of two to three times body weight with each step.

Putting the gel to practical use as a cartilage replacement, however, presented additional design challenges. One was achieving the upper limits of cartilage’s strength. Activities like hopping, lunging, or climbing stairs put some 10 Megapascals of pressure on the cartilage in the knee, or about 1,400 pounds per square inch. But the tissue can take up to four times that before it breaks.

“We knew there was room for improvement,” Wiley said.

In the past, researchers attempting to create stronger hydrogels used a freeze-thaw process to produce crystals within the gel, which drive out water and help hold the polymer chains together. In the new study, instead of freezing and thawing the hydrogel, the researchers used a heat treatment called annealing to coax even more crystals to form within the polymer network.

By increasing the crystal content, the researchers were able to produce a gel that can withstand five times as much stress from pulling and nearly twice as much squeezing relative to freeze-thaw methods.

The improved strength of the annealed gel also helped solve a second design challenge: securing it to the joint and getting it to stay put.

Cartilage forms a thin layer that covers the ends of bones so they don’t grind against one another. Previous studies haven’t been able to attach hydrogels directly to bone or cartilage with sufficient strength to keep them from breaking loose or sliding off. So the Duke team came up with a different approach.

Their method of attachment involves cementing and clamping the hydrogel to a titanium base. This is then pressed and anchored into a hole where the damaged cartilage used to be. Tests show the design stays fastened 68% more firmly than natural cartilage on bone.

“Another concern for knee implants is wear over time, both of the implant itself and the opposing cartilage,” Wiley said.

Other researchers have tried replacing damaged cartilage with knee implants made of metal or polyethylene, but because these materials are stiffer than cartilage they can chafe against other parts of the knee.

In wear tests, the researchers took artificial cartilage and natural cartilage and spun them against each other a million times, with a pressure similar to what the knee experiences during walking. Using a high-resolution X-ray scanning technique called micro-computed tomography (micro-CT), the scientists found that the surface of their lab-made version held up three times better than the real thing. Yet because the hydrogel mimics the smooth, slippery, cushiony nature of real cartilage, it protects other joint surfaces from friction as they slide against the implant.

Natural cartilage is remarkably durable stuff. But once damaged, it has limited ability to heal because it doesn’t have any blood vessels, Wiley said.

In the United States, osteoarthritis is twice as common today than it was a century ago. Surgery is an option when conservative treatments fail. Over the decades surgeons have developed a number of minimally invasive approaches, such as removing loose cartilage, or making holes to stimulate new growth, or transplanting healthy cartilage from a donor. But all of these methods require months of rehab, and some percentage of them fail over time.

Generally considered a last resort, total knee replacement is a proven way to relieve pain. But artificial joints don’t last forever, either. Particularly for younger patients who want to avoid major surgery for a device that will only need to be replaced again down the line, Wiley said, “there’s just not very good options out there.”

“I think this will be a dramatic change in treatment for people at this stage,” Wiley said.

This work was supported in part by Sparta Biomedical and by the Shared Materials Instrumentation Facility at Duke University. Wiley and Gall are shareholders in Sparta Biomedical.

Here’s a link to and a citation for the paper,

A Synthetic Hydrogel Composite with a Strength and Wear Resistance Greater than Cartilage by Jiacheng Zhao, Huayu Tong, Alina Kirillova, William J. Koshut, Andrew Malek, Natasha C. Brigham, Matthew L. Becker, Ken Gall, Benjamin J. Wiley. Advanced Functional Materials DOI: https://doi.org/10.1002/adfm.202205662 First published: 04 August 2022

This paper is behind a paywall.

You can find Sparta Biomedical here.

Memristive forming strategy

This is highly technical and it’s here since I’m informally collecting all the research that I stumble across concerning memristors and neuromorphic engineering.

From a Sept. 5, 2022 news item on Nanowerk, Note: A link has been removed,

The silicon-based CMOS [complementary metal-oxide-semiconductor] technology is fast approaching its physical limits, and the electronics industry is urgently calling for new techniques to keep the long-term development. Two-dimensional (2D) semiconductors, like transition-metal dichalcogenides (TMDs), have become a competitive alternative to traditional semiconducting materials in the post-Moore era, and caused worldwide interest. However, before they can be used in practical applications, some key obstacles must be resolved. One of them is the large electrical contact resistances at the metal-semiconductor interfaces.

The large contact resistances mainly come from two aspects: the high tunneling barrier caused by the wide van der Waals (vdW) gap between the 2D material and the metal electrode; the high Schottky barrier accompanied by strong Fermi level pinning at the metal-semiconductor interface.

Four strategies including edge contact, doping TMDs, phase engineering, and using special metals, have been developed to address this problem. However, they all have shortcomings.

In a new work (Nano Letters, “Van der Waals Epitaxy and Photoresponse of Hexagonal Tellurium Nanoplates on Flexible Mica Sheets”) coming out of Zhenxing Wang’s group at the National Center for Nanoscience and Technology [located in Beijing, China], the researchers have proposed a brand-new contact resistance lowering strategy of 2D semiconductors with a good feasibility, a wide generality and a high stability.

You can fill in the blanks at Nanowerk or there’s this link and citation for the paper

Van der Waals Epitaxy and Photoresponse of Hexagonal Tellurium Nanoplates on Flexible Mica Sheets by Qisheng Wang, Muhammad Safdar, Kai Xu, Misbah Mirza, Zhenxing Wang, and Jun He. ACS Nano 2014, 8, 7, 7497–7505 DOI: https://doi.org/10.1021/nn5028104 Publication Date:July 2, 2014 Copyright © 2014 American Chemical Society

This paper is behind a paywall.

Making longer lasting bandages with sound and bubbles

This research into longer lasting bandages described in an August 12, 2022 news item on phys.org comes from McGill University (Montréal, Canada)

Researchers have discovered that they can control the stickiness of adhesive bandages using ultrasound waves and bubbles. This breakthrough could lead to new advances in medical adhesives, especially in cases where adhesives are difficult to apply such as on wet skin.

“Bandages, glues, and stickers are common bioadhesives that are used at home or in clinics. However, they don’t usually adhere well on wet skin. It’s also challenging to control where they are applied and the strength and duration of the formed adhesion,” says McGill University Professor Jianyu Li, who led the research team of engineers, physicists, chemists, and clinicians.

Caption: Adhesive hydrogel applied on skin under ultrasound probe. Credit: Ran Huo and Jianyu Li

An August 12, 2022 McGill University news release (also on EurekAlert), which originated the news item, delves further into the work,

“We were surprised to find that by simply playing around with ultrasonic intensity, we can control very precisely the stickiness of adhesive bandages on many tissues,” says lead author Zhenwei Ma, a former student of Professor Li and now a Killam Postdoctoral Fellow at the University of British Columbia.

Ultrasound induced bubbles control stickiness

In collaboration with physicists Professor Outi Supponen and Claire Bourquard from the Institute of Fluid Dynamics at ETH Zurich, the team experimented with ultrasound induced microbubbles to make adhesives stickier. “The ultrasound induces many microbubbles, which transiently push the adhesives into the skin for stronger bioadhesion,” says Professor Supponen. “We can even use theoretical modeling to estimate exactly where the adhesion will happen.”

Their study, published in the journal Science, shows that the adhesives are compatible with living tissue in rats. The adhesives can also potentially be used to deliver drugs through the skin. “This paradigm-shifting technology will have great implications in many branches of medicine,” says University of British Columbia Professor Zu-hua Gao. “We’re very excited to translate this technology for applications in clinics for tissue repair, cancer therapy, and precision medicine.”

“By merging mechanics, materials and biomedical engineering, we envision the broad impact of our bioadhesive technology in wearable devices, wound management, and regenerative medicine,” says Professor Li, who is also a Canada Research Chair in Biomaterials and Musculoskeletal Health.

Here’s a link to and a citation for the paper,

Controlled tough bioadhesion mediated by ultrasound by Zhenwei Ma, Claire Bourquard, Qiman Gao, Shuaibing Jiang, Tristan De Iure-Grimmel, Ran Huo, Xuan Li, Zixin He, Zhen Yang, Galen Yang, Yixiang Wang, Edmond Lam, Zu-hua Gao, Outi Supponen and Jianyu Li. Science 11 Aug 2022 Vol 377, Issue 6607 pp. 751-755 DOI: 10.1126/science.abn8699

This paper is behind a paywall.

I haven’t seen this before but it seems that one of the journal’s editors decided to add a standalone paragraph to hype some of the other papers about adhesives in the issue,

A sound way to make it stick

Tissue adhesives play a role in temporary or permanent tissue repair, wound management, and the attachment of wearable electronics. However, it can be challenging to tailor the adhesive strength to ensure reversibility when desired and to maintain permeability. Ma et al. designed hydrogels made of polyacrylamide or poly(N-isopropylacrylamide) combined with alginate that are primed using a solution containing nanoparticles of chitosan, gelatin, or cellulose nanocrystals (see the Perspective by Es Sayed and Kamperman). The application of ultrasound causes cavitation that pushes the primer molecules into the tissue. The mechanical interlocking of the anchors eventually results in strong adhesion between hydrogel and tissue without the need for chemical bonding. Tests on porcine or rat skin showed enhanced adhesion energy and interfacial fatigue resistance with on-demand detachment. —MSL

I like the wordplay and am guessing that MSL is:

Marc S. Lavine
Senior Editor
Education: BASc, University of Toronto; PhD, University of Cambridge
Areas of responsibility: Reviews; materials science, biomaterials, engineering

Data storytelling in libraries

I had no idea that thee was such enthusiasm for data storytelling but it seems libraries require a kit for the topic. From an August 30, 2022 University of Illinois School of Information Sciences news release (also on EurekAlert), Note: A link has been removed,

A new project led by Associate Professor Kate McDowell and Assistant Professor Matthew Turk of the School of Information Sciences (iSchool) at the University of Illinois Urbana-Champaign will help libraries tell data stories that connect with their audiences. Their project, “Data Storytelling Toolkit for Librarians,” has received a two-year, $99,330 grant from the Institute of Museum and Library Services (IMLS grant RE-250094-OLS-21), under the Laura Bush 21st Century Librarian Program, which supports innovative research by untenured, tenure-track faculty.

“There are thousands of librarians who are skittish about data but love stories,” explained McDowell, who co-teaches a data storytelling course at the iSchool with Turk. “And there are hundreds of librarians who see data as fundamental, but until those librarians have a language through which to connect with the passions of the thousands who love stories, this movement toward strategic data use in the field of libraries will be stifled, along with the potential collaborative creativity of librarians.”

The data storytelling toolkit will provide a set of easy-to-adapt templates, which librarians can use to move quickly from data to story to storytelling. Librarians will be able to use the toolkit to plug in data they already have and generate data visualization and narrative structure options.

“To give an example, public libraries need to communicate employment impact. In this case, the data story will include who has become employed based on library services, how (journey map showing a visual sequence of steps from job seeking to employment), a structure for the story of an individual’s outcomes, and a strong data visualization strategy for communicating this impact,” said McDowell.

According to the researchers, the toolkit will be clearly defined so that librarians understand the potential for communicating with data but also fully adaptable to each librarian’s setting and to the communication needs inside the organization and with the public. The project will focus on community college and public libraries, with initial collaborators to include Ericson Public Library in Boone, Iowa; Oregon City (OR) Public Library; Moraine Valley Community College in Palos Hills, Illinois; Jackson State Community College in Jackson, Tennessee; and The Urbana Free Library.

McDowell’s storytelling research has involved training collaborations with advancement staff both at the University of Illinois Urbana-Champaign and the University of Illinois system; storytelling consulting work for multiple nonprofits including the 50th anniversary of the statewide Prairie Rivers Network that protects Illinois water; and storytelling lectures for the Consortium of Academic and Research Libraries in Illinois (CARLI). McDowell researches and publishes in the areas of storytelling at work, social justice storytelling, and what library storytelling can teach the information sciences about data storytelling. She holds both an MS and PhD in library and information science from Illinois.

Turk also holds an appointment with the Department of Astronomy in the College of Liberal Arts and Sciences at the University of Illinois. His research focuses on how individuals interact with data and how that data is processed and understood. He is a recipient of the prestigious Gordon and Betty Moore Foundation’s Moore Investigator Award in Data-Driven Discovery. Turk holds a PhD in physics from Stanford University.

I found some earlier information about a data storytelling course taught by the two researchers, from a September 25, 2019 University of Illinois School of Information Sciences news release, which provides some additional insight,

Collecting and understanding data is important, but equally important is the ability to tell meaningful stories based on data. Students in the iSchool’s Data Science Storytelling course (IS 590DST) learn data visualization as well as storytelling techniques, a combination that will prove valuable to their employers as they enter the workforce.

The course instructors, Associate Professor and Interim Associate Dean for Academic Affairs Kate McDowell and Assistant Professor Matthew Turk, introduced Data Science Storytelling in fall 2017. The course combines McDowell’s research interests in storytelling practices and applications and Turk’s research interests in data analysis and visualization.

Students in the course learn storytelling concepts, narrative theories, and performance techniques as well as how to develop stories in a collaborative workshop style. They also work with data visualization toolkits, which involves some knowledge of coding.

Ashley Hetrick (MS ’18) took Data Science Storytelling because she wanted “the skills to be able to tell the right story when the time is right for it.” She appreciated the practical approach, which allowed the students to immediately apply the skills they learned, such as developing a story structure and using a pandas DataFrame to support and build a story. Hetrick is using those skills in her current work as assistant director for research data engagement and education at the University of Illinois.

“I combine tools and methods from data science and analytics with storytelling to make sense of my unit’s data and to help researchers make sense of theirs,” she said. “In my experience, few researchers like data for its own sake. They collect, care for, and analyze data because they’re after what all storytellers are after: meaning. They want to find the signal in all of this noise. And they want others to find it too, perhaps long after their own careers are complete. Each dataset is a story and raw material for stories waiting to be told.”

According to Turk, the students who have enrolled in the course have been outstanding, “always finding ways to tell meaningful stories from data.” He hopes they leave the class with an understanding that stories permeate their lives and that shaping the stories they tell others and about others is a responsibility they carry with them.

“One reason that this course means a lot to me is because it gives students the opportunity to really bring together the different threads of study at the iSchool,” Turk said. “It’s a way to combine across levels of technicality, and it gives students permission to take a holistic approach to how they present data.”

I didn’t put much effort into it but did find three other courses on data storytelling, one at the University of Texas (my favourite), one at the University of Toronto, and one (Data Visualization and Storytelling) at the University of British Columbia. The one at the University of British Columbia is available through the business school, the other two are available through information/library science faculties.

‘Necrobotic’ spiders as mechanical grippers

A July 25, 2022 news item on ScienceDaily describes research utilizing dead spiders,

Spiders are amazing. They’re useful even when they’re dead.

Rice University mechanical engineers are showing how to repurpose deceased spiders as mechanical grippers that can blend into natural environments while picking up objects, like other insects, that outweigh them.

Caption: An illustration shows the process by which Rice University mechanical engineers turn deceased spiders into necrobotic grippers, able to grasp items when triggered by hydraulic pressure. Credit: Preston Innovation Laboratory/Rice University

A July 25, 2022 Rice University news release (also on on EurekAlert but published August 4, 2022), which originated the news item, explains the reasoning, Note: Links have been removed,

“It happens to be the case that the spider, after it’s deceased, is the perfect architecture for small scale, naturally derived grippers,” said Daniel Preston of Rice’s George R. Brown School of Engineering. 

An open-access study in Advanced Science outlines the process by which Preston and lead author Faye Yap harnessed a spider’s physiology in a first step toward a novel area of research they call “necrobotics.”

Preston’s lab specializes in soft robotic systems that often use nontraditional materials, as opposed to hard plastics, metals and electronics. “We use all kinds of interesting new materials like hydrogels and elastomers that can be actuated by things like chemical reactions, pneumatics and light,” he said. “We even have some recent work on textiles and wearables. 

“This area of soft robotics is a lot of fun because we get to use previously untapped types of actuation and materials,” Preston said. “The spider falls into this line of inquiry. It’s something that hasn’t been used before but has a lot of potential.”

Unlike people and other mammals that move their limbs by synchronizing opposing muscles, spiders use hydraulics. A chamber near their heads contracts to send blood to limbs, forcing them to extend. When the pressure is relieved, the legs contract. 

The cadavers Preston’s lab pressed into service were wolf spiders, and testing showed they were reliably able to lift more than 130% of their own body weight, and sometimes much more. They had the grippers manipulate a circuit board, move objects and even lift another spider.  

The researchers noted smaller spiders can carry heavier loads in comparison to their size. Conversely, the larger the spider, the smaller the load it can carry in comparison to its own body weight. Future research will likely involve testing this concept with spiders smaller than the wolf spider, Preston said

Yap said the project began shortly after Preston established his lab in Rice’s Department of Mechanical Engineering in 2019.

“We were moving stuff around in the lab and we noticed a curled up spider at the edge of the hallway,” she said. “We were really curious as to why spiders curl up after they die.”

A quick search found the answer: “Spiders do not have antagonistic muscle pairs, like biceps and triceps in humans,” Yap said. “They only have flexor muscles, which allow their legs to curl in, and they extend them outward by hydraulic pressure. When they die, they lose the ability to actively pressurize their bodies. That’s why they curl up. 

“At the time, we were thinking, ‘Oh, this is super interesting.’ We wanted to find a way to leverage this mechanism,” she said.

Internal valves in the spiders’ hydraulic chamber, or prosoma, allow them to control each leg individually, and that will also be the subject of future research, Preston said. “The dead spider isn’t controlling these valves,” he said. “They’re all open. That worked out in our favor in this study, because it allowed us to control all the legs at the same time.”

Setting up a spider gripper was fairly simple. Yap tapped into the prosoma chamber with a needle, attaching it with a dab of superglue. The other end of the needle was connected to one of the lab’s test rigs or a handheld syringe, which delivered a minute amount of air to activate the legs almost instantly. 

The lab ran one ex-spider through 1,000 open-close cycles to see how well its limbs held up, and found it to be fairly robust. “It starts to experience some wear and tear as we get close to 1,000 cycles,” Preston said. “We think that’s related to issues with dehydration of the joints. We think we can overcome that by applying polymeric coatings.”

What turns the lab’s work from a cool stunt into a useful technology?

Preston said a few necrobotic applications have occurred to him. “There are a lot of pick-and-place tasks we could look into, repetitive tasks like sorting or moving objects around at these small scales, and maybe even things like assembly of microelectronics,” he said. 

“Another application could be deploying it to capture smaller insects in nature, because it’s inherently camouflaged,” Yap added. 

“Also, the spiders themselves are biodegradable,” Preston said. “So we’re not introducing a big waste stream, which can be a problem with more traditional components.”

Preston and Yap are aware the experiments may sound to some people like the stuff of nightmares, but they said what they’re doing doesn’t qualify as reanimation. 

“Despite looking like it might have come back to life, we’re certain that it’s inanimate, and we’re using it in this case strictly as a material derived from a once-living spider,” Preston said. “It’s providing us with something really useful.”

Co-authors of the paper are graduate students Zhen Liu and Trevor Shimokusu and postdoctoral fellow Anoop Rajappan. Preston is an assistant professor of mechanical engineering.

Here’s a link to and a citation for the paper,

Necrobotics: Biotic Materials as Ready-to-Use Actuators by Te Faye Yap, Zhen Liu, Anoop Rajappan, Trevor J. Shimokusu, Daniel J. Preston. Advanced Science
DOI: https://doi.org/10.1002/advs.202201174 First published: 25 July 2022

As noted in the news release, this paper is open access.