Tag Archives: amputees

Bioinspired, biomimetic stimulation for the next generation of neuroprosthetics

ETH researchers have developed a prosthetic leg that communicates with the brain via natural signals. (Photograph: Keystone) Courtesy: ETH Zurich

A February 21, 2024 ETH Zurich press release by Ori Schipper (also on EurekAlert) announces a ‘nature-inspired’ or bioinspired approach to neuroprosthetics,

Prostheses that connect to the nervous system have been available for several years. Now, researchers at ETH Zurich have found evidence that neuroprosthetics work better when they use signals that are inspired by nature.

In brief

*Neuroprostheses are electro-​mechanical devices that are connected to the nervous system. As yet, these are unable to provide natural communication with the brain. Instead, they often evoke artificial, unpleasant sensations, similar to a feeling of tingles over the skin.
*This paraesthesia might be caused by overstimulation of the nervous system. ETH Zurich researchers together with colleagues in Germany, Serbia and Russia have proposed that neuroprosthetics should transmit biomimetic signals that are easier for the brain to understand.
*These new findings are relevant to arm and leg prostheses as well as various other aids and devices, including spinal implants and electrodes for brain stimulation. 

A few years ago, a team of researchers working under Professor Stanisa Raspopovic at the ETH Zurich Neuroengineering Lab gained worldwide attention when they announced that their prosthetic legs had enabled amputees to feel sensations from this artificial body part for the first time. Unlike commercial leg prostheses, which simply provide amputees with stability and support, the ETH researchers’ prosthetic device was connected to the sciatic nerve in the test subjects’ thigh via implanted electrodes.

This electrical connection enabled the neuroprosthesis to communicate with the patient’s brain, for example relaying information on the constant changes in pressure detected on the sole of the prosthetic foot when walking. This gave the test subjects greater confidence in their prosthesis – and it enabled them to walk considerably faster on challenging terrains. “Our experimental leg prosthesis succeeded in evoking natural sensations. That’s something current neuroprostheses are mainly unable to do; instead, they mostly evoke artificial, unpleasant sensations,” Raspopovic says.

This is probably because today’s neuroprosthetics are using time-​constant electrical pulses to stimulate the nervous system. “That’s not only unnatural, but also inefficient,” Raspopovic says. In a recently published paper, he and his team used the example of their leg prostheses to highlight the benefits of using naturally inspired, biomimetic stimulation to develop the next generation of neuroprosthetics.

Model simulates activation of nerves in the sole

To generate these biomimetic signals, Natalija Katic – a doctoral student in Raspopovic’s research group – developed a computer model called FootSim. It is based on data collected by collaborators in Canada, who recorded the activity of natural receptors, named mechanoreceptors, in the sole of the foot while touching different points on the feet of volunteers with a vibrating rod.

The model simulates the dynamic behaviour of large numbers of mechanoreceptors in the sole of the foot and generates the neural signals that shoot up the nerves in the leg towards the brain – from the moment the heel strikes the ground and the weight of the body starts to shift forward to the outside of the foot until the toes push off the ground ready for the next step. “Thanks to this model, we can see how semsory receptors from the sole, and the connected nerves, behave during walking or running, which is experimentally impossible to measure” Katic says.

Information overload in the spinal cord

To assess how closely the biomimetic signals calculated by the model correspond to the signals emitted by real neurons, Giacomo Valle – a postdoc in Raspopovic’s research group – worked with colleagues in Germany, Serbia and Russia on experiments with cats, whose nervous system processes movement in a similar way to that of humans. The experiments took place in 2019 at the Pavlov Institute of Physiology in St. Petersburg and were carried out in accordance with the relevant European Union guidelines.

The researchers implanted electrodes, connecting some to the nerve in the leg and some to the spinal cord to discover how the signals are transmitted through the nervous system. When the researchers applied pressure to the bottom of the cat’s paw, thereby evoking the natural neural response that occurs when a cat takes a step, the peculiar pattern of activity recorded in the spinal cord did indeed resemble the patterns that were elicited in the spinal cord when the researchers stimulated the leg nerve with biomimetic signals.

By contrast, the conventional approach of time-​constant stimulation of the sciatic nerve in the cat’s thigh elicited a markedly different pattern of activation in the spinal cord. “This clearly shows that the commonly used stimulation methods cause the neural networks in the spine to be flooded with information,” Valle says. “This information overload could be the reason for the unpleasant sensations or paraesthesia reported by some users of neuroprosthetics,” Raspopovic adds.

Learning the language of the nervous system

In their clinical trial with leg amputees, the researchers were able to show that biomimetic stimulation is superior to time-​constant stimulation. Their work clearly demonstrated how the signals that mimicked nature produced better results: not only were the test subjects able to climb steps faster, they also made fewer mistakes in a task that required them to climb the same steps while spelling words backwards. “Biomimetic neurostimulation allows subjects to concentrate on other things while walking,” Raspopovic says, “so we concluded that this type of stimulation is more naturally processed and less taxing on the brain.”

Raspopovic, whose lab forms part of the ETH Institute of Robotics and Intelligent Systems, believes that these new findings are not only relevant to the limb prostheses he and his team have been working on for over half a decade. He argues that the need to move away from unnatural, time-​constant stimulation towards biomimetic signals also applies to a whole series of other aids and devices, including spinal implants and electrodes for brain stimulation. “We need to learn the language of the nervous system,” Raspopovic says. “Then we’ll be able to communicate with the brain in ways it really understands.”

Here’s a link to and a citation for the paper,

Biomimetic computer-to-brain communication enhancing naturalistic touch sensations via peripheral nerve stimulation by Giacomo Valle, Natalija Katic Secerovic, Dominic Eggemann, Oleg Gorskii, Natalia Pavlova, Francesco M. Petrini, Paul Cvancara, Thomas Stieglitz, Pavel Musienko, Marko Bumbasirevic & Stanisa Raspopovic. Nature Communications volume 15, Article number: 1151 (2024) DOI: https://doi.org/10.1038/s41467-024-45190-6 Published: 20 February 2024

This paper is open access.

It was a bit of a surprise to see mention of some Canadian collaborators with regard to the earlier work featuring FootSim, a computer model Here’s a link to and a citation to that paper, this version is housed at ETH Zurich,

Modeling foot sole cutaneous afferents: FootSim by Natalija Katic, Rodrigo Kazu Siqueira, Luke Cleland, Nicholas Strzalkowski, Leah Bent, Stanisa Raspopovic, and Hannes Saal. Originally published in: iScience 26(1), DOI https://doi.org/10.1016/j.isci.2022.105874 Publication date: 2023-01-20 Permanent link: https://doi.org/10.3929/ethz-b-000591102

This paper too is open access.

A solar, self-charging supercapacitor for wearable technology

Ravinder Dahiya, Carlos García Núñez, and their colleagues at the University of Glasgow (Scotland) strike again (see my May 10, 2017 posting for their first ‘solar-powered graphene skin’ research announcement). Last time it was all about robots and prosthetics, this time they’ve focused on wearable technology according to a July 18, 2018 news item on phys.org,

A new form of solar-powered supercapacitor could help make future wearable technologies lighter and more energy-efficient, scientists say.

In a paper published in the journal Nano Energy, researchers from the University of Glasgow’s Bendable Electronics and Sensing Technologies (BEST) group describe how they have developed a promising new type of graphene supercapacitor, which could be used in the next generation of wearable health sensors.

A July 18, 2018 University of Glasgow press release, which originated the news item, explains further,

Currently, wearable systems generally rely on relatively heavy, inflexible batteries, which can be uncomfortable for long-term users. The BEST team, led by Professor Ravinder Dahiya, have built on their previous success in developing flexible sensors by developing a supercapacitor which could power health sensors capable of conforming to wearer’s bodies, offering more comfort and a more consistent contact with skin to better collect health data.

Their new supercapacitor uses layers of flexible, three-dimensional porous foam formed from graphene and silver to produce a device capable of storing and releasing around three times more power than any similar flexible supercapacitor. The team demonstrated the durability of the supercapacitor, showing that it provided power consistently across 25,000 charging and discharging cycles.

They have also found a way to charge the system by integrating it with flexible solar powered skin already developed by the BEST group, effectively creating an entirely self-charging system, as well as a pH sensor which uses wearer’s sweat to monitor their health.

Professor Dahiya said: “We’re very pleased by the progress this new form of solar-powered supercapacitor represents. A flexible, wearable health monitoring system which only requires exposure to sunlight to charge has a lot of obvious commercial appeal, but the underlying technology has a great deal of additional potential.

“This research could take the wearable systems for health monitoring to remote parts of the world where solar power is often the most reliable source of energy, and it could also increase the efficiency of hybrid electric vehicles. We’re already looking at further integrating the technology into flexible synthetic skin which we’re developing for use in advanced prosthetics.” [emphasis mine]

In addition to the team’s work on robots, prosthetics, and graphene ‘skin’ mentioned in the May 10, 2017 posting the team is working on a synthetic ‘brainy’ skin for which they have just received £1.5m funding from the Engineering and Physical Science Research Council (EPSRC).

Brainy skin

A July 3, 2018 University of Glasgow press release discusses the proposed work in more detail,

A robotic hand covered in ‘brainy skin’ that mimics the human sense of touch is being developed by scientists.

University of Glasgow’s Professor Ravinder Dahiya has plans to develop ultra-flexible, synthetic Brainy Skin that ‘thinks for itself’.

The super-flexible, hypersensitive skin may one day be used to make more responsive prosthetics for amputees, or to build robots with a sense of touch.

Brainy Skin reacts like human skin, which has its own neurons that respond immediately to touch rather than having to relay the whole message to the brain.

This electronic ‘thinking skin’ is made from silicon based printed neural transistors and graphene – an ultra-thin form of carbon that is only an atom thick, but stronger than steel.

The new version is more powerful, less cumbersome and would work better than earlier prototypes, also developed by Professor Dahiya and his Bendable Electronics and Sensing Technologies (BEST) team at the University’s School of Engineering.

His futuristic research, called neuPRINTSKIN (Neuromorphic Printed Tactile Skin), has just received another £1.5m funding from the Engineering and Physical Science Research Council (EPSRC).

Professor Dahiya said: “Human skin is an incredibly complex system capable of detecting pressure, temperature and texture through an array of neural sensors that carry signals from the skin to the brain.

“Inspired by real skin, this project will harness the technological advances in electronic engineering to mimic some features of human skin, such as softness, bendability and now, also sense of touch. This skin will not just mimic the morphology of the skin but also its functionality.

“Brainy Skin is critical for the autonomy of robots and for a safe human-robot interaction to meet emerging societal needs such as helping the elderly.”

Synthetic ‘Brainy Skin’ with sense of touch gets £1.5m funding. Photo of Professor Ravinder Dahiya

This latest advance means tactile data is gathered over large areas by the synthetic skin’s computing system rather than sent to the brain for interpretation.

With additional EPSRC funding, which extends Professor Dahiya’s fellowship by another three years, he plans to introduce tactile skin with neuron-like processing. This breakthrough in the tactile sensing research will lead to the first neuromorphic tactile skin, or ‘brainy skin.’

To achieve this, Professor Dahiya will add a new neural layer to the e-skin that he has already developed using printing silicon nanowires.

Professor Dahiya added: “By adding a neural layer underneath the current tactile skin, neuPRINTSKIN will add significant new perspective to the e-skin research, and trigger transformations in several areas such as robotics, prosthetics, artificial intelligence, wearable systems, next-generation computing, and flexible and printed electronics.”

The Engineering and Physical Sciences Research Council (EPSRC) is part of UK Research and Innovation, a non-departmental public body funded by a grant-in-aid from the UK government.

EPSRC is the main funding body for engineering and physical sciences research in the UK. By investing in research and postgraduate training, the EPSRC is building the knowledge and skills base needed to address the scientific and technological challenges facing the nation.

Its portfolio covers a vast range of fields from healthcare technologies to structural engineering, manufacturing to mathematics, advanced materials to chemistry. The research funded by EPSRC has impact across all sectors. It provides a platform for future UK prosperity by contributing to a healthy, connected, resilient, productive nation.

It’s fascinating to note how these pieces of research fit together for wearable technology and health monitoring and creating more responsive robot ‘skin’ and, possibly, prosthetic devices that would allow someone to feel again.

The latest research paper

Getting back the solar-charging supercapacitors mentioned in the opening, here’s a link to and a citation for the team’s latest research paper,

Flexible self-charging supercapacitor based on graphene-Ag-3D graphene foam electrodes by Libu Manjakka, Carlos García Núñez, Wenting Dang, Ravinder Dahiya. Nano Energy Volume 51, September 2018, Pages 604-612 DOI: https://doi.org/10.1016/j.nanoen.2018.06.072

This paper is open access.

Prosthetic pain

“Feeling no pain” can be a euphemism for being drunk. However, there are some people for whom it’s not a euphemism and they literally feel no pain for one reason or another. One group of people who feel no pain are amputees and a researcher at Johns Hopkins University (Maryland, US) has found a way so they can feel pain again.

A June 20, 2018 news item on ScienceDaily provides an introduction to the research and to the reason for it,

Amputees often experience the sensation of a “phantom limb” — a feeling that a missing body part is still there.

That sensory illusion is closer to becoming a reality thanks to a team of engineers at the Johns Hopkins University that has created an electronic skin. When layered on top of prosthetic hands, this e-dermis brings back a real sense of touch through the fingertips.

“After many years, I felt my hand, as if a hollow shell got filled with life again,” says the anonymous amputee who served as the team’s principal volunteer tester.

Made of fabric and rubber laced with sensors to mimic nerve endings, e-dermis recreates a sense of touch as well as pain by sensing stimuli and relaying the impulses back to the peripheral nerves.

A June 20, 2018 Johns Hopkins University news release (also on EurekAlert), which originated the news item, explores the research in more depth,

“We’ve made a sensor that goes over the fingertips of a prosthetic hand and acts like your own skin would,” says Luke Osborn, a graduate student in biomedical engineering. “It’s inspired by what is happening in human biology, with receptors for both touch and pain.

“This is interesting and new,” Osborn said, “because now we can have a prosthetic hand that is already on the market and fit it with an e-dermis that can tell the wearer whether he or she is picking up something that is round or whether it has sharp points.”

The work – published June 20 in the journal Science Robotics – shows it is possible to restore a range of natural, touch-based feelings to amputees who use prosthetic limbs. The ability to detect pain could be useful, for instance, not only in prosthetic hands but also in lower limb prostheses, alerting the user to potential damage to the device.

Human skin contains a complex network of receptors that relay a variety of sensations to the brain. This network provided a biological template for the research team, which includes members from the Johns Hopkins departments of Biomedical Engineering, Electrical and Computer Engineering, and Neurology, and from the Singapore Institute of Neurotechnology.

Bringing a more human touch to modern prosthetic designs is critical, especially when it comes to incorporating the ability to feel pain, Osborn says.

“Pain is, of course, unpleasant, but it’s also an essential, protective sense of touch that is lacking in the prostheses that are currently available to amputees,” he says. “Advances in prosthesis designs and control mechanisms can aid an amputee’s ability to regain lost function, but they often lack meaningful, tactile feedback or perception.”

That is where the e-dermis comes in, conveying information to the amputee by stimulating peripheral nerves in the arm, making the so-called phantom limb come to life. The e-dermis device does this by electrically stimulating the amputee’s nerves in a non-invasive way, through the skin, says the paper’s senior author, Nitish Thakor, a professor of biomedical engineering and director of the Biomedical Instrumentation and Neuroengineering Laboratory at Johns Hopkins.

“For the first time, a prosthesis can provide a range of perceptions, from fine touch to noxious to an amputee, making it more like a human hand,” says Thakor, co-founder of Infinite Biomedical Technologies, the Baltimore-based company that provided the prosthetic hardware used in the study.

Inspired by human biology, the e-dermis enables its user to sense a continuous spectrum of tactile perceptions, from light touch to noxious or painful stimulus. The team created a “neuromorphic model” mimicking the touch and pain receptors of the human nervous system, allowing the e-dermis to electronically encode sensations just as the receptors in the skin would. Tracking brain activity via electroencephalography, or EEG, the team determined that the test subject was able to perceive these sensations in his phantom hand.

The researchers then connected the e-dermis output to the volunteer by using a noninvasive method known as transcutaneous electrical nerve stimulation, or TENS. In a pain-detection task, the team determined that the test subject and the prosthesis were able to experience a natural, reflexive reaction to both pain while touching a pointed object and non-pain when touching a round object.

The e-dermis is not sensitive to temperature–for this study, the team focused on detecting object curvature (for touch and shape perception) and sharpness (for pain perception). The e-dermis technology could be used to make robotic systems more human, and it could also be used to expand or extend to astronaut gloves and space suits, Osborn says.

The researchers plan to further develop the technology and better understand how to provide meaningful sensory information to amputees in the hopes of making the system ready for widespread patient use.

Johns Hopkins is a pioneer in the field of upper limb dexterous prostheses. More than a decade ago, the university’s Applied Physics Laboratory led the development of the advanced Modular Prosthetic Limb, which an amputee patient controls with the muscles and nerves that once controlled his or her real arm or hand.

In addition to the funding from Space@Hopkins, which fosters space-related collaboration across the university’s divisions, the team also received grants from the Applied Physics Laboratory Graduate Fellowship Program and the Neuroengineering Training Initiative through the National Institute of Biomedical Imaging and Bioengineering through the National Institutes of Health under grant T32EB003383.

The e-dermis was tested over the course of one year on an amputee who volunteered in the Neuroengineering Laboratory at Johns Hopkins. The subject frequently repeated the testing to demonstrate consistent sensory perceptions via the e-dermis. The team has worked with four other amputee volunteers in other experiments to provide sensory feedback.

Here’s a video about this work,

Sarah Zhang’s June 20, 2018 article for The Atlantic reveals a few more details while covering some of the material in the news release,

Osborn and his team added one more feature to make the prosthetic hand, as he puts it, “more lifelike, more self-aware”: When it grasps something too sharp, it’ll open its fingers and immediately drop it—no human control necessary. The fingers react in just 100 milliseconds, the speed of a human reflex. Existing prosthetic hands have a similar degree of theoretically helpful autonomy: If an object starts slipping, the hand will grasp more tightly. Ideally, users would have a way to override a prosthesis’s reflex, like how you can hold your hand on a stove if you really, really want to. After all, the whole point of having a hand is being able to tell it what to do.

Here’s a link to and a citation for the paper,

Prosthesis with neuromorphic multilayered e-dermis perceives touch and pain by Luke E. Osborn, Andrei Dragomir, Joseph L. Betthauser, Christopher L. Hunt, Harrison H. Nguyen, Rahul R. Kaliki, and Nitish V. Thakor. Science Robotics 20 Jun 2018: Vol. 3, Issue 19, eaat3818 DOI: 10.1126/scirobotics.aat3818

This paper is behind a paywall.

A bioengineered robot hand with its own nervous system: machine/flesh and a job opening

A November 14, 2017 news item on phys.org announces a grant for a research project which will see engineered robot hands combined with regenerative medicine to imbue neuroprosthetic hands with the sense of touch,

The sense of touch is often taken for granted. For someone without a limb or hand, losing that sense of touch can be devastating. While highly sophisticated prostheses with complex moving fingers and joints are available to mimic almost every hand motion, they remain frustratingly difficult and unnatural for the user. This is largely because they lack the tactile experience that guides every movement. This void in sensation results in limited use or abandonment of these very expensive artificial devices. So why not make a prosthesis that can actually “feel” its environment?

That is exactly what an interdisciplinary team of scientists from Florida Atlantic University and the University of Utah School of Medicine aims to do. They are developing a first-of-its-kind bioengineered robotic hand that will grow and adapt to its environment. This “living” robot will have its own peripheral nervous system directly linking robotic sensors and actuators. FAU’s College of Engineering and Computer Science is leading the multidisciplinary team that has received a four-year, $1.3 million grant from the National Institute of Biomedical Imaging and Bioengineering of the [US] National Institutes of Health for a project titled “Virtual Neuroprosthesis: Restoring Autonomy to People Suffering from Neurotrauma.”

A November14, 2017 Florida Atlantic University (FAU) news release by Gisele Galoustian, which originated the news item, goes into more detail,

With expertise in robotics, bioengineering, behavioral science, nerve regeneration, electrophysiology, microfluidic devices, and orthopedic surgery, the research team is creating a living pathway from the robot’s touch sensation to the user’s brain to help amputees control the robotic hand. A neuroprosthesis platform will enable them to explore how neurons and behavior can work together to regenerate the sensation of touch in an artificial limb.

At the core of this project is a cutting-edge robotic hand and arm developed in the BioRobotics Laboratory in FAU’s College of Engineering and Computer Science. Just like human fingertips, the robotic hand is equipped with numerous sensory receptors that respond to changes in the environment. Controlled by a human, it can sense pressure changes, interpret the information it is receiving and interact with various objects. It adjusts its grip based on an object’s weight or fragility. But the real challenge is figuring out how to send that information back to the brain using living residual neural pathways to replace those that have been damaged or destroyed by trauma.

“When the peripheral nerve is cut or damaged, it uses the rich electrical activity that tactile receptors create to restore itself. We want to examine how the fingertip sensors can help damaged or severed nerves regenerate,” said Erik Engeberg, Ph.D., principal investigator, an associate professor in FAU’s Department of Ocean and Mechanical Engineering, and director of FAU’s BioRobotics Laboratory. “To accomplish this, we are going to directly connect these living nerves in vitro and then electrically stimulate them on a daily basis with sensors from the robotic hand to see how the nerves grow and regenerate while the hand is operated by limb-absent people.”

For the study, the neurons will not be kept in conventional petri dishes. Instead, they will be placed in  biocompatible microfluidic chambers that provide a nurturing environment mimicking the basic function of living cells. Sarah E. Du, Ph.D., co-principal investigator, an assistant professor in FAU’s Department of Ocean and Mechanical Engineering, and an expert in the emerging field of microfluidics, has developed these tiny customized artificial chambers with embedded micro-electrodes. The research team will be able to stimulate the neurons with electrical impulses from the robot’s hand to help regrowth after injury. They will morphologically and electrically measure in real-time how much neural tissue has been restored.

Jianning Wei, Ph.D., co-principal investigator, an associate professor of biomedical science in FAU’s Charles E. Schmidt College of Medicine, and an expert in neural damage and regeneration, will prepare the neurons in vitro, observe them grow and see how they fare and regenerate in the aftermath of injury. This “virtual” method will give the research team multiple opportunities to test and retest the nerves without any harm to subjects.

Using an electroencephalogram (EEG) to detect electrical activity in the brain, Emmanuelle Tognoli, Ph.D., co-principal investigator, associate research professor in FAU’s Center for Complex Systems and Brain Sciences in the Charles E. Schmidt College of Science, and an expert in electrophysiology and neural, behavioral, and cognitive sciences, will examine how the tactile information from the robotic sensors is passed onto the brain to distinguish scenarios with successful or unsuccessful functional restoration of the sense of touch. Her objective: to understand how behavior helps nerve regeneration and how this nerve regeneration helps the behavior.

Once the nerve impulses from the robot’s tactile sensors have gone through the microfluidic chamber, they are sent back to the human user manipulating the robotic hand. This is done with a special device that converts the signals coming from the microfluidic chambers into a controllable pressure at a cuff placed on the remaining portion of the amputated person’s arm. Users will know if they are squeezing the object too hard or if they are losing their grip.

Engeberg also is working with Douglas T. Hutchinson, M.D., co-principal investigator and a professor in the Department of Orthopedics at the University of Utah School of Medicine, who specializes in hand and orthopedic surgery. They are developing a set of tasks and behavioral neural indicators of performance that will ultimately reveal how to promote a healthy sensation of touch in amputees and limb-absent people using robotic devices. The research team also is seeking a post-doctoral researcher with multi-disciplinary experience to work on this breakthrough project.

Here’s more about the job opportunity from the FAU BioRobotics Laboratory job posting, (I checked on January 30, 2018 and it seems applications are still being accepted.)

Post-doctoral Opportunity

Dated Posted: Oct. 13, 2017

The BioRobotics Lab at Florida Atlantic University (FAU) invites applications for a NIH NIBIB-funded Postdoctoral position to develop a Virtual Neuroprosthesis aimed at providing a sense of touch in amputees and limb-absent people.

Candidates should have a Ph.D. in one of the following degrees: mechanical engineering, electrical engineering, biomedical engineering, bioengineering or related, with interest and/or experience in transdisciplinary work at the intersection of robotic hands, biology, and biomedical systems. Prior experience in the neural field will be considered an advantage, though not a necessity. Underrepresented minorities and women are warmly encouraged to apply.

The postdoctoral researcher will be co-advised across the department of Mechanical Engineering and the Center for Complex Systems & Brain Sciences through an interdisciplinary team whose expertise spans Robotics, Microfluidics, Behavioral and Clinical Neuroscience and Orthopedic Surgery.

The position will be for one year with a possibility of extension based on performance. Salary will be commensurate with experience and qualifications. Review of applications will begin immediately and continue until the position is filled.

The application should include:

  1. a cover letter with research interests and experiences,
  2. a CV, and
  3. names and contact information for three professional references.

Qualified candidates can contact Erik Engeberg, Ph.D., Associate Professor, in the FAU Department of Ocean and Mechanical Engineering at eengeberg@fau.edu. Please reference AcademicKeys.com in your cover letter when applying for or inquiring about this job announcement.

You can find the apply button on this page. Good luck!

Exploring the science of Iron Man (prior to the opening of Captain America: Civil War, aka, Captain America vs. Iron Man)

Not unexpectedly, there’s a news item about science and Iron Man (it’s getting quite common for the science in movies to be promoted and discussed) just a few weeks before the movie Captain America: Civil War or, as it’s also known, Captain America vs. Iron Man opens in the US. From an April 26, 2016 news item on phys.org,

… how much of our favourite superheros’ power lies in science and how much is complete fiction?

As Iron Man’s name suggests, he wears a suit of “iron” which gives him his abilities—superhuman strength, flight and an arsenal of weapons—and protects him from harm.

In scientific parlance, the Iron man suit is an exoskeleton which is worn outside the body to enhance it.

An April 26, 2016 posting by Chris Marr on the ScienceNetwork Western Australia blog, which originated the news item, provides an interesting overview of exoskeletons and some of the scientific obstacles still to be overcome before they become commonplace,

In the 1960s, the first real powered exoskeleton appeared—a machine integrated with the human frame and movements which provided the wearer with 25 times his natural lifting capacity.

The major drawback then was that the unit itself weighed in at 680kg.

UWA [University of Western Australia] Professor Adrian Keating suggests that some of the technology seen in the latest Marvel blockbuster, such as controlling the exoskeleton with simple thoughts, will be available in the near future by leveraging ongoing advances of multi-disciplinary research teams.

“Dust grain-sized micromachines could be programmed to cooperate to form reconfigurable materials such as the retractable face mask, for example,” Prof Keating says.

However, all of these devices are in need of a power unit small enough to be carried yet providing enough capacity for more than a few minutes of superhuman use, he says.

Does anyone have a spare Arc Reactor?

Currently, most exoskeleton development has been for medical applications, with devices designed to give mobility to amputees and paraplegics, and there are a number in commercial production and use.

Dr Lei Cui, who lectures in Mechatronics at Curtin University, has recently developed both a hand and leg exoskeleton, designed for use by patients who have undergone surgery or have nerve dysfunction, spinal injuries or muscular dysfunction.

“Currently we use an internal battery that lasts about two hours in the glove, which can be programmed for only four different movement patterns,” Dr Cui says.

Dr Cui’s exoskeletons are made from plastic, making them light but offering little protection compared to the titanium exterior of Stark’s favourite suit.

It’s clear that we are a long way from being able to produce a working Iron Man suit at all, let alone one that flies, protects the wearer and has the capacity to fight back.

This is not the first time I’ve featured a science and pop culture story here. You can check out my April 28, 2014 posting for a story about how Captain America’s shield could be a supercapacitor (it also has a link to a North Carolina State University blog featuring science and other comic book heroes) and there is my May 6, 2013 post about Iron Man 3 and a real life injectable nano-network.

As for ScienceNetwork Western Australia, here’s more from their About SWNA page,

ScienceNetwork Western Australia (SNWA) is an online science news service devoted to sharing WA’s achievements in science and technology.

SNWA is produced by Scitech, the state’s science and technology centre and supported by the WA Government’s Office of Science via the Department of the Premier and Cabinet.

Our team of freelance writers work with in-house editors based at Scitech to bring you news from all fields of science, and from the research, government and private industry sectors working throughout the state. Our writers also produce profile stories on scientists. We collaborate with leading WA institutions to bring you Perspectives from prominent WA scientists and opinion leaders.

We also share news of science-related events and information about the greater WA science community including WA’s Chief Scientist, the Premier’s Science Awards, Innovator of the Year Awards and information on regional community science engagement.

Since our commencement in 2003 we have grown to share WA’s stories with local, national and global audiences. Our articles are regularly republished in print and online media in the metropolitan and regional areas.

Bravo to the Western Australia government! I wish there  initiatives of this type in Canada, the closest we have is the French language Agence Science-Presse supported by the Province of Québec.

Human Bionic Project; amputations, prosthetics. and disabilities

Sydney Brownstone’s June 26, 2013 article about The Human Bionic Project  for Fast Company touches on human tragedy and the ways in which we attempt to cope by focusing on researcher David Sengeh’s work (Note: Links have been removed),

In the Iraq and Afghanistan wars alone, nearly 1,600 American soldiers have woken up without a limb. Fifteen survivors of the Boston marathon bombings are new amputees. And in Sierra Leone, where MIT graduate student David Sengeh is from, brutal tactics during the country’s 11-year civil war resulted in somewhere between 4,000 and 10,000 amputations in a country of less than 6 million people.

Many amputees go through the costly, lengthy process of transitioning to prosthetics, but it’s difficult even for prosthetic research specialists to gather information about the replacement parts outside their narrow fields. That’s part of the reason why, in December of last year, Sengeh and a research team began developing an interactive Inspector Gadget–a repository of all the FDA-approved [US Food and Drug Administration] replacement parts they could find.

So far, the Human Bionic Project has between 40 and 50 points of reference on its corporeal map–everything from artificial hearts to bionic jaws. In addition to photos and descriptions, the team will soon be looking to source videos of prosthetics in action from the public. Sengeh also hopes to integrate a timeline, tracking bionic parts throughout history, from the bionic toes of Ancient Egypt to the 3-D printed fingers of modern times.

“In [Haitian and Sierra Leonian] Creole, the word for disabled, like an amputee, is ‘scrap,'” Sengeh said. “I wanted to change that, because I know that we can get full functionality and become able-bodied.”

Do read Brownstone’s article as I haven’t, by any means, excerpted all the interesting bits.

There’s also more at The Human Bionic Project. Here’s a description (or manifesto) from the home page,

The Human Bionic Project begs for the fundamental redefinition of disability, illness, and disease as we have known it throughout history. It dares us to imagine the seamless interaction between the human being and machines. This interactive learning platform enables the user to visualize and learn about the comprehensive advances in human repair and enhancement that can be achieved with current technology. We can also wonder about what the human being will look like by the 22nd Century (year 2100) based on cutting edge advances in science and technology — more specifically in the fields of biomechanics, and electronics.

The Human Bionic Project serves as a call to action for technologists all around the world to think about the design of bionics in a fundamentally new way; how can we engineer all bionic elements for the human body using a similar protocol and architecture? Could we have the behaviour of the bionic knee be in sync with that of the bionic ankle of an above-knee amputee? How can we design a bionic eye that sees beyond what the biological eye can observe and use that information to help humans in critical situations? We have to imagine bionics not as singular units developed to replace or augment human parts but rather as part of a human-bionic system aimed at redefining what it means to be human.

Some of the ideas presented are already products used today, while others are prototypes explored by various research laboratories and inquisitive humans around the world. The works presented here are not ours and are publicly available. We have credited all the authors who are leading these extraordinary research initiatives.

You can find more about prosthetics, etc. on the ‘Inspector Gadget‘ page (it features an outline of a human body highlighted with red dots (click on a red dot to get details about prosthetics and other forms of augmentation). I don’t find this to be an especially friendly or intuitive interface. I think this is an MIT (Massachusetts Institute of Technology) student project and I find MIT tends to favour minimalism on its institutional and student websites. Still, there’s some fascinating information if you care to persist.

Here are more details about the folks and the funding supporting The Human Bionic Project (from the bottom of the home  page),

A project by David Moinina Sengeh. Collaborator: Reza Naeeni. Web development: Yannik Messerli. Undergraduate research assistant: Nicholas Fine. Funded by The Other Festival at MIT Media Lab (2013). Follow us on twitter: @humanbionicproj. …

I last mentioned human enhancement/augmentation in my June 17, 2013 commentary on You Are Very Star, a transmedia theatre experience taking place in Vancouver until June 29, 2013. I have written many times on the topic of human enhancement including a May 2, 2013 posting about a bionic ear; a Feb. 15, 2013 posting about a bionic eye; and a Jan. 30, 2013 posting about a BBC documentary on building a bionic man, amongst others.