Tag Archives: e-skin

Touchy robots and prosthetics

I have briefly speculated about the importance of touch elsewhere (see my July 19, 2019 posting regarding BlocKit and blockchain; scroll down about 50% of the way) but this upcoming news bit and the one following it put a different spin on the importance of touch.

Exceptional sense of touch

Robots need a sense of touch to perform their tasks and a July 18, 2019 National University of Singapore press release (also on EurekAlert) announces work on an improved sense of touch,

Robots and prosthetic devices may soon have a sense of touch equivalent to, or better than, the human skin with the Asynchronous Coded Electronic Skin (ACES), an artificial nervous system developed by a team of researchers at the National University of Singapore (NUS).

The new electronic skin system achieved ultra-high responsiveness and robustness to damage, and can be paired with any kind of sensor skin layers to function effectively as an electronic skin.

The innovation, achieved by Assistant Professor Benjamin Tee and his team from the Department of Materials Science and Engineering at the NUS Faculty of Engineering, was first reported in prestigious scientific journal Science Robotics on 18 July 2019.

Faster than the human sensory nervous system

“Humans use our sense of touch to accomplish almost every daily task, such as picking up a cup of coffee or making a handshake. Without it, we will even lose our sense of balance when walking. Similarly, robots need to have a sense of touch in order to interact better with humans, but robots today still cannot feel objects very well,” explained Asst Prof Tee, who has been working on electronic skin technologies for over a decade in hope of giving robots and prosthetic devices a better sense of touch.

Drawing inspiration from the human sensory nervous system, the NUS team spent a year and a half developing a sensor system that could potentially perform better. While the ACES electronic nervous system detects signals like the human sensor nervous system, it is made up of a network of sensors connected via a single electrical conductor, unlike the nerve bundles in the human skin. It is also unlike existing electronic skins which have interlinked wiring systems that can make them sensitive to damage and difficult to scale up.

Elaborating on the inspiration, Asst Prof Tee, who also holds appointments in the NUS Department of Electrical and Computer Engineering, NUS Institute for Health Innovation & Technology (iHealthTech), N.1 Institute for Health and the Hybrid Integrated Flexible Electronic Systems (HiFES) programme, said, “The human sensory nervous system is extremely efficient, and it works all the time to the extent that we often take it for granted. It is also very robust to damage. Our sense of touch, for example, does not get affected when we suffer a cut. If we can mimic how our biological system works and make it even better, we can bring about tremendous advancements in the field of robotics where electronic skins are predominantly applied.”

ACES can detect touches more than 1,000 times faster than the human sensory nervous system. For example, it is capable of differentiating physical contacts between different sensors in less than 60 nanoseconds – the fastest ever achieved for an electronic skin technology – even with large numbers of sensors. ACES-enabled skin can also accurately identify the shape, texture and hardness of objects within 10 milliseconds, ten times faster than the blinking of an eye. This is enabled by the high fidelity and capture speed of the ACES system.

The ACES platform can also be designed to achieve high robustness to physical damage, an important property for electronic skins because they come into the frequent physical contact with the environment. Unlike the current system used to interconnect sensors in existing electronic skins, all the sensors in ACES can be connected to a common electrical conductor with each sensor operating independently. This allows ACES-enabled electronic skins to continue functioning as long as there is one connection between the sensor and the conductor, making them less vulnerable to damage.

Smart electronic skins for robots and prosthetics

ACES’ simple wiring system and remarkable responsiveness even with increasing numbers of sensors are key characteristics that will facilitate the scale-up of intelligent electronic skins for Artificial Intelligence (AI) applications in robots, prosthetic devices and other human machine interfaces.

“Scalability is a critical consideration as big pieces of high performing electronic skins are required to cover the relatively large surface areas of robots and prosthetic devices,” explained Asst Prof Tee. “ACES can be easily paired with any kind of sensor skin layers, for example, those designed to sense temperatures and humidity, to create high performance ACES-enabled electronic skin with an exceptional sense of touch that can be used for a wide range of purposes,” he added.

For instance, pairing ACES with the transparent, self-healing and water-resistant sensor skin layer also recently developed by Asst Prof Tee’s team, creates an electronic skin that can self-repair, like the human skin. This type of electronic skin can be used to develop more realistic prosthetic limbs that will help disabled individuals restore their sense of touch.

Other potential applications include developing more intelligent robots that can perform disaster recovery tasks or take over mundane operations such as packing of items in warehouses. The NUS team is therefore looking to further apply the ACES platform on advanced robots and prosthetic devices in the next phase of their research.

For those who like videos, the researchers have prepared this,

Here’s a link to and a citation for the paper,

A neuro-inspired artificial peripheral nervous system for scalable electronic skins by Wang Wei Lee, Yu Jun Tan, Haicheng Yao, Si Li, Hian Hian See, Matthew Hon, Kian Ann Ng, Betty Xiong, John S. Ho and Benjamin C. K. Tee. Science Robotics Vol 4, Issue 32 31 July 2019 eaax2198 DOI: 10.1126/scirobotics.aax2198 Published online first: 17 Jul 2019:

This paper is behind a paywall.

Picking up a grape and holding his wife’s hand

This story comes from the Canadian Broadcasting Corporation (CBC) Radio with a six minute story embedded in the text, from a July 25, 2019 CBC Radio ‘As It Happens’ article by Sheena Goodyear,

The West Valley City, Utah, real estate agent [Keven Walgamott] lost his left hand in an electrical accident 17 years ago. Since then, he’s tried out a few different prosthetic limbs, but always found them too clunky and uncomfortable.

Then he decided to work with the University of Utah in 2016 to test out new prosthetic technology that mimics the sensation of human touch, allowing Walgamott to perform delicate tasks with precision — including shaking his wife’s hand. 

“I extended my left hand, she came and extended hers, and we were able to feel each other with the left hand for the first time in 13 years, and it was just a marvellous and wonderful experience,” Walgamott told As It Happens guest host Megan Williams. 

Walgamott, one of seven participants in the University of Utah study, was able to use an advanced prosthetic hand called the LUKE Arm to pick up an egg without cracking it, pluck a single grape from a bunch, hammer a nail, take a ring on and off his finger, fit a pillowcase over a pillow and more. 

While performing the tasks, Walgamott was able to actually feel the items he was holding and correctly gauge the amount of pressure he needed to exert — mimicking a process the human brain does automatically.

“I was able to feel something in each of my fingers,” he said. “What I feel, I guess the easiest way to explain it, is little electrical shocks.”

Those shocks — which he describes as a kind of a tingling sensation — intensify as he tightens his grip.

“Different variations of the intensity of the electricity as I move my fingers around and as I touch things,” he said. 

To make that [sense of touch] happen, the researchers implanted electrodes into the nerves on Walgamott’s forearm, allowing his brain to communicate with his prosthetic through a computer outside his body. That means he can move the hand just by thinking about it.

But those signals also work in reverse.

The team attached sensors to the hand of a LUKE Arm. Those sensors detect touch and positioning, and send that information to the electrodes so it can be interpreted by the brain.

For Walgamott, performing a series of menial tasks as a team of scientists recorded his progress was “fun to do.”

“I’d forgotten how well two hands work,” he said. “That was pretty cool.”

But it was also a huge relief from the phantom limb pain he has experienced since the accident, which he describes as a “burning sensation” in the place where his hand used to be.

A July 24, 2019 University of Utah news release (also on EurekAlert) provides more detail about the research,

Keven Walgamott had a good “feeling” about picking up the egg without crushing it.

What seems simple for nearly everyone else can be more of a Herculean task for Walgamott, who lost his left hand and part of his arm in an electrical accident 17 years ago. But he was testing out the prototype of a high-tech prosthetic arm with fingers that not only can move, they can move with his thoughts. And thanks to a biomedical engineering team at the University of Utah, he “felt” the egg well enough so his brain could tell the prosthetic hand not to squeeze too hard.

That’s because the team, led by U biomedical engineering associate professor Gregory Clark, has developed a way for the “LUKE Arm” (so named after the robotic hand that Luke Skywalker got in “The Empire Strikes Back”) to mimic the way a human hand feels objects by sending the appropriate signals to the brain. Their findings were published in a new paper co-authored by U biomedical engineering doctoral student Jacob George, former doctoral student David Kluger, Clark and other colleagues in the latest edition of the journal Science Robotics. A copy of the paper may be obtained by emailing robopak@aaas.org.

“We changed the way we are sending that information to the brain so that it matches the human body. And by matching the human body, we were able to see improved benefits,” George says. “We’re making more biologically realistic signals.”

That means an amputee wearing the prosthetic arm can sense the touch of something soft or hard, understand better how to pick it up and perform delicate tasks that would otherwise be impossible with a standard prosthetic with metal hooks or claws for hands.

“It almost put me to tears,” Walgamott says about using the LUKE Arm for the first time during clinical tests in 2017. “It was really amazing. I never thought I would be able to feel in that hand again.”

Walgamott, a real estate agent from West Valley City, Utah, and one of seven test subjects at the U, was able to pluck grapes without crushing them, pick up an egg without cracking it and hold his wife’s hand with a sensation in the fingers similar to that of an able-bodied person.

“One of the first things he wanted to do was put on his wedding ring. That’s hard to do with one hand,” says Clark. “It was very moving.”

Those things are accomplished through a complex series of mathematical calculations and modeling.

The LUKE Arm

The LUKE Arm has been in development for some 15 years. The arm itself is made of mostly metal motors and parts with a clear silicon “skin” over the hand. It is powered by an external battery and wired to a computer. It was developed by DEKA Research & Development Corp., a New Hampshire-based company founded by Segway inventor Dean Kamen.

Meanwhile, the U’s team has been developing a system that allows the prosthetic arm to tap into the wearer’s nerves, which are like biological wires that send signals to the arm to move. It does that thanks to an invention by U biomedical engineering Emeritus Distinguished Professor Richard A. Normann called the Utah Slanted Electrode Array. The array is a bundle of 100 microelectrodes and wires that are implanted into the amputee’s nerves in the forearm and connected to a computer outside the body. The array interprets the signals from the still-remaining arm nerves, and the computer translates them to digital signals that tell the arm to move.

But it also works the other way. To perform tasks such as picking up objects requires more than just the brain telling the hand to move. The prosthetic hand must also learn how to “feel” the object in order to know how much pressure to exert because you can’t figure that out just by looking at it.

First, the prosthetic arm has sensors in its hand that send signals to the nerves via the array to mimic the feeling the hand gets upon grabbing something. But equally important is how those signals are sent. It involves understanding how your brain deals with transitions in information when it first touches something. Upon first contact of an object, a burst of impulses runs up the nerves to the brain and then tapers off. Recreating this was a big step.

“Just providing sensation is a big deal, but the way you send that information is also critically important, and if you make it more biologically realistic, the brain will understand it better and the performance of this sensation will also be better,” says Clark.

To achieve that, Clark’s team used mathematical calculations along with recorded impulses from a primate’s arm to create an approximate model of how humans receive these different signal patterns. That model was then implemented into the LUKE Arm system.

Future research

In addition to creating a prototype of the LUKE Arm with a sense of touch, the overall team is already developing a version that is completely portable and does not need to be wired to a computer outside the body. Instead, everything would be connected wirelessly, giving the wearer complete freedom.

Clark says the Utah Slanted Electrode Array is also capable of sending signals to the brain for more than just the sense of touch, such as pain and temperature, though the paper primarily addresses touch. And while their work currently has only involved amputees who lost their extremities below the elbow, where the muscles to move the hand are located, Clark says their research could also be applied to those who lost their arms above the elbow.

Clark hopes that in 2020 or 2021, three test subjects will be able to take the arm home to use, pending federal regulatory approval.

The research involves a number of institutions including the U’s Department of Neurosurgery, Department of Physical Medicine and Rehabilitation and Department of Orthopedics, the University of Chicago’s Department of Organismal Biology and Anatomy, the Cleveland Clinic’s Department of Biomedical Engineering and Utah neurotechnology companies Ripple Neuro LLC and Blackrock Microsystems. The project is funded by the Defense Advanced Research Projects Agency and the National Science Foundation.

“This is an incredible interdisciplinary effort,” says Clark. “We could not have done this without the substantial efforts of everybody on that team.”

Here’s a link to and a citation for the paper,

Biomimetic sensory feedback through peripheral nerve stimulation improves dexterous use of a bionic hand by J. A. George, D. T. Kluger, T. S. Davis, S. M. Wendelken, E. V. Okorokova, Q. He, C. C. Duncan, D. T. Hutchinson, Z. C. Thumser, D. T. Beckler, P. D. Marasco, S. J. Bensmaia and G. A. Clark. Science Robotics Vol. 4, Issue 32, eaax2352 31 July 2019 DOI: 10.1126/scirobotics.aax2352 Published online first: 24 Jul 2019

This paper is definitely behind a paywall.

The University of Utah researchers have produced a video highlighting their work,

A solar, self-charging supercapacitor for wearable technology

Ravinder Dahiya, Carlos García Núñez, and their colleagues at the University of Glasgow (Scotland) strike again (see my May 10, 2017 posting for their first ‘solar-powered graphene skin’ research announcement). Last time it was all about robots and prosthetics, this time they’ve focused on wearable technology according to a July 18, 2018 news item on phys.org,

A new form of solar-powered supercapacitor could help make future wearable technologies lighter and more energy-efficient, scientists say.

In a paper published in the journal Nano Energy, researchers from the University of Glasgow’s Bendable Electronics and Sensing Technologies (BEST) group describe how they have developed a promising new type of graphene supercapacitor, which could be used in the next generation of wearable health sensors.

A July 18, 2018 University of Glasgow press release, which originated the news item, explains further,

Currently, wearable systems generally rely on relatively heavy, inflexible batteries, which can be uncomfortable for long-term users. The BEST team, led by Professor Ravinder Dahiya, have built on their previous success in developing flexible sensors by developing a supercapacitor which could power health sensors capable of conforming to wearer’s bodies, offering more comfort and a more consistent contact with skin to better collect health data.

Their new supercapacitor uses layers of flexible, three-dimensional porous foam formed from graphene and silver to produce a device capable of storing and releasing around three times more power than any similar flexible supercapacitor. The team demonstrated the durability of the supercapacitor, showing that it provided power consistently across 25,000 charging and discharging cycles.

They have also found a way to charge the system by integrating it with flexible solar powered skin already developed by the BEST group, effectively creating an entirely self-charging system, as well as a pH sensor which uses wearer’s sweat to monitor their health.

Professor Dahiya said: “We’re very pleased by the progress this new form of solar-powered supercapacitor represents. A flexible, wearable health monitoring system which only requires exposure to sunlight to charge has a lot of obvious commercial appeal, but the underlying technology has a great deal of additional potential.

“This research could take the wearable systems for health monitoring to remote parts of the world where solar power is often the most reliable source of energy, and it could also increase the efficiency of hybrid electric vehicles. We’re already looking at further integrating the technology into flexible synthetic skin which we’re developing for use in advanced prosthetics.” [emphasis mine]

In addition to the team’s work on robots, prosthetics, and graphene ‘skin’ mentioned in the May 10, 2017 posting the team is working on a synthetic ‘brainy’ skin for which they have just received £1.5m funding from the Engineering and Physical Science Research Council (EPSRC).

Brainy skin

A July 3, 2018 University of Glasgow press release discusses the proposed work in more detail,

A robotic hand covered in ‘brainy skin’ that mimics the human sense of touch is being developed by scientists.

University of Glasgow’s Professor Ravinder Dahiya has plans to develop ultra-flexible, synthetic Brainy Skin that ‘thinks for itself’.

The super-flexible, hypersensitive skin may one day be used to make more responsive prosthetics for amputees, or to build robots with a sense of touch.

Brainy Skin reacts like human skin, which has its own neurons that respond immediately to touch rather than having to relay the whole message to the brain.

This electronic ‘thinking skin’ is made from silicon based printed neural transistors and graphene – an ultra-thin form of carbon that is only an atom thick, but stronger than steel.

The new version is more powerful, less cumbersome and would work better than earlier prototypes, also developed by Professor Dahiya and his Bendable Electronics and Sensing Technologies (BEST) team at the University’s School of Engineering.

His futuristic research, called neuPRINTSKIN (Neuromorphic Printed Tactile Skin), has just received another £1.5m funding from the Engineering and Physical Science Research Council (EPSRC).

Professor Dahiya said: “Human skin is an incredibly complex system capable of detecting pressure, temperature and texture through an array of neural sensors that carry signals from the skin to the brain.

“Inspired by real skin, this project will harness the technological advances in electronic engineering to mimic some features of human skin, such as softness, bendability and now, also sense of touch. This skin will not just mimic the morphology of the skin but also its functionality.

“Brainy Skin is critical for the autonomy of robots and for a safe human-robot interaction to meet emerging societal needs such as helping the elderly.”

Synthetic ‘Brainy Skin’ with sense of touch gets £1.5m funding. Photo of Professor Ravinder Dahiya

This latest advance means tactile data is gathered over large areas by the synthetic skin’s computing system rather than sent to the brain for interpretation.

With additional EPSRC funding, which extends Professor Dahiya’s fellowship by another three years, he plans to introduce tactile skin with neuron-like processing. This breakthrough in the tactile sensing research will lead to the first neuromorphic tactile skin, or ‘brainy skin.’

To achieve this, Professor Dahiya will add a new neural layer to the e-skin that he has already developed using printing silicon nanowires.

Professor Dahiya added: “By adding a neural layer underneath the current tactile skin, neuPRINTSKIN will add significant new perspective to the e-skin research, and trigger transformations in several areas such as robotics, prosthetics, artificial intelligence, wearable systems, next-generation computing, and flexible and printed electronics.”

The Engineering and Physical Sciences Research Council (EPSRC) is part of UK Research and Innovation, a non-departmental public body funded by a grant-in-aid from the UK government.

EPSRC is the main funding body for engineering and physical sciences research in the UK. By investing in research and postgraduate training, the EPSRC is building the knowledge and skills base needed to address the scientific and technological challenges facing the nation.

Its portfolio covers a vast range of fields from healthcare technologies to structural engineering, manufacturing to mathematics, advanced materials to chemistry. The research funded by EPSRC has impact across all sectors. It provides a platform for future UK prosperity by contributing to a healthy, connected, resilient, productive nation.

It’s fascinating to note how these pieces of research fit together for wearable technology and health monitoring and creating more responsive robot ‘skin’ and, possibly, prosthetic devices that would allow someone to feel again.

The latest research paper

Getting back the solar-charging supercapacitors mentioned in the opening, here’s a link to and a citation for the team’s latest research paper,

Flexible self-charging supercapacitor based on graphene-Ag-3D graphene foam electrodes by Libu Manjakka, Carlos García Núñez, Wenting Dang, Ravinder Dahiya. Nano Energy Volume 51, September 2018, Pages 604-612 DOI: https://doi.org/10.1016/j.nanoen.2018.06.072

This paper is open access.

The sense of touch via artificial skin

Scientists have been working for years to allow artificial skin to transmit what the brain would recognize as the sense of touch. For anyone who has lost a limb and gotten a prosthetic replacement, the loss of touch is reputedly one of the more difficult losses to accept. The sense of touch is also vital in robotics if the field is to expand and include activities reliant on the sense of touch, e.g., how much pressure do you use to grasp a cup; how much strength  do you apply when moving an object from one place to another?

For anyone interested in the ‘electronic skin and pursuit of touch’ story, I have a Nov. 15, 2013 posting which highlights the evolution of the research into e-skin and what was then some of the latest work.

This posting is a 2015 update of sorts featuring the latest e-skin research from Stanford University and Xerox PARC. (Dexter Johnson in an Oct. 15, 2015 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineering] site) provides a good research summary.) For anyone with an appetite for more, there’s this from an Oct. 15, 2015 American Association for the Advancement of Science (AAAS) news release on EurekAlert,

Using flexible organic circuits and specialized pressure sensors, researchers have created an artificial “skin” that can sense the force of static objects. Furthermore, they were able to transfer these sensory signals to the brain cells of mice in vitro using optogenetics. For the many people around the world living with prosthetics, such a system could one day allow them to feel sensation in their artificial limbs. To create the artificial skin, Benjamin Tee et al. developed a specialized circuit out of flexible, organic materials. It translates static pressure into digital signals that depend on how much mechanical force is applied. A particular challenge was creating sensors that can “feel” the same range of pressure that humans can. Thus, on the sensors, the team used carbon nanotubes molded into pyramidal microstructures, which are particularly effective at tunneling the signals from the electric field of nearby objects to the receiving electrode in a way that maximizes sensitivity. Transferring the digital signal from the artificial skin system to the cortical neurons of mice proved to be another challenge, since conventional light-sensitive proteins used in optogenetics do not stimulate neural spikes for sufficient durations for these digital signals to be sensed. Tee et al. therefore engineered new optogenetic proteins able to accommodate longer intervals of stimulation. Applying these newly engineered optogenic proteins to fast-spiking interneurons of the somatosensory cortex of mice in vitro sufficiently prolonged the stimulation interval, allowing the neurons to fire in accordance with the digital stimulation pulse. These results indicate that the system may be compatible with other fast-spiking neurons, including peripheral nerves.

And, there’s an Oct. 15, 2015 Stanford University news release on EurkeAlert describing this work from another perspective,

The heart of the technique is a two-ply plastic construct: the top layer creates a sensing mechanism and the bottom layer acts as the circuit to transport electrical signals and translate them into biochemical stimuli compatible with nerve cells. The top layer in the new work featured a sensor that can detect pressure over the same range as human skin, from a light finger tap to a firm handshake.

Five years ago, Bao’s [Zhenan Bao, a professor of chemical engineering at Stanford,] team members first described how to use plastics and rubbers as pressure sensors by measuring the natural springiness of their molecular structures. They then increased this natural pressure sensitivity by indenting a waffle pattern into the thin plastic, which further compresses the plastic’s molecular springs.

To exploit this pressure-sensing capability electronically, the team scattered billions of carbon nanotubes through the waffled plastic. Putting pressure on the plastic squeezes the nanotubes closer together and enables them to conduct electricity.

This allowed the plastic sensor to mimic human skin, which transmits pressure information as short pulses of electricity, similar to Morse code, to the brain. Increasing pressure on the waffled nanotubes squeezes them even closer together, allowing more electricity to flow through the sensor, and those varied impulses are sent as short pulses to the sensing mechanism. Remove pressure, and the flow of pulses relaxes, indicating light touch. Remove all pressure and the pulses cease entirely.

The team then hooked this pressure-sensing mechanism to the second ply of their artificial skin, a flexible electronic circuit that could carry pulses of electricity to nerve cells.

Importing the signal

Bao’s team has been developing flexible electronics that can bend without breaking. For this project, team members worked with researchers from PARC, a Xerox company, which has a technology that uses an inkjet printer to deposit flexible circuits onto plastic. Covering a large surface is important to making artificial skin practical, and the PARC collaboration offered that prospect.

Finally the team had to prove that the electronic signal could be recognized by a biological neuron. It did this by adapting a technique developed by Karl Deisseroth, a fellow professor of bioengineering at Stanford who pioneered a field that combines genetics and optics, called optogenetics. Researchers bioengineer cells to make them sensitive to specific frequencies of light, then use light pulses to switch cells, or the processes being carried on inside them, on and off.

For this experiment the team members engineered a line of neurons to simulate a portion of the human nervous system. They translated the electronic pressure signals from the artificial skin into light pulses, which activated the neurons, proving that the artificial skin could generate a sensory output compatible with nerve cells.

Optogenetics was only used as an experimental proof of concept, Bao said, and other methods of stimulating nerves are likely to be used in real prosthetic devices. Bao’s team has already worked with Bianxiao Cui, an associate professor of chemistry at Stanford, to show that direct stimulation of neurons with electrical pulses is possible.

Bao’s team envisions developing different sensors to replicate, for instance, the ability to distinguish corduroy versus silk, or a cold glass of water from a hot cup of coffee. This will take time. There are six types of biological sensing mechanisms in the human hand, and the experiment described in Science reports success in just one of them.

But the current two-ply approach means the team can add sensations as it develops new mechanisms. And the inkjet printing fabrication process suggests how a network of sensors could be deposited over a flexible layer and folded over a prosthetic hand.

“We have a lot of work to take this from experimental to practical applications,” Bao said. “But after spending many years in this work, I now see a clear path where we can take our artificial skin.”

Here’s a link to and a citation for the paper,

A skin-inspired organic digital mechanoreceptor by Benjamin C.-K. Tee, Alex Chortos, Andre Berndt, Amanda Kim Nguyen, Ariane Tom, Allister McGuire, Ziliang Carter Lin, Kevin Tien, Won-Gyu Bae, Huiliang Wang, Ping Mei, Ho-Hsiu Chou, Bianxiao Cui, Karl Deisseroth, Tse Nga Ng, & Zhenan Bao. Science 16 October 2015 Vol. 350 no. 6258 pp. 313-316 DOI: 10.1126/science.aaa9306

This paper is behind a paywall.

Electronic skin and its evolution

Michael Berger has featured an article in the journal Advanced Materials, which reviews 25 years of work on e-skin (aka, electronic skin or artificial skin) in his Nov. 15, 2013 Nanowerk Spotlight series article ,

Advances in materials, fabrication strategies and device designs for flexible and stretchable electronics and sensors make it possible to envision a not-too-distant future where ultra-thin, flexible circuits based on inorganic semiconductors can be wrapped and attached to any imaginable surface, including body parts and even internal organs. Robotic technologies will also benefit as it becomes possible to fabricate electronic skin (‘e-skin’) that, for instance, could allow surgical robots to interact, in a soft contacting mode, with their surroundings through touch. In addition to giving robots a finer sense of touch, engineers believe that e-skin technology could also be used to create things like wallpapers that double as touchscreen displays and dashboard laminates that allow drivers to adjust electronic controls with the wave of a hand.

Here’s a link to and a citation for the 25-year review of work on e-skin,

25th Anniversary Article: The Evolution of Electronic Skin (E-Skin): A Brief History, Design Considerations, and Recent Progress by Mallory L. Hammock, Alex Chortos, Benjamin C.-K. Tee, Jeffrey B.-H. Tok, and Zhenan Bao. Advanced Materials Volume 25, Issue 42, pages 5997–6038, November 13, 2013 Article first published online: 22 OCT 2013 DOI: 10.1002/adma.201302240

© 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

The review article is behind a paywall but Berger’s synopsis offers a good overview* and tidbits such as this timeline (Berger offers a larger version) which includes important moments in science fiction (Note: Links in the caption have been removed),

Figure 1. A brief chronology of the evolution of e-skin. We emphasize several science fictional events in popular culture that inspired subsequent critical technological advancements in the development of e-skin. Images reproduced with permission: “micro-structured pressure sensor,”[18] “stretchable OLEDs,”[20b] “stretchable OPVs,”[21a] “stretchable, transparent e-skin,”[22] “macroscale nanowire e-skin,”[23a] “rechargeable, stretchable batteries,”[137] “interlocked e-skin.”[25] Copyright, respectively, 2010, 2009, 2012, 2005, 2010, 2013, 2012. Macmillan Publishers Ltd. “Flexible, active-matrix e-skin” image reproduced with permission.[26a] Copyright, 2004. National Academy of Sciences USA. “Epidermal electronics” image reproduced with permission.[390a] Copyright, American Association for the Advancement of Science. “Stretchable batteries” image reproduced with permission.[27] “Infrared e-skin” image reproduced with permission.[8b] Copyright 2001, IEEE. “Anthropomorphic cybernetic hand” image reproduced with permission.[426] Copyright 2006, IEEE. [downloaded from http://onlinelibrary.wiley.com.proxy.lib.sfu.ca/doi/10.1002/adma.201302240/full]

Figure 1. A brief chronology of the evolution of e-skin. We emphasize several science fictional events in popular culture that inspired subsequent critical technological advancements in the development of e-skin. Images reproduced with permission: “micro-structured pressure sensor,”[18] “stretchable OLEDs,”[20b] “stretchable OPVs,”[21a] “stretchable, transparent e-skin,”[22] “macroscale nanowire e-skin,”[23a] “rechargeable, stretchable batteries,”[137] “interlocked e-skin.”[25] Copyright, respectively, 2010, 2009, 2012, 2005, 2010, 2013, 2012. Macmillan Publishers Ltd. “Flexible, active-matrix e-skin” image reproduced with permission.[26a] Copyright, 2004. National Academy of Sciences USA. “Epidermal electronics” image reproduced with permission.[390a] Copyright, American Association for the Advancement of Science. “Stretchable batteries” image reproduced with permission.[27] “Infrared e-skin” image reproduced with permission.[8b] Copyright 2001, IEEE. “Anthropomorphic cybernetic hand” image reproduced with permission.[426] Copyright 2006, IEEE. [downloaded from http://onlinelibrary.wiley.com.proxy.lib.sfu.ca/doi/10.1002/adma.201302240/full]

Here’s an excerpt from the review article outlining the 1970s – 1990s period featuring some of the science fiction which has influenced the science (Note: Links have been removed),

The prospect of creating artificial skin was in many ways inspired by science fiction, which propelled the possibility of e-skin into the imagination of both the general public as well as the scientific community. One of the first science fiction books to explore the use of mechanical replacement organs was Caidin’s Cyborg in 1971, on which the famed Six Million Dollar Man television series about a man with a bionic replacement arm and eye was later based (1974).[4] Shortly after, at the beginning of the 1980s, George Lucas created a vision of a future with e-skin in the famous Star Wars series. In particular, he depicted a scene showing a medical robot installing an electronic hand with full sensory perception on the main character, Luke Skywalker.[5] Shortly after, in 1984, the Terminator movie series depicted humanoid robots and even a self-healing robot.[6] These fictitious renditions of e-skin took place against a real-life backdrop of vibrant microelectronics research that began bridging science fiction with scientific reality.

Early technological advancements in the development of e-skin were concomitant with their science fiction inspirations. In 1974, Clippinger et al. demonstrated a prosthetic hand capable of discrete sensor feedback.[7] Nearly a decade later, Hewlett-Packard (HP) marketed a personal computer (HP-150) that was equipped with a touchscreen, allowing users to activate functions by simply touching the display. It was the first mass-marketed electronic device capitalizing on the intuitive nature of human touch. In 1985, General Electric (GE) built the first sensitive skin for a robotic arm using discrete infrared sensors placed on a flexible sheet at a resolution of ≈5 cm.[8] The fabricated sensitive skin was proximally aware of its surroundings, allowing the robot’s arm to avert potential obstacles and effectively maneuver within its physical environment. Despite the robotic arm’s lack of fingers and low resolution, it was capable of demonstrating that electronics integrated into a membrane could allow for natural human–machine interaction. For example, the robotic arm was able to ‘dance’ with a ballerina without any pre-programmed motions.[8] In addition to the ability of an artificial skin to interact with its surroundings, it is equally critical that the artificial skin mimics the mechanical properties of human skin to accommodate its various motions. Hence, to build life-like prosthetics or humanoid robots, soft, flexible, and stretchable electronics needed to be developed.

In the 1990s, scientists began using flexible electronic materials to create large-area, low-cost and printable sensor sheets. Jiang et al. proposed one of the first flexible sensor sheets for tactile shear force sensing by creating silicon (Si) micro-electro-mechanical (MEM) islands by etching thin Si wafers and integrating them on flexible polyimide foils.[9] Much work has since been done to enhance the reliability of large sensor sheets to mechanical bending.[10] Around the same time, flexible arrays fabricated from organic semiconductors began to emerge that rivaled the performance of amorphous Si.[11]

Just before the turn of the millennium, the first “Sensitive Skin Workshop” was held in Washington DC under the aegis of the National Science Foundation and the Defense Advanced Research Projects Agency, bringing together approximately sixty researchers from different sectors of academia, industry, and government. It was discovered that there was significant industrial interest in e-skins for various applications, ranging from robotics to health care. A summary of concepts outlined in the workshop was compiled by Lumelsky et al.[12] In the early 2000s, the pace of e-skin development significantly increased as a result of this workshop, and researchers began to explore different types of sensors that could be more easily integrated with microprocessors.

I have written about e-skin a number of times, most recently in a July 9, 2013 posting about work on flexible sensors and gold nanoparticles being conducted at Technion-Israel Institute of Technology. This review helps to contextualize projects such as the one at Technion and elsewhere.

*To avoid redundancy ‘synopsis’ was replaced by ‘overview’ on Oct. 19, 2015.

The gold beneath your skin (artificial skin, that is)

Artificial skin that can sense as if it were real skin isn’t here yet but scientists at Technion-Israel Institute of Technology have created a flexible sensor that could fulfill that promise. From a July 9, 2013 news item on Azonano,

Using tiny gold particles and a kind of resin, a team of scientists at the Technion-Israel Institute of Technology has discovered how to make a new kind of flexible sensor that one day could be integrated into electronic skin, or e-skin.

If scientists learn how to attach e-skin to prosthetic limbs, people with amputations might once again be able to feel changes in their environments. The findings appear in the June issue of ACS Applied Materials & Interfaces.

The July 8, 2013 American Technion Society news release by Kevin Hattori, which originated the news item, describes the problems with developing flexible sensors that can mimic natural skin,

Researchers have long been interested in flexible sensors, but have had trouble adapting them for real-world use. To make its way into mainstream society, a flexible sensor would have to run on low voltage (so it would be compatible with the batteries in today’s portable devices), measure a wide range of pressures, and make more than one measurement at a time, including humidity, temperature, pressure, and the presence of chemicals. In addition, these sensors would also have to be able to be made quickly, easily, and cheaply.

Here are more details about the sensor and about how the researchers created it,

The Technion team’s sensor has all of these qualities. The secret is the use of monolayer-capped nanoparticles that are only 5-8 nanometers in diameter. They are made of gold and surrounded by connector molecules called ligands. In fact, “monolayer-capped nanoparticles can be thought of as flowers, where the center of the flower is the gold or metal nanoparticle and the petals are the monolayer of organic ligands that generally protect it,” says Haick.

The team discovered that when these nanoparticles are laid on top of a substrate – in this case, made of PET (flexible polyethylene terephthalate), the same plastic found in soda bottles – the resulting compound conducted electricity differently depending on how the substrate was bent. (The bending motion brings some particles closer to others, increasing how quickly electrons can pass between them.) This electrical property means that the sensor can detect a large range of pressures, from tens of milligrams to tens of grams. “The sensor is very stable and can be attached to any surface shape while keeping the function stable,” says Dr. Nir Peled, Head of the Thoracic Cancer Research and Detection Center at Israel’s Sheba Medical Center, who was not involved in the research.

And by varying how thick the substrate is, as well as what it is made of, scientists can modify how sensitive the sensor is. Because these sensors can be customized, they could in the future perform a variety of other tasks, including monitoring strain on bridges and detecting cracks in engines.

According to research team leader Professor Hossam Haick the new sensor is more sensitive (x 10 or more) in touch than existing touch-based e-skin.

Here’s a link to and a citation for the published paper,

Tunable Touch Sensor and Combined Sensing Platform: Toward Nanoparticle-based Electronic Skin by Meital Segev-Bar , Avigail Landman, Maayan Nir-Shapira, Gregory Shuster, and Hossam Haick. ACS Appl. Mater. Interfaces, 2013, 5 (12), pp 5531–5541 DOI: 10.1021/am400757q Publication Date (Web): June 4, 2013

Copyright © 2013 American Chemical Society

The paper is behind a paywall.

COllaborative Network for Training in Electronic Skin Technology (CONTEST) looking for twelve researchers

The CONTEST (COllaborative Network for Training in Electronic Skin TechnologyCOllaborative Network for Training in Electronic Skin Technology) project was launched today in Italy. According to the Aug. 21, 2012 news item on Nanowerk,

“Flexible electronics” is one of the most significant challenges in the field of future electronics. The possibility of realizing flexible and bendable electronic circuits, that can be rolled up, twisted or inserted in films around objects, would introduce a range of infinite applications in multiple fields, including healthcare, robotics and energy.

In this area, the Fondazione Bruno Kessler of Trento will coordinate the CONTEST project (COllaborative Network for Training in Electronic Skin Technology), an Initial Training Network (ITN) Marie Curie project funded by the European Commission involving European research, academic and business players. These include seven full partners (Fondazione Bruno Kessler, Italy; ST Microelectronics, Italy; Technical University Munich, Germany; Fraunhofer EMFT, Germany; University College London, UK; Imperial College London, UK; and Shadow Robotics Company, UK) and two associate partners (University of Cambridge, UK, and University of Tokyo, Japan).

The CONTEST project page at the Fondazione Bruno Kessler website offers more details,

At the heart of the CONTEST programme lies the multidisciplinary research training of young researchers. The CONTEST network will recruit twelve excellent Early-Stage Researchers (e.g. PhD students) and two Experienced Researchers (e.g. Post-Doc fellows). Information for submitting applications is available at the project’s website: http://www.contest-itn.eu/.
CONTEST activities will be coordinated by Ravinder S. Dahiya, researcher at the Bio-MEMS Unit (BIO-Micro-Electro-Mechanical-Systems) of  the Center for Materials and Microsystems (Fondazione Bruno Kessler) and by Leandro Lorenzelli, head of the Bio-MEMS Unit.
“The disruptive flexible electronics technology – says Ravinder S. Dahiya – will create change and improve the electronic market landscape and usher in a new revolution in multifunctional electronics. It will transform to an unprecedented degree our view of electronics and how we, as a society, interact with intelligent and responsive systems.”
“The investigation, in a very multidisciplinary framework, of technological approaches for thin flexible components – explains Leandro Lorenzelli – will generate new paradigms and concepts for microelectronic devices and systems with new functionalities tailored to the needs of a wide range of applications including robotics, biomedical instrumentations and smart cities.”

Here’s more about the 12 researchers they’re recruiting, excerpted from the Job Openings page on the CONTEST project website (Note: I have removed some links),

We have been awarded a large interdisciplinary project on electronic skin and applications, called CONTEST (COllaborative Network for Training in Electronic Skin Technology). We are therefore looking for 12 excellent Early-Stage Researchers (e.g. PhD students) and 2 Experienced Researchers (e.g. Post-Doc), associated to:

  • Fondazione Bruno Kessler, Trento, Italy (2 Early-Stage Researcher positions on silicon based flexible sensors (e.g. touch sensors), electronic circuits and 1 Experienced Researcher position on system integration)  …,
  • ST Microelectronics, Catania, Italy (2 Early-Stage Researcher positions on chemical/physical sensors on flexible substrates, and metal patterned substrates for integrating flexible sensing elements)…,
  • Technical University Munich, Germany (3 Early-Stage Researcher positions on organic semiconductor based electronics devices and circuits, modeling of flexible devices and sensors … , and artificial skin in humanoids…,
  • Fraunhofer EMFT, Munich, Germany (1 Early-Stage Researcher position on assembly on film substrates and foil integration as well as 1 Experienced Researcher position on reliability and ESD issues of components during flex integration) … ,
  • University College London, UK (2 Early-Stage Researcher positions on organic semiconductor based interconnects, solutions processed sensors, alternative on-skin energy schemes, patterning of e-skin and stretchable interconnects using blends of graphene in polymeric materials …
  • Imperial College London, UK (1 Early-Stage Researcher position on human sensori-motor control and robotics) …, and
  • Shadow Robotics Company, UK (1 Early-Stage Researcher position on biorobotics and mechatronics) ….

Mobility rules apply to all these positions. Researchers can be of any nationality. They are required to undertake trans-national mobility (i.e. move from one country to another) when taking up their appointment. One general rule applies to the appointment of researchers: At the time of recruitment by the host organization, researchers must not have resided or carried out their main activity (work, studies, etc.) in the country of their host organization (i.e. recruiting institute) for more than 12 months in the 3 years immediately prior to the reference date. Short stays such as holidays and/or compulsory national service are not taken into account.

Good luck to all who apply! Priority will be given to applications received by Sept. 30, 2012.

Nanotechnology-enabled robot skin

We take it for granted most of the time. The ability to sense pressure and respond to appropriately doesn’t seem like any great gift but without it, you’d crush fragile objects or be unable to hold onto the heavy ones.

It’s this ability to sense pressure that’s a stumbling block for robotmakers who want to move robots into jobs that require some dexterity, e.g., one that could clean yours windows and your walls without damaging one or failing to clean the other.

Two research teams have recently published papers about their work on solving the ‘pressure problem’. From the article by Jason Palmer for BBC News,

The materials, which can sense pressure as sensitively and quickly as human skin, have been outlined by two groups reporting in [the journal] Nature Materials.

The skins are arrays of small pressure sensors that convert tiny changes in pressure into electrical signals.

The arrays are built into or under flexible rubber sheets that could be stretched into a variety of shapes.

The materials could be used to sheath artificial limbs or to create robots that can pick up and hold fragile objects. They could also be used to improve tools for minimally-invasive surgery.

One team is located at the University of California, Berkeley and the other at Stanford University. The Berkeley team headed by Ali Javey, associate professor of electrical engineering and computer sciences has named their artificial skin ‘e-skin’. From the article by Dan Nosowitz on the Fast Company website,

Researchers at the University of California at Berkeley, backed by DARPA funding, have come up with a thin prototype material that’s getting science nerds all in a tizzy about the future of robotics.

This material is made from germanium and silicon nanowires grown on a cylinder, then rolled around a sticky polyimide substrate. What does that get you? As CNet says, “The result was a shiny, thin, and flexible electronic material organized into a matrix of transistors, each of which with hundreds of semiconductor nanowires.”

But what takes the material to the next level is the thin layer of pressure-sensitive rubber added to the prototype’s surface, capable of measuring pressures between zero and 15 kilopascals–about the normal range of pressure for a low-intensity human activity, like, say, writing a blog post. Basically, this rubber layer turns the nanowire material into a sort of artificial skin, which is being played up as a miracle material.

As Nosowitz points out, this is a remarkable achievement and it is a first step since skin registers pressure, pain, temperature, wetness, and more. Here’s an illustration of Berkeley’s e-skin (Source: University of California Berkeley, accessed from  http://berkeley.edu/news/media/releases/2010/09/12_eskin.shtml Sept. 14, 2010),

An artist’s illustration of an artificial e-skin with nanowire active matrix circuitry covering a hand. The fragile egg illustrates the functionality of the e-skin device for prosthetic and robotic applications.

The Stanford team’s approach has some similarities to the Berkeley’s (from Jason Palmer’s BBC article),

“Javey’s work is a nice demonstration of their capability in making a large array of nanowire TFTs [this film transistor],” said Zhenan Bao of Stanford University, whose group demonstrated the second approach.

The heart of Professor Bao’s devices is micro-structured rubber sheet in the middle of the TFT – effectively re-creating the functionality of the Berkeley group’s skins with less layers.

“Instead of laminating a pressure-sensitive resistor array on top of a nanowire TFT array, we made our transistors to be pressure sensitive,” Professor Bao explained to BBC News.

Here’s a short video about the Stanford team’s work (Source: Stanford University, accessed from http://news.stanford.edu/news/2010/september/sensitive-artificial-skin-091210.html Sept. 14, 2010),

Both approaches to the ‘pressure problem’ have at least one shortcoming. The Berkeley’s team’s e-skin has less sensitivity than Stanford’s while the Stanford team’s artificial skin is less flexible than e-skin as per Palmer’s BBC article. Also, I noticed that the Berkeley team at least is being funded by DARPA ([US Dept. of Defense] Defense Advanced Research Projects Agency) so I’m assuming a fair degree of military interest, which always gives me pause. Nonetheless, bravo to both teams.