Tag Archives: artificial skin

The sense of touch via artificial skin

Scientists have been working for years to allow artificial skin to transmit what the brain would recognize as the sense of touch. For anyone who has lost a limb and gotten a prosthetic replacement, the loss of touch is reputedly one of the more difficult losses to accept. The sense of touch is also vital in robotics if the field is to expand and include activities reliant on the sense of touch, e.g., how much pressure do you use to grasp a cup; how much strength  do you apply when moving an object from one place to another?

For anyone interested in the ‘electronic skin and pursuit of touch’ story, I have a Nov. 15, 2013 posting which highlights the evolution of the research into e-skin and what was then some of the latest work.

This posting is a 2015 update of sorts featuring the latest e-skin research from Stanford University and Xerox PARC. (Dexter Johnson in an Oct. 15, 2015 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineering] site) provides a good research summary.) For anyone with an appetite for more, there’s this from an Oct. 15, 2015 American Association for the Advancement of Science (AAAS) news release on EurekAlert,

Using flexible organic circuits and specialized pressure sensors, researchers have created an artificial “skin” that can sense the force of static objects. Furthermore, they were able to transfer these sensory signals to the brain cells of mice in vitro using optogenetics. For the many people around the world living with prosthetics, such a system could one day allow them to feel sensation in their artificial limbs. To create the artificial skin, Benjamin Tee et al. developed a specialized circuit out of flexible, organic materials. It translates static pressure into digital signals that depend on how much mechanical force is applied. A particular challenge was creating sensors that can “feel” the same range of pressure that humans can. Thus, on the sensors, the team used carbon nanotubes molded into pyramidal microstructures, which are particularly effective at tunneling the signals from the electric field of nearby objects to the receiving electrode in a way that maximizes sensitivity. Transferring the digital signal from the artificial skin system to the cortical neurons of mice proved to be another challenge, since conventional light-sensitive proteins used in optogenetics do not stimulate neural spikes for sufficient durations for these digital signals to be sensed. Tee et al. therefore engineered new optogenetic proteins able to accommodate longer intervals of stimulation. Applying these newly engineered optogenic proteins to fast-spiking interneurons of the somatosensory cortex of mice in vitro sufficiently prolonged the stimulation interval, allowing the neurons to fire in accordance with the digital stimulation pulse. These results indicate that the system may be compatible with other fast-spiking neurons, including peripheral nerves.

And, there’s an Oct. 15, 2015 Stanford University news release on EurkeAlert describing this work from another perspective,

The heart of the technique is a two-ply plastic construct: the top layer creates a sensing mechanism and the bottom layer acts as the circuit to transport electrical signals and translate them into biochemical stimuli compatible with nerve cells. The top layer in the new work featured a sensor that can detect pressure over the same range as human skin, from a light finger tap to a firm handshake.

Five years ago, Bao’s [Zhenan Bao, a professor of chemical engineering at Stanford,] team members first described how to use plastics and rubbers as pressure sensors by measuring the natural springiness of their molecular structures. They then increased this natural pressure sensitivity by indenting a waffle pattern into the thin plastic, which further compresses the plastic’s molecular springs.

To exploit this pressure-sensing capability electronically, the team scattered billions of carbon nanotubes through the waffled plastic. Putting pressure on the plastic squeezes the nanotubes closer together and enables them to conduct electricity.

This allowed the plastic sensor to mimic human skin, which transmits pressure information as short pulses of electricity, similar to Morse code, to the brain. Increasing pressure on the waffled nanotubes squeezes them even closer together, allowing more electricity to flow through the sensor, and those varied impulses are sent as short pulses to the sensing mechanism. Remove pressure, and the flow of pulses relaxes, indicating light touch. Remove all pressure and the pulses cease entirely.

The team then hooked this pressure-sensing mechanism to the second ply of their artificial skin, a flexible electronic circuit that could carry pulses of electricity to nerve cells.

Importing the signal

Bao’s team has been developing flexible electronics that can bend without breaking. For this project, team members worked with researchers from PARC, a Xerox company, which has a technology that uses an inkjet printer to deposit flexible circuits onto plastic. Covering a large surface is important to making artificial skin practical, and the PARC collaboration offered that prospect.

Finally the team had to prove that the electronic signal could be recognized by a biological neuron. It did this by adapting a technique developed by Karl Deisseroth, a fellow professor of bioengineering at Stanford who pioneered a field that combines genetics and optics, called optogenetics. Researchers bioengineer cells to make them sensitive to specific frequencies of light, then use light pulses to switch cells, or the processes being carried on inside them, on and off.

For this experiment the team members engineered a line of neurons to simulate a portion of the human nervous system. They translated the electronic pressure signals from the artificial skin into light pulses, which activated the neurons, proving that the artificial skin could generate a sensory output compatible with nerve cells.

Optogenetics was only used as an experimental proof of concept, Bao said, and other methods of stimulating nerves are likely to be used in real prosthetic devices. Bao’s team has already worked with Bianxiao Cui, an associate professor of chemistry at Stanford, to show that direct stimulation of neurons with electrical pulses is possible.

Bao’s team envisions developing different sensors to replicate, for instance, the ability to distinguish corduroy versus silk, or a cold glass of water from a hot cup of coffee. This will take time. There are six types of biological sensing mechanisms in the human hand, and the experiment described in Science reports success in just one of them.

But the current two-ply approach means the team can add sensations as it develops new mechanisms. And the inkjet printing fabrication process suggests how a network of sensors could be deposited over a flexible layer and folded over a prosthetic hand.

“We have a lot of work to take this from experimental to practical applications,” Bao said. “But after spending many years in this work, I now see a clear path where we can take our artificial skin.”

Here’s a link to and a citation for the paper,

A skin-inspired organic digital mechanoreceptor by Benjamin C.-K. Tee, Alex Chortos, Andre Berndt, Amanda Kim Nguyen, Ariane Tom, Allister McGuire, Ziliang Carter Lin, Kevin Tien, Won-Gyu Bae, Huiliang Wang, Ping Mei, Ho-Hsiu Chou, Bianxiao Cui, Karl Deisseroth, Tse Nga Ng, & Zhenan Bao. Science 16 October 2015 Vol. 350 no. 6258 pp. 313-316 DOI: 10.1126/science.aaa9306

This paper is behind a paywall.

Chameleon-like artificial skin

A March 12, 2015 news item on phys.org describes artificial skin inspired by chameleons,

Borrowing a trick from nature, engineers from the University of California at Berkeley have created an incredibly thin, chameleon-like material that can be made to change color—on demand—by simply applying a minute amount of force.

This new material-of-many-colors offers intriguing possibilities for an entirely new class of display technologies, color-shifting camouflage, and sensors that can detect otherwise imperceptible defects in buildings, bridges, and aircraft.

“This is the first time anybody has made a flexible chameleon-like skin that can change color simply by flexing it,” said Connie J. Chang-Hasnain, a member of the Berkeley team and co-author on a paper published today in Optica, The Optical Society’s (OSA) new journal.

A March 12, 2015 OSA news release (also on EurekAlert), which originated the news item, provides more information about this structural color project,

The colors we typically see in paints, fabrics, and other natural substances occur when white, broad spectrum light strikes their surfaces. The unique chemical composition of each surface then absorbs various bands, or wavelengths of light. Those that aren’t absorbed are reflected back, with shorter wavelengths giving objects a blue hue and longer wavelengths appearing redder and the entire rainbow of possible combinations in between. Changing the color of a surface, such as the leaves on the trees in autumn, requires a change in chemical make-up.

Recently, engineers and scientists have been exploring another approach, one that would create designer colors without the use of chemical dyes and pigments. Rather than controlling the chemical composition of a material, it’s possible to control the surface features on the tiniest of scales so they interact and reflect particular wavelengths of light. This type of “structural color” is much less common in nature, but is used by some butterflies and beetles to create a particularly iridescent display of color.

Controlling light with structures rather than traditional optics is not new. In astronomy, for example, evenly spaced slits known as diffraction gratings are routinely used to direct light and spread it into its component colors. Efforts to control color with this technique, however, have proved impractical because the optical losses are simply too great.

The authors of the Optica paper applied a similar principle, though with a radically different design, to achieve the color control they were looking for. In place of slits cut into a film they instead etched rows of ridges onto a single, thin layer of silicon. Rather than spreading the light into a complete rainbow, however, these ridges — or bars — reflect a very specific wavelength of light. By “tuning” the spaces between the bars, it’s possible to select the specific color to be reflected. Unlike the slits in a diffraction grating, however, the silicon bars were extremely efficient and readily reflected the frequency of light they were tuned to.

Fascinatingly, the reflected colors can be selected (from the news release),

Since the spacing, or period, of the bars is the key to controlling the color they reflect, the researchers realized it would be possible to subtly shift the period — and therefore the color — by flexing or bending the material.

“If you have a surface with very precise structures, spaced so they can interact with a specific wavelength of light, you can change its properties and how it interacts with light by changing its dimensions,” said Chang-Hasnain.

Earlier efforts to develop a flexible, color shifting surface fell short on a number of fronts. Metallic surfaces, which are easy to etch, were inefficient, reflecting only a portion of the light they received. Other surfaces were too thick, limiting their applications, or too rigid, preventing them from being flexed with sufficient control.

The Berkeley researchers were able to overcome both these hurdles by forming their grating bars using a semiconductor layer of silicon approximately 120 nanometers thick. Its flexibility was imparted by embedding the silicon bars into a flexible layer of silicone. As the silicone was bent or flexed, the period of the grating spacings responded in kind.

The semiconductor material also allowed the team to create a skin that was incredibly thin, perfectly flat, and easy to manufacture with the desired surface properties. This produces materials that reflect precise and very pure colors and that are highly efficient, reflecting up to 83 percent of the incoming light.

Their initial design, subjected to a change in period of a mere 25 nanometers, created brilliant colors that could be shifted from green to yellow, orange, and red – across a 39-nanometer range of wavelengths. Future designs, the researchers believe, could cover a wider range of colors and reflect light with even greater efficiency.

Here’s a link to and a citation for the paper,

Flexible photonic metastructures for tunable coloration by Li Zhu, Jonas Kapraun, James Ferrara, and Connie J. Chang-Hasnain. Optica, Vol. 2, Issue 3, pp. 255-258 (2015)
http://dx.doi.org/10.1364/OPTICA.2.000255

This paper is open access (for now at least).

Final note: I recently wrote about research into how real chameleons are able to effect colour changes in a March 16, 2015 post.

Electronic skin and its evolution

Michael Berger has featured an article in the journal Advanced Materials, which reviews 25 years of work on e-skin (aka, electronic skin or artificial skin) in his Nov. 15, 2013 Nanowerk Spotlight series article ,

Advances in materials, fabrication strategies and device designs for flexible and stretchable electronics and sensors make it possible to envision a not-too-distant future where ultra-thin, flexible circuits based on inorganic semiconductors can be wrapped and attached to any imaginable surface, including body parts and even internal organs. Robotic technologies will also benefit as it becomes possible to fabricate electronic skin (‘e-skin’) that, for instance, could allow surgical robots to interact, in a soft contacting mode, with their surroundings through touch. In addition to giving robots a finer sense of touch, engineers believe that e-skin technology could also be used to create things like wallpapers that double as touchscreen displays and dashboard laminates that allow drivers to adjust electronic controls with the wave of a hand.

Here’s a link to and a citation for the 25-year review of work on e-skin,

25th Anniversary Article: The Evolution of Electronic Skin (E-Skin): A Brief History, Design Considerations, and Recent Progress by Mallory L. Hammock, Alex Chortos, Benjamin C.-K. Tee, Jeffrey B.-H. Tok, and Zhenan Bao. Advanced Materials Volume 25, Issue 42, pages 5997–6038, November 13, 2013 Article first published online: 22 OCT 2013 DOI: 10.1002/adma.201302240

© 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

The review article is behind a paywall but Berger’s synopsis offers a good overview* and tidbits such as this timeline (Berger offers a larger version) which includes important moments in science fiction (Note: Links in the caption have been removed),

Figure 1. A brief chronology of the evolution of e-skin. We emphasize several science fictional events in popular culture that inspired subsequent critical technological advancements in the development of e-skin. Images reproduced with permission: “micro-structured pressure sensor,”[18] “stretchable OLEDs,”[20b] “stretchable OPVs,”[21a] “stretchable, transparent e-skin,”[22] “macroscale nanowire e-skin,”[23a] “rechargeable, stretchable batteries,”[137] “interlocked e-skin.”[25] Copyright, respectively, 2010, 2009, 2012, 2005, 2010, 2013, 2012. Macmillan Publishers Ltd. “Flexible, active-matrix e-skin” image reproduced with permission.[26a] Copyright, 2004. National Academy of Sciences USA. “Epidermal electronics” image reproduced with permission.[390a] Copyright, American Association for the Advancement of Science. “Stretchable batteries” image reproduced with permission.[27] “Infrared e-skin” image reproduced with permission.[8b] Copyright 2001, IEEE. “Anthropomorphic cybernetic hand” image reproduced with permission.[426] Copyright 2006, IEEE. [downloaded from http://onlinelibrary.wiley.com.proxy.lib.sfu.ca/doi/10.1002/adma.201302240/full]

Figure 1. A brief chronology of the evolution of e-skin. We emphasize several science fictional events in popular culture that inspired subsequent critical technological advancements in the development of e-skin. Images reproduced with permission: “micro-structured pressure sensor,”[18] “stretchable OLEDs,”[20b] “stretchable OPVs,”[21a] “stretchable, transparent e-skin,”[22] “macroscale nanowire e-skin,”[23a] “rechargeable, stretchable batteries,”[137] “interlocked e-skin.”[25] Copyright, respectively, 2010, 2009, 2012, 2005, 2010, 2013, 2012. Macmillan Publishers Ltd. “Flexible, active-matrix e-skin” image reproduced with permission.[26a] Copyright, 2004. National Academy of Sciences USA. “Epidermal electronics” image reproduced with permission.[390a] Copyright, American Association for the Advancement of Science. “Stretchable batteries” image reproduced with permission.[27] “Infrared e-skin” image reproduced with permission.[8b] Copyright 2001, IEEE. “Anthropomorphic cybernetic hand” image reproduced with permission.[426] Copyright 2006, IEEE. [downloaded from http://onlinelibrary.wiley.com.proxy.lib.sfu.ca/doi/10.1002/adma.201302240/full]

Here’s an excerpt from the review article outlining the 1970s – 1990s period featuring some of the science fiction which has influenced the science (Note: Links have been removed),

The prospect of creating artificial skin was in many ways inspired by science fiction, which propelled the possibility of e-skin into the imagination of both the general public as well as the scientific community. One of the first science fiction books to explore the use of mechanical replacement organs was Caidin’s Cyborg in 1971, on which the famed Six Million Dollar Man television series about a man with a bionic replacement arm and eye was later based (1974).[4] Shortly after, at the beginning of the 1980s, George Lucas created a vision of a future with e-skin in the famous Star Wars series. In particular, he depicted a scene showing a medical robot installing an electronic hand with full sensory perception on the main character, Luke Skywalker.[5] Shortly after, in 1984, the Terminator movie series depicted humanoid robots and even a self-healing robot.[6] These fictitious renditions of e-skin took place against a real-life backdrop of vibrant microelectronics research that began bridging science fiction with scientific reality.

Early technological advancements in the development of e-skin were concomitant with their science fiction inspirations. In 1974, Clippinger et al. demonstrated a prosthetic hand capable of discrete sensor feedback.[7] Nearly a decade later, Hewlett-Packard (HP) marketed a personal computer (HP-150) that was equipped with a touchscreen, allowing users to activate functions by simply touching the display. It was the first mass-marketed electronic device capitalizing on the intuitive nature of human touch. In 1985, General Electric (GE) built the first sensitive skin for a robotic arm using discrete infrared sensors placed on a flexible sheet at a resolution of ≈5 cm.[8] The fabricated sensitive skin was proximally aware of its surroundings, allowing the robot’s arm to avert potential obstacles and effectively maneuver within its physical environment. Despite the robotic arm’s lack of fingers and low resolution, it was capable of demonstrating that electronics integrated into a membrane could allow for natural human–machine interaction. For example, the robotic arm was able to ‘dance’ with a ballerina without any pre-programmed motions.[8] In addition to the ability of an artificial skin to interact with its surroundings, it is equally critical that the artificial skin mimics the mechanical properties of human skin to accommodate its various motions. Hence, to build life-like prosthetics or humanoid robots, soft, flexible, and stretchable electronics needed to be developed.

In the 1990s, scientists began using flexible electronic materials to create large-area, low-cost and printable sensor sheets. Jiang et al. proposed one of the first flexible sensor sheets for tactile shear force sensing by creating silicon (Si) micro-electro-mechanical (MEM) islands by etching thin Si wafers and integrating them on flexible polyimide foils.[9] Much work has since been done to enhance the reliability of large sensor sheets to mechanical bending.[10] Around the same time, flexible arrays fabricated from organic semiconductors began to emerge that rivaled the performance of amorphous Si.[11]

Just before the turn of the millennium, the first “Sensitive Skin Workshop” was held in Washington DC under the aegis of the National Science Foundation and the Defense Advanced Research Projects Agency, bringing together approximately sixty researchers from different sectors of academia, industry, and government. It was discovered that there was significant industrial interest in e-skins for various applications, ranging from robotics to health care. A summary of concepts outlined in the workshop was compiled by Lumelsky et al.[12] In the early 2000s, the pace of e-skin development significantly increased as a result of this workshop, and researchers began to explore different types of sensors that could be more easily integrated with microprocessors.

I have written about e-skin a number of times, most recently in a July 9, 2013 posting about work on flexible sensors and gold nanoparticles being conducted at Technion-Israel Institute of Technology. This review helps to contextualize projects such as the one at Technion and elsewhere.

*To avoid redundancy ‘synopsis’ was replaced by ‘overview’ on Oct. 19, 2015.

Feeling artificial skin

In reading about some of the latest work on artificial skin and feeling, I was reminded of a passage from a description of the ‘uncanny valley’ by Masahiro Mori (excerpted from my March 10, 2011 posting about robots [geminoid robots, in particular])

… this kind of prosthetic hand is too real and when we notice it is prosthetic, we have a sense of strangeness. So if we shake the hand, we are surprised by the lack of soft tissue and cold temperature.

According to a March 29, 2012 news item on Nanowerk, this state of affairs is about to change,

Sooner than later, robots may have the ability to “feel.” In a paper published online March 26 in Advanced Functional Materials (“Mechanical Resuscitation of Chemical Oscillations in Belousov–Zhabotinsky Gels”), a team of researchers from the University of Pittsburgh [Pitt] and the Massachusetts Institute of Technology (MIT) demonstrated that a nonoscillating gel can be resuscitated in a fashion similar to a medical cardiopulmonary resuscitation. These findings pave the way for the development of a wide range of new applications that sense mechanical stimuli and respond chemically—a natural phenomenon few materials have been able to mimic.

“Think of it like human skin, which can provide signals to the brain that something on the body is deformed or hurt,” says Balazs [Anna Balazs, Distinguished Professor of Chemical and Petroleum Engineering in Pitt’s Swanson School of Engineering]. “This gel has numerous far-reaching applications, such as artificial skin that could be sensory—a holy grail in robotics.”

The Pitt March 29, 2012 news release reveals some of the personal motivation behind the research,

“My mother would often tease me when I was young, saying I was like a mimosa plant— shy and bashful,” says Balazs. “As a result, I became fascinated with the plant and its unique hide-and-seek qualities—the plant leaves fold inward and droop when touched or shaken, reopening just minutes later. I knew there had to be a scientific application regarding touch, which led me to studies like this in mechanical and chemical energy.”

Here’s a more technical description of the joint Pitt/MIT research team’s work (from the Pitt news release),

A team of researchers at Pitt made predictions regarding the behavior of Belousov-Zhabotinsky (BZ) gel, a material that was first fabricated in the late 1990s and shown to pulsate in the absence of any external stimuli. In fact, under certain conditions, the gel sitting in a petri dish resembles a beating heart.

Along with her colleagues, [Balazs] predicted that BZ gel not previously oscillating could be re-excited by mechanical pressure. The prediction was actualized by MIT researchers, who proved that chemical oscillations can be triggered by mechanically compressing the BZ gel beyond a critical stress.

I’m always fascinated by what motivates people and so Balazs’s story about the mimosa strikes me as both charming and instructive as to the sources for creative inspiration in any field.

If I read the news release rightly, we’ve still got a long way to go before ‘seeing’ robots with skin that can ‘feel’.

Nanotechnology-enabled robot skin

We take it for granted most of the time. The ability to sense pressure and respond to appropriately doesn’t seem like any great gift but without it, you’d crush fragile objects or be unable to hold onto the heavy ones.

It’s this ability to sense pressure that’s a stumbling block for robotmakers who want to move robots into jobs that require some dexterity, e.g., one that could clean yours windows and your walls without damaging one or failing to clean the other.

Two research teams have recently published papers about their work on solving the ‘pressure problem’. From the article by Jason Palmer for BBC News,

The materials, which can sense pressure as sensitively and quickly as human skin, have been outlined by two groups reporting in [the journal] Nature Materials.

The skins are arrays of small pressure sensors that convert tiny changes in pressure into electrical signals.

The arrays are built into or under flexible rubber sheets that could be stretched into a variety of shapes.

The materials could be used to sheath artificial limbs or to create robots that can pick up and hold fragile objects. They could also be used to improve tools for minimally-invasive surgery.

One team is located at the University of California, Berkeley and the other at Stanford University. The Berkeley team headed by Ali Javey, associate professor of electrical engineering and computer sciences has named their artificial skin ‘e-skin’. From the article by Dan Nosowitz on the Fast Company website,

Researchers at the University of California at Berkeley, backed by DARPA funding, have come up with a thin prototype material that’s getting science nerds all in a tizzy about the future of robotics.

This material is made from germanium and silicon nanowires grown on a cylinder, then rolled around a sticky polyimide substrate. What does that get you? As CNet says, “The result was a shiny, thin, and flexible electronic material organized into a matrix of transistors, each of which with hundreds of semiconductor nanowires.”

But what takes the material to the next level is the thin layer of pressure-sensitive rubber added to the prototype’s surface, capable of measuring pressures between zero and 15 kilopascals–about the normal range of pressure for a low-intensity human activity, like, say, writing a blog post. Basically, this rubber layer turns the nanowire material into a sort of artificial skin, which is being played up as a miracle material.

As Nosowitz points out, this is a remarkable achievement and it is a first step since skin registers pressure, pain, temperature, wetness, and more. Here’s an illustration of Berkeley’s e-skin (Source: University of California Berkeley, accessed from  http://berkeley.edu/news/media/releases/2010/09/12_eskin.shtml Sept. 14, 2010),

An artist’s illustration of an artificial e-skin with nanowire active matrix circuitry covering a hand. The fragile egg illustrates the functionality of the e-skin device for prosthetic and robotic applications.

The Stanford team’s approach has some similarities to the Berkeley’s (from Jason Palmer’s BBC article),

“Javey’s work is a nice demonstration of their capability in making a large array of nanowire TFTs [this film transistor],” said Zhenan Bao of Stanford University, whose group demonstrated the second approach.

The heart of Professor Bao’s devices is micro-structured rubber sheet in the middle of the TFT – effectively re-creating the functionality of the Berkeley group’s skins with less layers.

“Instead of laminating a pressure-sensitive resistor array on top of a nanowire TFT array, we made our transistors to be pressure sensitive,” Professor Bao explained to BBC News.

Here’s a short video about the Stanford team’s work (Source: Stanford University, accessed from http://news.stanford.edu/news/2010/september/sensitive-artificial-skin-091210.html Sept. 14, 2010),

Both approaches to the ‘pressure problem’ have at least one shortcoming. The Berkeley’s team’s e-skin has less sensitivity than Stanford’s while the Stanford team’s artificial skin is less flexible than e-skin as per Palmer’s BBC article. Also, I noticed that the Berkeley team at least is being funded by DARPA ([US Dept. of Defense] Defense Advanced Research Projects Agency) so I’m assuming a fair degree of military interest, which always gives me pause. Nonetheless, bravo to both teams.