Tag Archives: artificial skin

Robots with living human skin tissue?

So far, it looks like they’ve managed a single robotic finger. I expect it will take a great deal more work before an entire robotic hand is covered in living skin. BTW, I have a few comments at the end of this post.

Caption: Illustration showing the cutting and healing process of the robotic finger (A), its anchoring structure (B) and fabrication process (C). Credit: ©2022 Takeuchi et al.

I have two news releases highlighting the work. This a June 9, 2022 Cell Press news release,

From action heroes to villainous assassins, biohybrid robots made of both living and artificial materials have been at the center of many sci-fi fantasies, inspiring today’s robotic innovations. It’s still a long way until human-like robots walk among us in our daily lives, but scientists from Japan are bringing us one step closer by crafting living human skin on robots. The method developed, presented June 9 in the journal Matter, not only gave a robotic finger skin-like texture, but also water-repellent and self-healing functions.

“The finger looks slightly ‘sweaty’ straight out of the culture medium,” says first author Shoji Takeuchi, a professor at the University of Tokyo, Japan. “Since the finger is driven by an electric motor, it is also interesting to hear the clicking sounds of the motor in harmony with a finger that looks just like a real one.”

Looking “real” like a human is one of the top priorities for humanoid robots that are often tasked to interact with humans in healthcare and service industries. A human-like appearance can improve communication efficiency and evoke likability. While current silicone skin made for robots can mimic human appearance, it falls short when it comes to delicate textures like wrinkles and lacks skin-specific functions. Attempts at fabricating living skin sheets to cover robots have also had limited success, since it’s challenging to conform them to dynamic objects with uneven surfaces.

“With that method, you have to have the hands of a skilled artisan who can cut and tailor the skin sheets,” says Takeuchi. “To efficiently cover surfaces with skin cells, we established a tissue molding method to directly mold skin tissue around the robot, which resulted in a seamless skin coverage on a robotic finger.”

To craft the skin, the team first submerged the robotic finger in a cylinder filled with a solution of collagen and human dermal fibroblasts, the two main components that make up the skin’s connective tissues. Takeuchi says the study’s success lies within the natural shrinking tendency of this collagen and fibroblast mixture, which shrank and tightly conformed to the finger. Like paint primers, this layer provided a uniform foundation for the next coat of cells—human epidermal keratinocytes—to stick to. These cells make up 90% of the outermost layer of skin, giving the robot a skin-like texture and moisture-retaining barrier properties.

The crafted skin had enough strength and elasticity to bear the dynamic movements as the robotic finger curled and stretched. The outermost layer was thick enough to be lifted with tweezers and repelled water, which provides various advantages in performing specific tasks like handling electrostatically charged tiny polystyrene foam, a material often used in packaging. When wounded, the crafted skin could even self-heal like humans’ with the help of a collagen bandage, which gradually morphed into the skin and withstood repeated joint movements.

“We are surprised by how well the skin tissue conforms to the robot’s surface,” says Takeuchi. “But this work is just the first step toward creating robots covered with living skin.” The developed skin is much weaker than natural skin and can’t survive long without constant nutrient supply and waste removal. Next, Takeuchi and his team plan to address those issues and incorporate more sophisticated functional structures within the skin, such as sensory neurons, hair follicles, nails, and sweat glands.

“I think living skin is the ultimate solution to give robots the look and touch of living creatures since it is exactly the same material that covers animal bodies,” says Takeuchi.

A June 10, 2022 University of Tokyo news release (also on EurekAlert but published June 9, 2022) covers some of the same ground while providing more technical details,

Researchers from the University of Tokyo pool knowledge of robotics and tissue culturing to create a controllable robotic finger covered with living skin tissue. The robotic digit had living cells and supporting organic material grown on top of it for ideal shaping and strength. As the skin is soft and can even heal itself, so could be useful in applications that require a gentle touch but also robustness. The team aims to add other kinds of cells into future iterations, giving devices the ability to sense as we do.

Professor Shoji Takeuchi is a pioneer in the field of biohybrid robots, the intersection of robotics and bioengineering. Together with researchers from around the University of Tokyo, he explores things such as artificial muscles, synthetic odor receptors, lab-grown meat, and more. His most recent creation is both inspired by and aims to aid medical research on skin damage such as deep wounds and burns, as well as help advance manufacturing.

“We have created a working robotic finger that articulates just as ours does, and is covered by a kind of artificial skin that can heal itself,” said Takeuchi. “Our skin model is a complex three-dimensional matrix that is grown in situ on the finger itself. It is not grown separately then cut to size and adhered to the device; our method provides a more complete covering and is more strongly anchored too.”

Three-dimensional skin models have been used for some time for cosmetic and drug research and testing, but this is the first time such materials have been used on a working robot. In this case, the synthetic skin is made from a lightweight collagen matrix known as a hydrogel, within which several kinds of living skin cells called fibroblasts and keratinocytes are grown. The skin is grown directly on the robotic component which proved to be one of the more challenging aspects of this research, requiring specially engineered structures that can anchor the collagen matrix to them, but it was worth it for the aforementioned benefits.

“Our creation is not only soft like real skin but can repair itself if cut or damaged in some way. So we imagine it could be useful in industries where in situ repairability is important as are humanlike qualities, such as dexterity and a light touch,” said Takeuchi. “In the future, we will develop more advanced versions by reproducing some of the organs found in skin, such as sensory cells, hair follicles and sweat glands. Also, we would like to try to coat larger structures.”

The main long-term aim for this research is to open up new possibilities in advanced manufacturing industries. Having humanlike manipulators could allow for the automation of things currently only achievable by highly skilled professionals. Other areas such as cosmetics, pharmaceuticals and regenerative medicine could also benefit. This could potentially reduce cost, time and complexity of research in these areas and could even reduce the need for animal testing.

Here’s a link to and a citation for the paper,

Living skin on a robot by Michio Kawai, Minghao Nie, Haruka Oda, Yuya Morimoto, Shoji Takeuchi. Matter DOI: https://doi.org/10.1016/j.matt.2022.05.019 Published:June 09, 2022

This paper appears to be open access.

There more images and there’s at least one video all of which can be found by clicking on the links to one or both of the news releases and to the paper. Personally, I found the images fascinating and …

Frankenstein, cyborgs, and more

The word is creepy. I find the robot finger images fascinating and creepy. The work brings to mind Frankenstein (by Mary Shelley) and The Island of Dr. Moreau (by H. G. Wells) both of which feature cautionary tales. Dr. Frankenstein tries to bring a dead ‘person’ assembled with parts from various corpses to life and Dr. Moreau attempts to create hybrids composed humans and animals. It’s fascinating how 19th century nightmares prefigure some of the research being performed now.

The work also brings to mind the ‘uncanny valley’, a term coined by Masahiro Mori, where people experience discomfort when something that’s not human seems too human. I have an excerpt from an essay that Mori wrote about the uncanny valley in my March 10, 2011 posting; scroll down about 50% of the way.) The diagram which accompanies it illustrates the gap between the least uncanny or the familiar (a healthy person, a puppet, etc.) and the most uncanny or the unfamiliar (a corpse, a zombie, a prosthetic hand).

Mori notes that the uncanny valley is not immovable; things change and the unfamiliar becomes familiar. Presumably, one day, I will no longer find robots with living skin to be creepy.

All of this changes the meaning (for me) of a term i coined for this site, ‘machine/flesh’. At the time, I was thinking of prosthetics and implants and how deeply they are being integrated into the body. But this research reverses the process. Now, the body (skin in this case) is being added to the machine (robot).

Iron oxide nanoparticles for artificial skin with super powers

A January 28, 2019 news item on ScienceDaily describes the possibilities for a skin replacement material,

A new type of sensor could lead to artificial skin that someday helps burn victims ‘feel’ and safeguards the rest of us, University of Connecticut researchers suggest in a paper in Advanced Materials.

Our skin’s ability to perceive pressure, heat, cold, and vibration is a critical safety function that most people take for granted. But burn victims, those with prosthetic limbs, and others who have lost skin sensitivity for one reason or another, can’t take it for granted, and often injure themselves unintentionally.

Chemists Islam Mosa from UConn [University of Connecticut], and James Rusling from UConn and UConn Health, along with University of Toronto engineer Abdelsalam Ahmed, wanted to create a sensor that can mimic the sensing properties of skin. Such a sensor would need to be able to detect pressure, temperature, and vibration. But perhaps it could do other things too, the researchers thought.

“It would be very cool if it had abilities human skin does not; for example, the ability to detect magnetic fields, sound waves, and abnormal behaviors,” said Mosa.

A January 22, 2019 UConn news release (also on EurekAlert but dated January 28, 2019), which originated the news item, give more detail about the work,

Mosa and his colleagues created such a sensor with a silicone tube wrapped in a copper wire and filled with a special fluid made of tiny particles of iron oxide just one billionth of a meter long, called nanoparticles. The nanoparticles rub around the inside of the silicone tube and create an electric current. The copper wire surrounding the silicone tube picks up the current as a signal. When this tube is bumped by something experiencing pressure, the nanoparticles move and the electric signal changes. Sound waves also create waves in the nanoparticle fluid, and the electric signal changes in a different way than when the tube is bumped.

The researchers found that magnetic fields alter the signal too, in a way distinct from pressure or sound waves. Even a person moving around while carrying the sensor changes the electrical current, and the team found they could distinguish between the electrical signals caused by walking, running, jumping, and swimming.

Metal skin might sound like a superhero power, but this skin wouldn’t make the wearer Colossus from the X-men. Rather, Mosa and his colleagues hope it could help burn victims “feel” again, and perhaps act as an early warning for workers exposed to dangerously high magnetic fields. Because the rubber exterior is completely sealed and waterproof, it could also serve as a wearable monitor to alert parents if their child fell into deep water in a pool, for example.

“The inspiration was to make something durable that would last for a very long time, and could detect multiple hazards,” Mosa says. The team has yet to test the sensor for its response to heat and cold, but they suspect it will work for those as well. The next step is to make the sensor in a flat configuration, more like skin, and see if it still works.

Here’s a link to and a citation for the paper,

An Ultra‐Shapeable, Smart Sensing Platform Based on a Multimodal Ferrofluid‐Infused Surface by Abdelsalam Ahmed, Islam Hassan, Islam M. Mosa, Esraa Elsanadidy, Mohamed Sharafeldin, James F. Rusling, Shenqiang Ren. Advanced Materials DOI: https://doi.org/10.1002/adma.201807201 First published: 28 January 2019

This paper is behind a paywall.

The sense of touch via artificial skin

Scientists have been working for years to allow artificial skin to transmit what the brain would recognize as the sense of touch. For anyone who has lost a limb and gotten a prosthetic replacement, the loss of touch is reputedly one of the more difficult losses to accept. The sense of touch is also vital in robotics if the field is to expand and include activities reliant on the sense of touch, e.g., how much pressure do you use to grasp a cup; how much strength  do you apply when moving an object from one place to another?

For anyone interested in the ‘electronic skin and pursuit of touch’ story, I have a Nov. 15, 2013 posting which highlights the evolution of the research into e-skin and what was then some of the latest work.

This posting is a 2015 update of sorts featuring the latest e-skin research from Stanford University and Xerox PARC. (Dexter Johnson in an Oct. 15, 2015 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineering] site) provides a good research summary.) For anyone with an appetite for more, there’s this from an Oct. 15, 2015 American Association for the Advancement of Science (AAAS) news release on EurekAlert,

Using flexible organic circuits and specialized pressure sensors, researchers have created an artificial “skin” that can sense the force of static objects. Furthermore, they were able to transfer these sensory signals to the brain cells of mice in vitro using optogenetics. For the many people around the world living with prosthetics, such a system could one day allow them to feel sensation in their artificial limbs. To create the artificial skin, Benjamin Tee et al. developed a specialized circuit out of flexible, organic materials. It translates static pressure into digital signals that depend on how much mechanical force is applied. A particular challenge was creating sensors that can “feel” the same range of pressure that humans can. Thus, on the sensors, the team used carbon nanotubes molded into pyramidal microstructures, which are particularly effective at tunneling the signals from the electric field of nearby objects to the receiving electrode in a way that maximizes sensitivity. Transferring the digital signal from the artificial skin system to the cortical neurons of mice proved to be another challenge, since conventional light-sensitive proteins used in optogenetics do not stimulate neural spikes for sufficient durations for these digital signals to be sensed. Tee et al. therefore engineered new optogenetic proteins able to accommodate longer intervals of stimulation. Applying these newly engineered optogenic proteins to fast-spiking interneurons of the somatosensory cortex of mice in vitro sufficiently prolonged the stimulation interval, allowing the neurons to fire in accordance with the digital stimulation pulse. These results indicate that the system may be compatible with other fast-spiking neurons, including peripheral nerves.

And, there’s an Oct. 15, 2015 Stanford University news release on EurkeAlert describing this work from another perspective,

The heart of the technique is a two-ply plastic construct: the top layer creates a sensing mechanism and the bottom layer acts as the circuit to transport electrical signals and translate them into biochemical stimuli compatible with nerve cells. The top layer in the new work featured a sensor that can detect pressure over the same range as human skin, from a light finger tap to a firm handshake.

Five years ago, Bao’s [Zhenan Bao, a professor of chemical engineering at Stanford,] team members first described how to use plastics and rubbers as pressure sensors by measuring the natural springiness of their molecular structures. They then increased this natural pressure sensitivity by indenting a waffle pattern into the thin plastic, which further compresses the plastic’s molecular springs.

To exploit this pressure-sensing capability electronically, the team scattered billions of carbon nanotubes through the waffled plastic. Putting pressure on the plastic squeezes the nanotubes closer together and enables them to conduct electricity.

This allowed the plastic sensor to mimic human skin, which transmits pressure information as short pulses of electricity, similar to Morse code, to the brain. Increasing pressure on the waffled nanotubes squeezes them even closer together, allowing more electricity to flow through the sensor, and those varied impulses are sent as short pulses to the sensing mechanism. Remove pressure, and the flow of pulses relaxes, indicating light touch. Remove all pressure and the pulses cease entirely.

The team then hooked this pressure-sensing mechanism to the second ply of their artificial skin, a flexible electronic circuit that could carry pulses of electricity to nerve cells.

Importing the signal

Bao’s team has been developing flexible electronics that can bend without breaking. For this project, team members worked with researchers from PARC, a Xerox company, which has a technology that uses an inkjet printer to deposit flexible circuits onto plastic. Covering a large surface is important to making artificial skin practical, and the PARC collaboration offered that prospect.

Finally the team had to prove that the electronic signal could be recognized by a biological neuron. It did this by adapting a technique developed by Karl Deisseroth, a fellow professor of bioengineering at Stanford who pioneered a field that combines genetics and optics, called optogenetics. Researchers bioengineer cells to make them sensitive to specific frequencies of light, then use light pulses to switch cells, or the processes being carried on inside them, on and off.

For this experiment the team members engineered a line of neurons to simulate a portion of the human nervous system. They translated the electronic pressure signals from the artificial skin into light pulses, which activated the neurons, proving that the artificial skin could generate a sensory output compatible with nerve cells.

Optogenetics was only used as an experimental proof of concept, Bao said, and other methods of stimulating nerves are likely to be used in real prosthetic devices. Bao’s team has already worked with Bianxiao Cui, an associate professor of chemistry at Stanford, to show that direct stimulation of neurons with electrical pulses is possible.

Bao’s team envisions developing different sensors to replicate, for instance, the ability to distinguish corduroy versus silk, or a cold glass of water from a hot cup of coffee. This will take time. There are six types of biological sensing mechanisms in the human hand, and the experiment described in Science reports success in just one of them.

But the current two-ply approach means the team can add sensations as it develops new mechanisms. And the inkjet printing fabrication process suggests how a network of sensors could be deposited over a flexible layer and folded over a prosthetic hand.

“We have a lot of work to take this from experimental to practical applications,” Bao said. “But after spending many years in this work, I now see a clear path where we can take our artificial skin.”

Here’s a link to and a citation for the paper,

A skin-inspired organic digital mechanoreceptor by Benjamin C.-K. Tee, Alex Chortos, Andre Berndt, Amanda Kim Nguyen, Ariane Tom, Allister McGuire, Ziliang Carter Lin, Kevin Tien, Won-Gyu Bae, Huiliang Wang, Ping Mei, Ho-Hsiu Chou, Bianxiao Cui, Karl Deisseroth, Tse Nga Ng, & Zhenan Bao. Science 16 October 2015 Vol. 350 no. 6258 pp. 313-316 DOI: 10.1126/science.aaa9306

This paper is behind a paywall.

Chameleon-like artificial skin

A March 12, 2015 news item on phys.org describes artificial skin inspired by chameleons,

Borrowing a trick from nature, engineers from the University of California at Berkeley have created an incredibly thin, chameleon-like material that can be made to change color—on demand—by simply applying a minute amount of force.

This new material-of-many-colors offers intriguing possibilities for an entirely new class of display technologies, color-shifting camouflage, and sensors that can detect otherwise imperceptible defects in buildings, bridges, and aircraft.

“This is the first time anybody has made a flexible chameleon-like skin that can change color simply by flexing it,” said Connie J. Chang-Hasnain, a member of the Berkeley team and co-author on a paper published today in Optica, The Optical Society’s (OSA) new journal.

A March 12, 2015 OSA news release (also on EurekAlert), which originated the news item, provides more information about this structural color project,

The colors we typically see in paints, fabrics, and other natural substances occur when white, broad spectrum light strikes their surfaces. The unique chemical composition of each surface then absorbs various bands, or wavelengths of light. Those that aren’t absorbed are reflected back, with shorter wavelengths giving objects a blue hue and longer wavelengths appearing redder and the entire rainbow of possible combinations in between. Changing the color of a surface, such as the leaves on the trees in autumn, requires a change in chemical make-up.

Recently, engineers and scientists have been exploring another approach, one that would create designer colors without the use of chemical dyes and pigments. Rather than controlling the chemical composition of a material, it’s possible to control the surface features on the tiniest of scales so they interact and reflect particular wavelengths of light. This type of “structural color” is much less common in nature, but is used by some butterflies and beetles to create a particularly iridescent display of color.

Controlling light with structures rather than traditional optics is not new. In astronomy, for example, evenly spaced slits known as diffraction gratings are routinely used to direct light and spread it into its component colors. Efforts to control color with this technique, however, have proved impractical because the optical losses are simply too great.

The authors of the Optica paper applied a similar principle, though with a radically different design, to achieve the color control they were looking for. In place of slits cut into a film they instead etched rows of ridges onto a single, thin layer of silicon. Rather than spreading the light into a complete rainbow, however, these ridges — or bars — reflect a very specific wavelength of light. By “tuning” the spaces between the bars, it’s possible to select the specific color to be reflected. Unlike the slits in a diffraction grating, however, the silicon bars were extremely efficient and readily reflected the frequency of light they were tuned to.

Fascinatingly, the reflected colors can be selected (from the news release),

Since the spacing, or period, of the bars is the key to controlling the color they reflect, the researchers realized it would be possible to subtly shift the period — and therefore the color — by flexing or bending the material.

“If you have a surface with very precise structures, spaced so they can interact with a specific wavelength of light, you can change its properties and how it interacts with light by changing its dimensions,” said Chang-Hasnain.

Earlier efforts to develop a flexible, color shifting surface fell short on a number of fronts. Metallic surfaces, which are easy to etch, were inefficient, reflecting only a portion of the light they received. Other surfaces were too thick, limiting their applications, or too rigid, preventing them from being flexed with sufficient control.

The Berkeley researchers were able to overcome both these hurdles by forming their grating bars using a semiconductor layer of silicon approximately 120 nanometers thick. Its flexibility was imparted by embedding the silicon bars into a flexible layer of silicone. As the silicone was bent or flexed, the period of the grating spacings responded in kind.

The semiconductor material also allowed the team to create a skin that was incredibly thin, perfectly flat, and easy to manufacture with the desired surface properties. This produces materials that reflect precise and very pure colors and that are highly efficient, reflecting up to 83 percent of the incoming light.

Their initial design, subjected to a change in period of a mere 25 nanometers, created brilliant colors that could be shifted from green to yellow, orange, and red – across a 39-nanometer range of wavelengths. Future designs, the researchers believe, could cover a wider range of colors and reflect light with even greater efficiency.

Here’s a link to and a citation for the paper,

Flexible photonic metastructures for tunable coloration by Li Zhu, Jonas Kapraun, James Ferrara, and Connie J. Chang-Hasnain. Optica, Vol. 2, Issue 3, pp. 255-258 (2015)
http://dx.doi.org/10.1364/OPTICA.2.000255

This paper is open access (for now at least).

Final note: I recently wrote about research into how real chameleons are able to effect colour changes in a March 16, 2015 post.

Electronic skin and its evolution

Michael Berger has featured an article in the journal Advanced Materials, which reviews 25 years of work on e-skin (aka, electronic skin or artificial skin) in his Nov. 15, 2013 Nanowerk Spotlight series article ,

Advances in materials, fabrication strategies and device designs for flexible and stretchable electronics and sensors make it possible to envision a not-too-distant future where ultra-thin, flexible circuits based on inorganic semiconductors can be wrapped and attached to any imaginable surface, including body parts and even internal organs. Robotic technologies will also benefit as it becomes possible to fabricate electronic skin (‘e-skin’) that, for instance, could allow surgical robots to interact, in a soft contacting mode, with their surroundings through touch. In addition to giving robots a finer sense of touch, engineers believe that e-skin technology could also be used to create things like wallpapers that double as touchscreen displays and dashboard laminates that allow drivers to adjust electronic controls with the wave of a hand.

Here’s a link to and a citation for the 25-year review of work on e-skin,

25th Anniversary Article: The Evolution of Electronic Skin (E-Skin): A Brief History, Design Considerations, and Recent Progress by Mallory L. Hammock, Alex Chortos, Benjamin C.-K. Tee, Jeffrey B.-H. Tok, and Zhenan Bao. Advanced Materials Volume 25, Issue 42, pages 5997–6038, November 13, 2013 Article first published online: 22 OCT 2013 DOI: 10.1002/adma.201302240

© 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

The review article is behind a paywall but Berger’s synopsis offers a good overview* and tidbits such as this timeline (Berger offers a larger version) which includes important moments in science fiction (Note: Links in the caption have been removed),

Figure 1. A brief chronology of the evolution of e-skin. We emphasize several science fictional events in popular culture that inspired subsequent critical technological advancements in the development of e-skin. Images reproduced with permission: “micro-structured pressure sensor,”[18] “stretchable OLEDs,”[20b] “stretchable OPVs,”[21a] “stretchable, transparent e-skin,”[22] “macroscale nanowire e-skin,”[23a] “rechargeable, stretchable batteries,”[137] “interlocked e-skin.”[25] Copyright, respectively, 2010, 2009, 2012, 2005, 2010, 2013, 2012. Macmillan Publishers Ltd. “Flexible, active-matrix e-skin” image reproduced with permission.[26a] Copyright, 2004. National Academy of Sciences USA. “Epidermal electronics” image reproduced with permission.[390a] Copyright, American Association for the Advancement of Science. “Stretchable batteries” image reproduced with permission.[27] “Infrared e-skin” image reproduced with permission.[8b] Copyright 2001, IEEE. “Anthropomorphic cybernetic hand” image reproduced with permission.[426] Copyright 2006, IEEE. [downloaded from http://onlinelibrary.wiley.com.proxy.lib.sfu.ca/doi/10.1002/adma.201302240/full]

Figure 1. A brief chronology of the evolution of e-skin. We emphasize several science fictional events in popular culture that inspired subsequent critical technological advancements in the development of e-skin. Images reproduced with permission: “micro-structured pressure sensor,”[18] “stretchable OLEDs,”[20b] “stretchable OPVs,”[21a] “stretchable, transparent e-skin,”[22] “macroscale nanowire e-skin,”[23a] “rechargeable, stretchable batteries,”[137] “interlocked e-skin.”[25] Copyright, respectively, 2010, 2009, 2012, 2005, 2010, 2013, 2012. Macmillan Publishers Ltd. “Flexible, active-matrix e-skin” image reproduced with permission.[26a] Copyright, 2004. National Academy of Sciences USA. “Epidermal electronics” image reproduced with permission.[390a] Copyright, American Association for the Advancement of Science. “Stretchable batteries” image reproduced with permission.[27] “Infrared e-skin” image reproduced with permission.[8b] Copyright 2001, IEEE. “Anthropomorphic cybernetic hand” image reproduced with permission.[426] Copyright 2006, IEEE. [downloaded from http://onlinelibrary.wiley.com.proxy.lib.sfu.ca/doi/10.1002/adma.201302240/full]

Here’s an excerpt from the review article outlining the 1970s – 1990s period featuring some of the science fiction which has influenced the science (Note: Links have been removed),

The prospect of creating artificial skin was in many ways inspired by science fiction, which propelled the possibility of e-skin into the imagination of both the general public as well as the scientific community. One of the first science fiction books to explore the use of mechanical replacement organs was Caidin’s Cyborg in 1971, on which the famed Six Million Dollar Man television series about a man with a bionic replacement arm and eye was later based (1974).[4] Shortly after, at the beginning of the 1980s, George Lucas created a vision of a future with e-skin in the famous Star Wars series. In particular, he depicted a scene showing a medical robot installing an electronic hand with full sensory perception on the main character, Luke Skywalker.[5] Shortly after, in 1984, the Terminator movie series depicted humanoid robots and even a self-healing robot.[6] These fictitious renditions of e-skin took place against a real-life backdrop of vibrant microelectronics research that began bridging science fiction with scientific reality.

Early technological advancements in the development of e-skin were concomitant with their science fiction inspirations. In 1974, Clippinger et al. demonstrated a prosthetic hand capable of discrete sensor feedback.[7] Nearly a decade later, Hewlett-Packard (HP) marketed a personal computer (HP-150) that was equipped with a touchscreen, allowing users to activate functions by simply touching the display. It was the first mass-marketed electronic device capitalizing on the intuitive nature of human touch. In 1985, General Electric (GE) built the first sensitive skin for a robotic arm using discrete infrared sensors placed on a flexible sheet at a resolution of ≈5 cm.[8] The fabricated sensitive skin was proximally aware of its surroundings, allowing the robot’s arm to avert potential obstacles and effectively maneuver within its physical environment. Despite the robotic arm’s lack of fingers and low resolution, it was capable of demonstrating that electronics integrated into a membrane could allow for natural human–machine interaction. For example, the robotic arm was able to ‘dance’ with a ballerina without any pre-programmed motions.[8] In addition to the ability of an artificial skin to interact with its surroundings, it is equally critical that the artificial skin mimics the mechanical properties of human skin to accommodate its various motions. Hence, to build life-like prosthetics or humanoid robots, soft, flexible, and stretchable electronics needed to be developed.

In the 1990s, scientists began using flexible electronic materials to create large-area, low-cost and printable sensor sheets. Jiang et al. proposed one of the first flexible sensor sheets for tactile shear force sensing by creating silicon (Si) micro-electro-mechanical (MEM) islands by etching thin Si wafers and integrating them on flexible polyimide foils.[9] Much work has since been done to enhance the reliability of large sensor sheets to mechanical bending.[10] Around the same time, flexible arrays fabricated from organic semiconductors began to emerge that rivaled the performance of amorphous Si.[11]

Just before the turn of the millennium, the first “Sensitive Skin Workshop” was held in Washington DC under the aegis of the National Science Foundation and the Defense Advanced Research Projects Agency, bringing together approximately sixty researchers from different sectors of academia, industry, and government. It was discovered that there was significant industrial interest in e-skins for various applications, ranging from robotics to health care. A summary of concepts outlined in the workshop was compiled by Lumelsky et al.[12] In the early 2000s, the pace of e-skin development significantly increased as a result of this workshop, and researchers began to explore different types of sensors that could be more easily integrated with microprocessors.

I have written about e-skin a number of times, most recently in a July 9, 2013 posting about work on flexible sensors and gold nanoparticles being conducted at Technion-Israel Institute of Technology. This review helps to contextualize projects such as the one at Technion and elsewhere.

*To avoid redundancy ‘synopsis’ was replaced by ‘overview’ on Oct. 19, 2015.

Feeling artificial skin

In reading about some of the latest work on artificial skin and feeling, I was reminded of a passage from a description of the ‘uncanny valley’ by Masahiro Mori (excerpted from my March 10, 2011 posting about robots [geminoid robots, in particular])

… this kind of prosthetic hand is too real and when we notice it is prosthetic, we have a sense of strangeness. So if we shake the hand, we are surprised by the lack of soft tissue and cold temperature.

According to a March 29, 2012 news item on Nanowerk, this state of affairs is about to change,

Sooner than later, robots may have the ability to “feel.” In a paper published online March 26 in Advanced Functional Materials (“Mechanical Resuscitation of Chemical Oscillations in Belousov–Zhabotinsky Gels”), a team of researchers from the University of Pittsburgh [Pitt] and the Massachusetts Institute of Technology (MIT) demonstrated that a nonoscillating gel can be resuscitated in a fashion similar to a medical cardiopulmonary resuscitation. These findings pave the way for the development of a wide range of new applications that sense mechanical stimuli and respond chemically—a natural phenomenon few materials have been able to mimic.

“Think of it like human skin, which can provide signals to the brain that something on the body is deformed or hurt,” says Balazs [Anna Balazs, Distinguished Professor of Chemical and Petroleum Engineering in Pitt’s Swanson School of Engineering]. “This gel has numerous far-reaching applications, such as artificial skin that could be sensory—a holy grail in robotics.”

The Pitt March 29, 2012 news release reveals some of the personal motivation behind the research,

“My mother would often tease me when I was young, saying I was like a mimosa plant— shy and bashful,” says Balazs. “As a result, I became fascinated with the plant and its unique hide-and-seek qualities—the plant leaves fold inward and droop when touched or shaken, reopening just minutes later. I knew there had to be a scientific application regarding touch, which led me to studies like this in mechanical and chemical energy.”

Here’s a more technical description of the joint Pitt/MIT research team’s work (from the Pitt news release),

A team of researchers at Pitt made predictions regarding the behavior of Belousov-Zhabotinsky (BZ) gel, a material that was first fabricated in the late 1990s and shown to pulsate in the absence of any external stimuli. In fact, under certain conditions, the gel sitting in a petri dish resembles a beating heart.

Along with her colleagues, [Balazs] predicted that BZ gel not previously oscillating could be re-excited by mechanical pressure. The prediction was actualized by MIT researchers, who proved that chemical oscillations can be triggered by mechanically compressing the BZ gel beyond a critical stress.

I’m always fascinated by what motivates people and so Balazs’s story about the mimosa strikes me as both charming and instructive as to the sources for creative inspiration in any field.

If I read the news release rightly, we’ve still got a long way to go before ‘seeing’ robots with skin that can ‘feel’.

Nanotechnology-enabled robot skin

We take it for granted most of the time. The ability to sense pressure and respond to appropriately doesn’t seem like any great gift but without it, you’d crush fragile objects or be unable to hold onto the heavy ones.

It’s this ability to sense pressure that’s a stumbling block for robotmakers who want to move robots into jobs that require some dexterity, e.g., one that could clean yours windows and your walls without damaging one or failing to clean the other.

Two research teams have recently published papers about their work on solving the ‘pressure problem’. From the article by Jason Palmer for BBC News,

The materials, which can sense pressure as sensitively and quickly as human skin, have been outlined by two groups reporting in [the journal] Nature Materials.

The skins are arrays of small pressure sensors that convert tiny changes in pressure into electrical signals.

The arrays are built into or under flexible rubber sheets that could be stretched into a variety of shapes.

The materials could be used to sheath artificial limbs or to create robots that can pick up and hold fragile objects. They could also be used to improve tools for minimally-invasive surgery.

One team is located at the University of California, Berkeley and the other at Stanford University. The Berkeley team headed by Ali Javey, associate professor of electrical engineering and computer sciences has named their artificial skin ‘e-skin’. From the article by Dan Nosowitz on the Fast Company website,

Researchers at the University of California at Berkeley, backed by DARPA funding, have come up with a thin prototype material that’s getting science nerds all in a tizzy about the future of robotics.

This material is made from germanium and silicon nanowires grown on a cylinder, then rolled around a sticky polyimide substrate. What does that get you? As CNet says, “The result was a shiny, thin, and flexible electronic material organized into a matrix of transistors, each of which with hundreds of semiconductor nanowires.”

But what takes the material to the next level is the thin layer of pressure-sensitive rubber added to the prototype’s surface, capable of measuring pressures between zero and 15 kilopascals–about the normal range of pressure for a low-intensity human activity, like, say, writing a blog post. Basically, this rubber layer turns the nanowire material into a sort of artificial skin, which is being played up as a miracle material.

As Nosowitz points out, this is a remarkable achievement and it is a first step since skin registers pressure, pain, temperature, wetness, and more. Here’s an illustration of Berkeley’s e-skin (Source: University of California Berkeley, accessed from  http://berkeley.edu/news/media/releases/2010/09/12_eskin.shtml Sept. 14, 2010),

An artist’s illustration of an artificial e-skin with nanowire active matrix circuitry covering a hand. The fragile egg illustrates the functionality of the e-skin device for prosthetic and robotic applications.

The Stanford team’s approach has some similarities to the Berkeley’s (from Jason Palmer’s BBC article),

“Javey’s work is a nice demonstration of their capability in making a large array of nanowire TFTs [this film transistor],” said Zhenan Bao of Stanford University, whose group demonstrated the second approach.

The heart of Professor Bao’s devices is micro-structured rubber sheet in the middle of the TFT – effectively re-creating the functionality of the Berkeley group’s skins with less layers.

“Instead of laminating a pressure-sensitive resistor array on top of a nanowire TFT array, we made our transistors to be pressure sensitive,” Professor Bao explained to BBC News.

Here’s a short video about the Stanford team’s work (Source: Stanford University, accessed from http://news.stanford.edu/news/2010/september/sensitive-artificial-skin-091210.html Sept. 14, 2010),

Both approaches to the ‘pressure problem’ have at least one shortcoming. The Berkeley’s team’s e-skin has less sensitivity than Stanford’s while the Stanford team’s artificial skin is less flexible than e-skin as per Palmer’s BBC article. Also, I noticed that the Berkeley team at least is being funded by DARPA ([US Dept. of Defense] Defense Advanced Research Projects Agency) so I’m assuming a fair degree of military interest, which always gives me pause. Nonetheless, bravo to both teams.