Tag Archives: robot skin

Robot skin that feels heat, pain, and pressure

This June 17, 2025 news item on ScienceDaily announces research into developing robot skin that more closely mimics skin (human and otherwise),

Scientists have developed a low-cost, durable, highly-sensitive robotic ‘skin’ that can be added to robotic hands like a glove, enabling robots to detect information about their surroundings in a way that’s similar to humans.

The researchers, from the University of Cambridge and University College London (UCL), developed the flexible, conductive skin, which is easy to fabricate and can be melted down and formed into a wide range of complex shapes. The technology senses and processes a range of physical inputs, allowing robots to interact with the physical world in a more meaningful way.

A June 11, 2025 University of Cambridge news release (also on EurekAlert) by Sarah Collins, which originated the news item, describes what makes this work a breakthrough,

Unlike other solutions for robotic touch, which typically work via sensors embedded in small areas and require different sensors to detect different types of touch, the entirety of the electronic skin developed by the Cambridge and UCL researchers is a sensor, bringing it closer to our own sensor system: our skin.  

Although the robotic skin is not as sensitive as human skin, it can detect signals from over 860,000 tiny pathways in the material, enabling it to recognise different types of touch and pressure – like the tap of a finger, a hot or cold surface, damage caused by cutting or stabbing, or multiple points being touched at once – in a single material.

The researchers used a combination of physical tests and machine learning techniques to help the robotic skin ‘learn’ which of these pathways matter most, so it can sense different types of contact more efficiently.

In addition to potential future applications for humanoid robots or human prosthetics where a sense of touch is vital, the researchers say the robotic skin could be useful in industries as varied as the automotive sector or disaster relief. The results are reported in the journal Science Robotics.

Electronic skins work by converting physical information – like pressure or temperature – into electronic signals. In most cases, different types of sensors are needed for different types of touch – one type of sensor to detect pressure, another for temperature, and so on – which are then embedded into soft, flexible materials. However, the signals from these different sensors can interfere with each other, and the materials are easily damaged.

“Having different sensors for different types of touch leads to materials that are complex to make,” said lead author Dr David Hardman from Cambridge’s Department of Engineering. “We wanted to develop a solution that can detect multiple types of touch at once, but in a single material.”

“At the same time, we need something that’s cheap and durable, so that it’s suitable for widespread use,” said co-author Dr Thomas George Thuruthel from UCL.

Their solution uses one type of sensor that reacts differently to different types of touch, known as multi-modal sensing. While it’s challenging to separate out the cause of each signal, multi-modal sensing materials are easier to make and more robust.

The researchers melted down a soft, stretchy and electrically conductive gelatine-based hydrogel, and cast it into the shape of a human hand. They tested a range of different electrode configurations to determine which gave them the most useful information about different types of touch. From just 32 electrodes placed at the wrist, they were able to collect over 1.7 million pieces of information over the whole hand, thanks to the tiny pathways in the conductive material.

The skin was then tested on different types of touch: the researchers blasted it with a heat gun, pressed it with their fingers and a robotic arm, gently touched it with their fingers, and even cut it open with a scalpel. The team then used the data gathered during these tests to train a machine learning model so the hand would recognise what the different types of touch meant. 

“We’re able to squeeze a lot of information from these materials – they can take thousands of measurements very quickly,” said Hardman, who is a postdoctoral researcher in the lab of co-author Professor Fumiya Iida. “They’re measuring lots of different things at once, over a large surface area.”

“We’re not quite at the level where the robotic skin is as good as human skin, but we think it’s better than anything else out there at the moment,” said Thuruthel. “Our method is flexible and easier to build than traditional sensors, and we’re able to calibrate it using human touch for a range of tasks.”

In future, the researchers are hoping to improve the durability of the electronic skin, and to carry out further tests on real-world robotic tasks.

The research was supported by Samsung Global Research Outreach Program, the Royal Society, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Fumiya Iida is a Fellow of Corpus Christi College, Cambridge.

Here’s a link to and a citation for the paper,

Multimodal information structuring with single-layer soft skins and high-density electrical impedance tomography by David Hardman, Thomas George Thuruthel, and Fumiya Iida. Science Robotics 11 Jun 2025 Vol 10, Issue 103 DOI: 10.1126/scirobotics.adq2303

This paper is behind a paywall.

Emotional robots

This is some very intriguing work,

“I’ve always felt that robots shouldn’t just be modeled after humans [emphasis mine] or be copies of humans,” he [Guy Hoffman, assistant professor at Cornell University)] said. “We have a lot of interesting relationships with other species. Robots could be thought of as one of those ‘other species,’ not trying to copy what we do but interacting with us with their own language, tapping into our own instincts.”

A July 16, 2018 Cornell University news release on EurekAlert offers more insight into the work,

Cornell University researchers have developed a prototype of a robot that can express “emotions” through changes in its outer surface. The robot’s skin covers a grid of texture units whose shapes change based on the robot’s feelings.

Assistant professor of mechanical and aerospace engineering Guy Hoffman, who has given a TEDx talk on “Robots with ‘soul'” said the inspiration for designing a robot that gives off nonverbal cues through its outer skin comes from the animal world, based on the idea that robots shouldn’t be thought of in human terms.

“I’ve always felt that robots shouldn’t just be modeled after humans or be copies of humans,” he said. “We have a lot of interesting relationships with other species. Robots could be thought of as one of those ‘other species,’ not trying to copy what we do but interacting with us with their own language, tapping into our own instincts.”

Their work is detailed in a paper, “Soft Skin Texture Modulation for Social Robots,” presented at the International Conference on Soft Robotics in Livorno, Italy. Doctoral student Yuhan Hu was lead author; the paper was featured in IEEE Spectrum, a publication of the Institute of Electrical and Electronics Engineers.

Hoffman and Hu’s design features an array of two shapes, goosebumps and spikes, which map to different emotional states. The actuation units for both shapes are integrated into texture modules, with fluidic chambers connecting bumps of the same kind.

The team tried two different actuation control systems, with minimizing size and noise level a driving factor in both designs. “One of the challenges,” Hoffman said, “is that a lot of shape-changing technologies are quite loud, due to the pumps involved, and these make them also quite bulky.”

Hoffman does not have a specific application for his robot with texture-changing skin mapped to its emotional state. At this point, just proving that this can be done is a sizable first step. “It’s really just giving us another way to think about how robots could be designed,” he said.

Future challenges include scaling the technology to fit into a self-contained robot – whatever shape that robot takes – and making the technology more responsive to the robot’s immediate emotional changes.

“At the moment, most social robots express [their] internal state only by using facial expressions and gestures,” the paper concludes. “We believe that the integration of a texture-changing skin, combining both haptic [feel] and visual modalities, can thus significantly enhance the expressive spectrum of robots for social interaction.”

A video helps to explain the work,

I don’t consider ‘sleepy’ to be an emotional state but as noted earlier this is intriguing. You can find out more in a July 9, 2018 article by Tom Fleischman for the Cornell Chronicle (Note: tthe news release was fashioned from this article so you will find some redundancy should you read in its entirety),

In 1872, Charles Darwin published his third major work on evolutionary theory, “The Expression of the Emotions in Man and Animals,” which explores the biological aspects of emotional life.

In it, Darwin writes: “Hardly any expressive movement is so general as the involuntary erection of the hairs, feathers and other dermal appendages … it is common throughout three of the great vertebrate classes.” Nearly 150 years later, the field of robotics is starting to draw inspiration from those words.

“The aspect of touch has not been explored much in human-robot interaction, but I often thought that people and animals do have this change in their skin that expresses their internal state,” said Guy Hoffman, assistant professor and Mills Family Faculty Fellow in the Sibley School of Mechanical and Aerospace Engineering (MAE).

Inspired by this idea, Hoffman and students in his Human-Robot Collaboration and Companionship Lab have developed a prototype of a robot that can express “emotions” through changes in its outer surface. …

Part of our relationship with other species is our understanding of the nonverbal cues animals give off – like the raising of fur on a dog’s back or a cat’s neck, or the ruffling of a bird’s feathers. Those are unmistakable signals that the animal is somehow aroused or angered; the fact that they can be both seen and felt strengthens the message.

“Yuhan put it very nicely: She said that humans are part of the family of species, they are not disconnected,” Hoffman said. “Animals communicate this way, and we do have a sensitivity to this kind of behavior.”

You can find the paper presented at the International Conference on Soft Robotics in Livorno, Italy, ‘Soft Skin Texture Modulation for Social Robotics’ by Yuhan Hu, Zhengnan Zhao, Abheek Vimal, and Guy Hoffman, here.

Nanotechnology-enabled robot skin

We take it for granted most of the time. The ability to sense pressure and respond to appropriately doesn’t seem like any great gift but without it, you’d crush fragile objects or be unable to hold onto the heavy ones.

It’s this ability to sense pressure that’s a stumbling block for robotmakers who want to move robots into jobs that require some dexterity, e.g., one that could clean yours windows and your walls without damaging one or failing to clean the other.

Two research teams have recently published papers about their work on solving the ‘pressure problem’. From the article by Jason Palmer for BBC News,

The materials, which can sense pressure as sensitively and quickly as human skin, have been outlined by two groups reporting in [the journal] Nature Materials.

The skins are arrays of small pressure sensors that convert tiny changes in pressure into electrical signals.

The arrays are built into or under flexible rubber sheets that could be stretched into a variety of shapes.

The materials could be used to sheath artificial limbs or to create robots that can pick up and hold fragile objects. They could also be used to improve tools for minimally-invasive surgery.

One team is located at the University of California, Berkeley and the other at Stanford University. The Berkeley team headed by Ali Javey, associate professor of electrical engineering and computer sciences has named their artificial skin ‘e-skin’. From the article by Dan Nosowitz on the Fast Company website,

Researchers at the University of California at Berkeley, backed by DARPA funding, have come up with a thin prototype material that’s getting science nerds all in a tizzy about the future of robotics.

This material is made from germanium and silicon nanowires grown on a cylinder, then rolled around a sticky polyimide substrate. What does that get you? As CNet says, “The result was a shiny, thin, and flexible electronic material organized into a matrix of transistors, each of which with hundreds of semiconductor nanowires.”

But what takes the material to the next level is the thin layer of pressure-sensitive rubber added to the prototype’s surface, capable of measuring pressures between zero and 15 kilopascals–about the normal range of pressure for a low-intensity human activity, like, say, writing a blog post. Basically, this rubber layer turns the nanowire material into a sort of artificial skin, which is being played up as a miracle material.

As Nosowitz points out, this is a remarkable achievement and it is a first step since skin registers pressure, pain, temperature, wetness, and more. Here’s an illustration of Berkeley’s e-skin (Source: University of California Berkeley, accessed from  http://berkeley.edu/news/media/releases/2010/09/12_eskin.shtml Sept. 14, 2010),

An artist’s illustration of an artificial e-skin with nanowire active matrix circuitry covering a hand. The fragile egg illustrates the functionality of the e-skin device for prosthetic and robotic applications.

The Stanford team’s approach has some similarities to the Berkeley’s (from Jason Palmer’s BBC article),

“Javey’s work is a nice demonstration of their capability in making a large array of nanowire TFTs [this film transistor],” said Zhenan Bao of Stanford University, whose group demonstrated the second approach.

The heart of Professor Bao’s devices is micro-structured rubber sheet in the middle of the TFT – effectively re-creating the functionality of the Berkeley group’s skins with less layers.

“Instead of laminating a pressure-sensitive resistor array on top of a nanowire TFT array, we made our transistors to be pressure sensitive,” Professor Bao explained to BBC News.

Here’s a short video about the Stanford team’s work (Source: Stanford University, accessed from http://news.stanford.edu/news/2010/september/sensitive-artificial-skin-091210.html Sept. 14, 2010),

Both approaches to the ‘pressure problem’ have at least one shortcoming. The Berkeley’s team’s e-skin has less sensitivity than Stanford’s while the Stanford team’s artificial skin is less flexible than e-skin as per Palmer’s BBC article. Also, I noticed that the Berkeley team at least is being funded by DARPA ([US Dept. of Defense] Defense Advanced Research Projects Agency) so I’m assuming a fair degree of military interest, which always gives me pause. Nonetheless, bravo to both teams.