Tag Archives: gestures

Emotional robots

This is some very intriguing work,

“I’ve always felt that robots shouldn’t just be modeled after humans [emphasis mine] or be copies of humans,” he [Guy Hoffman, assistant professor at Cornell University)] said. “We have a lot of interesting relationships with other species. Robots could be thought of as one of those ‘other species,’ not trying to copy what we do but interacting with us with their own language, tapping into our own instincts.”

A July 16, 2018 Cornell University news release on EurekAlert offers more insight into the work,

Cornell University researchers have developed a prototype of a robot that can express “emotions” through changes in its outer surface. The robot’s skin covers a grid of texture units whose shapes change based on the robot’s feelings.

Assistant professor of mechanical and aerospace engineering Guy Hoffman, who has given a TEDx talk on “Robots with ‘soul'” said the inspiration for designing a robot that gives off nonverbal cues through its outer skin comes from the animal world, based on the idea that robots shouldn’t be thought of in human terms.

“I’ve always felt that robots shouldn’t just be modeled after humans or be copies of humans,” he said. “We have a lot of interesting relationships with other species. Robots could be thought of as one of those ‘other species,’ not trying to copy what we do but interacting with us with their own language, tapping into our own instincts.”

Their work is detailed in a paper, “Soft Skin Texture Modulation for Social Robots,” presented at the International Conference on Soft Robotics in Livorno, Italy. Doctoral student Yuhan Hu was lead author; the paper was featured in IEEE Spectrum, a publication of the Institute of Electrical and Electronics Engineers.

Hoffman and Hu’s design features an array of two shapes, goosebumps and spikes, which map to different emotional states. The actuation units for both shapes are integrated into texture modules, with fluidic chambers connecting bumps of the same kind.

The team tried two different actuation control systems, with minimizing size and noise level a driving factor in both designs. “One of the challenges,” Hoffman said, “is that a lot of shape-changing technologies are quite loud, due to the pumps involved, and these make them also quite bulky.”

Hoffman does not have a specific application for his robot with texture-changing skin mapped to its emotional state. At this point, just proving that this can be done is a sizable first step. “It’s really just giving us another way to think about how robots could be designed,” he said.

Future challenges include scaling the technology to fit into a self-contained robot – whatever shape that robot takes – and making the technology more responsive to the robot’s immediate emotional changes.

“At the moment, most social robots express [their] internal state only by using facial expressions and gestures,” the paper concludes. “We believe that the integration of a texture-changing skin, combining both haptic [feel] and visual modalities, can thus significantly enhance the expressive spectrum of robots for social interaction.”

A video helps to explain the work,

I don’t consider ‘sleepy’ to be an emotional state but as noted earlier this is intriguing. You can find out more in a July 9, 2018 article by Tom Fleischman for the Cornell Chronicle (Note: tthe news release was fashioned from this article so you will find some redundancy should you read in its entirety),

In 1872, Charles Darwin published his third major work on evolutionary theory, “The Expression of the Emotions in Man and Animals,” which explores the biological aspects of emotional life.

In it, Darwin writes: “Hardly any expressive movement is so general as the involuntary erection of the hairs, feathers and other dermal appendages … it is common throughout three of the great vertebrate classes.” Nearly 150 years later, the field of robotics is starting to draw inspiration from those words.

“The aspect of touch has not been explored much in human-robot interaction, but I often thought that people and animals do have this change in their skin that expresses their internal state,” said Guy Hoffman, assistant professor and Mills Family Faculty Fellow in the Sibley School of Mechanical and Aerospace Engineering (MAE).

Inspired by this idea, Hoffman and students in his Human-Robot Collaboration and Companionship Lab have developed a prototype of a robot that can express “emotions” through changes in its outer surface. …

Part of our relationship with other species is our understanding of the nonverbal cues animals give off – like the raising of fur on a dog’s back or a cat’s neck, or the ruffling of a bird’s feathers. Those are unmistakable signals that the animal is somehow aroused or angered; the fact that they can be both seen and felt strengthens the message.

“Yuhan put it very nicely: She said that humans are part of the family of species, they are not disconnected,” Hoffman said. “Animals communicate this way, and we do have a sensitivity to this kind of behavior.”

You can find the paper presented at the International Conference on Soft Robotics in Livorno, Italy, ‘Soft Skin Texture Modulation for Social Robotics’ by Yuhan Hu, Zhengnan Zhao, Abheek Vimal, and Guy Hoffman, here.

From the bleeding edge to the cutting edge to ubiquitous? The PaperPhone, an innovation case study in progress

This story has it all: military, patents, international competition and cooperation, sex (well, not according to the academics but I think it’s possible), and a bizarre device – the PaperPhone (last mentioned in my May 6, 2011 posting on Human-Computer Interfaces).

“If you want to know what technologies people will be using 10 years in the future, talk to the people who’ve been working on a lab project for 10 years,” said Dr. Roel Vertegaal, Director of the Human Media Lab at Queen’s University in Kingston, Ontario. By the way, 10 years is roughly the length of time Vertegaal and his team have been working on a flexible/bendable phone/computer and he believes that it will be another five to 10 years before the device is available commercially.

Image from Human Media Lab press kit

As you can see in the image, the prototype device looks like a thin piece of plastic that displays a menu. In real life that black bit to the left of the image is the head of a cable with many wires connecting it to a computer. Here’s a physical description of the device copied from the paper (PaperPhone: Understanding the Use of Bend Gestures in Mobile Devices with Flexible Electronic Paper Displays) written by Byron Lahey, Audrey Girouard, Winslow Burleson and Vertegaal,

PaperPhone consists of an Arizona State University Flexible Display Center 3.7” Bloodhound flexible electrophoretic display, augmented with a layer of 5 Flexpoint 2” bidirectional bend sensors. The prototype is driven by an E Ink Broadsheet AM 300 Kit featuring a Gumstix processor. The prototype has a refresh rate of 780 ms for a typical full screen gray scale image.

An Arduino microcontroller obtains data from the Flexpoint bend sensors at a frequency of 20 Hz. Figure 2 shows the back of the display, with the bend sensor configuration mounted on a flexible printed circuit (FPC) of our own design. We built the FPC by printing its design on DuPont Pyralux flexible circuit material with a solid ink printer, then etching the result to obtain a fully functional flexible circuit substrate. PaperPhone is not fully wireless. This is because of the supporting rigid electronics that are required to drive the display. A single, thin cable bundle connects the AM300 and Arduino hardware to the display and sensors. This design maximizes the flexibility and mobility of the display, while keeping its weight to a minimum. The AM300 and Arduino are connected to a laptop running a Max 5 patch that processes sensor data, performs bend gesture recognition and sends images to the display. p. 3

It may look ungainly but it represents a significant step forward for the technology as this team (composed of researchers from Queen’s University, Arizona State University, and E Ink Corporation) appears to have produced the only working prototype in the world for a personal portable flexible device that will let you make phone calls, play music, read a book, and more by bending it. As they continue to develop the product, the device will become wireless.

The PaperPhone and the research about ‘bending’, i.e., the kinds of bending gestures people would find easiest and most intuitive to use when activating the device, were presented in Vancouver in an early session at the CHI 2011 Conference where I got a chance to speak to Dr. Vertegaal and his team.

Amongst other nuggets, I found out the US Department of Defense (not DARPA [Defense Advanced Research Projects Agency] oddly enough) has provided funding for the project. Military interest is focused on the device’s low energy requirements, lowlight screen, and light weight in addition to its potential ability to be folded up and carried like a piece of paper (i. e., it could mould itself to fit a number of tight spaces) as opposed to the rigid, ungiving borders of a standard mobile device. Of course, all of these factors are quite attractive to consumers too.

As is imperative these days, the ‘bends’ that activate the device have been patented and Vertegaal is in the process of developing a startup company that will bring this device and others to market. Queen’s University has an ‘industrial transfer’ office (they probably call it something else) which is assisting him with the startup.

There is international interest in the PaperPhone that is collaborative and competitive. Vertegaal’s team at Queen’s is partnered with a team at Arizona State University led by Dr. Winslow Burleson, professor in the Computer Systems Engineering and the Arts, Media, and Engineering graduate program and with Michael McCreary, Vice President Research & Development of E Ink Corporation representing an industry partner.

On the competitive side of things, the UK’s University of Cambridge and the Finnish Nokia Research Centre have been working on the Morph which as I noted in my May 6, 2011 posting still seems to be more concept than project.

Vertegaal noted that the idea of a flexible screen is not new and that North American companies have gone bankrupt trying to bring the screens to market. These days, you have to go to Taiwan for industrial production of flexible screens such as the PaperPhone’s.

One of my last questions to the team was about pornography. (In the early days of the Internet [which had its origins in military research], there were only two industries that made money online, pornography and gambling. The gambling opportunities seem pretty similar to what we already enjoy.) After an amused response, the consensus was that like gambling it’s highly unlikely a flexible phone could lend itself to anything new in the field of pornography. Personally, I’m not convinced about that one.

So there you have a case study for innovation. Work considered bleeding edge 10 years ago is now cutting edge and, in the next five to 10 years, that work will be become a consumer product. Along the way you have military investment, international collaboration and competition, failure and success, and, possibly, sex.