Tag Archives: social robots

Comedy club performances show how robots and humans connect via humor

Caption: Naomi Fitter and Jon the Robot. Credit: Johanna Carson, OSU College of Engineering

Robot comedian is not my first thought on seeing that image; ventriloquist’s dummy is what came to mind. However, it’s not the first time I’ve been wrong about something. A May 19, 2020 news item on ScienceDaily reveals the truth about Jon, a comedian in robot form,

Standup comedian Jon the Robot likes to tell his audiences that he does lots of auditions but has a hard time getting bookings.

“They always think I’m too robotic,” he deadpans.

If raucous laughter follows, he comes back with, “Please tell the booking agents how funny that joke was.”

If it doesn’t, he follows up with, “Sorry about that. I think I got caught in a loop. Please tell the booking agents that you like me … that you like me … that you like me … that you like me.”

Jon the Robot, with assistance from Oregon State University researcher Naomi Fitter, recently wrapped up a 32-show tour of comedy clubs in greater Los Angeles and in Oregon, generating guffaws and more importantly data that scientists and engineers can use to help robots and people relate more effectively with one another via humor.

A May 18, 2020 Oregon State University (OSU) news release (also on EurekAlert), which originated the news item, delves furthers into this intriguing research,

“Social robots and autonomous social agents are becoming more and more ingrained in our everyday lives,” said Fitter, assistant professor of robotics in the OSU College of Engineering. “Lots of them tell jokes to engage users – most people understand that humor, especially nuanced humor, is essential to relationship building. But it’s challenging to develop entertaining jokes for robots that are funny beyond the novelty level.”

Live comedy performances are a way for robots to learn “in the wild” which jokes and which deliveries work and which ones don’t, Fitter said, just like human comedians do.

Two studies comprised the comedy tour, which included assistance from a team of Southern California comedians in coming up with material true to, and appropriate for, a robot comedian.

The first study, consisting of 22 performances in the Los Angeles area, demonstrated that audiences found a robot comic with good timing – giving the audience the right amounts of time to react, etc. – to be significantly more funny than one without good timing.

The second study, based on 10 routines in Oregon, determined that an “adaptive performance” – delivering post-joke “tags” that acknowledge an audience’s reaction to the joke – wasn’t necessarily funnier overall, but the adaptations almost always improved the audience’s perception of individual jokes. In the second study, all performances featured appropriate timing.

“In bad-timing mode, the robot always waited a full five seconds after each joke, regardless of audience response,” Fitter said. “In appropriate-timing mode, the robot used timing strategies to pause for laughter and continue when it subsided, just like an effective human comedian would. Overall, joke response ratings were higher when the jokes were delivered with appropriate timing.”

The number of performances, given to audiences of 10 to 20, provide enough data to identify significant differences between distinct modes of robot comedy performance, and the research helped to answer key questions about comedic social interaction, Fitter said.

“Audience size, social context, cultural context, the microphone-holding human presence and the novelty of a robot comedian may have influenced crowd responses,” Fitter said. “The current software does not account for differences in laughter profiles, but future work can account for these differences using a baseline response measurement. The only sensing we used to evaluate joke success was audio readings. Future work might benefit from incorporating additional types of sensing.”

Still, the studies have key implications for artificial intelligence efforts to understand group responses to dynamic, entertaining social robots in real-world environments, she said.

“Also, possible advances in comedy from this work could include improved techniques for isolating and studying the effects of comedic techniques and better strategies to help comedians assess the success of a joke or routine,” she said. “The findings will guide our next steps toward giving autonomous social agents improved humor capabilities.”

The studies were published by the Association for Computing Machinery [ACM]/Institute of Electrical and Electronics Engineering’s [IEEE] International Conference on Human-Robot Interaction [HRI].

Here’s another link to the two studies published in a single paper, which were first presented at the 2020 International Conference on Human-Robot Interaction [HRI]. along with a citation for the title of the published presentation,

Comedians in Cafes Getting Data: Evaluating Timing and Adaptivity in Real-World Robot Comedy Performance by John Vilk and Naomi T Fitter. HRI ’20: Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot InteractionMarch 2020 Pages 223–231 DOI: https://doi.org/10.1145/3319502.3374780

The paper is open access and the researchers have embedded an mp4 file which includes parts of the performances. Enjoy!

Emotional robots

This is some very intriguing work,

“I’ve always felt that robots shouldn’t just be modeled after humans [emphasis mine] or be copies of humans,” he [Guy Hoffman, assistant professor at Cornell University)] said. “We have a lot of interesting relationships with other species. Robots could be thought of as one of those ‘other species,’ not trying to copy what we do but interacting with us with their own language, tapping into our own instincts.”

A July 16, 2018 Cornell University news release on EurekAlert offers more insight into the work,

Cornell University researchers have developed a prototype of a robot that can express “emotions” through changes in its outer surface. The robot’s skin covers a grid of texture units whose shapes change based on the robot’s feelings.

Assistant professor of mechanical and aerospace engineering Guy Hoffman, who has given a TEDx talk on “Robots with ‘soul'” said the inspiration for designing a robot that gives off nonverbal cues through its outer skin comes from the animal world, based on the idea that robots shouldn’t be thought of in human terms.

“I’ve always felt that robots shouldn’t just be modeled after humans or be copies of humans,” he said. “We have a lot of interesting relationships with other species. Robots could be thought of as one of those ‘other species,’ not trying to copy what we do but interacting with us with their own language, tapping into our own instincts.”

Their work is detailed in a paper, “Soft Skin Texture Modulation for Social Robots,” presented at the International Conference on Soft Robotics in Livorno, Italy. Doctoral student Yuhan Hu was lead author; the paper was featured in IEEE Spectrum, a publication of the Institute of Electrical and Electronics Engineers.

Hoffman and Hu’s design features an array of two shapes, goosebumps and spikes, which map to different emotional states. The actuation units for both shapes are integrated into texture modules, with fluidic chambers connecting bumps of the same kind.

The team tried two different actuation control systems, with minimizing size and noise level a driving factor in both designs. “One of the challenges,” Hoffman said, “is that a lot of shape-changing technologies are quite loud, due to the pumps involved, and these make them also quite bulky.”

Hoffman does not have a specific application for his robot with texture-changing skin mapped to its emotional state. At this point, just proving that this can be done is a sizable first step. “It’s really just giving us another way to think about how robots could be designed,” he said.

Future challenges include scaling the technology to fit into a self-contained robot – whatever shape that robot takes – and making the technology more responsive to the robot’s immediate emotional changes.

“At the moment, most social robots express [their] internal state only by using facial expressions and gestures,” the paper concludes. “We believe that the integration of a texture-changing skin, combining both haptic [feel] and visual modalities, can thus significantly enhance the expressive spectrum of robots for social interaction.”

A video helps to explain the work,

I don’t consider ‘sleepy’ to be an emotional state but as noted earlier this is intriguing. You can find out more in a July 9, 2018 article by Tom Fleischman for the Cornell Chronicle (Note: tthe news release was fashioned from this article so you will find some redundancy should you read in its entirety),

In 1872, Charles Darwin published his third major work on evolutionary theory, “The Expression of the Emotions in Man and Animals,” which explores the biological aspects of emotional life.

In it, Darwin writes: “Hardly any expressive movement is so general as the involuntary erection of the hairs, feathers and other dermal appendages … it is common throughout three of the great vertebrate classes.” Nearly 150 years later, the field of robotics is starting to draw inspiration from those words.

“The aspect of touch has not been explored much in human-robot interaction, but I often thought that people and animals do have this change in their skin that expresses their internal state,” said Guy Hoffman, assistant professor and Mills Family Faculty Fellow in the Sibley School of Mechanical and Aerospace Engineering (MAE).

Inspired by this idea, Hoffman and students in his Human-Robot Collaboration and Companionship Lab have developed a prototype of a robot that can express “emotions” through changes in its outer surface. …

Part of our relationship with other species is our understanding of the nonverbal cues animals give off – like the raising of fur on a dog’s back or a cat’s neck, or the ruffling of a bird’s feathers. Those are unmistakable signals that the animal is somehow aroused or angered; the fact that they can be both seen and felt strengthens the message.

“Yuhan put it very nicely: She said that humans are part of the family of species, they are not disconnected,” Hoffman said. “Animals communicate this way, and we do have a sensitivity to this kind of behavior.”

You can find the paper presented at the International Conference on Soft Robotics in Livorno, Italy, ‘Soft Skin Texture Modulation for Social Robotics’ by Yuhan Hu, Zhengnan Zhao, Abheek Vimal, and Guy Hoffman, here.

Internet of toys, the robotification of childhood, and privacy issues

Leave it to the European Commission’s (EC) Joint Research Centre (JRC) to look into the future of toys. As far as I’m aware there are no such moves in either Canada or the US despite the ubiquity of robot toys and other such devices. From a March 23, 2017 EC JRC  press release (also on EurekAlert),

Action is needed to monitor and control the emerging Internet of Toys, concludes a new JRC report. Privacy and security are highlighted as main areas of concern.

Large numbers of connected toys have been put on the market over the past few years, and the turnover is expected to reach €10 billion by 2020 – up from just €2.6 billion in 2015.

Connected toys come in many different forms, from smart watches to teddy bears that interact with their users. They are connected to the internet and together with other connected appliances they form the Internet of Things, which is bringing technology into our daily lives more than ever.

However, the toys’ ability to record, store and share information about their young users raises concerns about children’s safety, privacy and social development.

A team of JRC scientists and international experts looked at the safety, security, privacy and societal questions emerging from the rise of the Internet of Toys. The report invites policymakers, industry, parents and teachers to study connected toys more in depth in order to provide a framework which ensures that these toys are safe and beneficial for children.

Robotification of childhood

Robots are no longer only used in industry to carry out repetitive or potentially dangerous tasks. In the past years, robots have entered our everyday lives and also children are more and more likely to encounter robotic or artificial intelligence-enhanced toys.

We still know relatively little about the consequences of children’s interaction with robotic toys. However, it is conceivable that they represent both opportunities and risks for children’s cognitive, socio-emotional and moral-behavioural development.

For example, social robots may further the acquisition of foreign language skills by compensating for the lack of native speakers as language tutors or by removing the barriers and peer pressure encountered in class room. There is also evidence about the benefits of child-robot interaction for children with developmental problems, such as autism or learning difficulties, who may find human interaction difficult.

However, the internet-based personalization of children’s education via filtering algorithms may also increase the risk of ‘educational bubbles’ where children only receive information that fits their pre-existing knowledge and interest – similar to adult interaction on social media networks.

Safety and security considerations

The rapid rise in internet connected toys also raises concerns about children’s safety and privacy. In particular, the way that data gathered by connected toys is analysed, manipulated and stored is not transparent, which poses an emerging threat to children’s privacy.

The data provided by children while they play, i.e the sounds, images and movements recorded by connected toys is personal data protected by the EU data protection framework, as well as by the new General Data Protection Regulation (GDPR). However, information on how this data is stored, analysed and shared might be hidden in long privacy statements or policies and often go unnoticed by parents.

Whilst children’s right to privacy is the most immediate concern linked to connected toys, there is also a long term concern: growing up in a culture where the tracking, recording and analysing of children’s everyday choices becomes a normal part of life is also likely to shape children’s behaviour and development.

Usage framework to guide the use of connected toys

The report calls for industry and policymakers to create a connected toys usage framework to act as a guide for their design and use.

This would also help toymakers to meet the challenge of complying with the new European General Data Protection Regulation (GDPR) which comes into force in May 2018, which will increase citizens’ control over their personal data.

The report also calls for the connected toy industry and academic researchers to work together to produce better designed and safer products.

Advice for parents

The report concludes that it is paramount that we understand how children interact with connected toys and which risks and opportunities they entail for children’s development.

“These devices come with really interesting possibilities and the more we use them, the more we will learn about how to best manage them. Locking them up in a cupboard is not the way to go. We as adults have to understand how they work – and how they might ‘misbehave’ – so that we can provide the right tools and the right opportunities for our children to grow up happy in a secure digital world”, Stéphane Chaudron, the report’s lead researcher at the Joint Research Centre (JRC).).

The authors of the report encourage parents to get informed about the capabilities, functions, security measures and privacy settings of toys before buying them. They also urge parents to focus on the quality of play by observing their children, talking to them about their experiences and playing alongside and with their children.

Protecting and empowering children

Through the Alliance to better protect minors online and with the support of UNICEF, NGOs, Toy Industries Europe and other industry and stakeholder groups, European and global ICT and media companies  are working to improve the protection and empowerment of children when using connected toys. This self-regulatory initiative is facilitated by the European Commission and aims to create a safer and more stimulating digital environment for children.

There’s an engaging video accompanying this press release,

You can find the report (Kaleidoscope on the Internet of Toys: Safety, security, privacy and societal insights) here and both the PDF and print versions are free (although I imagine you’ll have to pay postage for the print version). This report was published in 2016; the authors are Stéphane Chaudron, Rosanna Di Gioia, Monica Gemo, Donell Holloway , Jackie Marsh , Giovanna Mascheroni , Jochen Peter, Dylan Yamada-Rice and organizations involved include European Cooperation in Science and Technology (COST), Digital Literacy and Multimodal Practices of Young Children (DigiLitEY), and COST Action IS1410. DigiLitEY is a European network of 33 countries focusing on research in this area (2015-2019).