Tag Archives: social robotics

Robots and a new perspective on disability

I’ve long wondered about how disabilities would be viewed in a future (h/t May 4, 2017 news item on phys.org) where technology could render them largely irrelevant. A May 4, 2017 essay by Thusha (Gnanthusharan) Rajendran of Heriot-Watt University on TheConversation.com provides a perspective on the possibilities (Note: Links have been removed),

When dealing with the otherness of disability, the Victorians in their shame built huge out-of-sight asylums, and their legacy of “them” and “us” continues to this day. Two hundred years later, technologies offer us an alternative view. The digital age is shattering barriers, and what used to the norm is now being challenged.

What if we could change the environment, rather than the person? What if a virtual assistant could help a visually impaired person with their online shopping? And what if a robot “buddy” could help a person with autism navigate the nuances of workplace politics? These are just some of the questions that are being asked and which need answers as the digital age challenges our perceptions of normality.

The treatment of people with developmental conditions has a chequered history. In towns and cities across Britain, you will still see large Victorian buildings that were once places to “look after” people with disabilities, that is, remove them from society. Things became worse still during the time of the Nazis with an idealisation of the perfect and rejection of Darwin’s idea of natural diversity.

Today we face similar challenges about differences versus abnormalities. Arguably, current diagnostic systems do not help, because they diagnose the person and not “the system”. So, a child has challenging behaviour, rather than being in distress; the person with autism has a communication disorder rather than simply not being understood.

Natural-born cyborgs

In contrast, the digital world is all about systems. The field of human-computer interaction is about how things work between humans and computers or robots. Philosopher Andy Clark argues that humans have always been natural-born cyborgs – that is, we have always used technology (in its broadest sense) to improve ourselves.

The most obvious example is language itself. In the digital age we can become truly digitally enhanced. How many of us Google something rather than remembering it? How do you feel when you have no access to wi-fi? How much do we favour texting, tweeting and Facebook over face-to-face conversations? How much do we love and need our smartphones?

In the new field of social robotics, my colleagues and I are developing a robot buddy to help adults with autism to understand, for example, if their boss is pleased or displeased with their work. For many adults with autism, it is not the work itself that stops from them from having successful careers, it is the social environment surrounding work. From the stress-inducing interview to workplace politics, the modern world of work is a social minefield. It is not easy, at times, for us neurotypticals, but for a person with autism it is a world full contradictions and implied meaning.

Rajendra goes on to highlight efforts with autistic individuals; he also includes this video of his December 14, 2016 TEDx Heriot-Watt University talk, which largely focuses on his work with robots and autism  (Note: This runs approximately 15 mins.),

The talk reminded me of a Feb. 6, 2017 posting (scroll down about 33% of the way) where I discussed a recent book about science communication and its failure to recognize the importance of pop culture in that endeavour. As an example, I used a then recent announcement from MIT (Massachusetts Institute of Technology) about their emotion detection wireless application and the almost simultaneous appearance of that application in a Feb. 2, 2017 episode of The Big Bang Theory (a popular US television comedy) featuring a character who could be seen as autistic making use of the emotion detection device.

In any event, the work described in the MIT news release is very similar to Rajendra’s albeit the communication is delivered to the public through entirely different channels: TEDx Talk and TheConversation.com (channels aimed at academics and those with academic interests) and a pop culture television comedy with broad appeal.

Meet Pepper, a robot for health care clinical settings

A Canadian project to introduce robots like Pepper into clinical settings (aside: can seniors’ facilities be far behind?) is the subject of a June 23, 2017 news item on phys.org,

McMaster and Ryerson universities today announced the Smart Robots for Health Communication project, a joint research initiative designed to introduce social robotics and artificial intelligence into clinical health care.

A June 22, 2017 McMaster University news release, which originated the news item, provides more detail,

With the help of Softbank’s humanoid robot Pepper and IBM Bluemix Watson Cognitive Services, the researchers will study health information exchange through a state-of-the-art human-robot interaction system. The project is a collaboration between David Harris Smith, professor in the Department of Communication Studies and Multimedia at McMaster University, Frauke Zeller, professor in the School of Professional Communication at Ryerson University and Hermenio Lima, a dermatologist and professor of medicine at McMaster’s Michael G. DeGroote School of Medicine. His main research interests are in the area of immunodermatology and technology applied to human health.

The research project involves the development and analysis of physical and virtual human-robot interactions, and has the capability to improve healthcare outcomes by helping healthcare professionals better understand patients’ behaviour.

Zeller and Harris Smith have previously worked together on hitchBOT, the friendly hitchhiking robot that travelled across Canada and has since found its new home in the [Canada] Science and Technology Museum in Ottawa.

“Pepper will help us highlight some very important aspects and motives of human behaviour and communication,” said Zeller.

Designed to be used in professional environments, Pepper is a humanoid robot that can interact with people, ‘read’ emotions, learn, move and adapt to its environment, and even recharge on its own. Pepper is able to perform facial recognition and develop individualized relationships when it interacts with people.

Lima, the clinic director, said: “We are excited to have the opportunity to potentially transform patient engagement in a clinical setting, and ultimately improve healthcare outcomes by adapting to clients’ communications needs.”

At Ryerson, Pepper was funded by the Co-lab in the Faculty of Communication and Design. FCAD’s Co-lab provides strategic leadership, technological support and acquisitions of technologies that are shaping the future of communications.

“This partnership is a testament to the collaborative nature of innovation,” said dean of FCAD, Charles Falzon. “I’m thrilled to support this multidisciplinary project that pushes the boundaries of research, and allows our faculty and students to find uses for emerging tech inside and outside the classroom.”

“This project exemplifies the value that research in the Humanities can bring to the wider world, in this case building understanding and enhancing communications in critical settings such as health care,” says McMaster’s Dean of Humanities, Ken Cruikshank.

The integration of IBM Watson cognitive computing services with the state-of-the-art social robot Pepper, offers a rich source of research potential for the projects at Ryerson and McMaster. This integration is also supported by IBM Canada and [Southern Ontario Smart Computing Innovation Platform] SOSCIP by providing the project access to high performance research computing resources and staff in Ontario.

“We see this as the initiation of an ongoing collaborative university and industry research program to develop and test applications of embodied AI, a research program that is well-positioned to integrate and apply emerging improvements in machine learning and social robotics innovations,” said Harris Smith.

I just went to a presentation at the facility where my mother lives and it was all about delivering more individualized and better care for residents. Given that most seniors in British Columbia care facilities do not receive the number of service hours per resident recommended by the province due to funding issues, it seemed a well-meaning initiative offered in the face of daunting odds against success. Now with this news, I wonder what impact ‘Pepper’ might ultimately have on seniors and on the people who currently deliver service. Of course, this assumes that researchers will be able to tackle problems with understanding various accents and communication strategies, which are strongly influenced by culture and, over time, the aging process.

After writing that last paragraph I stumbled onto this June 27, 2017 Sage Publications press release on EurekAlert about a related matter,

Existing digital technologies must be exploited to enable a paradigm shift in current healthcare delivery which focuses on tests, treatments and targets rather than the therapeutic benefits of empathy. Writing in the Journal of the Royal Society of Medicine, Dr Jeremy Howick and Dr Sian Rees of the Oxford Empathy Programme, say a new paradigm of empathy-based medicine is needed to improve patient outcomes, reduce practitioner burnout and save money.

Empathy-based medicine, they write, re-establishes relationship as the heart of healthcare. “Time pressure, conflicting priorities and bureaucracy can make practitioners less likely to express empathy. By re-establishing the clinical encounter as the heart of healthcare, and exploiting available technologies, this can change”, said Dr Howick, a Senior Researcher in Oxford University’s Nuffield Department of Primary Care Health Sciences.

Technology is already available that could reduce the burden of practitioner paperwork by gathering basic information prior to consultation, for example via email or a mobile device in the waiting room.

During the consultation, the computer screen could be placed so that both patient and clinician can see it, a help to both if needed, for example, to show infographics on risks and treatment options to aid decision-making and the joint development of a treatment plan.

Dr Howick said: “The spread of alternatives to face-to-face consultations is still in its infancy, as is our understanding of when a machine will do and when a person-to-person relationship is needed.” However, he warned, technology can also get in the way. A computer screen can become a barrier to communication rather than an aid to decision-making. “Patients and carers need to be involved in determining the need for, and designing, new technologies”, he said.

I sincerely hope that the Canadian project has taken into account some of the issues described in the ’empathy’ press release and in the article, which can be found here,

Overthrowing barriers to empathy in healthcare: empathy in the age of the Internet
by J Howick and S Rees. Journaly= of the Royal Society of Medicine Article first published online: June 27, 2017 DOI: https://doi.org/10.1177/0141076817714443

This article is open access.

Public science outreach in New York

I couldn’t believe my eyes when I first saw the title, Being Human in the 21st Century (a New York Academy of Sciences outreach series) in an article about social robotics at physorg.com. I’ve been thinking about that very issue since I wrote my paper, Whose electric brain? Sadly, I wouldn’t have been able to attend the series (I don’t live in New York and four of the five talks have already been given).

Here’s a brief description of the series,

One of the signature traits of being human is our quest to define what it means to “be human.” But that definition is always changing—now perhaps more than ever. From virtual reality to mundane reality, science and technology continue to push the boundaries of human existence. In this series, Science & the City will examine what it means to be human in the 21st century.

Here are the titles in the series,

System Overload: The Limits of Human Memory, September 6, 2011
Celluloid Science: Humanizing Life in the Lab, October 20, 2011
Virtual Humanity: The Anthropology of Online Worlds, November 9, 2011
Familiar but Strange: Exploring our Relationships with Robots, December 5, 2011
Matchmaking in the Digital Age, February 15, 2012

I was especially interested in the talk about robots since I have written on that topic quite regularly (my March 10, 2011 posting on Geminoid robots and the uncanny valley is one example). Here’s a description of the two speakers,

Chris Bregler

New York University

Chris Bregler’s primary research interests are in the areas of motion capture, animation, computer vision, graphics, statistical learning, gaming, and applications in the bio/medical field, human-computer interaction, and artificial intelligence. Currently he focuses on human movement research, including projects in human face, speech, and full-body motion analysis and animation, movement style, expressions, body language, and Massive Multiplayer Mocap games. Most of these projects are interdisciplinary collaborations with other (computer) scientists, engineers, dancers, animators, bio/medical experts, game designers, and producers.

Heather Knight

Carnegie Mellon University

Heather Knight is currently conducting her doctoral research at Carnegie Mellon’s Robotics Institute. She is also founder of Marilyn Monrobot Labs in New York City, which creates socially intelligent robot performances and sensor-based electronic art. Her previous work includes robotics and instrumentation at NASA’s Jet Propulsion Laboratory, interactive installations with Syyn Labs, field applications and sensor design at Aldebaran Robotics, and she is an alumnus from the Personal Robots Group at the MIT Media Lab. Knight earned her bachelor and master’s degrees at MIT in electrical engineering and computer science and has a minor in mechanical engineering.

I drilled down for more detail and was quite interested to find out that Knight’s Marilyn Monrobot is a robot theatre company while Bregler has been active in theatre and movies,

He was the chair for the SIGGRAPH Electronic Theater and Animation Festival. He has been active in the Visual Effects industry, for example, as the lead developer of ILM’s Multitrack system that has been used in many feature film productions.

I hope to see this talk available on a podcast one of these days. In the meantime, the New York Academy of Sciences runs a large outreach programme titled, Science & the City where you can find podcasts and listings for the various public series and events that it produces.

If you do live in New York City or will be there around Valentine’s Day, the last scheduled talk in this particular series (Being Human in the 21st Century), Matchmaking in the Digital Age will be held Feb. 15, 2012, 7 pm – 8:30 pm (from the event page),

In the not-too-distant past people found love through real-world social networks: family, friends, jobs, and social groups. But online dating has completely changed the way we find love and shifted matchmaking to a mathematical science. Now millions of singles turn over large amounts of personal data to computers, hoping an algorithm will find them the perfect mate. One leading online dating site is using that data to uncover the anthropology of human mating.

Founded by a mathematician, OKCupid analyzes dating data to draw funny, revealing, and downright strange conclusions about the sex lives of humans. OKCupid’s resident blogger, Christian Rudder, will give a behind-the-scenes look into human mating in the 21st century, just in time for Valentine’s Day. You’ll never look at your love life the same way again.

Join us for a reception afterward, where you may just meet someone the old-fashioned way.

Pricing for the event is as follows,

Member:                                                                   $15

Student / Postdoc / Fellow Member:           $15

Nonmember:                                                           $25

Student / Postdoc / Fellow Nonmember:   $20

The address and contact details:

The New York Academy of Sciences

7 World Trade Center
250 Greenwich Street, 40th floor
New York, NY 10007-2157
212.298.8600
nyas@nyas.org

You can register for the event here.