Category Archives: public perceptions

Comedy club performances show how robots and humans connect via humor

Caption: Naomi Fitter and Jon the Robot. Credit: Johanna Carson, OSU College of Engineering

Robot comedian is not my first thought on seeing that image; ventriloquist’s dummy is what came to mind. However, it’s not the first time I’ve been wrong about something. A May 19, 2020 news item on ScienceDaily reveals the truth about Jon, a comedian in robot form,

Standup comedian Jon the Robot likes to tell his audiences that he does lots of auditions but has a hard time getting bookings.

“They always think I’m too robotic,” he deadpans.

If raucous laughter follows, he comes back with, “Please tell the booking agents how funny that joke was.”

If it doesn’t, he follows up with, “Sorry about that. I think I got caught in a loop. Please tell the booking agents that you like me … that you like me … that you like me … that you like me.”

Jon the Robot, with assistance from Oregon State University researcher Naomi Fitter, recently wrapped up a 32-show tour of comedy clubs in greater Los Angeles and in Oregon, generating guffaws and more importantly data that scientists and engineers can use to help robots and people relate more effectively with one another via humor.

A May 18, 2020 Oregon State University (OSU) news release (also on EurekAlert), which originated the news item, delves furthers into this intriguing research,

“Social robots and autonomous social agents are becoming more and more ingrained in our everyday lives,” said Fitter, assistant professor of robotics in the OSU College of Engineering. “Lots of them tell jokes to engage users – most people understand that humor, especially nuanced humor, is essential to relationship building. But it’s challenging to develop entertaining jokes for robots that are funny beyond the novelty level.”

Live comedy performances are a way for robots to learn “in the wild” which jokes and which deliveries work and which ones don’t, Fitter said, just like human comedians do.

Two studies comprised the comedy tour, which included assistance from a team of Southern California comedians in coming up with material true to, and appropriate for, a robot comedian.

The first study, consisting of 22 performances in the Los Angeles area, demonstrated that audiences found a robot comic with good timing – giving the audience the right amounts of time to react, etc. – to be significantly more funny than one without good timing.

The second study, based on 10 routines in Oregon, determined that an “adaptive performance” – delivering post-joke “tags” that acknowledge an audience’s reaction to the joke – wasn’t necessarily funnier overall, but the adaptations almost always improved the audience’s perception of individual jokes. In the second study, all performances featured appropriate timing.

“In bad-timing mode, the robot always waited a full five seconds after each joke, regardless of audience response,” Fitter said. “In appropriate-timing mode, the robot used timing strategies to pause for laughter and continue when it subsided, just like an effective human comedian would. Overall, joke response ratings were higher when the jokes were delivered with appropriate timing.”

The number of performances, given to audiences of 10 to 20, provide enough data to identify significant differences between distinct modes of robot comedy performance, and the research helped to answer key questions about comedic social interaction, Fitter said.

“Audience size, social context, cultural context, the microphone-holding human presence and the novelty of a robot comedian may have influenced crowd responses,” Fitter said. “The current software does not account for differences in laughter profiles, but future work can account for these differences using a baseline response measurement. The only sensing we used to evaluate joke success was audio readings. Future work might benefit from incorporating additional types of sensing.”

Still, the studies have key implications for artificial intelligence efforts to understand group responses to dynamic, entertaining social robots in real-world environments, she said.

“Also, possible advances in comedy from this work could include improved techniques for isolating and studying the effects of comedic techniques and better strategies to help comedians assess the success of a joke or routine,” she said. “The findings will guide our next steps toward giving autonomous social agents improved humor capabilities.”

The studies were published by the Association for Computing Machinery [ACM]/Institute of Electrical and Electronics Engineering’s [IEEE] International Conference on Human-Robot Interaction [HRI].

Here’s another link to the two studies published in a single paper, which were first presented at the 2020 International Conference on Human-Robot Interaction [HRI]. along with a citation for the title of the published presentation,

Comedians in Cafes Getting Data: Evaluating Timing and Adaptivity in Real-World Robot Comedy Performance by John Vilk and Naomi T Fitter. HRI ’20: Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot InteractionMarch 2020 Pages 223–231 DOI: https://doi.org/10.1145/3319502.3374780

The paper is open access and the researchers have embedded an mp4 file which includes parts of the performances. Enjoy!

The Broad Institute gives us another reason to love CRISPR

More and more, this resembles a public relations campaign. First, CRISPR (clustered regularly interspersed short palindromic repeats) gene editing is going to be helpful with COVID-19 and now it can help us to deal with conservation issues. (See my May 26, 2020 posting about the latest CRISPR doings as of May 7, 2020; included is a brief description of the patent dispute between Broad Institute and UC Berkeley and musings about a public relations campaign.)

A May 21, 2020 news item on ScienceDaily announces how CRISPR could be useful for conservation,

The gene-editing technology CRISPR has been used for a variety of agricultural and public health purposes — from growing disease-resistant crops to, more recently, a diagnostic test for the virus that causes COVID-19. Now a study involving fish that look nearly identical to the endangered Delta smelt finds that CRISPR can be a conservation and resource management tool, as well. The researchers think its ability to rapidly detect and differentiate among species could revolutionize environmental monitoring.

Caption: Longfin smelt can be difficult to differentiate from endangered Delta smelt. Here, a longfin smelt is swabbed for genetic identification through a CRISPR tool called SHERLOCK. Credit: Alisha Goodbla/UC Davis

A May 21, 2020 University of California at Davis (UC Davis) news release (also on EurekAlert) by Kat Kerlin, which originated the news item, provides more detail (Note: A link has been removed),

The study, published in the journal Molecular Ecology Resources, was led by scientists at the University of California, Davis, and the California Department of Water Resources in collaboration with MIT Broad Institute [emphasis mine].

As a proof of concept, it found that the CRISPR-based detection platform SHERLOCK (Specific High-sensitivity Enzymatic Reporter Unlocking) [emphasis mine] was able to genetically distinguish threatened fish species from similar-looking nonnative species in nearly real time, with no need to extract DNA.

“CRISPR can do a lot more than edit genomes,” said co-author Andrea Schreier, an adjunct assistant professor in the UC Davis animal science department. “It can be used for some really cool ecological applications, and we’re just now exploring that.”

WHEN GETTING IT WRONG IS A BIG DEAL

The scientists focused on three fish species of management concern in the San Francisco Estuary: the U.S. threatened and California endangered Delta smelt, the California threatened longfin smelt and the nonnative wakasagi. These three species are notoriously difficult to visually identify, particularly in their younger stages.

Hundreds of thousands of Delta smelt once lived in the Sacramento-San Joaquin Delta before the population crashed in the 1980s. Only a few thousand are estimated to remain in the wild.

“When you’re trying to identify an endangered species, getting it wrong is a big deal,” said lead author Melinda Baerwald, a project scientist at UC Davis at the time the study was conceived and currently an environmental program manager with California Department of Water Resources.

For example, state and federal water pumping projects have to reduce water exports if enough endangered species, like Delta smelt or winter-run chinook salmon, get sucked into the pumps. Rapid identification makes real-time decision making about water operations feasible.

FROM HOURS TO MINUTES

Typically to accurately identify the species, researchers rub a swab over the fish to collect a mucus sample or take a fin clip for a tissue sample. Then they drive or ship it to a lab for a genetic identification test and await the results. Not counting travel time, that can take, at best, about four hours.

SHERLOCK shortens this process from hours to minutes. Researchers can identify the species within about 20 minutes, at remote locations, noninvasively, with no specialized lab equipment. Instead, they use either a handheld fluorescence reader or a flow strip that works much like a pregnancy test — a band on the strip shows if the target species is present.

“Anyone working anywhere could use this tool to quickly come up with a species identification,” Schreier said.

OTHER CRYPTIC CRITTERS

While the three fish species were the only animals tested for this study, the researchers expect the method could be used for other species, though more research is needed to confirm. If so, this sort of onsite, real-time capability may be useful for confirming species at crime scenes, in the animal trade at border crossings, for monitoring poaching, and for other animal and human health applications.

“There are a lot of cryptic species we can’t accurately identify with our naked eye,” Baerwald said. “Our partners at MIT are really interested in pathogen detection for humans. We’re interested in pathogen detection for animals as well as using the tool for other conservation issues.”

Here’s a link to and a citation for the paper,

Rapid and accurate species identification for ecological studies and monitoring using CRISPR‐based SHERLOCK by Melinda R. Baerwald, Alisha M. Goodbla, Raman P. Nagarajan, Jonathan S. Gootenberg, Omar O. Abudayyeh, Feng Zhang, Andrea D. Schreier. Molecular Ecology Resources https://doi.org/10.1111/1755-0998.13186 First published: 12 May 2020

This paper is behind a paywall.

The business of CRISPR

SHERLOCK™, is a trademark for what Sherlock Biosciences calls one of its engineering biology platforms. From the Sherlock Biosciences Technology webpage,

What is SHERLOCK™?

SHERLOCK is an evolution of CRISPR technology, which others use to make precise edits in genetic code. SHERLOCK can detect the unique genetic fingerprints of virtually any DNA or RNA sequence in any organism or pathogen. Developed by our founders and licensed exclusively from the Broad Institute, SHERLOCK is a method for single molecule detection of nucleic acid targets and stands for Specific High Sensitivity Enzymatic Reporter unLOCKing. It works by amplifying genetic sequences and programming a CRISPR molecule to detect the presence of a specific genetic signature in a sample, which can also be quantified. When it finds those signatures, the CRISPR enzyme is activated and releases a robust signal. This signal can be adapted to work on a simple paper strip test, in laboratory equipment, or to provide an electrochemical readout that can be read with a mobile phone.

However, things get a little more confusing when you look at the Broad Institute’s Developing Diagnostics and Treatments webpage,

Ensuring the SHERLOCK diagnostic platform is easily accessible, especially in the developing world, where the need for inexpensive, reliable, field-based diagnostics is the most urgent

SHERLOCK (Specific High-sensitivity Enzymatic Reporter unLOCKing) is a CRISPR-based diagnostic tool that is rapid, inexpensive, and highly sensitive, with the potential to have a transformative effect on research and global public health. The SHERLOCK platform can detect viruses, bacteria, or other targets in clinical samples such as urine or blood, and reveal results on a paper strip — without the need for extensive specialized equipment. This technology could potentially be used to aid the response to infectious disease outbreaks, monitor antibiotic resistance, detect cancer, and more. SHERLOCK tools are freely available [emphasis mine] for academic research worldwide, and the Broad Institute’s licensing framework [emphasis mine] ensures that the SHERLOCK diagnostic platform is easily accessible in the developing world, where inexpensive, reliable, field-based diagnostics are urgently needed.

Here’s what I suspect. as stated, the Broad Institute has free SHERLOCK licenses for academic institutions and not-for-profit organizations but Sherlock Biosciences, a Broad Institute spinoff company, is for-profit and has trademarked SHERLOCK for commercial purposes.

Final thoughts

This looks like a relatively subtle campaign to influence public perceptions. Genetic modification or genetic engineering as exemplified by the CRISPR gene editing technique is a force for the good of all. It will help us in our hour of need (COVID-19 pandemic) and it can help us save various species and better manage our resources.

This contrasts greatly with the publicity generated by the CRISPR twins situation where a scientist claimed to have successfully edited the germline for twins, Lulu and Nana. This was done despite a voluntary, worldwide moratorium on germline editing of viable embryos. (Search the terms [either here or on a standard search engine] ‘CRISPR twins’, ‘Lulu and Nana’, and/or ‘He Jiankui’ for details about the scandal.

In addition to presenting CRISPR as beneficial in the short term rather than the distant future, this publicity also subtly positions the Broad Institute as CRISPR’s owner.

Or, maybe I’m wrong. Regardless, I’m watching.

US Food and Drug Administration (FDA) gives first authorization for CRISPR (clustered regularly interspersed short palindromic repeats) use in COVID-19 crisis

Clustered regularly interspersed short palindromic repeats (CRISPR) gene editing has been largely confined to laboratory use or tested in agricultural trials. I believe that is true worldwide excepting the CRISPR twin scandal. (There are numerous postings about the CRISPR twins here including a Nov. 28, 2018 post, a May 17, 2019 post, and a June 20, 2019 post. Update: It was reported (3rd. para.) in December 2019 that He had been sentenced to three years jail time.)

Connie Lin in a May 7, 2020 article for Fast Company reports on this surprising decision by the US Food and Drug Administration (FDA), Note: A link has been removed),

The U.S. Food and Drug Administration has granted Emergency Use Authorization to a COVID-19 test that uses controversial gene-editing technology CRISPR.

This marks the first time CRISPR has been authorized by the FDA, although only for the purpose of detecting the coronavirus, and not for its far more contentious applications. The new test kit, developed by Cambridge, Massachusetts-based Sherlock Biosciences, will be deployed in laboratories certified to carry out high-complexity procedures and is “rapid,” returning results in about an hour as opposed to those that rely on the standard polymerase chain reaction method, which typically requires six hours.

The announcement was made in the FDA’s Coronavirus (COVID-19) Update: May 7, 2020 Daily Roundup (4th item in the bulleted list), Or, you can read the May 6, 2020 letter (PDF) sent to John Vozella of Sherlock Biosciences by the FDA.

As well, there’s the May 7, 2020 Sherlock BioSciences news release (the most informative of the lot),

Sherlock Biosciences, an Engineering Biology company dedicated to making diagnostic testing better, faster and more affordable, today announced the company has received Emergency Use Authorization (EUA) from the U.S. Food and Drug Administration (FDA) for its Sherlock™ CRISPR SARS-CoV-2 kit for the detection of the virus that causes COVID-19, providing results in approximately one hour.

“While it has only been a little over a year since the launch of Sherlock Biosciences, today we have made history with the very first FDA-authorized use of CRISPR technology, which will be used to rapidly identify the virus that causes COVID-19,” said Rahul Dhanda, co-founder, president and CEO of Sherlock Biosciences. “We are committed to providing this initial wave of testing kits to physicians, laboratory experts and researchers worldwide to enable them to assist frontline workers leading the charge against this pandemic.”

The Sherlock™ CRISPR SARS-CoV-2 test kit is designed for use in laboratories certified under the Clinical Laboratory Improvement Amendments of 1988 (CLIA), 42 U.S.C. §263a, to perform high complexity tests. Based on the SHERLOCK method, which stands for Specific High-sensitivity Enzymatic Reporter unLOCKing, the kit works by programming a CRISPR molecule to detect the presence of a specific genetic signature – in this case, the genetic signature for SARS-CoV-2 – in a nasal swab, nasopharyngeal swab, oropharyngeal swab or bronchoalveolar lavage (BAL) specimen. When the signature is found, the CRISPR enzyme is activated and releases a detectable signal. In addition to SHERLOCK, the company is also developing its INSPECTR™ platform to create an instrument-free, handheld test – similar to that of an at-home pregnancy test – that utilizes Sherlock Biosciences’ Synthetic Biology platform to provide rapid detection of a genetic match of the SARS-CoV-2 virus.

“When our lab collaborated with Dr. Feng Zhang’s team to develop SHERLOCK, we believed that this CRISPR-based diagnostic method would have a significant impact on global health,” said James J. Collins, co-founder and board member of Sherlock Biosciences and Termeer Professor of Medical Engineering and Science for MIT’s Institute for Medical Engineering and Science (IMES) and Department of Biological Engineering. “During what is a major healthcare crisis across the globe, we are heartened that the first FDA-authorized use of CRISPR will aid in the fight against this global COVID-19 pandemic.”

Access to rapid diagnostics is critical for combating this pandemic and is a primary focus for Sherlock Biosciences co-founder and board member, David R. Walt, Ph.D., who co-leads the Mass [Massachusetts] General Brigham Center for COVID Innovation.

“SHERLOCK enables rapid identification of a single alteration in a DNA or RNA sequence in a single molecule,” said Dr. Walt. “That precision, coupled with its capability to be deployed to multiplex over 100 targets or as a simple point-of-care system, will make it a critical addition to the arsenal of rapid diagnostics already being used to detect COVID-19.”

This development is particularly interesting since there was a major intellectual property dispute over CRISPR between the Broad Institute (a Harvard University and Massachusetts Institute of Technology [MIT] joint initiative), and the University of California at Berkeley (UC Berkeley). The Broad Institute mostly won in the first round of the patent fight, as I noted in a March 15, 2017 post but, as far as I’m aware, UC Berkeley is still disputing that decision.

In the period before receiving authorization, it appears that Sherlock Biosciences was doing a little public relations and ‘consciousness raising’ work. Here’s a sample from a May 5, 2020 article by Sharon Begley for STAT (Note: Links have been removed),

The revolutionary genetic technique better known for its potential to cure thousands of inherited diseases could also solve the challenge of Covid-19 diagnostic testing, scientists announced on Tuesday. A team headed by biologist Feng Zhang of the McGovern Institute at MIT and the Broad Institute has repurposed the genome-editing tool CRISPR into a test able to quickly detect as few as 100 coronavirus particles in a swab or saliva sample.

Crucially, the technique, dubbed a “one pot” protocol, works in a single test tube and does not require the many specialty chemicals, or reagents, whose shortage has hampered the rollout of widespread Covid-19 testing in the U.S. It takes about an hour to get results, requires minimal handling, and in preliminary studies has been highly accurate, Zhang told STAT. He and his colleagues, led by the McGovern’s Jonathan Gootenberg and Omar Abudayyeh, released the protocol on their STOPCovid.science website.

Because the test has not been approved by the Food and Drug Administration, it is only for research purposes for now. But minutes before speaking to STAT on Monday, Zhang and his colleagues were on a conference call with FDA officials about what they needed to do to receive an “emergency use authorization” that would allow clinical use of the test. The FDA has used EUAs to fast-track Covid-19 diagnostics as well as experimental therapies, including remdesivir, after less extensive testing than usually required.

For an EUA, the agency will require the scientists to validate the test, which they call STOPCovid, on dozens to hundreds of samples. Although “it is still early in the process,” Zhang said, he and his colleagues are confident enough in its accuracy that they are conferring with potential commercial partners who could turn the test into a cartridge-like device, similar to a pregnancy test, enabling Covid-19 testing at doctor offices and other point-of-care sites.

“It could potentially even be used at home or at workplaces,” Zhang said. “It’s inexpensive, does not require a lab, and can return results within an hour using a paper strip, not unlike a pregnancy test. This helps address the urgent need for widespread, accurate, inexpensive, and accessible Covid-19 testing.” Public health experts say the availability of such a test is one of the keys to safely reopening society, which will require widespread testing, and then tracing and possibly isolating the contacts of those who test positive.

If you have time, do read Begley’s in full.

Elder care robot being tested by Washington State University team

I imagine that at some point the Washington State University’s (WSU) ‘elder care’ robot will be tested by senior citizens as opposed to the students described in a January 14, 2019 WSU news release (also on EurekAlert) by Will Ferguson,

A robot created by Washington State University scientists could help elderly people with dementia and other limitations live independently in their own homes.

The Robot Activity Support System, or RAS, uses sensors embedded in a WSU smart home to determine where its residents are, what they are doing and when they need assistance with daily activities.

It navigates through rooms and around obstacles to find people on its own, provides video instructions on how to do simple tasks and can even lead its owner to objects like their medication or a snack in the kitchen.

“RAS combines the convenience of a mobile robot with the activity detection technology of a WSU smart home to provide assistance in the moment, as the need for help is detected,” said Bryan Minor, a postdoctoral researcher in the WSU School of Electrical Engineering and Computer Science.

Minor works in the lab of Diane Cook, professor of electrical engineering and computer science and director of the WSU Center for Advanced Studies in Adaptive Systems.

For the last decade, Cook and Maureen Schmitter-Edgecombe, a WSU professor of psychology, have led CASAS researchers in the development of smart home technologies that could enable elderly adults with memory problems and other impairments to live independently.

Currently, an estimated 50 percent of adults over the age of 85 need assistance with every day activities such as preparing meals and taking medication and the annual cost for this assistance in the US is nearly $2 trillion.

With the number of adults over 85 expected to triple by 2050, Cook and Schmitter-Edgecombe hope that technologies like RAS and the WSU smart home will alleviate some of the financial strain on the healthcare system by making it easier for older adults to live alone.

“Upwards of 90 percent of older adults prefer to age in place as opposed to moving into a nursing home,” Cook said. “We want to make it so that instead of bringing in a caregiver or sending these people to a nursing home, we can use technology to help them live independently on their own.”

RAS is the first robot CASAS researchers have tried to incorporate into their smart home environment. They recently published a study in the journal Cognitive Systems Research that demonstrates how RAS could make life easier for older adults struggling to live independently

In the study CASAS researchers recruited 26 undergraduate and graduate students [emphasis mine] to complete three activities in a smart home with RAS as an assistant.

The activities were getting ready to walk the dog, taking medication with food and water and watering household plants.

When the smart home sensors detected a human failed to initiate or was struggling with one of the tasks, RAS received a message to help.

The robot then used its mapping and navigation camera, sensors and software to find the person and offer assistance.

The person could then indicate through a tablet interface that they wanted to see a video of the next step in the activity they were performing, a video of the entire activity or they could ask the robot to lead them to objects needed to complete the activity like the dog’s leash or a granola bar from the kitchen.

Afterwards the study participants were asked to rate the robot’s performance. Most of the participants rated RAS’ performance favorably and found the robot’s tablet interface to be easy to use. They also reported the next step video as being the most useful of the prompts.

“While we are still in an early stage of development, our initial results with RAS have been promising,” Minor said. “The next step in the research will be to test RAS’ performance with a group of older adults to get a better idea of what prompts, video reminders and other preferences they have regarding the robot.”

Here’s a link to and a citation for the paper,

Robot-enabled support of daily activities in smart home environment by Garrett Wilson, Christopher Pereyda, Nisha Raghunath, Gabriel de la Cruz, Shivam Goel, Sepehr Nesaei, Bryan Minor, Maureen Schmitter-Edgecombe, Matthew E.Taylor, Diane J.Cook. Cognitive Systems Research Volume 54, May 2019, Pages 258-272 DOI: https://doi.org/10.1016/j.cogsys.2018.10.032

This paper is behind a paywall.

Other ‘caring’ robots

Dutch filmmaker, Sander Burger, directed a documentary about ‘caredroids’ for seniors titled ‘Alice Cares’ or ‘Ik ben Alice’ in Dutch. It premiered at the 2015 Vancouver (Canada) International Film Festival and was featured in a January 22, 2015 article by Neil Young for the Hollywood Reporter,


The benign side of artificial intelligence enjoys a rare cinematic showcase in Sander Burger‘s Alice Cares (Ik ben Alice), a small-scale Dutch documentary that reinvents no wheels but proves as unassumingly delightful as its eponymous, diminutive “care-robot.” Touching lightly on social and technological themes that are increasingly relevant to nearly all industrialized societies, this quiet charmer bowed at Rotterdam ahead of its local release and deserves wider exposure via festivals and small-screen outlets.

… Developed by the US firm Hanson Robotics, “Alice”— has the stature and face of a girl of eight, but an adult female’s voice—is primarily intended to provide company for lonely seniors.

Burger shows Alice “visiting” the apartments of three octogenarian Dutch ladies, the contraption overcoming their hosts’ initial wariness and quickly forming chatty bonds. This prototype “care-droid” represents the technology at a relatively early stage, with Alice unable to move anything apart from her head, eyes (which incorporate tiny cameras) and mouth. Her body is made much more obviously robotic in appearance than the face, to minimize the chances of her interlocutors mistaking her for an actual human. Such design-touches are discussed by Alice’s programmer in meetings with social-workers, which Burger and his editor Manuel Rombley intersperses between the domestic exchanges that provide the bulk of the running-time.

‘Alice’ was also featured in the Lancet’s (a general medical journal) July 18, 2015 article by Natalie Harrison,

“I’m going to ask you some questions about your life. Do you live independently? Are you lonely?” If you close your eyes and start listening to the film Alice Cares, you would think you were overhearing a routine conversation between an older woman and a health-care worker. It’s only when the woman, Martha Remkes, ends the conversation with “I don’t feel like having a robot in my home, I prefer a human being” that you realise something is amiss. In the Dutch documentary Alice Cares, Alice Robokind, a prototype caredroid developed in a laboratory in Amsterdam, is sent to live with three women who require care and company, with rather surprising results

Although the idea of health robots has been around for a couple of decades, research into the use of robots with older adults is a fairly new area. Alex Mihailidis, from the Intelligent Assistive Technology and Systems Lab [University of Toronto] in Toronto, ON, Canada, explains: “For carers, robots have been used as tools that can help to alleviate burden typically associated with providing continuous care”. He adds that “as robots become more viable and are able to perform common physical tasks, they can be very valuable in helping caregivers complete common tasks such as moving a person in and out of bed”. Although Japan and Korea are regarded as the world leaders in this research, the European Union and the USA are also making progress. At the Edinburgh Centre for Robotics, for example, researchers are working to develop more complex sensor and navigation technology for robots that work alongside people and on assisted living prosthetics technologies. This research is part of a collaboration between the University of Edinburgh and Heriot-Watt University that was awarded £6 million in funding as part of a wider £85 million investment into industrial technology in the UK Government’s Eight Great Technologies initiative. Robotics research is clearly flourishing and the global market for service and industrial robots is estimated to reach almost US$60 billion by 2020.

The idea for Alice Cares came to director Sander Burger after he read about a group of scientists at the VU University of Amsterdam in the Netherlands who were about to test a health-care robot on older people. “The first thing I felt was some resentment against the idea—I was curious why I was so offended by the whole idea and just called the scientists to see if I could come by to see what they were doing. …

… With software to generate and regulate Alice’s emotions, an artificial moral reasoner, a computational model of creativity, and full access to the internet, the investigators hoped to create a robotic care provider that was intelligent, sensitive, creative, and entertaining. “The robot was specially developed for social skills, in short, she was programmed to make the elderly women feel less lonely”, explains Burger.

Copyright © 2015 Alice Cares KeyDocs

Both the Young and Harrison articles are well worth the time, should you have enough to read them. Also, there’s an Ik ben Alice website (it’s in Dutch only).

Meanwhile, Canadians can look at Humber River Hospital (HHR; Toronto, Ontario) for a glimpse at another humanoid ‘carebot’, from a July 25, 2018 HHR Foundation blog entry,

Earlier this year, a special new caregiver joined the Child Life team at the Humber River Hospital. Pepper, the humanoid robot, helps our Child Life Specialists decrease patient anxiety, increase their comfort and educate young patients and their families. Pepper embodies perfectly the intersection of compassion and advanced technology for which Humber River is renowned.

Pepper helps our Child Life Specialists decrease patient anxiety, increase their comfort and educate young patients.

Humber River Hospital is committed to making the hospital experience a better one for our patients and their families from the moment they arrive and Pepper the robot helps us do that! Pepper is child-sized with large, expressive eyes and a sweet voice. It greets visitors, provides directions, plays games, does yoga and even dances. Using facial recognition to detect human emotions, it adapts its behaviour according to the mood of the person with whom it’s interacting. Pepper makes the Hospital an even more welcoming place for everyone it encounters.

Humber currently has two Peppers on staff: one is used exclusively by the Child Life Program to help young patients feel at ease and a second to greet patients and their families in the Hospital’s main entrance.

While Pepper robots are used around the world in such industries as retail and hospitality, Humber River is the first hospital in Canada to use Pepper in a healthcare setting. Using dedicated applications built specifically for the Hospital, Pepper’s interactive touch-screen display helps visitors find specific departments, washrooms, exits and more. In addition to answering questions and sharing information, Pepper entertains, plays games and is always available for a selfie.

I’m guessing that they had a ‘soft’ launch for Pepper because there’s an Oct. 25, 2018 HHR news release announcing Pepper’s deployment,

Pepper® can greet visitors, provide directions, play games, do yoga and even dance

Humber River Hospital has joined forces with SoftBank Robotics America (SBRA) to launch a new pilot program with Pepper the humanoid robot.  Beginning this week, Pepper will greet, help guide, engage and entertain patients and visitors who enter the hospital’s main entrance hall.

“While the healthcare sector has talked about this technology for some time now, we are ambitious and confident at Humber River Hospital to make the move and become the first hospital in Canada to pilot this technology,” states Barbara Collins, President and CEO, Humber River Hospital. 


Pepper by the numbers:
Stands 1.2 m (4ft) tall and weighs 29 kg (62lb)
Features three cameras – two 2 HD cameras and one 3D depth sensor – to “see” and interact with people
20 engines in Pepper’s head, arms and back control its precise movements
A 10-inch chest-mounted touchscreen tablet that Pepper uses to convey information and encourage input

Finally, there’s a 2012 movie, Robot & Frank (mentioned here before in this Oct. 13, 2017 posting; scroll down to Robots and pop culture subsection) which provides an intriguing example of how ‘carebots’ might present unexpected ethical challenges. Hint: Frank is a senior citizen and former jewel thief who decides to pass on some skills.

Final thoughts

It’s fascinating to me that every time I’ve looked at articles about robots being used for tasks usually performed by humans that some expert or other sweetly notes that robots will be used to help humans with tasks that are ‘boring’ or ‘physical’ with the implication that humans will focus on more rewarding work, from Harrison’s Lancet article (in a previous excerpt),

… Alex Mihailidis, from the Intelligent Assistive Technology and Systems Lab in Toronto, ON, Canada, explains: “For carers, robots have been used as tools that can help to alleviate burden typically associated with providing continuous care”. He adds that “as robots become more viable and are able to perform common physical tasks, they can be very valuable in helping caregivers …

For all the emphasis on robots as taking over burdensome physical tasks, Burger’s documentary makes it clear that these early versions are being used primarily to provide companionship. Yes, HHR’s Pepper® is taking over some repetitive tasks, such as giving directions, but it’s also playing and providing companionship.

As for what it will mean ultimately, that’s something we, as a society, need to consider.

S.NET (Society for the Study of New and Emerging Technologies) 2019 conference in Quito, Ecuador: call for abstracts

Why isn’t the S.NET abbreviation SSNET? That’s what it should be, given the organization’s full name: Society for the Study of New and Emerging Technologies. S.NET smacks of a compromise or consensus decision of some kind. Also, the ‘New’ in its name was ‘Nanoscience’ at one time (see my Oct. 22, 2013 posting).

Now onto 2019 and the conference, which, for the first time ever, is being held in Latin America. Here’s more from a February 4, 2019 S.Net email about the call for abstracts,

2019 Annual S.NET Meeting
Contrasting Visions of Technological Change

The 11th Annual S.NET meeting will take place November 18-20, 2019, at the Latin American Faculty of Social Sciences in Quito, Ecuador.

This year’s meeting will provide rich opportunities to reflect on technological change by establishing a dialogue between contrasting visions on how technology becomes closely intertwined with social orders.  We aim to open the black box of technological change by exploring the sociotechnical agreements that help to explain why societies follow certain technological trajectories. Contributors are invited to explore the ramifications of technological change, reflect on the policy process of technology, and debate whether or why technological innovation is a matter for democracy.

Following the transnational nature of S.NET, the meeting will highlight the diverse geographical and cultural approaches to technological innovation, the forces driving sociotechnical change, and social innovation.  It is of paramount importance to question the role of technology in the shaping of society and the outcomes of these configurations.  What happens when these arrangements come into being, are transformed or fall apart?  Does technology create contestation?  Why and how should we engage with contested visions of technology change?

This is the first time that the S.NET Meeting will take place in Latin America and we encourage panels and presentations with contrasting voices from both the Global North and the Global South. 

Topics of interest include, but are not limited to:

Sociotechnical imaginaries of innovation
The role of technology on shaping nationhood and nation identities
Decision-making processes on science and technology public policies
Co-creation approaches to promote public innovation
Grassroots innovation, sustainability and democracy
Visions and cultural imaginaries
Role of social sciences and humanities in processes technological change
In addition, we welcome contributions on:
Research dynamics and organization Innovation and use
Governance and regulation
Politics and ethics
Roles of publics and stakeholders

Keynote Speakers
TBA (check the conference website for updates!)

Deadlines & Submission Instructions
The program committee invites contributions from scholars, technology developers and practitioners, and welcome presentations from a range of disciplines spanning the humanities, social and natural sciences.  We invite individual paper submissions, open panel and closed session proposals, student posters, and special format sessions, including events that are innovative in form and content. 

The deadline for abstract submissions is *April 18, 2019* [extended to May 12, 2019].  Abstracts should be approximately 250 words in length, emailed in PDF format to 2019snet@gmail.com.  Notifications of acceptance can be expected by May 30, 2019.

Junior scholars and those with limited resources are strongly encouraged to apply, as the organizing committee is actively investigating potential sources of financial support.

Details on the conference can be found here: https://www.flacso.edu.ec/snet2019/

Local Organizing Committee
María Belén Albornoz, Isarelis Pérez, Javier Jiménez, Mónica Bustamante, Jorge Núñez, Maka Suárez.

Venue
FLACSO Ecuador is located in the heart of Quito.  Most hotels, museums, shopping centers and other cultural hotspots in the city are located near the campus and are easily accessible by public or private transportation.  Due to its proximity and easy access, Meeting participants would be able to enjoy Quito’s rich cultural life during their stay.  

About S.NET
S.NET is an international association that promotes intellectual exchange and critical inquiry about the advancement of new and emerging technologies in society.  The aim of the association is to advance critical reflection from various perspectives on developments in a broad range of new and emerging fields, including, but not limited to, nanoscale science and engineering, biotechnology, synthetic biology, cognitive science, ICT and Big Data, and geo-engineering.  Current S.NET board members are: Michael Bennett (President), Maria Belen Albornoz, Claire Shelley-Egan, Ana Delgado, Ana Viseu, Nora Vaage, Chris Toumey, Poonam Pandey, Sylvester Johnson, Lotte Krabbenborg, and Maria Joao Ferreira Maia.

Don’t forget, the deadline for your abstract is *April 18, 2019* [extended to May 12, 2019].

For anyone curious about what Quito might look like, there’s this from Quito’s Wikipedia entry,

Clockwise from top: Calle La Ronda, Iglesia de la Compañía de Jesús, El Panecillo as seen from Northern Quito, Carondelet Palace, Central-Northern Quito, Parque La Carolina and Iglesia y Monasterio de San Francisco. Credit: various authors – montage of various important landmarks of the City of Quito, Ecuador taken from files found in Wikimedia Commons. CC BY-SA 3.0 File:Montaje Quito.png Created: 24 December 2012

Good luck to all everyone submitting an abstract.

*Date for abstract submissions changed from April 18, 2019 to May 12, 2019 on April 24, 2019