Category Archives: robots

Science events (Einstein, getting research to patients, sleep, and art/science) in Vancouver (Canada), Jan. 23 – 28, 2016

There are five upcoming science events in seven days (Jan. 23 – 28, 2016) in the Vancouver area.

Einstein Centenary Series

The first is a Saturday morning, Jan. 23, 2016 lecture, the first for 2016 in a joint TRIUMF (Canada’s national laboratory for particle and nuclear physics), UBC (University of British Columbia), and SFU (Simon Fraser University) series featuring Einstein’s  work and its implications. From the event brochure (pdf), which lists the entire series,

TRIUMF, UBC and SFU are proud to present the 2015-2016 Saturday morning lecture series on the frontiers of modern physics. These free lectures are a level appropriate for high school students and members of the general public.

Parallel lecture series will be held at TRIUMF on the UBC South Campus, and at SFU Surrey Campus.

Lectures start at 10:00 am and 11:10 am. Parking is available.

For information, registration and directions, see :
http://www.triumf.ca/saturday-lectures

January 23, 2016 TRIUMF Auditorium (UBC, Vancouver)
1. General Relativity – the theory (Jonathan Kozaczuk, TRIUMF)
2. Einstein and Light: stimulated emission, photoelectric effect and quantum theory (Mark Van Raamsdonk, UBC)

January 30, 2016 SFU Surrey Room 2740 (SFU, Surrey Campus)

1. General Relativity – the theory (Jonathan Kozaczuk, TRIUMF)
2. Einstein and Light: stimulated emission, photoelectric effect and quantum theory (Mark Van Raamsdonk, UBC)

I believe these lectures are free. One more note, they will be capping off this series with a special lecture by Kip Thorne (astrophysicist and consultant for the movie Interstellar) at Science World, on Thursday, April 14, 2016. More about that when at a closer date.

Café Scientifique

On Tuesday, January 26, 2016 at 7:30 pm in the back room of The Railway Club (2nd floor of 579 Dunsmuir St. [at Seymour St.]), Café Scientifique will be hosting a talk about science and serving patients (from the Jan. 5, 2016 announcement),

Our speakers for the evening will be Dr. Millan Patel and Dr. Shirin Kalyan.  The title of their talk is:

Helping Science to Serve Patients

Science in general and biotechnology in particular are auto-catalytic. That is, they catalyze their own evolution and so generate breakthroughs at an exponentially increasing rate.  The experience of patients is not exponentially getting better, however.  This talk, with a medical geneticist and an immunologist who believe science can deliver far more for patients, will focus on structural and cultural impediments in our system and ways they and others have developed to either lower or leapfrog the barriers. We hope to engage the audience in a highly interactive discussion to share thoughts and perspectives on this important issue.

There is additional information about Dr. Millan Patel here and Dr. Shirin Kalyan here. It would appear both speakers are researchers and academics and while I find the emphasis on the patient and the acknowledgement that medical research benefits are not being delivered in quantity or quality to patients, it seems odd that they don’t have a clinician (a doctor who deals almost exclusively with patients as opposed to two researchers) to add to their perspective.

You may want to take a look at my Jan. 22, 2016 ‘open science’ and Montreal Neurological Institute posting for a look at how researchers there are responding to the issue.

Curiosity Collider

This is an art/science event from an organization that sprang into existence sometime during summer 2015 (my July 7, 2015 posting featuring Curiosity Collider).

When: 8:00pm on Wednesday, January 27, 2016. Door opens at 7:30pm.
Where: Café Deux Soleils. 2096 Commercial Drive, Vancouver, BC (Google Map).
Cost: $5.00 cover (sliding scale) at the door. Proceeds will be used to cover the cost of running this event, and to fund future Curiosity Collider events.

Part I. Speakers

Part II. Open Mic

  • 90 seconds to share your art-science ideas. Think they are “ridiculous”? Well, we think it could be ridiculously awesome – we are looking for creative ideas!
  • Don’t have an idea (yet)? Contribute by sharing your expertise.
  • Chat with other art-science enthusiasts, strike up a conversation to collaborate, all disciplines/backgrounds welcome.
  • Want to showcase your project in the future? Participate in our fall art-science competition (more to come)!

Follow updates on twitter via @ccollider or #CollideConquer

Good luck on the open mic (should you have a project)!

Brain Talks

This particular Brain Talk event is taking place at Vancouver General Hospital (VGH; there is also another Brain Talks series which takes place at the University of British Columbia). Yes, members of the public can attend the VGH version; they didn’t throw me out the last time I was there. Here’s more about the next VGH Brain Talks,

Sleep: biological & pathological perspectives

Thursday, Jan 28, 6:00pm @ Paetzold Auditorium, Vancouver General Hospital

Speakers:

Peter Hamilton, Sleep technician ~ Sleep Architecture

Dr. Robert Comey, MD ~ Sleep Disorders

Dr. Maia Love, MD ~ Circadian Rhythms

Panel discussion and wine and cheese reception to follow!

Please RSVP here

You may want to keep in mind that the event is organized by people who don’t organize events often. Nice people but you may need to search for crackers for your cheese and your wine comes out of a box (and I think it might have been self-serve the time I attended).

What a fabulous week we have ahead of us—Happy Weekend!

Exceeding the sensitivity of skin with a graphene elastomer

A Jan. 14, 2016 news item on Nanowerk announces the latest in ‘sensitive’ skin,

A new sponge-like material, discovered by Monash [Monash University in Australia] researchers, could have diverse and valuable real-life applications. The new elastomer could be used to create soft, tactile robots to help care for elderly people, perform remote surgical procedures or build highly sensitive prosthetic hands.

Graphene-based cellular elastomer, or G-elastomer, is highly sensitive to pressure and vibrations. Unlike other viscoelastic substances such as polyurethane foam or rubber, G-elastomer bounces back extremely quickly under pressure, despite its exceptionally soft nature. This unique, dynamic response has never been found in existing soft materials, and has excited and intrigued researchers Professor Dan Li and Dr Ling Qiu from the Monash Centre for Atomically Thin Materials (MCATM).

A Jan. 14, 2016 Monash University media release, which originated the news item, offers some insights from the researchers,

According to Dr Qiu, “This graphene elastomer is a flexible, ultra-light material which can detect pressures and vibrations across a broad bandwidth of frequencies. It far exceeds the response range of our skin, and it also has a very fast response time, much faster than conventional polymer elastomer.

“Although we often take it for granted, the pressure sensors in our skin allow us to do things like hold a cup without dropping it, crushing it, or spilling the contents. The sensitivity and response time of G-elastomer could allow a prosthetic hand or a robot to be even more dexterous than a human, while the flexibility could allow us to create next generation flexible electronic devices,” he said.

Professor Li, a director of MCATM, said, ‘Although we are still in the early stages of discovering graphene’s potential, this research is an excellent breakthrough. What we do know is that graphene could have a huge impact on Australia’s economy, both from a resources and innovation perspective, and we’re aiming to be at the forefront of that research and development.’

Dr Qiu’s research has been published in the latest edition of the prestigious journal Advanced Materials and is protected by a suite of patents.

Are they trying to protect the work from competition or wholesale theft of their work?

After all, the idea behind patents and copyrights was to encourage innovation and competition by ensuring that inventors and creators would benefit from their work. An example that comes to mind is the Xerox company which for many years had a monopoly on photocopy machines by virtue of their patent. Once the patent ran out (patents and copyrights were originally intended to be in place for finite time periods) and Xerox had made much, much money, competitors were free to create and market their own photocopy machines, which they did quite promptly. Since those days, companies have worked to extend patent and copyright time periods in efforts to stifle competition.

Getting back to Monash, I do hope the researchers are able to benefit from their work and wish them well. I also hope that they enjoy plenty of healthy competition spurring them onto greater innovation.

Here’s a link to and a citation for their paper,

Ultrafast Dynamic Piezoresistive Response of Graphene-Based Cellular Elastomers by Ling Qiu, M. Bulut Coskun, Yue Tang, Jefferson Z. Liu, Tuncay Alan, Jie Ding, Van-Tan Truong, and Dan Li. Advanced Materials Volume 28, Issue 1 January 6, 2016Pages 194–200 DOI: 10.1002/adma.201503957 First published: 2 November 2015

This paper appears to be open access.

Spermbot alternative for infertility issues

A German team that’s been working with sperm to develop a biological motor has announced it may have an alternative treatment for infertility, according to a Jan. 13, 2016 news item on Nanowerk,

Sperm that don’t swim well [also known as low motility] rank high among the main causes of infertility. To give these cells a boost, women trying to conceive can turn to artificial insemination or other assisted reproduction techniques, but success can be elusive. In an attempt to improve these odds, scientists have developed motorized “spermbots” that can deliver poor swimmers — that are otherwise healthy — to an egg. …

A Jan. 13, 2016 American Chemical Society (ACS) news release (*also on EurekAlert*), which originated the news item, expands on the theme,

Artificial insemination is a relatively inexpensive and simple technique that involves introducing sperm to a woman’s uterus with a medical instrument. Overall, the success rate is on average under 30 percent, according to the Human Fertilisation & Embryology Authority of the United Kingdom. In vitro fertilization can be more effective, but it’s a complicated and expensive process. It requires removing eggs from a woman’s ovaries with a needle, fertilizing them outside the body and then transferring the embryos to her uterus or a surrogate’s a few days later. Each step comes with a risk for failure. Mariana Medina-Sánchez, Lukas Schwarz, Oliver G. Schmidt and colleagues from the Institute for Integrative Nanosciences at IFW Dresden in Germany wanted to see if they could come up with a better option than the existing methods.

Building on previous work on micromotors, the researchers constructed tiny metal helices just large enough to fit around the tail of a sperm. Their movements can be controlled by a rotating magnetic field. Lab testing showed that the motors can be directed to slip around a sperm cell, drive it to an egg for potential fertilization and then release it. The researchers say that although much more work needs to be done before their technique can reach clinical testing, the success of their initial demonstration is a promising start.

For those who prefer to watch their news, there’s this,


This team got a flurry of interest in 2014 when they first announced their research on using sperm as a biological motor. Tracy Staedter in a Jan. 15, 2014 article for Discovery.com describes their then results,

To create these tiny robots, the scientists first had to catch a few. First, they designed microtubes, which are essentially thin sheets of titanium and iron — which have a magnetic property — rolled into conical tubes, with one end wider than the other. Next, they put the microtubes into a solution in a Petri dish and added bovine sperm cells, which are similar size to human sperm. When a live sperm entered the wider end of the tube, it became trapped down near the narrow end. The scientists also closed the wider end, so the sperm wouldn’t swim out. And because sperm are so determined, the trapped cell pushed against the tube, moving it forward.

Next, the scientists used a magnetic field to guide the tube in the direction they wanted it to go, relying on the sperm for the propulsion.

The quick swimming spermbots could use controlled from outside a person body to deliver payloads of drugs and even sperm itself to parts of the body where its needed, whether that’s a cancer tumor or an egg.

This work isn’t nanotechnology per se but it has been published in ACS Nano Letters. Here’s a link to and a citation for the paper,

Cellular Cargo Delivery: Toward Assisted Fertilization by Sperm-Carrying Micromotors by Mariana Medina-Sánchez, Lukas Schwarz, Anne K. Meyer, Franziska Hebenstreit, and Oliver G. Schmidt. Nano Lett., 2016, 16 (1), pp 555–561 DOI: 10.1021/acs.nanolett.5b04221 Publication Date (Web): December 21, 2015

Copyright © 2015 American Chemical Society

This paper is behind a paywall.

*'(also on EurekAlert)’ text and link added Jan. 14, 2016.

KAIST (Korea Advanced Institute of Science and Technology) will lead an Ideas Lab at 2016 World Economic Forum

The theme for the 2016 World Economic Forum (WEF) is ‘Mastering the Fourth Industrial Revolution’. I’m losing track of how many industrial revolutions we’ve had and this seems like a vague theme. However, there is enlightenment to be had in this Nov. 17, 2015 Korea Advanced Institute of Science and Technology (KAIST) news release on EurekAlert,

KAIST researchers will lead an IdeasLab on biotechnology for an aging society while HUBO, the winner of the 2015 DARPA Robotics Challenge, will interact with the forum participants, offering an experience of state-of-the-art robotics technology

Moving on from the news release’s subtitle, there’s more enlightenment,

Representatives from the Korea Advanced Institute of Science and Technology (KAIST) will attend the 2016 Annual Meeting of the World Economic Forum to run an IdeasLab and showcase its humanoid robot.

With over 2,500 leaders from business, government, international organizations, civil society, academia, media, and the arts expected to participate, the 2016 Annual Meeting will take place on Jan. 20-23, 2016 in Davos-Klosters, Switzerland. Under the theme of ‘Mastering the Fourth Industrial Revolution,’ [emphasis mine] global leaders will discuss the period of digital transformation [emphasis mine] that will have profound effects on economies, societies, and human behavior.

President Sung-Mo Steve Kang of KAIST will join the Global University Leaders Forum (GULF), a high-level academic meeting to foster collaboration among experts on issues of global concern for the future of higher education and the role of science in society. He will discuss how the emerging revolution in technology will affect the way universities operate and serve society. KAIST is the only Korean university participating in GULF, which is composed of prestigious universities invited from around the world.

Four KAIST professors, including Distinguished Professor Sang Yup Lee of the Chemical and Biomolecular Engineering Department, will lead an IdeasLab on ‘Biotechnology for an Aging Society.’

Professor Lee said, “In recent decades, much attention has been paid to the potential effect of the growth of an aging population and problems posed by it. At our IdeasLab, we will introduce some of our research breakthroughs in biotechnology to address the challenges of an aging society.”

In particular, he will present his latest research in systems biotechnology and metabolic engineering. His research has explained the mechanisms of how traditional Oriental medicine works in our bodies by identifying structural similarities between effective compounds in traditional medicine and human metabolites, and has proposed more effective treatments by employing such compounds.

KAIST will also display its networked mobile medical service system, ‘Dr. M.’ Built upon a ubiquitous and mobile Internet, such as the Internet of Things, wearable electronics, and smart homes and vehicles, Dr. M will provide patients with a more affordable and accessible healthcare service.

In addition, Professor Jun-Ho Oh of the Mechanical Engineering Department will showcase his humanoid robot, ‘HUBO,’ during the Annual Meeting. His research team won the International Humanoid Robotics Challenge hosted by the United States Defense Advanced Research Projects Agency (DARPA), which was held in Pomona, California, on June 5-6, 2015. With 24 international teams participating in the finals, HUBO completed all eight tasks in 44 minutes and 28 seconds, 6 minutes earlier than the runner-up, and almost 11 minutes earlier than the third-place team. Team KAIST walked away with the grand prize of USD 2 million.

Professor Oh said, “Robotics technology will grow exponentially in this century, becoming a real driving force to expedite the Fourth Industrial Revolution. I hope HUBO will offer an opportunity to learn about the current advances in robotics technology.”

President Kang pointed out, “KAIST has participated in the Annual Meeting of the World Economic Forum since 2011 and has engaged with a broad spectrum of global leaders through numerous presentations and demonstrations of our excellence in education and research. Next year, we will choreograph our first robotics exhibition on HUBO and present high-tech research results in biotechnology, which, I believe, epitomizes how science and technology breakthroughs in the Fourth Industrial Revolution will shape our future in an unprecedented way.”

Based on what I’m reading in the KAIST news release, I think the conversation about the ‘Fourth revolution’ may veer toward robotics and artificial intelligence (referred to in code as “digital transformation”) as developments in these fields are likely to affect various economies.  Before proceeding with that thought, take a look at this video showcasing HUBO at the DARPA challenge,


I’m quite impressed with how the robot can recalibrate its grasp so it can pick things up and plug an electrical cord into an outlet and knowing whether wheels or legs will be needed to complete a task all due to algorithms which give the robot a type of artificial intelligence. While it may seem more like a machine than anything else, there’s also this version of a HUBO,

Description English: Photo by David Hanson Date 26 October 2006 (original upload date) Source Transferred from en.wikipedia to Commons by Mac. Author Dayofid at English Wikipedia

Description
English: Photo by David Hanson
Date 26 October 2006 (original upload date)
Source Transferred from en.wikipedia to Commons by Mac.
Author Dayofid at English Wikipedia

It’ll be interesting to note if the researchers make the HUBO seem more humanoid by giving it a face for its interactions with WEF attendees. It would be more engaging but also more threatening since there is increasing concern over robots taking work away from humans with implications for various economies. There’s more about HUBO in its Wikipedia entry.

As for the IdeasLab, that’s been in place at the WEF since 2009 according to this WEF July 19, 2011 news release announcing an ideasLab hub (Note: A link has been removed),

The World Economic Forum is publicly launching its biannual interactive IdeasLab hub on 19 July [2011] at 10.00 CEST. The unique IdeasLab hub features short documentary-style, high-definition (HD) videos of preeminent 21st century ideas and critical insights. The hub also provides dynamic Pecha Kucha presentations and visual IdeaScribes that trace and package complex strategic thinking into engaging and powerful images. All videos are HD broadcast quality.

To share the knowledge captured by the IdeasLab sessions, which have been running since 2009, the Forum is publishing 23 of the latest sessions, seen as the global benchmark of collaborative learning and development.

So while you might not be able to visit an IdeasLab presentation at the WEF meetings, you could get a it to see them later.

Getting back to the robotics and artificial intelligence aspect of the 2016 WEF’s ‘digital’ theme, I noticed some reluctance to discuss how the field of robotics is affecting work and jobs in a broadcast of Canadian television show, ‘Conversations with Conrad’.

For those unfamiliar with the interviewer, Conrad Black is somewhat infamous in Canada for a number of reasons (from the Conrad Black Wikipedia entry), Note: Links have been removed,

Conrad Moffat Black, Baron Black of Crossharbour, KSG (born 25 August 1944) is a Canadian-born British former newspaper publisher and author. He is a non-affiliated life peer, and a convicted felon in the United States for fraud.[n 1] Black controlled Hollinger International, once the world’s third-largest English-language newspaper empire,[3] which published The Daily Telegraph (UK), Chicago Sun Times (U.S.), The Jerusalem Post (Israel), National Post (Canada), and hundreds of community newspapers in North America, before he was fired by the board of Hollinger in 2004.[4]

In 2004, a shareholder-initiated prosecution of Black began in the United States. Over $80 million in assets claimed to have been improperly taken or inappropriately spent by Black.[5] He was convicted of three counts of fraud and one count of obstruction of justice in a U.S. court in 2007 and sentenced to six and a half years’ imprisonment. In 2011 two of the charges were overturned on appeal and he was re-sentenced to 42 months in prison on one count of mail fraud and one count of obstruction of justice.[6] Black was released on 4 May 2012.[7]

Despite or perhaps because of his chequered past, he is often a good interviewer and he definitely attracts interesting guests. n an Oct. 26, 2015 programme, he interviewed both former Canadian astronaut, Chris Hadfield, and Canadian-American David Frum who’s currently editor of Atlantic Monthly and a former speechwriter for George W. Bush.

It was Black’s conversation with Frum which surprised me. They discuss robotics without ever once using the word. In a section where Frum notes that manufacturing is returning to the US, he also notes that it doesn’t mean more jobs and cites a newly commissioned plant in the eastern US employing about 40 people where before it would have employed hundreds or thousands. Unfortunately, the video has not been made available as I write this (Nov. 20, 2015) but that situation may change. You can check here.

Final thought, my guess is that economic conditions are fragile and I don’t think anyone wants to set off panic by mentioning robotics and disappearing jobs.

The sense of touch via artificial skin

Scientists have been working for years to allow artificial skin to transmit what the brain would recognize as the sense of touch. For anyone who has lost a limb and gotten a prosthetic replacement, the loss of touch is reputedly one of the more difficult losses to accept. The sense of touch is also vital in robotics if the field is to expand and include activities reliant on the sense of touch, e.g., how much pressure do you use to grasp a cup; how much strength  do you apply when moving an object from one place to another?

For anyone interested in the ‘electronic skin and pursuit of touch’ story, I have a Nov. 15, 2013 posting which highlights the evolution of the research into e-skin and what was then some of the latest work.

This posting is a 2015 update of sorts featuring the latest e-skin research from Stanford University and Xerox PARC. (Dexter Johnson in an Oct. 15, 2015 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineering] site) provides a good research summary.) For anyone with an appetite for more, there’s this from an Oct. 15, 2015 American Association for the Advancement of Science (AAAS) news release on EurekAlert,

Using flexible organic circuits and specialized pressure sensors, researchers have created an artificial “skin” that can sense the force of static objects. Furthermore, they were able to transfer these sensory signals to the brain cells of mice in vitro using optogenetics. For the many people around the world living with prosthetics, such a system could one day allow them to feel sensation in their artificial limbs. To create the artificial skin, Benjamin Tee et al. developed a specialized circuit out of flexible, organic materials. It translates static pressure into digital signals that depend on how much mechanical force is applied. A particular challenge was creating sensors that can “feel” the same range of pressure that humans can. Thus, on the sensors, the team used carbon nanotubes molded into pyramidal microstructures, which are particularly effective at tunneling the signals from the electric field of nearby objects to the receiving electrode in a way that maximizes sensitivity. Transferring the digital signal from the artificial skin system to the cortical neurons of mice proved to be another challenge, since conventional light-sensitive proteins used in optogenetics do not stimulate neural spikes for sufficient durations for these digital signals to be sensed. Tee et al. therefore engineered new optogenetic proteins able to accommodate longer intervals of stimulation. Applying these newly engineered optogenic proteins to fast-spiking interneurons of the somatosensory cortex of mice in vitro sufficiently prolonged the stimulation interval, allowing the neurons to fire in accordance with the digital stimulation pulse. These results indicate that the system may be compatible with other fast-spiking neurons, including peripheral nerves.

And, there’s an Oct. 15, 2015 Stanford University news release on EurkeAlert describing this work from another perspective,

The heart of the technique is a two-ply plastic construct: the top layer creates a sensing mechanism and the bottom layer acts as the circuit to transport electrical signals and translate them into biochemical stimuli compatible with nerve cells. The top layer in the new work featured a sensor that can detect pressure over the same range as human skin, from a light finger tap to a firm handshake.

Five years ago, Bao’s [Zhenan Bao, a professor of chemical engineering at Stanford,] team members first described how to use plastics and rubbers as pressure sensors by measuring the natural springiness of their molecular structures. They then increased this natural pressure sensitivity by indenting a waffle pattern into the thin plastic, which further compresses the plastic’s molecular springs.

To exploit this pressure-sensing capability electronically, the team scattered billions of carbon nanotubes through the waffled plastic. Putting pressure on the plastic squeezes the nanotubes closer together and enables them to conduct electricity.

This allowed the plastic sensor to mimic human skin, which transmits pressure information as short pulses of electricity, similar to Morse code, to the brain. Increasing pressure on the waffled nanotubes squeezes them even closer together, allowing more electricity to flow through the sensor, and those varied impulses are sent as short pulses to the sensing mechanism. Remove pressure, and the flow of pulses relaxes, indicating light touch. Remove all pressure and the pulses cease entirely.

The team then hooked this pressure-sensing mechanism to the second ply of their artificial skin, a flexible electronic circuit that could carry pulses of electricity to nerve cells.

Importing the signal

Bao’s team has been developing flexible electronics that can bend without breaking. For this project, team members worked with researchers from PARC, a Xerox company, which has a technology that uses an inkjet printer to deposit flexible circuits onto plastic. Covering a large surface is important to making artificial skin practical, and the PARC collaboration offered that prospect.

Finally the team had to prove that the electronic signal could be recognized by a biological neuron. It did this by adapting a technique developed by Karl Deisseroth, a fellow professor of bioengineering at Stanford who pioneered a field that combines genetics and optics, called optogenetics. Researchers bioengineer cells to make them sensitive to specific frequencies of light, then use light pulses to switch cells, or the processes being carried on inside them, on and off.

For this experiment the team members engineered a line of neurons to simulate a portion of the human nervous system. They translated the electronic pressure signals from the artificial skin into light pulses, which activated the neurons, proving that the artificial skin could generate a sensory output compatible with nerve cells.

Optogenetics was only used as an experimental proof of concept, Bao said, and other methods of stimulating nerves are likely to be used in real prosthetic devices. Bao’s team has already worked with Bianxiao Cui, an associate professor of chemistry at Stanford, to show that direct stimulation of neurons with electrical pulses is possible.

Bao’s team envisions developing different sensors to replicate, for instance, the ability to distinguish corduroy versus silk, or a cold glass of water from a hot cup of coffee. This will take time. There are six types of biological sensing mechanisms in the human hand, and the experiment described in Science reports success in just one of them.

But the current two-ply approach means the team can add sensations as it develops new mechanisms. And the inkjet printing fabrication process suggests how a network of sensors could be deposited over a flexible layer and folded over a prosthetic hand.

“We have a lot of work to take this from experimental to practical applications,” Bao said. “But after spending many years in this work, I now see a clear path where we can take our artificial skin.”

Here’s a link to and a citation for the paper,

A skin-inspired organic digital mechanoreceptor by Benjamin C.-K. Tee, Alex Chortos, Andre Berndt, Amanda Kim Nguyen, Ariane Tom, Allister McGuire, Ziliang Carter Lin, Kevin Tien, Won-Gyu Bae, Huiliang Wang, Ping Mei, Ho-Hsiu Chou, Bianxiao Cui, Karl Deisseroth, Tse Nga Ng, & Zhenan Bao. Science 16 October 2015 Vol. 350 no. 6258 pp. 313-316 DOI: 10.1126/science.aaa9306

This paper is behind a paywall.

Informal roundup of robot movies and television programmes and a glimpse into our robot future

David Bruggeman has written an informal series of posts about robot movies. The latest, a June 27, 2015 posting on his Pasco Phronesis blog, highlights the latest Terminator film and opines that the recent interest could be traced back to the rebooted Battlestar Galactica television series (Note: Links have been removed),

I suppose this could be traced back to the reboot of Battlestar Galactica over a decade ago, but robots and androids have become an increasing presence on film and television, particularly in the last 2 years.

In the movies, the new Terminator film comes out next week, and the previews suggest we will see a new generation of killer robots traveling through time and space.  Chappie is now out on your digital medium of choice (and I’ll post about any science fiction science policy/SciFiSciPol once I see it), so you can compare its robot police to those from either edition of Robocop or the 2013 series Almost Human.  Robots also have a role …

The new television series he mentions, Humans (click on About) debuted on the US tv channel, AMC, on Sunday, June 28, 2015 (yesterday).

HUMANS is set in a parallel present, where the latest must-have gadget for any busy family is a Synth – a highly-developed robotic servant, eerily similar to its live counterpart. In the hope of transforming the way his family lives, father Joe Hawkins (Tom Goodman-Hill) purchases a Synth (Gemma Chan) against the wishes of his wife (Katharine Parkinson), only to discover that sharing life with a machine has far-reaching and chilling consequences.

Here’s a bit more information from its Wikipedia entry,

Humans (styled as HUM∀NS) is a British-American science fiction television series, debuted in June 2015 on Channel 4 and AMC.[2] Written by the British team Sam Vincent and Jonathan Brackley, based on the award-winning Swedish science fiction drama Real Humans, the series explores the emotional impact of the blurring of the lines between humans and machines. The series is produced jointly by AMC, Channel 4 and Kudos.[3] The series will consist of eight episodes.[4]

David also wrote about Ex Machina, a recent robot film with artistic ambitions, in an April 26, 2015 posting on his Pasco Phronesis blog,

I finally saw Ex Machina, which recently opened in the United States.  It’s a minimalist film, with few speaking roles and a plot revolving around an intelligence test.  Of the robot movies out this year, it has received the strongest reviews, and it may take home some trophies during the next awards season.  Shot in Norway, the film is both lovely to watch and tricky to engage.  I finished the film not quite sure what the characters were thinking, and perhaps that’s a lesson from the film.

Unlike Chappie and Automata, the intelligent robot at the center of Ex Machina is not out in the world. …

He started the series with a Feb. 8, 2015 posting which previews the movies in his later postings but also includes a couple of others not mentioned in either the April or June posting, Avengers: Age of Ultron and Spare Parts.

It’s interesting to me that these robots  are mostly not related to the benign robots in the movie, ‘Forbidden Planet’, a reworking of Shakespeare’s The Tempest in outer space, in ‘Lost in Space’, a 1960s television programme, and in the Jetsons animated tv series of the 1960s. As far as I can tell not having seen the new movies in question, the only benign robot in the current crop would be ‘Chappie’. It should be mentioned that the ‘Terminator’, in the person of Arnold Schwarzenegger, has over a course of three or four movies evolved from a destructive robot bent on evil to a destructive robot working on behalf of good.

I’ll add one more more television programme and I’m not sure if the robot boy is good or evil but there’s Extant where Halle Berry’s robot son seems to be in a version of the Pinocchio story (an ersatz child want to become human), which is enjoying its second season on US television as of July 1, 2015.

Regardless of one or two ‘sweet’ robots, there seems to be a trend toward ominous robots and perhaps, in addition to Battlestar Galactica, the concerns being raised by prominent scientists such as Stephen Hawking and those associated with the Centre for Existential Risk at the University of Cambridge have something to do with this trend and may partially explain why Chappie did not do as well at the box office as hoped. Thematically, it was swimming against the current.

As for a glimpse into the future, there’s this Children’s Hospital of Los Angeles June 29, 2015 news release,

Many hospitals lack the resources and patient volume to employ a round-the-clock, neonatal intensive care specialist to treat their youngest and sickest patients. Telemedicine–with real-time audio and video communication between a neonatal intensive care specialist and a patient–can provide access to this level of care.

A team of neonatologists at Children’s Hospital Los Angeles investigated the use of robot-assisted telemedicine in performing bedside rounds and directing daily care for infants with mild-to-moderate disease. They found no significant differences in patient outcomes when telemedicine was used and noted a high level of parent satisfaction. This is the first published report of using telemedicine for patient rounds in a neonatal intensive care unit (NICU). Results will be published online first on June 29 in the Journal of Telemedicine and Telecare.

Glimpse into the future?

The part I find most fascinating was that there was no difference in outcomes, moreover, the parents’ satisfaction rate was high when robots (telemedicine) were used. Finally, of the families who completed the after care survey (45%), all indicated they would be comfortable with another telemedicine (robot) experience. My comment, should robots prove to be cheaper in the long run and the research results hold as more studies are done, I imagine that hospitals will introduce them as a means of cost cutting.

AI assistant makes scientific discovery at Tufts University (US)

In light of this latest research from Tufts University, I thought it might be interesting to review the “algorithms, artificial intelligence (AI), robots, and world of work” situation before moving on to Tufts’ latest science discovery. My Feb. 5, 2015 post provides a roundup of sorts regarding work and automation. For those who’d like the latest, there’s a May 29, 2015 article by Sophie Weiner for Fast Company, featuring a predictive interactive tool designed by NPR (US National Public Radio) based on data from Oxford University researchers, which tells you how likely automating your job could be, no one knows for sure, (Note: A link has been removed),

Paralegals and food service workers: the robots are coming.

So suggests this interactive visualization by NPR. The bare-bones graphic lets you select a profession, from tellers and lawyers to psychologists and authors, to determine who is most at risk of losing their jobs in the coming robot revolution. From there, it spits out a percentage. …

You can find the interactive NPR tool here. I checked out the scientist category (in descending order of danger: Historians [43.9%], Economists, Geographers, Survey Researchers, Epidemiologists, Chemists, Animal Scientists, Sociologists, Astronomers, Social Scientists, Political Scientists, Materials Scientists, Conservation Scientists, and Microbiologists [1.2%]) none of whom seem to be in imminent danger if you consider that bookkeepers are rated at  97.6%.

Here at last is the news from Tufts (from a June 4, 2015 Tufts University news release, also on EurekAlert),

An artificial intelligence system has for the first time reverse-engineered the regeneration mechanism of planaria–the small worms whose extraordinary power to regrow body parts has made them a research model in human regenerative medicine.

The discovery by Tufts University biologists presents the first model of regeneration discovered by a non-human intelligence and the first comprehensive model of planarian regeneration, which had eluded human scientists for over 100 years. The work, published in PLOS Computational Biology, demonstrates how “robot science” can help human scientists in the future.

To mine the fast-growing mountain of published experimental data in regeneration and developmental biology Lobo and Levin developed an algorithm that would use evolutionary computation to produce regulatory networks able to “evolve” to accurately predict the results of published laboratory experiments that the researchers entered into a database.

“Our goal was to identify a regulatory network that could be executed in every cell in a virtual worm so that the head-tail patterning outcomes of simulated experiments would match the published data,” Lobo said.

The paper represents a successful application of the growing field of “robot science” – which Levin says can help human researchers by doing much more than crunch enormous datasets quickly.

“While the artificial intelligence in this project did have to do a whole lot of computations, the outcome is a theory of what the worm is doing, and coming up with theories of what’s going on in nature is pretty much the most creative, intuitive aspect of the scientist’s job,” Levin said. “One of the most remarkable aspects of the project was that the model it found was not a hopelessly-tangled network that no human could actually understand, but a reasonably simple model that people can readily comprehend. All this suggests to me that artificial intelligence can help with every aspect of science, not only data mining but also inference of meaning of the data.”

Here’s a link to and a citation for the paper,

Inferring Regulatory Networks from Experimental Morphological Phenotypes: A Computational Method Reverse-Engineers Planarian Regeneration by Daniel Lobo and Michael Levin. PLOS (Computational Biology) DOI: DOI: 10.1371/journal.pcbi.1004295 Published: June 4, 2015

This paper is open access.

It will be interesting to see if attributing the discovery to an algorithm sets off criticism suggesting that the researchers overstated the role the AI assistant played.

I sing the body cyber: two projects funded by the US National Science Foundation

Points to anyone who recognized the reference to Walt Whitman’s poem, “I sing the body electric,” from his classic collection, Leaves of Grass (1867 edition; h/t Wikipedia entry). I wonder if the cyber physical systems (CPS) work being funded by the US National Science Foundation (NSF) in the US will occasion poetry too.

More practically, a May 15, 2015 news item on Nanowerk, describes two cyber physical systems (CPS) research projects newly funded by the NSF,

Today [May 12, 2015] the National Science Foundation (NSF) announced two, five-year, center-scale awards totaling $8.75 million to advance the state-of-the-art in medical and cyber-physical systems (CPS).

One project will develop “Cyberheart”–a platform for virtual, patient-specific human heart models and associated device therapies that can be used to improve and accelerate medical-device development and testing. The other project will combine teams of microrobots with synthetic cells to perform functions that may one day lead to tissue and organ re-generation.

CPS are engineered systems that are built from, and depend upon, the seamless integration of computation and physical components. Often called the “Internet of Things,” CPS enable capabilities that go beyond the embedded systems of today.

“NSF has been a leader in supporting research in cyber-physical systems, which has provided a foundation for putting the ‘smart’ in health, transportation, energy and infrastructure systems,” said Jim Kurose, head of Computer & Information Science & Engineering at NSF. “We look forward to the results of these two new awards, which paint a new and compelling vision for what’s possible for smart health.”

Cyber-physical systems have the potential to benefit many sectors of our society, including healthcare. While advances in sensors and wearable devices have the capacity to improve aspects of medical care, from disease prevention to emergency response, and synthetic biology and robotics hold the promise of regenerating and maintaining the body in radical new ways, little is known about how advances in CPS can integrate these technologies to improve health outcomes.

These new NSF-funded projects will investigate two very different ways that CPS can be used in the biological and medical realms.

A May 12, 2015 NSF news release (also on EurekAlert), which originated the news item, describes the two CPS projects,

Bio-CPS for engineering living cells

A team of leading computer scientists, roboticists and biologists from Boston University, the University of Pennsylvania and MIT have come together to develop a system that combines the capabilities of nano-scale robots with specially designed synthetic organisms. Together, they believe this hybrid “bio-CPS” will be capable of performing heretofore impossible functions, from microscopic assembly to cell sensing within the body.

“We bring together synthetic biology and micron-scale robotics to engineer the emergence of desired behaviors in populations of bacterial and mammalian cells,” said Calin Belta, a professor of mechanical engineering, systems engineering and bioinformatics at Boston University and principal investigator on the project. “This project will impact several application areas ranging from tissue engineering to drug development.”

The project builds on previous research by each team member in diverse disciplines and early proof-of-concept designs of bio-CPS. According to the team, the research is also driven by recent advances in the emerging field of synthetic biology, in particular the ability to rapidly incorporate new capabilities into simple cells. Researchers so far have not been able to control and coordinate the behavior of synthetic cells in isolation, but the introduction of microrobots that can be externally controlled may be transformative.

In this new project, the team will focus on bio-CPS with the ability to sense, transport and work together. As a demonstration of their idea, they will develop teams of synthetic cell/microrobot hybrids capable of constructing a complex, fabric-like surface.

Vijay Kumar (University of Pennsylvania), Ron Weiss (MIT), and Douglas Densmore (BU) are co-investigators of the project.

Medical-CPS and the ‘Cyberheart’

CPS such as wearable sensors and implantable devices are already being used to assess health, improve quality of life, provide cost-effective care and potentially speed up disease diagnosis and prevention. [emphasis mine]

Extending these efforts, researchers from seven leading universities and centers are working together to develop far more realistic cardiac and device models than currently exist. This so-called “Cyberheart” platform can be used to test and validate medical devices faster and at a far lower cost than existing methods. CyberHeart also can be used to design safe, patient-specific device therapies, thereby lowering the risk to the patient.

“Innovative ‘virtual’ design methodologies for implantable cardiac medical devices will speed device development and yield safer, more effective devices and device-based therapies, than is currently possible,” said Scott Smolka, a professor of computer science at Stony Brook University and one of the principal investigators on the award.

The group’s approach combines patient-specific computational models of heart dynamics with advanced mathematical techniques for analyzing how these models interact with medical devices. The analytical techniques can be used to detect potential flaws in device behavior early on during the device-design phase, before animal and human trials begin. They also can be used in a clinical setting to optimize device settings on a patient-by-patient basis before devices are implanted.

“We believe that our coordinated, multi-disciplinary approach, which balances theoretical, experimental and practical concerns, will yield transformational results in medical-device design and foundations of cyber-physical system verification,” Smolka said.

The team will develop virtual device models which can be coupled together with virtual heart models to realize a full virtual development platform that can be subjected to computational analysis and simulation techniques. Moreover, they are working with experimentalists who will study the behavior of virtual and actual devices on animals’ hearts.

Co-investigators on the project include Edmund Clarke (Carnegie Mellon University), Elizabeth Cherry (Rochester Institute of Technology), W. Rance Cleaveland (University of Maryland), Flavio Fenton (Georgia Tech), Rahul Mangharam (University of Pennsylvania), Arnab Ray (Fraunhofer Center for Experimental Software Engineering [Germany]) and James Glimm and Radu Grosu (Stony Brook University). Richard A. Gray of the U.S. Food and Drug Administration is another key contributor.

It is fascinating to observe how terminology is shifting from pacemakers and deep brain stimulators as implants to “CPS such as wearable sensors and implantable devices … .” A new category has been created, CPS, which conjoins medical devices with other sensing devices such as wearable fitness monitors found in the consumer market. I imagine it’s an attempt to quell fears about injecting strange things into or adding strange things to your body—microrobots and nanorobots partially derived from synthetic biology research which are “… capable of performing heretofore impossible functions, from microscopic assembly to cell sensing within the body.” They’ve also sneaked in a reference to synthetic biology, an area of research where some concerns have been expressed, from my March 19, 2013 post about a poll and synthetic biology concerns,

In our latest survey, conducted in January 2013, three-fourths of respondents say they have heard little or nothing about synthetic biology, a level consistent with that measured in 2010. While initial impressions about the science are largely undefined, these feelings do not necessarily become more positive as respondents learn more. The public has mixed reactions to specific synthetic biology applications, and almost one-third of respondents favor a ban “on synthetic biology research until we better understand its implications and risks,” while 61 percent think the science should move forward.

I imagine that for scientists, 61% in favour of more research is not particularly comforting given how easily and quickly public opinion can shift.

3D printing soft robots and flexible electronics with metal alloys

This research comes from Purdue University (Indiana, US) which seems to be on a publishing binge these days. From an April 7, 2015 news item on Nanowerk,

New research shows how inkjet-printing technology can be used to mass-produce electronic circuits made of liquid-metal alloys for “soft robots” and flexible electronics.

Elastic technologies could make possible a new class of pliable robots and stretchable garments that people might wear to interact with computers or for therapeutic purposes. However, new manufacturing techniques must be developed before soft machines become commercially feasible, said Rebecca Kramer, an assistant professor of mechanical engineering at Purdue University.

“We want to create stretchable electronics that might be compatible with soft machines, such as robots that need to squeeze through small spaces, or wearable technologies that aren’t restrictive of motion,” she said. “Conductors made from liquid metal can stretch and deform without breaking.”

A new potential manufacturing approach focuses on harnessing inkjet printing to create devices made of liquid alloys.

“This process now allows us to print flexible and stretchable conductors onto anything, including elastic materials and fabrics,” Kramer said.

An April 7, 2015 Purdue University news release (also on EurekAlert) by Emil Venere, which originated the news item, expands on the theme,

A research paper about the method will appear on April 18 [2015] in the journal Advanced Materials. The paper generally introduces the method, called mechanically sintered gallium-indium nanoparticles, and describes research leading up to the project. It was authored by postdoctoral researcher John William Boley, graduate student Edward L. White and Kramer.

A printable ink is made by dispersing the liquid metal in a non-metallic solvent using ultrasound, which breaks up the bulk liquid metal into nanoparticles. This nanoparticle-filled ink is compatible with inkjet printing.

“Liquid metal in its native form is not inkjet-able,” Kramer said. “So what we do is create liquid metal nanoparticles that are small enough to pass through an inkjet nozzle. Sonicating liquid metal in a carrier solvent, such as ethanol, both creates the nanoparticles and disperses them in the solvent. Then we can print the ink onto any substrate. The ethanol evaporates away so we are just left with liquid metal nanoparticles on a surface.”

After printing, the nanoparticles must be rejoined by applying light pressure, which renders the material conductive. This step is necessary because the liquid-metal nanoparticles are initially coated with oxidized gallium, which acts as a skin that prevents electrical conductivity.

“But it’s a fragile skin, so when you apply pressure it breaks the skin and everything coalesces into one uniform film,” Kramer said. “We can do this either by stamping or by dragging something across the surface, such as the sharp edge of a silicon tip.”

The approach makes it possible to select which portions to activate depending on particular designs, suggesting that a blank film might be manufactured for a multitude of potential applications.

“We selectively activate what electronics we want to turn on by applying pressure to just those areas,” said Kramer, who this year was awarded an Early Career Development award from the National Science Foundation, which supports research to determine how to best develop the liquid-metal ink.

The process could make it possible to rapidly mass-produce large quantities of the film.

Future research will explore how the interaction between the ink and the surface being printed on might be conducive to the production of specific types of devices.

“For example, how do the nanoparticles orient themselves on hydrophobic versus hydrophilic surfaces? How can we formulate the ink and exploit its interaction with a surface to enable self-assembly of the particles?” she said.

The researchers also will study and model how individual particles rupture when pressure is applied, providing information that could allow the manufacture of ultrathin traces and new types of sensors.

Here’s a link to and a citation for the paper,

Nanoparticles: Mechanically Sintered Gallium–Indium Nanoparticles by John William Boley, Edward L. White and Rebecca K. Kramer. Advanced Materials Volume 27, Issue 14, page 2270, April 8, 2015 DOI: 10.1002/adma.201570094 Article first published online: 7 APR 2015

© 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

This article is behind a paywall.

A bio-inspired robotic sock from Singapore’s National University

Should you ever be confined to a bed over a long period of time or find yourself unable to move your legs at will, this robotic sock could help you avoid blood clots according to a Feb. 10, 2015 National University of Singapore news release (also on EurekAlert but dated Feb. 13, 2015),

Patients who are bedridden or unable to move their legs are often at risk of developing Deep Vein Thrombosis (DVT), a potentially life-threatening condition caused by blood clots forming along the lower extremity veins of the legs. A team of researchers from the National University of Singapore’s (NUS) Yong Loo Lin School of Medicine and Faculty of Engineering has invented a novel sock that can help prevent DVT and improve survival rates of patients.

Equipped with soft actuators that mimic the tentacle movements of corals, the robotic sock emulates natural lower leg muscle contractions in the wearer’s leg, thereby promoting blood circulation throughout the wearer’s body. In addition, the novel device can potentially optimise therapy sessions and enable the patient’s lower leg movements to be monitored to improve therapy outcomes.

The invention is created by Assistant Professor Lim Jeong Hoon from the NUS Department of Medicine, as well as Assistant Professor Raye Yeow Chen Hua and first-year PhD candidate Mr Low Fanzhe of the NUS Department of Biomedical Engineering.

The news release goes on to contrast this new technique with the pharmacological and other methods currently in use,

Current approaches to prevent DVT include pharmacological methods which involve using anti-coagulation drugs to prevent blood from clotting, and mechanical methods that involve the use of compressive stimulations to assist blood flow.

While pharmacological methods are competent in preventing DVT, there is a primary detrimental side effect – there is higher risk of excessive bleeding which can lead to death, especially for patients who suffered hemorrhagic stroke. On the other hand, current mechanical methods such as the use of compression stockings have not demonstrated significant reduction in DVT risk.

In the course of exploring an effective solution that can prevent DVT, Asst Prof Lim, who is a rehabilitation clinician, was inspired by the natural role of the human ankle muscles in facilitating venous blood flow back to the heart. He worked with Asst Prof Yeow and Mr Low to derive a method that can perform this function for patients who are bedridden or unable to move their legs.

The team turned to nature for inspiration to develop a device that is akin to human ankle movements. They found similarities in the elegant structural design of the coral tentacle, which can extend to grab food and contract to bring the food closer for consumption, and invented soft actuators that mimic this “push and pull” mechanism.

By integrating the actuators with a sock and the use of a programmable pneumatic pump-valve control system, the invention is able to create the desired robot-assisted ankle joint motions to facilitate blood flow in the leg.

Explaining the choice of materials, Mr Low said, “We chose to use only soft components and actuators to increase patient comfort during use, hence minimising the risk of injury from excessive mechanical forces. Compression stockings are currently used in the hospital wards, so it makes sense to use a similar sock-based approach to provide comfort and minimise bulk on the ankle and foot.”

The sock complements conventional ankle therapy exercises that therapists perform on patients, thereby optimising therapy time and productivity. In addition, the sock can be worn for prolonged durations to provide robot-assisted therapy, on top of the therapist-assisted sessions. The sock is also embedded with sensors to track the ankle joint angle, allowing the patient’s ankle motion to be monitored for better treatment.

Said Asst Prof Yeow, “Given its compact size, modular design and ease of use, the soft robotic sock can be adopted in hospital wards and rehabilitation centres for on-bed applications to prevent DVT among stroke patients or even at home for bedridden patients. By reducing the risk of DVT using this device, we hope to improve survival rates of these patients.”

The team does not seem to have published any papers about this work although there are plans for clinical trials and commercialization (from the news release),

To further investigate the effectiveness of the robotic sock, Asst Prof Lim, Asst Prof Yeow and Mr Low will be conducting pilot clinical trials with about 30 patients at the National University Hospital over six months, starting March 2015. They hope that the pilot clinical trials will help them to obtain patient and clinical feedback to further improve the design and capabilities of the device.

The team intends to conduct trials across different local hospitals for better evaluation, and they also hope to commercialise the device in future.

The researchers have provided an image of the sock on a ‘patient’,

 Caption: NUS researchers (from right to left) Assistant Professor Raye Yeow, Mr Low Fanzhe and Dr Liu Yuchun demonstrating the novel bio-inspired robotic sock. Credit: National University of Singapore


Caption: NUS researchers (from right to left) Assistant Professor Raye Yeow, Mr Low Fanzhe and Dr Liu Yuchun demonstrating the novel bio-inspired robotic sock.
Credit: National University of Singapore