Tag Archives: Star Wars

Touchy robots and prosthetics

I have briefly speculated about the importance of touch elsewhere (see my July 19, 2019 posting regarding BlocKit and blockchain; scroll down about 50% of the way) but this upcoming news bit and the one following it put a different spin on the importance of touch.

Exceptional sense of touch

Robots need a sense of touch to perform their tasks and a July 18, 2019 National University of Singapore press release (also on EurekAlert) announces work on an improved sense of touch,

Robots and prosthetic devices may soon have a sense of touch equivalent to, or better than, the human skin with the Asynchronous Coded Electronic Skin (ACES), an artificial nervous system developed by a team of researchers at the National University of Singapore (NUS).

The new electronic skin system achieved ultra-high responsiveness and robustness to damage, and can be paired with any kind of sensor skin layers to function effectively as an electronic skin.

The innovation, achieved by Assistant Professor Benjamin Tee and his team from the Department of Materials Science and Engineering at the NUS Faculty of Engineering, was first reported in prestigious scientific journal Science Robotics on 18 July 2019.

Faster than the human sensory nervous system

“Humans use our sense of touch to accomplish almost every daily task, such as picking up a cup of coffee or making a handshake. Without it, we will even lose our sense of balance when walking. Similarly, robots need to have a sense of touch in order to interact better with humans, but robots today still cannot feel objects very well,” explained Asst Prof Tee, who has been working on electronic skin technologies for over a decade in hope of giving robots and prosthetic devices a better sense of touch.

Drawing inspiration from the human sensory nervous system, the NUS team spent a year and a half developing a sensor system that could potentially perform better. While the ACES electronic nervous system detects signals like the human sensor nervous system, it is made up of a network of sensors connected via a single electrical conductor, unlike the nerve bundles in the human skin. It is also unlike existing electronic skins which have interlinked wiring systems that can make them sensitive to damage and difficult to scale up.

Elaborating on the inspiration, Asst Prof Tee, who also holds appointments in the NUS Department of Electrical and Computer Engineering, NUS Institute for Health Innovation & Technology (iHealthTech), N.1 Institute for Health and the Hybrid Integrated Flexible Electronic Systems (HiFES) programme, said, “The human sensory nervous system is extremely efficient, and it works all the time to the extent that we often take it for granted. It is also very robust to damage. Our sense of touch, for example, does not get affected when we suffer a cut. If we can mimic how our biological system works and make it even better, we can bring about tremendous advancements in the field of robotics where electronic skins are predominantly applied.”

ACES can detect touches more than 1,000 times faster than the human sensory nervous system. For example, it is capable of differentiating physical contacts between different sensors in less than 60 nanoseconds – the fastest ever achieved for an electronic skin technology – even with large numbers of sensors. ACES-enabled skin can also accurately identify the shape, texture and hardness of objects within 10 milliseconds, ten times faster than the blinking of an eye. This is enabled by the high fidelity and capture speed of the ACES system.

The ACES platform can also be designed to achieve high robustness to physical damage, an important property for electronic skins because they come into the frequent physical contact with the environment. Unlike the current system used to interconnect sensors in existing electronic skins, all the sensors in ACES can be connected to a common electrical conductor with each sensor operating independently. This allows ACES-enabled electronic skins to continue functioning as long as there is one connection between the sensor and the conductor, making them less vulnerable to damage.

Smart electronic skins for robots and prosthetics

ACES’ simple wiring system and remarkable responsiveness even with increasing numbers of sensors are key characteristics that will facilitate the scale-up of intelligent electronic skins for Artificial Intelligence (AI) applications in robots, prosthetic devices and other human machine interfaces.

“Scalability is a critical consideration as big pieces of high performing electronic skins are required to cover the relatively large surface areas of robots and prosthetic devices,” explained Asst Prof Tee. “ACES can be easily paired with any kind of sensor skin layers, for example, those designed to sense temperatures and humidity, to create high performance ACES-enabled electronic skin with an exceptional sense of touch that can be used for a wide range of purposes,” he added.

For instance, pairing ACES with the transparent, self-healing and water-resistant sensor skin layer also recently developed by Asst Prof Tee’s team, creates an electronic skin that can self-repair, like the human skin. This type of electronic skin can be used to develop more realistic prosthetic limbs that will help disabled individuals restore their sense of touch.

Other potential applications include developing more intelligent robots that can perform disaster recovery tasks or take over mundane operations such as packing of items in warehouses. The NUS team is therefore looking to further apply the ACES platform on advanced robots and prosthetic devices in the next phase of their research.

For those who like videos, the researchers have prepared this,

Here’s a link to and a citation for the paper,

A neuro-inspired artificial peripheral nervous system for scalable electronic skins by Wang Wei Lee, Yu Jun Tan, Haicheng Yao, Si Li, Hian Hian See, Matthew Hon, Kian Ann Ng, Betty Xiong, John S. Ho and Benjamin C. K. Tee. Science Robotics Vol 4, Issue 32 31 July 2019 eaax2198 DOI: 10.1126/scirobotics.aax2198 Published online first: 17 Jul 2019:

This paper is behind a paywall.

Picking up a grape and holding his wife’s hand

This story comes from the Canadian Broadcasting Corporation (CBC) Radio with a six minute story embedded in the text, from a July 25, 2019 CBC Radio ‘As It Happens’ article by Sheena Goodyear,

The West Valley City, Utah, real estate agent [Keven Walgamott] lost his left hand in an electrical accident 17 years ago. Since then, he’s tried out a few different prosthetic limbs, but always found them too clunky and uncomfortable.

Then he decided to work with the University of Utah in 2016 to test out new prosthetic technology that mimics the sensation of human touch, allowing Walgamott to perform delicate tasks with precision — including shaking his wife’s hand. 

“I extended my left hand, she came and extended hers, and we were able to feel each other with the left hand for the first time in 13 years, and it was just a marvellous and wonderful experience,” Walgamott told As It Happens guest host Megan Williams. 

Walgamott, one of seven participants in the University of Utah study, was able to use an advanced prosthetic hand called the LUKE Arm to pick up an egg without cracking it, pluck a single grape from a bunch, hammer a nail, take a ring on and off his finger, fit a pillowcase over a pillow and more. 

While performing the tasks, Walgamott was able to actually feel the items he was holding and correctly gauge the amount of pressure he needed to exert — mimicking a process the human brain does automatically.

“I was able to feel something in each of my fingers,” he said. “What I feel, I guess the easiest way to explain it, is little electrical shocks.”

Those shocks — which he describes as a kind of a tingling sensation — intensify as he tightens his grip.

“Different variations of the intensity of the electricity as I move my fingers around and as I touch things,” he said. 

To make that [sense of touch] happen, the researchers implanted electrodes into the nerves on Walgamott’s forearm, allowing his brain to communicate with his prosthetic through a computer outside his body. That means he can move the hand just by thinking about it.

But those signals also work in reverse.

The team attached sensors to the hand of a LUKE Arm. Those sensors detect touch and positioning, and send that information to the electrodes so it can be interpreted by the brain.

For Walgamott, performing a series of menial tasks as a team of scientists recorded his progress was “fun to do.”

“I’d forgotten how well two hands work,” he said. “That was pretty cool.”

But it was also a huge relief from the phantom limb pain he has experienced since the accident, which he describes as a “burning sensation” in the place where his hand used to be.

A July 24, 2019 University of Utah news release (also on EurekAlert) provides more detail about the research,

Keven Walgamott had a good “feeling” about picking up the egg without crushing it.

What seems simple for nearly everyone else can be more of a Herculean task for Walgamott, who lost his left hand and part of his arm in an electrical accident 17 years ago. But he was testing out the prototype of a high-tech prosthetic arm with fingers that not only can move, they can move with his thoughts. And thanks to a biomedical engineering team at the University of Utah, he “felt” the egg well enough so his brain could tell the prosthetic hand not to squeeze too hard.

That’s because the team, led by U biomedical engineering associate professor Gregory Clark, has developed a way for the “LUKE Arm” (so named after the robotic hand that Luke Skywalker got in “The Empire Strikes Back”) to mimic the way a human hand feels objects by sending the appropriate signals to the brain. Their findings were published in a new paper co-authored by U biomedical engineering doctoral student Jacob George, former doctoral student David Kluger, Clark and other colleagues in the latest edition of the journal Science Robotics. A copy of the paper may be obtained by emailing robopak@aaas.org.

“We changed the way we are sending that information to the brain so that it matches the human body. And by matching the human body, we were able to see improved benefits,” George says. “We’re making more biologically realistic signals.”

That means an amputee wearing the prosthetic arm can sense the touch of something soft or hard, understand better how to pick it up and perform delicate tasks that would otherwise be impossible with a standard prosthetic with metal hooks or claws for hands.

“It almost put me to tears,” Walgamott says about using the LUKE Arm for the first time during clinical tests in 2017. “It was really amazing. I never thought I would be able to feel in that hand again.”

Walgamott, a real estate agent from West Valley City, Utah, and one of seven test subjects at the U, was able to pluck grapes without crushing them, pick up an egg without cracking it and hold his wife’s hand with a sensation in the fingers similar to that of an able-bodied person.

“One of the first things he wanted to do was put on his wedding ring. That’s hard to do with one hand,” says Clark. “It was very moving.”

Those things are accomplished through a complex series of mathematical calculations and modeling.

The LUKE Arm

The LUKE Arm has been in development for some 15 years. The arm itself is made of mostly metal motors and parts with a clear silicon “skin” over the hand. It is powered by an external battery and wired to a computer. It was developed by DEKA Research & Development Corp., a New Hampshire-based company founded by Segway inventor Dean Kamen.

Meanwhile, the U’s team has been developing a system that allows the prosthetic arm to tap into the wearer’s nerves, which are like biological wires that send signals to the arm to move. It does that thanks to an invention by U biomedical engineering Emeritus Distinguished Professor Richard A. Normann called the Utah Slanted Electrode Array. The array is a bundle of 100 microelectrodes and wires that are implanted into the amputee’s nerves in the forearm and connected to a computer outside the body. The array interprets the signals from the still-remaining arm nerves, and the computer translates them to digital signals that tell the arm to move.

But it also works the other way. To perform tasks such as picking up objects requires more than just the brain telling the hand to move. The prosthetic hand must also learn how to “feel” the object in order to know how much pressure to exert because you can’t figure that out just by looking at it.

First, the prosthetic arm has sensors in its hand that send signals to the nerves via the array to mimic the feeling the hand gets upon grabbing something. But equally important is how those signals are sent. It involves understanding how your brain deals with transitions in information when it first touches something. Upon first contact of an object, a burst of impulses runs up the nerves to the brain and then tapers off. Recreating this was a big step.

“Just providing sensation is a big deal, but the way you send that information is also critically important, and if you make it more biologically realistic, the brain will understand it better and the performance of this sensation will also be better,” says Clark.

To achieve that, Clark’s team used mathematical calculations along with recorded impulses from a primate’s arm to create an approximate model of how humans receive these different signal patterns. That model was then implemented into the LUKE Arm system.

Future research

In addition to creating a prototype of the LUKE Arm with a sense of touch, the overall team is already developing a version that is completely portable and does not need to be wired to a computer outside the body. Instead, everything would be connected wirelessly, giving the wearer complete freedom.

Clark says the Utah Slanted Electrode Array is also capable of sending signals to the brain for more than just the sense of touch, such as pain and temperature, though the paper primarily addresses touch. And while their work currently has only involved amputees who lost their extremities below the elbow, where the muscles to move the hand are located, Clark says their research could also be applied to those who lost their arms above the elbow.

Clark hopes that in 2020 or 2021, three test subjects will be able to take the arm home to use, pending federal regulatory approval.

The research involves a number of institutions including the U’s Department of Neurosurgery, Department of Physical Medicine and Rehabilitation and Department of Orthopedics, the University of Chicago’s Department of Organismal Biology and Anatomy, the Cleveland Clinic’s Department of Biomedical Engineering and Utah neurotechnology companies Ripple Neuro LLC and Blackrock Microsystems. The project is funded by the Defense Advanced Research Projects Agency and the National Science Foundation.

“This is an incredible interdisciplinary effort,” says Clark. “We could not have done this without the substantial efforts of everybody on that team.”

Here’s a link to and a citation for the paper,

Biomimetic sensory feedback through peripheral nerve stimulation improves dexterous use of a bionic hand by J. A. George, D. T. Kluger, T. S. Davis, S. M. Wendelken, E. V. Okorokova, Q. He, C. C. Duncan, D. T. Hutchinson, Z. C. Thumser, D. T. Beckler, P. D. Marasco, S. J. Bensmaia and G. A. Clark. Science Robotics Vol. 4, Issue 32, eaax2352 31 July 2019 DOI: 10.1126/scirobotics.aax2352 Published online first: 24 Jul 2019

This paper is definitely behind a paywall.

The University of Utah researchers have produced a video highlighting their work,

US National Institute of Standards and Technology and molecules made of light (lightsabres anyone?)

As I recall, lightsabres are a Star Wars invention. I gather we’re a long way from running around with lightsabres  but there is hope, if that should be your dream, according to a Sept. 9, 2015 news item on Nanowerk,

… a team including theoretical physicists from JQI [Joint Quantum Institute] and NIST [US National Institute of Stnadards and Technology] has taken another step toward building objects out of photons, and the findings hint that weightless particles of light can be joined into a sort of “molecule” with its own peculiar force.

Here’s an artist’s conception of the light “molecule” provided by the researchers,

Researchers show that two photons, depicted in this artist’s conception as waves (left and right), can be locked together at a short distance. Under certain conditions, the photons can form a state resembling a two-atom molecule, represented as the blue dumbbell shape at center. Credit: E. Edwards/JQI

Researchers show that two photons, depicted in this artist’s conception as waves (left and right), can be locked together at a short distance. Under certain conditions, the photons can form a state resembling a two-atom molecule, represented as the blue dumbbell shape at center. Credit: E. Edwards/JQI

A Sept. 8, 2015 NIST news release (also available on EurekAlert*), which originated the news item, provides more information about the research (Note: Links have been removed),

The findings build on previous research that several team members contributed to before joining NIST. In 2013, collaborators from Harvard, Caltech and MIT found a way to bind two photons together so that one would sit right atop the other, superimposed as they travel. Their experimental demonstration was considered a breakthrough, because no one had ever constructed anything by combining individual photons—inspiring some to imagine that real-life lightsabers were just around the corner.

Now, in a paper forthcoming in Physical Review Letters, the NIST and University of Maryland-based team (with other collaborators) has showed theoretically that by tweaking a few parameters of the binding process, photons could travel side by side, a specific distance from each other. The arrangement is akin to the way that two hydrogen atoms sit next to each other in a hydrogen molecule.

“It’s not a molecule per se, but you can imagine it as having a similar kind of structure,” says NIST’s Alexey Gorshkov. “We’re learning how to build complex states of light that, in turn, can be built into more complex objects. This is the first time anyone has shown how to bind two photons a finite distance apart.”

While the new findings appear to be a step in the right direction—if we can build a molecule of light, why not a sword?—Gorshkov says he is not optimistic that Jedi Knights will be lining up at NIST’s gift shop anytime soon. The main reason is that binding photons requires extreme conditions difficult to produce with a roomful of lab equipment, let alone fit into a sword’s handle. Still, there are plenty of other reasons to make molecular light—humbler than lightsabers, but useful nonetheless.

“Lots of modern technologies are based on light, from communication technology to high-definition imaging,” Gorshkov says. “Many of them would be greatly improved if we could engineer interactions between photons.”

For example, engineers need a way to precisely calibrate light sensors, and Gorshkov says the findings could make it far easier to create a “standard candle” that shines a precise number of photons at a detector. Perhaps more significant to industry, binding and entangling photons could allow computers to use photons as information processors, a job that electronic switches in your computer do today.

Not only would this provide a new basis for creating computer technology, but it also could result in substantial energy savings. Phone messages and other data that currently travel as light beams through fiber optic cables has to be converted into electrons for processing—an inefficient step that wastes a great deal of electricity. If both the transport and the processing of the data could be done with photons directly, it could reduce these energy losses.

Gorshkov says it will be important to test the new theory in practice for these and other potential benefits.

“It’s a cool new way to study photons,” he says. “They’re massless and fly at the speed of light. Slowing them down and binding them may show us other things we didn’t know about them before.”

Here are links and citations for the paper. First, there’s an early version on arXiv.org and, then, there’s the peer-reviewed version, which is not yet available,

Coulomb bound states of strongly interacting photons by M. F. Maghrebi, M. J. Gullans, P. Bienias, S. Choi, I. Martin, O. Firstenberg, M. D. Lukin, H. P. Büchler, A. V. Gorshkov.      arXiv:1505.03859 [quant-ph] (or arXiv:1505.03859v1 [quant-ph] for this version)

Coulomb bound states of strongly interacting photons by M. F. Maghrebi, M. J. Gullans, P. Bienias, S. Choi, I. Martin, O. Firstenberg, M. D. Lukin, H. P. Büchler, and A. V. Gorshkov.
Phys. Rev. Lett. forthcoming in September 2015.

The first version (arXiv) is open access and I’m not sure whether or not the Physical review Letters study will be behind a paywall or be available as an open access paper.

*EurekAlert link added 10:34 am PST on Sept. 11, 2015.

Luke Skywalker (Mark Hamill) and nanotechnology?

I don’t often get the chance to reference Star Wars and last week’s announcement that A. P. N. G.  Enterprises has hired Mark Hamill (Luke Skywalker) to be a creative consultant wouldn’t ordinarily afford that opportunity. However, in this case, it turns out that Hamill will be consulting on a series of mulitmedia projects based on the comic book series, NEW-GEN. From the Oct. 14, 2011 news article by Mark Langshaw on Digital Spy,

JD Matonti, Chris Matonti and Julia Coppola of APNG created the series for Marvel Comics. The story takes place in an extra-dimensional world where everything is controlled by nanotechnology. [emphasis mine] It follows a fierce conflict between two top scientists.

A NEW-GEN movie is in the works, directed by Matonti. The company also plans to bring the series to television, mobile devices and video games.

NEW-GEN is a fresh and powerful new story that will surely resonate with audiences across multiple platforms,” Hamill said.

I wonder what they mean by “…  everything controlled by nanotechnology?”

There’s a bit more information in the Oct. 16, 2011 news item in the Calgary Herald,

To introduce Hamill, the company will release a special six-issue graphic novel, NEW-GEN: Volume One, featuring a “forward” from Hamill who shares his thoughts on this nextgeneration comic franchise that evolves around the battle over nanotechnology.

Matt Blum in his Oct. 13, 2011 article for Wired Magazine notes,

You may not be aware of it, but Mark Hamill is a huge comic book geek. Yes, the man who played Luke Skywalker and whose second career as a voice actor is second to none gets just as excited by comic books as most of the folks who hang on his every word as Joker.

Anyone curious about the NEW-GEN comic books can sample the first issue free. Here’s the summary of the first issue available from the download page,

New-Gen exists in another dimension; a utopian society where human beings, creatures, and robots co-exist in complete harmony until a battle over nanotechnology rages between two superhuman scientists. Gabriel banishes his former friend, Deadalus, to The Underworld, sends his infant twin sons to Earth and takes in the young children and creatures affected by Nanotechnology. The children and creatures grow up possessing unique “NanoPowers” in the Association for the Protection of the New Generation (A.P.N.G.) and will oppose Deadalus as he evolves into the purely evil Sly whose goal is to transform worlds.

If it seems familiar, it should be as I last profiled this series in my July 22, 2010 posting where I noted that the series originally launched in 2008 and had been reworked into something edgier for its 2010 relaunch. At that time, they were offering the first three issues for free.

Liquid lenses and integrated research into nanotechnology safety

A flexible, fluid micro lens has been created by engineers at Penn State University. Here’s why it’s interesting news (from Nanowerk News),

Like tiny Jedi knights, tunable fluidic micro lenses can focus and direct light at will to count cells, evaluate molecules or create on-chip optical tweezers, according to a team of Penn State engineers. They may also provide imaging in medical devices, eliminating the necessity and discomfort of moving the tip of a probe.

For more about the work, go here. On a sidenote, this is the first time I’ve seen a Star Wars metaphor used. Depending on the nature of the breakthrough, you usually get Spiderman, Harry Potter, or Star Trek if they’re using a science fiction metaphor.

In other news, the Institute of Occupational Medicine (IOM) in Scotland will be leading a multi-million Euro project, Engineered NanoParticle Risk Assessment (ENPRA). From Nanowerk News (again),

The 3 ½ year IOM-led project, worth €3.7 million, harnesses the knowledge and capabilities of 15 European and 6 US partners including three US Federal Agencies: EPA, NIOSH and NIH-NIEHS. Under the coordination of Dr Lang Tran, IOM’s Director of Computational Toxicology, ENPRA will utilise the latest advances within in vitro, in vivo and in silico approaches to nanotechnology environment, health & safety (EHS) research to realise its aims.

There’s more about the project here. For anyone not familiar with the US abbreviations, EPA = Environmental Protection Agency, NIOSH = National Institute of Occupational Safety and Health, and NIH-NIEHS = National Institutes of Health – National Institute of Environmental Health and Safety.

I don’t know but this seems like a lot of governments and it could take them years to figure out what the multiple agency abbreviations stand for. Even so, bravo for taking the first steps.