Tag Archives: Zhennan Li

Using touch (bionic fingers) instead of x-rays

This is not the most exciting video but it is weirdly fascinating (thank you to ScientifiCult),

A February 15, 2023 news item on Nanowerk provides a textual description for what you’re seeing in the video, Note: A link has been removed,

What if, instead of using X-rays or ultrasound, we could use touch to image the insides of human bodies and electronic devices? In a study publishing in the journal Cell Reports Physical Science (“A smart bionic finger for subsurface tactile-tomography”), researchers present a bionic finger that can create 3D maps of the internal shapes and textures of complex objects by touching their exterior surface.

“We were inspired by human fingers, which have the most sensitive tactile perception that we know of,” says senior author Jianyi Luo, a professor at Wuyi University. “For example, when we touch our own bodies with our fingers, we can sense not only the texture of our skin, but also the outline of the bone beneath it.”

“Our bionic finger goes beyond previous artificial sensors that were only capable of recognizing and discriminating between external shapes, surface textures, and hardness,” says co-author Zhiming Chen, a lecturer at Wuyi University.

The bionic finger “scans” an object by moving across it and applying pressure—think of a constant stream of pokes or prods. With each poke, the carbon fibers compress, and the degree to which they compress provides information about the relative stiffness or softness of the object. Depending on the object’s material, it might also compress when poked by the bionic finger: rigid objects hold their shape, while soft objects will deform when enough pressure is applied. This information, along with the location at which it was recorded, is relayed to a personal computer and displayed onscreen as a 3D map.

A February 13, 2023 Cell Press news release on EurekAlert, which originated the news item, provides more details about the research and some hints at what the researchers may do next,

The researchers tested the bionic finger’s ability to map out the internal and external features of complex objects made of multiple types of material, such as a rigid “letter A” buried under a layer of soft silicon, as well as more abstractly shaped objects. When they used it to scan a small compound object made of three different materials—a rigid internal material, a soft internal material, and a soft outer coating—the bionic finger was able to discriminate between not only the soft outer coating and the internal hard ridges, but it could also tell the difference between the soft outer coating and the soft material that filled the internal grooves.

Next, the researchers tested the finger’s ability to sense and image simulated human tissue. They created this tissue— consisting of a skeletal component, made of three layers of hard polymer, and a soft silicone “muscle” layer—using 3D printing. The bionic finger was able to reproduce a 3D profile of the tissue’s structure and locate a simulated blood vessel beneath the muscle layer.

The team also explored the bionic finger’s ability to diagnose issues in electronic devices without opening them up. By scanning the surface of a defective electronic device with the bionic finger, the researchers were able to create a map of its internal electrical components and pinpoint the location at which the circuit was disconnected, as well as a mis-drilled hole, without breaking through the encapsulating layer.

“This tactile technology opens up a non-optical way for the nondestructive testing of the human body and flexible electronics,” says Luo. “Next, we want to develop the bionic finger’s capacity for omnidirectional detection with different surface materials.”

Here’s a link to and a citation for the paper,

A smart bionic finger for subsurface tactile tomography by Yizhou Li, Zhiming Chen, Youbin Chen, Hao Yang, Junyong Lu, Zhennan Li, Yongyao Chen, Dongyi Ding, Cuiying Zeng, Bingpu Zhou, Hongpeng Liang, Xingpeng Huang, Jiajia Hu, Jingcheng Huang, Jinxiu Wen, Jianyi Luo. Volume 4, Issue 2, 15 February 2023, 101257 DOI: https://doi.org/10.1016/j.xcrp.2023.101257 Published online: February 15, 2023 Version of Record 15 February 2023.

This paper is open access.