Monthly Archives: December 2022

Tunable metasurfaces and reshaping the future of light

Thinner, meaning smaller and less bulky, is a prized quality in technologies such as phones, batteries, and, in this case, lenses. From a May 16, 2022 news item on ScienceDaily,

The technological advancement of optical lenses has long been a significant marker of human scientific achievement. Eyeglasses, telescopes, cameras, and microscopes have all literally and figuratively allowed us to see the world in a new light. Lenses are also a fundamental component of manufacturing nanoelectronics by the semiconductor industry.

One of the most impactful breakthroughs of lens technology in recent history has been the development of photonic metasurfaces — artificially engineered nano-scale materials with remarkable optical properties. Georgia Tech [Georgia Institute of Technology] researchers at the forefront of this technology have recently demonstrated the first-ever electrically tunable photonic metasurface platform in a recent study published by Nature Communications.

“Metasurfaces can make the optical systems very thin, and as they become easier to control and tune, you’ll soon find them in cell phone cameras and similar electronic imaging systems,” said Ali Adibi, professor in the School of Electrical and Computer Engineering at the Georgia Institute of Technology [Georgia Tech; US].

A May 10, 2022 Georgia Tech news release (also on EurekAlert but published May 16, 2022), which originated the news item, provides more detail,

The pronounced tuning measures achieved through the new platform represent a critical advancement towards the development of miniaturized reconfigurable metasurfaces. The results of the study have shown a record eleven-fold change in the reflective properties, a large range of spectral tuning for operation, and much faster tuning speed.

Heating Up Metasurfaces

Metasurfaces are a class of nanophotonic materials in which a large range of miniaturized elements are engineered to affect the transmission and reflection of light at different frequencies in a controlled way.

“When viewing under very strong microscopes, metasurfaces look like a periodic array of posts,” said Adibi. “The best analogy would be to think of a LEGO pattern formed by connecting many similar LEGO bricks next to each other.”

Since their inception, metasurfaces have been used to demonstrate that very thin optical devices can affect light propagation with metalenses (the formation of thin lenses) being the most developed application.

Despite impressive progress, most demonstrated metasurfaces are passive, meaning their performance cannot be changed (or tuned) after fabrication. The work presented by Adibi and his team, led by Ph.D. candidate Sajjad Abdollahramezani, applies electrical heat to a special class of nanophotonic materials to create a platform that can enable reconfigurable metasurfaces to be easily manufactured with high levels of optical modulation.

PCMs Provide the Answer

A wide range of materials may be used to form metasurfaces including metals, oxides, and semiconductors, but Abdollahramezani and Adibi’s research focuses on phase-change materials (PCMs) because they can form the most effective structures with the smallest feature sizes. PCMs are substances that absorb and release heat during the process of heating and cooling. They are called “phase-change” materials because they go from one crystallization state to another during the thermal cycling process. Water changing from a liquid to a solid or gas is the most common example.

The Georgia Tech team’s experiments are substantially more complicated than heating and freezing water. Knowing that the optical properties of PCMs can be altered by local heating, they have harnessed the full potential of the PCM alloy Ge2Sb2Te5 (GST), which is a compound of germanium, antimony, and tellurium.

By combining the optical design with a miniaturized electrical microheater underneath, the team can change the crystalline phase of the GST to make active tuning of the metasurface device possible. The fabricated metasurfaces were developed at Georgia Tech’s Institute for Electronics and Nanotechnology (IEN) and tested in characterization labs by illuminating the reconfigurable metasurfaces with laser light at different frequencies and measuring the properties of the reflected light in real time.

What Tunable Metasurfaces Mean for the Future

Driven by device miniaturization and system integration, as well as their ability to selectively reflect different colors of light, metasurfaces are rapidly replacing bulky optical assemblies of the past. Immediate impact on technologies like LiDAR systems for autonomous cars, imaging, spectroscopy, and sensing is expected.

With further development, more aggressive applications like computing, augmented reality, photonic chips for artificial intelligence, and biohazard detection can also be envisioned, according to Abdollahramezani and Adibi.

“As the platform continues to develop, reconfigurable metasurfaces will be found everywhere,” said Adibi. “They will even empower smaller endoscopes to go deep inside the body for better imaging and help medical sensors detect different biomarkers in blood.”

Funding: This material is based upon work supported by the National Science Foundation (NSF) under Grant No. 1837021. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF. The work was primarily funded by Office of Naval Research (ONR) (N00014-18-1-2055, Dr. B. Bennett) and by Defense Advanced Research Projects Agency [DARPA] (D19AC00001, Dr. R. Chandrasekar). W.C. acknowledges support from ONR (N00014-17-1-2555) and National Science Foundation (NSF) (DMR-2004749). A. Alù acknowledges support from Air Force Office of Scientific Research and the Simons Foundation. M.W. acknowledges support by the Deutsche Forschungsgemeinschaft (SFB 917). M.E.S. acknowledges financial support of NSF-CHE (1608801). This work was performed in part at the Georgia Tech Institute for Electronics and Nanotechnology (IEN), a member of the National Nanotechnology Coordinated Infrastructure (NNCI), which is supported by NSF (ECCS1542174).

Caption: Georgia Tech professor Ali Adibi [on the right] with Ph.D. candidate Sajjad Abdollahramezani [on th eleft holding an unidentified object] in Ali’s Photonics Research Group lab where the characterization of the tunable metasurfaces takes place. Credit: Georgia Tech

I am charmed by this image. Neither of these two are professionals at posing for photographers. Nonetheless, they look pleased and happy to help the publicity team spread the word about their research, they also seem like they’re looking forward to getting back to work.

Here’s a link to and a citation for the paper,

Electrically driven reprogrammable phase-change metasurface reaching 80% efficiency by Sajjad Abdollahramezani, Omid Hemmatyar, Mohammad Taghinejad, Hossein Taghinejad, Alex Krasnok, Ali A. Eftekhar, Christian Teichrib, Sanchit Deshmukh, Mostafa A. El-Sayed, Eric Pop, Matthias Wuttig, Andrea Alù, Wenshan Cai & Ali Adibi. Nature Communications volume 13, Article number: 1696 (2022) DOI: https://doi.org/10.1038/s41467-022-29374-6 Published: 30 March 2022

This paper is open access.

Electrotactile rendering device virtualizes the sense of touch

I stumbled across this November 15, 2022 news item on Nanowerk highlighting work on the sense of touch in the virual originally announced in October 2022,

A collaborative research team co-led by City University of Hong Kong (CityU) has developed a wearable tactile rendering system, which can mimic the sensation of touch with high spatial resolution and a rapid response rate. The team demonstrated its application potential in a braille display, adding the sense of touch in the metaverse for functions such as virtual reality shopping and gaming, and potentially facilitating the work of astronauts, deep-sea divers and others who need to wear thick gloves.

Here’s what you’ll need to wear for this virtual tactile experience,

Caption: The new wearable tactile rendering system can mimic touch sensations with high spatial resolution and a rapid response rate. Credit: Robotics X Lab and City University of Hong Kong

An October 20, 2022 City University of Hong Kong (CityU) press release (also on EurekAlert), which originated the news item, delves further into the research,

“We can hear and see our families over a long distance via phones and cameras, but we still cannot feel or hug them. We are physically isolated by space and time, especially during this long-lasting pandemic,” said Dr Yang Zhengbao,Associate Professor in the Department of Mechanical Engineering of CityU, who co-led the study. “Although there has been great progress in developing sensors that digitally capture tactile features with high resolution and high sensitivity, we still lack a system that can effectively virtualize the sense of touch that can record and playback tactile sensations over space and time.”

In collaboration with Chinese tech giant Tencent’s Robotics X Laboratory, the team developed a novel electrotactile rendering system for displaying various tactile sensations with high spatial resolution and a rapid response rate. Their findings were published in the scientific journal Science Advances under the title “Super-resolution Wearable Electro-tactile Rendering System”.

Limitations in existing techniques

Existing techniques to reproduce tactile stimuli can be broadly classified into two categories: mechanical and electrical stimulation. By applying a localised mechanical force or vibration on the skin, mechanical actuators can elicit stable and continuous tactile sensations. However, they tend to be bulky, limiting the spatial resolution when integrated into a portable or wearable device. Electrotactile stimulators, in contrast, which evoke touch sensations in the skin at the location of the electrode by passing a local electric current though the skin, can be light and flexible while offering higher resolution and a faster response. But most of them rely on high voltage direct-current (DC) pulses (up to hundreds of volts) to penetrate the stratum corneum, the outermost layer of the skin, to stimulate the receptors and nerves, which poses a safety concern. Also, the tactile rendering resolution needed to be improved.

The latest electro-tactile actuator developed by the team is very thin and flexible and can be easily integrated into a finger cot. This fingertip wearable device can display different tactile sensations, such as pressure, vibration, and texture roughness in high fidelity. Instead of using DC pulses, the team developed a high-frequency alternating stimulation strategy and succeeded in lowering the operating voltage under 30 V, ensuring the tactile rendering is safe and comfortable.

They also proposed a novel super-resolution strategy that can render tactile sensation at locations between physical electrodes, instead of only at the electrode locations. This increases the spatial resolution of their stimulators by more than three times (from 25 to 105 points), so the user can feel more realistic tactile perception.

Tactile stimuli with high spatial resolution

“Our new system can elicit tactile stimuli with both high spatial resolution (76 dots/cm2), similar to the density of related receptors in the human skin, and a rapid response rate (4 kHz),” said Mr Lin Weikang, a PhD student at CityU, who made and tested the device.

The team ran different tests to show various application possibilities of this new wearable electrotactile rendering system. For example, they proposed a new Braille strategy that is much easier for people with a visual impairment to learn.

The proposed strategy breaks down the alphabet and numerical digits into individual strokes and order in the same way they are written. By wearing the new electrotactile rendering system on a fingertip, the user can recognise the alphabet presented by feeling the direction and the sequence of the strokes with the fingertip sensor. “This would be particularly useful for people who lose their eye sight later in life, allowing them to continue to read and write using the same alphabetic system they are used to, without the need to learn the whole Braille dot system,” said Dr Yang.

Enabling touch in the metaverse

Second, the new system is well suited for VR/AR [virtual reality/augmented reality] applications and games, adding the sense of touch to the metaverse. The electrodes can be made highly flexible and scalable to cover larger areas, such as the palm. The team demonstrated that a user can virtually sense the texture of clothes in a virtual fashion shop. The user also experiences an itchy sensation in the fingertips when being licked by a VR cat. When stroking a virtual cat’s fur, the user can feel a variance in the roughness as the strokes change direction and speed.

The system can also be useful in transmitting fine tactile details through thick gloves. The team successfully integrated the thin, light electrodes of the electrotactile rendering system into flexible tactile sensors on a safety glove. The tactile sensor array captures the pressure distribution on the exterior of the glove and relays the information to the user in real time through tactile stimulation. In the experiment, the user could quickly and accurately locate a tiny steel washer just 1 mm in radius and 0.44mm thick based on the tactile feedback from the glove with sensors and stimulators. This shows the system’s potential in enabling high-fidelity tactile perception, which is currently unavailable to astronauts, firefighters, deep-sea divers and others who need wear thick protective suits or gloves.

“We expect our technology to benefit a broad spectrum of applications, such as information transmission, surgical training, teleoperation, and multimedia entertainment,” added Dr Yang.

Here’s a link to and a citation for the paper,

Super-resolution wearable electrotactile rendering system by Weikang Lin, Dongsheng Zhang, Wang Wei Lee, Xuelong Li, Ying Hong, Qiqi Pan, Ruirui Zhang, Guoxiang Peng, Hong Z. Tan, Zhengyou Zhang, Lei Wei, and Zhengbao Yang. Science Advances 9 Sep 2022 Vol 8, Issue 36 DOI: 10.1126/sciadv.abp8738

This paper is open access.