Tag Archives: Jeff Huang

Fission chips—using vinegar to produce ultraviolet (UV) light sensors for more efficient and flexible wearable devices?

Thank you to whoever wrote this headline (I love wordplay), “Fission chips – How vinegar could revolutionize sensor processing” used for an August 28, 2024 news item on ScienceDaily,

Researchers at Macquarie University [Australia] have developed a new way to produce ultraviolet (UV) light sensors, which could lead to more efficient and flexible wearable devices.

The study, published in the journal Small in July [2024], shows how acetic acid vapour — essentially vinegar fumes — can rapidly improve the performance of zinc oxide nanoparticle-based sensors without using high-temperatures for processing.

An August 23, 2024 Macquarie University press release (also on EurekAlert but published on August 28, 2024), which originated the news item, provides more detail about the new technique,

Co-author Professor Shujuan Huang, from the School of Engineering at Macquarie University, says: “We found by briefly exposing the sensor to vinegar vapour, adjoining particles of zinc oxide on the sensor’s surface would merge together, forming a bridge that could conduct energy.”

Joining zinc oxide nanoparticles together is a critical part of building tiny sensors, as it creates channels for electrons to flow through.

The research team found that their vapour method could make UV detectors 128,000 more responsive than untreated ones, and the sensors could still accurately detect UV light without interference, making them highly sensitive and reliable.

Associate Professor Noushin Nasiri, co-author on the paper and head of the Nanotech Laboratory at Macquarie University, says: “Usually, these sensors are processed in an oven, heated at high temperature for 12 hours or so, before they can operate or transmit any signal.”

But instead, the team found a simple chemical way to copy the effects of the heat process.

“We found a way to process these sensors at room temperature with a very cheap ingredient – vinegar. You just expose the sensor to vinegar vapour for five minutes, and that’s it – you have a working sensor,” she says.

To create the sensors, the researchers sprayed a zinc solution into a flame, producing a fine mist of zinc oxide nanoparticles that settled onto platinum electrodes. This formed a thin sponge-like film, which they then exposed to vinegar vapour for five to 20 minutes.

The vinegar vapour changed how the tiny particles in the film were arranged, helping the particles connect to each other, so electrons could flow through the sensor. At the same time, the particles stayed small enough to detect light effectively.

“These sensors are made of many, many tiny particles that need to be connected for the sensor to work,” says Associate Professor Nasiri.

“Until we treat them, the particles just sit next to each other, almost as if they have a wall around them, so when light creates an electrical signal in one particle, it can’t easily travel to the next particle. That’s why an untreated sensor doesn’t give us a good signal.”

The researchers went through intensive testing of different formulations before hitting on the perfect balance in their process.

“Water alone isn’t strong enough to make the particles join. But pure vinegar is too strong and destroys the whole structure,” says Professor Huang. “We had to find just the right mix.”

The study shows the best results came from sensors exposed to the vapour for around 15 minutes. Longer exposure times caused too many structural changes and worse performance.

“The unique structure of these highly porous nanofilms enables oxygen to penetrate deeply, so that the entire film is part of the sensing mechanism,” Professor Huang says.

The new room-temperature vapour technique has many advantages over current high-temperature methods. It allows the use of heat-sensitive materials and flexible bases, and is cheaper and better for the environment.

Associate Professor Nasiri says the process can easily be scaled up commercially.

“The sensor materials could be laid out on a rolling plate, passing through an enclosed environment with vinegar vapours, and be ready to use in less than 20 minutes.”

The process will be a real advantage in creating wearable UV sensors, which need to be flexible and to use very little power.

Associate Professor Nasiri says that this method for UV sensors could be used for other types of sensors too, using simple chemical vapour treatments instead of high-temperature sensor processing across a wide range of functional materials, nanostructures and bases or substrates.

Here’s a link to and a citation for the paper,

Vapor-Tailored Nanojunctions in Ultraporous ZnO Nanoparticle Networks for Superior UV Photodetection by Jeff Huang, Xiaohu Chen, Shujuan Huang, Noushin Nasiri. Small DOI: https://doi.org/10.1002/smll.202402558 First published: 20 July 2024

This paper is open access.

Smartphone as augmented reality system with software from Brown University

You need to see this,

Amazing, eh? The researchers are scheduled to present this work sometime this week at the ACM Symposium on User Interface Software and Technology (UIST) being held in New Orleans, US, from October 20-23, 2019.

Here’s more about ‘Portal-ble’ in an October 16, 2019 news item on ScienceDaily,

A new software system developed by Brown University [US] researchers turns cell phones into augmented reality portals, enabling users to place virtual building blocks, furniture and other objects into real-world backdrops, and use their hands to manipulate those objects as if they were really there.

The developers hope the new system, called Portal-ble, could be a tool for artists, designers, game developers and others to experiment with augmented reality (AR). The team will present the work later this month at the ACM Symposium on User Interface Software and Technology (UIST 2019) in New Orleans. The source code for Andriod is freely available for download on the researchers’ website, and iPhone code will follow soon.

“AR is going to be a great new mode of interaction,” said Jeff Huang, an assistant professor of computer science at Brown who developed the system with his students. “We wanted to make something that made AR portable so that people could use anywhere without any bulky headsets. We also wanted people to be able to interact with the virtual world in a natural way using their hands.”

An October 16, 2019 Brown University news release (also on EurekAlert), which originated the news item, provides more detail,

Huang said the idea for Portal-ble’s “hands-on” interaction grew out of some frustration with AR apps like Pokemon GO. AR apps use smartphones to place virtual objects (like Pokemon characters) into real-world scenes, but interacting with those objects requires users to swipe on the screen.

“Swiping just wasn’t a satisfying way of interacting,” Huang said. “In the real world, we interact with objects with our hands. We turn doorknobs, pick things up and throw things. So we thought manipulating virtual objects by hand would be much more powerful than swiping. That’s what’s different about Portal-ble.”

The platform makes use of a small infrared sensor mounted on the back of a phone. The sensor tracks the position of people’s hands in relation to virtual objects, enabling users to pick objects up, turn them, stack them or drop them. It also lets people use their hands to virtually “paint” onto real-world backdrops. As a demonstration, Huang and his students used the system to paint a virtual garden into a green space on Brown’s College Hill campus.

Huang says the main technical contribution of the work was developing the right accommodations and feedback tools to enable people to interact intuitively with virtual objects.

“It turns out that picking up a virtual object is really hard if you try to apply real-world physics,” Huang said. “People try to grab in the wrong place, or they put their fingers through the objects. So we had to observe how people tried to interact with these objects and then make our system able accommodate those tendencies.”

To do that, Huang enlisted students in a class he was teaching to come up with tasks they might want to do in the AR world — stacking a set of blocks, for example. The students then asked other people to try performing those tasks using Portal-ble, while recording what people were able to do and what they couldn’t. They could then adjust the system’s physics and user interface to make interactions more successful.

“It’s a little like what happens when people draw lines in Photoshop,” Huang said. “The lines people draw are never perfect, but the program can smooth them out and make them perfectly straight. Those were the kinds of accommodations we were trying to make with these virtual objects.”

The team also added sensory feedback — visual highlights on objects and phone vibrations — to make interactions easier. Huang said he was somewhat surprised that phone vibrations helped users to interact. Users feel the vibrations in the hand they’re using to hold the phone, not in the hand that’s actually grabbing for the virtual object. Still, Huang said, vibration feedback still helped users to more successfully interact with objects.

In follow-up studies, users reported that the accommodations and feedback used by the system made tasks significantly easier, less time-consuming and more satisfying.

Huang and his students plan to continue working with Portal-ble — expanding its object library, refining interactions and developing new activities. They also hope to streamline the system to make it run entirely on a phone. Currently the infrared sensor requires an infrared sensor and external compute stick for extra processing power.

Huang hopes people will download the freely available source code and try it for themselves. 
“We really just want to put this out there and see what people do with it,” he said. “The code is on our website for people to download, edit and build off of. It will be interesting to see what people do with it.

Co-authors on the research paper were Jing Qian, Jiaju Ma, Xiangyu Li, Benjamin Attal, Haoming Lai, James Tompkin and John Hughes. The work was supported by the National Science Foundation (IIS-1552663) and by a gift from Pixar.

You can find the conference paper here on jeffhuang.com,

Portal-ble: Intuitive Free-hand Manipulationin Unbounded Smartphone-based Augmented Reality by Jing Qian, Jiaju Ma, Xiangyu Li∗, Benjamin Attal, Haoming Lai,James Tompkin, John F. Hughes, Jeff Huang. Brown University, Providence RI, USA; Southeast University, Nanjing, China. Presented at ACM Symposium on User Interface Software and Technology (UIST) being held in New Orleans, US

This is the first time I’ve seen an augmented reality system that seems accessible, i.e., affordable. You can find out more on the Portal-ble ‘resource’ page where you’ll also find a link to the source code repository. The researchers, as noted in the news release, have an Android version available now with an iPhone version to be released in the future.