Cellulose nanocrystals—bio-based nanomaterials derived from natural resources such as plant cellulose—are valuable for their use in water treatment, packaging, tissue engineering, electronics, antibacterial coatings and much more. Though the materials provide a sustainable alternative to non-bio-based materials, transporting them in liquid taxes industrial infrastructures and leads to environmental impacts.
A team of Penn State [Pennsylvania State University] chemical engineering researchers studied the mechanisms of drying the nanocrystals and proposed nanotechnology to render the nanocrystals highly redispersible in aqueous mediums, while retaining their full functionality, to make them easier to store and transport. They published their results in the journal Biomacromolecules.
This image illustrates what the drying process does,
This graphic representation of hairy cellulose nanocrystals, shown attached at their hairy ends when dried (right), will be featured as the Biomacromolecules journal cover in the Jan. 17 issue. Credit: Sheikhi Research Group. All Rights Reserved.
“We looked at how we could take hairy nanocrystals, dry them in ovens, and redisperse them in solutions containing different ions,” said co-first author Breanna Huntington, current chemical engineering doctoral student at the University of Delaware and former member of the Sheikhi Research Group while an undergraduate student at Penn State. “We then compared their functionality to conventional, non-hairy cellulose nanocrystals.”
The nanocrystals have negatively charged cellulose chains at their ends, known as hairs. When rehydrated, the hairs repel each other and separate, dispersing again through a liquid, as a result of electrosteric repulsion — a term meaning charge-driven, or electrostatic, and free-volume dependent, or steric.
“The hairy ends of the nanocrystals are nanoengineered to be negatively charged and repel each other when placed in an aqueous medium,” said corresponding author Amir Sheikhi, Penn State assistant professor of chemical engineering and of biomedical engineering. “To have maximum function, the nanocrystals must be separate, individual particles, not chained together as they are when they are dry.”
After the hairy particles were redispersed, researchers tested them and measured their size and surface properties and found their characteristics and performance were the same as those that had never been dried. They also found the particles could perform well and maintain their stability in a variety of liquid mixtures of different salinities and pH levels.
“The hairy nanocrystals can become redispersed even at high salt concentrations, which is convenient, as they remain functional in harsh media and may be used in a broad range of applications,” said co-first author Mica Pitcher, Penn State doctoral student in chemistry, supervised by Sheikhi. “This work may pave the way for sustainable and large-scale processing of nanocelluloses without using additive or energy-intensive methods.”
The Penn State College of Engineering Summer Research Experiences for Undergraduates program and the NASA Pennsylvania Space Grant Consortium graduate fellowship program supported this work.
This news is intriguing since they usually want to enhance memory not weaken it. Interestingly, this October 3, 2022 news item on ScienceDaily doesn’t immediately answer why you might want to weaken memory,
Robotics and wearable devices might soon get a little smarter with the addition of a stretchy, wearable synaptic transistor developed by Penn State engineers. The device works like neurons in the brain to send signals to some cells and inhibit others in order to enhance and weaken the devices’ memories.
Led by Cunjiang Yu, Dorothy Quiggle Career Development Associate Professor of Engineering Science and Mechanics and associate professor of biomedical engineering and of materials science and engineering, the team designed the synaptic transistor to be integrated in robots or wearables and use artificial intelligence to optimize functions. The details were published on Sept. 29 [2022] in Nature Electronics.
“Mirroring the human brain, robots and wearable devices using the synaptic transistor can use its artificial neurons to ‘learn’ and adapt their behaviors,” Yu said. “For example, if we burn our hand on a stove, it hurts, and we know to avoid touching it next time. The same results will be possible for devices that use the synaptic transistor, as the artificial intelligence is able to ‘learn’ and adapt to its environment.”
According to Yu, the artificial neurons in the device were designed to perform like neurons in the ventral tegmental area, a tiny segment of the human brain located in the uppermost part of the brain stem. Neurons process and transmit information by releasing neurotransmitters at their synapses, typically located at the neural cell ends. Excitatory neurotransmitters trigger the activity of other neurons and are associated with enhancing memories, while inhibitory neurotransmitters reduce the activity of other neurons and are associated with weakening memories.
“Unlike all other areas of the brain, neurons in the ventral tegmental area are capable of releasing both excitatory and inhibitory neurotransmitters at the same time,” Yu said. “By designing the synaptic transistor to operate with both synaptic behaviors simultaneously, fewer transistors are needed [emphasis mine] compared to conventional integrated electronics technology, which simplifies the system architecture and allows the device to conserve energy.”
To model soft, stretchy biological tissues, the researchers used stretchable bilayer semiconductor materials to fabricate the device, allowing it to stretch and twist while in use, according to Yu. Conventional transistors, on the other hand, are rigid and will break when deformed.
“The transistor is mechanically deformable and functionally reconfigurable, yet still retains its functions when stretched extensively,” Yu said. “It can attach to a robot or wearable device to serve as their outermost skin.”
In addition to Yu, other contributors include Hyunseok Shim and Shubham Patel, Penn State Department of Engineering Science and Mechanics; Yongcao Zhang, the University of Houston Materials Science and Engineering Program; Faheem Ershad, Penn State Department of Biomedical Engineering and University of Houston Department of Biomedical Engineering; Binghao Wang, School of Electronic Science and Engineering, Southeast University [Note: There’s one in Bangladesh, one in China, and there’s a Southeastern University in Florida, US] and Department of Chemistry and the Materials Research Center, Northwestern University; Zhihua Chen, Flexterra Inc.; Tobin J. Marks, Department of Chemistry and the Materials Research Center, Northwestern University; Antonio Facchetti, Flexterra Inc. and Northwestern University’s Department of Chemistry and Materials Research Center.
An April 7, 2022 news item on Nanowerk announces graphene research that could lead to advances in optoelectronics (Note: Links have been removed),
An international team, co-led by researchers at The University of Manchester’s National Graphene Institute (NGI) in the UK and the Penn State [Pennsylvania State University] College of Engineering in the US, has developed a tunable graphene-based platform that allows for fine control over the interaction between light and matter in the terahertz (THz) spectrum to reveal rare phenomena known as exceptional points.
The team published their results in Science (“Topological engineering of terahertz light using electrically tuneable exceptional point singularities”).
The work could advance optoelectronic technologies to better generate, control and sense light and potentially communications, according to the researchers. They demonstrated a way to control THz waves, which exist at frequencies between those of microwaves and infrared waves. The feat could contribute to the development of ‘beyond-5G’ wireless technology for high-speed communication networks.
Light and matter can couple, interacting at different levels: weakly, where they might be correlated but do not change each other’s constituents; or strongly, where their interactions can fundamentally change the system. The ability to control how the coupling shifts from weak to strong and back again has been a major challenge to advancing optoelectronic devices — a challenge researchers have now solved.
“We have demonstrated a new class of optoelectronic devices using concepts of topology — a branch of mathematics studying properties of geometric objects,” said co-corresponding author Coskun Kocabas, professor of 2D device materials at The University of Manchester. “Using exceptional point singularities, we show that topological concepts can be used to engineer optoelectronic devices that enable new ways to manipulate terahertz light.”
Exceptional points are spectral singularities — points at which any two spectral values in an open system coalesce. They are, unsurprisingly, exceptionally sensitive and respond to even the smallest changes to the system, revealing curious yet desirable characteristics, according to co-corresponding author Şahin K. Özdemir, associate professor of engineering science and mechanics at Penn State.
“At an exceptional point, the energy landscape of the system is considerably modified, resulting in reduced dimensionality and skewed topology,” said Özdemir, who is also affiliated with the Materials Research Institute, Penn State. “This, in turn, enhances the system’s response to perturbations, modifies the local density of states leading to the enhancement of spontaneous emission rates and leads to a plethora of phenomena. Control of exceptional points, and the physical processes that occur at them, could lead to applications for better sensors, imaging, lasers and much more.”
Platform composition
The platform the researchers developed consists of a graphene-based tunable THz resonator, with a gold-foil gate electrode forming a bottom reflective mirror. Above it, a graphene layer is book-ended with electrodes, forming a tunable top mirror. A non-volatile ionic liquid electrolyte layer sits between the mirrors, enabling control of the top mirror’s reflectivity by changing the applied voltage. In the middle of the device, between the mirrors, are molecules of alpha lactose, a sugar commonly found in milk.
The system is controlled by two adjusters. One raises the lower mirror to change the length of the cavity — tuning the frequency of resonation to couple the light with the collective vibrational modes of the organic sugar molecules, which serve as a fixed number of oscillators for the system. The other adjuster changes the voltage applied to the top graphene mirror — altering the graphene’s reflective properties to transition the energy loss imbalances to adjust coupling strength. The delicate, fine tuning shifts weakly coupled terahertz light and organic molecules to become strongly coupled and vice versa.
“Exceptional points coincide with the crossover point between the weak and strong coupling regimes of terahertz light with collective molecular vibrations,” Özdemir said.
He noted that these singularity points are typically studied and observed in the coupling of analogous modes or systems, such as two optical modes, electronic modes or acoustic modes.
“This work is one of rare cases where exceptional points are demonstrated to emerge in the coupling of two modes with different physical origins,” Kocabas said. “Due to the topology of the exceptional points, we observed a significant modulation in the magnitude and phase of the terahertz light, which could find applications in next-generation THz communications.”
Unprecedented phase modulation in the THz spectrum
As the researchers apply voltage and adjust the resonance, they drive the system to an exceptional point and beyond. Before, at and beyond the exceptional point, the geometric properties — the topology — of the system change.
One such change is the phase modulation, which describes how a wave changes as it propagates and interacts in the THz field. Controlling the phase and amplitude of THz waves is a technological challenge, the researchers said, but their platform demonstrates unprecedented levels of phase modulation. The researchers moved the system through exceptional points, as well as along loops around exceptional points in different directions, and measured how it responded through the changes. Depending on the system’s topology at the point of measurement, phase modulation could range from zero to four magnitudes larger.
“We can electrically steer the device through an exceptional point, which enables electrical control on reflection topology,” said first author M. Said Ergoktas. “Only by controlling the topology of the system electronically could we achieve these huge modulations.”
According to the researchers, the topological control of light-matter interactions around an exceptional point enabled by the graphene-based platform has potential applications ranging from topological optoelectronic and quantum devices to topological control of physical and chemical processes.
Contributors include Kaiyuan Wang, Gokhan Bakan, Thomas B. Smith, Alessandro Principi and Kostya S. Novoselov, University of Manchester; Sina Soleymani, graduate student in the Penn State Department of Engineering Science and Mechanics; Sinan Balci, Izmir Institute of Technology, Turkey; Nurbek Kakenov, who conducted work for this paper while at Bilkent University, Turkey.
I love the language in this press release, especially, ‘spectral singularities’. The explanations are more appreciated and help to make this image more than a pretty picture,
Caption: An international team, co-led by researchers at The University of Manchester’s National Graphene Institute (NGI) in the UK and the Penn State College of Engineering in the US, has developed a tunable graphene-based platform that allows for fine control over the interaction between light and matter in the terahertz (THz) spectrum to reveal rare phenomena known as exceptional points. The feat could contribute to the development of beyond-5G wireless technology for high-speed communication networks. Credit: Image Design, Pietro Steiner, The University of Manchester
Oddly, there is an identical press release dated April 8, 2022 on the Pennsylvania State University website with a byline for By Ashley J. WennersHerron and Alan Beck. Interestingly the first author is from Penn State and the second author is from the University of Manchester.
Talk of artificial brains (also known as, brainlike computing or neuromorphic computing) usually turns to memory fairly quickly. This February 3, 2022 news item on ScienceDaily does too although the focus is on how memory and forgetting affect the ability to learn,
When the human brain learns something new, it adapts. But when artificial intelligence learns something new, it tends to forget information it already learned.
As companies use more and more data to improve how AI recognizes images, learns languages and carries out other complex tasks, a paper publishing in Science this week shows a way that computer chips could dynamically rewire themselves to take in new data like the brain does, helping AI to keep learning over time.
“The brains of living beings can continuously learn throughout their lifespan. We have now created an artificial platform for machines to learn throughout their lifespan,” said Shriram Ramanathan, a professor in Purdue University’s [Indiana, US] School of Materials Engineering who specializes in discovering how materials could mimic the brain to improve computing.
Unlike the brain, which constantly forms new connections between neurons to enable learning, the circuits on a computer chip don’t change. A circuit that a machine has been using for years isn’t any different than the circuit that was originally built for the machine in a factory.
This is a problem for making AI more portable, such as for autonomous vehicles or robots in space that would have to make decisions on their own in isolated environments. If AI could be embedded directly into hardware rather than just running on software as AI typically does, these machines would be able to operate more efficiently.
In this study, Ramanathan and his team built a new piece of hardware that can be reprogrammed on demand through electrical pulses. Ramanathan believes that this adaptability would allow the device to take on all of the functions that are necessary to build a brain-inspired computer.
“If we want to build a computer or a machine that is inspired by the brain, then correspondingly, we want to have the ability to continuously program, reprogram and change the chip,” Ramanathan said.
Toward building a brain in chip form
The hardware is a small, rectangular device made of a material called perovskite nickelate, which is very sensitive to hydrogen. Applying electrical pulses at different voltages allows the device to shuffle a concentration of hydrogen ions in a matter of nanoseconds, creating states that the researchers found could be mapped out to corresponding functions in the brain.
When the device has more hydrogen near its center, for example, it can act as a neuron, a single nerve cell. With less hydrogen at that location, the device serves as a synapse, a connection between neurons, which is what the brain uses to store memory in complex neural circuits.
Through simulations of the experimental data, the Purdue team’s collaborators at Santa Clara University and Portland State University showed that the internal physics of this device creates a dynamic structure for an artificial neural network that is able to more efficiently recognize electrocardiogram patterns and digits compared to static networks. This neural network uses “reservoir computing,” which explains how different parts of a brain communicate and transfer information.
Researchers from The Pennsylvania State University also demonstrated in this study that as new problems are presented, a dynamic network can “pick and choose” which circuits are the best fit for addressing those problems.
Since the team was able to build the device using standard semiconductor-compatible fabrication techniques and operate the device at room temperature, Ramanathan believes that this technique can be readily adopted by the semiconductor industry.
“We demonstrated that this device is very robust,” said Michael Park, a Purdue Ph.D. student in materials engineering. “After programming the device over a million cycles, the reconfiguration of all functions is remarkably reproducible.”
The researchers are working to demonstrate these concepts on large-scale test chips that would be used to build a brain-inspired computer.
Experiments at Purdue were conducted at the FLEX Lab and Birck Nanotechnology Center of Purdue’s Discovery Park. The team’s collaborators at Argonne National Laboratory, the University of Illinois, Brookhaven National Laboratory and the University of Georgia conducted measurements of the device’s properties.
Here’s a link to and a citation for the paper,
Reconfigurable perovskite nickelate electronics for artificial intelligence by Hai-Tian Zhang, Tae Joon Park, A. N. M. Nafiul Islam, Dat S. J. Tran, Sukriti Manna, Qi Wang, Sandip Mondal, Haoming Yu, Suvo Banik, Shaobo Cheng, Hua Zhou, Sampath Gamage, Sayantan Mahapatra, Yimei Zhu, Yohannes Abate, Nan Jiang, Subramanian K. R. S. Sankaranarayanan, Abhronil Sengupta, Christof Teuscher, Shriram Ramanathan. Science • 3 Feb 2022 • Vol 375, Issue 6580 • pp. 533-539 • DOI: 10.1126/science.abj7943
An Oct. 29, 2020 news item on ScienceDaily features an explanation of the reasons for investigating brainlike (neuromorphic) computing ,
As progress in traditional computing slows, new forms of computing are coming to the forefront. At Penn State, a team of engineers is attempting to pioneer a type of computing that mimics the efficiency of the brain’s neural networks while exploiting the brain’s analog nature.
Modern computing is digital, made up of two states, on-off or one and zero. An analog computer, like the brain, has many possible states. It is the difference between flipping a light switch on or off and turning a dimmer switch to varying amounts of lighting.
Neuromorphic or brain-inspired computing has been studied for more than 40 years, according to Saptarshi Das, the team leader and Penn State [Pennsylvania State University] assistant professor of engineering science and mechanics. What’s new is that as the limits of digital computing have been reached, the need for high-speed image processing, for instance for self-driving cars, has grown. The rise of big data, which requires types of pattern recognition for which the brain architecture is particularly well suited, is another driver in the pursuit of neuromorphic computing.
“We have powerful computers, no doubt about that, the problem is you have to store the memory in one place and do the computing somewhere else,” Das said.
The shuttling of this data from memory to logic and back again takes a lot of energy and slows the speed of computing. In addition, this computer architecture requires a lot of space. If the computation and memory storage could be located in the same space, this bottleneck could be eliminated.
“We are creating artificial neural networks, which seek to emulate the energy and area efficiencies of the brain,” explained Thomas Shranghamer, a doctoral student in the Das group and first author on a paper recently published in Nature Communications. “The brain is so compact it can fit on top of your shoulders, whereas a modern supercomputer takes up a space the size of two or three tennis courts.”
Like synapses connecting the neurons in the brain that can be reconfigured, the artificial neural networks the team is building can be reconfigured by applying a brief electric field to a sheet of graphene, the one-atomic-thick layer of carbon atoms. In this work they show at least 16 possible memory states, as opposed to the two in most oxide-based memristors, or memory resistors [emphasis mine].
“What we have shown is that we can control a large number of memory states with precision using simple graphene field effect transistors [emphasis mine],” Das said.
The team thinks that ramping up this technology to a commercial scale is feasible. With many of the largest semiconductor companies actively pursuing neuromorphic computing, Das believes they will find this work of interest.
Caption: Tiny quantum fluctuations in the early universe explain two major mysteries about the large-scale structure of the universe, in a cosmic tango of the very small and the very large. A new study by researchers at Penn State used the theory of quantum loop gravity to account for these mysteries, which Einstein’s theory of general relativity considers anomalous.. Credit: Dani Zemba, Penn State
A July 29, 2020 news item on ScienceDaily announces a study showing that quantum loop cosmology can account for some large-scale mysteries,
While [1] Einstein’s theory of general relativity can explain a large array of fascinating astrophysical and cosmological phenomena, some aspects of the properties of the universe at the largest-scales remain a mystery. A new study using loop quantum cosmology — a theory that uses quantum mechanics to extend gravitational physics beyond Einstein’s theory of general relativity — accounts for two major mysteries. While the differences in the theories occur at the tiniest of scales — much smaller than even a proton — they have consequences at the largest of accessible scales in the universe. The study, which appears online July 29 [2020] in the journal Physical Review Letters, also provides new predictions about the universe that future satellite missions could test.
While [2] a zoomed-out picture of the universe looks fairly uniform, it does have a large-scale structure, for example because galaxies and dark matter are not uniformly distributed throughout the universe. The origin of this structure has been traced back to the tiny inhomogeneities observed in the Cosmic Microwave Background (CMB)–radiation that was emitted when the universe was 380 thousand years young that we can still see today. But the CMB itself has three puzzling features that are considered anomalies because they are difficult to explain using known physics.
“While [3] seeing one of these anomalies may not be that statistically remarkable, seeing two or more together suggests we live in an exceptional universe,” said Donghui Jeong, associate professor of astronomy and astrophysics at Penn State and an author of the paper. “A recent study in the journal Nature Astronomy proposed an explanation for one of these anomalies that raised so many additional concerns, they flagged a ‘possible crisis in cosmology‘ [emphasis mine].’ Using quantum loop cosmology, however, we have resolved two of these anomalies naturally, avoiding that potential crisis.”
Research over the last three decades has greatly improved our understanding of the early universe, including how the inhomogeneities in the CMB were produced in the first place. These inhomogeneities are a result of inevitable quantum fluctuations in the early universe. During a highly accelerated phase of expansion at very early times–known as inflation–these primordial, miniscule fluctuations were stretched under gravity’s influence and seeded the observed inhomogeneities in the CMB.
“To understand how primordial seeds arose, we need a closer look at the early universe, where Einstein’s theory of general relativity breaks down,” said Abhay Ashtekar, Evan Pugh Professor of Physics, holder of the Eberly Family Chair in Physics, and director of the Penn State Institute for Gravitation and the Cosmos. “The standard inflationary paradigm based on general relativity treats space time as a smooth continuum. Consider a shirt that appears like a two-dimensional surface, but on closer inspection you can see that it is woven by densely packed one-dimensional threads. In this way, the fabric of space time is really woven by quantum threads. In accounting for these threads, loop quantum cosmology allows us to go beyond the continuum described by general relativity where Einstein’s physics breaks down–for example beyond the Big Bang.”
The researchers’ previous investigation into the early universe replaced the idea of a Big Bang singularity, where the universe emerged from nothing, with the Big Bounce, where the current expanding universe emerged from a super-compressed mass that was created when the universe contracted in its preceding phase. They found that all of the large-scale structures of the universe accounted for by general relativity are equally explained by inflation after this Big Bounce using equations of loop quantum cosmology.
In the new study, the researchers determined that inflation under loop quantum cosmology also resolves two of the major anomalies that appear under general relativity.
“The primordial fluctuations we are talking about occur at the incredibly small Planck scale,” said Brajesh Gupt, a postdoctoral researcher at Penn State at the time of the research and currently at the Texas Advanced Computing Center of the University of Texas at Austin. “A Planck length is about 20 orders of magnitude smaller than the radius of a proton. But corrections to inflation at this unimaginably small scale simultaneously explain two of the anomalies at the largest scales in the universe, in a cosmic tango of the very small and the very large.”
The researchers also produced new predictions about a fundamental cosmological parameter and primordial gravitational waves that could be tested during future satellite missions, including LiteBird and Cosmic Origins Explorer, which will continue improve our understanding of the early universe.
That’s a lot of ‘while’. I’ve done this sort of thing, too, and whenever I come across it later; it’s painful.
On a practical level, it’s becoming clear that we need to become more thoughtful about our use of water. We here in Canada tend to take our water for granted, as if we have an inexhaustible supply. According to this August 21, 2008 CBC (Canadian Broadcasting Corporation) online news item, that’s not the case,
Canada’s stores of fresh water are not as plentiful as once thought, and threaten to pinch the economy and pit provinces against each other, a federal document says.
An internal report drafted last December [2007] by Environment Canada warns that climate change and a growing population will further drain resources.
“We can no longer take our extensive water supplies for granted,” says the report, titled A Federal Perspective on Water Quantity Issues.
The Canadian Press obtained the 21-page draft report under the Access to Information Act.
It suggests the federal government take a more hands-on role in managing the country’s water, which is now largely done by the provinces. Ottawa still manages most of the fresh water in the North through water boards.
The Conservatives promised a national water strategy in last fall’s throne speech but have been criticized since for announcing only piecemeal projects.
The Tories, like the previous Liberal government, are also behind in publishing annual reports required by law that show how water supplies are used and maintained.
The last assessment posted on Environment Canada’s website is from 2005-06.
The internal draft report says the government currently does not know enough about the country’s water to properly manage it.
‘This is not a crisis yet. Why would we expect any government, regardless of political leaning or level, to do anything about it?’
“Canada lacks sound information at a national scale on the major uses and user[s] of water,” it says.
“National forecasting of water availability has never been done because traditionally our use of the resource was thought to be unlimited.”
Canada has a fifth of the world’s supply of fresh water, but only seven per cent of it is renewable. The rest comes from ice-age glaciers and underground aquifers.
One per cent of Canada’s total water supply is renewed each year by precipitation, the report says.
Moreover, government data on the country’s groundwater reserves is deemed “sparse and often inadequate.”
That’s in contrast to the United States, which has spent more than a decade mapping its underground water reserves. Canada shares aquifers with the U.S., and the report says: “Our lack of data places Canada at strategic disadvantage for bilateral negotiations with the U.S.”
A comprehensive review [World Wildlife Federation: a national assessment of of Canada’s freshwater Watershed Reports; 2017] freshwater ecosystems reveals rising threats from pollution, overuse, invasive species and climate change among other problems. Yet, the biggest threat of all may be a lack of information that hinders effective regulation, Ivan Semeniuk reports. …
Some of that information may be out of date.
Getting back on topic, here’s one possible solution to better managing our use of water.
Every day, more than 141 billion liters of water are used solely to flush toilets. With millions of global citizens experiencing water scarcity, what if that amount could be reduced by 50%?
The possibility may exist through research conducted at Penn State, released today (Nov. 18) in Nature Sustainability.
“Our team has developed a robust bio-inspired, liquid, sludge- and bacteria-repellent coating that can essentially make a toilet self-cleaning,” said Tak-Sing Wong, Wormley Early Career Professor of Engineering and associate professor of mechanical engineering and biomedical engineering.
…
Penn State researchers have developed a method that dramatically reduces the amount of water needed to flush a conventional toilet, which usually requires 6 liters. Image: Wong Laboratory for Nature Inspired Engineering
In the Wong Laboratory for Nature Inspired Engineering, housed within the Department of Mechanical Engineering and the Materials Research Institute, researchers have developed a method that dramatically reduces the amount of water needed to flush a conventional toilet, which usually requires 6 liters.
Co-developed by Jing Wang, a doctoral graduate from Wong’s lab, the liquid-entrenched smooth surface (LESS) coating is a two-step spray that, among other applications, can be applied to a ceramic toilet bowl. The first spray, created from molecularly grafted polymers, is the initial step in building an extremely smooth and liquid-repellent foundation.
“When it dries, the first spray grows molecules that look like little hairs, with a diameter of about 1,000,000 times thinner than a human’s,” Wang said.
While this first application creates an extremely smooth surface as is, the second spray infuses a thin layer of lubricant around those nanoscopic “hairs” to create a super-slippery surface.
“When we put that coating on a toilet in the lab and dump synthetic fecal matter on it, it (the synthetic fecal matter) just completely slides down and nothing sticks to it (the toilet),” Wang said.
With this novel slippery surface, the toilets can effectively clean residue from inside the bowl and dispose of the waste with only a fraction of the water previously needed. The researchers also predict the coating could last for about 500 flushes in a conventional toilet before a reapplication of the lubricant layer is needed.
While other liquid-infused slippery surfaces can take hours to cure, the LESS two-step coating takes less than five minutes. The researcher’s experiments also found the surface effectively repelled bacteria, particularly ones that spread infectious diseases and unpleasant odors.
If it were widely adopted in the United States, it could direct critical resources toward other important activities, to drought-stricken areas or to regions experiencing chronic water scarcity, said the researchers.
Driven by these humanitarian solutions, the researchers also hope their work can make an impact in the developing world. The technology could be used within waterless toilets, which are used extensively around the world.
“Poop sticking to the toilet is not only unpleasant to users, but it also presents serious health concerns,” Wong said.
However, if a waterless toilet or urinal used the LESS coating, the team predicts these types of fixtures would be more appealing and safer for widespread use.
To address these issues in both the United States and around the world, Wong and his collaborators, Wang, Birgitt Boschitsch, and Nan Sun, all mechanical engineering alumni, began a start-up venture.
With support from the Ben Franklin Technology Partners’ TechCelerator, the National Science Foundation, the Department of Energy, the Office of Naval Research, the Rice Business Plan Competition and Y-Combinator, their company, spotLESS Materials, is already bringing the LESS coating to market.
“Our goal is to bring impactful technology to the market so everyone can benefit,” Wong said. “To maximize the impact of our coating technology, we need to get it out of the lab.”
Looking forward, the team hopes spotLESS Materials will play a role in sustaining the world’s water resources and continue expanding the reach of their technology.
“As a researcher in an academic setting, my goal is to invent things that everyone can benefit from,” Wong said. “As a Penn Stater, I see this culture being amplified through entrepreneurship, and I’m excited to contribute.”
This paper is behind a paywall. However, the researchers have made a brief video available,
There you have it. One random thought, that toilet image reminded me of the controversy over Marcel Duchamp, the Fountain, and who actually submitted a urinal for consideration as a piece of art (Jan. 23, 2019 posting). Hint: Some believe it was Baroness Elsa von Freytag-Loringhoven.
Caption: Researchers at ORNL’s Center for Nanophase Materials Sciences demonstrated the first example of capacitance in a lipid-based biomimetic membrane, opening nondigital routes to advanced, brain-like computation. Credit: Michelle Lehman/Oak Ridge National Laboratory, U.S. Dept. of Energy
The last time I wrote about memcapacitors (June 30, 2014 posting: Memristors, memcapacitors, and meminductors for faster computers), the ideas were largely theoretical; I believe this work is the first research I’ve seen on the topic. From an October 17, 2019 news item on ScienceDaily,
Researchers at the Department of Energy’s Oak Ridge National Laboratory ]ORNL], the University of Tennessee and Texas A&M University demonstrated bio-inspired devices that accelerate routes to neuromorphic, or brain-like, computing.
Results published in Nature Communications report the first example of a lipid-based “memcapacitor,” a charge storage component with memory that processes information much like synapses do in the brain. Their discovery could support the emergence of computing networks modeled on biology for a sensory approach to machine learning.
“Our goal is to develop materials and computing elements that work like biological synapses and neurons—with vast interconnectivity and flexibility—to enable autonomous systems that operate differently than current computing devices and offer new functionality and learning capabilities,” said Joseph Najem, a recent postdoctoral researcher at ORNL’s Center for Nanophase Materials Sciences, a DOE Office of Science User Facility, and current assistant professor of mechanical engineering at Penn State.
The novel approach uses soft materials to mimic biomembranes and simulate the way nerve cells communicate with one another.
The team designed an artificial cell membrane, formed at the interface of two lipid-coated water droplets in oil, to explore the material’s dynamic, electrophysiological properties. At applied voltages, charges build up on both sides of the membrane as stored energy, analogous to the way capacitors work in traditional electric circuits.
But unlike regular capacitors, the memcapacitor can “remember” a previously applied voltage and—literally—shape how information is processed. The synthetic membranes change surface area and thickness depending on electrical activity. These shapeshifting membranes could be tuned as adaptive filters for specific biophysical and biochemical signals.
“The novel functionality opens avenues for nondigital signal processing and machine learning modeled on nature,” said ORNL’s Pat Collier, a CNMS staff research scientist.
A distinct feature of all digital computers is the separation of processing and memory. Information is transferred back and forth from the hard drive and the central processor, creating an inherent bottleneck in the architecture no matter how small or fast the hardware can be.
Neuromorphic computing, modeled on the nervous system, employs architectures that are fundamentally different in that memory and signal processing are co-located in memory elements—memristors, memcapacitors and meminductors.
These “memelements” make up the synaptic hardware of systems that mimic natural information processing, learning and memory.
Systems designed with memelements offer advantages in scalability and low power consumption, but the real goal is to carve out an alternative path to artificial intelligence, said Collier.
Tapping into biology could enable new computing possibilities, especially in the area of “edge computing,” such as wearable and embedded technologies that are not connected to a cloud but instead make on-the-fly decisions based on sensory input and past experience.
Biological sensing has evolved over billions of years into a highly sensitive system with receptors in cell membranes that are able to pick out a single molecule of a specific odor or taste. “This is not something we can match digitally,” Collier said.
Digital computation is built around digital information, the binary language of ones and zeros coursing through electronic circuits. It can emulate the human brain, but its solid-state components do not compute sensory data the way a brain does.
“The brain computes sensory information pushed through synapses in a neural network that is reconfigurable and shaped by learning,” said Collier. “Incorporating biology—using biomembranes that sense bioelectrochemical information—is key to developing the functionality of neuromorphic computing.”
While numerous solid-state versions of memelements have been demonstrated, the team’s biomimetic elements represent new opportunities for potential “spiking” neural networks that can compute natural data in natural ways.
Spiking neural networks are intended to simulate the way neurons spike with electrical potential and, if the signal is strong enough, pass it on to their neighbors through synapses, carving out learning pathways that are pruned over time for efficiency.
A bio-inspired version with analog data processing is a distant aim. Current early-stage research focuses on developing the components of bio-circuitry.
“We started with the basics, a memristor that can weigh information via conductance to determine if a spike is strong enough to be broadcast through a network of synapses connecting neurons,” said Collier. “Our memcapacitor goes further in that it can actually store energy as an electric charge in the membrane, enabling the complex ‘integrate and fire’ activity of neurons needed to achieve dense networks capable of brain-like computation.”
The team’s next steps are to explore new biomaterials and study simple networks to achieve more complex brain-like functionalities with memelements.
Here’s a link to and a citation for the paper,
Dynamical nonlinear memory capacitance in biomimetic membranes by Joseph S. Najem, Md Sakib Hasan, R. Stanley Williams, Ryan J. Weiss, Garrett S. Rose, Graham J. Taylor, Stephen A. Sarles & C. Patrick Collier. Nature Communications volume 10, Article number: 3239 (2019) DOI: DOIhttps://doi.org/10.1038/s41467-019-11223-8 Published July 19, 2019
This paper is open access.
One final comment, you might recognize one of the authors (R. Stanley Williams) who in 2008 helped launch ‘memristor’ research.
I think this form of ‘cannibalism’ could also be described as a form of ‘self-assembly’. That said, here is an August 31, 2018 news item on ScienceDaily announcing ‘cannibalistic’ materials,
Scientists at the [US] Department of Energy’s [DOE] Oak Ridge National Laboratory [ORNL] induced a two-dimensional material to cannibalize itself for atomic “building blocks” from which stable structures formed.
The findings, reported in Nature Communications, provide insights that may improve design of 2D materials for fast-charging energy-storage and electronic devices.
“Under our experimental conditions, titanium and carbon atoms can spontaneously form an atomically thin layer of 2D transition-metal carbide, which was never observed before,” said Xiahan Sang of ORNL.
He and ORNL’s Raymond Unocic led a team that performed in situ experiments using state-of-the-art scanning transmission electron microscopy (STEM), combined with theory-based simulations, to reveal the mechanism’s atomistic details.
“This study is about determining the atomic-level mechanisms and kinetics that are responsible for forming new structures of a 2D transition-metal carbide such that new synthesis methods can be realized for this class of materials,” Unocic added.
The starting material was a 2D ceramic called a MXene (pronounced “max een”). Unlike most ceramics, MXenes are good electrical conductors because they are made from alternating atomic layers of carbon or nitrogen sandwiched within transition metals like titanium.
The research was a project of the Fluid Interface Reactions, Structures and Transport (FIRST) Center, a DOE Energy Frontier Research Center that explores fluid–solid interface reactions that have consequences for energy transport in everyday applications. Scientists conducted experiments to synthesize and characterize advanced materials and performed theory and simulation work to explain observed structural and functional properties of the materials. New knowledge from FIRST projects provides guideposts for future studies.
The high-quality material used in these experiments was synthesized by Drexel University scientists, in the form of five-ply single-crystal monolayer flakes of MXene. The flakes were taken from a parent crystal called “MAX,” which contains a transition metal denoted by “M”; an element such as aluminum or silicon, denoted by “A”; and either a carbon or nitrogen atom, denoted by “X.” The researchers used an acidic solution to etch out the monoatomic aluminum layers, exfoliate the material and delaminate it into individual monolayers of a titanium carbide MXene (Ti3C2).
The ORNL scientists suspended a large MXene flake on a heating chip with holes drilled in it so no support material, or substrate, interfered with the flake. Under vacuum, the suspended flake was exposed to heat and irradiated with an electron beam to clean the MXene surface and fully expose the layer of titanium atoms.
MXenes are typically inert because their surfaces are covered with protective functional groups—oxygen, hydrogen and fluorine atoms that remain after acid exfoliation. After protective groups are removed, the remaining material activates. Atomic-scale defects—“vacancies” created when titanium atoms are removed during etching—are exposed on the outer ply of the monolayer. “These atomic vacancies are good initiation sites,” Sang said. “It’s favorable for titanium and carbon atoms to move from defective sites to the surface.” In an area with a defect, a pore may form when atoms migrate.
“Once those functional groups are gone, now you’re left with a bare titanium layer (and underneath, alternating carbon, titanium, carbon, titanium) that’s free to reconstruct and form new structures on top of existing structures,” Sang said.
High-resolution STEM imaging proved that atoms moved from one part of the material to another to build structures. Because the material feeds on itself, the growth mechanism is cannibalistic.
“The growth mechanism is completely supported by density functional theory and reactive molecular dynamics simulations, thus opening up future possibilities to use these theory tools to determine the experimental parameters required for synthesizing specific defect structures,” said Adri van Duin of Penn State [Pennsylvania State University].
Most of the time, only one additional layer [of carbon and titanium] grew on a surface. The material changed as atoms built new layers. Ti3C2 turned into Ti4C3, for example.
“These materials are efficient at ionic transport, which lends itself well to battery and supercapacitor applications,” Unocic said. “How does ionic transport change when we add more layers to nanometer-thin MXene sheets?” This question may spur future studies.
“Because MXenes containing molybdenum, niobium, vanadium, tantalum, hafnium, chromium and other metals are available, there are opportunities to make a variety of new structures containing more than three or four metal atoms in cross-section (the current limit for MXenes produced from MAX phases),” Yury Gogotsi of Drexel University added. “Those materials may show different useful properties and create an array of 2D building blocks for advancing technology.”
At ORNL’s Center for Nanophase Materials Sciences (CNMS), Yu Xie, Weiwei Sun and Paul Kent performed first-principles theory calculations to explain why these materials grew layer by layer instead of forming alternate structures, such as squares. Xufan Li and Kai Xiao helped understand the growth mechanism, which minimizes surface energy to stabilize atomic configurations. Penn State scientists conducted large-scale dynamical reactive force field simulations showing how atoms rearranged on surfaces, confirming defect structures and their evolution as observed in experiments.
The researchers hope the new knowledge will help others grow advanced materials and generate useful nanoscale structures.
Here’s a link to and a citation for the paper,
In situ atomistic insight into the growth mechanisms of single layer 2D transition metal carbides by Xiahan Sang, Yu Xie, Dundar E. Yilmaz, Roghayyeh Lotfi, Mohamed Alhabeb, Alireza Ostadhossein, Babak Anasori, Weiwei Sun, Xufan Li, Kai Xiao, Paul R. C. Kent, Adri C. T. van Duin, Yury Gogotsi, & Raymond R. Unocic. Nature Communicationsvolume 9, Article number: 2266 (2018) DOI: https://doi.org/10.1038/s41467-018-04610-0 Published 11 June 2018
Museum curators planning to develop virtual exhibits online should choose communication and navigation technologies that match the experience they want to offer their visitors, according to a team of researchers.
“When curators think about creating a real-world exhibit, they are thinking about what the theme is and what they want their visitors to get out of the exhibit,” said S. Shyam Sundar, Distinguished Professor of Communications and co-director of the Media Effects Research Laboratory. “What this study suggests is that, just like curators need to be coherent in the content of the exhibit, they need to be conscious of the tools that they employ in their virtual museums.” [emphasis mine]
For some reason that phrase “need to be conscious of the tools that they employ” reminds of Marshall McLuhan and his dictum “the medium is the message.” Here’s more about study from the news release,
Many museum curators hope to create an authentic experience in their online museums by using technology to mimic aspects of the social, personal and physical aspects of a real-world museum experience. However, a more-is-better approach to technology may actually hinder that authentic experience, the researchers suggest.
In a study, visitors to an online virtual art museum found that technology tools used to communicate about and navigate through the exhibits were considered helpful when they were available separately, but less so when they were offered together. The researchers tested customization tools that helped the participants create their own art gallery, live-chat technology to facilitate communication with other visitors and 3-D tool navigation tools that some participants used to explore the museum.
The participants’ experiences often depended on what tools and what combinations of tools they used, according to the researchers, who released their findings in a recent issue of the International Journal of Human-Computer Interaction.
The news release goes on to provide some examples of when technologies do not mesh together for a good experience,
“When live chat and customization are offered together, for example, the combination of tools may be perceived to have increased usability, but it turns out using either customization or live chat separately was greater than either both functions together, or neither of the functions,” said Sundar. “We saw similar results not just with perceived usability, but also with sense of control and agency.”
The live chatting tool gave participants a feeling of social presence in the museum, but when live chatting was used in conjunction with the 3D navigation tool, the visitor had less of a sense of control, said Sundar, who worked with Eun Go, assistant professor of broadcasting and journalism, Western Illinois University; Hyang-Sook Kim, assistant professor of mass communication and media communication studies, Towson University and Bo Zhang, doctoral candidate in mass communications, Penn State.
Similarly, participants indicated the live chatting function lessened the realistic experience of the 3D tool, according to the researchers, who suggested that chatting may increase the user’s cognitive burden as they try to navigate through the site.
Each of these tools carries unique meaning for users, Sundar said. While customization provides an individualized experience, live-chatting signals a social experience of the site.
“Our data also suggest that expert users prefer tools that offer more agency or control to users whereas novices appreciate a variety of tools on the interface,” he added.
Users may react to these tools on other online platforms, not just during visits to online museums, Sundar said.
“We might be able to apply this research on tools you might add to news sites, for example, or it could be used to improve educational sites and long-distance learning,” he added. “You just have to be careful about how you deploy the tools because more is not always better.”
The researchers recruited 126 participants for the study. The subjects were assigned one of eight different website variations that tested their reactions to customization, live chat, 3D navigation and combinations of those tools during their visit to a virtual version of the Museum of Modern Art. The museum’s artworks were made available through the Google Art Project.