An Oct. 29, 2020 news item on ScienceDaily features an explanation of the reasons for investigating brainlike (neuromorphic) computing ,
As progress in traditional computing slows, new forms of computing are coming to the forefront. At Penn State, a team of engineers is attempting to pioneer a type of computing that mimics the efficiency of the brain’s neural networks while exploiting the brain’s analog nature.
Modern computing is digital, made up of two states, on-off or one and zero. An analog computer, like the brain, has many possible states. It is the difference between flipping a light switch on or off and turning a dimmer switch to varying amounts of lighting.
Neuromorphic or brain-inspired computing has been studied for more than 40 years, according to Saptarshi Das, the team leader and Penn State [Pennsylvania State University] assistant professor of engineering science and mechanics. What’s new is that as the limits of digital computing have been reached, the need for high-speed image processing, for instance for self-driving cars, has grown. The rise of big data, which requires types of pattern recognition for which the brain architecture is particularly well suited, is another driver in the pursuit of neuromorphic computing.
“We have powerful computers, no doubt about that, the problem is you have to store the memory in one place and do the computing somewhere else,” Das said.
The shuttling of this data from memory to logic and back again takes a lot of energy and slows the speed of computing. In addition, this computer architecture requires a lot of space. If the computation and memory storage could be located in the same space, this bottleneck could be eliminated.
“We are creating artificial neural networks, which seek to emulate the energy and area efficiencies of the brain,” explained Thomas Shranghamer, a doctoral student in the Das group and first author on a paper recently published in Nature Communications. “The brain is so compact it can fit on top of your shoulders, whereas a modern supercomputer takes up a space the size of two or three tennis courts.”
Like synapses connecting the neurons in the brain that can be reconfigured, the artificial neural networks the team is building can be reconfigured by applying a brief electric field to a sheet of graphene, the one-atomic-thick layer of carbon atoms. In this work they show at least 16 possible memory states, as opposed to the two in most oxide-based memristors, or memory resistors [emphasis mine].
“What we have shown is that we can control a large number of memory states with precision using simple graphene field effect transistors [emphasis mine],” Das said.
The team thinks that ramping up this technology to a commercial scale is feasible. With many of the largest semiconductor companies actively pursuing neuromorphic computing, Das believes they will find this work of interest.
A July 29, 2020 news item on ScienceDaily announces a study showing that quantum loop cosmology can account for some large-scale mysteries,
While  Einstein’s theory of general relativity can explain a large array of fascinating astrophysical and cosmological phenomena, some aspects of the properties of the universe at the largest-scales remain a mystery. A new study using loop quantum cosmology — a theory that uses quantum mechanics to extend gravitational physics beyond Einstein’s theory of general relativity — accounts for two major mysteries. While the differences in the theories occur at the tiniest of scales — much smaller than even a proton — they have consequences at the largest of accessible scales in the universe. The study, which appears online July 29  in the journal Physical Review Letters, also provides new predictions about the universe that future satellite missions could test.
While  a zoomed-out picture of the universe looks fairly uniform, it does have a large-scale structure, for example because galaxies and dark matter are not uniformly distributed throughout the universe. The origin of this structure has been traced back to the tiny inhomogeneities observed in the Cosmic Microwave Background (CMB)–radiation that was emitted when the universe was 380 thousand years young that we can still see today. But the CMB itself has three puzzling features that are considered anomalies because they are difficult to explain using known physics.
“While  seeing one of these anomalies may not be that statistically remarkable, seeing two or more together suggests we live in an exceptional universe,” said Donghui Jeong, associate professor of astronomy and astrophysics at Penn State and an author of the paper. “A recent study in the journal Nature Astronomy proposed an explanation for one of these anomalies that raised so many additional concerns, they flagged a ‘possible crisis in cosmology‘ [emphasis mine].’ Using quantum loop cosmology, however, we have resolved two of these anomalies naturally, avoiding that potential crisis.”
Research over the last three decades has greatly improved our understanding of the early universe, including how the inhomogeneities in the CMB were produced in the first place. These inhomogeneities are a result of inevitable quantum fluctuations in the early universe. During a highly accelerated phase of expansion at very early times–known as inflation–these primordial, miniscule fluctuations were stretched under gravity’s influence and seeded the observed inhomogeneities in the CMB.
“To understand how primordial seeds arose, we need a closer look at the early universe, where Einstein’s theory of general relativity breaks down,” said Abhay Ashtekar, Evan Pugh Professor of Physics, holder of the Eberly Family Chair in Physics, and director of the Penn State Institute for Gravitation and the Cosmos. “The standard inflationary paradigm based on general relativity treats space time as a smooth continuum. Consider a shirt that appears like a two-dimensional surface, but on closer inspection you can see that it is woven by densely packed one-dimensional threads. In this way, the fabric of space time is really woven by quantum threads. In accounting for these threads, loop quantum cosmology allows us to go beyond the continuum described by general relativity where Einstein’s physics breaks down–for example beyond the Big Bang.”
The researchers’ previous investigation into the early universe replaced the idea of a Big Bang singularity, where the universe emerged from nothing, with the Big Bounce, where the current expanding universe emerged from a super-compressed mass that was created when the universe contracted in its preceding phase. They found that all of the large-scale structures of the universe accounted for by general relativity are equally explained by inflation after this Big Bounce using equations of loop quantum cosmology.
In the new study, the researchers determined that inflation under loop quantum cosmology also resolves two of the major anomalies that appear under general relativity.
“The primordial fluctuations we are talking about occur at the incredibly small Planck scale,” said Brajesh Gupt, a postdoctoral researcher at Penn State at the time of the research and currently at the Texas Advanced Computing Center of the University of Texas at Austin. “A Planck length is about 20 orders of magnitude smaller than the radius of a proton. But corrections to inflation at this unimaginably small scale simultaneously explain two of the anomalies at the largest scales in the universe, in a cosmic tango of the very small and the very large.”
The researchers also produced new predictions about a fundamental cosmological parameter and primordial gravitational waves that could be tested during future satellite missions, including LiteBird and Cosmic Origins Explorer, which will continue improve our understanding of the early universe.
That’s a lot of ‘while’. I’ve done this sort of thing, too, and whenever I come across it later; it’s painful.
Canada’s stores of fresh water are not as plentiful as once thought, and threaten to pinch the economy and pit provinces against each other, a federal document says.
An internal report drafted last December  by Environment Canada warns that climate change and a growing population will further drain resources.
“We can no longer take our extensive water supplies for granted,” says the report, titled A Federal Perspective on Water Quantity Issues.
The Canadian Press obtained the 21-page draft report under the Access to Information Act.
It suggests the federal government take a more hands-on role in managing the country’s water, which is now largely done by the provinces. Ottawa still manages most of the fresh water in the North through water boards.
The Conservatives promised a national water strategy in last fall’s throne speech but have been criticized since for announcing only piecemeal projects.
The Tories, like the previous Liberal government, are also behind in publishing annual reports required by law that show how water supplies are used and maintained.
The last assessment posted on Environment Canada’s website is from 2005-06.
The internal draft report says the government currently does not know enough about the country’s water to properly manage it.
‘This is not a crisis yet. Why would we expect any government, regardless of political leaning or level, to do anything about it?’
“Canada lacks sound information at a national scale on the major uses and user[s] of water,” it says.
“National forecasting of water availability has never been done because traditionally our use of the resource was thought to be unlimited.”
Canada has a fifth of the world’s supply of fresh water, but only seven per cent of it is renewable. The rest comes from ice-age glaciers and underground aquifers.
One per cent of Canada’s total water supply is renewed each year by precipitation, the report says.
Moreover, government data on the country’s groundwater reserves is deemed “sparse and often inadequate.”
That’s in contrast to the United States, which has spent more than a decade mapping its underground water reserves. Canada shares aquifers with the U.S., and the report says: “Our lack of data places Canada at strategic disadvantage for bilateral negotiations with the U.S.”
A comprehensive review [World Wildlife Federation: a national assessment of of Canada’s freshwater Watershed Reports; 2017] freshwater ecosystems reveals rising threats from pollution, overuse, invasive species and climate change among other problems. Yet, the biggest threat of all may be a lack of information that hinders effective regulation, Ivan Semeniuk reports. …
Some of that information may be out of date.
Getting back on topic, here’s one possible solution to better managing our use of water.
Every day, more than 141 billion liters of water are used solely to flush toilets. With millions of global citizens experiencing water scarcity, what if that amount could be reduced by 50%?
The possibility may exist through research conducted at Penn State, released today (Nov. 18) in Nature Sustainability.
“Our team has developed a robust bio-inspired, liquid, sludge- and bacteria-repellent coating that can essentially make a toilet self-cleaning,” said Tak-Sing Wong, Wormley Early Career Professor of Engineering and associate professor of mechanical engineering and biomedical engineering.
In the Wong Laboratory for Nature Inspired Engineering, housed within the Department of Mechanical Engineering and the Materials Research Institute, researchers have developed a method that dramatically reduces the amount of water needed to flush a conventional toilet, which usually requires 6 liters.
Co-developed by Jing Wang, a doctoral graduate from Wong’s lab, the liquid-entrenched smooth surface (LESS) coating is a two-step spray that, among other applications, can be applied to a ceramic toilet bowl. The first spray, created from molecularly grafted polymers, is the initial step in building an extremely smooth and liquid-repellent foundation.
“When it dries, the first spray grows molecules that look like little hairs, with a diameter of about 1,000,000 times thinner than a human’s,” Wang said.
While this first application creates an extremely smooth surface as is, the second spray infuses a thin layer of lubricant around those nanoscopic “hairs” to create a super-slippery surface.
“When we put that coating on a toilet in the lab and dump synthetic fecal matter on it, it (the synthetic fecal matter) just completely slides down and nothing sticks to it (the toilet),” Wang said.
With this novel slippery surface, the toilets can effectively clean residue from inside the bowl and dispose of the waste with only a fraction of the water previously needed. The researchers also predict the coating could last for about 500 flushes in a conventional toilet before a reapplication of the lubricant layer is needed.
While other liquid-infused slippery surfaces can take hours to cure, the LESS two-step coating takes less than five minutes. The researcher’s experiments also found the surface effectively repelled bacteria, particularly ones that spread infectious diseases and unpleasant odors.
If it were widely adopted in the United States, it could direct critical resources toward other important activities, to drought-stricken areas or to regions experiencing chronic water scarcity, said the researchers.
Driven by these humanitarian solutions, the researchers also hope their work can make an impact in the developing world. The technology could be used within waterless toilets, which are used extensively around the world.
“Poop sticking to the toilet is not only unpleasant to users, but it also presents serious health concerns,” Wong said.
However, if a waterless toilet or urinal used the LESS coating, the team predicts these types of fixtures would be more appealing and safer for widespread use.
To address these issues in both the United States and around the world, Wong and his collaborators, Wang, Birgitt Boschitsch, and Nan Sun, all mechanical engineering alumni, began a start-up venture.
With support from the Ben Franklin Technology Partners’ TechCelerator, the National Science Foundation, the Department of Energy, the Office of Naval Research, the Rice Business Plan Competition and Y-Combinator, their company, spotLESS Materials, is already bringing the LESS coating to market.
“Our goal is to bring impactful technology to the market so everyone can benefit,” Wong said. “To maximize the impact of our coating technology, we need to get it out of the lab.”
Looking forward, the team hopes spotLESS Materials will play a role in sustaining the world’s water resources and continue expanding the reach of their technology.
“As a researcher in an academic setting, my goal is to invent things that everyone can benefit from,” Wong said. “As a Penn Stater, I see this culture being amplified through entrepreneurship, and I’m excited to contribute.”
This paper is behind a paywall. However, the researchers have made a brief video available,
There you have it. One random thought, that toilet image reminded me of the controversy over Marcel Duchamp, the Fountain, and who actually submitted a urinal for consideration as a piece of art (Jan. 23, 2019 posting). Hint: Some believe it was Baroness Elsa von Freytag-Loringhoven.
The last time I wrote about memcapacitors (June 30, 2014 posting: Memristors, memcapacitors, and meminductors for faster computers), the ideas were largely theoretical; I believe this work is the first research I’ve seen on the topic. From an October 17, 2019 news item on ScienceDaily,
Researchers at the Department of Energy’s Oak Ridge National Laboratory ]ORNL], the University of Tennessee and Texas A&M University demonstrated bio-inspired devices that accelerate routes to neuromorphic, or brain-like, computing.
Results published in Nature Communications report the first example of a lipid-based “memcapacitor,” a charge storage component with memory that processes information much like synapses do in the brain. Their discovery could support the emergence of computing networks modeled on biology for a sensory approach to machine learning.
“Our goal is to develop materials and computing elements that work like biological synapses and neurons—with vast interconnectivity and flexibility—to enable autonomous systems that operate differently than current computing devices and offer new functionality and learning capabilities,” said Joseph Najem, a recent postdoctoral researcher at ORNL’s Center for Nanophase Materials Sciences, a DOE Office of Science User Facility, and current assistant professor of mechanical engineering at Penn State.
The novel approach uses soft materials to mimic biomembranes and simulate the way nerve cells communicate with one another.
The team designed an artificial cell membrane, formed at the interface of two lipid-coated water droplets in oil, to explore the material’s dynamic, electrophysiological properties. At applied voltages, charges build up on both sides of the membrane as stored energy, analogous to the way capacitors work in traditional electric circuits.
But unlike regular capacitors, the memcapacitor can “remember” a previously applied voltage and—literally—shape how information is processed. The synthetic membranes change surface area and thickness depending on electrical activity. These shapeshifting membranes could be tuned as adaptive filters for specific biophysical and biochemical signals.
“The novel functionality opens avenues for nondigital signal processing and machine learning modeled on nature,” said ORNL’s Pat Collier, a CNMS staff research scientist.
A distinct feature of all digital computers is the separation of processing and memory. Information is transferred back and forth from the hard drive and the central processor, creating an inherent bottleneck in the architecture no matter how small or fast the hardware can be.
Neuromorphic computing, modeled on the nervous system, employs architectures that are fundamentally different in that memory and signal processing are co-located in memory elements—memristors, memcapacitors and meminductors.
These “memelements” make up the synaptic hardware of systems that mimic natural information processing, learning and memory.
Systems designed with memelements offer advantages in scalability and low power consumption, but the real goal is to carve out an alternative path to artificial intelligence, said Collier.
Tapping into biology could enable new computing possibilities, especially in the area of “edge computing,” such as wearable and embedded technologies that are not connected to a cloud but instead make on-the-fly decisions based on sensory input and past experience.
Biological sensing has evolved over billions of years into a highly sensitive system with receptors in cell membranes that are able to pick out a single molecule of a specific odor or taste. “This is not something we can match digitally,” Collier said.
Digital computation is built around digital information, the binary language of ones and zeros coursing through electronic circuits. It can emulate the human brain, but its solid-state components do not compute sensory data the way a brain does.
“The brain computes sensory information pushed through synapses in a neural network that is reconfigurable and shaped by learning,” said Collier. “Incorporating biology—using biomembranes that sense bioelectrochemical information—is key to developing the functionality of neuromorphic computing.”
While numerous solid-state versions of memelements have been demonstrated, the team’s biomimetic elements represent new opportunities for potential “spiking” neural networks that can compute natural data in natural ways.
Spiking neural networks are intended to simulate the way neurons spike with electrical potential and, if the signal is strong enough, pass it on to their neighbors through synapses, carving out learning pathways that are pruned over time for efficiency.
A bio-inspired version with analog data processing is a distant aim. Current early-stage research focuses on developing the components of bio-circuitry.
“We started with the basics, a memristor that can weigh information via conductance to determine if a spike is strong enough to be broadcast through a network of synapses connecting neurons,” said Collier. “Our memcapacitor goes further in that it can actually store energy as an electric charge in the membrane, enabling the complex ‘integrate and fire’ activity of neurons needed to achieve dense networks capable of brain-like computation.”
The team’s next steps are to explore new biomaterials and study simple networks to achieve more complex brain-like functionalities with memelements.
Here’s a link to and a citation for the paper,
Dynamical nonlinear memory capacitance in biomimetic membranes by Joseph S. Najem, Md Sakib Hasan, R. Stanley Williams, Ryan J. Weiss, Garrett S. Rose, Graham J. Taylor, Stephen A. Sarles & C. Patrick Collier. Nature Communications volume 10, Article number: 3239 (2019) DOI: DOIhttps://doi.org/10.1038/s41467-019-11223-8 Published July 19, 2019
This paper is open access.
One final comment, you might recognize one of the authors (R. Stanley Williams) who in 2008 helped launch ‘memristor’ research.
I think this form of ‘cannibalism’ could also be described as a form of ‘self-assembly’. That said, here is an August 31, 2018 news item on ScienceDaily announcing ‘cannibalistic’ materials,
Scientists at the [US] Department of Energy’s [DOE] Oak Ridge National Laboratory [ORNL] induced a two-dimensional material to cannibalize itself for atomic “building blocks” from which stable structures formed.
The findings, reported in Nature Communications, provide insights that may improve design of 2D materials for fast-charging energy-storage and electronic devices.
“Under our experimental conditions, titanium and carbon atoms can spontaneously form an atomically thin layer of 2D transition-metal carbide, which was never observed before,” said Xiahan Sang of ORNL.
He and ORNL’s Raymond Unocic led a team that performed in situ experiments using state-of-the-art scanning transmission electron microscopy (STEM), combined with theory-based simulations, to reveal the mechanism’s atomistic details.
“This study is about determining the atomic-level mechanisms and kinetics that are responsible for forming new structures of a 2D transition-metal carbide such that new synthesis methods can be realized for this class of materials,” Unocic added.
The starting material was a 2D ceramic called a MXene (pronounced “max een”). Unlike most ceramics, MXenes are good electrical conductors because they are made from alternating atomic layers of carbon or nitrogen sandwiched within transition metals like titanium.
The research was a project of the Fluid Interface Reactions, Structures and Transport (FIRST) Center, a DOE Energy Frontier Research Center that explores fluid–solid interface reactions that have consequences for energy transport in everyday applications. Scientists conducted experiments to synthesize and characterize advanced materials and performed theory and simulation work to explain observed structural and functional properties of the materials. New knowledge from FIRST projects provides guideposts for future studies.
The high-quality material used in these experiments was synthesized by Drexel University scientists, in the form of five-ply single-crystal monolayer flakes of MXene. The flakes were taken from a parent crystal called “MAX,” which contains a transition metal denoted by “M”; an element such as aluminum or silicon, denoted by “A”; and either a carbon or nitrogen atom, denoted by “X.” The researchers used an acidic solution to etch out the monoatomic aluminum layers, exfoliate the material and delaminate it into individual monolayers of a titanium carbide MXene (Ti3C2).
The ORNL scientists suspended a large MXene flake on a heating chip with holes drilled in it so no support material, or substrate, interfered with the flake. Under vacuum, the suspended flake was exposed to heat and irradiated with an electron beam to clean the MXene surface and fully expose the layer of titanium atoms.
MXenes are typically inert because their surfaces are covered with protective functional groups—oxygen, hydrogen and fluorine atoms that remain after acid exfoliation. After protective groups are removed, the remaining material activates. Atomic-scale defects—“vacancies” created when titanium atoms are removed during etching—are exposed on the outer ply of the monolayer. “These atomic vacancies are good initiation sites,” Sang said. “It’s favorable for titanium and carbon atoms to move from defective sites to the surface.” In an area with a defect, a pore may form when atoms migrate.
“Once those functional groups are gone, now you’re left with a bare titanium layer (and underneath, alternating carbon, titanium, carbon, titanium) that’s free to reconstruct and form new structures on top of existing structures,” Sang said.
High-resolution STEM imaging proved that atoms moved from one part of the material to another to build structures. Because the material feeds on itself, the growth mechanism is cannibalistic.
“The growth mechanism is completely supported by density functional theory and reactive molecular dynamics simulations, thus opening up future possibilities to use these theory tools to determine the experimental parameters required for synthesizing specific defect structures,” said Adri van Duin of Penn State [Pennsylvania State University].
Most of the time, only one additional layer [of carbon and titanium] grew on a surface. The material changed as atoms built new layers. Ti3C2 turned into Ti4C3, for example.
“These materials are efficient at ionic transport, which lends itself well to battery and supercapacitor applications,” Unocic said. “How does ionic transport change when we add more layers to nanometer-thin MXene sheets?” This question may spur future studies.
“Because MXenes containing molybdenum, niobium, vanadium, tantalum, hafnium, chromium and other metals are available, there are opportunities to make a variety of new structures containing more than three or four metal atoms in cross-section (the current limit for MXenes produced from MAX phases),” Yury Gogotsi of Drexel University added. “Those materials may show different useful properties and create an array of 2D building blocks for advancing technology.”
At ORNL’s Center for Nanophase Materials Sciences (CNMS), Yu Xie, Weiwei Sun and Paul Kent performed first-principles theory calculations to explain why these materials grew layer by layer instead of forming alternate structures, such as squares. Xufan Li and Kai Xiao helped understand the growth mechanism, which minimizes surface energy to stabilize atomic configurations. Penn State scientists conducted large-scale dynamical reactive force field simulations showing how atoms rearranged on surfaces, confirming defect structures and their evolution as observed in experiments.
The researchers hope the new knowledge will help others grow advanced materials and generate useful nanoscale structures.
Museum curators planning to develop virtual exhibits online should choose communication and navigation technologies that match the experience they want to offer their visitors, according to a team of researchers.
“When curators think about creating a real-world exhibit, they are thinking about what the theme is and what they want their visitors to get out of the exhibit,” said S. Shyam Sundar, Distinguished Professor of Communications and co-director of the Media Effects Research Laboratory. “What this study suggests is that, just like curators need to be coherent in the content of the exhibit, they need to be conscious of the tools that they employ in their virtual museums.” [emphasis mine]
For some reason that phrase “need to be conscious of the tools that they employ” reminds of Marshall McLuhan and his dictum “the medium is the message.” Here’s more about study from the news release,
Many museum curators hope to create an authentic experience in their online museums by using technology to mimic aspects of the social, personal and physical aspects of a real-world museum experience. However, a more-is-better approach to technology may actually hinder that authentic experience, the researchers suggest.
In a study, visitors to an online virtual art museum found that technology tools used to communicate about and navigate through the exhibits were considered helpful when they were available separately, but less so when they were offered together. The researchers tested customization tools that helped the participants create their own art gallery, live-chat technology to facilitate communication with other visitors and 3-D tool navigation tools that some participants used to explore the museum.
The participants’ experiences often depended on what tools and what combinations of tools they used, according to the researchers, who released their findings in a recent issue of the International Journal of Human-Computer Interaction.
The news release goes on to provide some examples of when technologies do not mesh together for a good experience,
“When live chat and customization are offered together, for example, the combination of tools may be perceived to have increased usability, but it turns out using either customization or live chat separately was greater than either both functions together, or neither of the functions,” said Sundar. “We saw similar results not just with perceived usability, but also with sense of control and agency.”
The live chatting tool gave participants a feeling of social presence in the museum, but when live chatting was used in conjunction with the 3D navigation tool, the visitor had less of a sense of control, said Sundar, who worked with Eun Go, assistant professor of broadcasting and journalism, Western Illinois University; Hyang-Sook Kim, assistant professor of mass communication and media communication studies, Towson University and Bo Zhang, doctoral candidate in mass communications, Penn State.
Similarly, participants indicated the live chatting function lessened the realistic experience of the 3D tool, according to the researchers, who suggested that chatting may increase the user’s cognitive burden as they try to navigate through the site.
Each of these tools carries unique meaning for users, Sundar said. While customization provides an individualized experience, live-chatting signals a social experience of the site.
“Our data also suggest that expert users prefer tools that offer more agency or control to users whereas novices appreciate a variety of tools on the interface,” he added.
Users may react to these tools on other online platforms, not just during visits to online museums, Sundar said.
“We might be able to apply this research on tools you might add to news sites, for example, or it could be used to improve educational sites and long-distance learning,” he added. “You just have to be careful about how you deploy the tools because more is not always better.”
The researchers recruited 126 participants for the study. The subjects were assigned one of eight different website variations that tested their reactions to customization, live chat, 3D navigation and combinations of those tools during their visit to a virtual version of the Museum of Modern Art. The museum’s artworks were made available through the Google Art Project.