Category Archives: energy

Iridescent giant clams could point the way to safety, climatologically speaking

Giant clams in Palau (Cynthia Barnett)

These don’t look like any clams I’ve ever seen but that is the point of Cynthia Barnett’s absorbing Sept. 10, 2018 article for The Atlantic (Note: A link has been removed),

Snorkeling amid the tree-tangled rock islands of Ngermid Bay in the western Pacific nation of Palau, Alison Sweeney lingers at a plunging coral ledge, photographing every giant clam she sees along a 50-meter transect. In Palau, as in few other places in the world, this means she is going to be underwater for a skin-wrinkling long time.

At least the clams are making it easy for Sweeney, a biophysicist at the University of Pennsylvania. The animals plump from their shells like painted lips, shimmering in blues, purples, greens, golds, and even electric browns. The largest are a foot across and radiate from the sea floor, but most are the smallest of the giant clams, five-inch Tridacna crocea, living higher up on the reef. Their fleshy Technicolor smiles beam in all directions from the corals and rocks of Ngermid Bay.

… Some of the corals are bleached from the conditions in Ngermid Bay, where naturally high temperatures and acidity mirror the expected effects of climate change on the global oceans. (Ngermid Bay is more commonly known as “Nikko Bay,” but traditional leaders and government officials are working to revive the indigenous name of Ngermid.)

Even those clams living on bleached corals are pulsing color, like wildflowers in a white-hot desert. Sweeney’s ponytail flows out behind her as she nears them with her camera. They startle back into their fluted shells. Like bashful fairytale creatures cursed with irresistible beauty, they cannot help but draw attention with their sparkly glow.

Barnett makes them seem magical and perhaps they are (Note: A link has been removed),

It’s the glow that drew Sweeney’s attention to giant clams, and to Palau, a tiny republic of more than 300 islands between the Philippines and Guam. Its sun-laden waters are home to seven of the world’s dozen giant-clam species, from the storied Tridacna gigas—which can weigh an estimated 550 pounds and measure over four feet across—to the elegantly fluted Tridacna squamosa. Sweeney first came to the archipelago in 2009, while working on animal iridescence as a post-doctoral fellow at the University of California at Santa Barbara. Whether shimmering from a blue morpho butterfly’s wings or a squid’s skin, iridescence is almost always associated with a visual signal—one used to attract mates or confuse predators. Giant clams’ luminosity is not such a signal. So, what is it?

In the years since, Sweeney and her colleagues have discovered that the clams’ iridescence is essentially the outer glow of a solar transformer—optimized over millions of years to run on sunlight and algal biofuel. Giant clams reach their cartoonish proportions thanks to an exceptional ability to grow their own photosynthetic algae in vertical farms spread throughout their flesh. Sweeney and other scientists think this evolved expertise may shed light on alternative fuel technologies and other industrial solutions for a warming world.

Barnett goes on to describe Palau’s relationship to the clams and the clams’ environment,

Palau’s islands have been inhabited for at least 3,400 years, and from the start, giant clams were a staple of diet, daily life, and even deity. Many of the islands’ oldest-surviving tools are crafted of thick giant-clam shell: arched-blade adzes, fishhooks, gougers, heavy taro-root pounders. Giant-clam shell makes up more than three-fourths of some of the oldest shell middens in Palau, a percentage that decreases through the centuries. Archaeologists suggest that the earliest islanders depleted the giant clams that crowded the crystalline shallows, then may have self-corrected. Ancient Palauan conservation law, known as bul, prohibited fishing during critical spawning periods, or when a species showed signs of over-harvesting.

Before the Christianity that now dominates Palauan religion sailed in on eighteenth-century mission ships, the culture’s creation lore began with a giant clam called to life in an empty sea. The clam grew bigger and bigger until it sired Latmikaik, the mother of human children, who birthed them with the help of storms and ocean currents.

The legend evokes giant clams in their larval phase, moving with the currents for their first two weeks of life. Before they can settle, the swimming larvae must find and ingest one or two photosynthetic alga, which later multiply, becoming self-replicating fuel cells. After the larvae down the alga and develop a wee shell and a foot, they kick around like undersea farmers, looking for a sunny spot for their crop. When they’ve chosen a well-lit home in a shallow lagoon or reef, they affix to the rock, their shell gaping to the sky. After the sun hits and photosynthesis begins, the microalgae will multiply to millions, or in the case of T. gigas, billions, and clam and algae will live in symbiosis for life.

Giant clam is a beloved staple in Palau and many other Pacific islands, prepared raw with lemon, simmered into coconut soup, baked into a savory pancake, or sliced and sautéed in a dozen other ways. But luxury demand for their ivory-like shells and their adductor muscle, which is coveted as high-end sashimi and an alleged aphrodisiac, has driven T. gigas extinct in China, Taiwan, and other parts of their native habitat. Some of the toughest marine-protection laws in the world, along with giant-clam aquaculture pioneered here, have helped Palau’s wild clams survive. The Palau Mariculture Demonstration Center raises hundreds of thousands of giant clams a year, supplying local clam farmers who sell to restaurants and the aquarium trade and keeping pressure off the wild population. But as other nations have wiped out their clams, Palau’s 230,000-square-mile ocean territory is an increasing target of illegal foreign fishers.

Barnett delves into how the country of Palau is responding to the voracious appetite for the giant clams and other marine life,

Palau, drawing on its ancient conservation tradition of bul, is fighting back. In 2015, President Tommy Remengesau Jr. signed into law the Palau National Marine Sanctuary Act, which prohibits fishing in 80 percent of Palau’s Exclusive Economic Zone and creates a domestic fishing area in the remaining 20 percent, set aside for local fishers selling to local markets. In 2016, the nation received a $6.6 million grant from Japan to launch a major renovation of the Palau Mariculture Demonstration Center. Now under construction at the waterfront on the southern tip of Malakal Island, the new facility will amp up clam-aquaculture research and increase giant-clam production five-fold, to more than a million seedlings a year.

Last year, Palau amended its immigration policy to require that all visitors sign a pledge to behave in an ecologically responsible manner. The pledge, stamped into passports by an immigration officer who watches you sign, is written to the island’s children:

Children of Palau, I take this pledge, as your guest, to preserve and protect your beautiful and unique island home. I vow to tread lightly, act kindly and explore mindfully. I shall not take what is not given. I shall not harm what does not harm me. The only footprints I shall leave are those that will wash away.

The pledge is winning hearts and public-relations awards. But Palau’s existential challenge is still the collective “we,” the world’s rising carbon emissions and the resulting upturns in global temperatures, sea levels, and destructive storms.

F. Umiich Sengebau, Palau’s Minister for Natural Resources, Environment, and Tourism, grew up on Koror and is full of giant-clam proverbs, wisdom and legends from his youth. He tells me a story I also heard from an elder in the state of Airai: that in old times, giant clams were known as “stormy-weather food,” the fresh staple that was easy to collect and have on hand when it was too stormy to go out fishing.

As Palau faces the storms of climate change, Sengebau sees giant clams becoming another sort of stormy-weather food, serving as a secure source of protein; a fishing livelihood; a glowing icon for tourists; and now, an inspiration for alternative energy and other low-carbon technologies. “In the old days, clams saved us,” Sengebau tells me. “I think there’s a lot of power in that, a great power and meaning in the history of clams as food, and now clams as science.”

I highly recommend Barnett’s article, which is one article in a larger series, from a November 6, 2017 The Atlantic press release,

The Atlantic is expanding the global footprint of its science writing today with a multi-year series to investigate life in all of its multitudes. The series, “Life Up Close,” created with support from Howard Hughes Medical Institute’s Department of Science Education (HHMI), begins today at TheAtlantic.com. In the first piece for the project, “The Zombie Diseases of Climate Change,” The Atlantic’s Robinson Meyer travels to Greenland to report on the potentially dangerous microbes emerging from thawing Arctic permafrost.

The project is ambitious in both scope and geographic reach, and will explore how life is adapting to our changing planet. Journalists will travel the globe to examine these changes as they happen to microbes, plants, and animals in oceans, grasslands, forests, deserts, and the icy poles. The Atlantic will question where humans should look for life next: from the Martian subsurface, to Europa’s oceans, to the atmosphere of nearby stars and beyond. “Life Up Close” will feature at least twenty reported pieces continuing through 2018.

“The Atlantic has been around for 160 years, but that’s a mere pinpoint in history when it comes to questions of life and where it started, and where we’re going,” said Ross Andersen, The Atlantic’s senior editor who oversees science, tech, and health. “The questions that this project will set out to tackle are critical; and this support will allow us to cover new territory in new and more ambitious ways.”

About The Atlantic:
Founded in 1857 and today one of the fastest growing media platforms in the industry, The Atlantic has throughout its history championed the power of big ideas and continues to shape global debate across print, digital, events, and video platforms. With its award-winning digital presence TheAtlantic.com and CityLab.com on cities around the world, The Atlantic is a multimedia forum on the most critical issues of our times—from politics, business, urban affairs, and the economy, to technology, arts, and culture. The Atlantic is celebrating its 160th anniversary this year. Bob Cohn is president of The Atlantic and Jeffrey Goldberg is editor in chief.

About the Howard Hughes Medical Institute (HHMI) Department of Science Education:
HHMI is the leading private nonprofit supporter of scientific research and science education in the United States. The Department of Science Education’s BioInteractive division produces free, high quality educational media for science educators and millions of students around the globe, its HHMI Tangled Bank Studios unit crafts powerful stories of scientific discovery for television and big screens, and its grants program aims to transform science education in universities and colleges. For more information, visit www.hhmi.org.

Getting back to the giant clams, sometimes all you can do is marvel, eh?

Bristly hybrid materials

Caption: [Image 1] A carbon fiber covered with a spiky forest of NiCoHC nanowires. Credit: All images reproduced from reference 1 under a Creative Commons Attribution 4.0 International License© 2018 KAUST

It makes me think of small, cuddly things like cats and dogs but it’s not. From an August 7, 2018 King Abdullah University of Science and Technology (KAUST; Saudi Arabia) news release (also published on August 12, 2018 on EurekAlert),

By combining multiple nanomaterials into a single structure, scientists can create hybrid materials that incorporate the best properties of each component and outperform any single substance. A controlled method for making triple-layered hollow nanostructures has now been developed at KAUST. The hybrid structures consist of a conductive organic core sandwiched between layers of electrocatalytically active metals: their potential uses range from better battery electrodes to renewable fuel production.

Although several methods exist to create two-layer materials, making three-layered structures has proven much more difficult, says Peng Wang from the Water Desalination and Reuse Center who co-led the current research with Professor Yu Han, member of the Advanced Membranes and Porous Materials Center at KAUST. The researchers developed a new, dual-template approach, explains Sifei Zhuo, a postdoctoral member of Wang’s team.

The researchers grew their hybrid nanomaterial directly on carbon paper–a mat of electrically conductive carbon fibers. They first produced a bristling forest of nickel cobalt hydroxyl carbonate (NiCoHC) nanowires onto the surface of each carbon fiber (image 1). Each tiny inorganic bristle was coated with an organic layer called hydrogen substituted graphdiyne (HsGDY) (image 2 [not included here]).

Next was the key dual-template step. When the team added a chemical mixture that reacts with the inner NiCoHC, the HsGDY acted as a partial barrier. Some nickel and cobalt ions from the inner layer diffused outward, where they reacted with thiomolybdate from the surrounding solution to form the outer nickel-, cobalt-co-doped MoS2 (Ni,Co-MoS2) layer. Meanwhile, some sulfur ions from the added chemicals diffused inwards to react with the remaining nickel and cobalt. The resulting substance (image 3 [not included here]) had the structure Co9S8, Ni3S2@HsGDY@Ni,Co-MoS2, in which the conductive organic HsGDY layer is sandwiched between two inorganic layers (image 4 [not included here]).

The triple layer material showed good performance at electrocatalytically breaking up water molecules to generate hydrogen, a potential renewable fuel. The researchers also created other triple-layer materials using the dual-template approach

“These triple-layered nanostructures hold great potential in energy conversion and storage,” says Zhuo. “We believe it could be extended to serve as a promising electrode in many electrochemical applications, such as in supercapacitors and sodium-/lithium-ion batteries, and for use in water desalination.”

Here’s a link to and a citation for the paper,

Dual-template engineering of triple-layered nanoarray electrode of metal chalcogenides sandwiched with hydrogen-substituted graphdiyne by Sifei Zhuo, Yusuf Shi, Lingmei Liu, Renyuan Li, Le Shi, Dalaver H. Anjum, Yu Han, & Peng Wang. Nature Communicationsvolume 9, Article number: 3132 (2018) DOI: https://doi.org/10.1038/s41467-018-05474-0 Published 07 August 2018

This paper is open access.

 

Bringing memristors to the masses and cutting down on energy use

One of my earliest posts featuring memristors (May 9, 2008) focused on their potential for energy savings but since then most of my postings feature research into their application in the field of neuromorphic (brainlike) computing. (For a description and abbreviated history of the memristor go to this page on my Nanotech Mysteries Wiki.)

In a sense this July 30, 2018 news item on Nanowerk is a return to the beginning,

A new way of arranging advanced computer components called memristors on a chip could enable them to be used for general computing, which could cut energy consumption by a factor of 100.

This would improve performance in low power environments such as smartphones or make for more efficient supercomputers, says a University of Michigan researcher.

“Historically, the semiconductor industry has improved performance by making devices faster. But although the processors and memories are very fast, they can’t be efficient because they have to wait for data to come in and out,” said Wei Lu, U-M professor of electrical and computer engineering and co-founder of memristor startup Crossbar Inc.

Memristors might be the answer. Named as a portmanteau of memory and resistor, they can be programmed to have different resistance states–meaning they store information as resistance levels. These circuit elements enable memory and processing in the same device, cutting out the data transfer bottleneck experienced by conventional computers in which the memory is separate from the processor.

A July 30, 2018 University of Michigan news release (also on EurekAlert), which originated the news item, expands on the theme,

… unlike ordinary bits, which are 1 or 0, memristors can have resistances that are on a continuum. Some applications, such as computing that mimics the brain (neuromorphic), take advantage of the analog nature of memristors. But for ordinary computing, trying to differentiate among small variations in the current passing through a memristor device is not precise enough for numerical calculations.

Lu and his colleagues got around this problem by digitizing the current outputs—defining current ranges as specific bit values (i.e., 0 or 1). The team was also able to map large mathematical problems into smaller blocks within the array, improving the efficiency and flexibility of the system.

Computers with these new blocks, which the researchers call “memory-processing units,” could be particularly useful for implementing machine learning and artificial intelligence algorithms. They are also well suited to tasks that are based on matrix operations, such as simulations used for weather prediction. The simplest mathematical matrices, akin to tables with rows and columns of numbers, can map directly onto the grid of memristors.

The memristor array situated on a circuit board.

The memristor array situated on a circuit board. Credit: Mohammed Zidan, Nanoelectronics group, University of Michigan.

Once the memristors are set to represent the numbers, operations that multiply and sum the rows and columns can be taken care of simultaneously, with a set of voltage pulses along the rows. The current measured at the end of each column contains the answers. A typical processor, in contrast, would have to read the value from each cell of the matrix, perform multiplication, and then sum up each column in series.

“We get the multiplication and addition in one step. It’s taken care of through physical laws. We don’t need to manually multiply and sum in a processor,” Lu said.

His team chose to solve partial differential equations as a test for a 32×32 memristor array—which Lu imagines as just one block of a future system. These equations, including those behind weather forecasting, underpin many problems science and engineering but are very challenging to solve. The difficulty comes from the complicated forms and multiple variables needed to model physical phenomena.

When solving partial differential equations exactly is impossible, solving them approximately can require supercomputers. These problems often involve very large matrices of data, so the memory-processor communication bottleneck is neatly solved with a memristor array. The equations Lu’s team used in their demonstration simulated a plasma reactor, such as those used for integrated circuit fabrication.

This work is described in a study, “A general memristor-based partial differential equation solver,” published in the journal Nature Electronics.

It was supported by the Defense Advanced Research Projects Agency (DARPA) (grant no. HR0011-17-2-0018) and by the National Science Foundation (NSF) (grant no. CCF-1617315).

Here’s a link and a citation for the paper,

A general memristor-based partial differential equation solver by Mohammed A. Zidan, YeonJoo Jeong, Jihang Lee, Bing Chen, Shuo Huang, Mark J. Kushner & Wei D. Lu. Nature Electronicsvolume 1, pages411–420 (2018) DOI: https://doi.org/10.1038/s41928-018-0100-6 Published: 13 July 2018

This paper is behind a paywall.

For the curious, Dr. Lu’s startup company, Crossbar can be found here.

A solar, self-charging supercapacitor for wearable technology

Ravinder Dahiya, Carlos García Núñez, and their colleagues at the University of Glasgow (Scotland) strike again (see my May 10, 2017 posting for their first ‘solar-powered graphene skin’ research announcement). Last time it was all about robots and prosthetics, this time they’ve focused on wearable technology according to a July 18, 2018 news item on phys.org,

A new form of solar-powered supercapacitor could help make future wearable technologies lighter and more energy-efficient, scientists say.

In a paper published in the journal Nano Energy, researchers from the University of Glasgow’s Bendable Electronics and Sensing Technologies (BEST) group describe how they have developed a promising new type of graphene supercapacitor, which could be used in the next generation of wearable health sensors.

A July 18, 2018 University of Glasgow press release, which originated the news item, explains further,

Currently, wearable systems generally rely on relatively heavy, inflexible batteries, which can be uncomfortable for long-term users. The BEST team, led by Professor Ravinder Dahiya, have built on their previous success in developing flexible sensors by developing a supercapacitor which could power health sensors capable of conforming to wearer’s bodies, offering more comfort and a more consistent contact with skin to better collect health data.

Their new supercapacitor uses layers of flexible, three-dimensional porous foam formed from graphene and silver to produce a device capable of storing and releasing around three times more power than any similar flexible supercapacitor. The team demonstrated the durability of the supercapacitor, showing that it provided power consistently across 25,000 charging and discharging cycles.

They have also found a way to charge the system by integrating it with flexible solar powered skin already developed by the BEST group, effectively creating an entirely self-charging system, as well as a pH sensor which uses wearer’s sweat to monitor their health.

Professor Dahiya said: “We’re very pleased by the progress this new form of solar-powered supercapacitor represents. A flexible, wearable health monitoring system which only requires exposure to sunlight to charge has a lot of obvious commercial appeal, but the underlying technology has a great deal of additional potential.

“This research could take the wearable systems for health monitoring to remote parts of the world where solar power is often the most reliable source of energy, and it could also increase the efficiency of hybrid electric vehicles. We’re already looking at further integrating the technology into flexible synthetic skin which we’re developing for use in advanced prosthetics.” [emphasis mine]

In addition to the team’s work on robots, prosthetics, and graphene ‘skin’ mentioned in the May 10, 2017 posting the team is working on a synthetic ‘brainy’ skin for which they have just received £1.5m funding from the Engineering and Physical Science Research Council (EPSRC).

Brainy skin

A July 3, 2018 University of Glasgow press release discusses the proposed work in more detail,

A robotic hand covered in ‘brainy skin’ that mimics the human sense of touch is being developed by scientists.

University of Glasgow’s Professor Ravinder Dahiya has plans to develop ultra-flexible, synthetic Brainy Skin that ‘thinks for itself’.

The super-flexible, hypersensitive skin may one day be used to make more responsive prosthetics for amputees, or to build robots with a sense of touch.

Brainy Skin reacts like human skin, which has its own neurons that respond immediately to touch rather than having to relay the whole message to the brain.

This electronic ‘thinking skin’ is made from silicon based printed neural transistors and graphene – an ultra-thin form of carbon that is only an atom thick, but stronger than steel.

The new version is more powerful, less cumbersome and would work better than earlier prototypes, also developed by Professor Dahiya and his Bendable Electronics and Sensing Technologies (BEST) team at the University’s School of Engineering.

His futuristic research, called neuPRINTSKIN (Neuromorphic Printed Tactile Skin), has just received another £1.5m funding from the Engineering and Physical Science Research Council (EPSRC).

Professor Dahiya said: “Human skin is an incredibly complex system capable of detecting pressure, temperature and texture through an array of neural sensors that carry signals from the skin to the brain.

“Inspired by real skin, this project will harness the technological advances in electronic engineering to mimic some features of human skin, such as softness, bendability and now, also sense of touch. This skin will not just mimic the morphology of the skin but also its functionality.

“Brainy Skin is critical for the autonomy of robots and for a safe human-robot interaction to meet emerging societal needs such as helping the elderly.”

Synthetic ‘Brainy Skin’ with sense of touch gets £1.5m funding. Photo of Professor Ravinder Dahiya

This latest advance means tactile data is gathered over large areas by the synthetic skin’s computing system rather than sent to the brain for interpretation.

With additional EPSRC funding, which extends Professor Dahiya’s fellowship by another three years, he plans to introduce tactile skin with neuron-like processing. This breakthrough in the tactile sensing research will lead to the first neuromorphic tactile skin, or ‘brainy skin.’

To achieve this, Professor Dahiya will add a new neural layer to the e-skin that he has already developed using printing silicon nanowires.

Professor Dahiya added: “By adding a neural layer underneath the current tactile skin, neuPRINTSKIN will add significant new perspective to the e-skin research, and trigger transformations in several areas such as robotics, prosthetics, artificial intelligence, wearable systems, next-generation computing, and flexible and printed electronics.”

The Engineering and Physical Sciences Research Council (EPSRC) is part of UK Research and Innovation, a non-departmental public body funded by a grant-in-aid from the UK government.

EPSRC is the main funding body for engineering and physical sciences research in the UK. By investing in research and postgraduate training, the EPSRC is building the knowledge and skills base needed to address the scientific and technological challenges facing the nation.

Its portfolio covers a vast range of fields from healthcare technologies to structural engineering, manufacturing to mathematics, advanced materials to chemistry. The research funded by EPSRC has impact across all sectors. It provides a platform for future UK prosperity by contributing to a healthy, connected, resilient, productive nation.

It’s fascinating to note how these pieces of research fit together for wearable technology and health monitoring and creating more responsive robot ‘skin’ and, possibly, prosthetic devices that would allow someone to feel again.

The latest research paper

Getting back the solar-charging supercapacitors mentioned in the opening, here’s a link to and a citation for the team’s latest research paper,

Flexible self-charging supercapacitor based on graphene-Ag-3D graphene foam electrodes by Libu Manjakka, Carlos García Núñez, Wenting Dang, Ravinder Dahiya. Nano Energy Volume 51, September 2018, Pages 604-612 DOI: https://doi.org/10.1016/j.nanoen.2018.06.072

This paper is open access.

Brainy and brainy: a novel synaptic architecture and a neuromorphic computing platform called SpiNNaker

I have two items about brainlike computing. The first item hearkens back to memristors, a topic I have been following since 2008. (If you’re curious about the various twists and turns just enter  the term ‘memristor’ in this blog’s search engine.) The latest on memristors is from a team than includes IBM (US), École Politechnique Fédérale de Lausanne (EPFL; Swizterland), and the New Jersey Institute of Technology (NJIT; US). The second bit comes from a Jülich Research Centre team in Germany and concerns an approach to brain-like computing that does not include memristors.

Multi-memristive synapses

In the inexorable march to make computers function more like human brains (neuromorphic engineering/computing), an international team has announced its latest results in a July 10, 2018 news item on Nanowerk,

Two New Jersey Institute of Technology (NJIT) researchers, working with collaborators from the IBM Research Zurich Laboratory and the École Polytechnique Fédérale de Lausanne, have demonstrated a novel synaptic architecture that could lead to a new class of information processing systems inspired by the brain.

The findings are an important step toward building more energy-efficient computing systems that also are capable of learning and adaptation in the real world. …

A July 10, 2018 NJIT news release (also on EurekAlert) by Tracey Regan, which originated by the news item, adds more details,

The researchers, Bipin Rajendran, an associate professor of electrical and computer engineering, and S. R. Nandakumar, a graduate student in electrical engineering, have been developing brain-inspired computing systems that could be used for a wide range of big data applications.

Over the past few years, deep learning algorithms have proven to be highly successful in solving complex cognitive tasks such as controlling self-driving cars and language understanding. At the heart of these algorithms are artificial neural networks – mathematical models of the neurons and synapses of the brain – that are fed huge amounts of data so that the synaptic strengths are autonomously adjusted to learn the intrinsic features and hidden correlations in these data streams.

However, the implementation of these brain-inspired algorithms on conventional computers is highly inefficient, consuming huge amounts of power and time. This has prompted engineers to search for new materials and devices to build special-purpose computers that can incorporate the algorithms. Nanoscale memristive devices, electrical components whose conductivity depends approximately on prior signaling activity, can be used to represent the synaptic strength between the neurons in artificial neural networks.

While memristive devices could potentially lead to faster and more power-efficient computing systems, they are also plagued by several reliability issues that are common to nanoscale devices. Their efficiency stems from their ability to be programmed in an analog manner to store multiple bits of information; however, their electrical conductivities vary in a non-deterministic and non-linear fashion.

In the experiment, the team showed how multiple nanoscale memristive devices exhibiting these characteristics could nonetheless be configured to efficiently implement artificial intelligence algorithms such as deep learning. Prototype chips from IBM containing more than one million nanoscale phase-change memristive devices were used to implement a neural network for the detection of hidden patterns and correlations in time-varying signals.

“In this work, we proposed and experimentally demonstrated a scheme to obtain high learning efficiencies with nanoscale memristive devices for implementing learning algorithms,” Nandakumar says. “The central idea in our demonstration was to use several memristive devices in parallel to represent the strength of a synapse of a neural network, but only chose one of them to be updated at each step based on the neuronal activity.”

Here’s a link to and a citation for the paper,

Neuromorphic computing with multi-memristive synapses by Irem Boybat, Manuel Le Gallo, S. R. Nandakumar, Timoleon Moraitis, Thomas Parnell, Tomas Tuma, Bipin Rajendran, Yusuf Leblebici, Abu Sebastian, & Evangelos Eleftheriou. Nature Communications volume 9, Article number: 2514 (2018) DOI: https://doi.org/10.1038/s41467-018-04933-y Published 28 June 2018

This is an open access paper.

Also they’ve got a couple of very nice introductory paragraphs which I’m including here, (from the June 28, 2018 paper in Nature Communications; Note: Links have been removed),

The human brain with less than 20 W of power consumption offers a processing capability that exceeds the petaflops mark, and thus outperforms state-of-the-art supercomputers by several orders of magnitude in terms of energy efficiency and volume. Building ultra-low-power cognitive computing systems inspired by the operating principles of the brain is a promising avenue towards achieving such efficiency. Recently, deep learning has revolutionized the field of machine learning by providing human-like performance in areas, such as computer vision, speech recognition, and complex strategic games1. However, current hardware implementations of deep neural networks are still far from competing with biological neural systems in terms of real-time information-processing capabilities with comparable energy consumption.

One of the reasons for this inefficiency is that most neural networks are implemented on computing systems based on the conventional von Neumann architecture with separate memory and processing units. There are a few attempts to build custom neuromorphic hardware that is optimized to implement neural algorithms2,3,4,5. However, as these custom systems are typically based on conventional silicon complementary metal oxide semiconductor (CMOS) circuitry, the area efficiency of such hardware implementations will remain relatively low, especially if in situ learning and non-volatile synaptic behavior have to be incorporated. Recently, a new class of nanoscale devices has shown promise for realizing the synaptic dynamics in a compact and power-efficient manner. These memristive devices store information in their resistance/conductance states and exhibit conductivity modulation based on the programming history6,7,8,9. The central idea in building cognitive hardware based on memristive devices is to store the synaptic weights as their conductance states and to perform the associated computational tasks in place.

The two essential synaptic attributes that need to be emulated by memristive devices are the synaptic efficacy and plasticity. …

It gets more complicated from there.

Now onto the next bit.

SpiNNaker

At a guess, those capitalized N’s are meant to indicate ‘neural networks’. As best I can determine, SpiNNaker is not based on the memristor. Moving on, a July 11, 2018 news item on phys.org announces work from a team examining how neuromorphic hardware and neuromorphic software work together,

A computer built to mimic the brain’s neural networks produces similar results to that of the best brain-simulation supercomputer software currently used for neural-signaling research, finds a new study published in the open-access journal Frontiers in Neuroscience. Tested for accuracy, speed and energy efficiency, this custom-built computer named SpiNNaker, has the potential to overcome the speed and power consumption problems of conventional supercomputers. The aim is to advance our knowledge of neural processing in the brain, to include learning and disorders such as epilepsy and Alzheimer’s disease.

A July 11, 2018 Frontiers Publishing news release on EurekAlert, which originated the news item, expands on the latest work,

“SpiNNaker can support detailed biological models of the cortex–the outer layer of the brain that receives and processes information from the senses–delivering results very similar to those from an equivalent supercomputer software simulation,” says Dr. Sacha van Albada, lead author of this study and leader of the Theoretical Neuroanatomy group at the Jülich Research Centre, Germany. “The ability to run large-scale detailed neural networks quickly and at low power consumption will advance robotics research and facilitate studies on learning and brain disorders.”

The human brain is extremely complex, comprising 100 billion interconnected brain cells. We understand how individual neurons and their components behave and communicate with each other and on the larger scale, which areas of the brain are used for sensory perception, action and cognition. However, we know less about the translation of neural activity into behavior, such as turning thought into muscle movement.

Supercomputer software has helped by simulating the exchange of signals between neurons, but even the best software run on the fastest supercomputers to date can only simulate 1% of the human brain.

“It is presently unclear which computer architecture is best suited to study whole-brain networks efficiently. The European Human Brain Project and Jülich Research Centre have performed extensive research to identify the best strategy for this highly complex problem. Today’s supercomputers require several minutes to simulate one second of real time, so studies on processes like learning, which take hours and days in real time are currently out of reach.” explains Professor Markus Diesmann, co-author, head of the Computational and Systems Neuroscience department at the Jülich Research Centre.

He continues, “There is a huge gap between the energy consumption of the brain and today’s supercomputers. Neuromorphic (brain-inspired) computing allows us to investigate how close we can get to the energy efficiency of the brain using electronics.”

Developed over the past 15 years and based on the structure and function of the human brain, SpiNNaker — part of the Neuromorphic Computing Platform of the Human Brain Project — is a custom-built computer composed of half a million of simple computing elements controlled by its own software. The researchers compared the accuracy, speed and energy efficiency of SpiNNaker with that of NEST–a specialist supercomputer software currently in use for brain neuron-signaling research.

“The simulations run on NEST and SpiNNaker showed very similar results,” reports Steve Furber, co-author and Professor of Computer Engineering at the University of Manchester, UK. “This is the first time such a detailed simulation of the cortex has been run on SpiNNaker, or on any neuromorphic platform. SpiNNaker comprises 600 circuit boards incorporating over 500,000 small processors in total. The simulation described in this study used just six boards–1% of the total capability of the machine. The findings from our research will improve the software to reduce this to a single board.”

Van Albada shares her future aspirations for SpiNNaker, “We hope for increasingly large real-time simulations with these neuromorphic computing systems. In the Human Brain Project, we already work with neuroroboticists who hope to use them for robotic control.”

Before getting to the link and citation for the paper, here’s a description of SpiNNaker’s hardware from the ‘Spiking neural netowrk’ Wikipedia entry, Note: Links have been removed,

Neurogrid, built at Stanford University, is a board that can simulate spiking neural networks directly in hardware. SpiNNaker (Spiking Neural Network Architecture) [emphasis mine], designed at the University of Manchester, uses ARM processors as the building blocks of a massively parallel computing platform based on a six-layer thalamocortical model.[5]

Now for the link and citation,

Performance Comparison of the Digital Neuromorphic Hardware SpiNNaker and the Neural Network Simulation Software NEST for a Full-Scale Cortical Microcircuit Model by
Sacha J. van Albada, Andrew G. Rowley, Johanna Senk, Michael Hopkins, Maximilian Schmidt, Alan B. Stokes, David R. Lester, Markus Diesmann, and Steve B. Furber. Neurosci. 12:291. doi: 10.3389/fnins.2018.00291 Published: 23 May 2018

As noted earlier, this is an open access paper.

Nanostructured materials and radiation

If you’re planning on using nanostructured materials in a nuclear facility, you might want to check out this work (from a June 8, 2018 Purdue University (Indiana, US) news release by Brian L. Huchel,

A professor in the Purdue College of Engineering examined the potential use of various materials in nuclear reactors in an extensive review article in the journal Progress in Materials Science.

The article, titled “Radiation Damage in Nanostructured Materials,” was led by Xinghang Zhang, a professor of materials engineering. It will be published in the July issue of the journal.

Zhang said there is a significant demand for advanced materials that can survive high temperature and high doses of radiation. These materials contain significant amount of internal changes, called defect sinks, which are too small to be seen with the naked eye, but may form the next generation of materials used in nuclear reactors.

“Nanostructured materials with abundant internal defect sinks are promising as these materials have shown significantly improved radiation tolerance,” he said. “However, there are many challenges and fundamental science questions that remain to be solved before these materials can have applications in advanced nuclear reactors.”

The 100-page article, which took two years to write, focuses on metallic materials and metal-ceramic compounds and reviews types of internal material defects on the reduction of radiation damage in nanostructured materials.

Under the extreme radiation conditions, a large number of defects and their clusters are generated inside materials, and such significant microstructure damage often leads to degradation of the mechanical and physical properties of the materials

The article discusses the usage of a combination of defect sink networks to collaboratively improve the radiation tolerance of nanomaterials, while pointing out the need to improve the thermal and radiation stabilities of the defect sinks.

“The field of radiation damage in nanostructured materials is an exciting and rapidly evolving arena, enriched with challenges and opportunities,” Zhang said. “The integration of extensive research effort, resources and expertise in various fields may eventually lead to the design of advanced nanomaterials with unprecedented radiation tolerance.”

Jin Li, co-author of the review article and a postdoctoral fellow in the School of Materials Engineering, said researchers with different expertise worked collaboratively on the article, which contains more than 100 pages, 100 figures and 700 references.

The team involved in the research article included researchers from Purdue, Texas A&M University, Drexel University, the University of Nebraska-Lincoln and China University of Petroleum-Beijing, as well as Sandia National Laboratory, Los Alamos National Laboratory and Idaho National Laboratory.

Here’s an image illustrating the work,

Various imperfections in nanostructures, call defect sinks, can enhance the material’s tolerance to radiation. (Photo/Xinghang Zhang)

Here’s a link to and a citation for the paper,

Radiation damage in nanostructured materials by Xinghang Zhang, Khalid Hattar, Youxing Chen, Lin Shao, Jin Li, Cheng Sun, Kaiyuan Yu, Nan Li, Mitra L.Taheri, Haiyan Wang, Jian Wang, Michael Nastasi. Progress in Materials Science Volume 96, July 2018, Pages 217-321 https://doi.org/10.1016/j.pmatsci.2018.03.002

This paper is behind a paywall.

ht/ to June 8, 2018 Nanowerk news item.

New semiconductor material from pigment produced by fungi?

Chlorociboria Aeruginascens fungus on a tree log. (Image: Oregon State University)

Apparently the pigment derived from the fungi you see in the above picture is used by visual artists and, perhaps soon, will be used by electronics manufacturers. From a June 5, 2018 news item on Nanowerk,

Researchers at Oregon State University are looking at a highly durable organic pigment, used by humans in artwork for hundreds of years, as a promising possibility as a semiconductor material.

Findings suggest it could become a sustainable, low-cost, easily fabricated alternative to silicon in electronic or optoelectronic applications where the high-performance capabilities of silicon aren’t required.

Optoelectronics is technology working with the combined use of light and electronics, such as solar cells, and the pigment being studied is xylindein.

A June 5, 2018 Oregon State University news release by Steve Lundeberg, which originated the news item, expands on the theme,

“Xylindein is pretty, but can it also be useful? How much can we squeeze out of it?” said Oregon State University [OSU] physicist Oksana Ostroverkhova. “It functions as an electronic material but not a great one, but there’s optimism we can make it better.”

Xylindien is secreted by two wood-eating fungi in the Chlorociboria genus. Any wood that’s infected by the fungi is stained a blue-green color, and artisans have prized xylindein-affected wood for centuries.

The pigment is so stable that decorative products made half a millennium ago still exhibit its distinctive hue. It holds up against prolonged exposure to heat, ultraviolet light and electrical stress.

“If we can learn the secret for why those fungi-produced pigments are so stable, we could solve a problem that exists with organic electronics,” Ostroverkhova said. “Also, many organic electronic materials are too expensive to produce, so we’re looking to do something inexpensively in an ecologically friendly way that’s good for the economy.”

With current fabrication techniques, xylindein tends to form non-uniform films with a porous, irregular, “rocky” structure.

“There’s a lot of performance variation,” she said. “You can tinker with it in the lab, but you can’t really make a technologically relevant device out of it on a large scale. But we found a way to make it more easily processed and to get a decent film quality.”

Ostroverkhova and collaborators in OSU’s colleges of Science and Forestry blended xylindein with a transparent, non-conductive polymer, poly(methyl methacrylate), abbreviated to PMMA and sometimes known as acrylic glass. They drop-cast solutions both of pristine xylindein and a xlyindein-PMMA blend onto electrodes on a glass substrate for testing.

They found the non-conducting polymer greatly improved the film structure without a detrimental effect on xylindein’s electrical properties. And the blended films actually showed better photosensitivity.

“Exactly why that happened, and its potential value in solar cells, is something we’ll be investigating in future research,” Ostroverkhova said. “We’ll also look into replacing the polymer with a natural product – something sustainable made from cellulose. We could grow the pigment from the cellulose and be able to make a device that’s all ready to go.

“Xylindein will never beat silicon, but for many applications, it doesn’t need to beat silicon,” she said. “It could work well for depositing onto large, flexible substrates, like for making wearable electronics.”

This research, whose findings were recently published in MRS Advances, represents the first use of a fungus-produced material in a thin-film electrical device.

“And there are a lot more of the materials,” Ostroverkhova said. “This is just first one we’ve explored. It could be the beginning of a whole new class of organic electronic materials.”

Here’s a link to and a citation for the paper,

Fungi-Derived Pigments for Sustainable Organic (Opto)Electronics by Gregory Giesbers, Jonathan Van Schenck, Sarath Vega Gutierrez, Sara Robinson. MRS Advances https://doi.org/10.1557/adv.2018.446 Published online: 21 May 2018

This paper is behind a paywall.

View Dynamic Glass—intelligent windows sold commercially

At last, commercially available ‘smart’, that is, electrochromic windows.

An April 17, 2018 article by Conor Shine for Dallas News describes a change at the Dallas Fort Worth (DFW) International Airport that has cooled things down,

At DFW International Airport, the coolest seats in the house can be found near Gate A28.

That’s where the airport, working with California-based technology company View, has replaced a bank of tarmac-facing windows with panes coated in microscopic layers of electrochromic ceramic that significantly reduce the amount of heat and glare coming into the terminal.

The technology, referred to as dynamic glass, uses an electrical current to change how much light is let in and has been shown to reduce surface temperatures on gate area seats and carpets by as much as 15 degrees compared to standard windows. All that heat savings add up, with View estimating its product can cut energy costs by as much as 20 percent when the technology is deployed widely in a building.

At DFW Airport, the energy bill runs about $18 million per year, putting the potential savings from dynamic glass into the hundreds of thousands, or even millions of dollars, annually.

Besides the money, it’s an appealing set of characteristics for DFW Airport, which is North America’s only carbon-neutral airport and regularly ranks among the top large airports for customer experience in the world.

After installing the dynamic glass near Gate A28 and a nearby Twisted Root restaurant in September at a cost of $49,000, the airport is now looking at ordering more for use throughout its terminals, although how many and at what cost hasn’t been finalized yet.

On a recent weekday morning, the impact of the dynamic glass was on full display. As sunlight beamed into Gate A25, passengers largely avoided the seats near the standard windows, favoring shadier spots a bit further into the terminal.

A few feet away, the bright natural light takes on a subtle blue hue and the temperature near the windows is noticeably cooler. There, passengers seemed to pay no mind to sitting in the sun, with window-adjacent seats filling up quickly.

As View’s Jeff Platón, the company’s vice president of marketing, notes in the video, there are considerable savings to be had when you cut down on air conditioning,

View’s April 17, 2018 news release (PDF) about a study of their technology in use at the airport provides more detail,

View®, the leader in dynamic glass, today announced the results of a study on the impact of in-terminal passenger experience and its correlation to higher revenues and reduced operational expenses.The study, conducted at Dallas Fort Worth International Airport (DFW), found that terminal windows fitted with View Dynamic Glass overwhelmingly improved passenger comfort over conventional glass, resulting in an 83 percent increase in passenger dwell time at a preferred gate seat and a 102 percent increase in concession spending. The research study was conducted by DFW Airport, View, Inc., and an independent aviation market research group.

It’s been a long time (I’ve been waiting about 10 years) but it seems that commercially available ‘smart’ glass is here—at the airport, anyway.

ht/ April 20, 2018 news item on phys.org

When nanoparticles collide

The science of collisions, although it looks more like kissing to me, at the nanoscale could lead to some helpful discoveries according to an April 5, 2018 news item on Nanowerk,

Helmets that do a better job of preventing concussions and other brain injuries. Earphones that protect people from damaging noises. Devices that convert “junk” energy from airport runway vibrations into usable power.

New research on the events that occur when tiny specks of matter called nanoparticles smash into each other could one day inform the development of such technologies.

Before getting to the news release proper, here’s a gif released by the university,

A digital reconstruction shows how individual atoms in two largely spherical nanoparticles react when the nanoparticles collide in a vacuum. In the reconstruction, the atoms turn blue when they are in contact with the opposing nanoparticle. Credit: Yoichi Takato

An April 4, 2018 University at Buffalo news release (also on EurekAlert) by Charlotte Hsu, which originated the news item, fills in some details,

Using supercomputers, scientists led by the University at Buffalo modeled what happens when two nanoparticles collide in a vacuum. The team ran simulations for nanoparticles with three different surface geometries: those that are largely circular (with smooth exteriors); those with crystal facets; and those that possess sharp edges.

“Our goal was to lay out the forces that control energy transport at the nanoscale,” says study co-author Surajit Sen, PhD, professor of physics in UB’s College of Arts and Sciences. “When you have a tiny particle that’s 10, 20 or 50 atoms across, does it still behave the same way as larger particles, or grains? That’s the guts of the question we asked.”

“The guts of the answer,” Sen adds, “is yes and no.”

“Our research is useful because it builds the foundation for designing materials that either transmit or absorb energy in desired ways,” says first author Yoichi Takato, PhD. Takato, a physicist at AGC Asahi Glass and former postdoctoral scholar at the Okinawa Institute of Science and Technology in Japan, completed much of the study as a doctoral candidate in physics at UB. “For example, you could potentially make an ultrathin material that is energy absorbent. You could imagine that this would be practical for use in helmets and head gear that can help to prevent head and combat injuries.”

The study was published on March 21 in Proceedings of the Royal Society A by Takato, Sen and Michael E. Benson, who completed his portion of the work as an undergraduate physics student at UB. The scientists ran their simulations at the Center for Computational Research, UB’s academic supercomputing facility.

What happens when nanoparticles crash

The new research focused on small nanoparticles — those with diameters of 5 to 15 nanometers. The scientists found that in collisions, particles of this size behave differently depending on their shape.

For example, nanoparticles with crystal facets transfer energy well when they crash into each other, making them an ideal component of materials designed to harvest energy. When it comes to energy transport, these particles adhere to scientific norms that govern macroscopic linear systems — including chains of equal-sized masses with springs in between them — that are visible to the naked eye.

In contrast, nanoparticles that are rounder in shape, with amorphous surfaces, adhere to nonlinear force laws. This, in turn, means they may be especially useful for shock mitigation. When two spherical nanoparticles collide, energy dissipates around the initial point of contact on each one instead of propagating all the way through both. The scientists report that at crash velocities of about 30 meters per second, atoms within each particle shift only near the initial point of contact.

Nanoparticles with sharp edges are less predictable: According to the new study, their behavior varies depending on sharpness of the edges when it comes to transporting energy.
Designing a new generation of materials

“From a very broad perspective, the kind of work we’re doing has very exciting prospects,” Sen says. “It gives engineers fundamental information about nanoparticles that they didn’t have before. If you’re designing a new type of nanoparticle, you can now think about doing it in a way that takes into account what happens when you have very small nanoparticles interacting with each other.”

Though many scientists are working with nanotechnology, the way the tiniest of nanoparticles behave when they crash into each other is largely an open question, Takato says.

“When you’re designing a material, what size do you want the nanoparticle to be? How will you lay out the particles within the material? How compact do you want it to be? Our study can inform these decisions,” Takato says.

Here’s a link to and a citation for the paper,

Small nanoparticles, surface geometry and contact forces by Yoichi Takato, Michael E. Benson, Surajit Sen. Proceedings of the Royal Society A (Mathematical, Physical, and Engineering Sciences) Published 21 March 2018.DOI: 10.1098/rspa.2017.0723

This paper is behind a paywall.

Mixing the unmixable for all new nanoparticles

This news comes out of the University of Maryland and the discovery could led to nanoparticles that have never before been imagined. From a March 29, 2018 news item on ScienceDaily,

Making a giant leap in the ‘tiny’ field of nanoscience, a multi-institutional team of researchers is the first to create nanoscale particles composed of up to eight distinct elements generally known to be immiscible, or incapable of being mixed or blended together. The blending of multiple, unmixable elements into a unified, homogenous nanostructure, called a high entropy alloy nanoparticle, greatly expands the landscape of nanomaterials — and what we can do with them.

This research makes a significant advance on previous efforts that have typically produced nanoparticles limited to only three different elements and to structures that do not mix evenly. Essentially, it is extremely difficult to squeeze and blend different elements into individual particles at the nanoscale. The team, which includes lead researchers at University of Maryland, College Park (UMD)’s A. James Clark School of Engineering, published a peer-reviewed paper based on the research featured on the March 30 [2018] cover of Science.

A March 29, 2018 University of Maryland press release (also on EurekAlert), which originated the news item, delves further (Note: Links have been removed),

“Imagine the elements that combine to make nanoparticles as Lego building blocks. If you have only one to three colors and sizes, then you are limited by what combinations you can use and what structures you can assemble,” explains Liangbing Hu, associate professor of materials science and engineering at UMD and one of the corresponding authors of the paper. “What our team has done is essentially enlarged the toy chest in nanoparticle synthesis; now, we are able to build nanomaterials with nearly all metallic and semiconductor elements.”

The researchers say this advance in nanoscience opens vast opportunities for a wide range of applications that includes catalysis (the acceleration of a chemical reaction by a catalyst), energy storage (batteries or supercapacitors), and bio/plasmonic imaging, among others.

To create the high entropy alloy nanoparticles, the researchers employed a two-step method of flash heating followed by flash cooling. Metallic elements such as platinum, nickel, iron, cobalt, gold, copper, and others were exposed to a rapid thermal shock of approximately 3,000 degrees Fahrenheit, or about half the temperature of the sun, for 0.055 seconds. The extremely high temperature resulted in uniform mixtures of the multiple elements. The subsequent rapid cooling (more than 100,000 degrees Fahrenheit per second) stabilized the newly mixed elements into the uniform nanomaterial.

“Our method is simple, but one that nobody else has applied to the creation of nanoparticles. By using a physical science approach, rather than a traditional chemistry approach, we have achieved something unprecedented,” says Yonggang Yao, a Ph.D. student at UMD and one of the lead authors of the paper.

To demonstrate one potential use of the nanoparticles, the research team used them as advanced catalysts for ammonia oxidation, which is a key step in the production of nitric acid (a liquid acid that is used in the production of ammonium nitrate for fertilizers, making plastics, and in the manufacturing of dyes). They were able to achieve 100 percent oxidation of ammonia and 99 percent selectivity toward desired products with the high entropy alloy nanoparticles, proving their ability as highly efficient catalysts.

Yao says another potential use of the nanoparticles as catalysts could be the generation of chemicals or fuels from carbon dioxide.

“The potential applications for high entropy alloy nanoparticles are not limited to the field of catalysis. With cross-discipline curiosity, the demonstrated applications of these particles will become even more widespread,” says Steven D. Lacey, a Ph.D. student at UMD and also one of the lead authors of the paper.

This research was performed through a multi-institutional collaboration of Prof. Liangbing Hu’s group at the University of Maryland, College Park; Prof. Reza Shahbazian-Yassar’s group at University of Illinois at Chicago; Prof. Ju Li’s group at the Massachusetts Institute of Technology; Prof. Chao Wang’s group at Johns Hopkins University; and Prof. Michael Zachariah’s group at the University of Maryland, College Park.

What outside experts are saying about this research:

“This is quite amazing; Dr. Hu creatively came up with this powerful technique, carbo-thermal shock synthesis, to produce high entropy alloys of up to eight different elements in a single nanoparticle. This is indeed unthinkable for bulk materials synthesis. This is yet another beautiful example of nanoscience!,” says Peidong Yang, the S.K. and Angela Chan Distinguished Professor of Energy and professor of chemistry at the University of California, Berkeley and member of the American Academy of Arts and Sciences.

“This discovery opens many new directions. There are simulation opportunities to understand the electronic structure of the various compositions and phases that are important for the next generation of catalyst design. Also, finding correlations among synthesis routes, composition, and phase structure and performance enables a paradigm shift toward guided synthesis,” says George Crabtree, Argonne Distinguished Fellow and director of the Joint Center for Energy Storage Research at Argonne National Laboratory.

More from the research coauthors:

“Understanding the atomic order and crystalline structure in these multi-element nanoparticles reveals how the synthesis can be tuned to optimize their performance. It would be quite interesting to further explore the underlying atomistic mechanisms of the nucleation and growth of high entropy alloy nanoparticle,” says Reza Shahbazian-Yassar, associate professor at the University of Illinois at Chicago and a corresponding author of the paper.

“Carbon metabolism drives ‘living’ metal catalysts that frequently move around, split, or merge, resulting in a nanoparticle size distribution that’s far from the ordinary, and highly tunable,” says Ju Li, professor at the Massachusetts Institute of Technology and a corresponding author of the paper.

“This method enables new combinations of metals that do not exist in nature and do not otherwise go together. It enables robust tuning of the composition of catalytic materials to optimize the activity, selectivity, and stability, and the application will be very broad in energy conversions and chemical transformations,” says Chao Wang, assistant professor of chemical and biomolecular engineering at Johns Hopkins University and one of the study’s authors.

Here’s a link to and a citation for the paper,

Carbothermal shock synthesis of high-entropy-alloy nanoparticles by Yonggang Yao, Zhennan Huang, Pengfei Xie, Steven D. Lacey, Rohit Jiji Jacob, Hua Xie, Fengjuan Chen, Anmin Nie, Tiancheng Pu, Miles Rehwoldt, Daiwei Yu, Michael R. Zachariah, Chao Wang, Reza Shahbazian-Yassar, Ju Li, Liangbing Hu. Science 30 Mar 2018: Vol. 359, Issue 6383, pp. 1489-1494 DOI: 10.1126/science.aan5412

This paper is behind a paywall.