Rice University engineers have zeroed in on the optimal architecture for storing hydrogen in “white graphene” nanomaterials — a design like a Lilliputian skyscraper with “floors” of boron nitride sitting one atop another and held precisely 5.2 angstroms apart by boron nitride pillars.
Caption Thousands of hours of calculations on Rice University’s two fastest supercomputers found that the optimal architecture for packing hydrogen into “white graphene” involves making skyscraper-like frameworks of vertical columns and one-dimensional floors that are about 5.2 angstroms apart. In this illustration, hydrogen molecules (white) sit between sheet-like floors of graphene (gray) that are supported by boron-nitride pillars (pink and blue). Researchers found that identical structures made wholly of boron-nitride had unprecedented capacity for storing readily available hydrogen. Credit Lei Tao/Rice University
“The motivation is to create an efficient material that can take up and hold a lot of hydrogen — both by volume and weight — and that can quickly and easily release that hydrogen when it’s needed,” [emphasis mine] said the study’s lead author, Rouzbeh Shahsavari, assistant professor of civil and environmental engineering at Rice.
Hydrogen is the lightest and most abundant element in the universe, and its energy-to-mass ratio — the amount of available energy per pound of raw material, for example — far exceeds that of fossil fuels. It’s also the cleanest way to generate electricity: The only byproduct is water. A 2017 report by market analysts at BCC Research found that global demand for hydrogen storage materials and technologies will likely reach $5.4 billion annually by 2021.
Hydrogen’s primary drawbacks relate to portability, storage and safety. While large volumes can be stored under high pressure in underground salt domes and specially designed tanks, small-scale portable tanks — the equivalent of an automobile gas tank — have so far eluded engineers.
Following months of calculations on two of Rice’s fastest supercomputers, Shahsavari and Rice graduate student Shuo Zhao found the optimal architecture for storing hydrogen in boron nitride. One form of the material, hexagonal boron nitride (hBN), consists of atom-thick sheets of boron and nitrogen and is sometimes called white graphene because the atoms are spaced exactly like carbon atoms in flat sheets of graphene.
Previous work in Shahsavari’s Multiscale Materials Lab found that hybrid materials of graphene and boron nitride could hold enough hydrogen to meet the Department of Energy’s storage targets for light-duty fuel cell vehicles.
“The choice of material is important,” he said. “Boron nitride has been shown to be better in terms of hydrogen absorption than pure graphene, carbon nanotubes or hybrids of graphene and boron nitride.
“But the spacing and arrangement of hBN sheets and pillars is also critical,” he said. “So we decided to perform an exhaustive search of all the possible geometries of hBN to see which worked best. We also expanded the calculations to include various temperatures, pressures and dopants, trace elements that can be added to the boron nitride to enhance its hydrogen storage capacity.”
Zhao and Shahsavari set up numerous “ab initio” tests, computer simulations that used first principles of physics. Shahsavari said the approach was computationally intense but worth the extra effort because it offered the most precision.
“We conducted nearly 4,000 ab initio calculations to try and find that sweet spot where the material and geometry go hand in hand and really work together to optimize hydrogen storage,” he said.
Unlike materials that store hydrogen through chemical bonding, Shahsavari said boron nitride is a sorbent that holds hydrogen through physical bonds, which are weaker than chemical bonds. That’s an advantage when it comes to getting hydrogen out of storage because sorbent materials tend to discharge more easily than their chemical cousins, Shahsavari said.
He said the choice of boron nitride sheets or tubes and the corresponding spacing between them in the superstructure were the key to maximizing capacity.
“Without pillars, the sheets sit naturally one atop the other about 3 angstroms apart, and very few hydrogen atoms can penetrate that space,” he said. “When the distance grew to 6 angstroms or more, the capacity also fell off. At 5.2 angstroms, there is a cooperative attraction from both the ceiling and floor, and the hydrogen tends to clump in the middle. Conversely, models made of purely BN tubes — not sheets — had less storage capacity.”
Shahsavari said models showed that the pure hBN tube-sheet structures could hold 8 weight percent of hydrogen. (Weight percent is a measure of concentration, similar to parts per million.) Physical experiments are needed to verify that capacity, but that the DOE’s ultimate target is 7.5 weight percent, and Shahsavari’s models suggests even more hydrogen can be stored in his structure if trace amounts of lithium are added to the hBN.
Finally, Shahsavari said, irregularities in the flat, floor-like sheets of the structure could also prove useful for engineers.
“Wrinkles form naturally in the sheets of pillared boron nitride because of the nature of the junctions between the columns and floors,” he said. “In fact, this could also be advantageous because the wrinkles can provide toughness. If the material is placed under load or impact, that buckled shape can unbuckle easily without breaking. This could add to the material’s safety, which is a big concern in hydrogen storage devices.
“Furthermore, the high thermal conductivity and flexibility of BN may provide additional opportunities to control the adsorption and release kinetics on-demand,” Shahsavari said. “For example, it may be possible to control release kinetics by applying an external voltage, heat or an electric field.”
I may be wrong but this “The motivation is to create an efficient material that can take up and hold a lot of hydrogen — both by volume and weight — and that can quickly and easily release that hydrogen when it’s needed, …” sounds like a supercapacitor. One other comment, this research appears to be ‘in silico’, i.e., all the testing has been done as computer simulations and the proposed materials themselves have yet to be tested.
I’d have to see it to believe it but researchers at the US Dept. of Energy (DOE) Lawrence Berkeley National Laboratory (LBNL) have developed a new kind of ‘bijel’ which would allow for some pretty nifty robotics. From a Sept. 25, 2017 news item on ScienceDaily,
A new two-dimensional film, made of polymers and nanoparticles and developed by researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), can direct two different non-mixing liquids into a variety of exotic architectures. This finding could lead to soft robotics, liquid circuitry, shape-shifting fluids, and a host of new materials that use soft, rather than solid, substances.
The study, reported today in the journal Nature Nanotechnology, presents the newest entry in a class of substances known as bicontinuous jammed emulsion gels, or bijels, which hold promise as a malleable liquid that can support catalytic reactions, electrical conductivity, and energy conversion.
Bijels are typically made of immiscible, or non-mixing, liquids. People who shake their bottle of vinaigrette before pouring the dressing on their salad are familiar with such liquids. As soon as the shaking stops, the liquids start to separate again, with the lower density liquid – often oil – rising to the top.
Trapping, or jamming, particles where these immiscible liquids meet can prevent the liquids from completely separating, stabilizing the substance into a bijel. What makes bijels remarkable is that, rather than just making the spherical droplets that we normally see when we try to mix oil and water, the particles at the interface shape the liquids into complex networks of interconnected fluid channels.
Bijels are notoriously difficult to make, however, involving exact temperatures at precisely timed stages. In addition, the liquid channels are normally more than 5 micrometers across, making them too large to be useful in energy conversion and catalysis.
“Bijels have long been of interest as next-generation materials for energy applications and chemical synthesis,” said study lead author Caili Huang. “The problem has been making enough of them, and with features of the right size. In this work, we crack that problem.”
Huang started the work as a graduate student with Thomas Russell, the study’s principal investigator, at Berkeley Lab’s Materials Sciences Division, and he continued the project as a postdoctoral researcher at DOE’s Oak Ridge National Laboratory.
Creating a new bijel recipe
The method described in this new study simplifies the bijel process by first using specially coated particles about 10-20 nanometers in diameter. The smaller-sized particles line the liquid interfaces much more quickly than the ones used in traditional bijels, making the smaller channels that are highly valued for applications.
Illustration shows key stages of bijel formation. Clockwise from top left, two non-mixing liquids are shown. Ligands (shown in yellow) with amine groups are dispersed throughout the oil or solvent, and nanoparticles coated with carboxylic acids (shown as blue dots) are scattered in the water. With vigorous shaking, the nanoparticles and ligands form a “supersoap” that gets trapped at the interface of the two liquids. The bottom panel is a magnified view of the jammed nanoparticle supersoap. (Credit: Caili Huang/ORNL)
“We’ve basically taken liquids like oil and water and given them a structure, and it’s a structure that can be changed,” said Russell, a visiting faculty scientist at Berkeley Lab. “If the nanoparticles are responsive to electrical, magnetic, or mechanical stimuli, the bijels can become reconfigurable and re-shaped on demand by an external field.”
The researchers were able to prepare new bijels from a variety of common organic, water-insoluble solvents, such as toluene, that had ligands dissolved in it, and deionized water, which contained the nanoparticles. To ensure thorough mixing of the liquids, they subjected the emulsion to a vortex spinning at 3,200 revolutions per minute.
“This extreme shaking creates a whole bunch of new places where these particles and polymers can meet each other,” said study co-author Joe Forth, a postdoctoral fellow at Berkeley Lab’s Materials Sciences Division. “You’re synthesizing a lot of this material, which is in effect a thin, 2-D coating of the liquid surfaces in the system.”
The liquids remained a bijel even after one week, a sign of the system’s stability.
Russell, who is also a professor of polymer science and engineering at the University of Massachusetts-Amherst, added that these shape-shifting characteristics would be valuable in microreactors, microfluidic devices, and soft actuators.
Nanoparticles had not been seriously considered in bijels before because their small size made them hard to trap in the liquid interface. To resolve that problem, the researchers coated nano-sized particles with carboxylic acids and put them in water. They then took polymers with an added amine group – a derivative of ammonia – and dissolved them in the toluene.
At left is a vial of bijel stabilized with nanoparticle surfactants. On the right is the same vial after a week of inversion, showing that the nanoparticle kept the liquids from moving. (Credit: Caili Huang/ORNL)
This configuration took advantage of the amine group’s affinity to water, a characteristic that is comparable to surfactants, like soap. Their nanoparticle “supersoap” was designed so that the nanoparticles join ligands, forming an octopus-like shape with a polar head and nonpolar legs that get jammed at the interface, the researchers said.
“Bijels are really a new material, and also excitingly weird in that they are kinetically arrested in these unusual configurations,” said study co-author Brett Helms, a staff scientist at Berkeley Lab’s Molecular Foundry. “The discovery that you can make these bijels with simple ingredients is a surprise. We all have access to oils and water and nanocrystals, allowing broad tunability in bijel properties. This platform also allows us to experiment with new ways to control their shape and function since they are both responsive and reconfigurable.”
The nanoparticles were made of silica, but the researchers noted that in previous studies they used graphene and carbon nanotubes to form nanoparticle surfactants.
“The key is that the nanoparticles can be made of many materials,” said Russell. “The most important thing is what’s on the surface.”
This is an animation of the bijel
3-D rendering of the nanoparticle bijel taken by confocal microscope. (Credit: Caili Huang/ORNL [Oak Ridge National Laboratory] and Joe Forth/Berkeley Lab)
In discussions about water desalination and carbon nanomaterials, it’s graphene that’s usually mentioned these days. By contrast, scientists from the US Department of Energy’s Lawrence Livermore National Laboratory (LLNL) have turned to carbon nanotubes,
There are two news items about the work at LLNL on ScienceDaily, this first one originated by the American Association for the Advancement of Science (AAAS) offers a succinct summary of the work (from an August 24, 2017 news item on ScienceDaily,
At just the right size, carbon nanotubes can filter water with better efficiency than biological proteins, a new study reveals. The results could pave the way to new water filtration systems, at a time when demands for fresh water pose a global threat to sustainable development.
A class of biological proteins, called aquaporins, is able to effectively filter water, yet scientists have not been able to manufacture scalable systems that mimic this ability. Aquaporins usually exhibit channels for filtering water molecules at a narrow width of 0.3 nanometers, which forces the water molecules into a single-file chain.
Here, Ramya H. Tunuguntla and colleagues experimented with nanotubes of different widths to see which ones are best for filtering water. Intriguingly, they found that carbon nanotubes with a width of 0.8 nanometers outperformed aquaporins in filtering efficiency by a factor of six.
These narrow carbon nanotube porins (nCNTPs) were still slim enough to force the water molecules into a single-file chain. The researchers attribute the differences between aquaporins and nCNTPS to differences in hydrogen bonding — whereas pore-lining residues in aquaporins can donate or accept H bonds to incoming water molecules, the walls of CNTPs cannot form H bonds, permitting unimpeded water flow.
The nCNTPs in this study maintained permeability exceeding that of typical saltwater, only diminishing at very high salt concentrations. Lastly, the team found that by changing the charges at the mouth of the nanotube, they can alter the ion selectivity. This advancement is highlighted in a Perspective [in Science magazine] by Zuzanna Siwy and Francesco Fornasiero.
Lawrence Livermore scientists, in collaboration with researchers at Northeastern University, have developed carbon nanotube pores that can exclude salt from seawater. The team also found that water permeability in carbon nanotubes (CNTs) with diameters smaller than a nanometer (0.8 nm) exceeds that of wider carbon nanotubes by an order of magnitude.
The nanotubes, hollow structures made of carbon atoms in a unique arrangement, are more than 50,000 times thinner than a human hair. The super smooth inner surface of the nanotube is responsible for their remarkably high water permeability, while the tiny pore size blocks larger salt ions.
There’s a rather lovely illustration for this work,
An artist’s depiction of the promise of carbon nanotube porins for desalination. The image depicts a stylized carbon nanotube pipe that delivers clean desalinated water from the ocean to a kitchen tap. Image by Ryan Chen/LLNL
Increasing demands for fresh water pose a global threat to sustainable development, resulting in water scarcity for 4 billion people. Current water purification technologies can benefit from the development of membranes with specialized pores that mimic highly efficient and water selective biological proteins.
“We found that carbon nanotubes with diameters smaller than a nanometer bear a key structural feature that enables enhanced transport. The narrow hydrophobic channel forces water to translocate in a single-file arrangement, a phenomenon similar to that found in the most efficient biological water transporters,” said Ramya Tunuguntla, an LLNL postdoctoral researcher and co-author of the manuscript appearing in the Aug. 24 edition of Science.
Computer simulations and experimental studies of water transport through CNTs with diameters larger than 1 nm showed enhanced water flow, but did not match the transport efficiency of biological proteins and did not separate salt efficiently, especially at higher salinities. The key breakthrough achieved by the LLNL team was to use smaller-diameter nanotubes that delivered the required boost in performance.
“These studies revealed the details of the water transport mechanism and showed that rational manipulation of these parameters can enhance pore efficiency,” said Meni Wanunu, a physics professor at Northeastern University and co-author on the study.
“Carbon nanotubes are a unique platform for studying molecular transport and nanofluidics,” said Alex Noy, LLNL principal investigator on the CNT project and a senior author on the paper. “Their sub-nanometer size, atomically smooth surfaces and similarity to cellular water transport channels make them exceptionally suited for this purpose, and it is very exciting to make a synthetic water channel that performs better than nature’s own.”
This discovery by the LLNL scientists and their colleagues has clear implications for the next generation of water purification technologies and will spur a renewed interest in development of the next generation of high-flux membranes.
Earth is 70 percent water, but only a tiny portion—0.007 percent—is available to drink.
As potable water sources dwindle, global population increases every year. One potential solution to quenching the planet’s thirst is through desalinization—the process of removing salt from seawater. While tantalizing, this approach has always been too expensive and energy intensive for large-scale feasibility.
Now, researchers from Northeastern have made a discovery that could change that, making desalinization easier, faster and cheaper than ever before. In a paper published Thursday [August 24, 2017] in Science, the group describes how carbon nanotubes of a certain size act as the perfect filter for salt—the smallest and most abundant water contaminant.
Filtering water is tricky because water molecules want to stick together. The “H” in H2O is hydrogen, and hydrogen bonds are strong, requiring a lot of energy to separate. Water tends to bulk up and resist being filtered. But nanotubes do it rapidly, with ease.
A carbon nanotube is like an impossibly small rolled up sheet of paper, about a nanometer in diameter. For comparison, the diameter of a human hair is 50 to 70 micrometers—50,000 times wider. The tube’s miniscule size, exactly 0.8 nm, only allows one water molecule to pass through at a time. This single-file lineup disrupts the hydrogen bonds, so water can be pushed through the tubes at an accelerated pace, with no bulking.
“You can imagine if you’re a group of people trying to run through the hallway holding hands, it’s going to be a lot slower than running through the hallway single-file,” said co-author Meni Wanunu, associate professor of physics at Northeastern. Wanunu and post doctoral student Robert Henley collaborated with scientists at the Lawrence Livermore National Laboratory in California to conduct the research.
Scientists led by Aleksandr Noy at Lawrence Livermore discovered last year  that carbon nanotubes were an ideal channel for proton transport. For this new study, Henley brought expertise and technology from Wanunu’s Nanoscale Biophysics Lab to Noy’s lab, and together they took the research one step further.
In addition to being precisely the right size for passing single water molecules, carbon nanotubes have a negative electric charge. This causes them to reject anything with the same charge, like the negative ions in salt, as well as other unwanted particles.
“While salt has a hard time passing through because of the charge, water is a neutral molecule and passes through easily,” Wanunu said. Scientists in Noy’s lab had theorized that carbon nanotubes could be designed for specific ion selectivity, but they didn’t have a reliable system of measurement. Luckily, “That’s the bread and butter of what we do in Meni’s lab,” Henley said. “It created a nice symbiotic relationship.”
“Robert brought the cutting-edge measurement and design capabilities of Wanunu’s group to my lab, and he was indispensable in developing a new platform that we used to measure the ion selectivity of the nanotubes,” Noy said.
The result is a novel system that could have major implications for the future of water security. The study showed that carbon nanotubes are better at desalinization than any other existing method— natural or man-made.
To keep their momentum going, the two labs have partnered with a leading water purification organization based in Israel. And the group was recently awarded a National Science Foundation/Binational Science Foundation grant to conduct further studies and develop water filtration platforms based on their new method. As they continue the research, the researchers hope to start programs where students can learn the latest on water filtration technology—with the goal of increasing that 0.007 percent.
As is usual in these cases there’s a fair degree of repetition but there’s always at least one nugget of new information, in this case, a link to Israel. As I noted many times, the Middle East is experiencing serious water issues. My most recent ‘water and the Middle East’ piece is an August 21, 2017 post about rainmaking at the Masdar Institute in United Arab Emirates. Approximately 50% of the way down the posting, I mention Israel and Palestine’s conflict over water.
For all the talk about research with stem cells, it seems they’re not that easy to produce. An April 10, 2017 news item on ScienceDaily describes the problem and how a research team at Iowa State University may have developed a solution,
Researchers looking for ways to regenerate nerves can have a hard time obtaining key tools of their trade.
Schwann cells are an example. They form sheaths around axons, the tail-like parts of nerve cells that carry electrical impulses. They promote regeneration of those axons. And they secrete substances that promote the health of nerve cells.
In other words, they’re very useful to researchers hoping to regenerate nerve cells, specifically peripheral nerve cells, those cells outside the brain and spinal cord.
But Schwann cells are hard to come by in useful numbers.
So researchers have been taking readily available and noncontroversial mesenchymal stem cells (also called bone marrow stromal stem cells that can form bone, cartilage and fat cells) and using a chemical process to turn them, or as researchers say, differentiate them into Schwann cells. But it’s an arduous, step-by-step and expensive process.
Researchers at Iowa State University are exploring what they hope will be a better way to transform those stem cells into Schwann-like cells. They’ve developed a nanotechnology that uses inkjet printers to print multi-layer graphene circuits and also uses lasers to treat and improve the surface structure and conductivity of those circuits.
Iowa State University researchers, left to right, Metin Uz, Suprem Das, Surya Mallapragada and Jonathan Claussen are developing technologies to promote nerve regrowth. The monitor shows mesenchymal stem cells (the white) aligned along graphene circuits (the black). Credit: Photo by Christopher Gannon/Iowa State University
It turns out mesenchymal stem cells adhere and grow well on the treated circuit’s raised, rough and 3-D nanostructures. Add small doses of electricity – 100 millivolts for 10 minutes per day over 15 days – and the stem cells become Schwann-like cells.
The researchers’ findings are featured on the front cover of the scientific journal Advanced Healthcare Materials. Jonathan Claussen, an Iowa State assistant professor of mechanical engineering and an associate of the U.S. Department of Energy’s Ames Laboratory, is lead author. Suprem Das, a postdoctoral research associate in mechanical engineering and an associate of the Ames Laboratory; and Metin Uz, a postdoctoral research associate in chemical and biological engineering, are first authors.
The project is supported by funds from the Roy J. Carver Charitable Trust, the U.S. Army Medical Research and Materiel Command, Iowa State’s College of Engineering, the department of mechanical engineering and the Carol Vohs Johnson Chair in Chemical and Biological Engineering held by Surya Mallapragada, an Anson Marston Distinguished Professor in Engineering, an associate of the Ames Laboratory and a paper co-author.
“This technology could lead to a better way to differentiate stem cells,” Uz said. “There is huge potential here.”
The electrical stimulation is very effective, differentiating 85 percent of the stem cells into Schwann-like cells compared to 75 percent by the standard chemical process, according to the research paper. The electrically differentiated cells also produced 80 nanograms per milliliter of nerve growth factor compared to 55 nanograms per milliliter for the chemically treated cells.
The researchers report the results could lead to changes in how nerve injuries are treated inside the body.
“These results help pave the way for in vivo peripheral nerve regeneration where the flexible graphene electrodes could conform to the injury site and provide intimate electrical stimulation for nerve cell regrowth,” the researchers wrote in a summary of their findings.
The paper reports several advantages to using electrical stimulation to differentiate stem cells into Schwann-like cells:
doing away with the arduous steps of chemical processing
reducing costs by eliminating the need for expensive nerve growth factors
potentially increasing control of stem cell differentiation with precise electrical stimulation
and creating a low maintenance, artificial framework for neural damage repairs.
A key to making it all work is a graphene inkjet printing process developed in Claussen’s research lab. The process takes advantages of graphene’s wonder-material properties – it’s a great conductor of electricity and heat, it’s strong, stable and biocompatible – to produce low-cost, flexible and even wearable electronics.
But there was a problem: once graphene electronic circuits were printed, they had to be treated to improve electrical conductivity. That usually meant high temperatures or chemicals. Either could damage flexible printing surfaces including plastic films or paper.
Claussen and his research group solved the problem by developing computer-controlled laser technology that selectively irradiates inkjet-printed graphene oxide. The treatment removes ink binders and reduces graphene oxide to graphene – physically stitching together millions of tiny graphene flakes. The process makes electrical conductivity more than a thousand times better.
The collaboration of Claussen’s group of nanoengineers developing printed graphene technologies and Mallapragada’s group of chemical engineers working on nerve regeneration began with some informal conversations on campus.
That led to experimental attempts to grow stem cells on printed graphene and then to electrical stimulation experiments.
“We knew this would be a really good platform for electrical stimulation,” Das said. “But we didn’t know it would differentiate these cells.”
But now that it has, the researchers say there are new possibilities to think about. The technology, for example, could one day be used to create dissolvable or absorbable nerve regeneration materials that could be surgically placed in a person’s body and wouldn’t require a second surgery to remove.
A March 7, 2017 news item on phys.org describes some of the US Argonne National Laboratory’s research into oil spill cleanup technology,
When the Deepwater Horizon drilling pipe blew out seven years ago, beginning the worst oil spill [BP oil spill in the Gulf of Mexico] in U.S. history, those in charge of the recovery discovered a new wrinkle: the millions of gallons of oil bubbling from the sea floor weren’t all collecting on the surface where it could be skimmed or burned. Some of it was forming a plume and drifting through the ocean under the surface.
Now, scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory have invented a new foam, called Oleo Sponge, that addresses this problem. The material not only easily adsorbs oil from water, but is also reusable and can pull dispersed oil from the entire water column—not just the surface.
“The Oleo Sponge offers a set of possibilities that, as far as we know, are unprecedented,” said co-inventor Seth Darling, a scientist with Argonne’s Center for Nanoscale Materials and a fellow of the University of Chicago’s Institute for Molecular Engineering.
We already have a library of molecules that can grab oil, but the problem is how to get them into a useful structure and bind them there permanently.
The scientists started out with common polyurethane foam, used in everything from furniture cushions to home insulation. This foam has lots of nooks and crannies, like an English muffin, which could provide ample surface area to grab oil; but they needed to give the foam a new surface chemistry in order to firmly attach the oil-loving molecules.
Previously, Darling and fellow Argonne chemist Jeff Elam had developed a technique called sequential infiltration synthesis, or SIS, which can be used to infuse hard metal oxide atoms within complicated nanostructures.
After some trial and error, they found a way to adapt the technique to grow an extremely thin layer of metal oxide “primer” near the foam’s interior surfaces. This serves as the perfect glue for attaching the oil-loving molecules, which are deposited in a second step; they hold onto the metal oxide layer with one end and reach out to grab oil molecules with the other.
The result is Oleo Sponge, a block of foam that easily adsorbs oil from the water. The material, which looks a bit like an outdoor seat cushion, can be wrung out to be reused—and the oil itself recovered.
At tests at a giant seawater tank in New Jersey called Ohmsett, the National Oil Spill Response Research & Renewable Energy Test Facility, the Oleo Sponge successfully collected diesel and crude oil from both below and on the water surface.
“The material is extremely sturdy. We’ve run dozens to hundreds of tests, wringing it out each time, and we have yet to see it break down at all,” Darling said.
Oleo Sponge could potentially also be used routinely to clean harbors and ports, where diesel and oil tend to accumulate from ship traffic, said John Harvey, a business development executive with Argonne’s Technology Development and Commercialization division.
Elam, Darling and the rest of the team are continuing to develop the technology.
“The technique offers enormous flexibility, and can be adapted to other types of cleanup besides oil in seawater. You could attach a different molecule to grab any specific substance you need,” Elam said.
The team is actively looking to commercialize [emphasis mine] the material, Harvey said; those interested in licensing the technology or collaborating with the laboratory on further development may contact email@example.com.
Having written about memristors and neuromorphic engineering a number of times here, I’m quite intrigued to see some research into another nanoscale device for mimicking the functions of a human brain.
The announcement about the latest research from the team at the US Department of Energy’s Argonne National Laboratory is in a Feb. 14, 2017 news item on Nanowerk (Note: A link has been removed),
Research published in Nature Scientific Reports (“Ferroelectric symmetry-protected multibit memory cell”) lays out a theoretical map to use ferroelectric material to process information using multivalued logic – a leap beyond the simple ones and zeroes that make up our current computing systems that could let us process information much more efficiently.
The language of computers is written in just two symbols – ones and zeroes, meaning yes or no. But a world of richer possibilities awaits us if we could expand to three or more values, so that the same physical switch could encode much more information.
“Most importantly, this novel logic unit will enable information processing using not only “yes” and “no”, but also “either yes or no” or “maybe” operations,” said Valerii Vinokur, a materials scientist and Distinguished Fellow at the U.S. Department of Energy’s Argonne National Laboratory and the corresponding author on the paper, along with Laurent Baudry with the Lille University of Science and Technology and Igor Lukyanchuk with the University of Picardie Jules Verne.
This is the way our brains operate, and they’re something on the order of a million times more efficient than the best computers we’ve ever managed to build – while consuming orders of magnitude less energy.
“Our brains process so much more information, but if our synapses were built like our current computers are, the brain would not just boil but evaporate from the energy they use,” Vinokur said.
While the advantages of this type of computing, called multivalued logic, have long been known, the problem is that we haven’t discovered a material system that could implement it. Right now, transistors can only operate as “on” or “off,” so this new system would have to find a new way to consistently maintain more states – as well as be easy to read and write and, ideally, to work at room temperature.
Hence Vinokur and the team’s interest in ferroelectrics, a class of materials whose polarization can be controlled with electric fields. As ferroelectrics physically change shape when the polarization changes, they’re very useful in sensors and other devices, such as medical ultrasound machines. Scientists are very interested in tapping these properties for computer memory and other applications; but the theory behind their behavior is very much still emerging.
The new paper lays out a recipe by which we could tap the properties of very thin films of a particular class of ferroelectric material called perovskites.
According to the calculations, perovskite films could hold two, three, or even four polarization positions that are energetically stable – “so they could ‘click’ into place, and thus provide a stable platform for encoding information,” Vinokur said.
The team calculated these stable configurations and how to manipulate the polarization to move it between stable positions using electric fields, Vinokur said.
“When we realize this in a device, it will enormously increase the efficiency of memory units and processors,” Vinokur said. “This offers a significant step towards realization of so-called neuromorphic computing, which strives to model the human brain.”
Vinokur said the team is working with experimentalists to apply the principles to create a working system
Identification of the precise 3-D coordinates of iron, shown in red, and platinum atoms in an iron-platinum nanoparticle.. Courtesy of Colin Ophus and Florian Nickel/Berkeley Lab
The image of the iron-platinum nanoparticle (referenced in the headline) reminds of foetal ultrasound images. A Feb. 1, 2017 news item on ScienceDaily tells us more,
In the world of the very tiny, perfection is rare: virtually all materials have defects on the atomic level. These imperfections — missing atoms, atoms of one type swapped for another, and misaligned atoms — can uniquely determine a material’s properties and function. Now, UCLA [University of California at Los Angeles] physicists and collaborators have mapped the coordinates of more than 23,000 individual atoms in a tiny iron-platinum nanoparticle to reveal the material’s defects.
The results demonstrate that the positions of tens of thousands of atoms can be precisely identified and then fed into quantum mechanics calculations to correlate imperfections and defects with material properties at the single-atom level.
Jianwei “John” Miao, a UCLA professor of physics and astronomy and a member of UCLA’s California NanoSystems Institute, led the international team in mapping the atomic-level details of the bimetallic nanoparticle, more than a trillion of which could fit within a grain of sand.
“No one has seen this kind of three-dimensional structural complexity with such detail before,” said Miao, who is also a deputy director of the Science and Technology Center on Real-Time Functional Imaging. This new National Science Foundation-funded consortium consists of scientists at UCLA and five other colleges and universities who are using high-resolution imaging to address questions in the physical sciences, life sciences and engineering.
Miao and his team focused on an iron-platinum alloy, a very promising material for next-generation magnetic storage media and permanent magnet applications.
By taking multiple images of the iron-platinum nanoparticle with an advanced electron microscope at Lawrence Berkeley National Laboratory and using powerful reconstruction algorithms developed at UCLA, the researchers determined the precise three-dimensional arrangement of atoms in the nanoparticle.
“For the first time, we can see individual atoms and chemical composition in three dimensions. Everything we look at, it’s new,” Miao said.
The team identified and located more than 6,500 iron and 16,600 platinum atoms and showed how the atoms are arranged in nine grains, each of which contains different ratios of iron and platinum atoms. Miao and his colleagues showed that atoms closer to the interior of the grains are more regularly arranged than those near the surfaces. They also observed that the interfaces between grains, called grain boundaries, are more disordered.
“Understanding the three-dimensional structures of grain boundaries is a major challenge in materials science because they strongly influence the properties of materials,” Miao said. “Now we are able to address this challenge by precisely mapping out the three-dimensional atomic positions at the grain boundaries for the first time.”
The researchers then used the three-dimensional coordinates of the atoms as inputs into quantum mechanics calculations to determine the magnetic properties of the iron-platinum nanoparticle. They observed abrupt changes in magnetic properties at the grain boundaries.
“This work makes significant advances in characterization capabilities and expands our fundamental understanding of structure-property relationships, which is expected to find broad applications in physics, chemistry, materials science, nanoscience and nanotechnology,” Miao said.
In the future, as the researchers continue to determine the three-dimensional atomic coordinates of more materials, they plan to establish an online databank for the physical sciences, analogous to protein databanks for the biological and life sciences. “Researchers can use this databank to study material properties truly on the single-atom level,” Miao said.
Miao and his team also look forward to applying their method called GENFIRE (GENeralized Fourier Iterative Reconstruction) to biological and medical applications. “Our three-dimensional reconstruction algorithm might be useful for imaging like CT scans,” Miao said. Compared with conventional reconstruction methods, GENFIRE requires fewer images to compile an accurate three-dimensional structure.
That means that radiation-sensitive objects can be imaged with lower doses of radiation.
The US Dept. of Energy (DOE) Lawrence Berkeley National Laboratory issued their own Feb. 1, 2017 news release (also on EurekAlert) about the work with a focus on how their equipment made this breakthrough possible (it repeats a little of the info. from the UCLA news release),
Scientists used one of the world’s most powerful electron microscopes to map the precise location and chemical type of 23,000 atoms in an extremely small particle made of iron and platinum.
The 3-D reconstruction reveals the arrangement of atoms in unprecedented detail, enabling the scientists to measure chemical order and disorder in individual grains, which sheds light on the material’s properties at the single-atom level. Insights gained from the particle’s structure could lead to new ways to improve its magnetic performance for use in high-density, next-generation hard drives.
What’s more, the technique used to create the reconstruction, atomic electron tomography (which is like an incredibly high-resolution CT scan), lays the foundation for precisely mapping the atomic composition of other useful nanoparticles. This could reveal how to optimize the particles for more efficient catalysts, stronger materials, and disease-detecting fluorescent tags.
Microscopy data was obtained and analyzed by scientists from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) at the Molecular Foundry, in collaboration with Foundry users from UCLA, Oak Ridge National Laboratory, and the United Kingdom’s University of Birmingham. …
Atoms are the building blocks of matter, and the patterns in which they’re arranged dictate a material’s properties. These patterns can also be exploited to greatly improve a material’s function, which is why scientists are eager to determine the 3-D structure of nanoparticles at the smallest scale possible.
“Our research is a big step in this direction. We can now take a snapshot that shows the positions of all the atoms in a nanoparticle at a specific point in its growth. This will help us learn how nanoparticles grow atom by atom, and it sets the stage for a materials-design approach starting from the smallest building blocks,” says Mary Scott, who conducted the research while she was a Foundry user, and who is now a staff scientist. Scott and fellow Foundry scientists Peter Ercius and Colin Ophus developed the method in close collaboration with Jianwei Miao, a UCLA professor of physics and astronomy.
Their nanoparticle reconstruction builds on an achievement they reported last year in which they measured the coordinates of more than 3,000 atoms in a tungsten needle to a precision of 19 trillionths of a meter (19 picometers), which is many times smaller than a hydrogen atom. Now, they’ve taken the same precision, added the ability to distinguish different elements, and scaled up the reconstruction to include tens of thousands of atoms.
Importantly, their method maps the position of each atom in a single, unique nanoparticle. In contrast, X-ray crystallography and cryo-electron microscopy plot the average position of atoms from many identical samples. These methods make assumptions about the arrangement of atoms, which isn’t a good fit for nanoparticles because no two are alike.
“We need to determine the location and type of each atom to truly understand how a nanoparticle functions at the atomic scale,” says Ercius.
A TEAM approach
The scientists’ latest accomplishment hinged on the use of one of the highest-resolution transmission electron microscopes in the world, called TEAM I. It’s located at the National Center for Electron Microscopy, which is a Molecular Foundry facility. The microscope scans a sample with a focused beam of electrons, and then measures how the electrons interact with the atoms in the sample. It also has a piezo-controlled stage that positions samples with unmatched stability and position-control accuracy.
The researchers began growing an iron-platinum nanoparticle from its constituent elements, and then stopped the particle’s growth before it was fully formed. They placed the “partially baked” particle in the TEAM I stage, obtained a 2-D projection of its atomic structure, rotated it a few degrees, obtained another projection, and so on. Each 2-D projection provides a little more information about the full 3-D structure of the nanoparticle.
They sent the projections to Miao at UCLA, who used a sophisticated computer algorithm to convert the 2-D projections into a 3-D reconstruction of the particle. The individual atomic coordinates and chemical types were then traced from the 3-D density based on the knowledge that iron atoms are lighter than platinum atoms. The resulting atomic structure contains 6,569 iron atoms and 16,627 platinum atoms, with each atom’s coordinates precisely plotted to less than the width of a hydrogen atom.
Translating the data into scientific insights
Interesting features emerged at this extreme scale after Molecular Foundry scientists used code they developed to analyze the atomic structure. For example, the analysis revealed chemical order and disorder in interlocking grains, in which the iron and platinum atoms are arranged in different patterns. This has large implications for how the particle grew and its real-world magnetic properties. The analysis also revealed single-atom defects and the width of disordered boundaries between grains, which was not previously possible in complex 3-D boundaries.
“The important materials science problem we are tackling is how this material transforms from a highly randomized structure, what we call a chemically-disordered structure, into a regular highly-ordered structure with the desired magnetic properties,” says Ophus.
To explore how the various arrangements of atoms affect the nanoparticle’s magnetic properties, scientists from DOE’s Oak Ridge National Laboratory ran computer calculations on the Titan supercomputer at ORNL–using the coordinates and chemical type of each atom–to simulate the nanoparticle’s behavior in a magnetic field. This allowed the scientists to see patterns of atoms that are very magnetic, which is ideal for hard drives. They also saw patterns with poor magnetic properties that could sap a hard drive’s performance.
“This could help scientists learn how to steer the growth of iron-platinum nanoparticles so they develop more highly magnetic patterns of atoms,” says Ercius.
Adds Scott, “More broadly, the imaging technique will shed light on the nucleation and growth of ordered phases within nanoparticles, which isn’t fully theoretically understood but is critically important to several scientific disciplines and technologies.”
The folks at the Berkeley Lab have created a video (notice where the still image from the beginning of this post appears),
The Oak Ridge National Laboratory (ORNL), not wanting to be left out, has been mentioned in a Feb. 3, 2017 news item on ScienceDaily,
… researchers working with magnetic nanoparticles at the University of California, Los Angeles (UCLA), and the US Department of Energy’s (DOE’s) Lawrence Berkeley National Laboratory (Berkeley Lab) approached computational scientists at DOE’s Oak Ridge National Laboratory (ORNL) to help solve a unique problem: to model magnetism at the atomic level using experimental data from a real nanoparticle.
“These types of calculations have been done for ideal particles with ideal crystal structures but not for real particles,” said Markus Eisenbach, a computational scientist at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at ORNL.
Eisenbach develops quantum mechanical electronic structure simulations that predict magnetic properties in materials. Working with Paul Kent, a computational materials scientist at ORNL’s Center for Nanophase Materials Sciences, the team collaborated with researchers at UCLA and Berkeley Lab’s Molecular Foundry to combine world-class experimental data with world-class computing to do something new–simulate magnetism atom by atom in a real nanoparticle.
Using the new data from the research teams on the West Coast, Eisenbach and Kent were able to precisely model the measured atomic structure, including defects, from a unique iron-platinum (FePt) nanoparticle and simulate its magnetic properties on the 27-petaflop Titan supercomputer at the OLCF.
Electronic structure codes take atomic and chemical structure and solve for the corresponding magnetic properties. However, these structures are typically derived from many 2-D electron microscopy or x-ray crystallography images averaged together, resulting in a representative, but not true, 3-D structure.
“In this case, researchers were able to get the precise 3-D structure for a real particle,” Eisenbach said. “The UCLA group has developed a new experimental technique where they can tell where the atoms are–the coordinates–and the chemical resolution, or what they are — iron or platinum.”
The ORNL news release goes on to describe the work from the perspective of the people who ran the supercompute simulationsr,
A Supercomputing Milestone
Magnetism at the atomic level is driven by quantum mechanics — a fact that has shaken up classical physics calculations and called for increasingly complex, first-principle calculations, or calculations working forward from fundamental physics equations rather than relying on assumptions that reduce computational workload.
For magnetic recording and storage devices, researchers are particularly interested in magnetic anisotropy, or what direction magnetism favors in an atom.
“If the anisotropy is too weak, a bit written to the nanoparticle might flip at room temperature,” Kent said.
To solve for magnetic anisotropy, Eisenbach and Kent used two computational codes to compare and validate results.
To simulate a supercell of about 1,300 atoms from strongly magnetic regions of the 23,000-atom nanoparticle, they used the Linear Scaling Multiple Scattering (LSMS) code, a first-principles density functional theory code developed at ORNL.
“The LSMS code was developed for large magnetic systems and can tackle lots of atoms,” Kent said.
As principal investigator on 2017, 2016, and previous INCITE program awards, Eisenbach has scaled the LSMS code to Titan for a range of magnetic materials projects, and the in-house code has been optimized for Titan’s accelerated architecture, speeding up calculations more than 8 times on the machine’s GPUs. Exceptionally capable of crunching large magnetic systems quickly, the LSMS code received an Association for Computing Machinery Gordon Bell Prize in high-performance computing achievement in 1998 and 2009, and developments continue to enhance the code for new architectures.
Working with Renat Sabirianov at the University of Nebraska at Omaha, the team also ran VASP, a simulation package that is better suited for smaller atom counts, to simulate regions of about 32 atoms.
“With both approaches, we were able to confirm that the local VASP results were consistent with the LSMS results, so we have a high confidence in the simulations,” Eisenbach said.
Computer simulations revealed that grain boundaries have a strong effect on magnetism. “We found that the magnetic anisotropy energy suddenly transitions at the grain boundaries. These magnetic properties are very important,” Miao said.
In the future, researchers hope that advances in computing and simulation will make a full-particle simulation possible — as first-principles calculations are currently too intensive to solve small-scale magnetism for regions larger than a few thousand atoms.
Also, future simulations like these could show how different fabrication processes, such as the temperature at which nanoparticles are formed, influence magnetism and performance.
“There’s a hope going forward that one would be able to use these techniques to look at nanoparticle growth and understand how to optimize growth for performance,” Kent said.
Finally, here’s a link to and a citation for the paper,
You might want to skip over the reference to snow as it doesn’t have much relevance to this story about ‘melting’, from a Feb. 1, 2017 news item on Nanowerk (Note: A link has been removed),
Snow falls in winter and melts in spring, but what drives the phase change in between?
Although melting is a familiar phenomenon encountered in everyday life, playing a part in many industrial and commercial processes, much remains to be discovered about this transformation at a fundamental level.
In 2015, a team led by the University of Michigan’s Sharon Glotzer used high-performance computing at the Department of Energy’s (DOE’s) Oak Ridge National Laboratory [ORNL] to study melting in two-dimensional (2-D) systems, a problem that could yield insights into surface interactions in materials important to technologies like solar panels, as well as into the mechanism behind three-dimensional melting. The team explored how particle shape affects the physics of a solid-to-fluid melting transition in two dimensions.
Using the Cray XK7 Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility, the team’s [latest?] work revealed that the shape and symmetry of particles can dramatically affect the melting process (“Shape and symmetry determine two-dimensional melting transitions of hard regular polygons”). This fundamental finding could help guide researchers in search of nanoparticles with desirable properties for energy applications.
There is a video of the ‘melting’ process but I have to confess to finding it a bit enigmatic,
o tackle the problem, Glotzer’s team needed a supercomputer capable of simulating systems of up to 1 million hard polygons, simple particles used as stand-ins for atoms, ranging from triangles to 14-sided shapes. Unlike traditional molecular dynamics simulations that attempt to mimic nature, hard polygon simulations give researchers a pared-down environment in which to evaluate shape-influenced physics.
“Within our simulated 2-D environment, we found that the melting transition follows one of three different scenarios depending on the shape of the systems’ polygons,” University of Michigan research scientist Joshua Anderson said. “Notably, we found that systems made up of hexagons perfectly follow a well-known theory for 2-D melting, something that hasn’t been described until now.”
Shifting Shape Scenarios
In 3-D systems such as a thinning icicle, melting takes the form of a first-order phase transition. This means that collections of molecules within these systems exist in either solid or liquid form with no in-between in the presence of latent heat, the energy that fuels a solid-to-fluid phase change . In 2-D systems, such as thin-film materials used in batteries and other technologies, melting can be more complex, sometimes exhibiting an intermediate phase known as the hexatic phase.
The hexatic phase, a state characterized as a halfway point between an ordered solid and a disordered liquid, was first theorized in the 1970s by researchers John Kosterlitz, David Thouless, Burt Halperin, David Nelson, and Peter Young. The phase is a principle feature of the KTHNY theory, a 2-D melting theory posited by the researchers (and named based on the first letters of their last names). In 2016 Kosterlitz and Thouless were awarded the Nobel Prize in Physics, along with physicist Duncan Haldane, for their contributions to 2-D materials research.
At the molecular level, solid, hexatic, and liquid systems are defined by the arrangement of their atoms. In a crystalline solid, two types of order are present: translational and orientational. Translational order describes the well-defined paths between atoms over distances, like blocks in a carefully constructed Jenga tower. Orientational order describes the relational and clustered order shared between atoms and groups of atoms over distances. Think of that same Jenga tower turned askew after several rounds of play. The general shape of the tower remains, but its order is now fragmented.
The hexatic phase has no translational order but possesses orientational order. (A liquid has neither translational nor orientational order but exhibits short-range order, meaning any atom will have some average number of neighbors nearby but with no predicable order.)
On Titan, HOOMD-blue used 64 GPUs for each massively parallel Monte Carlo simulation of up to 1 million particles. Researchers explored 11 different shape systems, applying an external pressure to push the particles together. Each system was simulated at 21 different densities, with the lowest densities representing a fluid state and the highest densities a solid state.
The simulations demonstrated multiple melting scenarios hinging on the polygons’ shape. Systems with polygons of seven sides or more closely followed the melting behavior of hard disks, or circles, exhibiting a continuous phase transition from the solid to the hexatic phase and a first-order phase transition from the hexatic to the liquid phase. A continuous phase transition means a constantly changing area in response to a changing external pressure. A first-order phase transition is characterized by a discontinuity in which the volume jumps across the phase transition in response to the changing external pressure. The team found pentagons and fourfold pentilles, irregular pentagons with two different edge lengths, exhibit a first-order solid-to-liquid phase transition.
The most significant finding, however, emerged from hexagon systems, which perfectly followed the phase transition described by the KTHNY theory. In this scenario, the particles’ shift from solid to hexatic and hexatic to fluid in a perfect continuous phase transition pattern.
“It was actually sort of surprising that no one else has found that until now,” Anderson said, “because it seems natural that the hexagon, with its six sides, and the honeycomb-like hexagonal arrangement would be a perfect match for this theory” in which the hexatic phase generally contains sixfold orientational order.
Glotzer’s team, which recently received a 2017 INCITE allocation, is now applying its leadership-class computing prowess to tackle phase transitions in 3-D. The team is focusing on how fluid particles crystallize into complex colloids—mixtures in which particles are suspended throughout another substance. Common examples of colloids include milk, paper, fog, and stained glass.
“We’re planning on using Titan to study how complexity can arise from these simple interactions, and to do that we’re actually going to look at how the crystals grow and study the kinetics of how that happens,” said Anderson.
Layers of graphene separated by nanotube pillars of boron nitride may be a suitable material to store hydrogen fuel in cars, according to Rice University scientists.
The Department of Energy has set benchmarks for storage materials that would make hydrogen a practical fuel for light-duty vehicles. The Rice lab of materials scientist Rouzbeh Shahsavari determined in a new computational study that pillared boron nitride and graphene could be a candidate.
Shahsavari’s lab had already determined through computer models how tough and resilient pillared graphene structures would be, and later worked boron nitride nanotubes into the mix to model a unique three-dimensional architecture. (Samples of boron nitride nanotubes seamlessly bonded to graphene have been made.)
Just as pillars in a building make space between floors for people, pillars in boron nitride graphene make space for hydrogen atoms. The challenge is to make them enter and stay in sufficient numbers and exit upon demand.
In their latest molecular dynamics simulations, the researchers found that either pillared graphene or pillared boron nitride graphene would offer abundant surface area (about 2,547 square meters per gram) with good recyclable properties under ambient conditions. Their models showed adding oxygen or lithium to the materials would make them even better at binding hydrogen.
They focused the simulations on four variants: pillared structures of boron nitride or pillared boron nitride graphene doped with either oxygen or lithium. At room temperature and in ambient pressure, oxygen-doped boron nitride graphene proved the best, holding 11.6 percent of its weight in hydrogen (its gravimetric capacity) and about 60 grams per liter (its volumetric capacity); it easily beat competing technologies like porous boron nitride, metal oxide frameworks and carbon nanotubes.
At a chilly -321 degrees Fahrenheit, the material held 14.77 percent of its weight in hydrogen.
The Department of Energy’s current target for economic storage media is the ability to store more than 5.5 percent of its weight and 40 grams per liter in hydrogen under moderate conditions. The ultimate targets are 7.5 weight percent and 70 grams per liter.
Shahsavari said hydrogen atoms adsorbed to the undoped pillared boron nitride graphene, thanks to weak van der Waals forces. When the material was doped with oxygen, the atoms bonded strongly with the hybrid and created a better surface for incoming hydrogen, which Shahsavari said would likely be delivered under pressure and would exit when pressure is released.
“Adding oxygen to the substrate gives us good bonding because of the nature of the charges and their interactions,” he said. “Oxygen and hydrogen are known to have good chemical affinity.”
He said the polarized nature of the boron nitride where it bonds with the graphene and the electron mobility of the graphene itself make the material highly tunable for applications.
“What we’re looking for is the sweet spot,” Shahsavari said, describing the ideal conditions as a balance between the material’s surface area and weight, as well as the operating temperatures and pressures. “This is only practical through computational modeling, because we can test a lot of variations very quickly. It would take experimentalists months to do what takes us only days.”
He said the structures should be robust enough to easily surpass the Department of Energy requirement that a hydrogen fuel tank be able to withstand 1,500 charge-discharge cycles.
Shayeganfar [Farzaneh Shayeganfar], a former visiting scholar at Rice, is an instructor at Shahid Rajaee Teacher Training University in Tehran, Iran.
Caption: Simulations by Rice University scientists show that pillared graphene boron nitride may be a suitable storage medium for hydrogen-powered vehicles. Above, the pink (boron) and blue (nitrogen) pillars serve as spacers for carbon graphene sheets (gray). The researchers showed the material worked best when doped with oxygen atoms (red), which enhanced its ability to adsorb and desorb hydrogen (white). Credit: Lei Tao/Rice University
The University of California at Davis (UC Davis) and the University of Washington (state) collaborated in research into fundamental questions on how aquatic animals grow. From an Oct. 24, 2016 news item on ScienceDaily,
For the first time scientists can see how the shells of tiny marine organisms grow atom-by-atom, a new study reports. The advance provides new insights into the mechanisms of biomineralization and will improve our understanding of environmental change in Earth’s past.
Led by researchers from the University of California, Davis and the University of Washington, with key support from the U.S. Department of Energy’s Pacific Northwest National Laboratory, the team examined an organic-mineral interface where the first calcium carbonate crystals start to appear in the shells of foraminifera, a type of plankton.
“We’ve gotten the first glimpse of the biological event horizon,” said Howard Spero, a study co-author and UC Davis geochemistry professor. …
Foraminifera’s Final Frontier
The researchers zoomed into shells at the atomic level to better understand how growth processes may influence the levels of trace impurities in shells. The team looked at a key stage — the interaction between the biological ‘template’ and the initiation of shell growth. The scientists produced an atom-scale map of the chemistry at this crucial interface in the foraminifera Orbulina universa. This is the first-ever measurement of the chemistry of a calcium carbonate biomineralization template, Spero said.
Among the new findings are elevated levels of sodium and magnesium in the organic layer. This is surprising because the two elements are not considered important architects in building shells, said lead study author Oscar Branson, a former postdoctoral researcher at UC Davis who is now at the Australian National University in Canberra. Also, the greater concentrations of magnesium and sodium in the organic template may need to be considered when investigating past climate with foraminifera shells.
Calibrating Earth’s Climate
Most of what we know about past climate (beyond ice core records) comes from chemical analyses of shells made by the tiny, one-celled creatures called foraminifera, or “forams.” When forams die, their shells sink and are preserved in seafloor mud. The chemistry preserved in ancient shells chronicles climate change on Earth, an archive that stretches back nearly 200 million years.
The calcium carbonate shells incorporate elements from seawater — such as calcium, magnesium and sodium — as the shells grow. The amount of trace impurities in a shell depends on both the surrounding environmental conditions and how the shells are made. For example, the more magnesium a shell has, the warmer the ocean was where that shell grew.
“Finding out how much magnesium there is in a shell can allow us to find out the temperature of seawater going back up to 150 million years,” Branson said.
But magnesium levels also vary within a shell, because of nanometer-scale growth bands. Each band is one day’s growth (similar to the seasonal variations in tree rings). Branson said considerable gaps persist in understanding what exactly causes the daily bands in the shells.
“We know that shell formation processes are important for shell chemistry, but we don’t know much about these processes or how they might have changed through time,” he said. “This adds considerable uncertainty to climate reconstructions.”
The researchers used two cutting-edge techniques: Time-of-Flight Secondary Ionization Mass Spectrometry (ToF-SIMS) and Laser-Assisted Atom Probe Tomography (APT). ToF-SIMS is a two-dimensional chemical mapping technique which shows the elemental composition of the surface of a polished sample. The technique was developed for the elemental analysis of complex polymer materials, and is just starting to be applied to natural samples like shells.
APT is an atomic-scale three-dimensional mapping technique, developed for looking at internal structures in advanced alloys, silicon chips and superconductors. The APT imaging was performed at the Environmental Molecular Sciences Laboratory, a U.S. Department of Energy Office of Science User Facility at the Pacific Northwest National Laboratory.
This foraminifera is just starting to form its adult spherical shell. The calcium carbonate spherical shell first forms on a thin organic template, shown here in white, around the dark juvenile skeleton. Calcium carbonate spines then extend from the juvenile skeleton through the new sphere and outward. The bright flecks are algae that the foraminifera “farm” for sustenance.Howard Spero/University of California, Davis
Unseen out in the ocean, countless single-celled organisms grow protective shells to keep them safe as they drift along, living off other tiny marine plants and animals. Taken together, the shells are so plentiful that when they sink they provide one of the best records for the history of ocean chemistry.
Oceanographers at the University of Washington and the University of California, Davis, have used modern tools to provide an atomic-scale look at how that shell first forms. Results could help answer fundamental questions about how these creatures grow under different ocean conditions, in the past and in the future. …
“There’s this debate among scientists about whether shelled organisms are slaves to the chemistry of the ocean, or whether they have the physiological capacity to adapt to changing environmental conditions,” said senior author Alex Gagnon, a UW assistant professor of oceanography.
The new work shows, he said, that they do exert some biologically-based control over shell formation.
“I think it’s just incredible that we were able to peer into the intricate details of those first moments that set how a seashell forms,” Gagnon said. “And that’s what sets how much of the rest of the skeleton will grow.”
The results could eventually help understand how organisms at the base of the marine food chain will respond to more acidic waters. And while the study looked at one organism, Orbulina universa, which is important for understanding past climate, the same method could be used for other plankton, corals and shellfish.
The study used tools developed for materials science and semiconductor research to view the shell formation in the most detail yet to see how the organisms turn seawater into solid mineral.
“We’re interested more broadly in the question ‘How do organisms make shells?'” said first author Oscar Branson, a former postdoctoral researcher at the University of California, Davis who is now at Australian National University in Canberra. “We’ve focused on a key stage in mineral formation — the interaction between biological template materials and the initiation of shell growth by an organism.”
These tiny single-celled animals, called foraminifera, can’t reproduce anywhere but in their natural surroundings, which prevents breeding them in captivity. The researchers caught juvenile foraminifera by diving in deep water off Southern California. Then they then raised them in the lab, using tiny pipettes to feed them brine shrimp during their weeklong lives.
Marine shells are made from calcium carbonate, drawing the calcium and carbon from surrounding seawater. But the animal first grows a soft template for the mineral to grow over. Because this template is trapped within the growing skeleton, it acts as a snapshot of the chemical conditions during the first part of skeletal growth.
To see this chemical picture, the authors analyzed tiny sections of foraminifera template with a technique called atom probe tomography at the Pacific Northwest National Laboratory. This tool creates an atom-by-atom picture of the organic template, which was located using a chemical tag.
Results show that the template contains more magnesium and sodium atoms than expected, and that this could influence how the mineral in the shell begins to grow around it.
“One of the key stages in growing a skeleton is when you make that first bit, when you build that first bit of structure. Anything that changes that process is a key control point,” Gagnon said.
The clumping suggests that magnesium and sodium play a role in the first stages of shell growth. If their availability changes for any reason, that could influence how the shell grows beyond what simple chemistry would predict.
“We can say who the players are — further experiments will have to tell us exactly how important each of them is,” Gagnon said.
Follow-up work will try to grow the shells and create models of their formation to see how the template affects growth under different conditions, such as more acidic water.
“Translating that into, ‘Can these forams survive ocean acidification?’ is still many steps down the line,” Gagnon cautioned. “But you can’t do that until you have a picture of what that surface actually looks like.”
The researchers also hope that by better understanding the exact mechanism of shell growth they could tease apart different aspects of seafloor remains so the shells can be used to reconstruct more than just the ocean’s past temperature. In the study, they showed that the template was responsible for causing fine lines in the shells — one example of the rich chemical information encoded in fossil shells.
“There are ways that you could separate the effects of temperature from other things and learn much more about the past ocean,” Gagnon said.