Tag Archives: Berkeley Lab

Disorder engineering turns ‘white’ nanoparticles to ‘black’ nanoparticles for clean energy

Titanium dioxide crystals are white, except when they’re black. According to an Apr. 10, 2013 news item on Nanowerk, researchers at the Lawrence Berkeley National Laboratory (US) have found a way to change white titanium dioxide crystals to black thereby changing some of their properties,

A unique atomic-scale engineering technique for turning low-efficiency photocatalytic “white” nanoparticles of titanium dioxide into high-efficiency “black” nanoparticles could be the key to clean energy technologies based on hydrogen.

Samuel Mao, a scientist who holds joint appointments with Berkeley Lab’s Environmental Energy Technologies Division and the University of California at Berkeley, leads the development of a technique for engineering disorder into the nanocrystalline structure of the semiconductor titanium dioxide. This turns the naturally white crystals black in color, a sign that the crystals are now able to absorb infrared as well as visible and ultraviolet light. The expanded absorption spectrum substantially improves the efficiency with which black titanium dioxide can use sunlight to split water molecules for the production of hydrogen.

The Apr. 10, 2013 Berkeley Lab news release, which originated the news item, provides more detail about how this discovery might have an impact on clean energy efforts,

The promise of hydrogen in batteries or fuels is a clean and renewable source of energy that does not exacerbate global climate change. The challenge is cost-effectively mass-producing it. Despite being the most abundant element in the universe, pure hydrogen is scarce on Earth because hydrogen combines with just about any other type of atom. Using solar energy to split the water molecule into hydrogen and oxygen is the ideal way to produce pure hydrogen. This, however, requires an efficient photocatalyst that water won’t corrode. Titanium dioxide can stand up to water but until the work of Mao and his group was only able to absorb ultraviolet light, which accounts for barely ten percent of the energy in sunlight.In his ACS [American Chemical Society]  talk [at the 245th meeting, Apr. 7 – 11, 2013], titled “Disorder Engineering: Turning Titanium Dioxide Nanoparticles Black,” Mao described how he developed the concept of “disorder engineering,” and how the introduction of hydrogenated disorders creates mid-band gap energy states above the valence band maximum to enhance hydrogen mobility. His studies have not only yielded a promising new photocatalyst for generating hydrogen, but have also helped dispel some widely held scientific beliefs.

“Our tests have shown that a good semiconductor photocatalyst does not have to be a single crystal with minimal defects and energy levels just beneath the bottom of conduction band,” Mao said.

Characterization studies at Berkeley Lab’s Advanced Light Source also helped answer the question of how much of the hydrogen  detected in their experiments comes from the photocatalytic reaction, and how much comes from hydrogen absorbed in the titanium oxide during the hydrogenation synthesis process.

“Our measurements indicate that only a very small amount of hydrogen is absorbed in black titanium dioxide, about 0.05 milligrams, as compared to the 40 milligrams of hydrogen detected during a 100 hour solar-driven hydrogen production experiment,” Mao said.

I must say, this ‘disorder engineering’ sounds much more appealing than some of the other disorders one hears about (e.g. personality disorders).

Computer simulation errors and corrections

In addition to being a news release, this is a really good piece of science writing by Paul Preuss for the Lawrence Berkeley National Laboratory (Berkeley Lab), from the Jan. 3, 2013 Berkeley Lab news release,

Because modern computers have to depict the real world with digital representations of numbers instead of physical analogues, to simulate the continuous passage of time they have to digitize time into small slices. This kind of simulation is essential in disciplines from medical and biological research, to new materials, to fundamental considerations of quantum mechanics, and the fact that it inevitably introduces errors is an ongoing problem for scientists.

Scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have now identified and characterized the source of tenacious errors and come up with a way to separate the realistic aspects of a simulation from the artifacts of the computer method. …

Here’s more detail about the problem and solution,

How biological molecules move is hardly the only field where computer simulations of molecular-scale motion are essential. The need to use computers to test theories and model experiments that can’t be done on a lab bench is ubiquitous, and the problems that Sivak and his colleagues encountered weren’t new.

“A simulation of a physical process on a computer cannot use the exact, continuous equations of motion; the calculations must use approximations over discrete intervals of time,” says Sivak. “It’s well known that standard algorithms that use discrete time steps don’t conserve energy exactly in these calculations.”

One workhorse method for modeling molecular systems is Langevin dynamics, based on equations first developed by the French physicist Paul Langevin over a century ago to model Brownian motion. Brownian motion is the random movement of particles in a fluid (originally pollen grains on water) as they collide with the fluid’s molecules – particle paths resembling a “drunkard’s walk,” which Albert Einstein had used just a few years earlier to establish the reality of atoms and molecules. Instead of impractical-to-calculate velocity, momentum, and acceleration for every molecule in the fluid, Langevin’s method substituted an effective friction to damp the motion of the particle, plus a series of random jolts.

When Sivak and his colleagues used Langevin dynamics to model the behavior of molecular machines, they saw significant differences between what their exact theories predicted and what their simulations produced. They tried to come up with a physical picture of what it would take to produce these wrong answers.

“It was as if extra work were being done to push our molecules around,” Sivak says. “In the real world, this would be a driven physical process, but it existed only in the simulation, so we called it ‘shadow work.’ It took exactly the form of a nonequilibrium driving force.”

They first tested this insight with “toy” models having only a single degree of freedom, and found that when they ignored the shadow work, the calculations were systematically biased. But when they accounted for the shadow work, accurate calculations could be recovered.

“Next we looked at systems with hundreds or thousands of simple molecules,” says Sivak. Using models of water molecules in a box, they simulated the state of the system over time, starting from a given thermal energy but with no “pushing” from outside. “We wanted to know how far the water simulation would be pushed by the shadow work alone.”

The result confirmed that even in the absence of an explicit driving force, the finite-time-step Langevin dynamics simulation acted by itself as a driving nonequilibrium process. Systematic errors resulted from failing to separate this shadow work from the actual “protocol work” that they explicitly modeled in their simulations. For the first time, Sivak and his colleagues were able to quantify the magnitude of the deviations in various test systems.

Such simulation errors can be reduced in several ways, for example by dividing the evolution of the system into ever-finer time steps, because the shadow work is larger when the discrete time steps are larger. But doing so increases the computational expense.

The better approach is to use a correction factor that isolates the shadow work from the physically meaningful work, says Sivak. “We can apply results from our calculation in a meaningful way to characterize the error and correct for it, separating the physically realistic aspects of the simulation from the artifacts of the computer method.”

You can find out more in the Berkeley Lab news release, or (H/T)  in the Jan. 3, 2013 news item on Nanowerk, or you can read the paper,

“Using nonequilibrium fluctuation theorems to understand and correct errors in equilibrium and nonequilibrium discrete Langevin dynamics simulations,” by David A. Sivak, John D. Chodera, and Gavin E. Crooks, will appear in Physical Review X (http://prx.aps.org/) and is now available as an arXiv preprint at http://arxiv.org/abs/1107.2967.

This casts a new light on the SPAUN (Semantic Pointer Architecture Unified Network) project, from Chris Eliasmith’s team at the University of Waterloo, which announced the most  successful attempt (my Nov. 29, 2012 posting) yet to simulate a brain using virtual neurons. Given the probability that Eliasmith’s team was not aware of this work from the Berkeley Lab, one imagines that once it has been integrated that SPAUN will be capable of even more extraordinary feats.

Space-time crystals and everlasting clocks

Apparently, a space-time crystal could be useful for such things as studying the many-body problem in physics.  Since I hadn’t realized the many-body problem existed and have no idea how this might affect me or anyone else, I will have to take the utility of a space-time crystal on trust.As for the possibility of an everlasting clock, how will I ever know the truth since I’m not everlasting?

The Sept. 24, 2012 news item on Nanowerk about a new development makes the space-time crystal sound quite fascinating,

Imagine a clock that will keep perfect time forever, even after the heat-death of the universe. This is the “wow” factor behind a device known as a “space-time crystal,” a four-dimensional crystal that has periodic structure in time as well as space. However, there are also practical and important scientific reasons for constructing a space-time crystal. With such a 4D crystal, scientists would have a new and more effective means by which to study how complex physical properties and behaviors emerge from the collective interactions of large numbers of individual particles, the so-called many-body problem of physics. A space-time crystal could also be used to study phenomena in the quantum world, such as entanglement, in which an action on one particle impacts another particle even if the two particles are separated by vast distances. [emphasis mine]

While I’m most interested in the possibility of studying entanglement, it seems to me the scientists are guessing since the verb ‘could’ is being used where they used ‘would’ previously for studying the many body problem.

The Sept. 24, 2012 news release by Lynn Yarris for the Lawrence Berkeley National Laboratory  (Berkeley Lab), which originated the news item, provides detail on the latest space-time crystal development,

A space-time crystal, however, has only existed as a concept in the minds of theoretical scientists with no serious idea as to how to actually build one – until now. An international team of scientists led by researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) has proposed the experimental design of a space-time crystal based on an electric-field ion trap and the Coulomb repulsion of particles that carry the same electrical charge.

“The electric field of the ion trap holds charged particles in place and Coulomb repulsion causes them to spontaneously form a spatial ring crystal,” says Xiang Zhang, a faculty scientist  with Berkeley Lab’s Materials Sciences Division who led this research. “Under the application of a weak static magnetic field, this ring-shaped ion crystal will begin a rotation that will never stop. The persistent rotation of trapped ions produces temporal order, leading to the formation of a space-time crystal at the lowest quantum energy state.”

Because the space-time crystal is already at its lowest quantum energy state, its temporal order – or timekeeping – will theoretically persist even after the rest of our universe reaches entropy, thermodynamic equilibrium or “heat-death.”

This new development builds on some work done earlier this year at the Massachusetts Institute of Technology (MIT), from the Yarris news release,

The concept of a crystal that has discrete order in time was proposed earlier this year by Frank Wilczek, the Nobel-prize winning physicist at the Massachusetts Institute of Technology. While Wilczek mathematically proved that a time crystal can exist, how to physically realize such a time crystal was unclear. Zhang and his group, who have been working on issues with temporal order in a different system since September 2011, have come up with an experimental design to build a crystal that is discrete both in space and time – a space-time crystal.

Traditional crystals are 3D solid structures made up of atoms or molecules bonded together in an orderly and repeating pattern. Common examples are ice, salt and snowflakes. Crystallization takes place when heat is removed from a molecular system until it reaches its lower energy state. At a certain point of lower energy, continuous spatial symmetry breaks down and the crystal assumes discrete symmetry, meaning that instead of the structure being the same in all directions, it is the same in only a few directions.

“Great progress has been made over the last few decades in exploring the exciting physics of low-dimensional crystalline materials such as two-dimensional graphene, one-dimensional nanotubes, and zero-dimensional buckyballs,” says Tongcang Li, lead author of the PRL paper and a post-doc in Zhang’s research group. “The idea of creating a crystal with dimensions higher than that of conventional 3D crystals is an important conceptual breakthrough in physics and it is very exciting for us to be the first to devise a way to realize a space-time crystal.”

Just as a 3D crystal is configured at the lowest quantum energy state when continuous spatial symmetry is broken into discrete symmetry, so too is symmetry breaking expected to configure the temporal component of the space-time crystal. Under the scheme devised by Zhang and Li and their colleagues, a spatial ring of trapped ions in persistent rotation will periodically reproduce itself in time, forming a temporal analog of an ordinary spatial crystal. With a periodic structure in both space and time, the result is a space-time crystal.

Here’s an image created by team at the Berkeley Lab to represent their work on the space-time crystal,

Imagine a clock that will keep perfect time forever or a device that opens new dimensions into quantum phenomena such as emergence and entanglement. (courtesy of Xiang Zhang group[?] at Berkeley Lab)

For anyone who’s interested in this work, I suggest reading either the news item on Nanowerk or the Berkeley Lab news release in full. I will leave you with Natalie Cole and Everlasting Love,

Flipping chirality at the Lawrence Berkeley National Laboratory

First, it might be a good idea to define chirality. From the Lawrence Berkeley National Laboratory (Berkeley Lab) July 10, 2012 news release by LynnYarris,

Chirality is the distinct left/right orientation or “handedness” of some types of molecules, meaning the molecule can take one of two mirror image forms. The right-handed and left-handed forms of such molecules, called “enantiomers,” can exhibit strikingly different properties. For example, one enantiomer of the chiral molecule limonene smells of lemon, the other smells of orange. The ability to observe or even switch the chirality of molecules using terahertz (trillion-cycles-per-second) electromagnetic radiation is a much coveted asset in the world of high technology.

As for why anyone would want  to flip molecules back and forth between left- and right-handedness (from the news release),

A multi-institutional team of researchers that included scientists with the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has created the first artificial molecules whose chirality can be rapidly switched from a right-handed to a left-handed orientation with a  beam of light. This holds potentially important possibilities for the application of terahertz technologies across a wide range of fields, including reduced energy use for data-processing, homeland security and ultrahigh-speed communications.

Here’s how the technique works, from the July 10, 2012 news item on physorg.com,

Working with terahertz (THz) metamaterials engineered from nanometer-sized gold strips with air as the dielectric – Zhang [Xiang Zhang, one of the leaders of this research and a principal investigator with Berkeley Lab’s Materials Sciences Division] and his colleagues fashioned a delicate artificial chiral molecule which they then incorporated with a photoactive silicon medium. Through photoexcitation of their metamolecules with an external beam of light, the researchers observed handedness flipping in the form of circularly polarized emitted THz light. Furthermore, the photoexcitation enabled this chirality flipping and the circular polarization of THz light to be dynamically controlled.

“In contrast to previous demonstrations where chirality was merely switched on or off in metamaterials using photoelectric stimulation, we used an optical switch to actually reverse the chirality of our THz metamolecules,” Zhang says.

The researchers describe in more detail the potential for this new technique,

“The observed giant switchable chirality we can engineer into our metamaterials provides a viable approach towards creating high performance polarimetric devices that are largely not available at terahertz frequencies,” says corresponding author Antoinette Taylor. “This frequency range is particularly interesting because it uniquely reveals information about physical phenomena such as the interactions between or within biologically relevant molecules, and may enable control of electronic states in novel material systems, such as cyclotron resonances in graphene and topological insulators.”

Taylor and her co-authors say that the general design principle of their optically switchable chiral THz metamolecules is not limited to handedness switching but could also be applied to the dynamic reversing of other electromagnetic properties.

From what I understand metamaterials are very expensive and difficult to produce which means this exciting advance is likely to remain in the laboratory of at least 10 years.

Carbon sequestration (capturing carbon dioxide) and Zeo++

Carbon capture has been proposed as a way to mitigate global climate change and Zeo++ is a software which promises to help the search for porous materials that will filter out carbon (capture carbon) before it reaches the atmosphere. From the Mar. 1, 2012 news item on Nanowerk,

Approximately 75 percent of electricity used in the United States is produced by coal-burning power plants that spew carbon dioxide (CO2) into the atmosphere and contribute to global warming. To reduce this effect, many researchers are searching for porous materials to filter out the CO2 generated by these plants before it reaches the atmosphere, a process commonly known as carbon capture. But identifying these materials is easier said than done.

“There are a number of porous substances—including crystalline porous materials, such as zeolites, and metal-organic frameworks—that could be used to capture carbon dioxide from power plant emissions,” says Maciej Haranczyk, a scientist in the Lawrence Berkeley National Laboratory’s (Berkeley Lab) Computational Research Division.

In the category of zeolites alone, Haranczyk notes that there are around 200 known materials and 2.5 million structures predicted by computational methods. That’s why Haranczyk and colleagues have developed a computational tool that can help researchers sort through vast databases of porous materials to identify promising carbon capture candidates—and at record speeds. They call it Zeo++.

Here’s a description of the software from the Zeo++ home page,

Zeo++ is a software package for analysis of crystalline porous materials. Zeo++ can be used to perform geometry-based analysis of structure and topology of the void space inside a material, to alternate structures as well as to generate structure representations to-be-used in structure similarity calculations. Zeo++ can be used to either analyze a single structure or perform high-throughput analysis of a large database.

Here’s what the scientists are trying to determine when they use the software to analyze the proposed carbon capture materials (from the news item),

Porous materials like zeolites or metal organic frameworks come in a variety of shapes and have a range of pore sizes. It is actually the shape and pore sizes that determine which molecules get absorbed into the material and which ones pass through.

Like molecular sponges, porous materials can also be reused in a cycle of capture and release. For instance, in the case of carbon capture, once the material is saturated and cannot absorb any more CO2, the gas can be extracted, and the cycle repeated.

“Understanding how all of these factors combine to effectively capture carbon is a challenge,” says Richard Luis Martin, a member of the Zeo++ development team and a postdoctoral researcher in Berkeley Lab’s Computational Research Division. “Until Zeo++, there were no easy methods for analyzing such large numbers of material structures and identifying what makes a material an outstanding carbon catcher.”

He notes that silicious zeolites, to take one example, are composed of the same tetrahedral blocks of silicon and oxygen atoms, but the geometric arrangement of these blocks differs from one zeolite to the next, and this configuration is what determines how CO2 or any other molecule will interact with the porous material.

Before Zeo++, scientists would typically characterize a porous structure based on a single feature, like the size of its largest pore or its total volume of empty space, then compare and categorize it based on this single observation.

“The problem with this one-dimensional description is that it does not tell you anything about how a molecule like CO2 will move through the material,” says Martin. “To identify the most effective materials for absorbing CO2, we need to understand the porous structure from the perspective of the penetrating molecule.”

This is precisely why Zeo++ characterizes these structures by mapping the empty spaces between their atoms. Drawing from a database of the coordinates of all the atoms in each porous structure, Zeo++ generates a 3D map of the voids in each material. This 3D network allows researchers to see where the channels between atoms intersect to create cavities. The size and shape of these cavities determine whether a molecule will pass through the system or be absorbed.

“Zeo++ allows us to do things that would otherwise be physically impossible,” says Smit [Berend Smit leads the Energy Frontier Research Center for Gas Separations Relevant to Clean Energy Technologies at the University of California at Berkeley], whose group is developing laboratory and computational methods for identifying carbon dioxide-absorbing nanomaterials.

For anyone who’s curious about zeolites, I’ve excerpted this from an essay on Wikipedia (all notes and links have been removed),

Zeolites are microporous, aluminosilicate minerals commonly used as commercial adsorbents.The term zeolite was originally coined in 1756 by Swedish mineralogist Axel Fredric Cronstedt, who observed that upon rapidly heating the material stilbite, it produced large amounts of steam from water that had been adsorbed by the material. Based on this, he called the material zeolite, from the Greek ζέω (zéo̱), meaning “to boil” and λίθος (líthos), meaning “stone”.

I first mentioned zeolite on this blog in a July 1, 2010 posting about ‘green’ nanotechnology in Alberta’s oil sands.

Abracadabra! A new material!

A Nov. 3, 2011 news release from the US Dept. of Energy (DOE) announced the Materials Project,

Researchers from the Department of Energy’s (DOE’s) Lawrence Berkeley National Laboratory (Berkeley Lab) and the Massachusetts Institute of Technology (MIT) jointly launched today a groundbreaking new online tool called the Materials Project, which operates like a “Google” of material properties, enabling scientists and engineers from universities, national laboratories and private industry to accelerate the development of new materials, including critical materials.

Discovering new materials and strengthening the properties of existing materials are key to improving just about everything humans use – from buildings and highways to modern necessities. For example, advances in a group of materials called “critical materials” are more important to America’s competitiveness than ever before – particularly in the clean energy field.  Cell phones, wind turbines, solar panels and a variety of military technologies depend on these roughly fourteen elements (including nine “rare earth” elements).  With about 90 percent coming from China, there are growing concerns about potential supply shortages and disruptions.

The Dec. 20, 2011 news item on Nanowerk provides more detail,

The project is a direct outgrowth of MIT’s Materials Genome Project, initiated in 2006 by Gerbrand Ceder, the Richard P. Simmons (1953) Professor of Materials Science and Engineering. The idea, he says, is that the site “would become the Google of material properties,” making available data previously scattered in many different places, most of them not even searchable.

For example, it used to require months of work — consulting tables of data, performing calculations and carrying out precise lab tests — to create a single phase diagram showing when compounds incorporating several different elements would be solid, liquid or gas. Now, such a diagram can be generated in a matter of minutes, Ceder says.

The new tool could revolutionize product development in fields from energy to electronics to biochemistry, its developers say, much as search engines have transformed the ability to search for arcane bits of knowledge.

Already, more than 500 researchers from universities, research labs and companies have used the new system to seek new materials for lithium-ion batteries, photovoltaic cells and new lightweight alloys for use in cars, trucks and airplanes. The Materials Project is available for use by anyone, although users must register (free of charge) in order to spend more than a few minutes, or to use the most advanced features.

Interestingly, the Materials Project could have an impact on education too,

The tools could also make a big difference in education, Ceder says: When professors set up experiments to help students learn specific principles, “it used to be that we had to pick easy examples” with known outcomes, he says. Now, it’s possible to set much more challenging exercises.

I wasn’t expecting to find a quote from a Canadian academic but here goes,

Mark Obrovac, an associate professor of chemistry and physics at Dalhousie University in Nova Scotia, says, “The Materials Project has made complex computational techniques available to materials researchers at a click of a mouse. This is a major innovation in materials science, enabling researchers to rapidly predict the structure and properties of materials before they make them, and even of materials that cannot be made. This can significantly accelerate materials development in many important areas, including advanced batteries, microelectronics and telecommunications.”

You can find the Materials Project here.