Tag Archives: Berkeley Lab

Interplanetary invaders (dust particles) may be delivery system for water and organics to earth

Researchers at the University of Hawaii and their colleagues in other institutions have determined that interplanetary dust particles (IDP) can deliver solar wind-generated water in addition to the organics which it is known they carry according to a Jan. 24, 2014 news item on ScienceDaily,

Researchers from the University of Hawaii — Manoa (UHM) School of Ocean and Earth Science and Technology (SOEST), Lawrence Livermore National Laboratory, Lawrence Berkeley National Laboratory, and University of California — Berkeley discovered that interplanetary dust particles (IDPs) could deliver water and organics to Earth and other terrestrial planets.

Interplanetary dust, dust that has come from comets, asteroids, and leftover debris from the birth of the solar system, continually rains down on Earth and other Solar System bodies. These particles are bombarded by solar wind, predominately hydrogen ions. This ion bombardment knocks the atoms out of order in the silicate mineral crystal and leaves behind oxygen that is more available to react with hydrogen, for example, to create water molecules.

“It is a thrilling possibility that this influx of dust has acted as a continuous rainfall of little reaction vessels containing both the water and organics needed for the eventual origin of life on Earth and possibly Mars,” said Hope Ishii, new Associate Researcher in the Hawaii Institute of Geophysics and Planetology (HIGP) at UHM SOEST and co-author of the study. This mechanism of delivering both water and organics simultaneously would also work for exoplanets, worlds that orbit other stars. These raw ingredients of dust and hydrogen ions from their parent star would allow the process to happen in almost any planetary system.

The Jan. 24, 2013 University of Hawaii news release (also on EurekAlert), which originated the news item, describes the implications of the research,

Implications of this work are potentially huge: Airless bodies in space such as asteroids and the Moon, with ubiquitous silicate minerals, are constantly being exposed to solar wind irradiation that can generate water. In fact, this mechanism of water formation would help explain remotely sensed data of the Moon, which discovered OH and preliminary water, and possibly explains the source of water ice in permanently shadowed regions of the Moon.

“Perhaps more exciting,” said Hope Ishii, Associate Researcher in HIGP and co-author of the study, “interplanetary dust, especially dust from primitive asteroids and comets, has long been known to carry organic carbon species that survive entering the Earth’s atmosphere, and we have now demonstrated that it also carries solar-wind-generated water. So we have shown for the first time that water and organics can be delivered together.”

The news release provides some background information and a few details about how the research was conducted,

It has been known since the Apollo-era, when astronauts brought back rocks and soil from the Moon, that solar wind causes the chemical makeup of the dust’s surface layer to change. Hence, the idea that solar wind irradiation might produce water-species has been around since then, but whether it actually does produce water has been debated.  The reasons for the uncertainty are that the amount of water produced is small and it is localized in very thin rims on the surfaces of silicate minerals so that older analytical techniques were unable to confirm the presence of water.

Using a state-of-the-art transmission electron microscope, the scientists have now actually detected water produced by solar-wind irradiation in the space-weathered rims on silicate minerals in interplanetary dust particles.  Futher, on the bases of laboratory-irradiated minerals that have similar amorphous rims, they were able to conclude that the water forms from the interaction of solar wind hydrogen ions (H+) with oxygen in the silicate mineral grains.

This recent work does not suggest how much water may have been delivered to Earth in this manner from IDPs.

“In no way do we suggest that it was sufficient to form oceans, for example,” said Ishii. “However, the relevance of our work is not the origin of the Earth’s oceans but that we have shown continuous, co-delivery of water and organics intimately intermixed.”

Here’s a citation for the paper and a link to the abstract,

Detection of solar wind-produced water in irradiated rims on silicate minerals by John Bradley, Hope Ishii, Jeffrey Gillis-Davis, James Ciston, Michael Nielsen, Hans Bechtel, and Michael Martin. Proceedings of the National Academy of Sciences, doi: 10.1073/pnas.1320115111

I believe this paper is behind a paywall.

Cooling it—an application using carbon nanotubes and a theory that hotter leads to cooler

The only thing these two news items have in common is their focus on cooling down electronic devices. Well, there’s also the fact that the work is being done at the nanoscale.

First, there’s a Jan. 23, 2014 news item on Azonano about a technique using carbon nanotubes to cool down microprocessors,

“Cool it!” That’s a prime directive for microprocessor chips and a promising new solution to meeting this imperative is in the offing. Researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) have developed a “process friendly” technique that would enable the cooling of microprocessor chips through carbon nanotubes.

Frank Ogletree, a physicist with Berkeley Lab’s Materials Sciences Division, led a study in which organic molecules were used to form strong covalent bonds between carbon nanotubes and metal surfaces. This improved by six-fold the flow of heat from the metal to the carbon nanotubes, paving the way for faster, more efficient cooling of computer chips. The technique is done through gas vapor or liquid chemistry at low temperatures, making it suitable for the manufacturing of computer chips.

The Jan. 22, 2014 Berkeley Lab news release (also on EurekAlert), which originated the news item, describes the nature  of the problem in more detail,

Overheating is the bane of microprocessors. As transistors heat up, their performance can deteriorate to the point where they no longer function as transistors. With microprocessor chips becoming more densely packed and processing speeds continuing to increase, the overheating problem looms ever larger. The first challenge is to conduct heat out of the chip and onto the circuit board where fans and other techniques can be used for cooling. Carbon nanotubes have demonstrated exceptionally high thermal conductivity but their use for cooling microprocessor chips and other devices has been hampered by high thermal interface resistances in nanostructured systems.

“The thermal conductivity of carbon nanotubes exceeds that of diamond or any other natural material but because carbon nanotubes are so chemically stable, their chemical interactions with most other materials are relatively weak, which makes for  high thermal interface resistance,” Ogletree says. “Intel came to the Molecular Foundry wanting to improve the performance of carbon nanotubes in devices. Working with Nachiket Raravikar and Ravi Prasher, who were both Intel engineers when the project was initiated, we were able to increase and strengthen the contact between carbon nanotubes and the surfaces of other materials. This reduces thermal resistance and substantially improves heat transport efficiency.”

The news release then describes the proposed solution,

Sumanjeet Kaur, lead author of the Nature Communications paper and an expert on carbon nanotubes, with assistance from co-author and Molecular Foundry chemist Brett Helms, used reactive molecules to bridge the carbon nanotube/metal interface – aminopropyl-trialkoxy-silane (APS) for oxide-forming metals, and cysteamine for noble metals. First vertically aligned carbon nanotube arrays were grown on silicon wafers, and thin films of aluminum or gold were evaporated on glass microscope cover slips. The metal films were then “functionalized” and allowed to bond with the carbon nanotube arrays. Enhanced heat flow was confirmed using a characterization technique developed by Ogletree that allows for interface-specific measurements of heat transport.

“You can think of interface resistance in steady-state heat flow as being an extra amount of distance the heat has to flow through the material,” Kaur says. “With carbon nanotubes, thermal interface resistance adds something like 40 microns of distance on each side of the actual carbon nanotube layer. With our technique, we’re able to decrease the interface resistance so that the extra distance is around seven microns at each interface.”

Although the approach used by Ogletree, Kaur and their colleagues substantially strengthened the contact between a metal and individual carbon nanotubes within an array, a majority of the nanotubes within the array may still fail to connect with the metal. The Berkeley team is now developing a way to improve the density of carbon nanotube/metal contacts. Their technique should also be applicable to single and multi-layer graphene devices, which face the same cooling issues.

For anyone who’s never heard of the Molecular Foundry before (from the news release),

The Molecular Foundry is one of five DOE [Department of Energy] Nanoscale Science Research Centers (NSRCs), national user facilities for interdisciplinary research at the nanoscale, supported by the DOE Office of Science. Together the NSRCs comprise a suite of complementary facilities that provide researchers with state-of-the-art capabilities to fabricate, process, characterize, and model nanoscale materials, and constitute the largest infrastructure investment of the National Nanotechnology Initiative. The NSRCs are located at DOE’s Argonne, Brookhaven, Lawrence Berkeley, Oak Ridge and Sandia and Los Alamos national laboratories.

My second item comes from the University of Buffalo (UB), located in the US. From a Jan. 21, 2014 University of Buffalo news release by Cory Nealon (also on EurekAlert),

Heat in electronic devices is generated by the movement of electrons through transistors, resistors and other elements of an electrical network. Depending on the network, there are a variety of ways, such as cooling fans and heat sinks, to prevent the circuits from overheating.

But as more integrated circuits and transistors are added to devices to boost their computing power, it’s becoming more difficult to keep those elements cool. Most nanoelectrics research centers are working to develop advanced materials that are capable of withstanding the extreme environment inside smartphones, laptops and other devices.

While advanced materials show tremendous potential, the UB research suggests there may still be room within the existing paradigm of electronic devices to continue developing more powerful computers.

To support their findings, the researchers fabricated nanoscale semiconductor devices in a state-of-the-art gallium arsenide crystal provided to UB by Sandia’s Reno [John L. Reno, Center for Integrated Nanotechnologies at Sandia National Laboratories]. The researchers then subjected the chip to a large voltage, squeezing an electrical current through the nanoconductors. This, in turn, increased the amount of heat circulating through the chip’s nanotransistor.

But instead of degrading the device, the nanotransistor spontaneously transformed itself into a quantum state that was protected from the effect of heating and provided a robust channel of electric current. To help explain, Bird [Jonathan Bird, UB professor of electrical engineering] offered an analogy to Niagara Falls.

“The water, or energy, comes from a source; in this case, the Great Lakes. It’s channeled into a narrow point (the Niagara River) and ultimately flows over Niagara Falls. At the bottom of waterfall is dissipated energy. But unlike the waterfall, this dissipated energy recirculates throughout the chip and changes how heat affects, or in this case doesn’t affect, the network’s operation.”

While this behavior may seem unusual, especially conceptualizing it in terms of water flowing over a waterfall, it is the direct result of the quantum mechanical nature of electronics when viewed on the nanoscale. The current is made up of electrons which spontaneously organize to form a narrow conducting filament through the nanoconductor. It is this filament that is so robust against the effects of heating.

“We’re not actually eliminating the heat, but we’ve managed to stop it from affecting the electrical network. In a way, this is an optimization of the current paradigm,” said Han [J. E. Han, UB Dept. of Physics], who developed the theoretical models which explain the findings.

What an interesting and counter-intuitive approach to managing the heat in our devices.

For those who want more, here’s a link to and citation for the carbon nanotube paper,

Enhanced thermal transport at covalently functionalized carbon nanotube array interfaces by Sumanjeet Kaur, Nachiket Raravikar, Brett A. Helms, Ravi Prasher, & D. Frank Ogletree. Nature Communications 5, Article number: 3082 doi:10.1038/ncomms4082 Published 22 January 2014

This paper is behind a paywall.

Now here’s a link to and a citation for the ‘making it hotter to make it cooler’ paper,

Formation of a protected sub-band for conduction in quantum point contacts under extreme biasing by J. Lee, J. E. Han, S. Xiao, J. Song, J. L. Reno, & J. P. Bird. Nature Nanotechnology (2014) doi:10.1038/nnano.2013.297 Published online 19 January 2014

This paper is behind a paywall although there is an option to preview it for free via ReadCube Access.

Get yourself some e-whiskers for improved tactile sensing

E-whiskers are highly responsive tactile sensor networks made from carbon nanotubes and silver nanoparticles that resemble the whiskers of cats and other mammals. Courtesy: Berkeley Labs [downloaded from http://newscenter.lbl.gov/science-shorts/2014/01/20/e-whiskers/]

E-whiskers are highly responsive tactile sensor networks made from carbon nanotubes and silver nanoparticles that resemble the whiskers of cats and other mammals. Courtesy: Berkeley Labs [downloaded from http://newscenter.lbl.gov/science-shorts/2014/01/20/e-whiskers/]

A Jan. 21, 2014 news item on Azonano features work from researchers who have simulated the sensitivity of cat’s and rat’s whiskers by creating e-whiskers,

Researchers with Berkeley Lab and the University of California (UC) Berkeley have created tactile sensors from composite films of carbon nanotubes and silver nanoparticles similar to the highly sensitive whiskers of cats and rats. These new e-whiskers respond to pressure as slight as a single Pascal, about the pressure exerted on a table surface by a dollar bill. Among their many potential applications is giving robots new abilities to “see” and “feel” their surrounding environment.

The Jan. 20, 2014 Lawrence Berkeley National Laboratory (Berkeley Lab) ‘science short’ by Lynn Yarris, which originated the news item,  provides more information about the research,

“Whiskers are hair-like tactile sensors used by certain mammals and insects to monitor wind and navigate around obstacles in tight spaces,” says the leader of this research Ali Javey, a faculty scientist in Berkeley Lab’s Materials Sciences Division and a UC Berkeley professor of electrical engineering and computer science.  “Our electronic whiskers consist of high-aspect-ratio elastic fibers coated with conductive composite films of nanotubes and nanoparticles. In tests, these whiskers were 10 times more sensitive to pressure than all previously reported capacitive or resistive pressure sensors.”

Javey and his research group have been leaders in the development of e-skin and other flexible electronic devices that can interface with the environment. In this latest effort, they used a carbon nanotube paste to form an electrically conductive network matrix with excellent bendability. To this carbon nanotube matrix they loaded a thin film of silver nanoparticles that endowed the matrix with high sensitivity to mechanical strain.

“The strain sensitivity and electrical resistivity of our composite film is readily tuned by changing the composition ratio of the carbon nanotubes and the silver nanoparticles,” Javey says. “The composite can then be painted or printed onto high-aspect-ratio elastic fibers to form e-whiskers that can be integrated with different user-interactive systems.”

Javey notes that the use of elastic fibers with a small spring constant as the structural component of the whiskers provides large deflection and therefore high strain in response to the smallest applied pressures. As proof-of-concept, he and his research group successfully used their e-whiskers to demonstrate highly accurate 2D and 3D mapping of wind flow. In the future, e-whiskers could be used to mediate tactile sensing for the spatial mapping of nearby objects, and could also lead to wearable sensors for measuring heartbeat and pulse rate.

“Our e-whiskers represent a new type of highly responsive tactile sensor networks for real time monitoring of environmental effects,” Javey says. “The ease of fabrication, light weight and excellent performance of our e-whiskers should have a wide range of applications for advanced robotics, human-machine user interfaces, and biological applications.”

The researchers’ paper has been published in the Proceedings of the National Academy of Sciences and is titled: “Highly sensitive electronic whiskers based on patterned carbon nanotube and silver nanoparticle composite films.”

Here’s what the e-whiskers look like,

An array of seven vertically placed e-whiskers was used for 3D mapping of the wind by Ali Javey and his group [ Kuniharu Takei, Zhibin Yu, Maxwell Zheng, Hiroki Ota and Toshitake Takahashi].  Courtesy: Berkeley Lab

An array of seven vertically placed e-whiskers was used for 3D mapping of the wind by Ali Javey and his group [ Kuniharu Takei, Zhibin Yu, Maxwell Zheng, Hiroki Ota and Toshitake Takahashi]. Courtesy: Berkeley Lab

Finding a successor to graphene

The folks at the Lawrence Berkeley National Laboratory (Berkeley Lab) have announced a ‘natural’ 3D counterpart of graphene in a Jan. 16, 2014 Berkeley Lab news release (also on EurekAlert and on Azonano dated Jan. 17, 2014),

The discovery of what is essentially a 3D version of graphene – the 2D sheets of carbon through which electrons race at many times the speed at which they move through silicon – promises exciting new things to come for the high-tech industry, including much faster transistors and far more compact hard drives. A collaboration of researchers at the U.S Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) has discovered that sodium bismuthate can exist as a form of quantum matter called a three-dimensional topological Dirac semi-metal (3DTDS). This is the first experimental confirmation of 3D Dirac fermions in the interior or bulk of a material, a novel state that was only recently proposed by theorists.

The news release provides a description of graphene and the search for alternatives (counterparts),

Two of the most exciting new materials in the world of high technology today are graphene and topological insulators, crystalline materials that are electrically insulating in the bulk but conducting on the surface. Both feature 2D Dirac fermions (fermions that aren’t their own antiparticle), which give rise to extraordinary and highly coveted physical properties. Topological insulators also possess a unique electronic structure, in which bulk electrons behave like those in an insulator while surface electrons behave like those in graphene.

“The swift development of graphene and topological insulators has raised questions as to whether there are 3D counterparts and other materials with unusual topology in their electronic structure,” says Chen [Yulin Chen, a physicist from the University of Oxford who led this study working with Berkeley Lab’s Advanced Light Source (ALS)]. “Our discovery answers both questions. In the sodium bismuthate we studied, the bulk conduction and valence bands touch only at discrete points and disperse linearly along all three momentum directions to form bulk 3D Dirac fermions. Furthermore, the topology of a 3DTSD electronic structure is also as unique as those of topological insulators.”

I’m a bit puzzled as to how this new material can be described as “essentially a 3D version of graphene” as my understanding is that graphene must be composed of carbon and have a 2-dimensiional honeycomb structure to merit the name. In any event, this new material, sodium bismuthate, has some disadvantages but the discovery is an encouraging development (from the news release),

Sodium bismuthate is too unstable to be used in devices without proper packaging, but it triggers the exploration for the development of other 3DTDS materials more suitable for everyday devices, a search that is already underway. Sodium bismuthate can also be used to demonstrate potential applications of 3DTDS systems, which offer some distinct advantages over graphene.

“A 3DTDS system could provide a significant improvement in efficiency in many applications over graphene because of its 3D volume,” Chen says. “Also, preparing large-size atomically thin single domain graphene films is still a challenge. It could be easier to fabricate graphene-type devices for a wider range of applications from 3DTDS systems.”

In addition, Chen says, a 3DTDS system also opens the door to other novel physical properties, such as giant diamagnetism that diverges when energy approaches the 3D Dirac point, quantum magnetoresistance in the bulk, unique Landau level structures under strong magnetic fields, and oscillating quantum spin Hall effects. All of these novel properties can be a boon for future electronic technologies. Future 3DTDS systems can also serve as an ideal platform for applications in spintronics.

While I don’t understand (again) the image the researchers have included as an illustration of their work, I do find the ‘blue jewels in a pile of junk’ very appealing,

Beamline 10.0.1 at Berkeley Lab’s Advanced Light Source is optimized for the study of for electron structures and correlated electron systems. (Photo by Roy Kaltschmidt) Courtesy: Berkeley Lab

Beamline 10.0.1 at Berkeley Lab’s Advanced Light Source is optimized for the study of for electron structures and correlated electron systems. (Photo by Roy Kaltschmidt) Courtesy: Berkeley Lab

Here’s a link to and a citation for the paper,

Discovery of a Three-dimensional Topological Dirac Semimetal, Na3Bi by Zhongkai Liu, Bo Zhou, Yi Zhang, Zhijun Wang, Hongming Weng, Dharmalingam Prabhakaran, Sung-Kwan Mo, Zhi-Xun Shen, Zhong Fang, Xi Dai, and Zahid Hussain. Published Online January 16 2014 Science DOI: 10.1126/science.1245085

This paper is behind a paywall.

Graphene liquid cells and movies at the nanoscale

Here’s an Oct. 3, 2013 news item on Azonano about transmission electron microscopy (TEM) and graphene liquid cells enabling researchers at the Lawrence Berkeley National Laboratory (Berkeley Lab) to make movies,

Through a combination of transmission electron microscopy (TEM) and their own unique graphene liquid cell, the researchers have recorded the three-dimensional motion of DNA connected to gold nanocrystals. This is the first time TEM has been used for 3D dynamic imaging of so-called soft materials.

The researchers have produced an animation illustrating their work,

The Oct. 3, 2013 Berkeley Lab news release, which originated the news item, goes on to describe the challenge of imaging soft materials and how the researchers solved the problem,

In the past, liquid cells featured silicon-based viewing windows whose thickness limited resolution and perturbed the natural state of the soft materials. Zettl [physicist Alex Zettl] and Alivisatos [Paul Alivisatos, Berkeley Lab Director] and their respective research groups overcame these limitations with the development of a liquid cell based on a graphene membrane only a single atom thick. This development was done in close cooperation with researchers at the National Center for Electron Microscopy (NCEM), which is located at Berkeley Lab.

“Our graphene liquid cells pushed the spatial resolution of liquid phase TEM imaging to the atomic scale but still focused on growth trajectories of metallic nanocrystals,” says lead author Qian Chen, a postdoctoral fellow in Alivisatos’s research group. “Now we’ve adopted the technique to imaging the 3D dynamics of soft materials, starting with double-strand (dsDNA) connected to gold nanocrystals and achieved nanometer resolution.”

To create the cell, two opposing graphene sheets are bonded to one another by their van der Waals attraction. This forms a sealed nanoscale chamber and creates within the chamber a stable aqueous solution pocket approximately 100 nanometers in height and one micron in diameter. The single atom thick graphene membrane of the cells is essentially transparent to the TEM electron beam, minimizing the unwanted loss of imaging electrons and providing superior contrast and resolution compared to silicon-based windows. The aqueous pockets allow for up to two minutes of continuous imaging of soft material samples exposed to a 200 kilo Volt imaging electron beam. During this time, soft material samples can freely rotate.

After demonstrating that their graphene liquid cell can seal an aqueous sample solution against a TEM high vacuum, the Berkeley researchers used it to study the types of gold-dsDNA nanoconjugates that have been widely used as dynamic plasmonic probes.

“The presence of double-stranded DNA molecules incorporates the major challenges of studying the dynamics of biological samples with liquid phase TEM,” says Alivisatos. “The high-contrast gold nanocrystals facilitate tracking of our specimens.”

The Alivisatos and Zettl groups were able to observe dimers, pairs of gold nanoparticles, tethered by a single piece of dsDNA, and trimers, three gold nanoparticles, connected into a linear configuration by two single pieces of dsDNA. From a series of 2D projected TEM images captured while the samples were rotating, the researchers were to reconstruct 3D configuration and motions of the samples as they evolved over time.

Smarter ‘smart’ windows

It seems to me we may have to find a new way to discuss ‘smart’ windows as there’s only one more category after the comparative  ‘smarter’ and that’s the superlative ‘smartest’. Lawrence Berkeley National Laboratory (Berkeley Lab), please, let’s stop the madness now! That said, the Berkeley Lab issued an Aug. 14, 2013 news release  (also on EurekAlert) about it’s latest work on raising the IQ of smart windows,

Researchers at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have designed a new material to make smart windows even smarter. The material is a thin coating of nanocrystals embedded in glass that can dynamically modify sunlight as it passes through a window. Unlike existing technologies, the coating provides selective control over visible light and heat-producing near-infrared (NIR) light, so windows can maximize both energy savings and occupant comfort in a wide range of climates.

Milliron’s research group is already well known for their smart-window technology that blocks NIR without blocking visible light. The technology hinges on an electrochromic effect, where a small jolt of electricity switches the material between NIR-transmitting and NIR-blocking states. This new work takes their approach to the next level by providing independent control over both visible and NIR light. The innovation was recently recognized with a 2013 R&D 100 Award and the researchers are in the early stages of commercializing their technology.

Independent control over NIR light means that occupants can have natural lighting indoors without unwanted thermal gain, reducing the need for both air-conditioning and artificial lighting. The same window can also be switched to a dark mode, blocking both light and heat, or to a bright, fully transparent mode.

“We’re very excited about the combination of unique optical function with the low-cost and environmentally friendly processing technique,” said Llordés, a project scientist working with Milliron. “That’s what turns this ‘universal smart window’ concept into a promising competitive technology.”

Here’s the specific technology that’s been developed, from the news release,

At the heart of their technology is a new “designer” electrochromic material, made from nanocrystals of indium tin oxide embedded in a glassy matrix of niobium oxide. The resulting composite material combines two distinct functionalities—one providing control over visible light and the other, control over NIR—but it is more than the sum of its parts. The researchers found a synergistic interaction in the tiny region where glassy matrix meets nanocrystal that increases the potency of the electrochromic effect, which means they can use thinner coatings without compromising performance. The key is that the way atoms connect across the nanocrystal-glass interface causes a structural rearrangement in the glass matrix. The interaction opens up space inside the glass, allowing charge to move in and out more readily. Beyond electrochromic windows, this discovery suggests new opportunities for battery materials where transport of ions through electrodes can be a challenge.

I notice they’re using indium, one of the ‘rare earths’. Last I heard, China, one of the main sources for ‘rare earths’, was limiting its exports so this seems like an odd choice of material. Perhaps now they’ve proved this can be done,  they’ll research for easily available substitutes. Here’s a link to and a citation for the published paper,

Tunable near-infrared and visible-light transmittance in nanocrystal-in-glass composites by Anna Llordés, Guillermo Garcia, Jaume Gazquez, & Delia J. Milliron. Nature 500, 323–326 (15 August 2013) doi:10.1038/nature12398 Published online 14 August 2013

Finally, the researchers have provided an illustration of indium tin oxide nanocrystals,

Nanocrystals of indium tin oxide (shown here in blue) embedded in a glassy matrix of niobium oxide (green) form a composite material that can switch between NIR-transmitting and NIR-blocking states with a small jolt of electricity. A synergistic interaction in the region where glassy matrix meets nanocrystal increases the potency of the electrochromic effect. Courtesy Berkeley Lab

Nanocrystals of indium tin oxide (shown here in blue) embedded in a glassy matrix of niobium oxide (green) form a composite material that can switch between NIR-transmitting and NIR-blocking states with a small jolt of electricity. A synergistic interaction in the region where glassy matrix meets nanocrystal increases the potency of the electrochromic effect. Courtesy Berkeley Lab

Integrated artificial photosynthesis nanosystem, a first for Lawrence Berkeley National Laboratory

There’s such a thing as too much information and not enough knowledge, a condition I’m currently suffering from with regard to artificial photosynthesis. Before expanding on that theme, here’s the latest about artificial photosynthesis from a May 16, 2013 Lawrence Berkeley National Laboratory news release (also available on EurekAlert),

In the wake of the sobering news that atmospheric carbon dioxide is now at its highest level in at least three million years, an important advance in the race to develop carbon-neutral renewable energy sources has been achieved. Scientists with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) have reported the first fully integrated nanosystem for artificial photosynthesis. While “artificial leaf” is the popular term for such a system, the key to this success was an “artificial forest.”

Here’s a more detailed description of the system, from the news release,

“Similar to the chloroplasts in green plants that carry out photosynthesis, our artificial photosynthetic system is composed of two semiconductor light absorbers, an interfacial layer for charge transport, and spatially separated co-catalysts,” says Peidong Yang, a chemist with Berkeley Lab’s Materials Sciences Division, who led this research. “To facilitate solar water- splitting in our system, we synthesized tree-like nanowire  heterostructures, consisting of silicon trunks and titanium oxide branches. Visually, arrays of these nanostructures very much resemble an artificial forest.”

… Artificial photosynthesis, in which solar energy is directly converted into chemical fuels, is regarded as one of the most promising of solar technologies. A major challenge for artificial photosynthesis is to produce hydrogen cheaply enough to compete with fossil fuels. Meeting this challenge requires an integrated system that can efficiently absorb sunlight and produce charge-carriers to drive separate water reduction and oxidation half-reactions.

More specifically,

“In natural photosynthesis the energy of absorbed sunlight produces energized charge-carriers that execute chemical reactions in separate regions of the chloroplast,” Yang says. “We’ve integrated our nanowire nanoscale heterostructure into a functional system that mimics the integration in chloroplasts and provides a conceptual blueprint for better solar-to-fuel conversion efficiencies in the future.”

When sunlight is absorbed by pigment molecules in a chloroplast, an energized electron is generated that moves from molecule to molecule through a transport chain until ultimately it drives the conversion of carbon dioxide into carbohydrate sugars. This electron transport chain is called a “Z-scheme” because the pattern of movement resembles the letter Z on its side. Yang and his colleagues also use a Z-scheme in their system only they deploy two Earth abundant and stable semiconductors – silicon and titanium oxide – loaded with co-catalysts and with an ohmic contact inserted between them. Silicon was used for the hydrogen-generating photocathode and titanium oxide for the oxygen-generating photoanode. The tree-like architecture was used to maximize the system’s performance. Like trees in a real forest, the dense arrays of artificial nanowire trees suppress sunlight reflection and provide more surface area for fuel producing reactions.

“Upon illumination photo-excited electron−hole pairs are generated in silicon and titanium oxide, which absorb different regions of the solar spectrum,” Yang says. “The photo-generated electrons in the silicon nanowires migrate to the surface and reduce protons to generate hydrogen while the photo-generated holes in the titanium oxide nanowires oxidize water to evolve  oxygen molecules. The majority charge carriers from both semiconductors recombine at the ohmic contact, completing the relay of the Z-scheme, similar to that of natural photosynthesis.”

Under simulated sunlight, this integrated nanowire-based artificial photosynthesis system achieved a 0.12-percent solar-to-fuel conversion efficiency. Although comparable to some natural photosynthetic conversion efficiencies, this rate will have to be substantially improved for commercial use. [emphasis mine] However, the modular design of this system allows for newly discovered individual components to be readily incorporated to improve its performance. For example, Yang notes that the photocurrent output from the system’s silicon cathodes and titanium oxide anodes do not match, and that the lower photocurrent output from the anodes is limiting the system’s overall performance.

“We have some good ideas to develop stable photoanodes with better performance than titanium oxide,” Yang says. “We’re confident that we will be able to replace titanium oxide anodes in the near future and push the energy conversion efficiency up into single digit percentages.”

Now I can discuss my confusion, which stems from my May 24, 2013 posting about work done at the Argonne National Laboratory,

… Researchers still have a long way to go before they will be able to create devices that match the light harvesting efficiency of a plant.

One reason for this shortcoming, Tiede [Argonne biochemist David Tiede] explained, is that artificial photosynthesis experiments have not been able to replicate the molecular matrix that contains the chromophores. “The level that we are at with artificial photosynthesis is that we can make the pigments and stick them together, but we cannot duplicate any of the external environment,” he said.  “The next step is to build in this framework, and then these kinds of quantum effects may become more apparent.”

Because the moment when the quantum effect occurs is so short-lived – less than a trillionth of a second – scientists will have a hard time ascertaining biological and physical rationales for their existence in the first place. [emphasis mine]

It’s not clear to me whether or not the folks at the Berkeley Lab bypassed the ‘problem’ described by Tiede or solved it to achieve solar-to-fuel conversion rates comparable to natural photosynthesis conversions. As I noted, too much information/not enough knowledge.

Disorder engineering turns ‘white’ nanoparticles to ‘black’ nanoparticles for clean energy

Titanium dioxide crystals are white, except when they’re black. According to an Apr. 10, 2013 news item on Nanowerk, researchers at the Lawrence Berkeley National Laboratory (US) have found a way to change white titanium dioxide crystals to black thereby changing some of their properties,

A unique atomic-scale engineering technique for turning low-efficiency photocatalytic “white” nanoparticles of titanium dioxide into high-efficiency “black” nanoparticles could be the key to clean energy technologies based on hydrogen.

Samuel Mao, a scientist who holds joint appointments with Berkeley Lab’s Environmental Energy Technologies Division and the University of California at Berkeley, leads the development of a technique for engineering disorder into the nanocrystalline structure of the semiconductor titanium dioxide. This turns the naturally white crystals black in color, a sign that the crystals are now able to absorb infrared as well as visible and ultraviolet light. The expanded absorption spectrum substantially improves the efficiency with which black titanium dioxide can use sunlight to split water molecules for the production of hydrogen.

The Apr. 10, 2013 Berkeley Lab news release, which originated the news item, provides more detail about how this discovery might have an impact on clean energy efforts,

The promise of hydrogen in batteries or fuels is a clean and renewable source of energy that does not exacerbate global climate change. The challenge is cost-effectively mass-producing it. Despite being the most abundant element in the universe, pure hydrogen is scarce on Earth because hydrogen combines with just about any other type of atom. Using solar energy to split the water molecule into hydrogen and oxygen is the ideal way to produce pure hydrogen. This, however, requires an efficient photocatalyst that water won’t corrode. Titanium dioxide can stand up to water but until the work of Mao and his group was only able to absorb ultraviolet light, which accounts for barely ten percent of the energy in sunlight.In his ACS [American Chemical Society]  talk [at the 245th meeting, Apr. 7 - 11, 2013], titled “Disorder Engineering: Turning Titanium Dioxide Nanoparticles Black,” Mao described how he developed the concept of “disorder engineering,” and how the introduction of hydrogenated disorders creates mid-band gap energy states above the valence band maximum to enhance hydrogen mobility. His studies have not only yielded a promising new photocatalyst for generating hydrogen, but have also helped dispel some widely held scientific beliefs.

“Our tests have shown that a good semiconductor photocatalyst does not have to be a single crystal with minimal defects and energy levels just beneath the bottom of conduction band,” Mao said.

Characterization studies at Berkeley Lab’s Advanced Light Source also helped answer the question of how much of the hydrogen  detected in their experiments comes from the photocatalytic reaction, and how much comes from hydrogen absorbed in the titanium oxide during the hydrogenation synthesis process.

“Our measurements indicate that only a very small amount of hydrogen is absorbed in black titanium dioxide, about 0.05 milligrams, as compared to the 40 milligrams of hydrogen detected during a 100 hour solar-driven hydrogen production experiment,” Mao said.

I must say, this ‘disorder engineering’ sounds much more appealing than some of the other disorders one hears about (e.g. personality disorders).

Computer simulation errors and corrections

In addition to being a news release, this is a really good piece of science writing by Paul Preuss for the Lawrence Berkeley National Laboratory (Berkeley Lab), from the Jan. 3, 2013 Berkeley Lab news release,

Because modern computers have to depict the real world with digital representations of numbers instead of physical analogues, to simulate the continuous passage of time they have to digitize time into small slices. This kind of simulation is essential in disciplines from medical and biological research, to new materials, to fundamental considerations of quantum mechanics, and the fact that it inevitably introduces errors is an ongoing problem for scientists.

Scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have now identified and characterized the source of tenacious errors and come up with a way to separate the realistic aspects of a simulation from the artifacts of the computer method. …

Here’s more detail about the problem and solution,

How biological molecules move is hardly the only field where computer simulations of molecular-scale motion are essential. The need to use computers to test theories and model experiments that can’t be done on a lab bench is ubiquitous, and the problems that Sivak and his colleagues encountered weren’t new.

“A simulation of a physical process on a computer cannot use the exact, continuous equations of motion; the calculations must use approximations over discrete intervals of time,” says Sivak. “It’s well known that standard algorithms that use discrete time steps don’t conserve energy exactly in these calculations.”

One workhorse method for modeling molecular systems is Langevin dynamics, based on equations first developed by the French physicist Paul Langevin over a century ago to model Brownian motion. Brownian motion is the random movement of particles in a fluid (originally pollen grains on water) as they collide with the fluid’s molecules – particle paths resembling a “drunkard’s walk,” which Albert Einstein had used just a few years earlier to establish the reality of atoms and molecules. Instead of impractical-to-calculate velocity, momentum, and acceleration for every molecule in the fluid, Langevin’s method substituted an effective friction to damp the motion of the particle, plus a series of random jolts.

When Sivak and his colleagues used Langevin dynamics to model the behavior of molecular machines, they saw significant differences between what their exact theories predicted and what their simulations produced. They tried to come up with a physical picture of what it would take to produce these wrong answers.

“It was as if extra work were being done to push our molecules around,” Sivak says. “In the real world, this would be a driven physical process, but it existed only in the simulation, so we called it ‘shadow work.’ It took exactly the form of a nonequilibrium driving force.”

They first tested this insight with “toy” models having only a single degree of freedom, and found that when they ignored the shadow work, the calculations were systematically biased. But when they accounted for the shadow work, accurate calculations could be recovered.

“Next we looked at systems with hundreds or thousands of simple molecules,” says Sivak. Using models of water molecules in a box, they simulated the state of the system over time, starting from a given thermal energy but with no “pushing” from outside. “We wanted to know how far the water simulation would be pushed by the shadow work alone.”

The result confirmed that even in the absence of an explicit driving force, the finite-time-step Langevin dynamics simulation acted by itself as a driving nonequilibrium process. Systematic errors resulted from failing to separate this shadow work from the actual “protocol work” that they explicitly modeled in their simulations. For the first time, Sivak and his colleagues were able to quantify the magnitude of the deviations in various test systems.

Such simulation errors can be reduced in several ways, for example by dividing the evolution of the system into ever-finer time steps, because the shadow work is larger when the discrete time steps are larger. But doing so increases the computational expense.

The better approach is to use a correction factor that isolates the shadow work from the physically meaningful work, says Sivak. “We can apply results from our calculation in a meaningful way to characterize the error and correct for it, separating the physically realistic aspects of the simulation from the artifacts of the computer method.”

You can find out more in the Berkeley Lab news release, or (H/T)  in the Jan. 3, 2013 news item on Nanowerk, or you can read the paper,

“Using nonequilibrium fluctuation theorems to understand and correct errors in equilibrium and nonequilibrium discrete Langevin dynamics simulations,” by David A. Sivak, John D. Chodera, and Gavin E. Crooks, will appear in Physical Review X (http://prx.aps.org/) and is now available as an arXiv preprint at http://arxiv.org/abs/1107.2967.

This casts a new light on the SPAUN (Semantic Pointer Architecture Unified Network) project, from Chris Eliasmith’s team at the University of Waterloo, which announced the most  successful attempt (my Nov. 29, 2012 posting) yet to simulate a brain using virtual neurons. Given the probability that Eliasmith’s team was not aware of this work from the Berkeley Lab, one imagines that once it has been integrated that SPAUN will be capable of even more extraordinary feats.

Space-time crystals and everlasting clocks

Apparently, a space-time crystal could be useful for such things as studying the many-body problem in physics.  Since I hadn’t realized the many-body problem existed and have no idea how this might affect me or anyone else, I will have to take the utility of a space-time crystal on trust.As for the possibility of an everlasting clock, how will I ever know the truth since I’m not everlasting?

The Sept. 24, 2012 news item on Nanowerk about a new development makes the space-time crystal sound quite fascinating,

Imagine a clock that will keep perfect time forever, even after the heat-death of the universe. This is the “wow” factor behind a device known as a “space-time crystal,” a four-dimensional crystal that has periodic structure in time as well as space. However, there are also practical and important scientific reasons for constructing a space-time crystal. With such a 4D crystal, scientists would have a new and more effective means by which to study how complex physical properties and behaviors emerge from the collective interactions of large numbers of individual particles, the so-called many-body problem of physics. A space-time crystal could also be used to study phenomena in the quantum world, such as entanglement, in which an action on one particle impacts another particle even if the two particles are separated by vast distances. [emphasis mine]

While I’m most interested in the possibility of studying entanglement, it seems to me the scientists are guessing since the verb ‘could’ is being used where they used ‘would’ previously for studying the many body problem.

The Sept. 24, 2012 news release by Lynn Yarris for the Lawrence Berkeley National Laboratory  (Berkeley Lab), which originated the news item, provides detail on the latest space-time crystal development,

A space-time crystal, however, has only existed as a concept in the minds of theoretical scientists with no serious idea as to how to actually build one – until now. An international team of scientists led by researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) has proposed the experimental design of a space-time crystal based on an electric-field ion trap and the Coulomb repulsion of particles that carry the same electrical charge.

“The electric field of the ion trap holds charged particles in place and Coulomb repulsion causes them to spontaneously form a spatial ring crystal,” says Xiang Zhang, a faculty scientist  with Berkeley Lab’s Materials Sciences Division who led this research. “Under the application of a weak static magnetic field, this ring-shaped ion crystal will begin a rotation that will never stop. The persistent rotation of trapped ions produces temporal order, leading to the formation of a space-time crystal at the lowest quantum energy state.”

Because the space-time crystal is already at its lowest quantum energy state, its temporal order – or timekeeping – will theoretically persist even after the rest of our universe reaches entropy, thermodynamic equilibrium or “heat-death.”

This new development builds on some work done earlier this year at the Massachusetts Institute of Technology (MIT), from the Yarris news release,

The concept of a crystal that has discrete order in time was proposed earlier this year by Frank Wilczek, the Nobel-prize winning physicist at the Massachusetts Institute of Technology. While Wilczek mathematically proved that a time crystal can exist, how to physically realize such a time crystal was unclear. Zhang and his group, who have been working on issues with temporal order in a different system since September 2011, have come up with an experimental design to build a crystal that is discrete both in space and time – a space-time crystal.

Traditional crystals are 3D solid structures made up of atoms or molecules bonded together in an orderly and repeating pattern. Common examples are ice, salt and snowflakes. Crystallization takes place when heat is removed from a molecular system until it reaches its lower energy state. At a certain point of lower energy, continuous spatial symmetry breaks down and the crystal assumes discrete symmetry, meaning that instead of the structure being the same in all directions, it is the same in only a few directions.

“Great progress has been made over the last few decades in exploring the exciting physics of low-dimensional crystalline materials such as two-dimensional graphene, one-dimensional nanotubes, and zero-dimensional buckyballs,” says Tongcang Li, lead author of the PRL paper and a post-doc in Zhang’s research group. “The idea of creating a crystal with dimensions higher than that of conventional 3D crystals is an important conceptual breakthrough in physics and it is very exciting for us to be the first to devise a way to realize a space-time crystal.”

Just as a 3D crystal is configured at the lowest quantum energy state when continuous spatial symmetry is broken into discrete symmetry, so too is symmetry breaking expected to configure the temporal component of the space-time crystal. Under the scheme devised by Zhang and Li and their colleagues, a spatial ring of trapped ions in persistent rotation will periodically reproduce itself in time, forming a temporal analog of an ordinary spatial crystal. With a periodic structure in both space and time, the result is a space-time crystal.

Here’s an image created by team at the Berkeley Lab to represent their work on the space-time crystal,

Imagine a clock that will keep perfect time forever or a device that opens new dimensions into quantum phenomena such as emergence and entanglement. (courtesy of Xiang Zhang group[?] at Berkeley Lab)

For anyone who’s interested in this work, I suggest reading either the news item on Nanowerk or the Berkeley Lab news release in full. I will leave you with Natalie Cole and Everlasting Love,