Tag Archives: NIST

Hopes for nanocellulose in the fields of medicine and green manufacturing

Initially this seemed like an essay extolling the possibilities for nanocellulose but it is also a research announcement. From a Nov. 7, 2016 news item on Nanowerk,

What if you could take one of the most abundant natural materials on earth and harness its strength to lighten the heaviest of objects, to replace synthetic materials, or use it in scaffolding to grow bone, in a fast-growing area of science in oral health care?

This all might be possible with cellulose nanocrystals, the molecular matter of all plant life. As industrial filler material, they can be blended with plastics and other synthetics. They are as strong as steel, tough as glass, lightweight, and green.

“Plastics are currently reinforced with fillers made of steel, carbon, Kevlar, or glass. There is an increasing demand in manufacturing for sustainable materials that are lightweight and strong to replace these fillers,” said Douglas M. Fox, associate professor of chemistry at American University.
“Cellulose nanocrystals are an environmentally friendly filler. If there comes a time that they’re used widely in manufacturing, cellulose nanocrystals will lessen the weight of materials, which will reduce energy.”

A Nov. 7, 2016 American University news release on EurekAlert, which originated the news item, continues into the research,

Fox has submitted a patent for his work with cellulose nanocrystals, which involves a simple, scalable method to improve their performance. Published results of his method can be found in the chemistry journal ACS Applied Materials and Interfaces. Fox’s method could be used as a biomaterial and for applications in transportation, infrastructure and wind turbines.

The power of cellulose

Cellulose gives stems, leaves and other organic material in the natural world their strength. That strength already has been harnessed for use in many commercial materials. At the nano-level, cellulose fibers can be broken down into tiny crystals, particles smaller than ten millionths of a meter. Deriving cellulose from natural sources such as wood, tunicate (ocean-dwelling sea cucumbers) and certain kinds of bacteria, researchers prepare crystals of different sizes and strengths.

For all of the industry potential, hurdles abound. As nanocellulose disperses within plastic, scientists must find the sweet spot: the right amount of nanoparticle-matrix interaction that yields the strongest, lightest property. Fox overcame four main barriers by altering the surface chemistry of nanocrystals with a simple process of ion exchange. Ion exchange reduces water absorption (cellulose composites lose their strength if they absorb water); increases the temperature at which the nanocrystals decompose (needed to blend with plastics); reduces clumping; and improves re-dispersal after the crystals dry.

Cell growth

Cellulose nanocrystals as a biomaterial is yet another commercial prospect. In dental regenerative medicine, restoring sufficient bone volume is needed to support a patient’s teeth or dental implants. Researchers at the National Institute of Standards and Technology [NIST], through an agreement with the National Institute of Dental and Craniofacial Research of the National Institutes of Health, are looking for an improved clinical approach that would regrow a patient’s bone. When researchers experimented with Fox’s modified nanocrystals, they were able to disperse the nanocrystals in scaffolds for dental regenerative medicine purposes.

“When we cultivated cells on the cellulose nanocrystal-based scaffolds, preliminary results showed remarkable potential of the scaffolds for both their mechanical properties and the biological response. This suggests that scaffolds with appropriate cellulose nanocrystal concentrations are a promising approach for bone regeneration,” said Martin Chiang, team leader for NIST’s Biomaterials for Oral Health Project.

Another collaboration Fox has is with Georgia Institute of Technology and Owens Corning, a company specializing in fiberglass insulation and composites, to research the benefits to replace glass-reinforced plastic used in airplanes, cars and wind turbines. He also is working with Vireo Advisors and NIST to characterize the health and safety of cellulose nanocrystals and nanofibers.

“As we continue to show these nanomaterials are safe, and make it easier to disperse them into a variety of materials, we get closer to utilizing nature’s chemically resistant, strong, and most abundant polymer in everyday products,” Fox said.

Here’s a link to and a citation for the paper,

Simultaneously Tailoring Surface Energies and Thermal Stabilities of Cellulose Nanocrystals Using Ion Exchange: Effects on Polymer Composite Properties for Transportation, Infrastructure, and Renewable Energy Applications by Douglas M. Fox, Rebeca S. Rodriguez, Mackenzie N. Devilbiss, Jeremiah Woodcock, Chelsea S. Davis, Robert Sinko, Sinan Keten, and Jeffrey W. Gilman. ACS Appl. Mater. Interfaces, 2016, 8 (40), pp 27270–27281 DOI: 10.1021/acsami.6b06083 Publication Date (Web): September 14, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.

Mimicking rain and sun to test plastic for nanoparticle release

One of Canada’s nanotechnology experts once informed a House of Commons Committee on Health that nanoparticles encased in plastic (he was talking about cell phones) weren’t likely to harm you except in two circumstances (when workers were using them in the manufacturing process and when the product was being disposed of). Apparently, under some circumstances, that isn’t true any more. From a Sept. 30, 2016 news item on Nanowerk,

If the 1967 film “The Graduate” were remade today, Mr. McGuire’s famous advice to young Benjamin Braddock would probably be updated to “Plastics … with nanoparticles.” These days, the mechanical, electrical and durability properties of polymers—the class of materials that includes plastics—are often enhanced by adding miniature particles (smaller than 100 nanometers or billionths of a meter) made of elements such as silicon or silver. But could those nanoparticles be released into the environment after the polymers are exposed to years of sun and water—and if so, what might be the health and ecological consequences?

A Sept. 30, 2016 US National Institute of Standards and Technology (NIST) news release, which originated the news item, describes how the research was conducted and its results (Note: Links have been removed),

In a recently published paper (link is external), researchers from the National Institute of Standards and Technology (NIST) describe how they subjected a commercial nanoparticle-infused coating to NIST-developed methods for accelerating the effects of weathering from ultraviolet (UV) radiation and simulated washings of rainwater. Their results indicate that humidity and exposure time are contributing factors for nanoparticle release, findings that may be useful in designing future studies to determine potential impacts.

In their recent experiment, the researchers exposed multiple samples of a commercially available polyurethane coating containing silicon dioxide nanoparticles to intense UV radiation for 100 days inside the NIST SPHERE (Simulated Photodegradation via High-Energy Radiant Exposure), a hollow, 2-meter (7-foot) diameter black aluminum chamber lined with highly UV reflective material that bears a casual resemblance to the Death Star in the film “Star Wars.” For this study, one day in the SPHERE was equivalent to 10 to 15 days outdoors. All samples were weathered at a constant temperature of 50 degrees Celsius (122 degrees Fahrenheit) with one group done in extremely dry conditions (approximately 0 percent humidity) and the other in humid conditions (75 percent humidity).

To determine if any nanoparticles were released from the polymer coating during UV exposure, the researchers used a technique they created and dubbed “NIST simulated rain.” Filtered water was converted into tiny droplets, sprayed under pressure onto the individual samples, and then the runoff—with any loose nanoparticles—was collected in a bottle. This procedure was conducted at the beginning of the UV exposure, at every two weeks during the weathering run and at the end. All of the runoff fluids were then analyzed by NIST chemists for the presence of silicon and in what amounts. Additionally, the weathered coatings were examined with atomic force microscopy (AFM) and scanning electron microscopy (SEM) to reveal surface changes resulting from UV exposure.

Both sets of coating samples—those weathered in very low humidity and the others in very humid conditions—degraded but released only small amounts of nanoparticles. The researchers found that more silicon was recovered from the samples weathered in humid conditions and that nanoparticle release increased as the UV exposure time increased. Microscopic examination showed that deformations in the coating surface became more numerous with longer exposure time, and that nanoparticles left behind after the coating degraded often bound together in clusters.

“These data, and the data from future experiments of this type, are valuable for developing computer models to predict the long-term release of nanoparticles from commercial coatings used outdoors, and in turn, help manufacturers, regulatory officials and others assess any health and environmental impacts from them,” said NIST research chemist Deborah Jacobs, lead author on the study published in the Journal of Coatings Technology and Research (link is external).

Here’s a link to and a citation for the paper,

Surface degradation and nanoparticle release of a commercial nanosilica/polyurethane coating under UV exposure by Deborah S. Jacobs, Sin-Ru Huang, Yu-Lun Cheng, Savelas A. Rabb, Justin M. Gorham, Peter J. Krommenhoek, Lee L. Yu, Tinh Nguyen, Lipiin Sung. J Coat Technol Res (2016) 13: 735. doi:10.1007/s11998-016-9796-2 First published online 13 July 2016

This paper is behind a paywall.

For anyone interested in the details about the House of Commons nano story I told at the start of this post, here’s the June 23, 2010 posting where I summarized the hearing on nanotechnology. If you scroll down about 50% of the way, you’ll find Dr. Nils Petersen’s (then director of Canada’s National Institute of Nanotechnology) comments about nanoparticles being encased. The topic had been nanosunscreens and he was describing the conditions under which he believed nanoparticles could be dangerous.

Creating multiferroic material at room temperature

A Sept. 23, 2016 news item on ScienceDaily describes some research from Cornell University (US),

Multiferroics — materials that exhibit both magnetic and electric order — are of interest for next-generation computing but difficult to create because the conditions conducive to each of those states are usually mutually exclusive. And in most multiferroics found to date, their respective properties emerge only at extremely low temperatures.

Two years ago, researchers in the labs of Darrell Schlom, the Herbert Fisk Johnson Professor of Industrial Chemistry in the Department of Materials Science and Engineering, and Dan Ralph, the F.R. Newman Professor in the College of Arts and Sciences, in collaboration with professor Ramamoorthy Ramesh at UC Berkeley, published a paper announcing a breakthrough in multiferroics involving the only known material in which magnetism can be controlled by applying an electric field at room temperature: the multiferroic bismuth ferrite.

Schlom’s group has partnered with David Muller and Craig Fennie, professors of applied and engineering physics, to take that research a step further: The researchers have combined two non-multiferroic materials, using the best attributes of both to create a new room-temperature multiferroic.

Their paper, “Atomically engineered ferroic layers yield a room-temperature magnetoelectric multiferroic,” was published — along with a companion News & Views piece — Sept. 22 [2016] in Nature. …

A Sept. 22, 2016 Cornell University news release by Tom Fleischman, which originated the news item, details more about the work (Note: A link has been removed),

The group engineered thin films of hexagonal lutetium iron oxide (LuFeO3), a material known to be a robust ferroelectric but not strongly magnetic. The LuFeO3 consists of alternating single monolayers of lutetium oxide and iron oxide, and differs from a strong ferrimagnetic oxide (LuFe2O4), which consists of alternating monolayers of lutetium oxide with double monolayers of iron oxide.

The researchers found, however, that they could combine these two materials at the atomic-scale to create a new compound that was not only multiferroic but had better properties that either of the individual constituents. In particular, they found they need to add just one extra monolayer of iron oxide to every 10 atomic repeats of the LuFeO3 to dramatically change the properties of the system.

That precision engineering was done via molecular-beam epitaxy (MBE), a specialty of the Schlom lab. A technique Schlom likens to “atomic spray painting,” MBE let the researchers design and assemble the two different materials in layers, a single atom at a time.

The combination of the two materials produced a strongly ferrimagnetic layer near room temperature. They then tested the new material at the Lawrence Berkeley National Laboratory (LBNL) Advanced Light Source in collaboration with co-author Ramesh to show that the ferrimagnetic atoms followed the alignment of their ferroelectric neighbors when switched by an electric field.

“It was when our collaborators at LBNL demonstrated electrical control of magnetism in the material that we made that things got super exciting,” Schlom said. “Room-temperature multiferroics are exceedingly rare and only multiferroics that enable electrical control of magnetism are relevant to applications.”

In electronics devices, the advantages of multiferroics include their reversible polarization in response to low-power electric fields – as opposed to heat-generating and power-sapping electrical currents – and their ability to hold their polarized state without the need for continuous power. High-performance memory chips make use of ferroelectric or ferromagnetic materials.

“Our work shows that an entirely different mechanism is active in this new material,” Schlom said, “giving us hope for even better – higher-temperature and stronger – multiferroics for the future.”

Collaborators hailed from the University of Illinois at Urbana-Champaign, the National Institute of Standards and Technology, the University of Michigan and Penn State University.

Here is a link and a citation to the paper and to a companion piece,

Atomically engineered ferroic layers yield a room-temperature magnetoelectric multiferroic by Julia A. Mundy, Charles M. Brooks, Megan E. Holtz, Jarrett A. Moyer, Hena Das, Alejandro F. Rébola, John T. Heron, James D. Clarkson, Steven M. Disseler, Zhiqi Liu, Alan Farhan, Rainer Held, Robert Hovden, Elliot Padgett, Qingyun Mao, Hanjong Paik, Rajiv Misra, Lena F. Kourkoutis, Elke Arenholz, Andreas Scholl, Julie A. Borchers, William D. Ratcliff, Ramamoorthy Ramesh, Craig J. Fennie, Peter Schiffer et al. Nature 537, 523–527 (22 September 2016) doi:10.1038/nature19343 Published online 21 September 2016

Condensed-matter physics: Multitasking materials from atomic templates by Manfred Fiebig. Nature 537, 499–500  (22 September 2016) doi:10.1038/537499a Published online 21 September 2016

Both the paper and its companion piece are behind a paywall.

Teleporting photons in Calgary (Canada) is a step towards a quantum internet

Scientists at the University of Calgary (Alberta, Canada) have set a distance record for the teleportation of photons and you can see the lead scientist is very pleased.

Wolfgang Tittel, professor of physics and astronomy at the University of Calgary, and a group of PhD students have developed a new quantum key distribution (QKD) system.

Wolfgang Tittel, professor of physics and astronomy at the University of Calgary, and a group of PhD students have developed a new quantum key distribution (QKD) system.

A Sept. 21, 2016 news item on phys.org makes the announcement (Note: A link has been removed),

What if you could behave like the crew on the Starship Enterprise and teleport yourself home or anywhere else in the world? As a human, you’re probably not going to realize this any time soon; if you’re a photon, you might want to keep reading.

Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometres using The City of Calgary’s fibre optic cable infrastructure. The project began with an Urban Alliance seed grant in 2014.

This accomplishment, which set a new record for distance of transferring a quantum state by teleportation, has landed the researchers a spot in the prestigious Nature Photonics scientific journal. The finding was published back-to-back with a similar demonstration by a group of Chinese researchers.

A Sept. 20, 2016 article by Robson Fletcher for CBC (Canadian Broadcasting News) online provides a bit more insight from the lead researcher (Note: A link has been removed),

“What is remarkable about this is that this information transfer happens in what we call a disembodied manner,” said physics professor Wolfgang Tittel, whose team’s work was published this week in the journal Nature Photonics.

“Our transfer happens without any need for an object to move between these two particles.”

A Sept. 20, 2016 University of Calgary news release by Drew Scherban, which originated the news item, provides more insight into the research,

“Such a network will enable secure communication without having to worry about eavesdropping, and allow distant quantum computers to connect,” says Tittel.

Experiment draws on ‘spooky action at a distance’

The experiment is based on the entanglement property of quantum mechanics, also known as “spooky action at a distance” — a property so mysterious that not even Einstein could come to terms with it.

“Being entangled means that the two photons that form an entangled pair have properties that are linked regardless of how far the two are separated,” explains Tittel. “When one of the photons was sent over to City Hall, it remained entangled with the photon that stayed at the University of Calgary.”

Next, the photon whose state was teleported to the university was generated in a third location in Calgary and then also travelled to City Hall where it met the photon that was part of the entangled pair.

“What happened is the instantaneous and disembodied transfer of the photon’s quantum state onto the remaining photon of the entangled pair, which is the one that remained six kilometres away at the university,” says Tittel.

City’s accessible dark fibre makes research possible

The research could not be possible without access to the proper technology. One of the critical pieces of infrastructure that support quantum networking is accessible dark fibre. Dark fibre, so named because of its composition — a single optical cable with no electronics or network equipment on the alignment — doesn’t interfere with quantum technology.

The City of Calgary is building and provisioning dark fibre to enable next-generation municipal services today and for the future.

“By opening The City’s dark fibre infrastructure to the private and public sector, non-profit companies, and academia, we help enable the development of projects like quantum encryption and create opportunities for further research, innovation and economic growth in Calgary,” said Tyler Andruschak, project manager with Innovation and Collaboration at The City of Calgary.

“The university receives secure access to a small portion of our fibre optic infrastructure and The City may benefit in the future by leveraging the secure encryption keys generated out of the lab’s research to protect our critical infrastructure,” said Andruschak. In order to deliver next-generation services to Calgarians, The City has been increasing its fibre optic footprint, connecting all City buildings, facilities and assets.

Timed to within one millionth of one millionth of a second

As if teleporting a photon wasn’t challenging enough, Tittel and his team encountered a number of other roadblocks along the way.

Due to changes in the outdoor temperature, the transmission time of photons from their creation point to City Hall varied over the course of a day — the time it took the researchers to gather sufficient data to support their claim. This change meant that the two photons would not meet at City Hall.

“The challenge was to keep the photons’ arrival time synchronized to within 10 pico-seconds,” says Tittel. “That is one trillionth, or one millionth of one millionth of a second.”

Secondly, parts of their lab had to be moved to two locations in the city, which as Tittel explains was particularly tricky for the measurement station at City Hall which included state-of-the-art superconducting single-photon detectors developed by the National Institute for Standards and Technology, and NASA’s Jet Propulsion Laboratory.

“Since these detectors only work at temperatures less than one degree above absolute zero the equipment also included a compact cryostat,” said Tittel.

Milestone towards a global quantum Internet

This demonstration is arguably one of the most striking manifestations of a puzzling prediction of quantum mechanics, but it also opens the path to building a future quantum internet, the long-term goal of the Tittel group.

The Urban Alliance is a strategic research partnership between The City of Calgary and University of Calgary, created in 2007 to encourage and co-ordinate the seamless transfer of cutting-edge research between the university and The City of Calgary for the benefit of all our communities. The Urban Alliance is a prime example and vehicle for one of the three foundational commitments of the University of Calgary’s Eyes High vision to fully integrate the university with the community. The City sees the Alliance as playing a key role in realizing its long-term priorities and the imagineCALGARY vision.

Here’s a link to and a citation for the paper,

Quantum teleportation across a metropolitan fibre network by Raju Valivarthi, Marcel.li Grimau Puigibert, Qiang Zhou, Gabriel H. Aguilar, Varun B. Verma, Francesco Marsili, Matthew D. Shaw, Sae Woo Nam, Daniel Oblak, & Wolfgang Tittel. Nature Photonics (2016)  doi:10.1038/nphoton.2016.180 Published online 19 September 2016

This paper is behind a paywall.

I’m 99% certain this is the paper from the Chinese researchers (referred to in the University of Calgary news release),

Quantum teleportation with independent sources and prior entanglement distribution over a network by Qi-Chao Sun, Ya-Li Mao, Si-Jing Chen, Wei Zhang, Yang-Fan Jiang, Yan-Bao Zhang, Wei-Jun Zhang, Shigehito Miki, Taro Yamashita, Hirotaka Terai, Xiao Jiang, Teng-Yun Chen, Li-Xing You, Xian-Feng Chen, Zhen Wang, Jing-Yun Fan, Qiang Zhang & Jian-Wei Pan. Nature Photonics (2016)  doi:10.1038/nphoton.2016.179 Published online 19 September 2016

This too is behind a paywall.

Carbon nanotubes: faster, cheaper, easier, and more consistent

One of the big problems with nanomaterials has to do with production issues such as: consistent size and shape. It seems that scientists at the US National Institute of Standards and Technology (NIST) have developed a technique for producing carbon nanotubes (CNTs) which addresses these issues. From a July 19, 2016 news item on Nanotechnology Now,

Just as many of us might be resigned to clogged salt shakers or rush-hour traffic, those working to exploit the special properties of carbon nanotubes have typically shrugged their shoulders when these tiniest of cylinders fill with water during processing. But for nanotube practitioners who have reached their Popeye threshold and “can’t stands no more,” the National Institute of Standards and Technology (NIST) has devised a cheap, quick and effective strategy that reliably enhances the quality and consistency of the materials–important for using them effectively in applications such as new computing technologies.

To prevent filling of the cores of single-wall carbon nanotubes with water or other detrimental substances, the NIST researchers advise intentionally prefilling them with a desired chemical of known properties. Taking this step before separating and dispersing the materials, usually done in water, yields a consistently uniform collection of nanotubes. In quantity and quality, the results are superior to water-filled nanotubes, especially for optical applications such as sensors and photodetectors.

A July 15, 2016 NIST news release, which originated the news item, expands on the theme,

The approach opens a straightforward route for engineering the properties of single-wall carbon nanotubes—rolled up sheets of carbon atoms arranged like chicken wire or honey combs—with improved or new properties.

“This approach is so easy, inexpensive and broadly useful that I can’t think of a reason not to use it,” said NIST chemical engineer Jeffrey Fagan.

In their proof-of-concept experiments, the NIST team inserted more than 20 different compounds into an assortment of single-wall carbon nanotubes with an interior diameter that ranged from more than 2 down to about 0.5 nanometers. Led by visiting researcher Jochen Campo, the scientists tested their strategy by using hydrocarbons called alkanes as fillers.

The alkanes, which include such familiar compounds as propane and butane, served to render the nanotube interiors unreactive. In other words, the alkane-filled nanotubes behaved almost as if they were empty—precisely the goal of Campo, Fagan and colleagues.

Compared with nanotubes filled with water and possibly ions, acids and other unwanted chemicals encountered during processing, empty nanotubes possess far superior properties. For example, when stimulated by light, empty carbon nanotubes fluoresce far brighter and with sharper signals.

Yet, “spontaneous ingestion” of water or other solvents by the nanotubes during processing is an “endemic but often neglected phenomenon with strong implications for the development of nanotube applications,” the NIST team wrote in a recent article in Nanoscale Horizons.

Perhaps because of the additional cost and effort required to filter out and gather nanotubes, researchers tend to tolerate mixed batches of unfilled (empty) and mostly filled single-wall carbon nanotubes. Separating unfilled nanotubes from these mixtures requires expensive ultracentrifuge equipment and, even then, the yield is only about 10 percent, Campo estimates.

“If your goal is to use nanotubes for electronic circuits, for example, or for fluorescent anti-cancer image contrast agents, then you require much greater quantities of materials of consistent composition and quality,” Campo explained, who was exploring these applications while doing postdoctoral research at the University of Antwerp. “This particular need inspired development of the new prefilling method by asking the question, can we put some passive chemical into the nanotube instead to keep the water out.”

From the very first simple experiments, the answer was yes. And the benefits can be significant. In fluorescence experiments, alkane-filled nanotubes emitted signals two to three times stronger than those emitted by water-filled nanotubes. Performance approached that of empty nanotubes—the gold standard for these comparisons.

As important, the NIST-developed prefilling strategy is controllable, versatile and easily incorporated into existing methods for processing single-wall carbon nanotubes, according to the researchers.

Here’s a link to and citation for the paper,

Enhancing single-wall carbon nanotube properties through controlled endohedral filling by J. Campo, Y. Piao, S. Lam, C. M. Stafford, J. K. Streit, J. R. Simpson, A. R. Hight Walker, and J. A. Fagan. Nanoscale Horiz., 2016,1, 317-324 DOI: 10.1039/C6NH00062B First published online 10 May 2016

This paper is open access but you do need to register on the site (it is a free registration).

US Nanotechnology Initiative for water sustainability

Wednesday, March 23, 2016 was World Water Day and to coincide with that event the US National Nanotechnology Initiative (NNI) in collaboration with several other agencies announced a new ‘signature initiative’. From a March 24, 2016 news item on Nanowerk (Note: A link has been removed),

As a part of the White House Water Summit held yesterday on World Water Day, the Federal agencies participating in the National Nanotechnology Initiative (NNI) announced the launch of a Nanotechnology Signature Initiative (NSI), Water Sustainability through Nanotechnology: Nanoscale Solutions for a Global-Scale Challenge.

A March 23, 2016 NNI news release provides more information about why this initiative is important,

Access to clean water remains one of the world’s most pressing needs. As today’s White House Office of Science and Technology blog post explains, “the small size and exceptional properties of engineered nanomaterials are particularly promising for addressing the key technical challenges related to water quality and quantity.”

“One cannot find an issue more critical to human life and global security than clean, plentiful, and reliable water sources,” said Dr. Michael Meador, Director of the National Nanotechnology Coordination Office (NNCO). “Through the NSI mechanism, NNI member agencies will have an even greater ability to make meaningful strides toward this initiative’s thrust areas: increasing water availability, improving the efficiency of water delivery and use, and enabling next-generation water monitoring systems.”

A March 23, 2016 US White House blog posting by Lloyd Whitman and Lisa Friedersdorf describes the efforts in more detail (Note: A link has been removed),

The small size and exceptional properties of engineered nanomaterials are particularly promising for addressing the pressing technical challenges related to water quality and quantity. For example, the increased surface area—a cubic centimeter of nanoparticles has a surface area larger than a football field—and reactivity of nanometer-scale particles can be exploited to create catalysts for water purification that do not require rare or precious metals. And composites incorporating nanomaterials such as carbon nanotubes might one day enable stronger, lighter, and more durable piping systems and components. Under this NSI, Federal agencies will coordinate and collaborate to more rapidly develop nanotechnology-enabled solutions in three main thrusts: [thrust 1] increasing water availability; [thrust 2] improving the efficiency of water delivery and use; and [thrust 3] enabling next-generation water monitoring systems.

A technical “white paper” released by the agencies this week highlights key technical challenges for each thrust, identifies key objectives to overcome those challenges, and notes areas of research and development where nanotechnology promises to provide the needed solutions. By shining a spotlight on these areas, the new NSI will increase Federal coordination and collaboration, including with public and private stakeholders, which is vital to making progress in these areas. The additional focus and associated collective efforts will advance stewardship of water resources to support the essential food, energy, security, and environment needs of all stakeholders.

We applaud the commitment of the Federal agencies who will participate in this effort—the Department of Commerce/National Institute of Standards and Technology, Department of Energy, Environmental Protection Agency, National Aeronautics and Space Administration, National Science Foundation, and U.S. Department of Agriculture/National Institute of Food and Agriculture. As made clear at this week’s White House Water Summit, the world’s water systems are under tremendous stress, and new and emerging technologies will play a critical role in ensuring a sustainable water future.

The white paper (12 pp.) is titled: Water Sustainability through Nanotechnology: Nanoscale Solutions for a Global-Scale Challenge and describes the thrusts in more detail.

A March 22, 2016 US White House fact sheet lays out more details including funding,

Click here to learn more about all of the commitments and announcements being made today. They include:

  • Nearly $4 billion in private capital committed to investment in a broad range of water-infrastructure projects nationwide. This includes $1.5 billion from Ultra Capital to finance decentralized and scalable water-management solutions, and $500 million from Sustainable Water to develop water reclamation and reuse systems.
  • More than $1 billion from the private sector over the next decade to conduct research and development into new technologies. This includes $500 million from GE to fuel innovation, expertise, and global capabilities in advanced water, wastewater, and reuse technologies.
  • A Presidential Memorandum and supporting Action Plan on building national capabilities for long-term drought resilience in the United States, including by setting drought resilience policy goals, directing specific drought resilience activities to be completed by the end of the year, and permanently establishing the National Drought Resilience Partnership as an interagency task force responsible for coordinating drought-resilience, response, and recovery efforts.
  • Nearly $35 million this year in Federal grants from the Environmental Protection Agency, the National Oceanic and Atmospheric Administration, the National Science Foundation, and the U.S. Department of Agriculture to support cutting-edge water science;
  • The release of a new National Water Model that will dramatically enhance the Nation’s river-forecasting capabilities by delivering forecasts for approximately 2.7 million locations, up from 4,000 locations today (a 700-fold increase in forecast density).

This seems promising and hopefully other countries will follow suit.

Monsieur Kilogram; an SI (international system of units) Superhero

I wouldn’t have thought that measurement was such a crucial issue that it would require a superhero team but I have to admit the folks at the US National Institute of Standards and Technology (NIST) make a compelling case in a Feb. 2, 2016 NIST news release (also on EurekAlert but dated Feb. 4, 2016),

The nefarious Major Uncertainty has kidnapped Monsieur Kilogram, putting the world’s measurements of mass in jeopardy. As the world spirals into “Mass Hysteria,” the remaining SI Superheroes, champions of the metric system, leap into action to save the day, and hopefully Monsieur Kilogram as well.

This crisis kicks off the third and latest adventure from the League of SI Superheroes, the animated online series from the National Institute of Standards and Technology (NIST). “Mass Hysteria” touches upon a topic—how to redefine the basic unit of mass known as the kilogram—that represents a cutting-edge undertaking for researchers working to modernize the worldwide metric measurement system known as the International System of Units (SI).

From the very big to the very small, accurately measuring mass is important in the world around us. For example, many of the products you buy at the grocery store and other places are sold by mass or the related quantity of weight. Roughly speaking, mass is the amount and type of “stuff” there is in something, and weight is the force pulling on the mass by gravity. The masses of every ingredient in medications from aspirin tablets to cancer drugs are carefully measured to ensure that they are both safe and effective. In many cases, medical doctors consider the mass of the patient to determine the dosage of the medications they prescribe as well. And both the fuel and the amount of thrust produced by the huge engines that power airplanes and rockets depend on mass.

Small errors of even a few milligrams per kilogram may not sound like much, but they can be costly when measuring huge quantities of something like a tanker ship full of grain or oil. With medicines, slightly too little of a chemical could make it ineffective and slightly too much could be toxic.

Here’s a peek at Monsieur Kilogram who seems like a pretty tough customer,

Monsieur Kilogram, a character in NIST’s League of SI Superheroes animated online series, is able to determine the mass of any object simply by holding it. ©NIST

Monsieur Kilogram, a character in NIST’s League of SI Superheroes animated online series, is able to determine the mass of any object simply by holding it. ©NIST

Getting back to the news release and the importance of accurate measurement,

Being the last standard unit of measure still based on an actual physical object, in this case a golf-ball-sized cylinder of platinum and iridium, the kilogram is vulnerable to damage, as well as being lost or even stolen. While the international prototype kilogram itself cannot change because it is the kilogram by definition, copies of the international prototype that many countries use as their standard of mass have been gaining or losing mass relative to it.

The SI Superheroes’ latest episode briefly explores one of the efforts to redefine the kilogram in terms of natural forces called the watt balance, a complex machine that uses electric and magnetic forces to balance a 1-kilogram mass against the Earth’s gravity. Precise measurements related to these forces can then be used to provide a consistent definition of the amount of mass in the kilogram.

While the superheroes’ antics are not exactly representative of the efforts of actual researchers, super scientists at NIST and elsewhere have been working for years to reduce the errors in their measurement of this quantity to the point where the watt balance can take over for the international prototype kilogram.

They are closing in on their goal, and it is widely anticipated that the kilogram will be redefined in 2018.

Once this process is complete, the kilogram will have been freed from its dependence on a physical object, and anyone with the right technical expertise and equipment will be able to determine the mass of a kilogram for themselves.

Will the SI Superheroes finish the watt balance in time? Watch the next episode and find out! Modeled on the seven base units of the International System of Units, or SI, the League of SI Superheroes are:

Meter Man: With his laser interferometer eyes, graduated arms and extendable body, no dimension is too big or too small for Meter Man to measure.

The Mole: Able to sniff out and count the atoms of every element, the Mole is a master of chemistry.

Professor Second: By reading the vibrations of her laser-cooled cesium atoms, Professor Second can measure any frequency and calibrate any clock.

Monsieur Kilogram: Monsieur Kilogram loves lifting weights, and it shows. He is able to determine the mass of any object simply by holding it.

Ms. Ampere: Ms. Ampere rules the flow of electrons—electrical current—and makes sure that the right amount gets where it needs to go.

Dr. Kelvin: Dr. Kelvin heats up or cools down objects by speeding up or slowing down particles inside them. He can also measure the temperature of anything in the universe with his trusty thermometer.

Candela: Don’t let her small size fool you. Candela’s power over light helps to brighten the whole world.

Catch up on their adventures at The League of SI Superheroes kids’ page. Teachers can also request a classroom set of SI educational materials by submitting their contact information and grade level to TheSI@nist.gov.

Here’s the latest adventure,


A study in contrasts: innovation and education strategies in US and British Columbia (Canada)

It’s always interesting to contrast two approaches to the same issue, in this case, innovation and education strategies designed to improve the economies of the United States and of British Columbia, a province in Canada.

One of the major differences regarding education in the US and in Canada is that the Canadian federal government, unlike the US federal government, has no jurisdiction over the matter. Education is strictly a provincial responsibility.

I recently wrote a commentary (a Jan. 19, 2016 posting) about the BC government’s Jan. 18, 2016 announcement of its innovation strategy in a special emphasis on the education aspect. Premier Christy Clark focused largely on the notion of embedding courses on computer coding in schools from K-12 (kindergarten through grade 12) as Jonathon Narvey noted in his Jan. 19, 2016 event recap for Betakit,

While many in the tech sector will be focused on the short-term benefits of a quick injection of large capital [a $100M BC Tech Fund as part of a new strategy was announced in Dec. 2015 but details about the new #BCTECH Strategy were not shared until Jan. 18, 2016], the long-term benefits for the local tech sector are being seeded in local schools. More than 600,000 BC students will be getting basic skills in the K-12 curriculum, with coding academies, more work experience electives and partnerships between high school and post-secondary institutions.

Here’s what I had to say in my commentary (from the Jan. 19, 2016 posting),

… the government wants to embed  computer coding into the education system for K-12 (kindergarten to grade 12). One determined reporter (Canadian Press if memory serves) attempted to find out how much this would cost. No answer was forthcoming although there were many words expended. Whether this failure was due to ignorance (disturbing!) or a reluctance to share (also disturbing!) was impossible to tell. Another reporter (Georgia Straight) asked about equipment (coding can be taught with pen and paper but hardware is better). … Getting back to the reporter’s question, no answer was forthcoming although the speaker was loquacious.

Another reporter asked if the government had found any jurisdictions doing anything similar regarding computer coding. It seems they did consider other jurisdictions although it was claimed that BC is the first to strike out in this direction. Oddly, no one mentioned Estonia, known in some circles as E-stonia, where the entire school system was online by the late 1990s in an initiative known as the ‘Tiger Leap Foundation’ which also supported computer coding classes in secondary school (there’s more in Tim Mansel’s May 16, 2013 article about Estonia’s then latest initiative to embed computer coding into grade school.) …

Aside from the BC government’s failure to provide details, I am uncomfortable with what I see as an overemphasis on computer coding that suggests a narrow focus on what constitutes a science and technology strategy for education. I find the US approach closer to what I favour although I may be biased since they are building their strategy around nanotechnology education.

The US approach had been announced in dribs and drabs until recently when a Jan. 26, 2016 news item on Nanotechnology Now indicated a broad-based plan for nanotechnology education (and computer coding),

Over the past 15 years, the Federal Government has invested over $22 billion in R&D under the auspices of the National Nanotechnology Initiative (NNI) to understand and control matter at the nanoscale and develop applications that benefit society. As these nanotechnology-enabled applications become a part of everyday life, it is important for students to have a basic understanding of material behavior at the nanoscale, and some states have even incorporated nanotechnology concepts into their K-12 science standards. Furthermore, application of the novel properties that exist at the nanoscale, from gecko-inspired climbing gloves and invisibility cloaks, to water-repellent coatings on clothes or cellphones, can spark students’ excitement about science, technology, engineering, and mathematics (STEM).

An earlier Jan. 25, 2016 White House blog posting by Lisa Friedersdorf and Lloyd Whitman introduced the notion that nanotechnology is viewed as foundational and a springboard for encouraging interest in STEM (science, technology, engineering, and mathematics) careers while outlining several formal and information education efforts,

The Administration’s updated Strategy for American Innovation, released in October 2015, identifies nanotechnology as one of the emerging “general-purpose technologies”—a technology that, like the steam engine, electricity, and the Internet, will have a pervasive impact on our economy and our society, with the ability to create entirely new industries, create jobs, and increase productivity. To reap these benefits, we must train our Nation’s students for these high-tech jobs of the future. Fortunately, the multidisciplinary nature of nanotechnology and the unique and fascinating phenomena that occur at the nanoscale mean that nanotechnology is a perfect topic to inspire students to pursue careers in science, technology, engineering, and mathematics (STEM).

The Nanotechnology: Super Small Science series [mentioned in my Jan. 21, 2016 posting] is just the latest example of the National Nanotechnology Initiative (NNI)’s efforts to educate and inspire our Nation’s students. Other examples include:

The announcement about computer coding and courses being integrated in the US education curricula K-12 was made in US President Barack Obama’s 2016 State of the Union speech and covered in a Jan. 30, 2016 article by Jessica Hullinger for Fast Company,

In his final State Of The Union address earlier this month, President Obama called for providing hands-on computer science classes for all students to make them “job ready on day one.” Today, he is unveiling how he plans to do that with his upcoming budget.

The President’s Computer Science for All Initiative seeks to provide $4 billion in funding for states and an additional $100 million directly to school districts in a push to provide access to computer science training in K-12 public schools. The money would go toward things like training teachers, providing instructional materials, and getting kids involved in computer science early in elementary and middle school.

There are more details in the Hullinger’s article and in a Jan. 30, 2016 White House blog posting by Megan Smith,

Computer Science for All is the President’s bold new initiative to empower all American students from kindergarten through high school to learn computer science and be equipped with the computational thinking skills they need to be creators in the digital economy, not just consumers, and to be active citizens in our technology-driven world. Our economy is rapidly shifting, and both educators and business leaders are increasingly recognizing that computer science (CS) is a “new basic” skill necessary for economic opportunity and social mobility.

CS for All builds on efforts already being led by parents, teachers, school districts, states, and private sector leaders from across the country.

Nothing says one approach has to be better than the other as there’s usually more than one way to accomplish a set of goals. As well, it’s unfair to expect a provincial government to emulate the federal government of a larger country with more money to spend. I just wish the BC government (a) had shared details such as the budget allotment for their initiative and (b) would hint at a more imaginative, long range view of STEM education.

Going back to Estonia one last time, in addition to the country’s recent introduction of computer coding classes in grade school, it has also embarked on a nanotechnology/nanoscience educational and entrepreneurial programme as noted in my Sept. 30, 2014 posting,

The University of Tartu (Estonia) announced in a Sept. 29, 2014 press release an educational and entrepreneurial programme about nanotechnology/nanoscience for teachers and students,

To bring nanoscience closer to pupils, educational researchers of the University of Tartu decided to implement the European Union LLP Comenius project “Quantum Spin-Off – connecting schools with high-tech research and entrepreneurship”. The objective of the project is to build a kind of a bridge: at one end, pupils can familiarise themselves with modern science, and at the other, experience its application opportunities at high-tech enterprises. “We also wish to inspire these young people to choose a specialisation related to science and technology in the future,” added Lukk [Maarika Lukk, Coordinator of the project].

The pupils can choose between seven topics of nanotechnology: the creation of artificial muscles, microbiological fuel elements, manipulation of nanoparticles, nanoparticles and ionic liquids as oil additives, materials used in regenerative medicine, deposition and 3D-characterisation of atomically designed structures and a topic covered in English, “Artificial robotic fish with EAP elements”.

Learning is based on study modules in the field of nanotechnology. In addition, each team of pupils will read a scientific publication, selected for them by an expert of that particular field. In that way, pupils will develop an understanding of the field and of scientific texts. On the basis of the scientific publication, the pupils prepare their own research project and a business plan suitable for applying the results of the project.

In each field, experts of the University of Tartu will help to understand the topics. Participants will visit a nanotechnology research laboratory and enterprises using nanotechnologies.

The project lasts for two years and it is also implemented in Belgium, Switzerland and Greece.

As they say, time will tell.

Developing optical microscopes that measure features down to 10 nanometer level on computer chips

The US National Institute of Standards and Technology (NIST) issued a Dec. 2, 2015 news release (also on EurekAlert) announcing a new kind of optical microscope and its possible impact on the semiconductor industry,

National Institute of Standards and Technology (NIST) researchers are seeing the light, but in an altogether different way. And how they are doing it just might be the semiconductor industry’s ticket for extending its use of optical microscopes to measure computer chip features that are approaching 10 nanometers, tiny fractions of the wavelength of light.

The news release goes on to provide details and an explanation of scatterfield imaging,

Using a novel microscope that combines standard through-the-lens viewing with a technique called scatterfield imaging, the NIST team accurately measured patterned features on a silicon wafer that were 30 times smaller than the wavelength of light (450 nanometers) used to examine them. They report* that measurements of the etched lines–as thin as 16 nanometers wide–on the SEMATECH-fabricated wafer were accurate to one nanometer. With the technique, they spotted variations in feature dimensions amounting to differences of a few atoms.

Measurements were confirmed by those made with an atomic force microscope, which achieves sub-nanometer resolution, but is considered too slow for online quality-control measurements. Combined with earlier results, the NIST researchers write, the new proof-of-concept study* suggests that the innovative optical approach could be a “realistic solution to a very challenging problem” facing chip makers and others aiming to harness advances in nanotechnology. All need the means for “nondestructive measurement of nanometer-scale structures with sub-nanometer sensitivity while still having high throughput.

“Light-based, or optical, microscopes can’t “see” features smaller than the wavelength of light, at least not in the crisp detail necessary for making accurate measurements. However, light does scatter when it strikes so-called subwavelength features and patterned arrangements of such features. “Historically, we would ignore this scattered light because it did not yield sufficient resolution,” explains Richard Silver, the physicist who initiated NIST’s scatterfield imaging effort. “Now we know it contains helpful information that provides signatures telling us something about where the light came from.”

With scatterfield imaging, Silver and colleagues methodically illuminate a sample with polarized light from different angles. From this collection of scattered light–nothing more than a sea of wiggly lines to the untrained eye–the NIST team can extract characteristics of the bounced lightwaves that, together, reveal the geometry of features on the specimen.

Light-scattering data are gathered in slices, which together image the volume of scattered light above and into the sample. These slices are analyzed and reconstructed to create a three-dimensional representation. The process is akin to a CT scan, except that the slices are collections of interfering waves, not cross-sectional pictures.

“It’s the ensemble of data that tells us what we’re after,” says project leader Bryan Barnes.” We may not be able see the lines on the wafer, but we can tell you what you need to know about them–their size, their shape, their spacing.”

Scatterfield imaging has critical prerequisites that must be met before it can yield useful data for high-accuracy measurements of exceedingly small features. Key steps entail detailed evaluation of the path light takes as it beams through lenses, apertures and other system elements before reaching the sample. The path traversed by light scattering from the specimen undergoes the same level of scrutiny. Fortunately, scatterfield imaging lends itself to thorough characterization of both sequences of optical devices, according to the researchers. These preliminary steps are akin to error mapping so that recognized sources of inaccuracy are factored out of the data.

The method also benefits from a little advance intelligence–the as-designed arrangement of circuit lines on a chip, down to the size of individual features. Knowing what is expected to be the result of the complex chip-making process sets up a classic matchup of theory vs. experiment.

The NIST researchers can use standard equations to simulate light scattering from an ideal, defect-free pattern and, in fact, any variation thereof. Using wave analysis software they developed, the team has assembled an indexed library of light-scattering reference models. So once a specimen is scanned, the team relies on computers to compare their real-world data to models and to find close matches.

From there, succeeding rounds of analysis homes in on the remaining differences, reducing them until the only ones that remain are due to variations in geometry such as irregularities in the height, width, or shape of a line.

Measurement results achieved with the NIST approach might be said to cast light itself in an entirely new light. Their new study, the researchers say, shows that once disregarded scattered light “contains a wealth of accessible optical information.”

Next steps include extending the technique to even shorter wavelengths of light, down to ultraviolet, or 193 nanometers. The aim is to accurately measure features as small as 5 nanometers.

This work is part of a larger NIST effort to supply measurement tools that enable the semiconductor industry to continue doubling the number of devices on a chip about every two years and to help other industries make products with nanoscale features. Recently, NIST and Intel researchers reported using an X-ray technique to accurately measure features on a silicon chip to within fractions of a nanometer.

Here’s a link to and a citation for a PDF of the paper,

Deep-subwavelength Nanometric Image Reconstruction using Fourier Domain Optical Normalization by Jing  Qin, Richard  M Silver, Bryan  M  Barnes, Hui Zhou, Ronald G Dixson, and Mark Alexander Hen. Light: Science & Applications accepted article preview 5 November 2015; e16038 doi: 10.1038/lsa.2016.38

[Note:] This is a PDF file of an unedited peer-reviewed manuscript that has been accepted for publication. NPG are providing this early version of the manuscript as a service to our customers. The manuscript will undergo copy editing, typesetting and a proof review before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers apply.

This seems to be an open access paper but it is an early version.

US National Institute of Standards and Technology and molecules made of light (lightsabres anyone?)

As I recall, lightsabres are a Star Wars invention. I gather we’re a long way from running around with lightsabres  but there is hope, if that should be your dream, according to a Sept. 9, 2015 news item on Nanowerk,

… a team including theoretical physicists from JQI [Joint Quantum Institute] and NIST [US National Institute of Stnadards and Technology] has taken another step toward building objects out of photons, and the findings hint that weightless particles of light can be joined into a sort of “molecule” with its own peculiar force.

Here’s an artist’s conception of the light “molecule” provided by the researchers,

Researchers show that two photons, depicted in this artist’s conception as waves (left and right), can be locked together at a short distance. Under certain conditions, the photons can form a state resembling a two-atom molecule, represented as the blue dumbbell shape at center. Credit: E. Edwards/JQI

Researchers show that two photons, depicted in this artist’s conception as waves (left and right), can be locked together at a short distance. Under certain conditions, the photons can form a state resembling a two-atom molecule, represented as the blue dumbbell shape at center. Credit: E. Edwards/JQI

A Sept. 8, 2015 NIST news release (also available on EurekAlert*), which originated the news item, provides more information about the research (Note: Links have been removed),

The findings build on previous research that several team members contributed to before joining NIST. In 2013, collaborators from Harvard, Caltech and MIT found a way to bind two photons together so that one would sit right atop the other, superimposed as they travel. Their experimental demonstration was considered a breakthrough, because no one had ever constructed anything by combining individual photons—inspiring some to imagine that real-life lightsabers were just around the corner.

Now, in a paper forthcoming in Physical Review Letters, the NIST and University of Maryland-based team (with other collaborators) has showed theoretically that by tweaking a few parameters of the binding process, photons could travel side by side, a specific distance from each other. The arrangement is akin to the way that two hydrogen atoms sit next to each other in a hydrogen molecule.

“It’s not a molecule per se, but you can imagine it as having a similar kind of structure,” says NIST’s Alexey Gorshkov. “We’re learning how to build complex states of light that, in turn, can be built into more complex objects. This is the first time anyone has shown how to bind two photons a finite distance apart.”

While the new findings appear to be a step in the right direction—if we can build a molecule of light, why not a sword?—Gorshkov says he is not optimistic that Jedi Knights will be lining up at NIST’s gift shop anytime soon. The main reason is that binding photons requires extreme conditions difficult to produce with a roomful of lab equipment, let alone fit into a sword’s handle. Still, there are plenty of other reasons to make molecular light—humbler than lightsabers, but useful nonetheless.

“Lots of modern technologies are based on light, from communication technology to high-definition imaging,” Gorshkov says. “Many of them would be greatly improved if we could engineer interactions between photons.”

For example, engineers need a way to precisely calibrate light sensors, and Gorshkov says the findings could make it far easier to create a “standard candle” that shines a precise number of photons at a detector. Perhaps more significant to industry, binding and entangling photons could allow computers to use photons as information processors, a job that electronic switches in your computer do today.

Not only would this provide a new basis for creating computer technology, but it also could result in substantial energy savings. Phone messages and other data that currently travel as light beams through fiber optic cables has to be converted into electrons for processing—an inefficient step that wastes a great deal of electricity. If both the transport and the processing of the data could be done with photons directly, it could reduce these energy losses.

Gorshkov says it will be important to test the new theory in practice for these and other potential benefits.

“It’s a cool new way to study photons,” he says. “They’re massless and fly at the speed of light. Slowing them down and binding them may show us other things we didn’t know about them before.”

Here are links and citations for the paper. First, there’s an early version on arXiv.org and, then, there’s the peer-reviewed version, which is not yet available,

Coulomb bound states of strongly interacting photons by M. F. Maghrebi, M. J. Gullans, P. Bienias, S. Choi, I. Martin, O. Firstenberg, M. D. Lukin, H. P. Büchler, A. V. Gorshkov.      arXiv:1505.03859 [quant-ph] (or arXiv:1505.03859v1 [quant-ph] for this version)

Coulomb bound states of strongly interacting photons by M. F. Maghrebi, M. J. Gullans, P. Bienias, S. Choi, I. Martin, O. Firstenberg, M. D. Lukin, H. P. Büchler, and A. V. Gorshkov.
Phys. Rev. Lett. forthcoming in September 2015.

The first version (arXiv) is open access and I’m not sure whether or not the Physical review Letters study will be behind a paywall or be available as an open access paper.

*EurekAlert link added 10:34 am PST on Sept. 11, 2015.