Tag Archives: NIST

Carbon nanotubes: faster, cheaper, easier, and more consistent

One of the big problems with nanomaterials has to do with production issues such as: consistent size and shape. It seems that scientists at the US National Institute of Standards and Technology (NIST) have developed a technique for producing carbon nanotubes (CNTs) which addresses these issues. From a July 19, 2016 news item on Nanotechnology Now,

Just as many of us might be resigned to clogged salt shakers or rush-hour traffic, those working to exploit the special properties of carbon nanotubes have typically shrugged their shoulders when these tiniest of cylinders fill with water during processing. But for nanotube practitioners who have reached their Popeye threshold and “can’t stands no more,” the National Institute of Standards and Technology (NIST) has devised a cheap, quick and effective strategy that reliably enhances the quality and consistency of the materials–important for using them effectively in applications such as new computing technologies.

To prevent filling of the cores of single-wall carbon nanotubes with water or other detrimental substances, the NIST researchers advise intentionally prefilling them with a desired chemical of known properties. Taking this step before separating and dispersing the materials, usually done in water, yields a consistently uniform collection of nanotubes. In quantity and quality, the results are superior to water-filled nanotubes, especially for optical applications such as sensors and photodetectors.

A July 15, 2016 NIST news release, which originated the news item, expands on the theme,

The approach opens a straightforward route for engineering the properties of single-wall carbon nanotubes—rolled up sheets of carbon atoms arranged like chicken wire or honey combs—with improved or new properties.

“This approach is so easy, inexpensive and broadly useful that I can’t think of a reason not to use it,” said NIST chemical engineer Jeffrey Fagan.

In their proof-of-concept experiments, the NIST team inserted more than 20 different compounds into an assortment of single-wall carbon nanotubes with an interior diameter that ranged from more than 2 down to about 0.5 nanometers. Led by visiting researcher Jochen Campo, the scientists tested their strategy by using hydrocarbons called alkanes as fillers.

The alkanes, which include such familiar compounds as propane and butane, served to render the nanotube interiors unreactive. In other words, the alkane-filled nanotubes behaved almost as if they were empty—precisely the goal of Campo, Fagan and colleagues.

Compared with nanotubes filled with water and possibly ions, acids and other unwanted chemicals encountered during processing, empty nanotubes possess far superior properties. For example, when stimulated by light, empty carbon nanotubes fluoresce far brighter and with sharper signals.

Yet, “spontaneous ingestion” of water or other solvents by the nanotubes during processing is an “endemic but often neglected phenomenon with strong implications for the development of nanotube applications,” the NIST team wrote in a recent article in Nanoscale Horizons.

Perhaps because of the additional cost and effort required to filter out and gather nanotubes, researchers tend to tolerate mixed batches of unfilled (empty) and mostly filled single-wall carbon nanotubes. Separating unfilled nanotubes from these mixtures requires expensive ultracentrifuge equipment and, even then, the yield is only about 10 percent, Campo estimates.

“If your goal is to use nanotubes for electronic circuits, for example, or for fluorescent anti-cancer image contrast agents, then you require much greater quantities of materials of consistent composition and quality,” Campo explained, who was exploring these applications while doing postdoctoral research at the University of Antwerp. “This particular need inspired development of the new prefilling method by asking the question, can we put some passive chemical into the nanotube instead to keep the water out.”

From the very first simple experiments, the answer was yes. And the benefits can be significant. In fluorescence experiments, alkane-filled nanotubes emitted signals two to three times stronger than those emitted by water-filled nanotubes. Performance approached that of empty nanotubes—the gold standard for these comparisons.

As important, the NIST-developed prefilling strategy is controllable, versatile and easily incorporated into existing methods for processing single-wall carbon nanotubes, according to the researchers.

Here’s a link to and citation for the paper,

Enhancing single-wall carbon nanotube properties through controlled endohedral filling by J. Campo, Y. Piao, S. Lam, C. M. Stafford, J. K. Streit, J. R. Simpson, A. R. Hight Walker, and J. A. Fagan. Nanoscale Horiz., 2016,1, 317-324 DOI: 10.1039/C6NH00062B First published online 10 May 2016

This paper is open access but you do need to register on the site (it is a free registration).

US Nanotechnology Initiative for water sustainability

Wednesday, March 23, 2016 was World Water Day and to coincide with that event the US National Nanotechnology Initiative (NNI) in collaboration with several other agencies announced a new ‘signature initiative’. From a March 24, 2016 news item on Nanowerk (Note: A link has been removed),

As a part of the White House Water Summit held yesterday on World Water Day, the Federal agencies participating in the National Nanotechnology Initiative (NNI) announced the launch of a Nanotechnology Signature Initiative (NSI), Water Sustainability through Nanotechnology: Nanoscale Solutions for a Global-Scale Challenge.

A March 23, 2016 NNI news release provides more information about why this initiative is important,

Access to clean water remains one of the world’s most pressing needs. As today’s White House Office of Science and Technology blog post explains, “the small size and exceptional properties of engineered nanomaterials are particularly promising for addressing the key technical challenges related to water quality and quantity.”

“One cannot find an issue more critical to human life and global security than clean, plentiful, and reliable water sources,” said Dr. Michael Meador, Director of the National Nanotechnology Coordination Office (NNCO). “Through the NSI mechanism, NNI member agencies will have an even greater ability to make meaningful strides toward this initiative’s thrust areas: increasing water availability, improving the efficiency of water delivery and use, and enabling next-generation water monitoring systems.”

A March 23, 2016 US White House blog posting by Lloyd Whitman and Lisa Friedersdorf describes the efforts in more detail (Note: A link has been removed),

The small size and exceptional properties of engineered nanomaterials are particularly promising for addressing the pressing technical challenges related to water quality and quantity. For example, the increased surface area—a cubic centimeter of nanoparticles has a surface area larger than a football field—and reactivity of nanometer-scale particles can be exploited to create catalysts for water purification that do not require rare or precious metals. And composites incorporating nanomaterials such as carbon nanotubes might one day enable stronger, lighter, and more durable piping systems and components. Under this NSI, Federal agencies will coordinate and collaborate to more rapidly develop nanotechnology-enabled solutions in three main thrusts: [thrust 1] increasing water availability; [thrust 2] improving the efficiency of water delivery and use; and [thrust 3] enabling next-generation water monitoring systems.

A technical “white paper” released by the agencies this week highlights key technical challenges for each thrust, identifies key objectives to overcome those challenges, and notes areas of research and development where nanotechnology promises to provide the needed solutions. By shining a spotlight on these areas, the new NSI will increase Federal coordination and collaboration, including with public and private stakeholders, which is vital to making progress in these areas. The additional focus and associated collective efforts will advance stewardship of water resources to support the essential food, energy, security, and environment needs of all stakeholders.

We applaud the commitment of the Federal agencies who will participate in this effort—the Department of Commerce/National Institute of Standards and Technology, Department of Energy, Environmental Protection Agency, National Aeronautics and Space Administration, National Science Foundation, and U.S. Department of Agriculture/National Institute of Food and Agriculture. As made clear at this week’s White House Water Summit, the world’s water systems are under tremendous stress, and new and emerging technologies will play a critical role in ensuring a sustainable water future.

The white paper (12 pp.) is titled: Water Sustainability through Nanotechnology: Nanoscale Solutions for a Global-Scale Challenge and describes the thrusts in more detail.

A March 22, 2016 US White House fact sheet lays out more details including funding,

Click here to learn more about all of the commitments and announcements being made today. They include:

  • Nearly $4 billion in private capital committed to investment in a broad range of water-infrastructure projects nationwide. This includes $1.5 billion from Ultra Capital to finance decentralized and scalable water-management solutions, and $500 million from Sustainable Water to develop water reclamation and reuse systems.
  • More than $1 billion from the private sector over the next decade to conduct research and development into new technologies. This includes $500 million from GE to fuel innovation, expertise, and global capabilities in advanced water, wastewater, and reuse technologies.
  • A Presidential Memorandum and supporting Action Plan on building national capabilities for long-term drought resilience in the United States, including by setting drought resilience policy goals, directing specific drought resilience activities to be completed by the end of the year, and permanently establishing the National Drought Resilience Partnership as an interagency task force responsible for coordinating drought-resilience, response, and recovery efforts.
  • Nearly $35 million this year in Federal grants from the Environmental Protection Agency, the National Oceanic and Atmospheric Administration, the National Science Foundation, and the U.S. Department of Agriculture to support cutting-edge water science;
  • The release of a new National Water Model that will dramatically enhance the Nation’s river-forecasting capabilities by delivering forecasts for approximately 2.7 million locations, up from 4,000 locations today (a 700-fold increase in forecast density).

This seems promising and hopefully other countries will follow suit.

Monsieur Kilogram; an SI (international system of units) Superhero

I wouldn’t have thought that measurement was such a crucial issue that it would require a superhero team but I have to admit the folks at the US National Institute of Standards and Technology (NIST) make a compelling case in a Feb. 2, 2016 NIST news release (also on EurekAlert but dated Feb. 4, 2016),

The nefarious Major Uncertainty has kidnapped Monsieur Kilogram, putting the world’s measurements of mass in jeopardy. As the world spirals into “Mass Hysteria,” the remaining SI Superheroes, champions of the metric system, leap into action to save the day, and hopefully Monsieur Kilogram as well.

This crisis kicks off the third and latest adventure from the League of SI Superheroes, the animated online series from the National Institute of Standards and Technology (NIST). “Mass Hysteria” touches upon a topic—how to redefine the basic unit of mass known as the kilogram—that represents a cutting-edge undertaking for researchers working to modernize the worldwide metric measurement system known as the International System of Units (SI).

From the very big to the very small, accurately measuring mass is important in the world around us. For example, many of the products you buy at the grocery store and other places are sold by mass or the related quantity of weight. Roughly speaking, mass is the amount and type of “stuff” there is in something, and weight is the force pulling on the mass by gravity. The masses of every ingredient in medications from aspirin tablets to cancer drugs are carefully measured to ensure that they are both safe and effective. In many cases, medical doctors consider the mass of the patient to determine the dosage of the medications they prescribe as well. And both the fuel and the amount of thrust produced by the huge engines that power airplanes and rockets depend on mass.

Small errors of even a few milligrams per kilogram may not sound like much, but they can be costly when measuring huge quantities of something like a tanker ship full of grain or oil. With medicines, slightly too little of a chemical could make it ineffective and slightly too much could be toxic.

Here’s a peek at Monsieur Kilogram who seems like a pretty tough customer,

Monsieur Kilogram, a character in NIST’s League of SI Superheroes animated online series, is able to determine the mass of any object simply by holding it. ©NIST

Monsieur Kilogram, a character in NIST’s League of SI Superheroes animated online series, is able to determine the mass of any object simply by holding it. ©NIST

Getting back to the news release and the importance of accurate measurement,

Being the last standard unit of measure still based on an actual physical object, in this case a golf-ball-sized cylinder of platinum and iridium, the kilogram is vulnerable to damage, as well as being lost or even stolen. While the international prototype kilogram itself cannot change because it is the kilogram by definition, copies of the international prototype that many countries use as their standard of mass have been gaining or losing mass relative to it.

The SI Superheroes’ latest episode briefly explores one of the efforts to redefine the kilogram in terms of natural forces called the watt balance, a complex machine that uses electric and magnetic forces to balance a 1-kilogram mass against the Earth’s gravity. Precise measurements related to these forces can then be used to provide a consistent definition of the amount of mass in the kilogram.

While the superheroes’ antics are not exactly representative of the efforts of actual researchers, super scientists at NIST and elsewhere have been working for years to reduce the errors in their measurement of this quantity to the point where the watt balance can take over for the international prototype kilogram.

They are closing in on their goal, and it is widely anticipated that the kilogram will be redefined in 2018.

Once this process is complete, the kilogram will have been freed from its dependence on a physical object, and anyone with the right technical expertise and equipment will be able to determine the mass of a kilogram for themselves.

Will the SI Superheroes finish the watt balance in time? Watch the next episode and find out! Modeled on the seven base units of the International System of Units, or SI, the League of SI Superheroes are:

Meter Man: With his laser interferometer eyes, graduated arms and extendable body, no dimension is too big or too small for Meter Man to measure.

The Mole: Able to sniff out and count the atoms of every element, the Mole is a master of chemistry.

Professor Second: By reading the vibrations of her laser-cooled cesium atoms, Professor Second can measure any frequency and calibrate any clock.

Monsieur Kilogram: Monsieur Kilogram loves lifting weights, and it shows. He is able to determine the mass of any object simply by holding it.

Ms. Ampere: Ms. Ampere rules the flow of electrons—electrical current—and makes sure that the right amount gets where it needs to go.

Dr. Kelvin: Dr. Kelvin heats up or cools down objects by speeding up or slowing down particles inside them. He can also measure the temperature of anything in the universe with his trusty thermometer.

Candela: Don’t let her small size fool you. Candela’s power over light helps to brighten the whole world.

Catch up on their adventures at The League of SI Superheroes kids’ page. Teachers can also request a classroom set of SI educational materials by submitting their contact information and grade level to TheSI@nist.gov.

Here’s the latest adventure,

Enjoy!

A study in contrasts: innovation and education strategies in US and British Columbia (Canada)

It’s always interesting to contrast two approaches to the same issue, in this case, innovation and education strategies designed to improve the economies of the United States and of British Columbia, a province in Canada.

One of the major differences regarding education in the US and in Canada is that the Canadian federal government, unlike the US federal government, has no jurisdiction over the matter. Education is strictly a provincial responsibility.

I recently wrote a commentary (a Jan. 19, 2016 posting) about the BC government’s Jan. 18, 2016 announcement of its innovation strategy in a special emphasis on the education aspect. Premier Christy Clark focused largely on the notion of embedding courses on computer coding in schools from K-12 (kindergarten through grade 12) as Jonathon Narvey noted in his Jan. 19, 2016 event recap for Betakit,

While many in the tech sector will be focused on the short-term benefits of a quick injection of large capital [a $100M BC Tech Fund as part of a new strategy was announced in Dec. 2015 but details about the new #BCTECH Strategy were not shared until Jan. 18, 2016], the long-term benefits for the local tech sector are being seeded in local schools. More than 600,000 BC students will be getting basic skills in the K-12 curriculum, with coding academies, more work experience electives and partnerships between high school and post-secondary institutions.

Here’s what I had to say in my commentary (from the Jan. 19, 2016 posting),

… the government wants to embed  computer coding into the education system for K-12 (kindergarten to grade 12). One determined reporter (Canadian Press if memory serves) attempted to find out how much this would cost. No answer was forthcoming although there were many words expended. Whether this failure was due to ignorance (disturbing!) or a reluctance to share (also disturbing!) was impossible to tell. Another reporter (Georgia Straight) asked about equipment (coding can be taught with pen and paper but hardware is better). … Getting back to the reporter’s question, no answer was forthcoming although the speaker was loquacious.

Another reporter asked if the government had found any jurisdictions doing anything similar regarding computer coding. It seems they did consider other jurisdictions although it was claimed that BC is the first to strike out in this direction. Oddly, no one mentioned Estonia, known in some circles as E-stonia, where the entire school system was online by the late 1990s in an initiative known as the ‘Tiger Leap Foundation’ which also supported computer coding classes in secondary school (there’s more in Tim Mansel’s May 16, 2013 article about Estonia’s then latest initiative to embed computer coding into grade school.) …

Aside from the BC government’s failure to provide details, I am uncomfortable with what I see as an overemphasis on computer coding that suggests a narrow focus on what constitutes a science and technology strategy for education. I find the US approach closer to what I favour although I may be biased since they are building their strategy around nanotechnology education.

The US approach had been announced in dribs and drabs until recently when a Jan. 26, 2016 news item on Nanotechnology Now indicated a broad-based plan for nanotechnology education (and computer coding),

Over the past 15 years, the Federal Government has invested over $22 billion in R&D under the auspices of the National Nanotechnology Initiative (NNI) to understand and control matter at the nanoscale and develop applications that benefit society. As these nanotechnology-enabled applications become a part of everyday life, it is important for students to have a basic understanding of material behavior at the nanoscale, and some states have even incorporated nanotechnology concepts into their K-12 science standards. Furthermore, application of the novel properties that exist at the nanoscale, from gecko-inspired climbing gloves and invisibility cloaks, to water-repellent coatings on clothes or cellphones, can spark students’ excitement about science, technology, engineering, and mathematics (STEM).

An earlier Jan. 25, 2016 White House blog posting by Lisa Friedersdorf and Lloyd Whitman introduced the notion that nanotechnology is viewed as foundational and a springboard for encouraging interest in STEM (science, technology, engineering, and mathematics) careers while outlining several formal and information education efforts,

The Administration’s updated Strategy for American Innovation, released in October 2015, identifies nanotechnology as one of the emerging “general-purpose technologies”—a technology that, like the steam engine, electricity, and the Internet, will have a pervasive impact on our economy and our society, with the ability to create entirely new industries, create jobs, and increase productivity. To reap these benefits, we must train our Nation’s students for these high-tech jobs of the future. Fortunately, the multidisciplinary nature of nanotechnology and the unique and fascinating phenomena that occur at the nanoscale mean that nanotechnology is a perfect topic to inspire students to pursue careers in science, technology, engineering, and mathematics (STEM).

The Nanotechnology: Super Small Science series [mentioned in my Jan. 21, 2016 posting] is just the latest example of the National Nanotechnology Initiative (NNI)’s efforts to educate and inspire our Nation’s students. Other examples include:

The announcement about computer coding and courses being integrated in the US education curricula K-12 was made in US President Barack Obama’s 2016 State of the Union speech and covered in a Jan. 30, 2016 article by Jessica Hullinger for Fast Company,

In his final State Of The Union address earlier this month, President Obama called for providing hands-on computer science classes for all students to make them “job ready on day one.” Today, he is unveiling how he plans to do that with his upcoming budget.

The President’s Computer Science for All Initiative seeks to provide $4 billion in funding for states and an additional $100 million directly to school districts in a push to provide access to computer science training in K-12 public schools. The money would go toward things like training teachers, providing instructional materials, and getting kids involved in computer science early in elementary and middle school.

There are more details in the Hullinger’s article and in a Jan. 30, 2016 White House blog posting by Megan Smith,

Computer Science for All is the President’s bold new initiative to empower all American students from kindergarten through high school to learn computer science and be equipped with the computational thinking skills they need to be creators in the digital economy, not just consumers, and to be active citizens in our technology-driven world. Our economy is rapidly shifting, and both educators and business leaders are increasingly recognizing that computer science (CS) is a “new basic” skill necessary for economic opportunity and social mobility.

CS for All builds on efforts already being led by parents, teachers, school districts, states, and private sector leaders from across the country.

Nothing says one approach has to be better than the other as there’s usually more than one way to accomplish a set of goals. As well, it’s unfair to expect a provincial government to emulate the federal government of a larger country with more money to spend. I just wish the BC government (a) had shared details such as the budget allotment for their initiative and (b) would hint at a more imaginative, long range view of STEM education.

Going back to Estonia one last time, in addition to the country’s recent introduction of computer coding classes in grade school, it has also embarked on a nanotechnology/nanoscience educational and entrepreneurial programme as noted in my Sept. 30, 2014 posting,

The University of Tartu (Estonia) announced in a Sept. 29, 2014 press release an educational and entrepreneurial programme about nanotechnology/nanoscience for teachers and students,

To bring nanoscience closer to pupils, educational researchers of the University of Tartu decided to implement the European Union LLP Comenius project “Quantum Spin-Off – connecting schools with high-tech research and entrepreneurship”. The objective of the project is to build a kind of a bridge: at one end, pupils can familiarise themselves with modern science, and at the other, experience its application opportunities at high-tech enterprises. “We also wish to inspire these young people to choose a specialisation related to science and technology in the future,” added Lukk [Maarika Lukk, Coordinator of the project].

The pupils can choose between seven topics of nanotechnology: the creation of artificial muscles, microbiological fuel elements, manipulation of nanoparticles, nanoparticles and ionic liquids as oil additives, materials used in regenerative medicine, deposition and 3D-characterisation of atomically designed structures and a topic covered in English, “Artificial robotic fish with EAP elements”.

Learning is based on study modules in the field of nanotechnology. In addition, each team of pupils will read a scientific publication, selected for them by an expert of that particular field. In that way, pupils will develop an understanding of the field and of scientific texts. On the basis of the scientific publication, the pupils prepare their own research project and a business plan suitable for applying the results of the project.

In each field, experts of the University of Tartu will help to understand the topics. Participants will visit a nanotechnology research laboratory and enterprises using nanotechnologies.

The project lasts for two years and it is also implemented in Belgium, Switzerland and Greece.

As they say, time will tell.

Developing optical microscopes that measure features down to 10 nanometer level on computer chips

The US National Institute of Standards and Technology (NIST) issued a Dec. 2, 2015 news release (also on EurekAlert) announcing a new kind of optical microscope and its possible impact on the semiconductor industry,

National Institute of Standards and Technology (NIST) researchers are seeing the light, but in an altogether different way. And how they are doing it just might be the semiconductor industry’s ticket for extending its use of optical microscopes to measure computer chip features that are approaching 10 nanometers, tiny fractions of the wavelength of light.

The news release goes on to provide details and an explanation of scatterfield imaging,

Using a novel microscope that combines standard through-the-lens viewing with a technique called scatterfield imaging, the NIST team accurately measured patterned features on a silicon wafer that were 30 times smaller than the wavelength of light (450 nanometers) used to examine them. They report* that measurements of the etched lines–as thin as 16 nanometers wide–on the SEMATECH-fabricated wafer were accurate to one nanometer. With the technique, they spotted variations in feature dimensions amounting to differences of a few atoms.

Measurements were confirmed by those made with an atomic force microscope, which achieves sub-nanometer resolution, but is considered too slow for online quality-control measurements. Combined with earlier results, the NIST researchers write, the new proof-of-concept study* suggests that the innovative optical approach could be a “realistic solution to a very challenging problem” facing chip makers and others aiming to harness advances in nanotechnology. All need the means for “nondestructive measurement of nanometer-scale structures with sub-nanometer sensitivity while still having high throughput.

“Light-based, or optical, microscopes can’t “see” features smaller than the wavelength of light, at least not in the crisp detail necessary for making accurate measurements. However, light does scatter when it strikes so-called subwavelength features and patterned arrangements of such features. “Historically, we would ignore this scattered light because it did not yield sufficient resolution,” explains Richard Silver, the physicist who initiated NIST’s scatterfield imaging effort. “Now we know it contains helpful information that provides signatures telling us something about where the light came from.”

With scatterfield imaging, Silver and colleagues methodically illuminate a sample with polarized light from different angles. From this collection of scattered light–nothing more than a sea of wiggly lines to the untrained eye–the NIST team can extract characteristics of the bounced lightwaves that, together, reveal the geometry of features on the specimen.

Light-scattering data are gathered in slices, which together image the volume of scattered light above and into the sample. These slices are analyzed and reconstructed to create a three-dimensional representation. The process is akin to a CT scan, except that the slices are collections of interfering waves, not cross-sectional pictures.

“It’s the ensemble of data that tells us what we’re after,” says project leader Bryan Barnes.” We may not be able see the lines on the wafer, but we can tell you what you need to know about them–their size, their shape, their spacing.”

Scatterfield imaging has critical prerequisites that must be met before it can yield useful data for high-accuracy measurements of exceedingly small features. Key steps entail detailed evaluation of the path light takes as it beams through lenses, apertures and other system elements before reaching the sample. The path traversed by light scattering from the specimen undergoes the same level of scrutiny. Fortunately, scatterfield imaging lends itself to thorough characterization of both sequences of optical devices, according to the researchers. These preliminary steps are akin to error mapping so that recognized sources of inaccuracy are factored out of the data.

The method also benefits from a little advance intelligence–the as-designed arrangement of circuit lines on a chip, down to the size of individual features. Knowing what is expected to be the result of the complex chip-making process sets up a classic matchup of theory vs. experiment.

The NIST researchers can use standard equations to simulate light scattering from an ideal, defect-free pattern and, in fact, any variation thereof. Using wave analysis software they developed, the team has assembled an indexed library of light-scattering reference models. So once a specimen is scanned, the team relies on computers to compare their real-world data to models and to find close matches.

From there, succeeding rounds of analysis homes in on the remaining differences, reducing them until the only ones that remain are due to variations in geometry such as irregularities in the height, width, or shape of a line.

Measurement results achieved with the NIST approach might be said to cast light itself in an entirely new light. Their new study, the researchers say, shows that once disregarded scattered light “contains a wealth of accessible optical information.”

Next steps include extending the technique to even shorter wavelengths of light, down to ultraviolet, or 193 nanometers. The aim is to accurately measure features as small as 5 nanometers.

This work is part of a larger NIST effort to supply measurement tools that enable the semiconductor industry to continue doubling the number of devices on a chip about every two years and to help other industries make products with nanoscale features. Recently, NIST and Intel researchers reported using an X-ray technique to accurately measure features on a silicon chip to within fractions of a nanometer.

Here’s a link to and a citation for a PDF of the paper,

Deep-subwavelength Nanometric Image Reconstruction using Fourier Domain Optical Normalization by Jing  Qin, Richard  M Silver, Bryan  M  Barnes, Hui Zhou, Ronald G Dixson, and Mark Alexander Hen. Light: Science & Applications accepted article preview 5 November 2015; e16038 doi: 10.1038/lsa.2016.38

[Note:] This is a PDF file of an unedited peer-reviewed manuscript that has been accepted for publication. NPG are providing this early version of the manuscript as a service to our customers. The manuscript will undergo copy editing, typesetting and a proof review before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers apply.

This seems to be an open access paper but it is an early version.

US National Institute of Standards and Technology and molecules made of light (lightsabres anyone?)

As I recall, lightsabres are a Star Wars invention. I gather we’re a long way from running around with lightsabres  but there is hope, if that should be your dream, according to a Sept. 9, 2015 news item on Nanowerk,

… a team including theoretical physicists from JQI [Joint Quantum Institute] and NIST [US National Institute of Stnadards and Technology] has taken another step toward building objects out of photons, and the findings hint that weightless particles of light can be joined into a sort of “molecule” with its own peculiar force.

Here’s an artist’s conception of the light “molecule” provided by the researchers,

Researchers show that two photons, depicted in this artist’s conception as waves (left and right), can be locked together at a short distance. Under certain conditions, the photons can form a state resembling a two-atom molecule, represented as the blue dumbbell shape at center. Credit: E. Edwards/JQI

Researchers show that two photons, depicted in this artist’s conception as waves (left and right), can be locked together at a short distance. Under certain conditions, the photons can form a state resembling a two-atom molecule, represented as the blue dumbbell shape at center. Credit: E. Edwards/JQI

A Sept. 8, 2015 NIST news release (also available on EurekAlert*), which originated the news item, provides more information about the research (Note: Links have been removed),

The findings build on previous research that several team members contributed to before joining NIST. In 2013, collaborators from Harvard, Caltech and MIT found a way to bind two photons together so that one would sit right atop the other, superimposed as they travel. Their experimental demonstration was considered a breakthrough, because no one had ever constructed anything by combining individual photons—inspiring some to imagine that real-life lightsabers were just around the corner.

Now, in a paper forthcoming in Physical Review Letters, the NIST and University of Maryland-based team (with other collaborators) has showed theoretically that by tweaking a few parameters of the binding process, photons could travel side by side, a specific distance from each other. The arrangement is akin to the way that two hydrogen atoms sit next to each other in a hydrogen molecule.

“It’s not a molecule per se, but you can imagine it as having a similar kind of structure,” says NIST’s Alexey Gorshkov. “We’re learning how to build complex states of light that, in turn, can be built into more complex objects. This is the first time anyone has shown how to bind two photons a finite distance apart.”

While the new findings appear to be a step in the right direction—if we can build a molecule of light, why not a sword?—Gorshkov says he is not optimistic that Jedi Knights will be lining up at NIST’s gift shop anytime soon. The main reason is that binding photons requires extreme conditions difficult to produce with a roomful of lab equipment, let alone fit into a sword’s handle. Still, there are plenty of other reasons to make molecular light—humbler than lightsabers, but useful nonetheless.

“Lots of modern technologies are based on light, from communication technology to high-definition imaging,” Gorshkov says. “Many of them would be greatly improved if we could engineer interactions between photons.”

For example, engineers need a way to precisely calibrate light sensors, and Gorshkov says the findings could make it far easier to create a “standard candle” that shines a precise number of photons at a detector. Perhaps more significant to industry, binding and entangling photons could allow computers to use photons as information processors, a job that electronic switches in your computer do today.

Not only would this provide a new basis for creating computer technology, but it also could result in substantial energy savings. Phone messages and other data that currently travel as light beams through fiber optic cables has to be converted into electrons for processing—an inefficient step that wastes a great deal of electricity. If both the transport and the processing of the data could be done with photons directly, it could reduce these energy losses.

Gorshkov says it will be important to test the new theory in practice for these and other potential benefits.

“It’s a cool new way to study photons,” he says. “They’re massless and fly at the speed of light. Slowing them down and binding them may show us other things we didn’t know about them before.”

Here are links and citations for the paper. First, there’s an early version on arXiv.org and, then, there’s the peer-reviewed version, which is not yet available,

Coulomb bound states of strongly interacting photons by M. F. Maghrebi, M. J. Gullans, P. Bienias, S. Choi, I. Martin, O. Firstenberg, M. D. Lukin, H. P. Büchler, A. V. Gorshkov.      arXiv:1505.03859 [quant-ph] (or arXiv:1505.03859v1 [quant-ph] for this version)

Coulomb bound states of strongly interacting photons by M. F. Maghrebi, M. J. Gullans, P. Bienias, S. Choi, I. Martin, O. Firstenberg, M. D. Lukin, H. P. Büchler, and A. V. Gorshkov.
Phys. Rev. Lett. forthcoming in September 2015.

The first version (arXiv) is open access and I’m not sure whether or not the Physical review Letters study will be behind a paywall or be available as an open access paper.

*EurekAlert link added 10:34 am PST on Sept. 11, 2015.

Nanotechnology research protocols for Environment, Health and Safety Studies in US and a nanomedicine characterization laboratory in the European Union

I have two items relating to nanotechnology and the development of protocols. The first item concerns the launch of a new web portal by the US National Institute of Standards and Technology.

US National Institute of Standards and Technology (NIST)

From a July 1, 2015 news item on Azonano,

As engineered nanomaterials increasingly find their way into commercial products, researchers who study the potential environmental or health impacts of those materials face a growing challenge to accurately measure and characterize them. These challenges affect measurements of basic chemical and physical properties as well as toxicology assessments.

To help nano-EHS (Environment, Health and Safety)researchers navigate the often complex measurement issues, the National Institute of Standards and Technology (NIST) has launched a new website devoted to NIST-developed (or co-developed) and validated laboratory protocols for nano-EHS studies.

A July 1, 2015 NIST news release on EurekAlert, which originated the news item, offers more details about the information available through the web portal,

In common lab parlance, a “protocol” is a specific step-by-step procedure used to carry out a measurement or related activity, including all the chemicals and equipment required. Any peer-reviewed journal article reporting an experimental result has a “methods” section where the authors document their measurement protocol, but those descriptions are necessarily brief and condensed, and may lack validation of any sort. By comparison, on NIST’s new Protocols for Nano-EHS website the protocols are extraordinarily detailed. For ease of citation, they’re published individually–each with its own unique digital object identifier (DOI).

The protocols detail not only what you should do, but why and what could go wrong. The specificity is important, according to program director Debra Kaiser, because of the inherent difficulty of making reliable measurements of such small materials. “Often, if you do something seemingly trivial–use a different size pipette, for example–you get a different result. Our goal is to help people get data they can reproduce, data they can trust.”

A typical caution, for example, notes that if you’re using an instrument that measures the size of nanoparticles in a solution by how they scatter light, it’s important also to measure the transmission spectrum of the particles if they’re colored, because if they happen to absorb light strongly at the same frequency as your instrument, the result may be biased.

“These measurements are difficult because of the small size involved,” explains Kaiser. “Very few new instruments have been developed for this. People are adapting existing instruments and methods for the job, but often those instruments are being operated close to their limits and the methods were developed for chemicals or bulk materials and not for nanomaterials.”

“For example, NIST offers a reference material for measuring the size of gold nanoparticles in solution, and we report six different sizes depending on the instrument you use. We do it that way because different instruments sense different aspects of a nanoparticle’s dimensions. An electron microscope is telling you something different than a dynamic light scattering instrument, and the researcher needs to understand that.”

The nano-EHS protocols offered by the NIST site, Kaiser says, could form the basis for consensus-based, formal test methods such as those published by ASTM and ISO.

NIST’s nano-EHS protocol site currently lists 12 different protocols in three categories: sample preparation, physico-chemical measurements and toxicological measurements. More protocols will be added as they are validated and documented. Suggestions for additional protocols are welcome at nanoprotocols@nist.gov.

The next item concerns European nanomedicine.

CEA-LETI and Europe’s first nanomedicine characterization laboratory

A July 1, 2015 news item on Nanotechnology Now describes the partnership which has led to launch of the new laboratory,

CEA-Leti today announced the launch of the European Nano-Characterisation Laboratory (EU-NCL) funded by the European Union’s Horizon 2020 research and innovation programm[1]e. Its main objective is to reach a level of international excellence in nanomedicine characterisation for medical indications like cancer, diabetes, inflammatory diseases or infections, and make it accessible to all organisations developing candidate nanomedicines prior to their submission to regulatory agencies to get the approval for clinical trials and, later, marketing authorization.

“As reported in the ETPN White Paper[2], there is a lack of infrastructure to support nanotechnology-based innovation in healthcare,” said Patrick Boisseau, head of business development in nanomedicine at CEA-Leti and chairman of the European Technology Platform Nanomedicine (ETPN). “Nanocharacterisation is the first bottleneck encountered by companies developing nanotherapeutics. The EU-NCL project is of most importance for the nanomedicine community, as it will contribute to the competiveness of nanomedicine products and tools and facilitate regulation in Europe.”

EU-NCL is partnered with the sole international reference facility, the Nanotechnology Characterization Lab of the National Cancer Institute in the U.S. (US-NCL)[3], to get faster international harmonization of analytical protocols.

“We are excited to be part of this cooperative arrangement between Europe and the U.S.,” said Scott E. McNeil, director of U.S. NCL. “We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.”

A July 2, 2015 EMPA (Swiss Federal Laboratories for Materials Science and Technology) news release on EurekAlert provides more detail about the laboratory and the partnerships,

The «European Nanomedicine Characterization Laboratory» (EU-NCL), which was launched on 1 June 2015, has a clear-cut goal: to help bring more nanomedicine candidates into the clinic and on the market, for the benefit of patients and the European pharmaceutical industry. To achieve this, EU-NCL is partnered with the sole international reference facility, the «Nanotechnology Characterization Laboratory» (US-NCL) of the US-National Cancer Institute, to get faster international harmonization of analytical protocols. EU-NCL is also closely connected to national medicine agencies and the European Medicines Agency to continuously adapt its analytical services to requests of regulators. EU-NCL is designed, organized and operated according to the highest EU regulatory and quality standards. «We are excited to be part of this cooperative project between Europe and the U.S.,» says Scott E. McNeil, director of US-NCL. «We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.»

Nine partners from eight countries

EU-NCL, which is funded by the EU for a four-year period with nearly 5 million Euros, brings together nine partners from eight countries: CEA-Tech in Leti and Liten, France, the coordinator of the project; the Joint Research Centre of the European Commission in Ispra, Italy; European Research Services GmbH in Münster Germany; Leidos Biomedical Research, Inc. in Frederick, USA; Trinity College in Dublin, Ireland; SINTEF in Oslo, Norway; the University of Liverpool in the UK; Empa, the Swiss Federal Laboratories for Materials Science and Technology in St. Gallen, Switzerland; Westfälische Wilhelms-Universität (WWU) and Gesellschaft für Bioanalytik, both in Münster, Germany. Together, the partnering institutions will provide a trans-disciplinary testing infrastructure covering a comprehensive set of preclinical characterization assays (physical, chemical, in vitro and in vivo biological testing), which will allow researchers to fully comprehend the biodistribution, metabolism, pharmacokinetics, safety profiles and immunological effects of their medicinal nano-products. The project will also foster the use and deployment of standard operating procedures (SOPs), benchmark materials and quality management for the preclinical characterization of medicinal nano-products. Yet another objective is to promote intersectoral and interdisciplinary communication among key drivers of innovation, especially between developers and regulatory agencies.

The goal: to bring safe and efficient nano-therapeutics faster to the patient

Within EU-NCL, six analytical facilities will offer transnational access to their existing analytical services for public and private developers, and will also develop new or improved analytical assays to keep EU-NCL at the cutting edge of nanomedicine characterization. A complementary set of networking activities will enable EU-NCL to deliver to European academic or industrial scientists the high-quality analytical services they require for accelerating the industrial development of their candidate nanomedicines. The Empa team of Peter Wick at the «Particles-Biology Interactions» lab will be in charge of the quality management of all analytical methods, a key task to guarantee the best possible reproducibility and comparability of the data between the various analytical labs within the consortium. «EU-NCL supports our research activities in developing innovative and safe nanomaterials for healthcare within an international network, which will actively shape future standards in nanomedicine and strengthen Empa as an enabler to facilitate the transfer of novel nanomedicines from bench to bedside», says Wick.

You can find more information about the laboratory on the Horizon 2020 (a European Union science funding programme) project page for the EU-NCL laboratory. For anyone curious about CEA-Leti, it’s a double-layered organization. CEA is France’s Commission on Atomic Energy and Alternative Energy (Commissariat à l’énergie atomique et aux énergies alternatives); you can go here to their French language site (there is an English language clickable option on the page). Leti is one of the CEA’s institutes and is known as either Leti or CEA-Leti. I have no idea what Leti stands for. Here’s the Leti website (this is the English language version).

US White House establishes new initiatives to commercialize nanotechnology

As I’ve noted several times, there’s a strong push in the US to commercialize nanotechnology and May 20, 2015 was a banner day for the efforts. The US White House announced a series of new initiatives to speed commercialization efforts in a May 20, 2015 posting by Lloyd Whitman, Tom Kalil, and JJ Raynor,

Today, May 20 [2015], the National Economic Council and the Office of Science and Technology Policy held a forum at the White House to discuss opportunities to accelerate the commercialization of nanotechnology.

In recognition of the importance of nanotechnology R&D, representatives from companies, government agencies, colleges and universities, and non-profits are announcing a series of new and expanded public and private initiatives that complement the Administration’s efforts to accelerate the commercialization of nanotechnology and expand the nanotechnology workforce:

  • The Colleges of Nanoscale Science and Engineering at SUNY Polytechnic Institute in Albany, NY and the National Institute for Occupational Safety and Health are launching the Nano Health & Safety Consortium to advance research and guidance for occupational safety and health in the nanoelectronics and other nanomanufacturing industry settings.
  • Raytheon has brought together a group of representatives from the defense industry and the Department of Defense to identify collaborative opportunities to advance nanotechnology product development, manufacturing, and supply-chain support with a goal of helping the U.S. optimize development, foster innovation, and take more rapid advantage of new commercial nanotechnologies.
  • BASF Corporation is taking a new approach to finding solutions to nanomanufacturing challenges. In March, BASF launched a prize-based “NanoChallenge” designed to drive new levels of collaborative innovation in nanotechnology while connecting with potential partners to co-create solutions that address industry challenges.
  • OCSiAl is expanding the eligibility of its “iNanoComm” matching grant program that provides low-cost, single-walled carbon nanotubes to include more exploratory research proposals, especially proposals for projects that could result in the creation of startups and technology transfers.
  • The NanoBusiness Commercialization Association (NanoBCA) is partnering with Venture for America and working with the National Science Foundation (NSF) to promote entrepreneurship in nanotechnology.  Three companies (PEN, NanoMech, and SouthWest NanoTechnologies), are offering to support NSF’s Innovation Corps (I-Corps) program with mentorship for entrepreneurs-in-training and, along with three other companies (NanoViricides, mPhase Technologies, and Eikos), will partner with Venture for America to hire recent graduates into nanotechnology jobs, thereby strengthening new nanotech businesses while providing needed experience for future entrepreneurs.
  • TechConnect is establishing a Nano and Emerging Technologies Student Leaders Conference to bring together the leaders of nanotechnology student groups from across the country. The conference will highlight undergraduate research and connect students with venture capitalists, entrepreneurs, and industry leaders.  Five universities have already committed to participating, led by the University of Virginia Nano and Emerging Technologies Club.
  • Brewer Science, through its Global Intern Program, is providing more than 30 students from high schools, colleges, and graduate schools across the country with hands-on experience in a wide range of functions within the company.  Brewer Science plans to increase the number of its science and engineering interns by 50% next year and has committed to sharing best practices with other nanotechnology businesses interested in how internship programs can contribute to a small company’s success.
  • The National Institute of Standards and Technology’s Center for Nanoscale Science and Technology is expanding its partnership with the National Science Foundation to provide hands-on experience for students in NSF’s Advanced Technology Education program. The partnership will now run year-round and will include opportunities for students at Hudson Valley Community College and the University of the District of Columbia Community College.
  • Federal agencies participating in the NNI [US National Nanotechnology Initiative], supported by the National Nanotechnology Coordination Office [NNCO], are launching multiple new activities aimed at educating students and the public about nanotechnology, including image and video contests highlighting student research, a new webinar series focused on providing nanotechnology information for K-12 teachers, and a searchable web portal on nano.gov of nanoscale science and engineering resources for teachers and professors.

Interestingly, May 20, 2015 is also the day the NNCO held its second webinar for small- and medium-size businesses in the nanotechnology community. You can find out more about that webinar and future ones by following the links in my May 13, 2015 posting.

Since the US White House announcement, OCSiAl has issued a May 26, 2015 news release which provides a brief history and more details about its newly expanded NanoComm program,

OCSiAl launched the iNanoComm, which stands for the Integrated Nanotube Commercialization Award, program in February 2015 to help researchers lower the cost of their most promising R&D projects dedicated to SWCNT [single-walled carbon nanotube] applications. The first round received 33 applications from 28 university groups, including The Smalley-Curl Center for Nanoscale Science and Technology at Rice University and the Concordia Center for Composites at Concordia University [Canada] among others. [emphasis mine] The aim of iNanoComm is to stimulate universities and research organizations to develop innovative market products based on nano-augmented materials, also known as clean materials.

Now the program’s criteria are being broadened to enable greater private sector engagement in potential projects and the creation of partnerships in commercializing nanotechnology. The program will now support early stage commercialization efforts connected to university research in the form of start-ups, technology transfers, new businesses and university spinoffs to support the mass commercialization of SWCNT products and technologies.

The announcement of the program’s expansion took place at the 2015 Roundtable of the US NanoBusiness Commercialization Association (NanoBCA), the world’s first non-profit association focused on the commercialization of nanotechnologies. NanoBCA is dedicated to creating an environment that nurtures research and innovation in nanotechnology, promotes tech-transfer of nanotechnology from academia to industry, encourages private capital investments in nanotechnology companies, and helps its corporate members bring innovative nanotechnology products to market.

“Enhancing iNanoComm as a ‘start-up incubator’ is a concrete step in promoting single-wall carbon nanotube applications in the commercial world,” said Max Atanassov, CEO of OCSiAl USA. “It was the logical thing for us to do, now that high quality carbon nanotubes have become broadly available and are affordably priced to be used on a mass industrial scale.”

Vince Caprio, Executive Director of NanoBCA, added that “iNanoComm will make an important contribution to translating fundamental nanotechnology research into commercial products. By facilitating the formation of more start-ups, it will encourage more scientists to pursue their dreams and develop their ideas into commercially successful businesses.”

For more information on the program expansion and how it can reduce the cost of early stage research connected to university projects, visit the iNanoComm website at www.inanocomm.org or contact info@inanocomm.org.

h/t Azonano May 27, 2015 news item

US National Institute of Standards and Technology (NIST) and its whispering gallery for graphene electrons

I like this old introduction about research that invoked whispering galleries well enough to reuse it here. From a Feb. 8, 2012 post about whispering galleries for light,

Whispering galleries are always popular with all ages. I know that because I can never get enough time in them as I jostle with seniors, children, young adults, etc. For most humans, the magic of having someone across from you on the other side of the room sound as if they’re beside you whispering in your ear is ever fresh.

According to a May 12, 2015 news item on Nanowerk, the US Institute of National Standards and Technology’s (NIST) whispering gallery is not likely to cause any jostling for space as it exists at the nanoscale,

An international research group led by scientists at the U.S. Commerce Department’s National Institute of Standards and Technology (NIST) has developed a technique for creating nanoscale whispering galleries for electrons in graphene. The development opens the way to building devices that focus and amplify electrons just as lenses focus light and resonators (like the body of a guitar) amplify sound.

The NIST has provided a rather intriguing illustration of this work,

Caption: An international research group led by scientists at NIST has developed a technique for creating nanoscale whispering galleries for electrons in graphene. The researchers used the voltage from a scanning tunneling microscope (right) to push graphene electrons out of a nanoscale area to create the whispering gallery (represented by the protuberances on the left), which is like a circular wall of mirrors to the electron. credit: Jon Wyrick, CNST/NIST

Caption: An international research group led by scientists at NIST has developed a technique for creating nanoscale whispering galleries for electrons in graphene. The researchers used the voltage from a scanning tunneling microscope (right) to push graphene electrons out of a nanoscale area to create the whispering gallery (represented by the protuberances on the left), which is like a circular wall of mirrors to the electron.
credit: Jon Wyrick, CNST/NIST

A May 8, 2015 NIST news release, which originated the news item, gives a delightful introduction to whispering galleries and more details about this research (Note: Links have been removed),

In some structures, such as the dome in St. Paul’s Cathedral in London, a person standing near a curved wall can hear the faintest sound made along any other part of that wall. This phenomenon, called a whispering gallery, occurs because sound waves will travel along a curved surface much farther than they will along a flat one. Using this same principle, scientists have built whispering galleries for light waves as well, and whispering galleries are found in applications ranging from sensing, spectroscopy and communications to the generation of laser frequency combs.

“The cool thing is that we made a nanometer scale electronic analogue of a classical wave effect,” said NIST researcher Joe Stroscio. “These whispering galleries are unlike anything you see in any other electron based system, and that’s really exciting.”

Ever since graphene, a single layer of carbon atoms arranged in a honeycomb lattice, was first created in 2004, the material has impressed researchers with its strength, ability to conduct electricity and heat and many interesting optical, magnetic and chemical properties.

However, early studies of the behavior of electrons in graphene were hampered by defects in the material. As the manufacture of clean and near-perfect graphene becomes more routine, scientists are beginning to uncover its full potential.

When moving electrons encounter a potential barrier in conventional semiconductors, it takes an increase in energy for the electron to continue flowing. As a result, they are often reflected, just as one would expect from a ball-like particle.

However, because electrons can sometimes behave like a wave, there is a calculable chance that they will ignore the barrier altogether, a phenomenon called tunneling. Due to the light-like properties of graphene electrons, they can pass through unimpeded—no matter how high the barrier—if they hit the barrier head on. This tendency to tunnel makes it hard to steer electrons in graphene.

Enter the graphene electron whispering gallery.

To create a whispering gallery in graphene, the team first enriched the graphene with electrons from a conductive plate mounted below it. With the graphene now crackling with electrons, the research team used the voltage from a scanning tunneling microscope (STM) to push some of them out of a nanoscale-sized area. This created the whispering gallery, which is like a circular wall of mirrors to the electron.

“An electron that hits the step head-on can tunnel straight through it,” said NIST researcher Nikolai Zhitenev. “But if electrons hit it at an angle, their waves can be reflected and travel along the sides of the curved walls of the barrier until they began to interfere with one another, creating a nanoscale electronic whispering gallery mode.”

The team can control the size and strength, i.e., the leakiness, of the electronic whispering gallery by varying the STM tip’s voltage. The probe not only creates whispering gallery modes, but can detect them as well.

NIST researcher Yue Zhao fabricated the high mobility device and performed the measurements with her colleagues Fabian Natterer and Jon Wyrick. A team of theoretical physicists from the Massachusetts Institute of Technology developed the theory describing whispering gallery modes in graphene.

Here’s a link to and a citation for the paper,

Creating and probing electron whispering-gallery modes in graphene by Yue Zhao, Jonathan Wyrick, Fabian D. Natterer1, Joaquin F. Rodriguez-Nieva, Cyprian Lewandowski, Kenji Watanabe, Takashi Taniguchi, Leonid S. Levitov, Nikolai B. Zhitenev, & Joseph A. Stroscio. Science 8 May 2015:
Vol. 348 no. 6235 pp. 672-675 DOI: 10.1126/science.aaa7469

This paper is behind a paywall.

Making x-ray measurements more accurate

Apparently the method for establishing x-ray measurements is from the 1970s and the folks at the US National Institute of Standards and Technology (NIST) feel it’s time for a new technique. From a March 9, 2015 NIST news release (also on EurekAlert),

Criminal justice, cosmology and computer manufacturing may not look to have much in common, but these and many other disparate fields all depend on sensitive measurements of X-rays. Scientists at the National Institute of Standards and Technology (NIST) have developed a new method* to reduce uncertainty in X-ray wavelength measurement that could provide improvements awaited for decades.

Accurate measurement of X-ray wavelengths depends critically on the ability to measure angles very precisely and with very little margin for error. NIST’s new approach is the first major advance since the 1970s in reducing certain sources of error common in X-ray angle measurement.

Many of us associate X-rays with a doctor’s office, but the uses for these energetic beams go far beyond revealing our skeletons. The ability to sense X-rays at precise wavelengths allows law enforcement to detect and identify trace explosives, or astrophysicists to better understand cosmic phenomena. It all comes down to looking very closely at the X-ray spectrum and measuring the precise position of lines within it. Those lines represent specific wavelengths–which are associated with specific energies–of X-rays that are emitted by the subject being studied. Each material has its own, unique X-ray “fingerprint.”

But a slight error in angle measurement can skew the results, with consequences for quantum theories, research and manufacturing. “While many fields need good X-ray reference data, many of the measurements that presently fill standard reference databases are not great–most data were taken in the 1970s and are often imprecise,” says NIST’s Larry Hudson.

X-ray wavelengths are measured by passing the beam through special crystals and very carefully measuring the angle that exiting rays make with the original beam. While the physics is different, the technique is analogous to the way a prism will split white light into different colors coming out at different angles.

The crystal is typically mounted on a rotating device that spins the crystal to two different positions where a spectral line is observed. The angle between the two is measured–this is a neat geometry trick that determines the line’s position more precisely than a single measurement would, while also cancelling out some potential errors. One inevitable limit is the accuracy of the digital encoder, the device that translates the rotation of the crystal to an angle measurement.

The news release goes on to describe the new technique,

Hudson and his co-authors have found a way to dramatically reduce the error in that measurement. Their new approach uses laser beams bouncing off a mirrored polygon that is rotated on the same shaft that would carry the crystal. The approach allows the team to use additional mathematical shortcuts to their advantage. With new NIST sensing instrumentation and analysis, X-ray angles can now be measured routinely with an uncertainty of 0.06 arcseconds–an accuracy more than three times better than the uncalibrated encoder.

Hudson describes this reduction as significant enough to set world records in X-ray wavelength measurement. “If a giant windshield wiper stretched from Washington D.C. to New York City (364 kilometers) and were to sweep out the angle of one of these errors, its tip would move less than the width of a DVD,” he says.

What do these improvements mean for the fields that depend on X-ray sensing? For one thing, calibrating measurement devices to greater precision will provide better understanding of a host of newly designed materials, which often have complicated crystal structures that give rise to unusual effects such as high-temperature superconductivity. The team’s efforts will permit better understanding of the relationship between the structures and properties of novel materials.

Here’s a link to and a citation for the paper,

A simple method for high-precision calibration of long-range errors in an angle encoder using an electronic nulling by Mark N Kinnane, Lawrence T Hudson, Albert Henins, and Marcus H Mendenhall. Metrologia Volume 52 Number 2 doi:10.1088/0026-1394/52/2/244

This is an open access paper,

For anyone curious about arcseconds, you can find an explanation in this Wikipedia entry titled Minute of art. Briefly, imagine a 360 degree circle where one degree equals one arcminute and one arcsecond is 1/60 of that minute.