Tag Archives: US National Institute of Standards and Technology

Mimicking rain and sun to test plastic for nanoparticle release

One of Canada’s nanotechnology experts once informed a House of Commons Committee on Health that nanoparticles encased in plastic (he was talking about cell phones) weren’t likely to harm you except in two circumstances (when workers were using them in the manufacturing process and when the product was being disposed of). Apparently, under some circumstances, that isn’t true any more. From a Sept. 30, 2016 news item on Nanowerk,

If the 1967 film “The Graduate” were remade today, Mr. McGuire’s famous advice to young Benjamin Braddock would probably be updated to “Plastics … with nanoparticles.” These days, the mechanical, electrical and durability properties of polymers—the class of materials that includes plastics—are often enhanced by adding miniature particles (smaller than 100 nanometers or billionths of a meter) made of elements such as silicon or silver. But could those nanoparticles be released into the environment after the polymers are exposed to years of sun and water—and if so, what might be the health and ecological consequences?

A Sept. 30, 2016 US National Institute of Standards and Technology (NIST) news release, which originated the news item, describes how the research was conducted and its results (Note: Links have been removed),

In a recently published paper (link is external), researchers from the National Institute of Standards and Technology (NIST) describe how they subjected a commercial nanoparticle-infused coating to NIST-developed methods for accelerating the effects of weathering from ultraviolet (UV) radiation and simulated washings of rainwater. Their results indicate that humidity and exposure time are contributing factors for nanoparticle release, findings that may be useful in designing future studies to determine potential impacts.

In their recent experiment, the researchers exposed multiple samples of a commercially available polyurethane coating containing silicon dioxide nanoparticles to intense UV radiation for 100 days inside the NIST SPHERE (Simulated Photodegradation via High-Energy Radiant Exposure), a hollow, 2-meter (7-foot) diameter black aluminum chamber lined with highly UV reflective material that bears a casual resemblance to the Death Star in the film “Star Wars.” For this study, one day in the SPHERE was equivalent to 10 to 15 days outdoors. All samples were weathered at a constant temperature of 50 degrees Celsius (122 degrees Fahrenheit) with one group done in extremely dry conditions (approximately 0 percent humidity) and the other in humid conditions (75 percent humidity).

To determine if any nanoparticles were released from the polymer coating during UV exposure, the researchers used a technique they created and dubbed “NIST simulated rain.” Filtered water was converted into tiny droplets, sprayed under pressure onto the individual samples, and then the runoff—with any loose nanoparticles—was collected in a bottle. This procedure was conducted at the beginning of the UV exposure, at every two weeks during the weathering run and at the end. All of the runoff fluids were then analyzed by NIST chemists for the presence of silicon and in what amounts. Additionally, the weathered coatings were examined with atomic force microscopy (AFM) and scanning electron microscopy (SEM) to reveal surface changes resulting from UV exposure.

Both sets of coating samples—those weathered in very low humidity and the others in very humid conditions—degraded but released only small amounts of nanoparticles. The researchers found that more silicon was recovered from the samples weathered in humid conditions and that nanoparticle release increased as the UV exposure time increased. Microscopic examination showed that deformations in the coating surface became more numerous with longer exposure time, and that nanoparticles left behind after the coating degraded often bound together in clusters.

“These data, and the data from future experiments of this type, are valuable for developing computer models to predict the long-term release of nanoparticles from commercial coatings used outdoors, and in turn, help manufacturers, regulatory officials and others assess any health and environmental impacts from them,” said NIST research chemist Deborah Jacobs, lead author on the study published in the Journal of Coatings Technology and Research (link is external).

Here’s a link to and a citation for the paper,

Surface degradation and nanoparticle release of a commercial nanosilica/polyurethane coating under UV exposure by Deborah S. Jacobs, Sin-Ru Huang, Yu-Lun Cheng, Savelas A. Rabb, Justin M. Gorham, Peter J. Krommenhoek, Lee L. Yu, Tinh Nguyen, Lipiin Sung. J Coat Technol Res (2016) 13: 735. doi:10.1007/s11998-016-9796-2 First published online 13 July 2016

This paper is behind a paywall.

For anyone interested in the details about the House of Commons nano story I told at the start of this post, here’s the June 23, 2010 posting where I summarized the hearing on nanotechnology. If you scroll down about 50% of the way, you’ll find Dr. Nils Petersen’s (then director of Canada’s National Institute of Nanotechnology) comments about nanoparticles being encased. The topic had been nanosunscreens and he was describing the conditions under which he believed nanoparticles could be dangerous.

Creating multiferroic material at room temperature

A Sept. 23, 2016 news item on ScienceDaily describes some research from Cornell University (US),

Multiferroics — materials that exhibit both magnetic and electric order — are of interest for next-generation computing but difficult to create because the conditions conducive to each of those states are usually mutually exclusive. And in most multiferroics found to date, their respective properties emerge only at extremely low temperatures.

Two years ago, researchers in the labs of Darrell Schlom, the Herbert Fisk Johnson Professor of Industrial Chemistry in the Department of Materials Science and Engineering, and Dan Ralph, the F.R. Newman Professor in the College of Arts and Sciences, in collaboration with professor Ramamoorthy Ramesh at UC Berkeley, published a paper announcing a breakthrough in multiferroics involving the only known material in which magnetism can be controlled by applying an electric field at room temperature: the multiferroic bismuth ferrite.

Schlom’s group has partnered with David Muller and Craig Fennie, professors of applied and engineering physics, to take that research a step further: The researchers have combined two non-multiferroic materials, using the best attributes of both to create a new room-temperature multiferroic.

Their paper, “Atomically engineered ferroic layers yield a room-temperature magnetoelectric multiferroic,” was published — along with a companion News & Views piece — Sept. 22 [2016] in Nature. …

A Sept. 22, 2016 Cornell University news release by Tom Fleischman, which originated the news item, details more about the work (Note: A link has been removed),

The group engineered thin films of hexagonal lutetium iron oxide (LuFeO3), a material known to be a robust ferroelectric but not strongly magnetic. The LuFeO3 consists of alternating single monolayers of lutetium oxide and iron oxide, and differs from a strong ferrimagnetic oxide (LuFe2O4), which consists of alternating monolayers of lutetium oxide with double monolayers of iron oxide.

The researchers found, however, that they could combine these two materials at the atomic-scale to create a new compound that was not only multiferroic but had better properties that either of the individual constituents. In particular, they found they need to add just one extra monolayer of iron oxide to every 10 atomic repeats of the LuFeO3 to dramatically change the properties of the system.

That precision engineering was done via molecular-beam epitaxy (MBE), a specialty of the Schlom lab. A technique Schlom likens to “atomic spray painting,” MBE let the researchers design and assemble the two different materials in layers, a single atom at a time.

The combination of the two materials produced a strongly ferrimagnetic layer near room temperature. They then tested the new material at the Lawrence Berkeley National Laboratory (LBNL) Advanced Light Source in collaboration with co-author Ramesh to show that the ferrimagnetic atoms followed the alignment of their ferroelectric neighbors when switched by an electric field.

“It was when our collaborators at LBNL demonstrated electrical control of magnetism in the material that we made that things got super exciting,” Schlom said. “Room-temperature multiferroics are exceedingly rare and only multiferroics that enable electrical control of magnetism are relevant to applications.”

In electronics devices, the advantages of multiferroics include their reversible polarization in response to low-power electric fields – as opposed to heat-generating and power-sapping electrical currents – and their ability to hold their polarized state without the need for continuous power. High-performance memory chips make use of ferroelectric or ferromagnetic materials.

“Our work shows that an entirely different mechanism is active in this new material,” Schlom said, “giving us hope for even better – higher-temperature and stronger – multiferroics for the future.”

Collaborators hailed from the University of Illinois at Urbana-Champaign, the National Institute of Standards and Technology, the University of Michigan and Penn State University.

Here is a link and a citation to the paper and to a companion piece,

Atomically engineered ferroic layers yield a room-temperature magnetoelectric multiferroic by Julia A. Mundy, Charles M. Brooks, Megan E. Holtz, Jarrett A. Moyer, Hena Das, Alejandro F. Rébola, John T. Heron, James D. Clarkson, Steven M. Disseler, Zhiqi Liu, Alan Farhan, Rainer Held, Robert Hovden, Elliot Padgett, Qingyun Mao, Hanjong Paik, Rajiv Misra, Lena F. Kourkoutis, Elke Arenholz, Andreas Scholl, Julie A. Borchers, William D. Ratcliff, Ramamoorthy Ramesh, Craig J. Fennie, Peter Schiffer et al. Nature 537, 523–527 (22 September 2016) doi:10.1038/nature19343 Published online 21 September 2016

Condensed-matter physics: Multitasking materials from atomic templates by Manfred Fiebig. Nature 537, 499–500  (22 September 2016) doi:10.1038/537499a Published online 21 September 2016

Both the paper and its companion piece are behind a paywall.

US white paper on neuromorphic computing (or the nanotechnology-inspired Grand Challenge for future computing)

The US has embarked on a number of what is called “Grand Challenges.” I first came across the concept when reading about the Bill and Melinda Gates (of Microsoft fame) Foundation. I gather these challenges are intended to provide funding for research that advances bold visions.

There is the US National Strategic Computing Initiative established on July 29, 2015 and its first anniversary results were announced one year to the day later. Within that initiative a nanotechnology-inspired Grand Challenge for Future Computing was issued and, according to a July 29, 2016 news item on Nanowerk, a white paper on the topic has been issued (Note: A link has been removed),

Today [July 29, 2016), Federal agencies participating in the National Nanotechnology Initiative (NNI) released a white paper (pdf) describing the collective Federal vision for the emerging and innovative solutions needed to realize the Nanotechnology-Inspired Grand Challenge for Future Computing.

The grand challenge, announced on October 20, 2015, is to “create a new type of computer that can proactively interpret and learn from data, solve unfamiliar problems using what it has learned, and operate with the energy efficiency of the human brain.” The white paper describes the technical priorities shared by the agencies, highlights the challenges and opportunities associated with these priorities, and presents a guiding vision for the research and development (R&D) needed to achieve key technical goals. By coordinating and collaborating across multiple levels of government, industry, academia, and nonprofit organizations, the nanotechnology and computer science communities can look beyond the decades-old approach to computing based on the von Neumann architecture and chart a new path that will continue the rapid pace of innovation beyond the next decade.

A July 29, 2016 US National Nanotechnology Coordination Office news release, which originated the news item, further and succinctly describes the contents of the paper,

“Materials and devices for computing have been and will continue to be a key application domain in the field of nanotechnology. As evident by the R&D topics highlighted in the white paper, this challenge will require the convergence of nanotechnology, neuroscience, and computer science to create a whole new paradigm for low-power computing with revolutionary, brain-like capabilities,” said Dr. Michael Meador, Director of the National Nanotechnology Coordination Office. …

The white paper was produced as a collaboration by technical staff at the Department of Energy, the National Science Foundation, the Department of Defense, the National Institute of Standards and Technology, and the Intelligence Community. …

The white paper titled “A Federal Vision for Future Computing: A Nanotechnology-Inspired Grand Challenge” is 15 pp. and it offers tidbits such as this (Note: Footnotes not included),

A new materials base may be needed for future electronic hardware. While most of today’s electronics use silicon, this approach is unsustainable if billions of disposable and short-lived sensor nodes are needed for the coming Internet-of-Things (IoT). To what extent can the materials base for the implementation of future information technology (IT) components and systems support sustainability through recycling and bio-degradability? More sustainable materials, such as compostable or biodegradable systems (polymers, paper, etc.) that can be recycled or reused,  may play an important role. The potential role for such alternative materials in the fabrication of integrated systems needs to be explored as well. [p. 5]

The basic architecture of computers today is essentially the same as those built in the 1940s—the von Neumann architecture—with separate compute, high-speed memory, and high-density storage components that are electronically interconnected. However, it is well known that continued performance increases using this architecture are not feasible in the long term, with power density constraints being one of the fundamental roadblocks.7 Further advances in the current approach using multiple cores, chip multiprocessors, and associated architectures are plagued by challenges in software and programming models. Thus,  research and development is required in radically new and different computing architectures involving processors, memory, input-output devices, and how they behave and are interconnected. [p. 7]

Neuroscience research suggests that the brain is a complex, high-performance computing system with low energy consumption and incredible parallelism. A highly plastic and flexible organ, the human brain is able to grow new neurons, synapses, and connections to cope with an ever-changing environment. Energy efficiency, growth, and flexibility occur at all scales, from molecular to cellular, and allow the brain, from early to late stage, to never stop learning and to act with proactive intelligence in both familiar and novel situations. Understanding how these mechanisms work and cooperate within and across scales has the potential to offer tremendous technical insights and novel engineering frameworks for materials, devices, and systems seeking to perform efficient and autonomous computing. This research focus area is the most synergistic with the national BRAIN Initiative. However, unlike the BRAIN Initiative, where the goal is to map the network connectivity of the brain, the objective here is to understand the nature, methods, and mechanisms for computation,  and how the brain performs some of its tasks. Even within this broad paradigm,  one can loosely distinguish between neuromorphic computing and artificial neural network (ANN) approaches. The goal of neuromorphic computing is oriented towards a hardware approach to reverse engineering the computational architecture of the brain. On the other hand, ANNs include algorithmic approaches arising from machinelearning,  which in turn could leverage advancements and understanding in neuroscience as well as novel cognitive, mathematical, and statistical techniques. Indeed, the ultimate intelligent systems may as well be the result of merging existing ANN (e.g., deep learning) and bio-inspired techniques. [p. 8]

As government documents go, this is quite readable.

For anyone interested in learning more about the future federal plans for computing in the US, there is a July 29, 2016 posting on the White House blog celebrating the first year of the US National Strategic Computing Initiative Strategic Plan (29 pp. PDF; awkward but that is the title).

Carbon nanotubes: faster, cheaper, easier, and more consistent

One of the big problems with nanomaterials has to do with production issues such as: consistent size and shape. It seems that scientists at the US National Institute of Standards and Technology (NIST) have developed a technique for producing carbon nanotubes (CNTs) which addresses these issues. From a July 19, 2016 news item on Nanotechnology Now,

Just as many of us might be resigned to clogged salt shakers or rush-hour traffic, those working to exploit the special properties of carbon nanotubes have typically shrugged their shoulders when these tiniest of cylinders fill with water during processing. But for nanotube practitioners who have reached their Popeye threshold and “can’t stands no more,” the National Institute of Standards and Technology (NIST) has devised a cheap, quick and effective strategy that reliably enhances the quality and consistency of the materials–important for using them effectively in applications such as new computing technologies.

To prevent filling of the cores of single-wall carbon nanotubes with water or other detrimental substances, the NIST researchers advise intentionally prefilling them with a desired chemical of known properties. Taking this step before separating and dispersing the materials, usually done in water, yields a consistently uniform collection of nanotubes. In quantity and quality, the results are superior to water-filled nanotubes, especially for optical applications such as sensors and photodetectors.

A July 15, 2016 NIST news release, which originated the news item, expands on the theme,

The approach opens a straightforward route for engineering the properties of single-wall carbon nanotubes—rolled up sheets of carbon atoms arranged like chicken wire or honey combs—with improved or new properties.

“This approach is so easy, inexpensive and broadly useful that I can’t think of a reason not to use it,” said NIST chemical engineer Jeffrey Fagan.

In their proof-of-concept experiments, the NIST team inserted more than 20 different compounds into an assortment of single-wall carbon nanotubes with an interior diameter that ranged from more than 2 down to about 0.5 nanometers. Led by visiting researcher Jochen Campo, the scientists tested their strategy by using hydrocarbons called alkanes as fillers.

The alkanes, which include such familiar compounds as propane and butane, served to render the nanotube interiors unreactive. In other words, the alkane-filled nanotubes behaved almost as if they were empty—precisely the goal of Campo, Fagan and colleagues.

Compared with nanotubes filled with water and possibly ions, acids and other unwanted chemicals encountered during processing, empty nanotubes possess far superior properties. For example, when stimulated by light, empty carbon nanotubes fluoresce far brighter and with sharper signals.

Yet, “spontaneous ingestion” of water or other solvents by the nanotubes during processing is an “endemic but often neglected phenomenon with strong implications for the development of nanotube applications,” the NIST team wrote in a recent article in Nanoscale Horizons.

Perhaps because of the additional cost and effort required to filter out and gather nanotubes, researchers tend to tolerate mixed batches of unfilled (empty) and mostly filled single-wall carbon nanotubes. Separating unfilled nanotubes from these mixtures requires expensive ultracentrifuge equipment and, even then, the yield is only about 10 percent, Campo estimates.

“If your goal is to use nanotubes for electronic circuits, for example, or for fluorescent anti-cancer image contrast agents, then you require much greater quantities of materials of consistent composition and quality,” Campo explained, who was exploring these applications while doing postdoctoral research at the University of Antwerp. “This particular need inspired development of the new prefilling method by asking the question, can we put some passive chemical into the nanotube instead to keep the water out.”

From the very first simple experiments, the answer was yes. And the benefits can be significant. In fluorescence experiments, alkane-filled nanotubes emitted signals two to three times stronger than those emitted by water-filled nanotubes. Performance approached that of empty nanotubes—the gold standard for these comparisons.

As important, the NIST-developed prefilling strategy is controllable, versatile and easily incorporated into existing methods for processing single-wall carbon nanotubes, according to the researchers.

Here’s a link to and citation for the paper,

Enhancing single-wall carbon nanotube properties through controlled endohedral filling by J. Campo, Y. Piao, S. Lam, C. M. Stafford, J. K. Streit, J. R. Simpson, A. R. Hight Walker, and J. A. Fagan. Nanoscale Horiz., 2016,1, 317-324 DOI: 10.1039/C6NH00062B First published online 10 May 2016

This paper is open access but you do need to register on the site (it is a free registration).

US Nanotechnology Initiative for water sustainability

Wednesday, March 23, 2016 was World Water Day and to coincide with that event the US National Nanotechnology Initiative (NNI) in collaboration with several other agencies announced a new ‘signature initiative’. From a March 24, 2016 news item on Nanowerk (Note: A link has been removed),

As a part of the White House Water Summit held yesterday on World Water Day, the Federal agencies participating in the National Nanotechnology Initiative (NNI) announced the launch of a Nanotechnology Signature Initiative (NSI), Water Sustainability through Nanotechnology: Nanoscale Solutions for a Global-Scale Challenge.

A March 23, 2016 NNI news release provides more information about why this initiative is important,

Access to clean water remains one of the world’s most pressing needs. As today’s White House Office of Science and Technology blog post explains, “the small size and exceptional properties of engineered nanomaterials are particularly promising for addressing the key technical challenges related to water quality and quantity.”

“One cannot find an issue more critical to human life and global security than clean, plentiful, and reliable water sources,” said Dr. Michael Meador, Director of the National Nanotechnology Coordination Office (NNCO). “Through the NSI mechanism, NNI member agencies will have an even greater ability to make meaningful strides toward this initiative’s thrust areas: increasing water availability, improving the efficiency of water delivery and use, and enabling next-generation water monitoring systems.”

A March 23, 2016 US White House blog posting by Lloyd Whitman and Lisa Friedersdorf describes the efforts in more detail (Note: A link has been removed),

The small size and exceptional properties of engineered nanomaterials are particularly promising for addressing the pressing technical challenges related to water quality and quantity. For example, the increased surface area—a cubic centimeter of nanoparticles has a surface area larger than a football field—and reactivity of nanometer-scale particles can be exploited to create catalysts for water purification that do not require rare or precious metals. And composites incorporating nanomaterials such as carbon nanotubes might one day enable stronger, lighter, and more durable piping systems and components. Under this NSI, Federal agencies will coordinate and collaborate to more rapidly develop nanotechnology-enabled solutions in three main thrusts: [thrust 1] increasing water availability; [thrust 2] improving the efficiency of water delivery and use; and [thrust 3] enabling next-generation water monitoring systems.

A technical “white paper” released by the agencies this week highlights key technical challenges for each thrust, identifies key objectives to overcome those challenges, and notes areas of research and development where nanotechnology promises to provide the needed solutions. By shining a spotlight on these areas, the new NSI will increase Federal coordination and collaboration, including with public and private stakeholders, which is vital to making progress in these areas. The additional focus and associated collective efforts will advance stewardship of water resources to support the essential food, energy, security, and environment needs of all stakeholders.

We applaud the commitment of the Federal agencies who will participate in this effort—the Department of Commerce/National Institute of Standards and Technology, Department of Energy, Environmental Protection Agency, National Aeronautics and Space Administration, National Science Foundation, and U.S. Department of Agriculture/National Institute of Food and Agriculture. As made clear at this week’s White House Water Summit, the world’s water systems are under tremendous stress, and new and emerging technologies will play a critical role in ensuring a sustainable water future.

The white paper (12 pp.) is titled: Water Sustainability through Nanotechnology: Nanoscale Solutions for a Global-Scale Challenge and describes the thrusts in more detail.

A March 22, 2016 US White House fact sheet lays out more details including funding,

Click here to learn more about all of the commitments and announcements being made today. They include:

  • Nearly $4 billion in private capital committed to investment in a broad range of water-infrastructure projects nationwide. This includes $1.5 billion from Ultra Capital to finance decentralized and scalable water-management solutions, and $500 million from Sustainable Water to develop water reclamation and reuse systems.
  • More than $1 billion from the private sector over the next decade to conduct research and development into new technologies. This includes $500 million from GE to fuel innovation, expertise, and global capabilities in advanced water, wastewater, and reuse technologies.
  • A Presidential Memorandum and supporting Action Plan on building national capabilities for long-term drought resilience in the United States, including by setting drought resilience policy goals, directing specific drought resilience activities to be completed by the end of the year, and permanently establishing the National Drought Resilience Partnership as an interagency task force responsible for coordinating drought-resilience, response, and recovery efforts.
  • Nearly $35 million this year in Federal grants from the Environmental Protection Agency, the National Oceanic and Atmospheric Administration, the National Science Foundation, and the U.S. Department of Agriculture to support cutting-edge water science;
  • The release of a new National Water Model that will dramatically enhance the Nation’s river-forecasting capabilities by delivering forecasts for approximately 2.7 million locations, up from 4,000 locations today (a 700-fold increase in forecast density).

This seems promising and hopefully other countries will follow suit.

Monsieur Kilogram; an SI (international system of units) Superhero

I wouldn’t have thought that measurement was such a crucial issue that it would require a superhero team but I have to admit the folks at the US National Institute of Standards and Technology (NIST) make a compelling case in a Feb. 2, 2016 NIST news release (also on EurekAlert but dated Feb. 4, 2016),

The nefarious Major Uncertainty has kidnapped Monsieur Kilogram, putting the world’s measurements of mass in jeopardy. As the world spirals into “Mass Hysteria,” the remaining SI Superheroes, champions of the metric system, leap into action to save the day, and hopefully Monsieur Kilogram as well.

This crisis kicks off the third and latest adventure from the League of SI Superheroes, the animated online series from the National Institute of Standards and Technology (NIST). “Mass Hysteria” touches upon a topic—how to redefine the basic unit of mass known as the kilogram—that represents a cutting-edge undertaking for researchers working to modernize the worldwide metric measurement system known as the International System of Units (SI).

From the very big to the very small, accurately measuring mass is important in the world around us. For example, many of the products you buy at the grocery store and other places are sold by mass or the related quantity of weight. Roughly speaking, mass is the amount and type of “stuff” there is in something, and weight is the force pulling on the mass by gravity. The masses of every ingredient in medications from aspirin tablets to cancer drugs are carefully measured to ensure that they are both safe and effective. In many cases, medical doctors consider the mass of the patient to determine the dosage of the medications they prescribe as well. And both the fuel and the amount of thrust produced by the huge engines that power airplanes and rockets depend on mass.

Small errors of even a few milligrams per kilogram may not sound like much, but they can be costly when measuring huge quantities of something like a tanker ship full of grain or oil. With medicines, slightly too little of a chemical could make it ineffective and slightly too much could be toxic.

Here’s a peek at Monsieur Kilogram who seems like a pretty tough customer,

Monsieur Kilogram, a character in NIST’s League of SI Superheroes animated online series, is able to determine the mass of any object simply by holding it. ©NIST

Monsieur Kilogram, a character in NIST’s League of SI Superheroes animated online series, is able to determine the mass of any object simply by holding it. ©NIST

Getting back to the news release and the importance of accurate measurement,

Being the last standard unit of measure still based on an actual physical object, in this case a golf-ball-sized cylinder of platinum and iridium, the kilogram is vulnerable to damage, as well as being lost or even stolen. While the international prototype kilogram itself cannot change because it is the kilogram by definition, copies of the international prototype that many countries use as their standard of mass have been gaining or losing mass relative to it.

The SI Superheroes’ latest episode briefly explores one of the efforts to redefine the kilogram in terms of natural forces called the watt balance, a complex machine that uses electric and magnetic forces to balance a 1-kilogram mass against the Earth’s gravity. Precise measurements related to these forces can then be used to provide a consistent definition of the amount of mass in the kilogram.

While the superheroes’ antics are not exactly representative of the efforts of actual researchers, super scientists at NIST and elsewhere have been working for years to reduce the errors in their measurement of this quantity to the point where the watt balance can take over for the international prototype kilogram.

They are closing in on their goal, and it is widely anticipated that the kilogram will be redefined in 2018.

Once this process is complete, the kilogram will have been freed from its dependence on a physical object, and anyone with the right technical expertise and equipment will be able to determine the mass of a kilogram for themselves.

Will the SI Superheroes finish the watt balance in time? Watch the next episode and find out! Modeled on the seven base units of the International System of Units, or SI, the League of SI Superheroes are:

Meter Man: With his laser interferometer eyes, graduated arms and extendable body, no dimension is too big or too small for Meter Man to measure.

The Mole: Able to sniff out and count the atoms of every element, the Mole is a master of chemistry.

Professor Second: By reading the vibrations of her laser-cooled cesium atoms, Professor Second can measure any frequency and calibrate any clock.

Monsieur Kilogram: Monsieur Kilogram loves lifting weights, and it shows. He is able to determine the mass of any object simply by holding it.

Ms. Ampere: Ms. Ampere rules the flow of electrons—electrical current—and makes sure that the right amount gets where it needs to go.

Dr. Kelvin: Dr. Kelvin heats up or cools down objects by speeding up or slowing down particles inside them. He can also measure the temperature of anything in the universe with his trusty thermometer.

Candela: Don’t let her small size fool you. Candela’s power over light helps to brighten the whole world.

Catch up on their adventures at The League of SI Superheroes kids’ page. Teachers can also request a classroom set of SI educational materials by submitting their contact information and grade level to TheSI@nist.gov.

Here’s the latest adventure,

Enjoy!

A study in contrasts: innovation and education strategies in US and British Columbia (Canada)

It’s always interesting to contrast two approaches to the same issue, in this case, innovation and education strategies designed to improve the economies of the United States and of British Columbia, a province in Canada.

One of the major differences regarding education in the US and in Canada is that the Canadian federal government, unlike the US federal government, has no jurisdiction over the matter. Education is strictly a provincial responsibility.

I recently wrote a commentary (a Jan. 19, 2016 posting) about the BC government’s Jan. 18, 2016 announcement of its innovation strategy in a special emphasis on the education aspect. Premier Christy Clark focused largely on the notion of embedding courses on computer coding in schools from K-12 (kindergarten through grade 12) as Jonathon Narvey noted in his Jan. 19, 2016 event recap for Betakit,

While many in the tech sector will be focused on the short-term benefits of a quick injection of large capital [a $100M BC Tech Fund as part of a new strategy was announced in Dec. 2015 but details about the new #BCTECH Strategy were not shared until Jan. 18, 2016], the long-term benefits for the local tech sector are being seeded in local schools. More than 600,000 BC students will be getting basic skills in the K-12 curriculum, with coding academies, more work experience electives and partnerships between high school and post-secondary institutions.

Here’s what I had to say in my commentary (from the Jan. 19, 2016 posting),

… the government wants to embed  computer coding into the education system for K-12 (kindergarten to grade 12). One determined reporter (Canadian Press if memory serves) attempted to find out how much this would cost. No answer was forthcoming although there were many words expended. Whether this failure was due to ignorance (disturbing!) or a reluctance to share (also disturbing!) was impossible to tell. Another reporter (Georgia Straight) asked about equipment (coding can be taught with pen and paper but hardware is better). … Getting back to the reporter’s question, no answer was forthcoming although the speaker was loquacious.

Another reporter asked if the government had found any jurisdictions doing anything similar regarding computer coding. It seems they did consider other jurisdictions although it was claimed that BC is the first to strike out in this direction. Oddly, no one mentioned Estonia, known in some circles as E-stonia, where the entire school system was online by the late 1990s in an initiative known as the ‘Tiger Leap Foundation’ which also supported computer coding classes in secondary school (there’s more in Tim Mansel’s May 16, 2013 article about Estonia’s then latest initiative to embed computer coding into grade school.) …

Aside from the BC government’s failure to provide details, I am uncomfortable with what I see as an overemphasis on computer coding that suggests a narrow focus on what constitutes a science and technology strategy for education. I find the US approach closer to what I favour although I may be biased since they are building their strategy around nanotechnology education.

The US approach had been announced in dribs and drabs until recently when a Jan. 26, 2016 news item on Nanotechnology Now indicated a broad-based plan for nanotechnology education (and computer coding),

Over the past 15 years, the Federal Government has invested over $22 billion in R&D under the auspices of the National Nanotechnology Initiative (NNI) to understand and control matter at the nanoscale and develop applications that benefit society. As these nanotechnology-enabled applications become a part of everyday life, it is important for students to have a basic understanding of material behavior at the nanoscale, and some states have even incorporated nanotechnology concepts into their K-12 science standards. Furthermore, application of the novel properties that exist at the nanoscale, from gecko-inspired climbing gloves and invisibility cloaks, to water-repellent coatings on clothes or cellphones, can spark students’ excitement about science, technology, engineering, and mathematics (STEM).

An earlier Jan. 25, 2016 White House blog posting by Lisa Friedersdorf and Lloyd Whitman introduced the notion that nanotechnology is viewed as foundational and a springboard for encouraging interest in STEM (science, technology, engineering, and mathematics) careers while outlining several formal and information education efforts,

The Administration’s updated Strategy for American Innovation, released in October 2015, identifies nanotechnology as one of the emerging “general-purpose technologies”—a technology that, like the steam engine, electricity, and the Internet, will have a pervasive impact on our economy and our society, with the ability to create entirely new industries, create jobs, and increase productivity. To reap these benefits, we must train our Nation’s students for these high-tech jobs of the future. Fortunately, the multidisciplinary nature of nanotechnology and the unique and fascinating phenomena that occur at the nanoscale mean that nanotechnology is a perfect topic to inspire students to pursue careers in science, technology, engineering, and mathematics (STEM).

The Nanotechnology: Super Small Science series [mentioned in my Jan. 21, 2016 posting] is just the latest example of the National Nanotechnology Initiative (NNI)’s efforts to educate and inspire our Nation’s students. Other examples include:

The announcement about computer coding and courses being integrated in the US education curricula K-12 was made in US President Barack Obama’s 2016 State of the Union speech and covered in a Jan. 30, 2016 article by Jessica Hullinger for Fast Company,

In his final State Of The Union address earlier this month, President Obama called for providing hands-on computer science classes for all students to make them “job ready on day one.” Today, he is unveiling how he plans to do that with his upcoming budget.

The President’s Computer Science for All Initiative seeks to provide $4 billion in funding for states and an additional $100 million directly to school districts in a push to provide access to computer science training in K-12 public schools. The money would go toward things like training teachers, providing instructional materials, and getting kids involved in computer science early in elementary and middle school.

There are more details in the Hullinger’s article and in a Jan. 30, 2016 White House blog posting by Megan Smith,

Computer Science for All is the President’s bold new initiative to empower all American students from kindergarten through high school to learn computer science and be equipped with the computational thinking skills they need to be creators in the digital economy, not just consumers, and to be active citizens in our technology-driven world. Our economy is rapidly shifting, and both educators and business leaders are increasingly recognizing that computer science (CS) is a “new basic” skill necessary for economic opportunity and social mobility.

CS for All builds on efforts already being led by parents, teachers, school districts, states, and private sector leaders from across the country.

Nothing says one approach has to be better than the other as there’s usually more than one way to accomplish a set of goals. As well, it’s unfair to expect a provincial government to emulate the federal government of a larger country with more money to spend. I just wish the BC government (a) had shared details such as the budget allotment for their initiative and (b) would hint at a more imaginative, long range view of STEM education.

Going back to Estonia one last time, in addition to the country’s recent introduction of computer coding classes in grade school, it has also embarked on a nanotechnology/nanoscience educational and entrepreneurial programme as noted in my Sept. 30, 2014 posting,

The University of Tartu (Estonia) announced in a Sept. 29, 2014 press release an educational and entrepreneurial programme about nanotechnology/nanoscience for teachers and students,

To bring nanoscience closer to pupils, educational researchers of the University of Tartu decided to implement the European Union LLP Comenius project “Quantum Spin-Off – connecting schools with high-tech research and entrepreneurship”. The objective of the project is to build a kind of a bridge: at one end, pupils can familiarise themselves with modern science, and at the other, experience its application opportunities at high-tech enterprises. “We also wish to inspire these young people to choose a specialisation related to science and technology in the future,” added Lukk [Maarika Lukk, Coordinator of the project].

The pupils can choose between seven topics of nanotechnology: the creation of artificial muscles, microbiological fuel elements, manipulation of nanoparticles, nanoparticles and ionic liquids as oil additives, materials used in regenerative medicine, deposition and 3D-characterisation of atomically designed structures and a topic covered in English, “Artificial robotic fish with EAP elements”.

Learning is based on study modules in the field of nanotechnology. In addition, each team of pupils will read a scientific publication, selected for them by an expert of that particular field. In that way, pupils will develop an understanding of the field and of scientific texts. On the basis of the scientific publication, the pupils prepare their own research project and a business plan suitable for applying the results of the project.

In each field, experts of the University of Tartu will help to understand the topics. Participants will visit a nanotechnology research laboratory and enterprises using nanotechnologies.

The project lasts for two years and it is also implemented in Belgium, Switzerland and Greece.

As they say, time will tell.

Developing optical microscopes that measure features down to 10 nanometer level on computer chips

The US National Institute of Standards and Technology (NIST) issued a Dec. 2, 2015 news release (also on EurekAlert) announcing a new kind of optical microscope and its possible impact on the semiconductor industry,

National Institute of Standards and Technology (NIST) researchers are seeing the light, but in an altogether different way. And how they are doing it just might be the semiconductor industry’s ticket for extending its use of optical microscopes to measure computer chip features that are approaching 10 nanometers, tiny fractions of the wavelength of light.

The news release goes on to provide details and an explanation of scatterfield imaging,

Using a novel microscope that combines standard through-the-lens viewing with a technique called scatterfield imaging, the NIST team accurately measured patterned features on a silicon wafer that were 30 times smaller than the wavelength of light (450 nanometers) used to examine them. They report* that measurements of the etched lines–as thin as 16 nanometers wide–on the SEMATECH-fabricated wafer were accurate to one nanometer. With the technique, they spotted variations in feature dimensions amounting to differences of a few atoms.

Measurements were confirmed by those made with an atomic force microscope, which achieves sub-nanometer resolution, but is considered too slow for online quality-control measurements. Combined with earlier results, the NIST researchers write, the new proof-of-concept study* suggests that the innovative optical approach could be a “realistic solution to a very challenging problem” facing chip makers and others aiming to harness advances in nanotechnology. All need the means for “nondestructive measurement of nanometer-scale structures with sub-nanometer sensitivity while still having high throughput.

“Light-based, or optical, microscopes can’t “see” features smaller than the wavelength of light, at least not in the crisp detail necessary for making accurate measurements. However, light does scatter when it strikes so-called subwavelength features and patterned arrangements of such features. “Historically, we would ignore this scattered light because it did not yield sufficient resolution,” explains Richard Silver, the physicist who initiated NIST’s scatterfield imaging effort. “Now we know it contains helpful information that provides signatures telling us something about where the light came from.”

With scatterfield imaging, Silver and colleagues methodically illuminate a sample with polarized light from different angles. From this collection of scattered light–nothing more than a sea of wiggly lines to the untrained eye–the NIST team can extract characteristics of the bounced lightwaves that, together, reveal the geometry of features on the specimen.

Light-scattering data are gathered in slices, which together image the volume of scattered light above and into the sample. These slices are analyzed and reconstructed to create a three-dimensional representation. The process is akin to a CT scan, except that the slices are collections of interfering waves, not cross-sectional pictures.

“It’s the ensemble of data that tells us what we’re after,” says project leader Bryan Barnes.” We may not be able see the lines on the wafer, but we can tell you what you need to know about them–their size, their shape, their spacing.”

Scatterfield imaging has critical prerequisites that must be met before it can yield useful data for high-accuracy measurements of exceedingly small features. Key steps entail detailed evaluation of the path light takes as it beams through lenses, apertures and other system elements before reaching the sample. The path traversed by light scattering from the specimen undergoes the same level of scrutiny. Fortunately, scatterfield imaging lends itself to thorough characterization of both sequences of optical devices, according to the researchers. These preliminary steps are akin to error mapping so that recognized sources of inaccuracy are factored out of the data.

The method also benefits from a little advance intelligence–the as-designed arrangement of circuit lines on a chip, down to the size of individual features. Knowing what is expected to be the result of the complex chip-making process sets up a classic matchup of theory vs. experiment.

The NIST researchers can use standard equations to simulate light scattering from an ideal, defect-free pattern and, in fact, any variation thereof. Using wave analysis software they developed, the team has assembled an indexed library of light-scattering reference models. So once a specimen is scanned, the team relies on computers to compare their real-world data to models and to find close matches.

From there, succeeding rounds of analysis homes in on the remaining differences, reducing them until the only ones that remain are due to variations in geometry such as irregularities in the height, width, or shape of a line.

Measurement results achieved with the NIST approach might be said to cast light itself in an entirely new light. Their new study, the researchers say, shows that once disregarded scattered light “contains a wealth of accessible optical information.”

Next steps include extending the technique to even shorter wavelengths of light, down to ultraviolet, or 193 nanometers. The aim is to accurately measure features as small as 5 nanometers.

This work is part of a larger NIST effort to supply measurement tools that enable the semiconductor industry to continue doubling the number of devices on a chip about every two years and to help other industries make products with nanoscale features. Recently, NIST and Intel researchers reported using an X-ray technique to accurately measure features on a silicon chip to within fractions of a nanometer.

Here’s a link to and a citation for a PDF of the paper,

Deep-subwavelength Nanometric Image Reconstruction using Fourier Domain Optical Normalization by Jing  Qin, Richard  M Silver, Bryan  M  Barnes, Hui Zhou, Ronald G Dixson, and Mark Alexander Hen. Light: Science & Applications accepted article preview 5 November 2015; e16038 doi: 10.1038/lsa.2016.38

[Note:] This is a PDF file of an unedited peer-reviewed manuscript that has been accepted for publication. NPG are providing this early version of the manuscript as a service to our customers. The manuscript will undergo copy editing, typesetting and a proof review before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers apply.

This seems to be an open access paper but it is an early version.

Nanotechnology research protocols for Environment, Health and Safety Studies in US and a nanomedicine characterization laboratory in the European Union

I have two items relating to nanotechnology and the development of protocols. The first item concerns the launch of a new web portal by the US National Institute of Standards and Technology.

US National Institute of Standards and Technology (NIST)

From a July 1, 2015 news item on Azonano,

As engineered nanomaterials increasingly find their way into commercial products, researchers who study the potential environmental or health impacts of those materials face a growing challenge to accurately measure and characterize them. These challenges affect measurements of basic chemical and physical properties as well as toxicology assessments.

To help nano-EHS (Environment, Health and Safety)researchers navigate the often complex measurement issues, the National Institute of Standards and Technology (NIST) has launched a new website devoted to NIST-developed (or co-developed) and validated laboratory protocols for nano-EHS studies.

A July 1, 2015 NIST news release on EurekAlert, which originated the news item, offers more details about the information available through the web portal,

In common lab parlance, a “protocol” is a specific step-by-step procedure used to carry out a measurement or related activity, including all the chemicals and equipment required. Any peer-reviewed journal article reporting an experimental result has a “methods” section where the authors document their measurement protocol, but those descriptions are necessarily brief and condensed, and may lack validation of any sort. By comparison, on NIST’s new Protocols for Nano-EHS website the protocols are extraordinarily detailed. For ease of citation, they’re published individually–each with its own unique digital object identifier (DOI).

The protocols detail not only what you should do, but why and what could go wrong. The specificity is important, according to program director Debra Kaiser, because of the inherent difficulty of making reliable measurements of such small materials. “Often, if you do something seemingly trivial–use a different size pipette, for example–you get a different result. Our goal is to help people get data they can reproduce, data they can trust.”

A typical caution, for example, notes that if you’re using an instrument that measures the size of nanoparticles in a solution by how they scatter light, it’s important also to measure the transmission spectrum of the particles if they’re colored, because if they happen to absorb light strongly at the same frequency as your instrument, the result may be biased.

“These measurements are difficult because of the small size involved,” explains Kaiser. “Very few new instruments have been developed for this. People are adapting existing instruments and methods for the job, but often those instruments are being operated close to their limits and the methods were developed for chemicals or bulk materials and not for nanomaterials.”

“For example, NIST offers a reference material for measuring the size of gold nanoparticles in solution, and we report six different sizes depending on the instrument you use. We do it that way because different instruments sense different aspects of a nanoparticle’s dimensions. An electron microscope is telling you something different than a dynamic light scattering instrument, and the researcher needs to understand that.”

The nano-EHS protocols offered by the NIST site, Kaiser says, could form the basis for consensus-based, formal test methods such as those published by ASTM and ISO.

NIST’s nano-EHS protocol site currently lists 12 different protocols in three categories: sample preparation, physico-chemical measurements and toxicological measurements. More protocols will be added as they are validated and documented. Suggestions for additional protocols are welcome at nanoprotocols@nist.gov.

The next item concerns European nanomedicine.

CEA-LETI and Europe’s first nanomedicine characterization laboratory

A July 1, 2015 news item on Nanotechnology Now describes the partnership which has led to launch of the new laboratory,

CEA-Leti today announced the launch of the European Nano-Characterisation Laboratory (EU-NCL) funded by the European Union’s Horizon 2020 research and innovation programm[1]e. Its main objective is to reach a level of international excellence in nanomedicine characterisation for medical indications like cancer, diabetes, inflammatory diseases or infections, and make it accessible to all organisations developing candidate nanomedicines prior to their submission to regulatory agencies to get the approval for clinical trials and, later, marketing authorization.

“As reported in the ETPN White Paper[2], there is a lack of infrastructure to support nanotechnology-based innovation in healthcare,” said Patrick Boisseau, head of business development in nanomedicine at CEA-Leti and chairman of the European Technology Platform Nanomedicine (ETPN). “Nanocharacterisation is the first bottleneck encountered by companies developing nanotherapeutics. The EU-NCL project is of most importance for the nanomedicine community, as it will contribute to the competiveness of nanomedicine products and tools and facilitate regulation in Europe.”

EU-NCL is partnered with the sole international reference facility, the Nanotechnology Characterization Lab of the National Cancer Institute in the U.S. (US-NCL)[3], to get faster international harmonization of analytical protocols.

“We are excited to be part of this cooperative arrangement between Europe and the U.S.,” said Scott E. McNeil, director of U.S. NCL. “We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.”

A July 2, 2015 EMPA (Swiss Federal Laboratories for Materials Science and Technology) news release on EurekAlert provides more detail about the laboratory and the partnerships,

The «European Nanomedicine Characterization Laboratory» (EU-NCL), which was launched on 1 June 2015, has a clear-cut goal: to help bring more nanomedicine candidates into the clinic and on the market, for the benefit of patients and the European pharmaceutical industry. To achieve this, EU-NCL is partnered with the sole international reference facility, the «Nanotechnology Characterization Laboratory» (US-NCL) of the US-National Cancer Institute, to get faster international harmonization of analytical protocols. EU-NCL is also closely connected to national medicine agencies and the European Medicines Agency to continuously adapt its analytical services to requests of regulators. EU-NCL is designed, organized and operated according to the highest EU regulatory and quality standards. «We are excited to be part of this cooperative project between Europe and the U.S.,» says Scott E. McNeil, director of US-NCL. «We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.»

Nine partners from eight countries

EU-NCL, which is funded by the EU for a four-year period with nearly 5 million Euros, brings together nine partners from eight countries: CEA-Tech in Leti and Liten, France, the coordinator of the project; the Joint Research Centre of the European Commission in Ispra, Italy; European Research Services GmbH in Münster Germany; Leidos Biomedical Research, Inc. in Frederick, USA; Trinity College in Dublin, Ireland; SINTEF in Oslo, Norway; the University of Liverpool in the UK; Empa, the Swiss Federal Laboratories for Materials Science and Technology in St. Gallen, Switzerland; Westfälische Wilhelms-Universität (WWU) and Gesellschaft für Bioanalytik, both in Münster, Germany. Together, the partnering institutions will provide a trans-disciplinary testing infrastructure covering a comprehensive set of preclinical characterization assays (physical, chemical, in vitro and in vivo biological testing), which will allow researchers to fully comprehend the biodistribution, metabolism, pharmacokinetics, safety profiles and immunological effects of their medicinal nano-products. The project will also foster the use and deployment of standard operating procedures (SOPs), benchmark materials and quality management for the preclinical characterization of medicinal nano-products. Yet another objective is to promote intersectoral and interdisciplinary communication among key drivers of innovation, especially between developers and regulatory agencies.

The goal: to bring safe and efficient nano-therapeutics faster to the patient

Within EU-NCL, six analytical facilities will offer transnational access to their existing analytical services for public and private developers, and will also develop new or improved analytical assays to keep EU-NCL at the cutting edge of nanomedicine characterization. A complementary set of networking activities will enable EU-NCL to deliver to European academic or industrial scientists the high-quality analytical services they require for accelerating the industrial development of their candidate nanomedicines. The Empa team of Peter Wick at the «Particles-Biology Interactions» lab will be in charge of the quality management of all analytical methods, a key task to guarantee the best possible reproducibility and comparability of the data between the various analytical labs within the consortium. «EU-NCL supports our research activities in developing innovative and safe nanomaterials for healthcare within an international network, which will actively shape future standards in nanomedicine and strengthen Empa as an enabler to facilitate the transfer of novel nanomedicines from bench to bedside», says Wick.

You can find more information about the laboratory on the Horizon 2020 (a European Union science funding programme) project page for the EU-NCL laboratory. For anyone curious about CEA-Leti, it’s a double-layered organization. CEA is France’s Commission on Atomic Energy and Alternative Energy (Commissariat à l’énergie atomique et aux énergies alternatives); you can go here to their French language site (there is an English language clickable option on the page). Leti is one of the CEA’s institutes and is known as either Leti or CEA-Leti. I have no idea what Leti stands for. Here’s the Leti website (this is the English language version).

US White House establishes new initiatives to commercialize nanotechnology

As I’ve noted several times, there’s a strong push in the US to commercialize nanotechnology and May 20, 2015 was a banner day for the efforts. The US White House announced a series of new initiatives to speed commercialization efforts in a May 20, 2015 posting by Lloyd Whitman, Tom Kalil, and JJ Raynor,

Today, May 20 [2015], the National Economic Council and the Office of Science and Technology Policy held a forum at the White House to discuss opportunities to accelerate the commercialization of nanotechnology.

In recognition of the importance of nanotechnology R&D, representatives from companies, government agencies, colleges and universities, and non-profits are announcing a series of new and expanded public and private initiatives that complement the Administration’s efforts to accelerate the commercialization of nanotechnology and expand the nanotechnology workforce:

  • The Colleges of Nanoscale Science and Engineering at SUNY Polytechnic Institute in Albany, NY and the National Institute for Occupational Safety and Health are launching the Nano Health & Safety Consortium to advance research and guidance for occupational safety and health in the nanoelectronics and other nanomanufacturing industry settings.
  • Raytheon has brought together a group of representatives from the defense industry and the Department of Defense to identify collaborative opportunities to advance nanotechnology product development, manufacturing, and supply-chain support with a goal of helping the U.S. optimize development, foster innovation, and take more rapid advantage of new commercial nanotechnologies.
  • BASF Corporation is taking a new approach to finding solutions to nanomanufacturing challenges. In March, BASF launched a prize-based “NanoChallenge” designed to drive new levels of collaborative innovation in nanotechnology while connecting with potential partners to co-create solutions that address industry challenges.
  • OCSiAl is expanding the eligibility of its “iNanoComm” matching grant program that provides low-cost, single-walled carbon nanotubes to include more exploratory research proposals, especially proposals for projects that could result in the creation of startups and technology transfers.
  • The NanoBusiness Commercialization Association (NanoBCA) is partnering with Venture for America and working with the National Science Foundation (NSF) to promote entrepreneurship in nanotechnology.  Three companies (PEN, NanoMech, and SouthWest NanoTechnologies), are offering to support NSF’s Innovation Corps (I-Corps) program with mentorship for entrepreneurs-in-training and, along with three other companies (NanoViricides, mPhase Technologies, and Eikos), will partner with Venture for America to hire recent graduates into nanotechnology jobs, thereby strengthening new nanotech businesses while providing needed experience for future entrepreneurs.
  • TechConnect is establishing a Nano and Emerging Technologies Student Leaders Conference to bring together the leaders of nanotechnology student groups from across the country. The conference will highlight undergraduate research and connect students with venture capitalists, entrepreneurs, and industry leaders.  Five universities have already committed to participating, led by the University of Virginia Nano and Emerging Technologies Club.
  • Brewer Science, through its Global Intern Program, is providing more than 30 students from high schools, colleges, and graduate schools across the country with hands-on experience in a wide range of functions within the company.  Brewer Science plans to increase the number of its science and engineering interns by 50% next year and has committed to sharing best practices with other nanotechnology businesses interested in how internship programs can contribute to a small company’s success.
  • The National Institute of Standards and Technology’s Center for Nanoscale Science and Technology is expanding its partnership with the National Science Foundation to provide hands-on experience for students in NSF’s Advanced Technology Education program. The partnership will now run year-round and will include opportunities for students at Hudson Valley Community College and the University of the District of Columbia Community College.
  • Federal agencies participating in the NNI [US National Nanotechnology Initiative], supported by the National Nanotechnology Coordination Office [NNCO], are launching multiple new activities aimed at educating students and the public about nanotechnology, including image and video contests highlighting student research, a new webinar series focused on providing nanotechnology information for K-12 teachers, and a searchable web portal on nano.gov of nanoscale science and engineering resources for teachers and professors.

Interestingly, May 20, 2015 is also the day the NNCO held its second webinar for small- and medium-size businesses in the nanotechnology community. You can find out more about that webinar and future ones by following the links in my May 13, 2015 posting.

Since the US White House announcement, OCSiAl has issued a May 26, 2015 news release which provides a brief history and more details about its newly expanded NanoComm program,

OCSiAl launched the iNanoComm, which stands for the Integrated Nanotube Commercialization Award, program in February 2015 to help researchers lower the cost of their most promising R&D projects dedicated to SWCNT [single-walled carbon nanotube] applications. The first round received 33 applications from 28 university groups, including The Smalley-Curl Center for Nanoscale Science and Technology at Rice University and the Concordia Center for Composites at Concordia University [Canada] among others. [emphasis mine] The aim of iNanoComm is to stimulate universities and research organizations to develop innovative market products based on nano-augmented materials, also known as clean materials.

Now the program’s criteria are being broadened to enable greater private sector engagement in potential projects and the creation of partnerships in commercializing nanotechnology. The program will now support early stage commercialization efforts connected to university research in the form of start-ups, technology transfers, new businesses and university spinoffs to support the mass commercialization of SWCNT products and technologies.

The announcement of the program’s expansion took place at the 2015 Roundtable of the US NanoBusiness Commercialization Association (NanoBCA), the world’s first non-profit association focused on the commercialization of nanotechnologies. NanoBCA is dedicated to creating an environment that nurtures research and innovation in nanotechnology, promotes tech-transfer of nanotechnology from academia to industry, encourages private capital investments in nanotechnology companies, and helps its corporate members bring innovative nanotechnology products to market.

“Enhancing iNanoComm as a ‘start-up incubator’ is a concrete step in promoting single-wall carbon nanotube applications in the commercial world,” said Max Atanassov, CEO of OCSiAl USA. “It was the logical thing for us to do, now that high quality carbon nanotubes have become broadly available and are affordably priced to be used on a mass industrial scale.”

Vince Caprio, Executive Director of NanoBCA, added that “iNanoComm will make an important contribution to translating fundamental nanotechnology research into commercial products. By facilitating the formation of more start-ups, it will encourage more scientists to pursue their dreams and develop their ideas into commercially successful businesses.”

For more information on the program expansion and how it can reduce the cost of early stage research connected to university projects, visit the iNanoComm website at www.inanocomm.org or contact info@inanocomm.org.

h/t Azonano May 27, 2015 news item