Tag Archives: US National Institute of Standards and Technology

A study in contrasts: innovation and education strategies in US and British Columbia (Canada)

It’s always interesting to contrast two approaches to the same issue, in this case, innovation and education strategies designed to improve the economies of the United States and of British Columbia, a province in Canada.

One of the major differences regarding education in the US and in Canada is that the Canadian federal government, unlike the US federal government, has no jurisdiction over the matter. Education is strictly a provincial responsibility.

I recently wrote a commentary (a Jan. 19, 2016 posting) about the BC government’s Jan. 18, 2016 announcement of its innovation strategy in a special emphasis on the education aspect. Premier Christy Clark focused largely on the notion of embedding courses on computer coding in schools from K-12 (kindergarten through grade 12) as Jonathon Narvey noted in his Jan. 19, 2016 event recap for Betakit,

While many in the tech sector will be focused on the short-term benefits of a quick injection of large capital [a $100M BC Tech Fund as part of a new strategy was announced in Dec. 2015 but details about the new #BCTECH Strategy were not shared until Jan. 18, 2016], the long-term benefits for the local tech sector are being seeded in local schools. More than 600,000 BC students will be getting basic skills in the K-12 curriculum, with coding academies, more work experience electives and partnerships between high school and post-secondary institutions.

Here’s what I had to say in my commentary (from the Jan. 19, 2016 posting),

… the government wants to embed  computer coding into the education system for K-12 (kindergarten to grade 12). One determined reporter (Canadian Press if memory serves) attempted to find out how much this would cost. No answer was forthcoming although there were many words expended. Whether this failure was due to ignorance (disturbing!) or a reluctance to share (also disturbing!) was impossible to tell. Another reporter (Georgia Straight) asked about equipment (coding can be taught with pen and paper but hardware is better). … Getting back to the reporter’s question, no answer was forthcoming although the speaker was loquacious.

Another reporter asked if the government had found any jurisdictions doing anything similar regarding computer coding. It seems they did consider other jurisdictions although it was claimed that BC is the first to strike out in this direction. Oddly, no one mentioned Estonia, known in some circles as E-stonia, where the entire school system was online by the late 1990s in an initiative known as the ‘Tiger Leap Foundation’ which also supported computer coding classes in secondary school (there’s more in Tim Mansel’s May 16, 2013 article about Estonia’s then latest initiative to embed computer coding into grade school.) …

Aside from the BC government’s failure to provide details, I am uncomfortable with what I see as an overemphasis on computer coding that suggests a narrow focus on what constitutes a science and technology strategy for education. I find the US approach closer to what I favour although I may be biased since they are building their strategy around nanotechnology education.

The US approach had been announced in dribs and drabs until recently when a Jan. 26, 2016 news item on Nanotechnology Now indicated a broad-based plan for nanotechnology education (and computer coding),

Over the past 15 years, the Federal Government has invested over $22 billion in R&D under the auspices of the National Nanotechnology Initiative (NNI) to understand and control matter at the nanoscale and develop applications that benefit society. As these nanotechnology-enabled applications become a part of everyday life, it is important for students to have a basic understanding of material behavior at the nanoscale, and some states have even incorporated nanotechnology concepts into their K-12 science standards. Furthermore, application of the novel properties that exist at the nanoscale, from gecko-inspired climbing gloves and invisibility cloaks, to water-repellent coatings on clothes or cellphones, can spark students’ excitement about science, technology, engineering, and mathematics (STEM).

An earlier Jan. 25, 2016 White House blog posting by Lisa Friedersdorf and Lloyd Whitman introduced the notion that nanotechnology is viewed as foundational and a springboard for encouraging interest in STEM (science, technology, engineering, and mathematics) careers while outlining several formal and information education efforts,

The Administration’s updated Strategy for American Innovation, released in October 2015, identifies nanotechnology as one of the emerging “general-purpose technologies”—a technology that, like the steam engine, electricity, and the Internet, will have a pervasive impact on our economy and our society, with the ability to create entirely new industries, create jobs, and increase productivity. To reap these benefits, we must train our Nation’s students for these high-tech jobs of the future. Fortunately, the multidisciplinary nature of nanotechnology and the unique and fascinating phenomena that occur at the nanoscale mean that nanotechnology is a perfect topic to inspire students to pursue careers in science, technology, engineering, and mathematics (STEM).

The Nanotechnology: Super Small Science series [mentioned in my Jan. 21, 2016 posting] is just the latest example of the National Nanotechnology Initiative (NNI)’s efforts to educate and inspire our Nation’s students. Other examples include:

The announcement about computer coding and courses being integrated in the US education curricula K-12 was made in US President Barack Obama’s 2016 State of the Union speech and covered in a Jan. 30, 2016 article by Jessica Hullinger for Fast Company,

In his final State Of The Union address earlier this month, President Obama called for providing hands-on computer science classes for all students to make them “job ready on day one.” Today, he is unveiling how he plans to do that with his upcoming budget.

The President’s Computer Science for All Initiative seeks to provide $4 billion in funding for states and an additional $100 million directly to school districts in a push to provide access to computer science training in K-12 public schools. The money would go toward things like training teachers, providing instructional materials, and getting kids involved in computer science early in elementary and middle school.

There are more details in the Hullinger’s article and in a Jan. 30, 2016 White House blog posting by Megan Smith,

Computer Science for All is the President’s bold new initiative to empower all American students from kindergarten through high school to learn computer science and be equipped with the computational thinking skills they need to be creators in the digital economy, not just consumers, and to be active citizens in our technology-driven world. Our economy is rapidly shifting, and both educators and business leaders are increasingly recognizing that computer science (CS) is a “new basic” skill necessary for economic opportunity and social mobility.

CS for All builds on efforts already being led by parents, teachers, school districts, states, and private sector leaders from across the country.

Nothing says one approach has to be better than the other as there’s usually more than one way to accomplish a set of goals. As well, it’s unfair to expect a provincial government to emulate the federal government of a larger country with more money to spend. I just wish the BC government (a) had shared details such as the budget allotment for their initiative and (b) would hint at a more imaginative, long range view of STEM education.

Going back to Estonia one last time, in addition to the country’s recent introduction of computer coding classes in grade school, it has also embarked on a nanotechnology/nanoscience educational and entrepreneurial programme as noted in my Sept. 30, 2014 posting,

The University of Tartu (Estonia) announced in a Sept. 29, 2014 press release an educational and entrepreneurial programme about nanotechnology/nanoscience for teachers and students,

To bring nanoscience closer to pupils, educational researchers of the University of Tartu decided to implement the European Union LLP Comenius project “Quantum Spin-Off – connecting schools with high-tech research and entrepreneurship”. The objective of the project is to build a kind of a bridge: at one end, pupils can familiarise themselves with modern science, and at the other, experience its application opportunities at high-tech enterprises. “We also wish to inspire these young people to choose a specialisation related to science and technology in the future,” added Lukk [Maarika Lukk, Coordinator of the project].

The pupils can choose between seven topics of nanotechnology: the creation of artificial muscles, microbiological fuel elements, manipulation of nanoparticles, nanoparticles and ionic liquids as oil additives, materials used in regenerative medicine, deposition and 3D-characterisation of atomically designed structures and a topic covered in English, “Artificial robotic fish with EAP elements”.

Learning is based on study modules in the field of nanotechnology. In addition, each team of pupils will read a scientific publication, selected for them by an expert of that particular field. In that way, pupils will develop an understanding of the field and of scientific texts. On the basis of the scientific publication, the pupils prepare their own research project and a business plan suitable for applying the results of the project.

In each field, experts of the University of Tartu will help to understand the topics. Participants will visit a nanotechnology research laboratory and enterprises using nanotechnologies.

The project lasts for two years and it is also implemented in Belgium, Switzerland and Greece.

As they say, time will tell.

Developing optical microscopes that measure features down to 10 nanometer level on computer chips

The US National Institute of Standards and Technology (NIST) issued a Dec. 2, 2015 news release (also on EurekAlert) announcing a new kind of optical microscope and its possible impact on the semiconductor industry,

National Institute of Standards and Technology (NIST) researchers are seeing the light, but in an altogether different way. And how they are doing it just might be the semiconductor industry’s ticket for extending its use of optical microscopes to measure computer chip features that are approaching 10 nanometers, tiny fractions of the wavelength of light.

The news release goes on to provide details and an explanation of scatterfield imaging,

Using a novel microscope that combines standard through-the-lens viewing with a technique called scatterfield imaging, the NIST team accurately measured patterned features on a silicon wafer that were 30 times smaller than the wavelength of light (450 nanometers) used to examine them. They report* that measurements of the etched lines–as thin as 16 nanometers wide–on the SEMATECH-fabricated wafer were accurate to one nanometer. With the technique, they spotted variations in feature dimensions amounting to differences of a few atoms.

Measurements were confirmed by those made with an atomic force microscope, which achieves sub-nanometer resolution, but is considered too slow for online quality-control measurements. Combined with earlier results, the NIST researchers write, the new proof-of-concept study* suggests that the innovative optical approach could be a “realistic solution to a very challenging problem” facing chip makers and others aiming to harness advances in nanotechnology. All need the means for “nondestructive measurement of nanometer-scale structures with sub-nanometer sensitivity while still having high throughput.

“Light-based, or optical, microscopes can’t “see” features smaller than the wavelength of light, at least not in the crisp detail necessary for making accurate measurements. However, light does scatter when it strikes so-called subwavelength features and patterned arrangements of such features. “Historically, we would ignore this scattered light because it did not yield sufficient resolution,” explains Richard Silver, the physicist who initiated NIST’s scatterfield imaging effort. “Now we know it contains helpful information that provides signatures telling us something about where the light came from.”

With scatterfield imaging, Silver and colleagues methodically illuminate a sample with polarized light from different angles. From this collection of scattered light–nothing more than a sea of wiggly lines to the untrained eye–the NIST team can extract characteristics of the bounced lightwaves that, together, reveal the geometry of features on the specimen.

Light-scattering data are gathered in slices, which together image the volume of scattered light above and into the sample. These slices are analyzed and reconstructed to create a three-dimensional representation. The process is akin to a CT scan, except that the slices are collections of interfering waves, not cross-sectional pictures.

“It’s the ensemble of data that tells us what we’re after,” says project leader Bryan Barnes.” We may not be able see the lines on the wafer, but we can tell you what you need to know about them–their size, their shape, their spacing.”

Scatterfield imaging has critical prerequisites that must be met before it can yield useful data for high-accuracy measurements of exceedingly small features. Key steps entail detailed evaluation of the path light takes as it beams through lenses, apertures and other system elements before reaching the sample. The path traversed by light scattering from the specimen undergoes the same level of scrutiny. Fortunately, scatterfield imaging lends itself to thorough characterization of both sequences of optical devices, according to the researchers. These preliminary steps are akin to error mapping so that recognized sources of inaccuracy are factored out of the data.

The method also benefits from a little advance intelligence–the as-designed arrangement of circuit lines on a chip, down to the size of individual features. Knowing what is expected to be the result of the complex chip-making process sets up a classic matchup of theory vs. experiment.

The NIST researchers can use standard equations to simulate light scattering from an ideal, defect-free pattern and, in fact, any variation thereof. Using wave analysis software they developed, the team has assembled an indexed library of light-scattering reference models. So once a specimen is scanned, the team relies on computers to compare their real-world data to models and to find close matches.

From there, succeeding rounds of analysis homes in on the remaining differences, reducing them until the only ones that remain are due to variations in geometry such as irregularities in the height, width, or shape of a line.

Measurement results achieved with the NIST approach might be said to cast light itself in an entirely new light. Their new study, the researchers say, shows that once disregarded scattered light “contains a wealth of accessible optical information.”

Next steps include extending the technique to even shorter wavelengths of light, down to ultraviolet, or 193 nanometers. The aim is to accurately measure features as small as 5 nanometers.

This work is part of a larger NIST effort to supply measurement tools that enable the semiconductor industry to continue doubling the number of devices on a chip about every two years and to help other industries make products with nanoscale features. Recently, NIST and Intel researchers reported using an X-ray technique to accurately measure features on a silicon chip to within fractions of a nanometer.

Here’s a link to and a citation for a PDF of the paper,

Deep-subwavelength Nanometric Image Reconstruction using Fourier Domain Optical Normalization by Jing  Qin, Richard  M Silver, Bryan  M  Barnes, Hui Zhou, Ronald G Dixson, and Mark Alexander Hen. Light: Science & Applications accepted article preview 5 November 2015; e16038 doi: 10.1038/lsa.2016.38

[Note:] This is a PDF file of an unedited peer-reviewed manuscript that has been accepted for publication. NPG are providing this early version of the manuscript as a service to our customers. The manuscript will undergo copy editing, typesetting and a proof review before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers apply.

This seems to be an open access paper but it is an early version.

Nanotechnology research protocols for Environment, Health and Safety Studies in US and a nanomedicine characterization laboratory in the European Union

I have two items relating to nanotechnology and the development of protocols. The first item concerns the launch of a new web portal by the US National Institute of Standards and Technology.

US National Institute of Standards and Technology (NIST)

From a July 1, 2015 news item on Azonano,

As engineered nanomaterials increasingly find their way into commercial products, researchers who study the potential environmental or health impacts of those materials face a growing challenge to accurately measure and characterize them. These challenges affect measurements of basic chemical and physical properties as well as toxicology assessments.

To help nano-EHS (Environment, Health and Safety)researchers navigate the often complex measurement issues, the National Institute of Standards and Technology (NIST) has launched a new website devoted to NIST-developed (or co-developed) and validated laboratory protocols for nano-EHS studies.

A July 1, 2015 NIST news release on EurekAlert, which originated the news item, offers more details about the information available through the web portal,

In common lab parlance, a “protocol” is a specific step-by-step procedure used to carry out a measurement or related activity, including all the chemicals and equipment required. Any peer-reviewed journal article reporting an experimental result has a “methods” section where the authors document their measurement protocol, but those descriptions are necessarily brief and condensed, and may lack validation of any sort. By comparison, on NIST’s new Protocols for Nano-EHS website the protocols are extraordinarily detailed. For ease of citation, they’re published individually–each with its own unique digital object identifier (DOI).

The protocols detail not only what you should do, but why and what could go wrong. The specificity is important, according to program director Debra Kaiser, because of the inherent difficulty of making reliable measurements of such small materials. “Often, if you do something seemingly trivial–use a different size pipette, for example–you get a different result. Our goal is to help people get data they can reproduce, data they can trust.”

A typical caution, for example, notes that if you’re using an instrument that measures the size of nanoparticles in a solution by how they scatter light, it’s important also to measure the transmission spectrum of the particles if they’re colored, because if they happen to absorb light strongly at the same frequency as your instrument, the result may be biased.

“These measurements are difficult because of the small size involved,” explains Kaiser. “Very few new instruments have been developed for this. People are adapting existing instruments and methods for the job, but often those instruments are being operated close to their limits and the methods were developed for chemicals or bulk materials and not for nanomaterials.”

“For example, NIST offers a reference material for measuring the size of gold nanoparticles in solution, and we report six different sizes depending on the instrument you use. We do it that way because different instruments sense different aspects of a nanoparticle’s dimensions. An electron microscope is telling you something different than a dynamic light scattering instrument, and the researcher needs to understand that.”

The nano-EHS protocols offered by the NIST site, Kaiser says, could form the basis for consensus-based, formal test methods such as those published by ASTM and ISO.

NIST’s nano-EHS protocol site currently lists 12 different protocols in three categories: sample preparation, physico-chemical measurements and toxicological measurements. More protocols will be added as they are validated and documented. Suggestions for additional protocols are welcome at nanoprotocols@nist.gov.

The next item concerns European nanomedicine.

CEA-LETI and Europe’s first nanomedicine characterization laboratory

A July 1, 2015 news item on Nanotechnology Now describes the partnership which has led to launch of the new laboratory,

CEA-Leti today announced the launch of the European Nano-Characterisation Laboratory (EU-NCL) funded by the European Union’s Horizon 2020 research and innovation programm[1]e. Its main objective is to reach a level of international excellence in nanomedicine characterisation for medical indications like cancer, diabetes, inflammatory diseases or infections, and make it accessible to all organisations developing candidate nanomedicines prior to their submission to regulatory agencies to get the approval for clinical trials and, later, marketing authorization.

“As reported in the ETPN White Paper[2], there is a lack of infrastructure to support nanotechnology-based innovation in healthcare,” said Patrick Boisseau, head of business development in nanomedicine at CEA-Leti and chairman of the European Technology Platform Nanomedicine (ETPN). “Nanocharacterisation is the first bottleneck encountered by companies developing nanotherapeutics. The EU-NCL project is of most importance for the nanomedicine community, as it will contribute to the competiveness of nanomedicine products and tools and facilitate regulation in Europe.”

EU-NCL is partnered with the sole international reference facility, the Nanotechnology Characterization Lab of the National Cancer Institute in the U.S. (US-NCL)[3], to get faster international harmonization of analytical protocols.

“We are excited to be part of this cooperative arrangement between Europe and the U.S.,” said Scott E. McNeil, director of U.S. NCL. “We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.”

A July 2, 2015 EMPA (Swiss Federal Laboratories for Materials Science and Technology) news release on EurekAlert provides more detail about the laboratory and the partnerships,

The «European Nanomedicine Characterization Laboratory» (EU-NCL), which was launched on 1 June 2015, has a clear-cut goal: to help bring more nanomedicine candidates into the clinic and on the market, for the benefit of patients and the European pharmaceutical industry. To achieve this, EU-NCL is partnered with the sole international reference facility, the «Nanotechnology Characterization Laboratory» (US-NCL) of the US-National Cancer Institute, to get faster international harmonization of analytical protocols. EU-NCL is also closely connected to national medicine agencies and the European Medicines Agency to continuously adapt its analytical services to requests of regulators. EU-NCL is designed, organized and operated according to the highest EU regulatory and quality standards. «We are excited to be part of this cooperative project between Europe and the U.S.,» says Scott E. McNeil, director of US-NCL. «We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.»

Nine partners from eight countries

EU-NCL, which is funded by the EU for a four-year period with nearly 5 million Euros, brings together nine partners from eight countries: CEA-Tech in Leti and Liten, France, the coordinator of the project; the Joint Research Centre of the European Commission in Ispra, Italy; European Research Services GmbH in Münster Germany; Leidos Biomedical Research, Inc. in Frederick, USA; Trinity College in Dublin, Ireland; SINTEF in Oslo, Norway; the University of Liverpool in the UK; Empa, the Swiss Federal Laboratories for Materials Science and Technology in St. Gallen, Switzerland; Westfälische Wilhelms-Universität (WWU) and Gesellschaft für Bioanalytik, both in Münster, Germany. Together, the partnering institutions will provide a trans-disciplinary testing infrastructure covering a comprehensive set of preclinical characterization assays (physical, chemical, in vitro and in vivo biological testing), which will allow researchers to fully comprehend the biodistribution, metabolism, pharmacokinetics, safety profiles and immunological effects of their medicinal nano-products. The project will also foster the use and deployment of standard operating procedures (SOPs), benchmark materials and quality management for the preclinical characterization of medicinal nano-products. Yet another objective is to promote intersectoral and interdisciplinary communication among key drivers of innovation, especially between developers and regulatory agencies.

The goal: to bring safe and efficient nano-therapeutics faster to the patient

Within EU-NCL, six analytical facilities will offer transnational access to their existing analytical services for public and private developers, and will also develop new or improved analytical assays to keep EU-NCL at the cutting edge of nanomedicine characterization. A complementary set of networking activities will enable EU-NCL to deliver to European academic or industrial scientists the high-quality analytical services they require for accelerating the industrial development of their candidate nanomedicines. The Empa team of Peter Wick at the «Particles-Biology Interactions» lab will be in charge of the quality management of all analytical methods, a key task to guarantee the best possible reproducibility and comparability of the data between the various analytical labs within the consortium. «EU-NCL supports our research activities in developing innovative and safe nanomaterials for healthcare within an international network, which will actively shape future standards in nanomedicine and strengthen Empa as an enabler to facilitate the transfer of novel nanomedicines from bench to bedside», says Wick.

You can find more information about the laboratory on the Horizon 2020 (a European Union science funding programme) project page for the EU-NCL laboratory. For anyone curious about CEA-Leti, it’s a double-layered organization. CEA is France’s Commission on Atomic Energy and Alternative Energy (Commissariat à l’énergie atomique et aux énergies alternatives); you can go here to their French language site (there is an English language clickable option on the page). Leti is one of the CEA’s institutes and is known as either Leti or CEA-Leti. I have no idea what Leti stands for. Here’s the Leti website (this is the English language version).

US White House establishes new initiatives to commercialize nanotechnology

As I’ve noted several times, there’s a strong push in the US to commercialize nanotechnology and May 20, 2015 was a banner day for the efforts. The US White House announced a series of new initiatives to speed commercialization efforts in a May 20, 2015 posting by Lloyd Whitman, Tom Kalil, and JJ Raynor,

Today, May 20 [2015], the National Economic Council and the Office of Science and Technology Policy held a forum at the White House to discuss opportunities to accelerate the commercialization of nanotechnology.

In recognition of the importance of nanotechnology R&D, representatives from companies, government agencies, colleges and universities, and non-profits are announcing a series of new and expanded public and private initiatives that complement the Administration’s efforts to accelerate the commercialization of nanotechnology and expand the nanotechnology workforce:

  • The Colleges of Nanoscale Science and Engineering at SUNY Polytechnic Institute in Albany, NY and the National Institute for Occupational Safety and Health are launching the Nano Health & Safety Consortium to advance research and guidance for occupational safety and health in the nanoelectronics and other nanomanufacturing industry settings.
  • Raytheon has brought together a group of representatives from the defense industry and the Department of Defense to identify collaborative opportunities to advance nanotechnology product development, manufacturing, and supply-chain support with a goal of helping the U.S. optimize development, foster innovation, and take more rapid advantage of new commercial nanotechnologies.
  • BASF Corporation is taking a new approach to finding solutions to nanomanufacturing challenges. In March, BASF launched a prize-based “NanoChallenge” designed to drive new levels of collaborative innovation in nanotechnology while connecting with potential partners to co-create solutions that address industry challenges.
  • OCSiAl is expanding the eligibility of its “iNanoComm” matching grant program that provides low-cost, single-walled carbon nanotubes to include more exploratory research proposals, especially proposals for projects that could result in the creation of startups and technology transfers.
  • The NanoBusiness Commercialization Association (NanoBCA) is partnering with Venture for America and working with the National Science Foundation (NSF) to promote entrepreneurship in nanotechnology.  Three companies (PEN, NanoMech, and SouthWest NanoTechnologies), are offering to support NSF’s Innovation Corps (I-Corps) program with mentorship for entrepreneurs-in-training and, along with three other companies (NanoViricides, mPhase Technologies, and Eikos), will partner with Venture for America to hire recent graduates into nanotechnology jobs, thereby strengthening new nanotech businesses while providing needed experience for future entrepreneurs.
  • TechConnect is establishing a Nano and Emerging Technologies Student Leaders Conference to bring together the leaders of nanotechnology student groups from across the country. The conference will highlight undergraduate research and connect students with venture capitalists, entrepreneurs, and industry leaders.  Five universities have already committed to participating, led by the University of Virginia Nano and Emerging Technologies Club.
  • Brewer Science, through its Global Intern Program, is providing more than 30 students from high schools, colleges, and graduate schools across the country with hands-on experience in a wide range of functions within the company.  Brewer Science plans to increase the number of its science and engineering interns by 50% next year and has committed to sharing best practices with other nanotechnology businesses interested in how internship programs can contribute to a small company’s success.
  • The National Institute of Standards and Technology’s Center for Nanoscale Science and Technology is expanding its partnership with the National Science Foundation to provide hands-on experience for students in NSF’s Advanced Technology Education program. The partnership will now run year-round and will include opportunities for students at Hudson Valley Community College and the University of the District of Columbia Community College.
  • Federal agencies participating in the NNI [US National Nanotechnology Initiative], supported by the National Nanotechnology Coordination Office [NNCO], are launching multiple new activities aimed at educating students and the public about nanotechnology, including image and video contests highlighting student research, a new webinar series focused on providing nanotechnology information for K-12 teachers, and a searchable web portal on nano.gov of nanoscale science and engineering resources for teachers and professors.

Interestingly, May 20, 2015 is also the day the NNCO held its second webinar for small- and medium-size businesses in the nanotechnology community. You can find out more about that webinar and future ones by following the links in my May 13, 2015 posting.

Since the US White House announcement, OCSiAl has issued a May 26, 2015 news release which provides a brief history and more details about its newly expanded NanoComm program,

OCSiAl launched the iNanoComm, which stands for the Integrated Nanotube Commercialization Award, program in February 2015 to help researchers lower the cost of their most promising R&D projects dedicated to SWCNT [single-walled carbon nanotube] applications. The first round received 33 applications from 28 university groups, including The Smalley-Curl Center for Nanoscale Science and Technology at Rice University and the Concordia Center for Composites at Concordia University [Canada] among others. [emphasis mine] The aim of iNanoComm is to stimulate universities and research organizations to develop innovative market products based on nano-augmented materials, also known as clean materials.

Now the program’s criteria are being broadened to enable greater private sector engagement in potential projects and the creation of partnerships in commercializing nanotechnology. The program will now support early stage commercialization efforts connected to university research in the form of start-ups, technology transfers, new businesses and university spinoffs to support the mass commercialization of SWCNT products and technologies.

The announcement of the program’s expansion took place at the 2015 Roundtable of the US NanoBusiness Commercialization Association (NanoBCA), the world’s first non-profit association focused on the commercialization of nanotechnologies. NanoBCA is dedicated to creating an environment that nurtures research and innovation in nanotechnology, promotes tech-transfer of nanotechnology from academia to industry, encourages private capital investments in nanotechnology companies, and helps its corporate members bring innovative nanotechnology products to market.

“Enhancing iNanoComm as a ‘start-up incubator’ is a concrete step in promoting single-wall carbon nanotube applications in the commercial world,” said Max Atanassov, CEO of OCSiAl USA. “It was the logical thing for us to do, now that high quality carbon nanotubes have become broadly available and are affordably priced to be used on a mass industrial scale.”

Vince Caprio, Executive Director of NanoBCA, added that “iNanoComm will make an important contribution to translating fundamental nanotechnology research into commercial products. By facilitating the formation of more start-ups, it will encourage more scientists to pursue their dreams and develop their ideas into commercially successful businesses.”

For more information on the program expansion and how it can reduce the cost of early stage research connected to university projects, visit the iNanoComm website at www.inanocomm.org or contact info@inanocomm.org.

h/t Azonano May 27, 2015 news item

Making x-ray measurements more accurate

Apparently the method for establishing x-ray measurements is from the 1970s and the folks at the US National Institute of Standards and Technology (NIST) feel it’s time for a new technique. From a March 9, 2015 NIST news release (also on EurekAlert),

Criminal justice, cosmology and computer manufacturing may not look to have much in common, but these and many other disparate fields all depend on sensitive measurements of X-rays. Scientists at the National Institute of Standards and Technology (NIST) have developed a new method* to reduce uncertainty in X-ray wavelength measurement that could provide improvements awaited for decades.

Accurate measurement of X-ray wavelengths depends critically on the ability to measure angles very precisely and with very little margin for error. NIST’s new approach is the first major advance since the 1970s in reducing certain sources of error common in X-ray angle measurement.

Many of us associate X-rays with a doctor’s office, but the uses for these energetic beams go far beyond revealing our skeletons. The ability to sense X-rays at precise wavelengths allows law enforcement to detect and identify trace explosives, or astrophysicists to better understand cosmic phenomena. It all comes down to looking very closely at the X-ray spectrum and measuring the precise position of lines within it. Those lines represent specific wavelengths–which are associated with specific energies–of X-rays that are emitted by the subject being studied. Each material has its own, unique X-ray “fingerprint.”

But a slight error in angle measurement can skew the results, with consequences for quantum theories, research and manufacturing. “While many fields need good X-ray reference data, many of the measurements that presently fill standard reference databases are not great–most data were taken in the 1970s and are often imprecise,” says NIST’s Larry Hudson.

X-ray wavelengths are measured by passing the beam through special crystals and very carefully measuring the angle that exiting rays make with the original beam. While the physics is different, the technique is analogous to the way a prism will split white light into different colors coming out at different angles.

The crystal is typically mounted on a rotating device that spins the crystal to two different positions where a spectral line is observed. The angle between the two is measured–this is a neat geometry trick that determines the line’s position more precisely than a single measurement would, while also cancelling out some potential errors. One inevitable limit is the accuracy of the digital encoder, the device that translates the rotation of the crystal to an angle measurement.

The news release goes on to describe the new technique,

Hudson and his co-authors have found a way to dramatically reduce the error in that measurement. Their new approach uses laser beams bouncing off a mirrored polygon that is rotated on the same shaft that would carry the crystal. The approach allows the team to use additional mathematical shortcuts to their advantage. With new NIST sensing instrumentation and analysis, X-ray angles can now be measured routinely with an uncertainty of 0.06 arcseconds–an accuracy more than three times better than the uncalibrated encoder.

Hudson describes this reduction as significant enough to set world records in X-ray wavelength measurement. “If a giant windshield wiper stretched from Washington D.C. to New York City (364 kilometers) and were to sweep out the angle of one of these errors, its tip would move less than the width of a DVD,” he says.

What do these improvements mean for the fields that depend on X-ray sensing? For one thing, calibrating measurement devices to greater precision will provide better understanding of a host of newly designed materials, which often have complicated crystal structures that give rise to unusual effects such as high-temperature superconductivity. The team’s efforts will permit better understanding of the relationship between the structures and properties of novel materials.

Here’s a link to and a citation for the paper,

A simple method for high-precision calibration of long-range errors in an angle encoder using an electronic nulling by Mark N Kinnane, Lawrence T Hudson, Albert Henins, and Marcus H Mendenhall. Metrologia Volume 52 Number 2 doi:10.1088/0026-1394/52/2/244

This is an open access paper,

For anyone curious about arcseconds, you can find an explanation in this Wikipedia entry titled Minute of art. Briefly, imagine a 360 degree circle where one degree equals one arcminute and one arcsecond is 1/60 of that minute.

Silver nanoparticle reference materials

When comparing silver nanoparticle toxicity studies, it would be good to know that the studies are all looking at the same type of nanoparticle. Happily, the US National Institute of Standards and Technology (NIST) has developed a silver nanoparticle reference material for just that purpose. From a March 5, 2015 news item on Azonano,

The National Institute of Standards and Technology (NIST) has issued a new silver nanoparticle reference material to support researchers studying potential environmental, health and safety risks associated with the nanoparticles, which are being incorporated in a growing number of consumer and industrial products for their antimicrobial properties. The new NIST test material is believed to be the first of its kind to stabilize the highly reactive silver particles in a freeze-dried, polymer coated, nanoparticle cake for long-term storage.

Nanoparticulate silver is a highly effective bactericide. It is, by some estimates, the most widely used nanomaterial in consumer products. These include socks and shoe liners (it combats foot odor), stain-resistant fabrics, coatings for handrails and keyboards, and a plethora of other applications.

The explosion of “nanosilver” products has driven a like expansion of research to better understand what happens to the material in the environment. “Silver nanoparticles transform, dissolve and precipitate back into nanoparticles again, combine or react with other materials—our understanding of these processes is limited,” says NIST chemist Vince Hackley. “However, in order to study their biological and environmental behavior and fate, one needs to know one is starting with the same material and not some modified or oxidized version. This new reference material targets a broad range of research applications.” [emphasis mine]

A March 3, 2015 NIST news release, which originated the news item, elaborates,

Silver nanoparticles are highly reactive. In the presence of oxygen or moisture they rapidly oxidize, subsequently releasing silver ions. This is the basis for their antimicrobial properties, but it also makes it difficult to create a standardized silver nanoparticle suspension with a long shelf life as a basis for doing comparative environmental studies. The new NIST product is the first to be stabilized by coating and freeze-drying—a technique commonly used in the pharmaceutical industry to preserve blood products and protein-based drugs. The NIST material uses polyvinylpyrrolidone (PVP), a polymer approved by the Food and Drug Administration for many uses, including as a food additive. The freeze-dried PVP-nanosilver cakes are flushed with an inert gas and sealed under a vacuum. Mixing the cake with water reconstitutes the original suspension.

NIST reference materials are designed to be homogeneous and stable. NIST provides the best available estimates for key properties of reference materials. In this case those include the mean silver particle size measured by four different methods, the total silver mass per vial, and the percentage distribution of nanoparticle sizes. The particles have a nominal diameter of 75 nanometers. NIST expects the material to be stable indefinitely when properly stored and handled, but will continue to monitor it for substantive changes in the reported values.

More information on NIST RM 8017, “Polyvinylpyrrolidone Coated Silver Nanoparticles” is available at https://www-s.nist.gov/srmors/view_report.cfm?srm=8017.

Given this development, I’m beginning to question all of the silver studies I’ve seen previously.

Oh so cute! Baby nanotubes!

Scientists from the US National Institute of Standards and Technology and from two US universities have successfully filmed the formation of single-walled carbon nanotubes (SWCNTs) according to a Dec. 2, 2014 news item on Nanowerk,

Single-walled carbon nanotubes are loaded with desirable properties. In particular, the ability to conduct electricity at high rates of speed makes them attractive for use as nanoscale transistors. But this and other properties are largely dependent on their structure, and their structure is determined when the nanotube is just beginning to form.

In a step toward understanding the factors that influence how nanotubes form, researchers at the National Institute of Standards and Technology (NIST), the University of Maryland, and Texas A&M have succeeded in filming them when they are only a few atoms old. These nanotube “baby pictures” give crucial insight into how they germinate and grow, potentially opening the way for scientists to create them en masse with just the properties that they want.

A Dec. 1, 2014 NIST news release, which originated the news item, explains how scientists managed to make movies of SWCNTs as they formed,

To better understand how carbon nanotubes grow and how to grow the ones you want, you need to understand the very beginning of the growth process, called nucleation. To do that, you need to be able to image the nucleation process as it happens. However, this is not easy because it involves a small number of fast-moving atoms, meaning you have to take very high resolution pictures very quickly.

Because fast, high-resolution cameras are expensive, NIST scientists instead slowed the growth rate by lowering the pressure inside their instrument, an environmental scanning transmission electron microscope. Inside the microscope’s chamber, under high heat and low pressure, the team watched as carbon atoms generated from acetylene rained down onto 1.2-nanometer bits of cobalt carbide, where they attached, formed into graphene, encircled the nanoparticle, and began to grow into nanotubes.

“Our observations showed that the carbon atoms attached only to the pure metal facets of the cobalt carbide nanoparticle, and not those facets interlaced with carbon atoms,” says NIST chemist Renu Sharma, who led the research effort. “The burgeoning tube then grew above the cobalt-carbon facets until it found another pure metal surface to attach to, forming a closed cap. Carbon atoms continued to attach at the cobalt facets, pushing the previously formed graphene along toward the cap in a kind of carbon assembly line and lengthening the tube. This whole process took only a few seconds.”

According to Sharma, the carbon atoms seek out the most energetically favorable configurations as they form graphene on the cobalt carbide nanoparticle’s surface. While graphene has a mostly hexagonal, honeycomb-type structure, the geometry of the nanoparticle forces the carbon atoms to arrange themselves into pentagonal shapes within the otherwise honeycomb lattice. Crucially, these pentagonal irregularities in the graphene’s structure are what allows the graphene to curve and become a nanotube.

Because the nanoparticles’ facets also appear to play a deciding role in the nanotube’s diameter and chirality, or direction of twist, the group’s next step will be to measure the chirality of the nanotubes as they grow. The group also plans to use metal nanoparticles with different facets to study their adhesive properties to see how they affect the tubes’ chirality and diameter.

The researchers have made one of their movies available for viewing, but, despite my efforts, I cannot find a way to embed the silent movie. Happily, you can find the ‘baby carbon nanotube’ movie alongside NIST’s Dec. 1, 2014 NIST news release,

As for the research paper, here’s a link and a citation for it,

Nucleation of Graphene and Its Conversion to Single-Walled Carbon Nanotubes by Matthieu Picher, Pin Ann Lin, Jose L. Gomez-Ballesteros, Perla B. Balbuena, and Renu Sharma. Nano Lett., 2014, 14 (11), pp 6104–6108 DOI: 10.1021/nl501977b Publication Date (Web): October 20, 2014

Copyright © 2014 American Chemical Society

This paper is behind a paywall.

Nanosafety research: a quality control issue

Toxicologist Dr. Harald Krug has published a review of several thousand studies on nanomaterials safety exposing problematic research methodologies and conclusions. From an Oct. 29, 2014 news item on Nanowerk (Note: A link has been removed),

Empa [Swiss Federal Laboratories for Materials Science and Technology] toxicologist Harald Krug has lambasted his colleagues in the journal Angewandte Chemie (“Nanosafety Research—Are We on the Right Track?”). He evaluated several thousand studies on the risks associated with nanoparticles and discovered no end of shortcomings: poorly prepared experiments and results that don’t carry any clout. Instead of merely leveling criticism, however, Empa is also developing new standards for such experiments within an international network.

An Oct. 29, 2014 Empa press release (also on EurekAlert), which originated the news item, describes the new enthusiasm for research into nanomaterials and safety,

Researching the safety of nanoparticles is all the rage. Thousands of scientists worldwide are conducting research on the topic, examining the question of whether titanium dioxide nanoparticles from sun creams can get through the skin and into the body, whether carbon nanotubes from electronic products are as hazardous for the lungs as asbestos used to be or whether nanoparticles in food can get into the blood via the intestinal flora, for instance. Public interest is great, research funds are flowing – and the number of scientific projects is skyrocketing: between 1980 and 2010, a total of 5,000 projects were published, followed by another 5,000 in just the last three years. However, the amount of new knowledge has only increased marginally. After all, according to Krug the majority of the projects are poorly executed and all but useless for risk assessments.

The press release goes on to describe various pathways into the body and problems with research methodologies,

How do nanoparticles get into the body?

Artificial nanoparticles measuring between one and 100 nanometers in size can theoretically enter the body in three ways: through the skin, via the lungs and via the digestive tract. Almost every study concludes that healthy, undamaged skin is an effective protective barrier against nanoparticles. When it comes to the route through the stomach and gut, however, the research community is at odds. But upon closer inspection the value of many alarmist reports is dubious – such as when nanoparticles made of soluble substances like zinc oxide or silver are being studied. Although the particles disintegrate and the ions drifting into the body are cytotoxic, this effect has nothing to do with the topic of nanoparticles but is merely linked to the toxicity of the (dissolved) substance and the ingested dose.

Laboratory animals die in vain – drastic overdoses and other errors

Krug also discovered that some researchers maltreat their laboratory animals with absurdly high amounts of nanoparticles. Chinese scientists, for instance, fed mice five grams of titanium oxide per kilogram of body weight, without detecting any effects. By way of comparison: half the amount of kitchen salt would already have killed the animals. A sloppy job is also being made of things in the study of lung exposure to nanoparticles: inhalation experiments are expensive and complex because a defined number of particles has to be swirled around in the air. Although it is easier to place the particles directly in the animal’s windpipe (“instillation”), some researchers overdo it to such an extent that the animals suffocate on the sheer mass of nanoparticles.

While others might well make do without animal testing and conduct in vitro experiments on cells, here, too, cell cultures are covered by layers of nanoparticles that are 500 nanometers thick, causing them to die from a lack of nutrients and oxygen alone – not from a real nano-effect. And even the most meticulous experiment is worthless if the particles used have not been characterized rigorously beforehand. Some researchers simply skip this preparatory work and use the particles “straight out of the box”. Such experiments are irreproducible, warns Krug.

As noted in the news item, the scientists at Empa have devised a solution to some to of the problems (from the press release),

The solution: inter-laboratory tests with standard materials
Empa is thus collaborating with research groups like EPFL’s Powder Technology Laboratory, with industrial partners and with Switzerland’s Federal Office of Public Health (FOPH) to find a solution to the problem: on 9 October the “NanoScreen” programme, one of the “CCMX Materials Challenges”, got underway, which is expected to yield a set of pre-validated methods for lab experiments over the next few years. It involves using test materials that have a closely defined particle size distribution, possess well-documented biological and chemical properties and can be altered in certain parameters – such as surface charge. “Thanks to these methods and test substances, international labs will be able to compare, verify and, if need be, improve their experiments,” explains Peter Wick, Head of Empa’s laboratory for Materials-Biology Interactions.

Instead of the all-too-familiar “fumbling around in the dark”, this would provide an opportunity for internationally coordinated research strategies to not only clarify the potential risks of new nanoparticles in retrospect but even be able to predict them. The Swiss scientists therefore coordinate their research activities with the National Institute of Standards and Technology (NIST) in the US, the European Commission’s Joint Research Center (JRC) and the Korean Institute of Standards and Science (KRISS).

Bravo! and thank you Dr. Krug and Empa for confirming something I’ve suspected due to hints from more informed commentators. Unfortunately my ignorance. about research protocols has not permitted me to undertake a better analysis of the research. ,

Here’s a link to and a citation for the paper,

Nanosafety Research—Are We on the Right Track? by Prof. Dr. Harald F. Krug. Angewandte Chemie International Edition DOI: 10.1002/anie.201403367 Article first published online: 10 OCT 2014

This is an open access paper.

Smallest known reference material issued

I betray some of my occupational origins with this comment; a reference material is not necessarily a reference book. A Sept. 25, 2014, news item on Nanowerk clarifies the use of the term reference material in relationship to the US National Institute of Standards and technology while describing a new set designed for use at the nanoscale,

If it’s true that good things come in small packages, then the National Institute of Standards and Technology (NIST) can now make anyone working with nanoparticles very happy. NIST recently issued Reference Material (RM) 8027, the smallest known reference material ever created for validating measurements of these man-made, ultrafine particles between 1 and 100 nanometers in size.

RM 8027 consists of five hermetically sealed ampoules containing one milliliter of silicon nanoparticles—all certified to be close to 2 nanometers in diameter—suspended in toluene.

A Sept. 24, 2014 NIST news release (also on EurekAlert but dated Sept. 25, 2014), which originated the news item, provides a more detailed description and purchasing instructions

To yield the appropriate sizes for the new RM, the nanocrystals are etched from a silicon wafer, separated using ultrasound and then stabilized within an organic shell. Particle size and chemical composition are determined by dynamic light scattering, analytical centrifugation, electron microscopy and inductively coupled plasma mass spectrometry (ICP-MS), a powerful technique that can measure elements at concentrations as low as several parts per billion.

“For anyone working with nanomaterials at dimensions 5 nanometers or less, our well-characterized nanoparticles can ensure confidence that their measurements are accurate,” says NIST research chemist Vytas Reipa, leader of the team that developed and qualified RM 8027.

Silicon nanoparticles such as those in RM 8027 are being studied as alternative semiconductor materials for next-generation photovoltaic solar cells and solid-state lighting, and as a replacement for carbon in the cathodes of lithium batteries. Another potential application comes from the fact that silicon crystals at dimensions of 5 nanometers or less fluoresce under ultraviolet light. Because of this property, silicon nanoparticles may one day serve as easily detectable “tags” for tracking nanosized substances in biological, environmental or other dynamic systems.

RM 8027 maybe ordered from the NIST Standard Reference Materials Program by phone, (301) 975-2200; by fax, (301) 948-3730; or online at http://www.nist.gov/srm.

NIST has provided an illustration,

Caption: A structural model of a typical silicon nanocrystal (yellow) was stabilized within an organic shell of cyclohexane (blue). Credit: NIST

Caption: A structural model of a typical silicon nanocrystal (yellow) was stabilized within an organic shell of cyclohexane (blue).
Credit: NIST

NIST also supplied this image,

Caption: This is a high-resolution transmission electron microscope photograph of a single silicon nanoparticle. Credit: NIST

Caption: This is a high-resolution transmission electron microscope photograph of a single silicon nanoparticle. Credit: NIST

As is common with the images that scientists actually use in their work, the colour palette is gray.

Carbon capture with nanoporous material in the oilfields

Researchers at Rice University (Texas) have devised a new technique for carbon capture according to a June 3, 2014 news item on Nanowerk,

Rice University scientists have created an Earth-friendly way to separate carbon dioxide from natural gas at wellheads.

A porous material invented by the Rice lab of chemist James Tour sequesters carbon dioxide, a greenhouse gas, at ambient temperature with pressure provided by the wellhead and lets it go once the pressure is released. The material shows promise to replace more costly and energy-intensive processes.

A June 3, 2014 Rice University news release, which originated the news item, provides a general description of how carbon dioxide is currently removed during fossil fuel production and adds a few more details about the new technology,

Natural gas is the cleanest fossil fuel. Development of cost-effective means to separate carbon dioxide during the production process will improve this advantage over other fossil fuels and enable the economic production of gas resources with higher carbon dioxide content that would be too costly to recover using current carbon capture technologies, Tour said. Traditionally, carbon dioxide has been removed from natural gas to meet pipelines’ specifications.

The Tour lab, with assistance from the National Institute of Standards and Technology (NIST), produced the patented material that pulls only carbon dioxide molecules from flowing natural gas and polymerizes them while under pressure naturally provided by the well.

When the pressure is released, the carbon dioxide spontaneously depolymerizes and frees the sorbent material to collect more.

All of this works in ambient temperatures, unlike current high-temperature capture technologies that use up a significant portion of the energy being produced.

The news release mentions current political/legislative actions in the US and the implications for the oil and gas industry while further describing the advantages of this new technique,

“If the oil and gas industry does not respond to concerns about carbon dioxide and other emissions, it could well face new regulations,” Tour said, noting the White House issued its latest National Climate Assessment last month [May 2014] and, this week [June 2, 2014], set new rules to cut carbon pollution from the nation’s power plants.

“Our technique allows one to specifically remove carbon dioxide at the source. It doesn’t have to be transported to a collection station to do the separation,” he said. “This will be especially effective offshore, where the footprint of traditional methods that involve scrubbing towers or membranes are too cumbersome.

“This will enable companies to pump carbon dioxide directly back downhole, where it’s been for millions of years, or use it for enhanced oil recovery to further the release of oil and natural gas. Or they can package and sell it for other industrial applications,” he said.

This is an epic (Note to writer: well done) news release as only now is there a technical explanation,

The Rice material, a nanoporous solid of carbon with nitrogen or sulfur, is inexpensive and simple to produce compared with the liquid amine-based scrubbers used now, Tour said. “Amines are corrosive and hard on equipment,” he said. “They do capture carbon dioxide, but they need to be heated to about 140 degrees Celsius to release it for permanent storage. That’s a terrible waste of energy.”

Rice graduate student Chih-Chau Hwang, lead author of the paper, first tried to combine amines with porous carbon. “But I still needed to heat it to break the covalent bonds between the amine and carbon dioxide molecules,” he said. Hwang also considered metal oxide frameworks that trap carbon dioxide molecules, but they had the unfortunate side effect of capturing the desired methane as well and they are far too expensive to make for this application.

The porous carbon powder he settled on has massive surface area and turns the neat trick of converting gaseous carbon dioxide into solid polymer chains that nestle in the pores.

“Nobody’s ever seen a mechanism like this,” Tour said. “You’ve got to have that nucleophile (the sulfur or nitrogen atoms) to start the polymerization reaction. This would never work on simple activated carbon; the key is that the polymer forms and provides continuous selectivity for carbon dioxide.”

Methane, ethane and propane molecules that make up natural gas may try to stick to the carbon, but the growing polymer chains simply push them off, he said.

The researchers treated their carbon source with potassium hydroxide at 600 degrees Celsius to produce the powders with either sulfur or nitrogen atoms evenly distributed through the resulting porous material. The sulfur-infused powder performed best, absorbing 82 percent of its weight in carbon dioxide. The nitrogen-infused powder was nearly as good and improved with further processing.

Tour said the material did not degrade over many cycles, “and my guess is we won’t see any. After heating it to 600 degrees C for the one-step synthesis from inexpensive industrial polymers, the final carbon material has a surface area of 2,500 square meters per gram, and it is enormously robust and extremely stable.”

Apache Corp., a Houston-based oil and gas exploration and production company, funded the research at Rice and licensed the technology. Tour expected it will take time and more work on manufacturing and engineering aspects to commercialize.

Here’s a link to and a citation for the paper,

Capturing carbon dioxide as a polymer from natural gas by Chih-Chau Hwang, Josiah J. Tour, Carter Kittrell, Laura Espinal, Lawrence B. Alemany, & James M. Tour. Nature Communications 5, Article number: 3961 doi:10.1038/ncomms4961 Published 03 June 2014

This paper is behind a paywall.

The researchers have made an illustration of the material available,

 Illustration by Tanyia Johnson/Rice University

Illustration by Tanyia Johnson/Rice University

This morning, Azonano posted a June 6, 2014 news item about a patent for carbon capture,

CO2 Solutions Inc. ( the “Corporation”), an innovator in the field of enzyme-enabled carbon capture technology, today announced it has received a Notice of Allowance from the U.S. Patent and Trademark Office for its patent application No. 13/264,294 entitled Process for CO2 Capture Using Micro-Particles Comprising Biocatalysts.

One might almost think these announcements were timed to coincide with the US White House’s moves.

As for CO2 Solutions, this company is located in Québec, Canada.  You can find out more about the company here (you may want to click on the English language button).