Category Archives: environment

Center for Sustainable Nanotechnology or how not to poison and make the planet uninhabitable

I received notice of the Center for Sustainable Nanotechnology’s newest deal with the US National Science Foundation in an August 31, 2015 email University of Wisconsin-Madison (UWM) news release,

The Center for Sustainable Nanotechnology, a multi-institutional research center based at the University of Wisconsin-Madison, has inked a new contract with the National Science Foundation (NSF) that will provide nearly $20 million in support over the next five years.

Directed by UW-Madison chemistry Professor Robert Hamers, the center focuses on the molecular mechanisms by which nanoparticles interact with biological systems.

Nanotechnology involves the use of materials at the smallest scale, including the manipulation of individual atoms and molecules. Products that use nanoscale materials range from beer bottles and car wax to solar cells and electric and hybrid car batteries. If you read your books on a Kindle, a semiconducting material manufactured at the nanoscale underpins the high-resolution screen.

While there are already hundreds of products that use nanomaterials in various ways, much remains unknown about how these modern materials and the tiny particles they are composed of interact with the environment and living things.

“The purpose of the center is to explore how we can make sure these nanotechnologies come to fruition with little or no environmental impact,” explains Hamers. “We’re looking at nanoparticles in emerging technologies.”

In addition to UW-Madison, scientists from UW-Milwaukee, the University of Minnesota, the University of Illinois, Northwestern University and the Pacific Northwest National Laboratory have been involved in the center’s first phase of research. Joining the center for the next five-year phase are Tuskegee University, Johns Hopkins University, the University of Iowa, Augsburg College, Georgia Tech and the University of Maryland, Baltimore County.

At UW-Madison, Hamers leads efforts in synthesis and molecular characterization of nanomaterials. soil science Professor Joel Pedersen and chemistry Professor Qiang Cui lead groups exploring the biological and computational aspects of how nanomaterials affect life.

Much remains to be learned about how nanoparticles affect the environment and the multitude of organisms – from bacteria to plants, animals and people – that may be exposed to them.

“Some of the big questions we’re asking are: How is this going to impact bacteria and other organisms in the environment? What do these particles do? How do they interact with organisms?” says Hamers.

For instance, bacteria, the vast majority of which are beneficial or benign organisms, tend to be “sticky” and nanoparticles might cling to the microorganisms and have unintended biological effects.

“There are many different mechanisms by which these particles can do things,” Hamers adds. “The challenge is we don’t know what these nanoparticles do if they’re released into the environment.”

To get at the challenge, Hamers and his UW-Madison colleagues are drilling down to investigate the molecular-level chemical and physical principles that dictate how nanoparticles interact with living things.
Pedersen’s group, for example, is studying the complexities of how nanoparticles interact with cells and, in particular, their surface membranes.

“To enter a cell, a nanoparticle has to interact with a membrane,” notes Pedersen. “The simplest thing that can happen is the particle sticks to the cell. But it might cause toxicity or make a hole in the membrane.”

Pedersen’s group can make model cell membranes in the lab using the same lipids and proteins that are the building blocks of nature’s cells. By exposing the lab-made membranes to nanomaterials now used commercially, Pedersen and his colleagues can see how the membrane-particle interaction unfolds at the molecular level – the scale necessary to begin to understand the biological effects of the particles.

Such studies, Hamers argues, promise a science-based understanding that can help ensure the technology leaves a minimal environmental footprint by identifying issues before they manifest themselves in the manufacturing, use or recycling of products that contain nanotechnology-inspired materials.

To help fulfill that part of the mission, the center has established working relationships with several companies to conduct research on materials in the very early stages of development.

“We’re taking a look-ahead view. We’re trying to get into the technological design cycle,” Hamers says. “The idea is to use scientific understanding to develop a predictive ability to guide technology and guide people who are designing and using these materials.”

What with this initiative and the LCnano Network at Arizona State University (my April 8, 2014 posting; scroll down about 50% of the way), it seems that environmental and health and safety studies of nanomaterials are kicking into a higher gear as commercialization efforts intensify.

Save those coffee grounds, they can be used for fuel storage

A September 1, 2015 news item on Nanowerk features research from Korea that could point the way to using coffee grounds for methane storage (Note: A link has been removed),

Scientists have developed a simple process to treat waste coffee grounds to allow them to store methane. The simple soak and heating process develops a carbon capture nanomaterial with the additional environmental benefits of recycling a waste product.

The results are published today, 03 September 2015, in the journal Nanotechnology (“Activated carbon derived from waste coffee grounds for stable methane storage”). [emphasis mine]

Methane capture and storage provides a double environmental return – it removes a harmful greenhouse gas from the atmosphere that can then be used as a fuel that is cleaner than other fossil fuels.

The process developed by the researchers, based at the Ulsan National Institute of Science and Technology (UNIST), South Korea, involves soaking the waste coffee grounds in sodium hydroxide and heating to 700-900 °C in a furnace. This produced a stable carbon capture material in less than a day – a fraction of the time it takes to produce carbon capture materials.

I wonder if someone meant to embargo this news release as the paper isn’t due to be published until Thurs., Sept. 3, 2015.

In any event, the Institute of Physics (IOP) Sept. 1, 2015 news release on Alpha Galileo and elsewhere is making the rounds. Here’s more from the news release,

“The big thing is we are decreasing the fabrication time and we are using cheap materials,” explains Christian Kemp, an author of the paper now based at Pohang University of Science and Technology, Korea. “The waste material is free compared compared to all the metals and expensive organic chemicals needed in other processes – in my opinion this is a far easier way to go.”

Kemp found inspiration in his cup of coffee whilst discussing an entirely different project with colleagues at UNIST. “We were sitting around drinking coffee and looked at the coffee grounds and thought ‘I wonder if we can use this for methane storage?’” he continues.

The absorbency of coffee grounds may be the key to successful activation of the material for carbon capture. “It seems when we add the sodium hydroxide to form the activated carbon it absorbs everything,” says Kemp. “We were able to take away one step in the normal activation process – the filtering and washing – because the coffee is such a brilliant absorbant.”

The work also demonstrates hydrogen storage at cryogenic temperatures, and the researchers are now keen to develop hydrogen storage in the activated coffee grounds at less extreme temperatures.

Once the paper has been published I will return to add a link to and a citation for it.

ETA Sept. 3, 2015 (It seems I was wrong about the publication date):

Activated carbon derived from waste coffee grounds for stable methane storage by K Christian Kemp, Seung Bin Baek, Wang-Geun Lee, M Meyyappan, and Kwang S Kim. IOP Publishing Ltd • Nanotechnology, Volume 26, Number 38 doi:10.1088/0957-4484/26/38/385602) Published 2 September 2015 • © 2015

This is an open access paper.

Plus, there is a copy of the press release on EurekAlert.

Nanotechnology takes the big data dive

Duke University’s (North Carolina, US) Center for Environmental Implications of Nano Technology (CEINT) is back in the news. An August 18, 2015 news item on Nanotechnology Now  highlights two new projects intended to launch the field of nanoinformatics,

In two new studies, researchers from across the country spearheaded by Duke University faculty have begun to design the framework on which to build the emerging field of nanoinformatics.

An August 18, 2015 Duke University news release on EurekAlert, which originated the news item, describes the notion of nanoinformatics and how Duke is playing a key role in establishing this field,

Nanoinformatics is, as the name implies, the combination of nanoscale research and informatics. It attempts to determine which information is relevant to the field and then develop effective ways to collect, validate, store, share, analyze, model and apply that information — with the ultimate goal of helping scientists gain new insights into human health, the environment and more.

In the first paper, published on August 10, 2015, in the Beilstein Journal of Nanotechnology, researchers begin the conversation of how to standardize the way nanotechnology data are curated.

Because the field is young and yet extremely diverse, data are collected and reported in different ways in different studies, making it difficult to compare apples to apples. Silver nanoparticles in a Florida swamp could behave entirely differently if studied in the Amazon River. And even if two studies are both looking at their effects in humans, slight variations like body temperature, blood pH levels or nanoparticles only a few nanometers larger can give different results. For future studies to combine multiple datasets to explore more complex questions, researchers must agree on what they need to know when curating nanomaterial data.

“We chose curation as the focus of this first paper because there are so many disparate efforts that are all over the road in terms of their missions, and the only thing they all have in common is that somehow they have to enter data into their resources,” said Christine Hendren, a research scientist at Duke and executive director of the Center for the Environmental Implications of NanoTechnology (CEINT). “So we chose that as the kernel of this effort to be as broad as possible in defining a baseline for the nanoinformatics community.”

The paper is the first in a series of six that will explore what people mean — their vocabulary, definitions, assumptions, research environments, etc. — when they talk about gathering data on nanomaterials in digital form. And to get everyone on the same page, the researchers are seeking input from all stakeholders, including those conducting basic research, studying environmental implications, harnessing nanomaterial properties for applications, developing products and writing government regulations.

The daunting task is being undertaken by the Nanomaterial Data Curation Initiative (NDCI), a project of the National Cancer Informatics Nanotechnology Working Group (NCIP NanoWG) lead by a diverse team of nanomaterial data stakeholders. If successful, not only will these disparate interests be able to combine their data, the project will highlight what data are missing and help drive the research priorities of the field.

In the second paper, published on July 16, 2015, in Science of The Total Environment, Hendren and her colleagues at CEINT propose a new, standardized way of studying the properties of nanomaterials.

“If we’re going to move the field forward, we have to be able to agree on what measurements are going to be useful, which systems they should be measured in and what data gets reported, so that we can make comparisons,” said Hendren.

The proposed strategy uses functional assays — relatively simple tests carried out in standardized, well-described environments — to measure nanomaterial behavior in actual systems.

For some time, the nanomaterial research community has been trying to use measured nanomaterial properties to predict outcomes. For example, what size and composition of a nanoparticle is most likely to cause cancer? The problem, argues Mark Wiesner, director of CEINT, is that this question is far too complex to answer.

“Environmental researchers use a parameter called biological oxygen demand to predict how much oxygen a body of water needs to support its ecosystem,” explains Wiesner. “What we’re basically trying to do with nanomaterials is the equivalent of trying to predict the oxygen level in a lake by taking an inventory of every living organism, mathematically map all of their living mechanisms and interactions, add up all of the oxygen each would take, and use that number as an estimate. But that’s obviously ridiculous and impossible. So instead, you take a jar of water, shake it up, see how much oxygen is taken and extrapolate that. Our functional assay paper is saying do that for nanomaterials.”

The paper makes suggestions as to what nanomaterials’ “jar of water” should be. It identifies what parameters should be noted when studying a specific environmental system, like digestive fluids or wastewater, so that they can be compared down the road.

It also suggests two meaningful processes for nanoparticles that should be measured by functional assays: attachment efficiency (does it stick to surfaces or not) and dissolution rate (does it release ions).

In describing how a nanoinformatics approach informs the implementation of a functional assay testing strategy, Hendren said “We’re trying to anticipate what we want to ask the data down the road. If we’re banking all of this comparable data while doing our near-term research projects, we should eventually be able to support more mechanistic investigations to make predictions about how untested nanomaterials will behave in a given scenario.”

Here are links to and citations for the papers,

The Nanomaterial Data Curation Initiative: A collaborative approach to assessing, evaluating, and advancing the state of the field by Christine Ogilvie Hendren, Christina M. Powers, Mark D. Hoover, and Stacey L. Harper.  Beilstein J. Nanotechnol. 2015, 6, 1752–1762. doi:10.3762/bjnano.6.179 Published 18 Aug 2015

A functional assay-based strategy for nanomaterial risk forecasting by Christine Ogilvie Hendren, Gregory V. Lowry, Jason M. Unrine, and Mark R. Wiesner. Science of The Total Environment Available online 16 July 2015 In Press, Corrected Proof  DOI: 10.1016/j.scitotenv.2015.06.100.

The first paper listed in open access while the second paper is behind a paywall.

I’m (mostly) giving the final comments to Dexter Johnson who in an August 20, 2015 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website) had this to say (Note: Links have been removed),

It can take days for a supercomputer to unravel all the data contained in a single human genome. So it wasn’t long after mapping the first human genome that researchers coined the umbrella term “bioinformatics” in which a variety of methods and computer technologies are used for organizing and analyzing all that data.

Now teams of researchers led by scientists at Duke University believe that the field of nanotechnology has reached a critical mass of data and that a new field needs to be established, dubbed “nanoinformatics.

While being able to better organize and analyze data to study the impact of nanomaterials on the environment should benefit the field, what seems to remain a more pressing concern is having the tools for measuring nanomaterials outside of a vacuum and in water and air environments.”

I gather Christine Hendren has succeeded Mark Weisner as CEINT’s executive director.

Self-assembling copper and physiology

An Aug. 24, 2015 news item on Nanowerk highlights work at Louisiana Tech University (US) on self-assembling copper nanocomposites in liquid form,

Faculty at Louisiana Tech University have discovered, for the first time, a new nanocomposite formed by the self-assembly of copper and a biological component that occurs under physiological conditions, which are similar those found in the human body and could be used in targeted drug delivery for fighting diseases such as cancer.

The team, led by Dr. Mark DeCoster, the James E. Wyche III Endowed Associate Professor in Biomedical Engineering at Louisiana Tech, has also discovered a way for this synthesis to be carried out in liquid form. This would allow for controlling the scale of the synthesis up or down, and to grow structures with larger features, so they can be observed.

An Aug. 24, 2015 Louisiana Tech University news release by Dave Guerin, which originated the news item, describes possible future  applications and the lead researcher’s startup company,

“We are currently investigating how this new material interacts with cells,” said DeCoster. “It may be used, for example for drug delivery, which could be used in theory for fighting diseases such as cancer. Also, as a result of the copper component that we used, there could be some interesting electronics, energy, or optics applications that could impact consumer products. In addition, copper has some interesting and useful antimicrobial features.

“Finally, as the recent environmental spill of mining waste into river systems showed us, metals, including copper, can sometimes make their way into freshwater systems, so our newly discovered metal-composite methods could provide a way to “bind up” unwanted copper into a useful or more stable form.”

DeCoster said there were two aspects of this discovery that surprised him and his research team. First, they found that once formed, these copper nanocomposites were incredibly stable both in liquid or dried form, and remained stable for years. “We have been carrying out this research for at least four years and have a number of samples that are at least two years old and still stable,” DeCoster said.

Second, DeCoster’s group was very surprised that these composites are resistant to agglomeration, which is the process by which material clumps or sticks together.

“This is of benefit because it allows us to work with individual structures in order to separate or modify them chemically,” explains DeCoster. “When materials stick together and clump, as many do, it is much harder to work with them in a logical way. Both of these aspects, however, fit with our hypothesis that the self-assembly that we have discovered is putting positively charged copper together with negatively charged sulfur-containing cystine.”

The research discovery was a team effort that included DeCoster and Louisiana Tech students at the bachelor, master and doctoral level. “The quality of my team in putting together a sustained effort to figure out what was needed to reproducibly carry out the new self-assembly methods and to simplify them really speaks well as to what can be accomplished at Louisiana Tech University,” DeCoster said. “Furthermore, the work is very multi-disciplinary, meaning that it required nanotechnology as well as biological and biochemical insights to make it all work, as well as some essential core instrumentation that we have at Louisiana Tech.”

DeCoster says the future of this research has some potentially high impacts. He and his team are speaking with colleagues and collaborators about how to test these new nanocomposites for applications in bioengineering and larger composites such as materials that would be large enough to be hand-held.

“Our recent publication of the work could generate some interest and new ideas,” said DeCoster. “We are working on new proposals to fund the research and to keep it moving forward. We are currently making these materials on an ‘as needed’ basis, knowing that they can be stored once generated, and if we discover new uses for the nanocomposites, then applications for the materials could lead to income generation through a start-up company that I have formed.”

Here’s a link to and a citation for the paper,

MediumGeneration of Scalable, Metallic High-Aspect Ratio Nanocomposites in a Biological Liquid Medium by Kinsey Cotton Kelly, Jessica R. Wasserman, Sneha Deodhar, Justin Huckaby, and Mark A. DeCoster. J. Vis. Exp. [Journal of Visual Experimentation; JoVE] (101), e52901, doi:10.3791/52901 (2015).

This paper/video is behind a paywall.

ISEA (International Symposium on Electronic Arts) 2015 and the pronoun ‘I’

The 2015 International Symposium on Electronic Arts (or ISEA 2015) held  in Vancouver ended yesterday, Aug. 19, 2015. It was quite an experience both as a participant and as a presenter (mentioned in my Aug. 14, 2015 posting, Sneak peek: Steep (1): a digital poetry of gold nanoparticles). Both this ISEA and the one I attended previously in 2009 (Belfast, Northern Ireland, and Dublin, Ireland) were jampacked with sessions, keynote addresses, special events, and exhibitions of various artworks. Exhilarating and exhausting, that is the ISEA experience for me and just about anyone else I talked to here in Vancouver (Canada). In terms of organization, I have to give props to the Irish. Unfortunately, the Vancouver team didn’t seem to have given their volunteers any training and technical difficulties abounded. Basics such as having a poster outside a room noting what session was taking place, signage indicating which artist’s work was being featured, and good technical support (my guy managed to plug in a few things but seemed disinclined or perhaps didn’t have the technical expertise (?) to troubleshoot prior to the presentation) seemed elusive (a keynote presentation had to be moved due to technical requirements [!] plus no one told the volunteer staff who consequently misdirected people). Ooops.

Despite the difficulties, people remained enthusiastic and that’s a tribute to both the participants and, importantly, the organizers. The Vancouver ISEA was a huge undertaking with over 1000 presentation submissions made and over 1800 art work submissions. They had 900+ register and were the first ISEA able to offer payment to artists for their installations. Bravo to Philippe Pasquier, Thecla Schiphorst, Kate Armstrong, Malcolm Levy, and all the others who worked hard to pull this off.

Moving on to ‘I’, while the theme for ISEA 2015 was Disruption, I noticed a number of presentations focused on biology and on networks (in particular, generative networks). In some ways this parallels what’s happening in the sciences where more notice is being given to networks and network communications of all sorts.  For example, there’s an Aug. 19, 2015 news item on ScienceDaily suggesting that our use of the pronoun ‘I’ may become outdated.  What we consider to be an individual may be better understood as a host for a number of communities or networks,

Recent microbiological research has shown that thinking of plants and animals, including humans, as autonomous individuals is a serious over-simplification.

A series of groundbreaking studies have revealed that what we have always thought of as individuals are actually “biomolecular networks” that consist of visible hosts plus millions of invisible microbes that have a significant effect on how the host develops, the diseases it catches, how it behaves and possibly even its social interactions.

“It’s a case of the whole being greater than the sum of its parts,” said Seth Bordenstein, associate professor of biological sciences at Vanderbilt University, who has contributed to the body of scientific knowledge that is pointing to the conclusion that symbiotic microbes play a fundamental role in virtually all aspects of plant and animal biology, including the origin of new species.

In this case, the parts are the host and its genome plus the thousands of different species of bacteria living in or on the host, along with all their genomes, collectively known as the microbiome. (The host is something like the tip of the iceberg while the bacteria are like the part of the iceberg that is underwater: Nine out of every 10 cells in plant and animal bodies are bacterial. But bacterial cells are so much smaller than host cells that they have generally gone unnoticed.)

An Aug. 19, 2015 Vanderbilt University news release, which originated the news item, describes this provocative idea (no more ‘I’)  further,

Microbiologists have coined new terms for these collective entities — holobiont — and for their genomes — hologenome. “These terms are needed to define the assemblage of organisms that makes up the so-called individual,” said Bordenstein.

In the article “Host Biology in Light of the Microbiome: Ten Principles of Holobionts and Hologenomes” published online Aug. 18 [2015] in the open access journal PLOS Biology, Bordenstein and his colleague Kevin Theis from the University of Michigan take the general concepts involved in this new paradigm and break them down into underlying principles that apply to the entire field of biology.

They make specific and refutable predictions based on these principles and call for other biologists to test them theoretically and experimentally.

“One of the basic expectations from this conceptual framework is that animal and plant experiments that do not account for what is happening at the microbiological level will be incomplete and, in some cases, will be misleading as well,” said Bordenstein.

The first principle they advance is that holobionts and hologenomes are fundamental units of biological organization.

Another is that evolutionary forces such as natural selection and drift may act on the hologenome not just on the genome. So mutations in the microbiome that affect the fitness of a holobiont are just as important as mutations in the host’s genome. However, they argue that this does not change the basic rules of evolution but simply upgrades the types of biological units that the rules may act upon.

Although it does not change the basic rules of evolution, holobionts do have a way to respond to environmental challenges that is not available to individual organisms: They can alter the composition of their bacterial communities. For example, if a holobiont is attacked by a pathogen that the host cannot defend against, another symbiont may fulfill the job by manufacturing a toxin that can kill the invader. In this light, the microbes are as much part of the holobiont immune system as the host immune genes themselves.

According to Bordenstein, these ideas are gaining acceptance in the microbiology community. At the American Society of Microbiology General Meeting in June [2015], he convened the inaugural session on “Holobionts and Their Hologenomes” and ASM’s flagship journal mBio plans to publish a special issue on the topic in the coming year. [emphases are mine]

However, adoption of these ideas has been slower in other fields.

“Currently, the field of biology has reached an inflection point. The silos of microbiology, zoology and botany are breaking down and we hope that this framework will help further unify these fields,” said Bordenstein.

Not only will this powerful holistic approach affect the basic biological sciences but it also is likely to impact the practice of personalized medicine as well, Bordenstein said.

Take the missing heritability problem, for example. Although genome-wide studies have provided valuable insights into the genetic basis of a number of simple diseases, they have only found a small portion of the genetic causes of a number of more complex conditions such as autoimmune and metabolic diseases.

These may in part be “missing” because the genetic factors that cause them are in the microbiome, he pointed out.

“Instead of being so ‘germophobic,’ we need to accept the fact that we live in and benefit from a microbial world. We are as much an environment for microbes as microbes are for us,” said Bordenstein.

Here’s a link to and a citation for the paper,

Host Biology in Light of the Microbiome: Ten Principles of Holobionts and Hologenomes by Seth R. Bordenstein and Kevin R. Theis. PLOS DOI: 10.1371/journal.pbio.1002226 Published: August 18, 2015

This is an open access paper.

It’s intriguing to see artists and scientists exploring ideas that resonate with each other. In fact, ISEA 2015 hosted a couple of sessions on BioArt, as well as, having sessions devoted to networks. While, I wasn’t thinking about networks or biological systems when I wrote my poem on gold nanoparticles, I did pose this possibility (how we become the sum of our parts) at the end:

Nature’s alchemy
breathing them
eating them
drinking them
we become gold
discovering what we are

As for how Raewyn handled the idea, words fail, please do go here to see the video here.

Carbon capture with ‘diamonds from the sky’

Before launching into the latest on a new technique for carbon capture, it might be useful to provide some context. Arthur Neslen’s March 23, 2015 opinion piece outlines the issues and notes that one Norwegian Prime Minister resigned when coalition government partners attempted to build gas power plants without carbon capture and storage facilities (CCS), Note : A link has been removed,

At least 10 European power plants were supposed to begin piping their carbon emissions into underground tombs this year, rather than letting them twirl into the sky. None has done so.

Missed deadlines, squandered opportunities, spiralling costs and green protests have plagued the development of carbon capture and storage (CCS) technology since Statoil proposed the concept more than two decades ago.

But in the face of desperate global warming projections the CCS dream still unites Canadian tar sands rollers with the UN’s Intergovernmental Panel on Climate Change (IPCC), and Shell with some environmentalists.

With 2bn people in the developing world expected to hook up to the world’s dirty energy system by 2050, CCS holds out the tantalising prospect of fossil-led growth that does not fry the planet.

“With CCS in the mix, we can decarbonise in a cost-effective manner and still continue to produce, to some extent, our fossil fuels,” Tim Bertels, Shell’s Glocal CCS portfolio manager told the Guardian. “You don’t need to divest in fossil fuels, you need to decarbonise them.”

The technology has been gifted “a very significant fraction” of the billions of dollars earmarked by Shell for clean energy research, he added. But the firm is also a vocal supporter of public funding for CCS from carbon markets, as are almost all players in the industry.

Enthusiasm for this plan is not universal (from Neslen’s opinion piece),

Many environmentalists see the idea as a non-starter because it locks high emitting power plants into future energy systems, and obstructs funding for the cheaper renewables revolution already underway. “CCS is is completely irrelevant,” said Jeremy Rifkin, a noted author and climate adviser to several governments. “I don’t even think about it. It’s not going to happen. It’s not commercially available and it won’t be commercially viable.”

I recommend reading Neslen’s piece for anyone who’s not already well versed on the issues. He uses Norway as a case study and sums up the overall CCS political situation this way,

In many ways, the debate over carbon capture and storage is a struggle between two competing visions of the societal transformation needed to avert climate disaster. One vision represents the enlightened self-interest of a contributor to the problem. The other cannot succeed without eliminating its highly entrenched opponent. The battle is keenly fought by technological optimists on both sides. But if Norway’s fractious CCS experience is any indicator, it will be decided on the ground by the grimmest of realities.

On that note of urgency, here’s some research on carbon dioxide (CO2) or, more specifically, carbon capture and utilization technology, from an Aug. 19, 2015 news item on Nanowerk,,

Finding a technology to shift carbon dioxide (CO2), the most abundant anthropogenic greenhouse gas, from a climate change problem to a valuable commodity has long been a dream of many scientists and government officials. Now, a team of chemists says they have developed a technology to economically convert atmospheric CO2    directly into highly valued carbon nanofibers for industrial and consumer products.

An Aug. 19, 2015 American Chemical Society (ACS) news release (also on EurekAlert), which originated the news time, expands on the theme,

The team will present brand-new research on this new CO2 capture and utilization technology at the 250th National Meeting & Exposition of the American Chemical Society (ACS). ACS is the world’s largest scientific society. The national meeting, which takes place here through Thursday, features more than 9,000 presentations on a wide range of science topics.

“We have found a way to use atmospheric CO2 to produce high-yield carbon nanofibers,” says Stuart Licht, Ph.D., who leads a research team at George Washington University. “Such nanofibers are used to make strong carbon composites, such as those used in the Boeing Dreamliner, as well as in high-end sports equipment, wind turbine blades and a host of other products.”

Previously, the researchers had made fertilizer and cement without emitting CO2, which they reported. Now, the team, which includes postdoctoral fellow Jiawen Ren, Ph.D., and graduate student Jessica Stuart, says their research could shift CO2 from a global-warming problem to a feed stock for the manufacture of in-demand carbon nanofibers.

Licht calls his approach “diamonds from the sky.” That refers to carbon being the material that diamonds are made of, and also hints at the high value of the products, such as the carbon nanofibers that can be made from atmospheric carbon and oxygen.

Because of its efficiency, this low-energy process can be run using only a few volts of electricity, sunlight and a whole lot of carbon dioxide. At its root, the system uses electrolytic syntheses to make the nanofibers. CO2 is broken down in a high-temperature electrolytic bath of molten carbonates at 1,380 degrees F (750 degrees C). Atmospheric air is added to an electrolytic cell. Once there, the CO2 dissolves when subjected to the heat and direct current through electrodes of nickel and steel. The carbon nanofibers build up on the steel electrode, where they can be removed, Licht says.

To power the syntheses, heat and electricity are produced through a hybrid and extremely efficient concentrating solar-energy system. The system focuses the sun’s rays on a photovoltaic solar cell to generate electricity and on a second system to generate heat and thermal energy, which raises the temperature of the electrolytic cell.

Licht estimates electrical energy costs of this “solar thermal electrochemical process” to be around $1,000 per ton of carbon nanofiber product, which means the cost of running the system is hundreds of times less than the value of product output.

“We calculate that with a physical area less than 10 percent the size of the Sahara Desert, our process could remove enough CO2 to decrease atmospheric levels to those of the pre-industrial revolution within 10 years,” he says. [emphasis mine]

At this time, the system is experimental, and Licht’s biggest challenge will be to ramp up the process and gain experience to make consistently sized nanofibers. “We are scaling up quickly,” he adds, “and soon should be in range of making tens of grams of nanofibers an hour.”

Licht explains that one advance the group has recently achieved is the ability to synthesize carbon fibers using even less energy than when the process was initially developed. “Carbon nanofiber growth can occur at less than 1 volt at 750 degrees C, which for example is much less than the 3-5 volts used in the 1,000 degree C industrial formation of aluminum,” he says.

A low energy approach that cleans up the air by converting greenhouse gases into useful materials and does it quickly is incredibly exciting. Of course, there are a few questions to be asked. Are the research outcomes reproducible by other teams? Licht notes the team is scaling the technology up but how soon can we scale up to industrial strength?

MOFs (metal-organic frameworks) to clean up nuclear waste?

There’s a possibility that metal-organic frameworks could be used to clean up nuclear waste according to an Aug. 5, 2015 news item on,

One of the most versatile and widely applicable classes of materials being studied today are the metal-organic frameworks. These materials, known as MOFs, are characterized by metal ions or metal-ion clusters that are linked together with organic molecules, forming ordered crystal structures that contain tiny cage-like pores with diameters of two nanometers or less.

MOFs can be thought of as highly specialized and customizable sieves. By designing them with pores of a certain size, shape, and chemical composition, researchers can tailor them for specific purposes. A few of the many, many possible applications for MOFs are storing hydrogen in fuel cells, capturing environmental contaminants, or temporarily housing catalytic agents for chemical reactions.

At [US Department of Energy] Brookhaven National Laboratory, physicist Sanjit Ghose and his collaborators have been studying MOFs designed for use in the separation of waste from nuclear reactors, which results from the reprocessing of nuclear fuel rods. He is targeting two waste products in particular: the noble gases xenon (Xe) and krypton (Kr).

An Aug. 4, 2015 Brookhaven National Laboratory news release, which originated the news item, describes not only the research and the reasons for it but also the institutional collaborations necessary to conduct the research,

There are compelling economic and environmental reasons to separate Xe and Kr from the nuclear waste stream. For one, because they have very different half-lives – about 36 days for Xe and nearly 11 years for Kr – pulling out the Xe greatly reduces the amount of waste that needs to be stored long-term before it is safe to handle. Additionally, the extracted Xe can be used for industrial applications, such as in commercial lighting and as an anesthetic. This research may also help scientists determine how to create MOFs that can remove other materials from the nuclear waste stream and expose the remaining unreacted nuclear fuel for further re-use. This could lead to much less overall waste that must be stored long-term and a more efficient system for producing nuclear energy, which is the source of about 20 percent of the electricity in the U.S.

Because Xe and Kr are noble gases, meaning their outer electron orbitals are filled and they don’t tend to bind to other atoms, they are difficult to manipulate. The current method for extracting them from the nuclear waste stream is cryogenic distillation, a process that is energy-intensive and expensive. The MOFs studied here use a very different approach: polarizing the gas atoms dynamically, just enough to draw them in using the van der Waals force. The mechanism works at room temperature, but also at hotter temperatures, which is key if the MOFs are to be used in a nuclear environment.

Recently, Ghose co-authored two papers that describe MOFs capable of adsorbing Xe and Kr, and excel at separating the Xe from the Kr. The papers are published in the May 22 online edition of the Journal of the American Chemical Society and the April 16 online edition of the Journal of Physical Chemistry Letters.

“Only a handful of noble-gas-specific MOFs have been studied so far, and we felt there was certainly scope for improvement through the discovery of more selective materials,” said Ghose.

Both MOF studies were carried out by large multi-institution collaborations, using a combination of X-ray diffraction, theoretical modeling, and other methods. The X-ray work was performed at Brookhaven’s former National Synchrotron Light Source (permanently closed and replaced by its successor, NSLS-II) and the Advanced Photon Source at Argonne National Laboratory (ANL), both DOE Office of Science User Facilities.

The JACS paper was co-authored by researchers from Brookhaven Lab, Stony Brook University (SBU), Pacific Northwest National Laboratory (PNNL), and the University of Amsterdam. Authors on the JPCL paper include scientists from Brookhaven, SBU, PNNL, ANL, the Deutsches Elektronen-Synchrotron (DESY) in Germany, and DM Strachan, LLC.

Here’s more about the first published paper in the Journal of Physical Chemistry Letters (JCPL) (from the news release)

A nickel-based MOF

The MOF studied in the JCPL paper consists of nickel (Ni) and the organic compound dioxido-benzene-dicarboxylate (DOBC), and is thus referred to as Ni-DOBDC. Ni-DOBDC can adsorb both Xe and Kr at room temperature but is highly selective toward Xe. In fact, it boasts what may be the highest Xe adsorption capacity of a MOF discovered to date.

The group studied Ni-DOBC using two main techniques: X-ray diffraction and first-principles density functional theory (DFT). The paper is the first published report to detail the adsorption mechanism by which the MOF takes in these noble gases at room temperature and pressure.

“Our results provide a fundamental understanding of the adsorption structure and the interactions between the MOF and the gas by combining direct structural analyses from experimental X-ray diffraction data and DFT calculations,” said Ghose.

The group was also able to discover the existence of a secondary site at the pore center in addition to the six-fold primary site. The seven-atom loading scheme was initially proposed by theorist Yan Li, an co-author of the JCPL paper and formerly on staff at Brookhaven (she is now an editor at Physical Review B), which was confirmed experimentally and theoretically. Data also indicate that Xe are adsorbed more strongly than Kr, due to its higher atomic polarizability. They also discovered a temperature-dependence of the adsorption that furthers this MOF’s selectivity for Xe over Kr. As the temperature was increased above room temperature, the Kr adsorption drops more drastically than for Xe. Over the entire temperature range tested, Xe adsorption always dominates that of Kr.

“The high separation capacity of Ni-DOBDC suggests that it has great potential for removing Xe from Kr in the off-gas streams in nuclear spent fuel reprocessing, as well as filtering Xe at low concentration from other gas mixtures,” said Ghose.

Ghose and Li are now preparing a manuscript that will discuss a more in-depth investigation into the possibility of packing in even more Xe atoms.

“Because of the confinement offered by each pore, we want to see if it’s possible to fit enough Xe in each chamber to form a solid,” said Li.

Ghose and Li hope to experimentally test this idea at NSLS-II in the future, at the facility’s X-ray Powder Diffraction (XPD) beamline, which Ghose has helped develop and build. Additional future studies of these and other MOFs will also take place at XPD. For example, they want to see what happens when other gases are present, such as nitrogen oxides, to mimic what happens in an actual nuclear reactor.

Then, there was the second paper published in the Journal of the American Chemical Society (JACS),

Another MOF, Another Promising Result

In the JACS paper, Ghose and researchers from Brookhaven, SBU, PNNL, and the University of Amsterdam describe a second MOF, dubbed Stony Brook MOF-2 (SBMOF-2). It also captures both Xe and Kr at room temperature and pressure, although is about ten times as effective at taking in Xe, with Xe taking up as much as 27 percent of its weight. SBMOF-2 had been theoretically predicted to be an efficient adsorbent for Xe and Kr, but until this research there had been no experimental results to back up the prediction.

“Our study is different than MOF research done by other groups,” said chemist John Parise, a coauthor of the JACS paper who holds a joint position with Brookhaven and SBU. “We did a lot of testing and investigated the capture mechanism very closely to get clues that would help us understand why the MOF worked, and how to tailor the structure to have even better properties.”

SBMOF-2 contains calcium (Ca) ions and an organic compound with the chemical formula C34H22O8. X-ray data show that its structure is unusual among microporous MOFs. It has fewer calcium sites than expected and an excess of oxygen over calcium. The calcium and oxgyen form CaO6, which takes the form of a three-dimensional octahedron. Notably, none of the six oxygen atoms bound to the calcium ion are shared with any other nearby calcium ions. The authors believe that SBMOF-2 is the first microporous MOF with these isolated CaO6 octahedra, which are connected by organic linker molecules.

The group discovered that the preference of SBMOF-2 for Xe over Kr is due to both the geometry and chemistry of its pores. All the pores have diamond-shaped cross sections, but they come in two sizes, designated type-1 and type-2. Both sizes are a better fit for the Xe molecule. The interiors of the pores have walls made of phenyl groups – ring-shaped C6H5 molecules – along with delocalized electron clouds and H atoms pointing into the pore. The type-2 pores also have hydroxyl anions (OH-) available. All of these features provide are potential sites for adsorbed Xe and Kr atoms.

In follow-up studies, Ghose and his colleagues will use these results to guide them as they determine what changes can be made to these MOFs to improve adsorption, as well as to determine what existing MOFs may yield similar or better performance.

Here are links to and citations for both papers,

Understanding the Adsorption Mechanism of Xe and Kr in a Metal–Organic Framework from X-ray Structural Analysis and First-Principles Calculations by Sanjit K. Ghose, Yan Li, Andrey Yakovenko, Eric Dooryhee, Lars Ehm, Lynne E. Ecker, Ann-Christin Dippel, Gregory J. Halder, Denis M. Strachan, and Praveen K. Thallapally. J. Phys. Chem. Lett., 2015, 6 (10), pp 1790–1794 DOI: 10.1021/acs.jpclett.5b00440 Publication Date (Web): April 16, 2015

Copyright © 2015 American Chemical Society

Direct Observation of Xe and Kr Adsorption in a Xe-Selective Microporous Metal–Organic Framework by Xianyin Chen, Anna M. Plonka, Debasis Banerjee, Rajamani Krishna, Herbert T. Schaef, Sanjit Ghose, Praveen K. Thallapally, and John B. Parise. J. Am. Chem. Soc., 2015, 137 (22), pp 7007–7010 DOI: 10.1021/jacs.5b02556 Publication Date (Web): May 22, 2015
Copyright © 2015 American Chemical Society

Both papers are behind a paywall.

Nanomaterials and UV (ultraviolet) light for environmental cleanups

I think this is the first time I’ve seen anything about a technology that removes toxic materials from both water and soil; it’s usually one or the other. A July 22, 2015 news item on Nanowerk makes the announcement (Note: A link has been removed),

Many human-made pollutants in the environment resist degradation through natural processes, and disrupt hormonal and other systems in mammals and other animals. Removing these toxic materials — which include pesticides and endocrine disruptors such as bisphenol A (BPA) — with existing methods is often expensive and time-consuming.

In a new paper published this week in Nature Communications (“Nanoparticles with photoinduced precipitation for the extraction of pollutants from water and soil”), researchers from MIT [Massachusetts Institute of Technology] and the Federal University of Goiás in Brazil demonstrate a novel method for using nanoparticles and ultraviolet (UV) light to quickly isolate and extract a variety of contaminants from soil and water.

A July 21, 2015 MIT news release by Jonathan Mingle, which originated the news item, describes the inspiration and the research in more detail,

Ferdinand Brandl and Nicolas Bertrand, the two lead authors, are former postdocs in the laboratory of Robert Langer, the David H. Koch Institute Professor at MIT’s Koch Institute for Integrative Cancer Research. (Eliana Martins Lima, of the Federal University of Goiás, is the other co-author.) Both Brandl and Bertrand are trained as pharmacists, and describe their discovery as a happy accident: They initially sought to develop nanoparticles that could be used to deliver drugs to cancer cells.

Brandl had previously synthesized polymers that could be cleaved apart by exposure to UV light. But he and Bertrand came to question their suitability for drug delivery, since UV light can be damaging to tissue and cells, and doesn’t penetrate through the skin. When they learned that UV light was used to disinfect water in certain treatment plants, they began to ask a different question.

“We thought if they are already using UV light, maybe they could use our particles as well,” Brandl says. “Then we came up with the idea to use our particles to remove toxic chemicals, pollutants, or hormones from water, because we saw that the particles aggregate once you irradiate them with UV light.”

A trap for ‘water-fearing’ pollution

The researchers synthesized polymers from polyethylene glycol, a widely used compound found in laxatives, toothpaste, and eye drops and approved by the Food and Drug Administration as a food additive, and polylactic acid, a biodegradable plastic used in compostable cups and glassware.

Nanoparticles made from these polymers have a hydrophobic core and a hydrophilic shell. Due to molecular-scale forces, in a solution hydrophobic pollutant molecules move toward the hydrophobic nanoparticles, and adsorb onto their surface, where they effectively become “trapped.” This same phenomenon is at work when spaghetti sauce stains the surface of plastic containers, turning them red: In that case, both the plastic and the oil-based sauce are hydrophobic and interact together.

If left alone, these nanomaterials would remain suspended and dispersed evenly in water. But when exposed to UV light, the stabilizing outer shell of the particles is shed, and — now “enriched” by the pollutants — they form larger aggregates that can then be removed through filtration, sedimentation, or other methods.

The researchers used the method to extract phthalates, hormone-disrupting chemicals used to soften plastics, from wastewater; BPA, another endocrine-disrupting synthetic compound widely used in plastic bottles and other resinous consumer goods, from thermal printing paper samples; and polycyclic aromatic hydrocarbons, carcinogenic compounds formed from incomplete combustion of fuels, from contaminated soil.

The process is irreversible and the polymers are biodegradable, minimizing the risks of leaving toxic secondary products to persist in, say, a body of water. “Once they switch to this macro situation where they’re big clumps,” Bertrand says, “you won’t be able to bring them back to the nano state again.”

The fundamental breakthrough, according to the researchers, was confirming that small molecules do indeed adsorb passively onto the surface of nanoparticles.

“To the best of our knowledge, it is the first time that the interactions of small molecules with pre-formed nanoparticles can be directly measured,” they write in Nature Communications.

Nano cleansing

Even more exciting, they say, is the wide range of potential uses, from environmental remediation to medical analysis.

The polymers are synthesized at room temperature, and don’t need to be specially prepared to target specific compounds; they are broadly applicable to all kinds of hydrophobic chemicals and molecules.

“The interactions we exploit to remove the pollutants are non-specific,” Brandl says. “We can remove hormones, BPA, and pesticides that are all present in the same sample, and we can do this in one step.”

And the nanoparticles’ high surface-area-to-volume ratio means that only a small amount is needed to remove a relatively large quantity of pollutants. The technique could thus offer potential for the cost-effective cleanup of contaminated water and soil on a wider scale.

“From the applied perspective, we showed in a system that the adsorption of small molecules on the surface of the nanoparticles can be used for extraction of any kind,” Bertrand says. “It opens the door for many other applications down the line.”

This approach could possibly be further developed, he speculates, to replace the widespread use of organic solvents for everything from decaffeinating coffee to making paint thinners. Bertrand cites DDT, banned for use as a pesticide in the U.S. since 1972 but still widely used in other parts of the world, as another example of a persistent pollutant that could potentially be remediated using these nanomaterials. “And for analytical applications where you don’t need as much volume to purify or concentrate, this might be interesting,” Bertrand says, offering the example of a cheap testing kit for urine analysis of medical patients.

The study also suggests the broader potential for adapting nanoscale drug-delivery techniques developed for use in environmental remediation.

“That we can apply some of the highly sophisticated, high-precision tools developed for the pharmaceutical industry, and now look at the use of these technologies in broader terms, is phenomenal,” says Frank Gu, an assistant professor of chemical engineering at the University of Waterloo in Canada, and an expert in nanoengineering for health care and medical applications.

“When you think about field deployment, that’s far down the road, but this paper offers a really exciting opportunity to crack a problem that is persistently present,” says Gu, who was not involved in the research. “If you take the normal conventional civil engineering or chemical engineering approach to treating it, it just won’t touch it. That’s where the most exciting part is.”

The researchers have made this illustration of their work available,

Nanoparticles that lose their stability upon irradiation with light have been designed to extract endocrine disruptors, pesticides, and other contaminants from water and soils. The system exploits the large surface-to-volume ratio of nanoparticles, while the photoinduced precipitation ensures nanomaterials are not released in the environment. Image: Nicolas Bertrand Courtesy: MIT

Nanoparticles that lose their stability upon irradiation with light have been designed to extract endocrine disruptors, pesticides, and other contaminants from water and soils. The system exploits the large surface-to-volume ratio of nanoparticles, while the photoinduced precipitation ensures nanomaterials are not released in the environment.
Image: Nicolas Bertrand Courtesy: MIT

Here’s a link to and a citation for the paper,

Nanoparticles with photoinduced precipitation for the extraction of pollutants from water and soil by Ferdinand Brandl, Nicolas Bertrand, Eliana Martins Lima & Robert Langer. Nature Communications 6, Article number: 7765 doi:10.1038/ncomms8765 Published 21 July 2015

This paper is open access.

Carbon sequestration and buckyballs (aka C60 or buckminsterfullerenes)

Sometime in the last few years I was asked about carbon sequestration (or carbon capture) and nanotechnology and had no answer for the question until now (drat!). A July 13, 2015 Rice University (Texas, US) news release (also on EurekAlert) describes some research into buckyballs and the possibility they could be used to confine greenhouse gases,

Rice University scientists are forging toward tunable carbon-capture materials with a new study that shows how chemical changes affect the abilities of enhanced buckyballs to confine greenhouse gases.

The lab of Rice chemist Andrew Barron found last year that carbon-60 molecules (aka buckyballs, discovered at Rice in the 1980s) gain the ability to sequester carbon dioxide when combined with a polymer known as polyethyleneimine (PEI).

Two critical questions – how and how well – are addressed in a new paper in the American Chemical Society journal Energy and Fuels.

The news release expands on the theme,

The amine-rich combination of C60 and PEI showed its potential in the previous study to capture emissions of carbon dioxide, a greenhouse gas, from such sources as industrial flue gases and natural-gas wells.

In the new study, the researchers found pyrolyzing the material – heating it in an oxygen-free environment – changes its chemical composition in ways that may someday be used to tune what the scientists call PEI-C60 for specific carbon-capture applications.

“One of the things we wanted to see is at what point, chemically, it converts from being something that absorbed best at high temperature to something that absorbed best at low temperature,” Barron said. “In other words, at what point does the chemistry change from one to the other?”

Lead author Enrico Andreoli pyrolyzed PEI-C60 in argon at various temperatures from 100 to 1,000 degrees Celsius (212 to 1,832 degrees Fahrenheit) and then evaluated each batch for carbon uptake.

He discovered the existence of a transition point at 200 C, a boundary between the material’s ability to soak in carbon dioxide through chemical means as opposed to physical absorption.

The material that was pyrolyzed at low temperatures became gooey and failed at pulling in carbon from high-temperature sources by chemical means. The opposite was true for PEI-C60 pyrolyzed at high heat. The now-porous, brittle material became better in low-temperature environments, physically soaking up carbon dioxide molecules.

At 200 C, they found the heat treatment breaks the polymer’s carbon-nitrogen bonds, leading to a drastic decrease in carbon capture by any means.

“One of the goals was to see if can we make this a little less gooey and still have chemical uptake, and the answer is, not really,” Barron said. “It flips from one process to the other. But this does give us a nice continuum of how to get from one to the other.”

Andreoli found that at its peak, untreated PEI-C60 absorbed more than a 10th of its weight in carbon dioxide at high temperatures (0.13 grams per gram of material at 90 C). Pyrolyzed PEI-C60 did nearly as well at low temperatures (0.12 grams at 25 C).

The researchers, with an eye on potential environmental benefits, continue to refine their process. “This has definitely pointed us in the right direction,” Barron said.

Here’s a link to and a citation for the paper,

Correlating Carbon Dioxide Capture and Chemical Changes in Pyrolyzed Polyethylenimine-C60 by Enrico Andreoli and Andrew R. Barron. Energy Fuels, Article ASAP DOI: 10.1021/acs.energyfuels.5b00778 Publication Date (Web): July 2, 2015

Copyright © 2015 American Chemical Society

This paper is behind a paywall.

Nanotechnology research protocols for Environment, Health and Safety Studies in US and a nanomedicine characterization laboratory in the European Union

I have two items relating to nanotechnology and the development of protocols. The first item concerns the launch of a new web portal by the US National Institute of Standards and Technology.

US National Institute of Standards and Technology (NIST)

From a July 1, 2015 news item on Azonano,

As engineered nanomaterials increasingly find their way into commercial products, researchers who study the potential environmental or health impacts of those materials face a growing challenge to accurately measure and characterize them. These challenges affect measurements of basic chemical and physical properties as well as toxicology assessments.

To help nano-EHS (Environment, Health and Safety)researchers navigate the often complex measurement issues, the National Institute of Standards and Technology (NIST) has launched a new website devoted to NIST-developed (or co-developed) and validated laboratory protocols for nano-EHS studies.

A July 1, 2015 NIST news release on EurekAlert, which originated the news item, offers more details about the information available through the web portal,

In common lab parlance, a “protocol” is a specific step-by-step procedure used to carry out a measurement or related activity, including all the chemicals and equipment required. Any peer-reviewed journal article reporting an experimental result has a “methods” section where the authors document their measurement protocol, but those descriptions are necessarily brief and condensed, and may lack validation of any sort. By comparison, on NIST’s new Protocols for Nano-EHS website the protocols are extraordinarily detailed. For ease of citation, they’re published individually–each with its own unique digital object identifier (DOI).

The protocols detail not only what you should do, but why and what could go wrong. The specificity is important, according to program director Debra Kaiser, because of the inherent difficulty of making reliable measurements of such small materials. “Often, if you do something seemingly trivial–use a different size pipette, for example–you get a different result. Our goal is to help people get data they can reproduce, data they can trust.”

A typical caution, for example, notes that if you’re using an instrument that measures the size of nanoparticles in a solution by how they scatter light, it’s important also to measure the transmission spectrum of the particles if they’re colored, because if they happen to absorb light strongly at the same frequency as your instrument, the result may be biased.

“These measurements are difficult because of the small size involved,” explains Kaiser. “Very few new instruments have been developed for this. People are adapting existing instruments and methods for the job, but often those instruments are being operated close to their limits and the methods were developed for chemicals or bulk materials and not for nanomaterials.”

“For example, NIST offers a reference material for measuring the size of gold nanoparticles in solution, and we report six different sizes depending on the instrument you use. We do it that way because different instruments sense different aspects of a nanoparticle’s dimensions. An electron microscope is telling you something different than a dynamic light scattering instrument, and the researcher needs to understand that.”

The nano-EHS protocols offered by the NIST site, Kaiser says, could form the basis for consensus-based, formal test methods such as those published by ASTM and ISO.

NIST’s nano-EHS protocol site currently lists 12 different protocols in three categories: sample preparation, physico-chemical measurements and toxicological measurements. More protocols will be added as they are validated and documented. Suggestions for additional protocols are welcome at

The next item concerns European nanomedicine.

CEA-LETI and Europe’s first nanomedicine characterization laboratory

A July 1, 2015 news item on Nanotechnology Now describes the partnership which has led to launch of the new laboratory,

CEA-Leti today announced the launch of the European Nano-Characterisation Laboratory (EU-NCL) funded by the European Union’s Horizon 2020 research and innovation programm[1]e. Its main objective is to reach a level of international excellence in nanomedicine characterisation for medical indications like cancer, diabetes, inflammatory diseases or infections, and make it accessible to all organisations developing candidate nanomedicines prior to their submission to regulatory agencies to get the approval for clinical trials and, later, marketing authorization.

“As reported in the ETPN White Paper[2], there is a lack of infrastructure to support nanotechnology-based innovation in healthcare,” said Patrick Boisseau, head of business development in nanomedicine at CEA-Leti and chairman of the European Technology Platform Nanomedicine (ETPN). “Nanocharacterisation is the first bottleneck encountered by companies developing nanotherapeutics. The EU-NCL project is of most importance for the nanomedicine community, as it will contribute to the competiveness of nanomedicine products and tools and facilitate regulation in Europe.”

EU-NCL is partnered with the sole international reference facility, the Nanotechnology Characterization Lab of the National Cancer Institute in the U.S. (US-NCL)[3], to get faster international harmonization of analytical protocols.

“We are excited to be part of this cooperative arrangement between Europe and the U.S.,” said Scott E. McNeil, director of U.S. NCL. “We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.”

A July 2, 2015 EMPA (Swiss Federal Laboratories for Materials Science and Technology) news release on EurekAlert provides more detail about the laboratory and the partnerships,

The «European Nanomedicine Characterization Laboratory» (EU-NCL), which was launched on 1 June 2015, has a clear-cut goal: to help bring more nanomedicine candidates into the clinic and on the market, for the benefit of patients and the European pharmaceutical industry. To achieve this, EU-NCL is partnered with the sole international reference facility, the «Nanotechnology Characterization Laboratory» (US-NCL) of the US-National Cancer Institute, to get faster international harmonization of analytical protocols. EU-NCL is also closely connected to national medicine agencies and the European Medicines Agency to continuously adapt its analytical services to requests of regulators. EU-NCL is designed, organized and operated according to the highest EU regulatory and quality standards. «We are excited to be part of this cooperative project between Europe and the U.S.,» says Scott E. McNeil, director of US-NCL. «We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.»

Nine partners from eight countries

EU-NCL, which is funded by the EU for a four-year period with nearly 5 million Euros, brings together nine partners from eight countries: CEA-Tech in Leti and Liten, France, the coordinator of the project; the Joint Research Centre of the European Commission in Ispra, Italy; European Research Services GmbH in Münster Germany; Leidos Biomedical Research, Inc. in Frederick, USA; Trinity College in Dublin, Ireland; SINTEF in Oslo, Norway; the University of Liverpool in the UK; Empa, the Swiss Federal Laboratories for Materials Science and Technology in St. Gallen, Switzerland; Westfälische Wilhelms-Universität (WWU) and Gesellschaft für Bioanalytik, both in Münster, Germany. Together, the partnering institutions will provide a trans-disciplinary testing infrastructure covering a comprehensive set of preclinical characterization assays (physical, chemical, in vitro and in vivo biological testing), which will allow researchers to fully comprehend the biodistribution, metabolism, pharmacokinetics, safety profiles and immunological effects of their medicinal nano-products. The project will also foster the use and deployment of standard operating procedures (SOPs), benchmark materials and quality management for the preclinical characterization of medicinal nano-products. Yet another objective is to promote intersectoral and interdisciplinary communication among key drivers of innovation, especially between developers and regulatory agencies.

The goal: to bring safe and efficient nano-therapeutics faster to the patient

Within EU-NCL, six analytical facilities will offer transnational access to their existing analytical services for public and private developers, and will also develop new or improved analytical assays to keep EU-NCL at the cutting edge of nanomedicine characterization. A complementary set of networking activities will enable EU-NCL to deliver to European academic or industrial scientists the high-quality analytical services they require for accelerating the industrial development of their candidate nanomedicines. The Empa team of Peter Wick at the «Particles-Biology Interactions» lab will be in charge of the quality management of all analytical methods, a key task to guarantee the best possible reproducibility and comparability of the data between the various analytical labs within the consortium. «EU-NCL supports our research activities in developing innovative and safe nanomaterials for healthcare within an international network, which will actively shape future standards in nanomedicine and strengthen Empa as an enabler to facilitate the transfer of novel nanomedicines from bench to bedside», says Wick.

You can find more information about the laboratory on the Horizon 2020 (a European Union science funding programme) project page for the EU-NCL laboratory. For anyone curious about CEA-Leti, it’s a double-layered organization. CEA is France’s Commission on Atomic Energy and Alternative Energy (Commissariat à l’énergie atomique et aux énergies alternatives); you can go here to their French language site (there is an English language clickable option on the page). Leti is one of the CEA’s institutes and is known as either Leti or CEA-Leti. I have no idea what Leti stands for. Here’s the Leti website (this is the English language version).