Tag Archives: The Nanomaterial Data Curation Initiative: A collaborative approach to assessing evaluating and advancing the state of the field

Nanotechnology takes the big data dive

Duke University’s (North Carolina, US) Center for Environmental Implications of Nano Technology (CEINT) is back in the news. An August 18, 2015 news item on Nanotechnology Now  highlights two new projects intended to launch the field of nanoinformatics,

In two new studies, researchers from across the country spearheaded by Duke University faculty have begun to design the framework on which to build the emerging field of nanoinformatics.

An August 18, 2015 Duke University news release on EurekAlert, which originated the news item, describes the notion of nanoinformatics and how Duke is playing a key role in establishing this field,

Nanoinformatics is, as the name implies, the combination of nanoscale research and informatics. It attempts to determine which information is relevant to the field and then develop effective ways to collect, validate, store, share, analyze, model and apply that information — with the ultimate goal of helping scientists gain new insights into human health, the environment and more.

In the first paper, published on August 10, 2015, in the Beilstein Journal of Nanotechnology, researchers begin the conversation of how to standardize the way nanotechnology data are curated.

Because the field is young and yet extremely diverse, data are collected and reported in different ways in different studies, making it difficult to compare apples to apples. Silver nanoparticles in a Florida swamp could behave entirely differently if studied in the Amazon River. And even if two studies are both looking at their effects in humans, slight variations like body temperature, blood pH levels or nanoparticles only a few nanometers larger can give different results. For future studies to combine multiple datasets to explore more complex questions, researchers must agree on what they need to know when curating nanomaterial data.

“We chose curation as the focus of this first paper because there are so many disparate efforts that are all over the road in terms of their missions, and the only thing they all have in common is that somehow they have to enter data into their resources,” said Christine Hendren, a research scientist at Duke and executive director of the Center for the Environmental Implications of NanoTechnology (CEINT). “So we chose that as the kernel of this effort to be as broad as possible in defining a baseline for the nanoinformatics community.”

The paper is the first in a series of six that will explore what people mean — their vocabulary, definitions, assumptions, research environments, etc. — when they talk about gathering data on nanomaterials in digital form. And to get everyone on the same page, the researchers are seeking input from all stakeholders, including those conducting basic research, studying environmental implications, harnessing nanomaterial properties for applications, developing products and writing government regulations.

The daunting task is being undertaken by the Nanomaterial Data Curation Initiative (NDCI), a project of the National Cancer Informatics Nanotechnology Working Group (NCIP NanoWG) lead by a diverse team of nanomaterial data stakeholders. If successful, not only will these disparate interests be able to combine their data, the project will highlight what data are missing and help drive the research priorities of the field.

In the second paper, published on July 16, 2015, in Science of The Total Environment, Hendren and her colleagues at CEINT propose a new, standardized way of studying the properties of nanomaterials.

“If we’re going to move the field forward, we have to be able to agree on what measurements are going to be useful, which systems they should be measured in and what data gets reported, so that we can make comparisons,” said Hendren.

The proposed strategy uses functional assays — relatively simple tests carried out in standardized, well-described environments — to measure nanomaterial behavior in actual systems.

For some time, the nanomaterial research community has been trying to use measured nanomaterial properties to predict outcomes. For example, what size and composition of a nanoparticle is most likely to cause cancer? The problem, argues Mark Wiesner, director of CEINT, is that this question is far too complex to answer.

“Environmental researchers use a parameter called biological oxygen demand to predict how much oxygen a body of water needs to support its ecosystem,” explains Wiesner. “What we’re basically trying to do with nanomaterials is the equivalent of trying to predict the oxygen level in a lake by taking an inventory of every living organism, mathematically map all of their living mechanisms and interactions, add up all of the oxygen each would take, and use that number as an estimate. But that’s obviously ridiculous and impossible. So instead, you take a jar of water, shake it up, see how much oxygen is taken and extrapolate that. Our functional assay paper is saying do that for nanomaterials.”

The paper makes suggestions as to what nanomaterials’ “jar of water” should be. It identifies what parameters should be noted when studying a specific environmental system, like digestive fluids or wastewater, so that they can be compared down the road.

It also suggests two meaningful processes for nanoparticles that should be measured by functional assays: attachment efficiency (does it stick to surfaces or not) and dissolution rate (does it release ions).

In describing how a nanoinformatics approach informs the implementation of a functional assay testing strategy, Hendren said “We’re trying to anticipate what we want to ask the data down the road. If we’re banking all of this comparable data while doing our near-term research projects, we should eventually be able to support more mechanistic investigations to make predictions about how untested nanomaterials will behave in a given scenario.”

Here are links to and citations for the papers,

The Nanomaterial Data Curation Initiative: A collaborative approach to assessing, evaluating, and advancing the state of the field by Christine Ogilvie Hendren, Christina M. Powers, Mark D. Hoover, and Stacey L. Harper.  Beilstein J. Nanotechnol. 2015, 6, 1752–1762. doi:10.3762/bjnano.6.179 Published 18 Aug 2015

A functional assay-based strategy for nanomaterial risk forecasting by Christine Ogilvie Hendren, Gregory V. Lowry, Jason M. Unrine, and Mark R. Wiesner. Science of The Total Environment Available online 16 July 2015 In Press, Corrected Proof  DOI: 10.1016/j.scitotenv.2015.06.100.

The first paper listed in open access while the second paper is behind a paywall.

I’m (mostly) giving the final comments to Dexter Johnson who in an August 20, 2015 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website) had this to say (Note: Links have been removed),

It can take days for a supercomputer to unravel all the data contained in a single human genome. So it wasn’t long after mapping the first human genome that researchers coined the umbrella term “bioinformatics” in which a variety of methods and computer technologies are used for organizing and analyzing all that data.

Now teams of researchers led by scientists at Duke University believe that the field of nanotechnology has reached a critical mass of data and that a new field needs to be established, dubbed “nanoinformatics.

While being able to better organize and analyze data to study the impact of nanomaterials on the environment should benefit the field, what seems to remain a more pressing concern is having the tools for measuring nanomaterials outside of a vacuum and in water and air environments.”

I gather Christine Hendren has succeeded Mark Weisner as CEINT’s executive director.