Tag Archives: Andrew Maynard

Phyto and nano soil remediation (part 2: nano)

For Part 2, I’ve included part of my original introduction (sans the story about the neighbour’s soil and a picture of Joe Martin):

I’m pleased to repost a couple of pieces on soil remediation written by Joe Martin for the Mind the Science Gap (MTSG) blog.

I wrote about the MTSG blog in my Jan. 12, 2012 posting, which focussed on this University of Michigan project designed by Dr. Andrew Maynard for Master’s students in the university’s Public Health program. Very briefly here’s a description of Andrews and the program from the About page,

Mind the Science Gap is a science blog with a difference.  For ten weeks between January and April 2012, Masters of Public Health students from the University of Michigan will each be posting weekly articles as they learn how to translate complex science into something a broad audience can understand and appreciate.

Each week, ten students will take a recent scientific publication or emerging area of scientific interest, and write a post on it that is aimed at a non expert and non technical audience.  As the ten weeks progress, they will be encouraged to develop their own area of focus and their own style.

About the Instructor.  Andrew Maynard is Director of the University of Michigan Risk Science Center, and a Professor of Environmental Health Sciences in the School of Public Health.  He writes a regular blog on emerging technologies and societal implications at 2020science.org.

Here’s a bit more about Joe Martin,

I am a second year MPH student in Environmental Quality and Health, and after graduation from this program, I will pursue a Ph.D. in soil science.  My interests lie in soil science and chemistry, human health and how they interact, especially in regards to agricultural practice and productivity.

Here’s part 2: nano soil remediation or Joe’s Feb. 10, 2012 posting:

Last week I wrote about phytoremediation, and its potential to help us combat and undo soil contamination. But, like any good advanced society, we’re not pinning all our hopes on a single technique. A commenter, Maryse, alerted me to the existence of another promising set of techniques and technologies: nano-remediation.

For those who don’t know, nano-technology is a science which concerns itself with manipulating matter on a very small scale.  Nano-particles are commonly described as being between 100 nanometers (nm) to 1nm, though this is hardly a hard and fast rule. (For perspective, a nanometer is one one-millionth of a millimeter. If you aren’t inclined to the metric system, there are roughly four hundred million nanometers per inch.) On such micro-scales, the normal properties of compounds can be altered without changing the actual chemical composition. This allows for many new materials and products, (such as Ross Nanotechnology’s Neverwet Spray,) and for new applications for common materials, (using graphene to make the well-known carbon nanotubes).

When we apply the use of nano-scale particles to the remediation of contaminated soil, we are using nano-remediation. Unlike phytoremediation, this actually encompasses several different strategies which can be broadly classes as adsorptive or reactive. (Mueller and Nowack, 2010) The use of iron oxides to adsorb and immobilize metals and arsenic is not a new concept, but nano-particles offer new advantages. When I wrote “adsorb”, I was not making a spelling error; adsorption is a process by which particles adhere to the surface of another material, but do not penetrate into the interior. This makes surface area, not volume, the important characteristic. Nano-particles provide the maximum surface area-to-weight ratio, maximizing the adsorptive surfaces onto which these elements can attach. These adsorptive processes a very effective at binding and immobilizing metals and arsenic, but they do not allow for the removal of the toxic components. This may be less-than-ideal, but in places like Bangladesh, where arsenic contamination of groundwater poses major health risks, it may be just short of a miracle.

Reactive nano-remediation strategies focus on organic pollutants, and seem to work best for chlorinated solvents such as the infamous PCBs. Nano-scale zero valent iron, or nZVI, is the most widely explored and tested element used in these methods. The nZVI, or sometimes nZVI bound to various organic molecules like polysaccharides or protein chains, force redox reactions which rapidly disassemble the offending molecules.

There are other advantages to these nano-molecular techniques aside from the efficiency with which they bind or destroy the offending pollutants. In reactive remediation, the hyper reactivity nZVI causes it to react with other common and natural elements, such as dissolved oxygen in ground water, or nitrate and sulfate molecules, and in the process this inactivates the nZVI. While this forces multiple applications of the nano-particle (delivered in slurry form, through an injection well), it also prevents unused iron from drifting out of the treatment zone and becoming a pollutant itself. For adsorptive and reactive remediation techniques, that active nano-particles are injected into a well dug into or near the contaminated soil and/or groundwater. When injected as a slurry, the nano-particles can drift along with the flow of ground water, effectively creating an “anti-pollution” plume. In other formulations, the active mixture is made to flow less easily, effectively creating a barrier to filter spreading pollution or through which polluted ground water can be pulled.

There are health risks and concerns associated with the production and use of nano-particles, so some caution and validation is needed before its used everywhere. However, there has already been some successes with nano-remediation. The example of PCB remediation with nZVI is taken from great success the US Air Force has had. (PCB contamination is a legacy of their use as fire-suppressants). Beyond this, while nano-remediation has not been widely applied on surface or near-surface soils, it does enable remediation in deeper soils normally only accessed by “pump-and-treat” methods, (which are expensive and can have decades-long time frames). When coupled with other techniques, (like phytoremediation), it does fit nicely into an expanding tool bag, one with which we as a society and species can use to reverse our impact on the planet, (and our own health).

Further Reading: There was no way for me to represent the full sum of nano-remediation, nevertheless nanotechnology, in this post. It has such potential, and is developing at such a rate that the attention it deserves is better measured in blogs (or perhaps decablogs). So if you are interested in nano-technology or nano-remediation, click through some of the links below.

List of popular blogs: http://www.blogs.com/topten/10-popular-nanotechnology-blogs/, including some very important ones on the health risks of nano-technology.

A cool site listing sites currently using nano-remediation: http://www.nanotechproject.org/inventories/remediation_map/, and another post from the same site dealing with nano-remediation [PEN webcast on site remediation]: http://www.nanotechproject.org/events/archive/remediation/

An excellent peer-reviewed article: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2799454/

Citation: Mueller C and Nowack B. Nanoparticles for Remediation: Solving Big Problems with Little Particles. 2010. Elements, Vol. 6. pp 395-400.

You can read about the other MTSG contributors and find links to their work here.

I have mentioned remediation before on the blog,

soil remediation and Professor Dennis Carroll at the University of Western Ontario in my Nov. 4, 2011 posting

remediation and a patent for Green-nano zero valent iron (G-nZVI) in my June 17, 2011 posting

groundwater remediation and nano zero valent iron (nZVI) at the University of California at Santa Barbara in my March 30, 2011 posting

site remediation and drywall in my Aug. 2, 2010 posting

remediation technologies and oil spills my May 6, 2010 posting

my March 4, 2010 posting  (scroll down about 1/2 way) which is a commentary on the Project for Emerging Nanotechnologies (PEN) webcast about site remediation in Joe’s list of resources

Thank you Joe for giving me permission to repost your pieces. For more of Joe’s pieces,  Read his posts here –>

Phyto and nano soil remediation (part 1: phyto/plant)

One of my parent’s neighbours was a lifelong vegetarian and organic gardener. The neighbour, a Dutchman,  had been born on the island of Curaçao, around 1900, and was gardening organically by the 1940′s at the latest. He had wonderful soil and an extraordinary rose garden in the front yard and vegetables in the back, along with his compost heap. After he died in the 1980′s, his granddaughter sold the property to a couple who immediately removed the roses to be replaced with grass in the front and laid a good quantity of cement in the backyard. Those philistines sold the soil and, I imagine, the roses too.

Myself, I’m not not a gardener but I have a strong appreciation for the necessity of good soil so, I’m pleased to repost a couple of pieces on soil remediation written by Joe Martin for the Mind the Science Gap (MTSG) blog. First here’s a little bit about the MTSG blog project and about Joe Martin.

I wrote about the MTSG blog in my Jan. 12, 2012 posting, which focussed on this University of Michigan project designed by Dr. Andrew Maynard for Masters students in the university’s Public Health program. Very briefly here’s a description of Andrews and the program from the About page,

Mind the Science Gap is a science blog with a difference.  For ten weeks between January and April 2012, Masters of Public Health students from the University of Michigan will each be posting weekly articles as they learn how to translate complex science into something a broad audience can understand and appreciate.

Each week, ten students will take a recent scientific publication or emerging area of scientific interest, and write a post on it that is aimed at a non expert and non technical audience.  As the ten weeks progress, they will be encouraged to develop their own area of focus and their own style.

About the Instructor.  Andrew Maynard is Director of the University of Michigan Risk Science Center, and a Professor of Environmental Health Sciences in the School of Public Health.  He writes a regular blog on emerging technologies and societal implications at 2020science.org.

As for Joe Martin,

I am a second year MPH student in Environmental Quality and Health, and after graduation from this program, I will pursue a Ph.D. in soil science.  My interests lie in soil science and chemistry, human health and how they interact, especially in regards to agricultural practice and productivity.

Here’s a picture,

Joe Martin, Masters of Public Health program, University of Michigan, MTSG blog

Joe gave an excellent description of nano soil remediation but I felt it would be remiss to not include the first part on phyto soil remediation. Here’s his Feb. 3, 2012 posting about plants and soil remediation:

Pictured: The Transcendent Reality of Life and the Universe.

Plants are awesome. It’s from them that we get most of our food. It’s from plants that many of our medicines originated, (such as Willow and aspirin). We raise the skeletons of our homes and furnish their interiors with trees. Most of our cloth is woven from plant fiber, (a statement I feel comfortable making based solely on the sheer weight of denim consumed each year in this country.) And although there is an entire world of water plants, all of the plants I listed above are grown in the soil*. How the individual soil particles cling to each other, how they hold water and nutrients, and how the soil provides shelter for the various macro and micro-organisms is as important to the growth of plants as sunlight.

But no matter how proliferative, no matter how adaptive plants are, there are still spaces inaccessible to them. A clear example would be the Saharan dunes or a frozen tundra plain. However, many of places where plants can’t survive are created by human activity. The exhaust of smelters provides one example – waste or escaped zinc, copper, cadmium, and lead infiltrate downwind soils and often exterminate many or most of the natural plants. Normal treatment options for remediating metal contaminated soils are expensive, and can actually create hazards to human health. This is because, like some persistent organic pollutants (the infamous dioxin is a great example), the natural removal of metals from soils often proceeds very slowly, if it proceeds at all. For this reason, remediation of metal soil often involves scraping the contaminated portion off and depositing it in a hazardous waste landfill. In cases of old or extensive pollution, the amount of soil can exceed thousands of cubic feet. In this process, contaminated dust can easily be stirred up, priming it to be inhaled by either the workers present or any local populations.

But it can be cousins of the evicted shrubs and grass which offer us the best option to undo the heavy metal pollution. In a process called phytoremediation, specific plants are deliberately seeded over the contaminated areas. These plants have been specifically chosen for the tendency to uptake the metals in question. (In some cases, this process is also used for persistent organic pollutants, like 2,3,7,8-TCDD, infamously known as dioxin.) These plants are allowed to grow and develop their root systems, but are also selectively mowed to remove the pollutant laden leaves and stems, and ultimately remove the contaminant from the soil system. Once the pollution level has descended to a sufficiently low level, the field may be left fallow. Otherwise, the remediating plants can be removed and the ground reseeded with natural plants or returned to agricultural, commercial, or residential use.

When it is applicable, phytoremediation offers a significant advantage over either restricted access, (a common strategy which amounts to placing a fence around the contaminated site and keeping people out), or soil removal. While the polluted grass clippings much still be treated as hazardous waste, the volume and mass of the hazardous material is greatly reduced. Throughout the process, the remediating plants also serve to fix the soil in place, reducing or preventing runoff and free-blowing dust. Instead of bulldozers and many dump trucks, the equipment needed is reduced to a mower which captures grass or plant clippings and a single dump truck haul each growing season. Finally, the site does not need to be reinforced with topsoil from some other region to return it to useable space. These last few advantages can also greatly reduce the cost of remediation.

The major disadvantages of phytoremediation are time and complexity. Scraping the soil can be done in a few months or less, depending on the size of the area to be remediated. Phytoremediation takes multiple growing seasons, and if the land is a prime space for development this may be unacceptable. Phytoremediation requires different plants for different pollutants or mixtures of pollutants. I chose the copper, zinc, lead, and cadmium mixture earlier in the article because in a study from 2005, (Herrero et al, 2005), they specifically attempted to measure the ability of rapeseed and sunflower to extract these metals from an artificially contaminated soil. The unfortunate reality is that each contaminant will have to be studied in such a way, meticulously pairing pollutants (or mixture of them) with a plant. Each of the selected plants must also be able to grow in the soil to be remediated. Regardless of type of contamination, a North American prairie grass is unlikely to grow well in a Brazilian tropical soil. For these reasons, phytoremediation plans must be individually built for each site. This is costly both in dollars and man hours. Furthermore, there is always the problem that some pollutants don’t respond well to phytoremediation. While copper, zinc, and cadmium have all been found to respond quite well to phytoremediation, lead does not appear to be. In the Herraro et al study, the plants accumulated lead, but did so in the roots. Unless the roots were dug up, this would not effectively remove the lead from the soil system. Unfortunately, lead is one of the most common heavy metal pollutants, at least in the U.S., a legacy of our former love for leaded gasoline and paint.

Despite these disadvantages, phytoremediation presents a unique opportunity to remove many pollutants. It is by far the least environmentally destructive, and in many cases may be the cheapest method of remediation. I am happy to see that it appears to be receiving funding and is being actively researched and developed, (for those who don’t pursue the reference, the Herraro article came from The International Journal of Phytoremediation.) In recent times, we’ve been hit with messages about expanding hydrofracking and the Gulf Oil spill, but perhaps I can send you into this weekend with a little positivity about our environmental future. The aggregated techniques and methods which can be termed “phytoremediation” have the potential to do much good at a lower cost than many other remediation techniques. That sounds like a win-win situation to me.

* I am aware that many of these crops can be grown aero- or hydroponically. While these systems do provide many foodstuffs, they are not near the level of soil grown crops, and can be comparatively very expensive. I chose not to discuss them because, well, I aspire to be a soil scientist.

1.) Herreo E, Lopez-Gonzalvez A, Ruiz M, Lucas Garcia J, and Barbas C. Uptake and Distribution of Zinc, Cadmium, Lead, and Copper in Brassica napus vr. oleifera and Helianthus annus Grown in Contaminated Soils. 2005. The International Journal of Phytoremediation. Vol. 5, pp. 153-167.

A note on photos: Any photos I use will be CC licensed. These particular photos are provided by Matthew Saunders (banana flower) and KPC (rapeseed) under an attribution, no commercial, no derivation license.  I originally attempted to link to the source in the caption, but wordpress won’t let me for some reason. Until I work that out, the image home can be found under the artist’s names a few sentences earlier. I believe this honors the license and gives proper credit, but if I’ve committed some faux pas, (which would not be a surprise), don’t hesitate to comment and correct me. And thanks to those who have done so in previous posts, its one of the best ways to learn.

Part 2: nano soil remediation follows.

For more of Joe’s pieces,  Read his posts here –>

Davos, World Economic Forum, and risk

The World Econ0mic Forum’s (WEF) annual meeting in Davos, Switzerland started today, Jan. 25. 2012 and runs until Jan. 29. From the WEF’s home page, here’s what they have to say about the theme for this year’s meeting,

The contextual change at the top of minds remains the rebalancing and deleveraging that is reshaping the global economy. In the near term, this transformation is seen in the context of how developed countries will deleverage without falling back into recession and how emerging countries will curb inflation and avoid future economic bubbles. In the long term, both will play out as the population of our interdependent world not only passes 7 billion but is also interconnected through information technology on a historic scale. The net result will be transformational changes in social values, resource needs and technological advances as never before. In either context, the necessary conceptual models do not exist from which to develop a systemic understanding of the great transformations taking place now and in the future.

It is hubris to frame this transition as a global “management” problem of integrating people, systems and technologies. It is an indisputable leadership challenge that ultimately requires new models, bold ideas and personal courage to ensure that this century improves the human condition rather than capping its potential. Thus, the Annual Meeting 2012 will convene under the theme, The Great Transformation: Shaping New Models, whereby leaders return to their core purpose of defining what the future should look like, aligning stakeholders around that vision and inspiring their institutions to realize that vision.

The meeting is a big deal with lots of important and/or prominent people expected to attend. I usually get my dose of WEF’s annual meeting (sometimes there’s some talk about nanotechnology) from Dr. Andrew Maynard, Director of the University of Michigan Risk Science Center and owner of the 2020 Science blog. I’m not sure if he’s attending this year but he has already profiled the WEF Global Risks 2012 Report in a Jan. 11, 2012 posting on his blog.

The World Economic Forum Global Risks Report is one of the most authoritative annual assessments of emerging issues surrounding risk currently produced. Now in its seventh edition, the 2012 report launched today draws on over 460 experts* from industry, government, academia and civil society to provide insight into 50 global risks across five categories, within a ten-year forward looking window.

As you would expect from such a major undertaking, the report has its limitations. There are some risk trends that maybe aren’t captured as well as they could be – chronic disease and pandemics are further down the list this year than I would have expected. And there are others that capture the headlining concerns of the moment – severe income disparity is the top-listed global risk in terms of likelihood.

Risks are addressed in five broad categories, covering economic, environmental, geopolitical, societal and technological risks. And cutting across these, the report considers three top-level issues under the headings Seeds of Dystopia (action or inaction that leads to fragility in states); How Safe are our Safeguards? (unintended consequences of over, under and unresponsive regulation); and The Dark Side of Connectivity(connectivity-induced vulnerability). These provide a strong framework for approaching the identified risks systemically, and teasing apart complex interactions that could lead to adverse consequences.

I’m always interested in ‘unintended consequences’. (When I worked as a frontline staff member for various bureaucracies, I was able to observe the ‘unintended consequences’ of policies devised by people who had no direct experience or had forgotten their experience.) So, I was quite interested to note these items in Andrew’s excerpts from the report,

Unintended consequences of nanotechnology. Following a trend seen in previous Global Risks reports, the unintended consequences of nanotechnology – while still flagged up – are toward the bottom of the risk spectrum. The potential toxicity of engineered nanomaterials is still mentioned as a concern. But most of the 50 risks addressed are rated as having a higher likelihood and/or impact.

Unintended consequences of new life science technologies. These are also relatively low on the list, but higher up the scale of concern that nanotechnologies. Specifically called out are the possibilities of genetic manipulation through synthetic biology leading to unintended consequences or biological weapons.

Unforeseen consequences of regulation. These are ranked relatively low in terms of likelihood and impact. But the broad significance of unintended consequences is highlighted in the report. These are also linked in with the potential impact and likelihood of global governance failure. Specifically, the report calls for

“A shift in mentality … so that policies, regulations or institutions can offer vital protection in a more agile and cohesive way.”

The report’s authors also ask how leaders can develop anticipatory and holistic approaches to system safeguards; how businesses and governments can prevent a breakdown of trust following the emergence of new risks; and how governments, business and civil society can work together to improve resilience against unforeseen risks.

Andrew has a lot more detail about the risks noted in the report, so I encourage you to read the post in its entirety. I was intrigued by this final passage with its emphasis on communication and trust,

The bottom line? The report concludes that

Decision-makers need to improve understanding of incentives that will improve collaboration in response to global risks;

Trust, or lack of trust, is perceived to be a crucial factor in how risks may manifest themselves. In particular, this refers to confidence, or lack thereof, in leaders, in systems which ensure public safety and in the tools of communication that are revolutionizing how we share and digest information; and

Communication and information sharing on risks must be improved by introducing greater transparency about uncertainty and conveying it to the public in a meaningful way.

One other comment, Andrew notes that he was ‘marginally involved’ (single quotes mine) in the report as a member of the World Economic Forum Agenda Council on Emerging Technologies.

Mind the Science Gap and mentoring

There’s a bunch of master’s of public health students at the University of Michigan who want to communicate about complex science to the public and you’re invited. Mind the Science Gap blog is a project of Dr. Andrew Maynard’s. The project is being presented as part of a course. Here’s a description of the course for the students (from the Syllabus webpage),

This course is designed to teach participants how to connect effectively with a non-expert audience when conveying complex science-based information that is relevant to public health, using the medium of a public science blog (http://mtsg.org).

In today’s data-rich and hyper-connected world, the gap between access to information and informed decision-making is widening.  It is a gap that threatens to undermine actions on public health as managers, policy makers, consumers and others struggle to fish relevant information from an ever-growing sea of noise.  And it is a gap that is flourishing in a world where anyone with a smart phone and an Internet connection can become an instant “expert”.

To bridge this gap, the next generation of public health professionals will need to be adept at working with new communication platforms, and skilled at translating “information” into “intelligence” for a broad audience. These skills will become increasingly relevant to communicating effectively with managers, clients and customers.  But more broadly, they will be critical to supporting evidence-informed decisions as social influences continue to guide public health activities within society.

Here’s a bit more about the blog itself and what the students will be doing (from the About page),

Mind the Science Gap is a science blog with a difference.  For ten weeks between January and April 2012, Masters of Public Health students from the University of Michigan will each be posting weekly articles as they learn how to translate complex science into something a broad audience can understand and appreciate.

Each week, ten students will take a recent scientific publication or emerging area of scientific interest, and write a post on it that is aimed at a non expert and non technical audience.  As the ten weeks progress, they will be encouraged to develop their own area of focus and their own style.

And they will be evaluated in the most brutal way possible – by the audience they are writing for!  As this is a public initiative, comments and critiques on each post will be encouraged, and author responses expected.

This is not a course on science blogging.  Rather, it is about teaching public health graduate students how to convey complex information effectively to a non-expert audience, using the medium of a science blog.

The blogging starts Jan. 16, 2012 and you are invited to participate.  You can be a casual commenter or Andrew has a list of almost 40 mentors (people who’ve committed to commenting on the content at least once per week) and he’s asking for more. BTW, I (Maryse de la Giroday) am on the list as is Robyn Sussel, health and academic communicator and principal for Signals, a Vancouver-based communications and graphic design company. If you’re interested in signing up as a mentor, you can contact Andrew through this email address: [email protected]

You can also sign up for RSS feeds.

Dr. Andrew Maynard discusses the Health Canada nanomaterial definition

I have often referred to and linked to Andrew Maynard’s writing on nanotechnology issues and am pleased to note he has kindly answered some questions about the Health Canada Working Definition of Nanomaterial. Before launching into his responses, here’s a little more about him.

Dr. Andrew Maynard was originally trained as a physicist and graduated with a PhD from Cambridge, UK  in 1993. He worked for a number of years for the UK Health and Safety Executive moving to the US to work with the National Institute of Occupational Health and Safety where he helped set up a nanotechnology safety programme post 2000 when the NNI was established. By 2005, he was employed at the Project on Emerging Nanotechnologies as their Chief Science Advisor. As of April 2010, he assumed responsibility as director of the Risk Science Center at the University of Michigan School of Public Health. He consults internationally on nanotechnology safety issues. He was a member of the expert panel consulted for the nanotechnology report, Small is Different; A Science Perspective on the Regulatory Challenges of Nanotechnology, published by the Council of Canadian Academies in 2008.

Since the 2008 report for the Council of Canadian Academies, Andrew has adopted a different approach to regulating nanotechnology, a change I first noted in an April 15, 2011 posting on the University of Michigan Risk Science Center blog. Excerpted from that posting,

Engineered nanomaterials present regulators with a conundrum – there is a gut feeling that these materials present a new regulatory challenge, yet the nature and resolution of this challenge remains elusive.  But as the debate over the regulation of nanomaterials continues, there are worrying signs that discussions are being driven less by the science of how these materials might cause harm, and more by the politics of confusion and uncertainty.

The genesis of the current dilemma is entirely understandable. Engineered nanomaterials are typically the product of nanotechnology – a technology that has been lauded as leading to designed materials with unique physical and chemical properties.   Intuitively it makes sense that these unique properties could lead to unique risks.  And indeed a rapidly growing body of research is indicating that many nanoscale materials behave differently to their non-nanoscale counterparts in biological environments. Logically, it seems to follow that engineered nanomaterials potentially present risks that depend on their scale, and should be regulated appropriately.

Yet the more we learn about how materials interact with biology, the less clear it becomes where the boundaries of this class of materials called “nanomaterials” lie, or even whether this is a legitimate class of material at all from a regulatory perspective.

I waffle somewhat largely due to my respect for Andrew and his work and due to my belief that one needs to entertain new approaches for the emerging technologies, even when they make your brain hurt. (Before proceeding with Andrew’s comments and for anyone who’s interested in my take here is, My thoughts on the Health Canada nanomaterial definition.)

In any event, here are Andrew’s responses to my questions,

  • I have warm feelings towards this definition, especially the elaboration where I think they avoided the problem of including naturally occuring nanoparticles (as per your comment about micelles in milk); and they specify a size range without being doctrinaire about it. How do you feel about it, given that you’re not in favour of definitions?

The problem is that, while the Health Canada is a valiant attempt to craft a definition based on the current state of science, it is still based on a premise – that size within a well defined range is a robust indicator of novel risk – that is questionable.  Granted, they try to compensate for the limitations of this premise, but the result still smacks of trying to shoehorn the science into an assumption of what is important.

  • Do you see any pitfalls?

A large part of the problem here is an attempt to oversimplify a complex problem, without having a clear understanding of what the problem is in the first place.  Much of my current thinking – including questioning current approaches to developing definitions – revolves round trying to work out what the problem is before developing the solution.  But this makes commenting on the adequacy or inadequacy of definitions tricky, to say the least.

  • Is there anything you’d like to add?

My sincere apologies, I’ve just got to 5:00 PM on Sunday [Oct. 23, 2011] after working flat out all weekend, and am not sure I have the wherewithal to tackle this before collapsing in a heap.

I am hugely thankful that Dr. Maynard extended himself to answer my questions about the Health Canada definition of nanomaterial. To Andrew: a virtual bouquet of thanks made up of the most stunning flowers and scents you can imagine.

More on US National Nanotechnology Initiative (NNI) and EHS research strategy

In my Oct, 18, 2011 posting I noted that the US National Nanotechnology Initiative (NNI) would be holding a webinar on Oct. 20, 2011 to announce an environmental, health, and safety (EHS) research strategy for federal agencies participating in the NNI. I also noted that I was unable to register for the event. Thankfully all is not lost. There are a couple of news items on Nanowerk which give some information about the research strategy. The first news item, U.S. government releases environmental, health, and safety research strategy for nanotechnology, from the NNI offers this,

The strategy identifies six core categories of research that together can contribute to the responsible development of nanotechnology: (1) Nanomaterial Measurement Infrastructure, (2) Human Exposure Assessment, (3) Human Health, (4) Environment, (5) Risk Assessment and Risk Management, and (6) Informatics and Modeling. The strategy also aims to address the various ethical, legal, and societal implications of this emerging technology. Notable elements of the 2011 NNI EHS Research Strategy include:

  • The critical role of informatics and predictive modeling in organizing the expanding nanotechnology EHS knowledge base;
  • Targeting and accelerating research through the prioritization of nanomaterials for research; the establishment of standardized measurements, terminology, and nomenclature; and the stratification of knowledge for different applications of risk assessment; and
  • Identification of best practices for the coordination and implementation of NNI interagency collaborations and industrial and international partnerships. “The EHS Research Strategy provides guidance to all the Federal agencies that have been producing gold-standard scientific data for risk assessment and management, regulatory decision making, product use, research planning, and public outreach,” said Dr. Sally Tinkle, NNI EHS Coordinator and Deputy Director of the National Nanotechnology Coordination Office (NNCO), which coordinates activities of the 25 agencies that participate in the NNI. “This continues a trend in this Administration of increasing support for nanotechnology-related EHS research, as exemplified by new funding in 2011 from the Food and Drug Administration and the Consumer Product Safety Commission and increased funding from both the Environmental Protection Agency and the National Institute of Occupational Safety and Health within the Centers for Disease Control and Prevention.”

The other news item, Responsible development of nanotechnology: Maximizing results while minimizing risk, from Sally Tinkle, Deputy Director of the National Nanotechnology Coordination Office and Tof Carim, Assistant Director for Nanotechnology at OSTP (White House Office of Science and Technology Policy) adds this,

Core research areas addressed in the 2011 strategy include: nanomaterial measurement, human exposure assessment, human health, environment, risk assessment and management, and the new core area of predictive modeling and informatics. Also emphasized in this strategy is a more robust risk assessment component that incorporates product life cycle analysis and ethical, legal, and societal implications of nanotechnology. Most importantly, the strategy introduces principles for targeting and accelerating nanotechnology EHS research so that risk assessment and risk management decisions are based on sound science.

Progress in EHS research is occurring on many fronts as the NNI EHS research agencies have joined together to plan and fund research programs in core areas. For example, the Food and Drug Administration and National Institutes of Health have researched the safety of nanomaterials used in skin products like sunscreen; the Environmental Protection Agency and Consumer Product Safety Commission are monitoring the health and environmental impacts of products containing silver nanoparticles, and National Institute of Occupational Safety and Health has recommended safe handling guidelines for workers in industries and laboratories.

Erwin Gianchandani of the Computing Community Consortium blog focuses, not unnaturally, on the data aspect of the research strategy in his Oct. 20, 2011 posting titled, New Nanotechnology Strategy Touts Big Data, Modeling,

From the EHS Research Strategy:

Expanding informatics capabilities will aid development, analysis, organization, archiving, sharing, and use of data that is acquired in nanoEHS research projects… Effective management of reliable, high-quality data will also help support advanced modeling and simulation capabilities in support of future nanoEHS R&D and nanotechnology-related risk management.

Research needs highlighted span “Big Data”…

Data acquisition: Improvements in data reliability and reproducibility can be effected quickly by leveraging the widespread use of wireless and video-enabled devices by the public and by standards development organizations to capture protocol detail through videos…

Data analysis: The need for sensitivity analysis in conjunction with error and uncertainty analysis is urgent for hazard and exposure estimation and the rational design of nanomaterials… Collaborative efforts in nanomaterial design [will include] curation of datasets with known uncertainties and errors, the use of sensitivity analysis to predict changes in nanomaterial properties, and the development of computational models to augment and elucidate experimental data.

Data sharing: Improved data sharing is a crucial need to accelerate progress in nanoscience by removing the barriers presented by the current “siloed” data environment. Because data must be curated by those who have the most intimate knowledge of how it was obtained and analyzed and how it will be used, a central repository to facilitate sharing is not an optimal solution. However, federating database systems through common data elements would permit rapid semantic search and transparent sharing over all associated databases, while leaving control and curation of the data in the hands of the experts. The use of nanomaterial ontologies to define those data elements together with their computer-readable logical relationships can provide a semantic search capability.

…and predictive modeling:

Predictive models and simulations: The turnaround times for the development and validation of predictive models is measured in years. Pilot websites, applications, and tools should be added to the NCN [Network for Computational Nanotechnology] to speed collaborative code development among relevant modeling and simulation disciplines, including the risk modeling community. The infrastructure should provide for collaborative code development by public and private scientists, code validation exercises, feedback through interested user communities, and the transfer of validated versions to centers such as NanoHUB… Collaborative efforts could supplement nanomaterial characterization measurements to provide more complete sensitivity information and structure-property relationships.

Gianchandani’s post provides an unusual insight into the importance of data where research is considered. I do recommend more of his posting.

Dr. Andrew Maynard on his 2020 Science blog has posted as of Oct. 20, 2011 with a comparison of the original draft to the final report,

Given the comments received, I was interested to see how much they had influenced the final strategy.  If you take the time to comment on a federal document, it’s always nice to know that someone has paid attention.  Unfortunately, it isn’t usual practice for the federal government to respond directly to public comments, so I had the arduous task of carrying out a side by side comparison of the draft, and today’s document.

As it turns out, there are extremely few differences between the draft and the final strategy, and even fewer of these alter the substance of the document.  Which means that, by on large, my assessment of the document at the beginning of the year still stands.

Perhaps the most significant changes were on chapter 6 – Risk Assessment and Risk Management Methods. The final strategy presents a substantially revised set of current research needs, that more accurately and appropriately (in my opinion) reflect the current state of knowledge and uncertainty (page 66).  This is accompanied by an updated analysis of current projects (page 73), and additional text on page 77 stating

“Risk communication should also be appropriately tailored to the targeted audience. As a result, different approaches may be used to communicate risk(s) by Federal and state agencies, academia, and industry stakeholders with the goal of fostering the development of an effective risk management framework.”

Andrew examines the document further,

Comparing the final strategy to public comments from Günter Oberdörster [professor of Environmental Medicine at the University of Rochester in NY state] on the draft document. I decided to do this as Günter provided some of the most specific public comments, and because he is one of the most respected experts in the field.  The specificity of his comments also provided an indication of the extent to which they had been directly addressed in the final strategy.

Andrew’s post is well worth reading especially if you’ve ever made a submission to a public consultation held by your government.

The research strategy and other associated documents are now available for access and the webinar will be available for viewing at a later date. Go here.

Aside, I was a little surprised that I was unable to register to view the webinar live (I wonder if I’ll encounter the same difficulties later). It’s the first time I’ve had a problem viewing any such event hosted by a US government agency.

The French and others weigh in on the European nanomaterials definition (included here)

The responses to the announcement of the nanomaterials definition for Europe are coming fast and furious now. A summary from L’Association de Veille et d’Information Civique sur les Enjeux des Nanosciences et des Nanotechnologies (L’Avicenn) is available in an Oct. 20, 2011 news item on Nanowerk (French language version is available here),

Avicenn offers a first insight into the politics hidden behind this supposedly neutral and “scientific” definition, the next obstacles and important meetings, and then concludes on the suspense surrounding the definition that France will finally adopt for the annual mandatory declaration of nanomaterials it is implementing.

In a self-applauding press release, the European Commission announced yesterday that it finally published “a clear definition (of nanomaterials) to ensure that the appropriate chemical safety rules apply”. Nanomaterial is defined as:

  • “a natural, incidental or manufactured material
  • containing particles, in an unbound state or as an aggregate or as an agglomerate
  • and where, for 50% or more of the particles in the number size distribution, one or more external dimensions is in the size range 1 nm – 100 nm.”

Here’s a list of the responding organizations (from the Oct. 20, 2011 news item on Nanowerk),

After the release of this new definition, the most active “stakeholders” have already formally responded: among them, on the side of CSOs, the European Environmental Bureau (BEE) – the federation of 140+ environmental organisations in 31 countries, Friends of the Earth Australia (FoE Australia), the Center for International Environmental Law (CIEL), the European Consumers’ Organisation (BEUC) or the European consumer voice in standardisation (ANEC); on the industrial side, the European Chemical Industry Council (CEFIC).

I posted European nanomaterials definition not good enough about the response from the European Environmental Bureau yesterday (Oct. 19, 2011). So this may seem mildly repetitive (from English language tranaslation on the Avicenn website),

  • The new 100 nm upper limit

Friends of the Earth Australia, ANEC and BEUC denounce the adoption of the upper limit of 100 nm that they consider too restrictive: these CSOs would have preferred a higher threshold limit, that would have encompassed more materials. They refer to the Scientific Committee on Emerging and Newly Identified Health Risks (SCENIHR)’s highlight of the lack of scientific basis for this 100nm limit, and to results of toxicology studies on toxicity of submicron particles over 100 nm.
As illustrated by Foe Australia, “if this definition were applied to regulation, it would mean that where 45% of particles are 95nm in size and 55% particles are 105nm in size, substances would not be regulated as nano”at the expense of consumers and workers exposed to these substances and over whom will therefore keep hanging the threat of a risk that is assumed but not evaluated.
In response to EC consultation on its draft definition in 2010, many CSOs [civil society organizations] had argued for a threshold of 300 nm.
FoE Australia alerts to the fact that “some European cosmetics companies and North American bioactive manufacturers are reformulating their products to exploit the novel optical, chemical and biological properties of larger nanomaterials (ie >100nm) while escaping the labelling and safety assessment requirements that were anticipated for materials 1-100nm in size”.

  • 50% threshold

Some organizations – including CIEL and ANEC – applaud the choice of particle number (i.e. the number of particles) rather than mass as a measuring unit for size distribution of a nanomaterial product ; in contrast, CEFIC (which had strongly advocated using weight concentration rather than particle number distribution to determine the cut-off criterion for nanomaterials) is concerned that the adoption of this definition will add unnecessary burden for companies, leading to added costs and less efficient use of resources. The Commission followed by the recommendations of SCENIHR, which had been particularly supported by ANEC in 2010.
The Commission, however, largely raised the proportion of nano-sized materials required to qualify as nanomaterial compared to what was expected: 50% or more of the particles in the number size distribution is 50 times higher than the one that was proposed by DG Environment and supported by civil society (1%) and 333 times greater than that recommended by SCENIHR (0.15%) and supported by DG Sanco.
CSOs have expressed their surprise, incomprehension and hostility to such a high threshold. For example CIEL pinpoints that even the German industry had not been so demanding: it had campaigned for a rate of 10% “only”. However, the Commission provided that “in specific cases and where warranted by concerns for the environment, health, safety or competitiveness the number size distribution threshold of 50 % may be replaced by a threshold between 1 and 50 %”. While CIEL or ClientEarth welcome this opportunity, FoE Australia deplores that it puts a huge burden of proof on to the CSOs to demonstrate not only that certain nanomaterials can cause harm but that certain they do so as a specific proportion of particles in a sample. Showing that some nanomaterials can cause damage in itself is already very difficult by the uncertainties, the gaps in the safety science, the variability of nanomaterials and the lack of information about real life exposure. But making the same demonstration by identifying the fraction of nanoparticles in a sample that cause such harm is even more difficult, actually well beyond current scientific knowledge.

  • The inclusion of aggregate and agglomerate

CIEL appreciates the inclusion of aggregate and agglomerate within the definition. CEFIC believes that this measure will make any European legislation on nanomaterials too restrictive.

The apparent technical nature of these debates and, ultimately, the arbitrary selection of thresholds illustrate the strong political dimension at work behind the decisions made by the EC : granted, the European authorities have had to make a decision based on “sound science” – backed by consultation of scientific experts – but in the end, they mainly had to come up with a trade-off between conflicting interests of stakeholders.

Here’s how they hope the French government will respond to all of this (from the English translation on the Avicenn website),

As far as France is concerned, it is not clear at present whether the decree on the annual declaration of “substances with nanoparticle status” will use the new definition of the European Commission. In its decree, the French government might try to maintain a larger definition than the definition adopted by the Commission. CSOs are turning with hope towards French choice which will be determinant for the future: if the adopted definition is larger than that of the Commission and therefore more in line with the precautionary principle, it could serve as an example and be followed in other countries.

For anyone who may not be familiar with some recent French nanotechnology history, in the Spring of 2010 there were major nanotechnology protests in France during a series of public debates.  You can read more about them in my Jan. 26, 2011 posting, Feb. 26, 2010 posting, and followup March 10 , 2010 posting, which includes details about a French-language podcast with two Québec academics discussing the French protests.

This does clear up one question I had about European Commission (EC) jurisdictions and national jurisdictions. It seems that countries can choose to create their own definitions although I imagine they cannot be at cross-purposes with the EC definition.

On an almost final note, here’s Dexter Johnson (Nanoclast blog for the Institute of Electrical and Electronics Engineers [IEEE]) in his Oct. 19, 2011 posting,

The definition itself…well, I don’t see how it helps to narrow anything, which I understand to be one of the main purposes of definitions. It would seem that the nanoparticles that are given off when your car’s tires roll along the pavement are now up for regulatory policy (“Nanomaterial” means a natural, incidental or manufactured material containing particles…”). And due to the lack of distinction between “hard” and “soft” nanoparticles in the definition, Andrew Maynard points out that “someone needs to check the micelle size distribution in homogenized milk.”

So what is the fallout from this definition? It would seem to be somewhat less than had been anticipated earlier in the year when worries surrounded getting the definition just right because it would immediately dictate policy.

So basically they have created a class of materials that at the moment are not known to be intrinsically hazardous, but if someday they are they now have a separate class for them. While some may see as this as making some sense, it eludes me.

As for me, I think much depends on future implementations. After all, you can have the best system possible but if it’s being run by fools, you have a big problem. That said, I take Dexter’s point about establishing a class of materials ‘just in case there could be a problem’. I really must take another look at the Health Canada nanomaterials definition.

Note: I removed footnotes from the Avicenn material; these can easily be found by viewing either the Oct. 20, 2011 news item on Nanowerk or the material on the Avicenn site.

ETA Oct. 20, 2011 1500 hours: I forgot to include a link to the ANEC response in this Oct. 20, 2011 news item on Nanowerk.

European nanomaterials definition not good enough

The European Environmental Bureau (EEB) has released a statement about the definition of nanomaterials that has been adopted (mentioned in my Oct. 18, 2011 posting) from the Oct. 19, 2011 news item on Nanowerk,

The European Environmental Bureau (EEB) is deeply disappointed by the European Commission’s decision released yesterday to use a narrow definition for the term “nanomaterial”, indicating that industry lobbying has won over the Commission’s own scientific advisors. EEB did however welcome the fact that a recommendation was adopted and hopes this will clear the way for the EU to actually start regulating on this.

The EEB echoed one of Dr. Andrew Maynard’s concerns (here’s Andrew’s concern from my Oct. 18, 2011 posting),

The threshold of 50% of a material’s number distribution comprising of particles with one or more external dimension between 1 nm – 100 nm. This is a laudable attempt to handle materials comprised of particles of different sizes.  But it is unclear where the scientific basis for the 50% threshold lies, how this applies to aggregates and agglomerates, and how diameter is defined (there is no absolute measure of particle diameter – it depends on how it is defined and measured).

Here’s what the EEB had to say (from the Oct. 19, 2011 news item),

It is completely unclear from the Commission’s publication how the threshold was multiplied by 50 from the original 1% when scientists had in fact called for a 0.15% threshold.

One of Andrew’s commenters provides some insight (Note: It is quite technical) from the comments to Andrew’s Oct. 18, 2011 posting,

The 50% benchmark appears not to be arbitrary: SWNTs are p-FETs when exposed to oxygen and n-FETs otherwise. It has been proven possible to protect half of an SWNT from oxygen exposure, while exposing the other half to oxygen, so this control measure seems to one of flammability risk mitigation. (excerpted from LaVerne Poussaint,  October 18, 2011 at 5:34 pm)

I’ve included Poussaint’s comment as it provides what I consider a fascinating insight into just how complex this conversation can get.

Nanomaterials definition for Europe

After all the ‘sturm und drang’ in the last few months (my Sept. 8, 2011 posting summarizing some of the lively discussion), a nanomaterials definition for Europe has been adopted. It is the first ‘cross-cutting’ nanomaterials definition to date according to the Oct. 18, 2011 news item on Nanowerk,

“Nanomaterials” are materials whose main constituents have a dimension of between 1 and 100 billionth of a metre, according to a Recommendation on the definition of nanomaterial (pdf) adopted by the European Commission today. The announcement marks an important step towards greater protection for citizens, clearly defining which materials need special treatment in specific legislation.

European Environment Commissioner Janez Potocnik said: “I am happy to say that the EU is the first to come forward with a cross-cutting designation of nanomaterials to be used for all regulatory purposes. We have come up with a solid definition based on scientific input and a broad consultation. Industry needs a clear coherent regulatory framework in this important economic sector, and consumers deserve accurate information about these substances. It is an important step towards addressing any possible risks for the environment and human health, while ensuring that this new technology can live up to its potential.”

As I understand it , ‘cross-cutting’ doesn’t refer to national boundaries so much as it refers to agency boundaries. Take for example the recent nanomaterial definition (my initial comments in an Oct. 11, 2011 posting) adopted by Health Canada. It is applicable only to Health Canada’s jurisdictional responsibilities. Environment Canada uses a different definition.

As for the new European definition of nanomaterials, Dr. Andrew Maynard offers some interesting observations on his 2020 Science blog in an Oct. 18, 2011 posting (Note: Andrew favours an approach other than the one adopted by the European Commission and was an active participant in the lively discussion that took place),

1.  The inclusion of incidental and natural materials in the definition. The inference is that any product containing or associated with nanomaterials from any of these sources will potentially be regulated under this definition.  Strict enforcement of this definition would encompass many polymeric materials and most heterogeneous materials currently in use.  And the lack of distinction between “hard” and “soft” nanoparticles means that the definition applies to any substance containing small micelles or liposomes – someone needs to check the micelle size distribution in homogenized milk.

2.  The focus on unbound nanoparticles and their agglomerates and aggregates. This makes sense in terms of targeting materials with the greatest exposure potential.  But it may be hard to apply to complex nanostructured materials which nevertheless present unusual health and environmental risks – such as materials with biologically active structures that are not based on unbound nanoparticles (patterned surfaces, porous materials and nano-engineered micrometer-sized structures come to mind).

3.  The threshold of 50% of a material’s number distribution comprising of particles with one or more external dimension between 1 nm – 100 nm. This is a laudable attempt to handle materials comprised of particles of different sizes.  But it is unclear where the scientific basis for the 50% threshold lies, how this applies to aggregates and agglomerates, and how diameter is defined (there is no absolute measure of particle diameter – it depends on how it is defined and measured).

The desire to identify materials that require further action makes sense.  But I do worry that this definition is a significant move toward requiring industry action and providing consumer information in a way that creates concern and raises economic barriers, without protecting health (and possibly taking the focus off materials that could present unusual risks) – in the “do no harm” and “do good” stakes, it seems somewhat lacking.

Andrew does include the full text of the definition and more points of interest in his full posting. I’m very happy to see his comments as they give me some guidance as I get ready to review the Health Canada definition more closely.

ETA Oct. 18, 2011 1500 hours: The European Commission released the Joint Research Centre (JRC) and the European Academies Science Advisory Council (EASAC) presented the findings of a joint report entitled “Impact of engineered nanomaterials on health: considerations for benefit-risk assessment” (pdf). This was an  event designed to coincide with the adoption of a definition for nanomaterials. The Oct. 18, 2011 news item on Nanowerk notes,

This fulfils one of the recommendations of the report, which was a call for a precise definition of nanomaterials.

ETA Oct. 18, 2011 1525 hours: I particularly appreciate Andrew’s dry comment about micelle and liposome distribution in milk at the end of his first point.

ETA: NanoWiki offers a roundup of responses in an Oct. 21, 2011 posting.

US National Nanotechnology Initiative holding EHS webinar

There’s an Oct. 15, 2011 news item on Nanowerk announcing the US National Nanotechnology Initiative’s Environmental, Health, and Safety webinar on research strategies.

Federal Agencies participating in the National Nanotechnology Initiative (NNI) are hosting a webinar to announce the release of the 2011 NNI Environmental, Health, and Safety (EHS) Research Strategy and to the discuss the development of this document and its key focus areas. The webinar will be held October 20, 2011 from 12 noon until 12:45p.m [EDT].

The event will consist of an overview of the strategy’s development followed by comments from industrial, regulatory, and public health perspectives. Dr. John Howard, Nanotechnology Environmental and Health Implications (NEHI) Working Group Co-Chair, will serve as the moderator. Panelists include:

  • Dr. Treye Thomas, NEHI Working Group Co-Chair
  • Dr. Shaun Clancy, Evonik DeGussa Corporation
  • Dr. Janet Carter, Occupational Safety and Health Administration (OSHA)
  • Ms. Lynn Bergeson, Bergeson & Campbell

The webinar will also feature a 20-minute question-and-answer segment following the presentations. Questions may be submitted prior to the webinar to [email protected] beginning at noon (EDT) Wednesday, October 19, 2011 and will be accepted until the close of the webinar at 12:45 p.m. Thursday, October 20, 2011. [???]

I’m pretty sure that last bit is an error. I can’t imagine a webinar that lasts for 25 hours, at least not on this topic.

As registration is necessary to watch the webinar, I tried to do so and failed each time. I think the problem is that I don’t have a zip code. Usually I can fill in a Canadian postal code instead but this system rejected every attempt. If you do have a US zip code, you can register here.

In preparation for this webinar about EHS research strategies to be undertaken by US federal agencies, Dr. Andrew Maynard has summarized some of the public comments about the  key recommendations in the draft version, which was published in December 2010. Excerpted from Andrew’s Oct. 15, 2011 posting,

Bill Kojola

An integrated and linked research effort to assess, via epidemiological studies, the impact of exposure to engineered nanomaterials on human health and any necessary resultant risk assessment/management responses seems to be missing from the strategy.

Andrew Maynard

…what would it take to craft a federal strategy that enabled agencies to work together more effectively in ensuring the safe use of nanomaterials?  I’m not sure that this is entirely possible – an internal strategy will always be constrained by the system in ways that an externally-crafted strategy isn’t.  But I do think that there are three areas in particular that could be built on here:

  1. Principles. The idea of establishing principles to which agencies sign up to is a powerful one, and could be extended further.  For instance, they could include a commitment to working closely and cooperatively with other agencies, to working toward a common set of aims, and to critically reviewing progress towards these aims on a regular basis.
  2. Accountability. The implementation and coordination framework set out in chapter 8 of the draft strategy contains a number of items that, with a bit of work, some group within the federal government could be held accountable to.  Formally, the NNCO would seem to be the most appropriate organization to be held responsible for progress here.  With accountability for actions that support the implementation and coordination of the strategy, a basis could be built for an actionable strategy, rather than wishful thinking.
  3. Innovation. So often in documents like this, there is a sense of defeatism – “this is the system, and there’s nothing we can do to change it”.  Yet there are always innovative ways to circumvent institutional barriers in order to achieve specific ends.  I would strongly encourage the NEHI to start from the question “where to we want to go, and how are we going to get there”, rather than “what are we allowed to do”, and from this starting point explore innovative ways of making substantive and measurable progress towards the stated mission of the strategy.  Just one possibility here is to use the model of the Signature Initiatives being developed elsewhere within the NNI – which overcome institutional barriers to encourage agencies to focus on a common challenge.  Something similar to a Signature Initiative focused on predictive modeling, or personal exposure measurement, or nanomaterial characterization, could enable highly coordinated and integrated cross-agency programs that accelerate progress toward specific goals.  But this is just one possibility – there are surely many more ways of getting round the system!

John DiLoreto, The Nanotechnology Coalition

A core mission of the NNI is to foster “technological advancements that benefit society” (Draft NNI 2011 Environmental, Health, and Safety Strategy, page 1). The NNI strategy provides valuable help in identifying key research areas and, in some cases, providing the necessary funding to conduct the research itself. The Coalition believes that to fulfill its mission in this regard, the NNI could and should direct its considerable influence and resources to educating regulatory and other officials in positions of influence about nanotechnology so they can better fulfill their responsibilities to protect the safety of consumers. The EHS research strategy should also examine ways that science-based safety information can be shared with regulatory officials and others in leadership positions and provide scientific resources to assist these officials in understanding what a ‘nanomaterial’ is and help create a better understanding of properties that may impact safety.

David Berube

Section 6, p. 56, line 23/25/26/30 – 23 conflates translation with risk communication (they are different). 25 “approaches” is unclear and should reference levels of acceptable caution. 26 high uncertainty may demand whole new algorithms – your assumption whether risk communication and risk management can be integrated is incorrect. 30 is a good point to discuss the conflation of translation which occurs between parties within similar ranges of understanding and public perception (NGOs) as well as perception of public perception (legislators). Each of these subset publics have different needs and interests and standardization of terminology is hardly sufficient to the task at hand.

p. 57 line 4 – see above and consider we might need to develop algorithms appropriate to different levels of certainty. The assumption the answer to uncertainty is more certainty is not necessarily valid for all publics. The simplified version in the document seems more attuned to strategic communication involving response strategies for different risks and certainty values involving variables like plausibility, phenomenon specificity, exigence, salience, etc.

p. 63 lines 34/37 34 (see above). 37 one model does not fit all. 38 link to trust is very complex and complicated by new/digital media sources as well as new credibility (social media) and reliability.

p. 58 lines 1/5/11/27 (see above) and this demands information sharing and transparency as well as answering how data is defined, who decides what is relevant data, how it is generated, how data is compiled and concatenated. how data is vetted and debunked, and how data is revised. 5 two ways is overly simplistic, try interactional. 11 this is a model issue and we do not have a model for high uncertainty. 27 assumes risk communication is a function of data, esp. scientific data and for many publics that is not true.

p. 76 – Explanation SP objective 4.2 re: needs of the stakeholders – it might be prudent to ask them what their needs are.

Samantha Dozier, PETA

A complete, step-wise method for rigorous characterization is imperative so that measurement is not questioned and studies are not repeated. A clear requirement for nanomaterial characterization will help eliminate redundancy and imprecise data-gathering and will aid in reducing animal use for the field.

For human health effects assessment, the NNI should promote the development of a tiered, weight-of-evidence approach that is based on the most relevant methods available and encourages the NNI to support the incorporation of appropriate in vitro human-relevant cell and tissue assays for all endpoints, instead of relying on inadequately modified, non-validated animal assays. This tiered approach should start with an initial characterization of the nanomaterial, followed by in vitro basal cell and portal-of-entry toxicity assessments according to human exposure potential and a full characterization of the toxicokinetic potential.

There’s a lot more in Andrew’s posting. It saddens me even more now that I see Andrew’s posting that Health Canada did not make the submissions to its public consultation on “Policy Statement on Health Canada’s Working Definition for Nanomaterials” available for viewing (my Oct. 11, 2011 posting).