Tag Archives: Marchant

Incremental regulation and nanotechnology

I think today will be the end of this series. So, for the last time, the article is ‘Risk Management Principles for Nanotechnology’ by Gary E. Marchant, Douglas J. Sylvester, and Kenneth W. Abbott in Nanoethics, 2008, vol. 2, pp. 43-60.

The authors contend that the regulatory model proposed by Ayers and Braithwaite (discussed in yesterday’s post) is not sufficiently flexible to accommodate nanotechnology as their model assumes

“a fully developed regulatory system that can effectively manage a particular set of risks. …  advanced nations with  highly developed legal systems in which legislatures and agencies can create, communicate, and utilize a range of regulatory options. … high levels of information and understanding on the part of regulators. p. 52

In turn, the authors are proposing a refinement of the Ayers/Braithwaite model, ‘Incremental Regulation’, which they describe by using an example from the US Environmental Protection Agency (EPA)

The EPA Nanomaterials Stewardship Program reflects precisely the approach we espouse here: begin with information gathering and assessment, encourage experiments with self-regulation and multi-stakeholder norms, move gradually to greater governmental involvement to standardize, scale up and supervise voluntary programs, perform all the steps with high levels of transparency and participation, and over time build up a regulatory end state that retains the best of these voluntary mechanisms … along with formal regulation …, as required. p. 57

Seems more like a plea to ‘go slow’ rather than rush to regulating before you understand the implications. The approach seems reasonable enough. Of course, implementing these ideas always provides a stumbling block. I’ve worked in enough jobs where I’ve had to invoke policy in situations that the policy makers never envisioned due to the fact [1] they had no practical experience and [2] it’s impossible to create policies that cover every single contingency. That’s kind of a big problem with nanotechnology, none of us has much practical experience with it and I think the question that hasn’t been addressed is whether or not we are willing to take chances. Then we need to figure out what kind, how long, and who will be taking the chances? More soon.

Inspiration for a new approach to risk regulation for nanotechnology

I’m getting into the home stretch now regarding the ‘Risk Management Principle for Nanotechnology’ article. After dealing with the ‘classic’ risk principles and the newer precautionary principles, the authors (Marchant, Sylvester, and Abbott) unveil a theory for their proposed ‘new principles’. The theory is based on work by I. Ayres and J. Braithwaite on something they call, ‘Responsive Regulation’. Briefly, they suggest avoiding the regulation/deregulation debate in favour of a flexible regulatory approach where a range of strategies are employed.

With this tool kit [range of strategies] in hand, regulators can play a tit-for-tat strategy: they allow firms to self-regulate so long as the firms reciprocate with responsible action; if instead some firms act opportunistically, regulators respond to the defectors with appropriate penalties and more stringent regulation. p. 52 (Nanoethics, 2008, vol. 2 pp. 43-60

There are some difficulties associated with this approach but that is being saved for my next posting in this series.

The Project on Emerging Nanotechnologies has two events coming up. ‘Synthetic Biology: Is Ethics a Showstopper?’ on Thursday, January 8, 2009 from 12:30 pm – 1:30 pm (EST). For information on location (you have to RSVP) or how to attend via webcast (no RSVP required), check here. The other event is called, ‘Nanotech and Your Daily Vitamins; Barriers to Effective FDA Regulation of Nanotechnology-Based Dietary Supplements’ and will be held on Thursday, January 15 (?) from 9:30 am – 10:30 am (EST). The date listed on their website and in their invitation is January 14, which is incorrect. I imagine they’ll correct either the date or date soon. For more details about the event itself, the physical location (If you’re planning to go, please RSVP), or the webcast directions (RSVP) not required) please check here.

The availability heuristic and the perception of risk

It’s taking a lot longer to go through the Risk Management Principles for Nanotechnology article than I expected. But, let’s move onwards. “Availability” is the other main heuristic used when trying to understand how people perceive risk. This one is about how we assess the likelihood of one or more risks.

According to researchers, individuals who can easily recall a memory specific to a given harm are predisposed to overestimating the probability of its recurrence, compared to to other more likely harms to which no memory is attached. p. 49 in Nanoethics, 2008, vol. 2

This memory extends beyond your personal experience (although it remains the most powerful) all the way to reading or hearing about an incident.  The effect can also be exacerbated by imagery and social reinforcement. Probably the most powerful, recent example would be ‘frankenfoods’. We read about the cloning of Dolly the sheep who died soon after her ‘brith’, there was the ‘stem cell debate, and ‘mad cow disease’ which somehow got mixed together in a debate on genetically modified food evolving into a discussion about biotechnology in general. The whole thing was summed as ‘frankenfood’ a term which fused a very popular icon of science gone mad, Frankenstein, with the food we put in our mouths. (Note: It is a little more complicated than that but I’m not in the mood to write a long paper or dissertation where every nuance and development is discussed.) It was propelled by the media and activists had one of their most successful campaigns.

Getting back to ‘availability’ it is a very powerful heuristic to use when trying to understand how people perceive risk.

The thing with ‘frankenfoods’ is that wasn’t planned. Susan Tyler Hitchcock in her book, ‘Frankensein; a cultural history’ (2007), traces the birth of the term in a 1992 letter written by Paul Lewis to the New York Times through to its use as a clarion cry for activists, the media, and a newly worried public. Lewis coined the phrase and one infers from the book that it was done casually. The phrase was picked up by other media outlets and other activists (Lewis is both a professor and an activist). For the full story, check out Tyler’s book pp. 288-294.

I have heard the ETC Group as being credited with the ‘frankenfoods’ debate and pushing the activist agenda. While they may have been active in the debate, I have not been able to find any documentation to support the contention that the ETC Group made it happen. (Please let me know if you have found something.)

The authors (Marchant, Sylvester, and Abbott) of this risk management paper feel that nanotechnology is vulnerable to the same sort of cascading effects that the ‘availability’ heuristic provides a framework for understanding. Coming next, a ‘new’ risk management model.

The affect heuristic and risk management principles

Continuing still with the article by Marchant, Sylvester, and Abbott (Risk Management Principles for Nanotechnology) but first a comment about the report released yesterday by the US National Research Council. I haven’t had a chance to look at it but the report coverage points to agreement between a surprising set of stakeholders to the effect that there is no appropriate governance (regulation) of nanotechnology. The stakeholders include scientists, industry heavyweights such as BASF and Dupont as well as non-for-profit organizations (American Chemical Council and Project on Emerging Nanotechnologies). They didn’t mention any activist groups in the materials I’ve seen but I can’t imagine any disagreement for those quarters.

It’s intriguing that this panel report from the US National Research Council has been released the same week that Nature Nanotechnology has published data from ‘the [sic] Cognition Project’ at Yale Law School warning about possible ‘culture wars’ and Dietram Scheufele’s latest findings about the impact religion might have on the adoption of nanotechnology. It’s possible that someone is masterminding all of this but I think there’s a more likely scenario. Most of the people of the involved know each other because there’s a loose network. They are concerned about the potential for problems and when they talk to each other they find out about each other’s projects and encourage them. At some point they may have decided that it would be a good tactic to release reports and publish in Nature Nanotechnology at roughly the same time. Consequently, they got lucky and the various media cooperated unknowingly with this impromptu plan. Conversely, nobody talked to anyone about these various projects and they got lucky. What I don’t believe is that they developed some master plan and carried it out.

On to heuristics. As I understand the word, it means guidelines (more or less). In this paper, the authors discuss two specific heuristics that relate to risk perception. (If you’re going to manage risk, you need to understand how it’s perceived.)

Where nanotechnology is concerned, ‘Affect” is considered to be an important heuristic when examining the public’s perception of risk. (Affect is how you feel about something.) Here’s something interesting from the paper,

… numerous studies have shown that where individuals believe a technology has high benefits, they automatically believe its risks are low. This negative correlation has been shown to affect both lay and expert opinions, and is robust even in the face of countervailing evidence. … In short, how individuals feel about a particular stimulus directs how they perceive its dangers or benefits. p. 48

What fascinates me is that your knowledge about the topic be it expert or amateur is still heavily affected by whether or not you believe the technology is beneficial even when evidence suggests that the dangers are huge.

There’s more about ‘affect’ in the article, if you’re interested, get the journal Nanoethics, 2008, vol. 2, pp. 43-60. Meanwhile, there’s another heuristic that the authors are using to build their case for a new risk management principle. The other heuristic is ‘Availability’ and more about that tomorrow.