I was sick for a while there but now I’m back. The article I’ve been talking about is “Risk Management Principles for Nanotechnology” by Gary E. Marchant, Douglas J. Sylvester and Kenneth W. Abbott. The precautionary principle according to the article ‘is often summarized by the phrase ‘better safe than sorry’.” In other words, if there’s a possibility that something bad will happen, don’t do it. As you might expect this seems like a problematic principle to implement. Do you sit around imagining disaster scenarios or do you tell yourself everything will be fine? How do you determine the level of possible risk?
One of the reasons I was so interested in the event that the Project on Emerging Nanotechnologies had organized with L’Oreal (cosmetics firm) was that the company representative would be discussing how they were implementing the precautionary principle when developing and selling their nanotechnology-based cosmetics. Unfortunately, that event has yet to be rescheduled.
The subject of risk is quite topical right now due to an article from the folks at Yale Law School’s Cognition Project (in cooperation with the Project on Emerging Nanotechnologies) that’s just been published in Nature Nanotechnology and which apparently predicts ‘culture wars’. (I read an earlier version of the work online and cited it in a presentation for the 2008 Cascadia Nanotechnology Symposium.) The major thrust of the work at Yale was that people will consider the benefits and risks of an emerging technology (in this case, nanotechnology) according to their cultural values. They used anthropologist Mary Douglas’s two cross-cutting dimensions of culture to explain what they mean by culture. On one axis you have hierarchy/egalitarianism and on the other axis you have individualism/egalitarianism. One of the findings in the paper is that it doesn’t matter how much information you receive (this relates to the notion of science literacy where if you educate people about the technology they will come to accept it and its attendant risks) since your opinion of the technology is more strongly influenced by your cultural values as they are measured on those two axes. I think at least some of this work is a response to the city of Berkeley’s law regulating nanotechnology research. The legislation was passed unusually quickly and, I believe, it was the first such legislation in the US.
Concurrently published in Nature Nanotechnology with the ‘culture wars’ article is an article by Dietram Scheufele where he discusses how ‘religion’ or ‘values’ have an impact on attitudes towards nanotechnology. I think this article is based on some of the material he presented last year at the 2007 American Association for the Advancement of Science annual meeting.