Monthly Archives: December 2008

Sporty nano in Vancouver, Canada

As soon as I saw the title I knew it had to be a nanotech product. “The new ‘no sweat’ science” was an article in the Sunday (Dec. 21, 2008) edition of The Province daily paper. A local company, Firstar Sports (based in Surrey), makes a shirt that wicks away your sweat and never smells. The current CEO, Keith Gracey, wore the shirt over a period of months for his workouts and never washed it. Plus, he never had any complaints about the smell.

The ‘no smell’ part was the clue. There’s been a lot of talk about silver nanoparticles and their anti-bacterial properties which can be used in bandages to combat (infection and in clothing to combat smell.  Interestingly, nobody used the word nanotechnology or any of its variants in the article,

Throw in some anti-bacterial silver ions and Firstar’s garments have a 99.9-er-cent kill rate for bacteria after 50 washes, Gracey  (CEO) and Thom (Founder and VP) say. [emphasis is mine]

Certainly the marketing and PR folks seem to be backing off from using the nanotechnology or any of its variants. I commented on this development in my Talking nano posting. I also gave a link to an article by Alex Shmidt about this.

The article in The Province did not mention any risks but i don’t expect the reporter knew enough to ask the question. For the record, I have seen material which indicates that the silver nanoparticles (or ions) wash off, which means they could end up in our water supply. As far as I know, there’s no definitive data whether or not this feature could pose risks.

Incremental regulation and nanotechnology

I think today will be the end of this series. So, for the last time, the article is ‘Risk Management Principles for Nanotechnology’ by Gary E. Marchant, Douglas J. Sylvester, and Kenneth W. Abbott in Nanoethics, 2008, vol. 2, pp. 43-60.

The authors contend that the regulatory model proposed by Ayers and Braithwaite (discussed in yesterday’s post) is not sufficiently flexible to accommodate nanotechnology as their model assumes

“a fully developed regulatory system that can effectively manage a particular set of risks. …  advanced nations with  highly developed legal systems in which legislatures and agencies can create, communicate, and utilize a range of regulatory options. … high levels of information and understanding on the part of regulators. p. 52

In turn, the authors are proposing a refinement of the Ayers/Braithwaite model, ‘Incremental Regulation’, which they describe by using an example from the US Environmental Protection Agency (EPA)

The EPA Nanomaterials Stewardship Program reflects precisely the approach we espouse here: begin with information gathering and assessment, encourage experiments with self-regulation and multi-stakeholder norms, move gradually to greater governmental involvement to standardize, scale up and supervise voluntary programs, perform all the steps with high levels of transparency and participation, and over time build up a regulatory end state that retains the best of these voluntary mechanisms … along with formal regulation …, as required. p. 57

Seems more like a plea to ‘go slow’ rather than rush to regulating before you understand the implications. The approach seems reasonable enough. Of course, implementing these ideas always provides a stumbling block. I’ve worked in enough jobs where I’ve had to invoke policy in situations that the policy makers never envisioned due to the fact [1] they had no practical experience and [2] it’s impossible to create policies that cover every single contingency. That’s kind of a big problem with nanotechnology, none of us has much practical experience with it and I think the question that hasn’t been addressed is whether or not we are willing to take chances. Then we need to figure out what kind, how long, and who will be taking the chances? More soon.

Inspiration for a new approach to risk regulation for nanotechnology

I’m getting into the home stretch now regarding the ‘Risk Management Principle for Nanotechnology’ article. After dealing with the ‘classic’ risk principles and the newer precautionary principles, the authors (Marchant, Sylvester, and Abbott) unveil a theory for their proposed ‘new principles’. The theory is based on work by I. Ayres and J. Braithwaite on something they call, ‘Responsive Regulation’. Briefly, they suggest avoiding the regulation/deregulation debate in favour of a flexible regulatory approach where a range of strategies are employed.

With this tool kit [range of strategies] in hand, regulators can play a tit-for-tat strategy: they allow firms to self-regulate so long as the firms reciprocate with responsible action; if instead some firms act opportunistically, regulators respond to the defectors with appropriate penalties and more stringent regulation. p. 52 (Nanoethics, 2008, vol. 2 pp. 43-60

There are some difficulties associated with this approach but that is being saved for my next posting in this series.

The Project on Emerging Nanotechnologies has two events coming up. ‘Synthetic Biology: Is Ethics a Showstopper?’ on Thursday, January 8, 2009 from 12:30 pm – 1:30 pm (EST). For information on location (you have to RSVP) or how to attend via webcast (no RSVP required), check here. The other event is called, ‘Nanotech and Your Daily Vitamins; Barriers to Effective FDA Regulation of Nanotechnology-Based Dietary Supplements’ and will be held on Thursday, January 15 (?) from 9:30 am – 10:30 am (EST). The date listed on their website and in their invitation is January 14, which is incorrect. I imagine they’ll correct either the date or date soon. For more details about the event itself, the physical location (If you’re planning to go, please RSVP), or the webcast directions (RSVP) not required) please check here.

The availability heuristic and the perception of risk

It’s taking a lot longer to go through the Risk Management Principles for Nanotechnology article than I expected. But, let’s move onwards. “Availability” is the other main heuristic used when trying to understand how people perceive risk. This one is about how we assess the likelihood of one or more risks.

According to researchers, individuals who can easily recall a memory specific to a given harm are predisposed to overestimating the probability of its recurrence, compared to to other more likely harms to which no memory is attached. p. 49 in Nanoethics, 2008, vol. 2

This memory extends beyond your personal experience (although it remains the most powerful) all the way to reading or hearing about an incident.  The effect can also be exacerbated by imagery and social reinforcement. Probably the most powerful, recent example would be ‘frankenfoods’. We read about the cloning of Dolly the sheep who died soon after her ‘brith’, there was the ‘stem cell debate, and ‘mad cow disease’ which somehow got mixed together in a debate on genetically modified food evolving into a discussion about biotechnology in general. The whole thing was summed as ‘frankenfood’ a term which fused a very popular icon of science gone mad, Frankenstein, with the food we put in our mouths. (Note: It is a little more complicated than that but I’m not in the mood to write a long paper or dissertation where every nuance and development is discussed.) It was propelled by the media and activists had one of their most successful campaigns.

Getting back to ‘availability’ it is a very powerful heuristic to use when trying to understand how people perceive risk.

The thing with ‘frankenfoods’ is that wasn’t planned. Susan Tyler Hitchcock in her book, ‘Frankensein; a cultural history’ (2007), traces the birth of the term in a 1992 letter written by Paul Lewis to the New York Times through to its use as a clarion cry for activists, the media, and a newly worried public. Lewis coined the phrase and one infers from the book that it was done casually. The phrase was picked up by other media outlets and other activists (Lewis is both a professor and an activist). For the full story, check out Tyler’s book pp. 288-294.

I have heard the ETC Group as being credited with the ‘frankenfoods’ debate and pushing the activist agenda. While they may have been active in the debate, I have not been able to find any documentation to support the contention that the ETC Group made it happen. (Please let me know if you have found something.)

The authors (Marchant, Sylvester, and Abbott) of this risk management paper feel that nanotechnology is vulnerable to the same sort of cascading effects that the ‘availability’ heuristic provides a framework for understanding. Coming next, a ‘new’ risk management model.

The affect heuristic and risk management principles

Continuing still with the article by Marchant, Sylvester, and Abbott (Risk Management Principles for Nanotechnology) but first a comment about the report released yesterday by the US National Research Council. I haven’t had a chance to look at it but the report coverage points to agreement between a surprising set of stakeholders to the effect that there is no appropriate governance (regulation) of nanotechnology. The stakeholders include scientists, industry heavyweights such as BASF and Dupont as well as non-for-profit organizations (American Chemical Council and Project on Emerging Nanotechnologies). They didn’t mention any activist groups in the materials I’ve seen but I can’t imagine any disagreement for those quarters.

It’s intriguing that this panel report from the US National Research Council has been released the same week that Nature Nanotechnology has published data from ‘the [sic] Cognition Project’ at Yale Law School warning about possible ‘culture wars’ and Dietram Scheufele’s latest findings about the impact religion might have on the adoption of nanotechnology. It’s possible that someone is masterminding all of this but I think there’s a more likely scenario. Most of the people of the involved know each other because there’s a loose network. They are concerned about the potential for problems and when they talk to each other they find out about each other’s projects and encourage them. At some point they may have decided that it would be a good tactic to release reports and publish in Nature Nanotechnology at roughly the same time. Consequently, they got lucky and the various media cooperated unknowingly with this impromptu plan. Conversely, nobody talked to anyone about these various projects and they got lucky. What I don’t believe is that they developed some master plan and carried it out.

On to heuristics. As I understand the word, it means guidelines (more or less). In this paper, the authors discuss two specific heuristics that relate to risk perception. (If you’re going to manage risk, you need to understand how it’s perceived.)

Where nanotechnology is concerned, ‘Affect” is considered to be an important heuristic when examining the public’s perception of risk. (Affect is how you feel about something.) Here’s something interesting from the paper,

… numerous studies have shown that where individuals believe a technology has high benefits, they automatically believe its risks are low. This negative correlation has been shown to affect both lay and expert opinions, and is robust even in the face of countervailing evidence. … In short, how individuals feel about a particular stimulus directs how they perceive its dangers or benefits. p. 48

What fascinates me is that your knowledge about the topic be it expert or amateur is still heavily affected by whether or not you believe the technology is beneficial even when evidence suggests that the dangers are huge.

There’s more about ‘affect’ in the article, if you’re interested, get the journal Nanoethics, 2008, vol. 2, pp. 43-60. Meanwhile, there’s another heuristic that the authors are using to build their case for a new risk management principle. The other heuristic is ‘Availability’ and more about that tomorrow.

The precautionary principle and a bit about the ‘culture wars’

I was sick for a while there but now I’m back. The article I’ve been talking about is “Risk Management Principles for Nanotechnology” by Gary E. Marchant, Douglas J. Sylvester and Kenneth W. Abbott. The precautionary principle according to the article ‘is often summarized by the phrase ‘better safe than sorry’.” In other words, if there’s a possibility that something bad will happen, don’t do it. As you might expect this seems like a problematic principle to implement. Do you sit around imagining disaster scenarios or do you tell yourself everything will be fine? How do you determine the level of possible risk?

One of the reasons I was so interested in the event that the Project on Emerging Nanotechnologies had organized with L’Oreal (cosmetics firm) was that the company representative would be discussing how they were implementing the precautionary principle when developing and selling their nanotechnology-based cosmetics. Unfortunately, that event has yet to be rescheduled.

The subject of risk is quite topical right now due to an article from the folks at Yale Law School’s Cognition Project (in cooperation with the Project on Emerging Nanotechnologies) that’s just been published in Nature Nanotechnology and which apparently predicts ‘culture wars’. (I read an earlier version of the work online and cited it in a presentation for the 2008 Cascadia Nanotechnology Symposium.) The major thrust of the work at Yale was that people will consider the benefits and risks of an emerging technology (in this case, nanotechnology) according to their cultural values. They used anthropologist Mary Douglas’s two cross-cutting dimensions of culture to explain what they mean by culture. On one axis you have hierarchy/egalitarianism and on the other axis you have individualism/egalitarianism. One of the findings in the paper is that it doesn’t matter how much information you receive (this relates to the notion of science literacy where if you educate people about the technology they will come to accept it and its attendant risks) since your opinion of the technology is more strongly influenced by your cultural values as they are measured on those two axes. I think at least some of this work is a response to the city of Berkeley’s law regulating nanotechnology research. The legislation was passed unusually quickly and, I believe, it was the first such legislation in the US.

Concurrently published in Nature Nanotechnology with the ‘culture wars’ article is an article by Dietram Scheufele where he discusses how ‘religion’ or ‘values’ have an impact on attitudes towards nanotechnology. I think this article is based on some of the material he presented last year at the 2007 American Association for the Advancement of Science annual meeting.