Tag Archives: the Cognition Project

Public understanding of science projects as prophylactic treatments

As I stated (in different words) yesterday, prophylaxis is one of the unspoken goals for a lot of these public consultation/engagement/understanding science projects. The problem is that you have to figure out how a group is going to react so you can make predictions.  The recent write up in Nature Nanotechnology (December 2008 online edition) featuring work from Dan M. Kahan, et al from the Cognition Project at Yale has a very interesting way of analyzing how people arrive at their opinions described in my posting here. These people suggest/predict that learning more about a science or a technology is not going to be helpful since opinions get fixed at an early stage and further intellectual input does not (according to their study) change things.

Presumably people would have behaved similarly (i.e. quickly establishing opinion after a minimum of input) on the introduction of electricity. There are a surprising number of similarities between the technology introductions of then (19th century) and now. If you want to look at some of the text from that period complete with dire predictions about messing with God’s work, there’s an excellent book  written by Carolyn Marvin called ‘When Old Technologies were New’.

We are able to predict some things about people individually and in groups but we don’t have a very good record. If we could do it well, every movie would be a financial success, every song would be a hit, and no scientists would ever have the projects halted due to public outcry.  More tomorrow.

The affect heuristic and risk management principles

Continuing still with the article by Marchant, Sylvester, and Abbott (Risk Management Principles for Nanotechnology) but first a comment about the report released yesterday by the US National Research Council. I haven’t had a chance to look at it but the report coverage points to agreement between a surprising set of stakeholders to the effect that there is no appropriate governance (regulation) of nanotechnology. The stakeholders include scientists, industry heavyweights such as BASF and Dupont as well as non-for-profit organizations (American Chemical Council and Project on Emerging Nanotechnologies). They didn’t mention any activist groups in the materials I’ve seen but I can’t imagine any disagreement for those quarters.

It’s intriguing that this panel report from the US National Research Council has been released the same week that Nature Nanotechnology has published data from ‘the [sic] Cognition Project’ at Yale Law School warning about possible ‘culture wars’ and Dietram Scheufele’s latest findings about the impact religion might have on the adoption of nanotechnology. It’s possible that someone is masterminding all of this but I think there’s a more likely scenario. Most of the people of the involved know each other because there’s a loose network. They are concerned about the potential for problems and when they talk to each other they find out about each other’s projects and encourage them. At some point they may have decided that it would be a good tactic to release reports and publish in Nature Nanotechnology at roughly the same time. Consequently, they got lucky and the various media cooperated unknowingly with this impromptu plan. Conversely, nobody talked to anyone about these various projects and they got lucky. What I don’t believe is that they developed some master plan and carried it out.

On to heuristics. As I understand the word, it means guidelines (more or less). In this paper, the authors discuss two specific heuristics that relate to risk perception. (If you’re going to manage risk, you need to understand how it’s perceived.)

Where nanotechnology is concerned, ‘Affect” is considered to be an important heuristic when examining the public’s perception of risk. (Affect is how you feel about something.) Here’s something interesting from the paper,

… numerous studies have shown that where individuals believe a technology has high benefits, they automatically believe its risks are low. This negative correlation has been shown to affect both lay and expert opinions, and is robust even in the face of countervailing evidence. … In short, how individuals feel about a particular stimulus directs how they perceive its dangers or benefits. p. 48

What fascinates me is that your knowledge about the topic be it expert or amateur is still heavily affected by whether or not you believe the technology is beneficial even when evidence suggests that the dangers are huge.

There’s more about ‘affect’ in the article, if you’re interested, get the journal Nanoethics, 2008, vol. 2, pp. 43-60. Meanwhile, there’s another heuristic that the authors are using to build their case for a new risk management principle. The other heuristic is ‘Availability’ and more about that tomorrow.

The precautionary principle and a bit about the ‘culture wars’

I was sick for a while there but now I’m back. The article I’ve been talking about is “Risk Management Principles for Nanotechnology” by Gary E. Marchant, Douglas J. Sylvester and Kenneth W. Abbott. The precautionary principle according to the article ‘is often summarized by the phrase ‘better safe than sorry’.” In other words, if there’s a possibility that something bad will happen, don’t do it. As you might expect this seems like a problematic principle to implement. Do you sit around imagining disaster scenarios or do you tell yourself everything will be fine? How do you determine the level of possible risk?

One of the reasons I was so interested in the event that the Project on Emerging Nanotechnologies had organized with L’Oreal (cosmetics firm) was that the company representative would be discussing how they were implementing the precautionary principle when developing and selling their nanotechnology-based cosmetics. Unfortunately, that event has yet to be rescheduled.

The subject of risk is quite topical right now due to an article from the folks at Yale Law School’s Cognition Project (in cooperation with the Project on Emerging Nanotechnologies) that’s just been published in Nature Nanotechnology and which apparently predicts ‘culture wars’. (I read an earlier version of the work online and cited it in a presentation for the 2008 Cascadia Nanotechnology Symposium.) The major thrust of the work at Yale was that people will consider the benefits and risks of an emerging technology (in this case, nanotechnology) according to their cultural values. They used anthropologist Mary Douglas’s two cross-cutting dimensions of culture to explain what they mean by culture. On one axis you have hierarchy/egalitarianism and on the other axis you have individualism/egalitarianism. One of the findings in the paper is that it doesn’t matter how much information you receive (this relates to the notion of science literacy where if you educate people about the technology they will come to accept it and its attendant risks) since your opinion of the technology is more strongly influenced by your cultural values as they are measured on those two axes. I think at least some of this work is a response to the city of Berkeley’s law regulating nanotechnology research. The legislation was passed unusually quickly and, I believe, it was the first such legislation in the US.

Concurrently published in Nature Nanotechnology with the ‘culture wars’ article is an article by Dietram Scheufele where he discusses how ‘religion’ or ‘values’ have an impact on attitudes towards nanotechnology. I think this article is based on some of the material he presented last year at the 2007 American Association for the Advancement of Science annual meeting.