Tag Archives: science communication

Nanotechnology and the Council of Canadian Academies assessment report

I started discussing the Council of Canadian Academies and its mid-term assessment report (Review of the Council of Canadian Academies; Report from the External Evaluation Panel 2010) yesterday and will finish today with my thoughts on the assessment of the Council’s nanotechnology report and its impact.

Titled Small is Different: A Science Perspective of the Regulatory Challenges on the Nanoscale (2008), the Council’s report is one of the best I’ve read. I highly recommend it to anyone who wants an introduction to some of the issues (and was much struck by its omission from the list of suggested nanotechnology readings that Peter Julian [Canadian MP] offered in part 2 of his interview).  Interestingly, the Council’s nanotechnology report is Case Study No. 3 in the mid-term expert panel assessment report’s Annex 6 (p. 33 in the print version and p. 37 in PDF).

Many respondents were concerned that Health Canada has made no response to, or use of, this report. However, Health Canada respondents were highly enthusiastic about the assessment and the ways in which it is being used to inform the department’s many – albeit still entirely internal – regulatory development activities: “We’ve all read it and used it. The fact that we haven’t responded to the outside is actually a reflection of how busy we’ve been responding to the file on the inside!” [emphases mine]

The report has been particularly valuable in providing a framework to bring together Health Canada’s five – very different – regulatory regimes to identify a common approach and priorities. The sponsor believes the report’s findings have been well-incorporated into its draft working definition of nanomaterials, [emphasis mine] its work with Canadian and international standards agencies, its development of a regulatory framework to address shorter- and longer-term needs, and its creation of a research agenda to aid the development of the science needed to underpin the regulation of nanomaterials in Canada.

I think the next time somebody confronts me as to why I haven’t responded externally to some notice (e.g., paid my strata fees), I’ll assure them that I’ve been ‘responding on the inside’. (Sometimes I cannot resist the low-hanging fruit and I just have to take a bite.)

As for the second paragraph where they claim that Health Canada has incorporated suggestions from the report for its nanomaterials definition, that’s all well and good but the thinking is changing and Health Canada doesn’t seem to be responding (or to even be aware of the fact). Take a look at the proposed definition in the current draft bill before the US Senate where in addition to size, they mention shape, reactivity, and more as compared the Health Canada 1 to 100 nm. size definition. (See details in this posting from earlier in the week where I compare the proposed US and Canadian definitions.)

Additionally, I think they need to find ways to measure impact that are quantitative as well as this qualitative approach, which itself needs to be revised. Quantitative measures could include the numbers of reports disseminated in print and online, social networking efforts (if any), number of times reports are mentioned in the media, etc. They may also want to limit case studies in future reports so they can provide more depth. The comment about the ‘internal’ impact could have been described at more length. How have the five different Health Canada regulatory regimes come together? Has something substantive occurred?

Finally, it’s hard to know if the Julian’s failure to mention the council’s report in his list of nanotechnology readings is a simple failure of memory or a reflection of the Council’s “invisibility”. I’m inclined to believe that it’s the latter.

Science advice and technology assessment in Canada?

Thank you to the folks at The Black Hole blog for their very incisive post about the recent (released April 28, 2010) mid-term assessment report of the Council of Canadian Academies (CCA). Here’s a brief excerpt from what The Black Hole posting,

Created in 2005, the Council of Canadian Academies is a not-for-profit corporation that supports science-based, expert assessments to inform public policy development in Canada. It was created with $30 million seed funding from Government which expires in 2015 and just underwent its midterm assessment last week. The report was generally positive and indeed to the casual reader it would appear the CCA has a lot to be proud of and not much to worry about. Digging a little deeper though, one gets the feeling that the CCA is facing a critical juncture in its existence and faces the very real possibility of becoming a heck of a lot less effective in 2015.

The blogger, Dave, goes on to explain that the concerns arise from the CCA’s “lack of visibility” and its “dependence on government sponsors” (I assume this means funding). Given that the CCA is the only agency that provides comprehensive science advice for Canada, this could mean the loss of a very singular resource will  in the foreseeable future.

In looking at the report very briefly I too noticed a few things that rang warning bells. From the report (p. 9 in print version, p. 13 in PDF),

Recognizing that a great deal of Canada’s intellectual capital lies within the country’s three Academies – the RSC: The Academies of Arts, Humanities and Sciences of Canada, the Canadian Academy of Engineering, and the Canadian Academy of Health Sciences – these organizations were designated the founding members of the Council. The relationship between the Council and its three Member Academies, however, has not [emphases mine] been as productive or cooperative as it could be.

As far as I’m concerned there’s no chance for survival if the CCA can’t develop a good working relationship with its academies. Further, this working relationship will determine the success of the CCA’s efforts to address its “invisibility.” In the report there are three recommendations for communication efforts to make the CCA more visible (p. 13 in print version, p. 16  in PDF),


14. The Board should lead the development of a new communications strategy that builds on the Council’s considerable assets: its reputation, quality product, enthusiastic panellists and scientific advisors, and its key partners, the Academies.

15. The Council should empower and support this broadened scope of voices to engage with a wide range of key stakeholders who could be identifying topics and/or making use of their findings.

16. The Council should continue to seek opportunities to work with the Academies to contribute to international science advisory bodies.

All of there recommendations are reliant on support from the member academies.

On another note, I find the complete and utter of lack interest in communication efforts to the general public fascinating (I’ve skimmed through the report and have to spot anything that concretely addresses it). They are unrelentingly focused on experts and policy makers. I understand that public outreach is not part of the official mandate but the CCA does release reports to the media and arguably they would like their reports to have some impact on the larger society. They might even be interested in public support when the next federal budget that will have an impact on their activities is due or if they try to increase revenue streams to include something other than government funding. At the very least, they should acknowledge the presence of a much larger world around them and their stakeholders (how do they define stakeholders, anyway? aren’t Canadian citizens stakeholders?).

This indifference to the Canadian citizenry contrasts mightily with the approach Richard Sclove (mentioned in this posting earlier today) is trying to implement with regard to technology assessment in the US. In fact, the indifference contrasts with material I see that comes from the US, the UK, and from the European Community.

NNI’s clumsy attempt to manipulate media; copyright roots

Is it ever a good idea to hand a bunch of experts at your public workshop on nanotechnology risks and ethical issues a list of the facts and comments that you’d like them to give in response to ‘difficult’ questions from the media after you’ve taken a recent shellacking from one reporter who is likely present? While the answer should be obvious, I’m sad to say that the folks at the US National Nanotechnology Initiative (NNI) publicly and demonstrably failed to answer correctly.

The reporter in question is Andrew Schneider who wrote a series on nanotechnology for AOL News. I’ve mentioned his series in passing a few times here and I’m truly disheartened to find myself discussing Schneider and it, one more time. For the record, I think it’s well written and there’s some good information about important problems unfortunately, there’s also a fair chunk of misleading and wrong information. So, in addition to the solid, well founded material, the series also provides examples of ill-informed and irresponsible science journalism. (Here’s an example of one of his misleading statements. If you want to find it, you have to read down a few paragraphs as that post was about misleading statements being bruited about by individuals with differing perspectives on nanotechnology.) The Schneider’s series, if you’re madly curious is here.

Yesterday, Clayton Teague, director for the National Nanotechnology Coordination Office, provided a riposte on AOL News where Schneider, a few hours, later, offered a devastating nonresponse. Instead, Schneider focused on the NNI’s recent report to the President’s Council of Science and Technology Advisors (PCAST) getting in a few solid hits before revealing the clumsy attempt to manipulate the media message at the public workshop that the NNI recently held and which Schneider likely attended.

If you want the inside story from the perspective of one of the experts who was at the panel, check out Dr. Andrew Maynard’s latest posting on his 2020 Science blog.

Two more points before I move on (for today anyway), Schneider’s ‘nonresponse’ refers to both Andrew and another expert as ‘civilians’.

  • Maynard [director of the Risk Science Center at the University of Michigan School of Public Health] and Jennifer Sass [chief scientist and nano expert for the Natural Resources Defense Council], both leading civilian public health scientists who participated in the review … [emphasis mine]
  • “Surely it is inappropriate for the federal government to advise independent experts what to say on its behalf when it comes to critical news reports,” added Maynard, who was one of the civilian advisers on the panel. [emphasis mine]

As far as I’m aware, only the police and the military refer to the rest of us (who are not them) as civilians. Is Schneider trying to suggest (purposely or not) a police or military state?

As for my second point. Somebody passed the list of NNI preferred/approved facts and comments on to Schneider. The first thought would be someone from the expert panel but it could have come from anyone within the NNI who had access and is sympathetic to Schneider’s concerns about nanotechnology.

Copyright roots

If you’ve ever been curious as to how copyright came about in the first place, head over to Greg Fenton’s item on Techdirt. From the posting where Fenton is commenting on a recent Economist article about copyright,

The Economist goes on to highlight:

Copyright was originally the grant of a temporary government-supported monopoly on copying a work, not a property right.

Surely there will be copyright supporters who will cringe at such a statement. They believe that copyright is “intellectual property”, and therefore their arguments often confuse the requirements for laws that support copyright with those that support physical properties.

The article Fenton refers to  is currently open access (but I’m not sure for how long or what the policy is at The Economist). The last lines (with which I heartily concur) from the Economist’s article,

The value society places on creativity means that fair use needs to be expanded and inadvertent infringement should be minimally penalised. None of this should get in the way of the enforcement of copyright, which remains a vital tool in the encouragement of learning. But tools are not ends in themselves. [emphasis mine]

Today’s posting is a short one. About time I did that, eh?

Responsible science communication and magic bullets; lego and pasta analogies; sing about physics

Cancer’s ‘magic bullet],  a term which has been around for decades, is falling into disuse and deservedly. So it’s disturbing to see it used by someone in McGill University’s (Montreal, Canada) communications department for a recent breakthrough by their researchers.

The reason ‘magic bullet for cancer’ has been falling into is disuse because it does not function well as a metaphor with what we now know about biology. (The term itself dates from the 19th century and chemist, Paul Erlich.) It continues to exist because it’s an easy (and lazy) way to get attention and headlines. Unfortunately, hyperbolic writing of this type obscures the extraordinary and exciting work that researchers are accomplishing. From the news release on the McGill website (also available on Nanowerk here),

A team of McGill Chemistry Department researchers led by Dr. Hanadi Sleiman has achieved a major breakthrough in the development of nanotubes – tiny “magic bullets” that could one day deliver drugs to specific diseased cells.

The lead researcher seems less inclined to irresponsible hyperbole,

One of the possible future applications for this discovery is cancer treatment. However, Sleiman cautions, “we are still far from being able to treat diseases using this technology; this is only a step in that direction. Researchers need to learn how to take these DNA nanostructures, such as the nanotubes here, and bring them back to biology to solve problems in nanomedicine, from drug delivery, to tissue engineering to sensors,” she said.

You’ll notice that the researcher says these ‘DNA nanotubes’ have to be brought “back to biology.” This comment brought to mind a recent post on 2020 Science (Andrew Maynard’s blog) about noted chemist and nanoscientist’s, George Whitesides, concerns/doubts about the direction for cancer and nanotechnology research. From Andrew’s post,

Cancer treatment has been a poster-child for nanotechnology for almost as long as I’ve been involved with the field. As far back as in 1999, a brochure on nanotechnology published by the US government described future “synthetic anti-body-like nanoscale drugs or devices that might seek out and destroy malignant cells wherever they might be in the body.”

So I was somewhat surprised to see the eminent chemist and nano-scientist George Whitesides questioning how much progress we’ve made in developing nanotechnology-based cancer treatments, in an article published in the Columbia Chronicle.

Whitesides comments are quite illuminating (from the article, Microscopic particles have huge possibilites [sic], by Ivana Susic,

George Whitesides, professor of chemistry and chemical biology at Harvard University, said that while the technology sounds impressive, he thinks the focus should be on using nanoparticles in imaging and diagnosing, not treatment.

The problem lies in being able to deliver the treatment to the right cells, and Whitesides said this has proven difficult.

“Cancer cells are abnormal cells, but they’re still us,” he said. [emphasis is mine]

The nanoparticles sent in to destroy the cancer cells may also destroy unaffected cells, because they can sometimes have cancer markers even if they’re healthy. Tumors have also been known to be “genetically flexible” and mutate around several different therapies, Whitesides explained. This keeps them from getting recognized by the therapeutic drugs.

The other problem with targeting cancer cells is the likelihood that only large tumors will be targeted, missing smaller clumps of developing tumors.

“We need something that finds isolated [cancer] clumps that’s somewhere else in the tissue … it’s not a tumor, it’s a whole bunch of tumors,” Whitesides said.

The upside to the treatment possibilities is that they buy the patient time, he said, which is very important to many cancer patients.

“It’s easy to say that one is going to have a particle that’s going to recognize the tumor once it gets there and will do something that triggers the death of the cell, it’s just that we don’t know how to do either one of these parts,” he said.

There is no simple solution. The more scientists learn about biology the more complicated it becomes, not less. [emphasis is mine] Whitesides said one effective way to deal with cancer is to reduce the risk of getting it by reducing the environmental factors that lead to cancer.

It’s a biology problem, not a particle problem,” he said. [emphasis is mine]

If you are interested , do read Andrew’s post and the comments that follow as well as the article that includes Whitesides’ comments and quotes from Andrew in his guise as Chief Science Advisor for the Project on Emerging Nanotechnologies.

All of this discussion follows on yesterday’s (Mar.17.10) post about how confusing inaccurate science reporting can be.

Moving onwards to two analogies, lego and pasta. Researchers at the University of Glasgow have ‘built’ inorganic (not carbon-based) molecular structures which could potentionally be used as more energy efficient and environmentally friendly catalysts for industrial purposes. From the news item on Nanowerk,

Researchers within the Department of Chemistry created hollow cube-based frameworks from polyoxometalates (POMs) – complex compounds made from metal and oxygen atoms – which stick together like LEGO bricks meaning a whole range of well-defined architectures can be developed with great ease.

The molecular sensing aspects of this new material are related to the potassium and lithium ions, which sit loosely in cavities in the framework. These can be displaced by other positively charged ions such as transition metals or small organic molecules while at the same time leaving the framework intact.

These characteristics highlight some of the many potential uses and applications of POM frameworks, but their principle application is their use as catalysts – a molecule used to start or speed-up a chemical reaction making it more efficient, cost-effective and environmentally friendly.

Moving from lego to pasta with a short stop at the movies, we have MIT researchers describing how they and their team have found a way to ‘imprint’ computer chips by using a new electron-beam lithography process to encourage copolymers to self-assemble on the chip. (Currently, manufacturers use light lasers in a photolithographic process which is becoming less effective as chips grow ever smaller and light waves become too large to use.) From the news item on Nanowerk,

The new technique uses “copolymers” made of two different types of polymer. Berggren [Karl] compares a copolymer molecule to the characters played by Robert De Niro and Charles Grodin in the movie Midnight Run, a bounty hunter and a white-collar criminal who are handcuffed together but can’t stand each other. Ross [Caroline] prefers a homelier analogy: “You can think of it like a piece of spaghetti joined to a piece of tagliatelle,” she says. “These two chains don’t like to mix. So given the choice, all the spaghetti ends would go here, and all the tagliatelle ends would go there, but they can’t, because they’re joined together.” In their attempts to segregate themselves, the different types of polymer chain arrange themselves into predictable patterns. By varying the length of the chains, the proportions of the two polymers, and the shape and location of the silicon hitching posts, Ross, Berggren, and their colleagues were able to produce a wide range of patterns useful in circuit design.

ETA (March 18,2010): Dexter Johnson at Nanoclast continues with his his posts (maybe these will form a series?) about more accuracy in reporting, specifically the news item I’ve just highlighted. Check it out here.

To finish on a completely different note (pun intended), I have a link (courtesy of Dave Bruggeman of the Pasco Phronesis blog by way of the Science Cheerleader blog) to a website eponymously (not sure that’s the right term) named physicssongs.org. Do enjoy such titles as: I got Physics; Snel’s Law – Macarena Style!; and much, much more.

Tomorrow: I’m not sure if I’ll have time to do much more than link to it and point to some commentary but the UK’s Nanotechnologies Strategy has just been been released today.

Can you trust science and scientists?; nanoparticle sludge is a good thing

The recent kerfuffle about scientists, climate change, and hacked emails  (see this story in the UK Guardian for more details) is oddly coincidental with a couple of articles I’ve read recently about trust, science, pr, and scientific claims.

Andrew Maynard (2020 Science) wrote Do scientists dncourage misleading coverage? to explore some of the issues around how scientists get media coverage for their work as he examines a specific incident.

The easiest, simplest way to get coverage for anything is to make a dramatic statement. e.g. First xxxx in history; Huge losses xxxx; xxx possibly fatal; etc. This can lead to overblown claims and/or a snarky, combative communications style. Maynard’s example features overblown claims about possible future applications of a very exciting development. The serious work was published in Nature Physics but someone at the university has written up a news release and produced a video that features the overblown claims as part of their science outreach. Some of this more dramatic material has been picked up and reproduced elsewhere for general consumption.

The reality is that any scientific endeavour occurs over a long period of time and there are many twists and turns before there is any relative certainty about the discovery and/or the implications for any applications that may arise from it.

In the case of climate change, there is strong evidence but as in any other scientific endeavour there are uncertainties. These uncertainties are integral to how science is practiced because our understanding is constantly being refined (theoretically anyway).

The campaign in the popular media to raise concern about climate change is often quite dramatic and has stripped away much of the uncertainty inherent to scientific data. The campaign has been quite successful but an opportunity was created when the evidence for climate change was presented as irrefutable. Opponents were able to capitalize on anomalies and the uncertainty that is inherent in the practice of science. Interestingly, the opponents are just as dramatic and insist their material is just as irrefutable. So, who do you trust? It’s a pretty basic issue and one that keeps recurring.

The point Maynard and Matthew Nisbett (Framing Science blog)  in his posting, Two articles on prediction and hype in science, is that in trying engage the public scientists need to be mindful. Giving in to the impulse to exaggerate or overstate a conclusion for a headline (I do sympathize with that impulse) will do more damage than good to the public’s trust.

Now for something completely different. As more products with nanoparticles enter the marketplace, there’s increasing concern about what happens to them as they are washed off from athletic gear, cleaning products, your body (after using beauty and cosmetic products) and more. According to a newly published paper, scientists may have found a way to remove nanoparticles  from wastewater.  From the news item on Nanowerk,

The new study, details of which are published in Environmental Science & Technology (“Fate of Silica Nanoparticles in Simulated Primary Wastewater Treatment”), simulated primary sewage treatment to show that coating silica nanoparticles with a detergent-like material (called a surfactant) made the nanoparticles interact with components of the sewage to form a solid sludge. This sludge can be separated from the wastewater and disposed of. In contrast, uncoated nanoparticles stayed dispersed in the wastewater and were therefore likely to continue through the effluent stream and potentially on into the environment.

Assuming that nanoparticles entering the environment in substantive quantities is not a good thing, I hope they find some way to deal with them and this research certainly seems promising.

Pop culture, science communication, and nanotechnology

A few years back I wrote a paper for the  Cascadia Nanotech Symposium (March 2007 held in Vancouver) called: Engaging Nanotechnology: pop culture, media, and public awareness. I was reminded it of a few days ago when I saw a mention on Andrew Maynard’s, 2020 Science blog about a seminar titled, Biopolitics of Popular Culture being held in Irvine, California on Dec. 4, 2009 by the Institute of Ethics for Emerging Technologies. (You can read more of Andrew’s comments here or you can check out the meeting details here.) From the meeting website,

Popular culture is full of tropes and cliches that shape our debates about emerging technologies. Our most transcendent expectations for technology come from pop culture, and the most common objections to emerging technologies come from science fiction and horror, from Frankenstein and Brave New World to Gattaca and the Terminator.

Why is it that almost every person in fiction who wants to live a longer than normal life is evil or pays some terrible price? What does it say about attitudes towards posthuman possibilities when mutants in Heroes or the X-Men, or cyborgs in Battlestar Galactica or Iron Man, or vampires in True Blood or Twilight are depicted as capable of responsible citizenship?

Is Hollywood reflecting a transhuman turn in popular culture, helping us imagine a day when magical and muggle can live together in a peaceful Star Trek federation? Will the merging of pop culture, social networking and virtual reality into a heightened augmented reality encourage us all to make our lives a form of participative fiction?

During this day long seminar we will engage with culture critics, artists, writers, and filmmakers to explore the biopolitics that are implicit in depictions of emerging technology in literature, film and television.

I’m not sure what they mean by biopolitics, especially after the lecture I attended at Simon Fraser University’s downtown campus last night (Nov. 12, 2009), Liminal Livestock. Last night’s lecture by Susan Squier highlighted (this is oversimplified) the relationship between women and chickens in the light of reproductive technologies.  From the lecture description,

Adapting SubRosa Art Collective’s memorable question, this talk asks: “What does it mean, to feminism and to agriculture, that women are like chickens and chickens are like women?” As liminal livestock, chickens play a central role in our gendered agricultural imaginary: the zone where we find the “speculative, propositional fabric of agricultural thought.” Analyzing several children’s stories, a novel, and a documentary film, the talk seeks to discover some of the factors that help to shape the role of women in agriculture, and the role of agriculture in women’s lives.

Squier did also discuss reproductive technologies at some length although it’s not obvious from the description that the topic will arise. She discussed the transition of chicken raising as a woman’s job to a man’s job which coincided with the rise of  chicken factory farms. Squier also noted the current interest in raising chickens in city and suburban areas without speculating on possible cultural impacts.

The lecture covered  selective breeding and the shift of university  poultry science departments from the study of science to the study of increasing chicken productivity, which led to tampering with genes and other reproductive technologies. One thing I didn’t realize is that chicken eggs are used for studies on human reproduction. Disturbingly, Squier talked to an American scientist, whose work concerns human reproduction, who moved to Britain because the chicken eggs are of such poor quality in the US.

The relationship between women and chickens was metaphorical and illustrated through popular children’s stories and pop culture artifacts (i.e. poultry beauty pageants featuring women not chickens) in a way that would require reproducing far more of the lecture than I can here. So if you are interested, I understand that Squier does have a book about women and chickens being published although I can’t find a publication date.

Squier’s lecture and the meeting for the Institute of Ethics for Emerging Technologies present different ways of integrating pop culture elements into the discussion about science and emerging technologies. Since I’m tooting my horn, I’m going to finish with my thoughts on the matter as written in my Cascadia Nanotechnology Symposium paper,

The process of accepting, rejecting, or changing new sciences and new technologies seems more akin to a freewheeling, creative conversation with competing narratives than a transfer of information from experts to nonexperts as per the science literacy model.

The focus on establishing how much awareness the public has about nanotechnology by measuring the number of articles in the newspaper or items in the broadcast media or even tracking the topic in the blogosphere is useful as one of a set of tools.

Disturbing as it is to think that it could be used for purely manipulative purposes, finding out how people develop their attitudes towards new technologies and the interplay between cognition, affect, and values has the potential to help us better understand ourselves and our relationship to the sciences. (In this paper, the terms science and technology are being used interchangeably, as is often the case with nanotechnology.)

Pop culture provides a valuable view into how nonexperts learn about science (books, television, etc.) and accept technological innovations (e.g. rejecting the phonograph as a talking book technology but accepting it for music listening).

There is a collaborative and interactive process at the heart of the nanotechnology ‘discussion’. For example, Drexler appears to be responding to some of his critics by revising some of his earlier suppositions about how nanotechnology would work. Interestingly, he also appears to be downplaying his earlier concerns about nanoassemblers running amok and unleashing the ‘goo’ scenario on us all. (BBC News, June 9, 2004)

In reviewing all of the material about communicating science, public attitudes, and values, one thing stands out: time. Electricity was seen by some as deeply disturbing to the cosmic forces of the universe. There was resistance to the idea for decades and, in some cases (the Amish), that resistance lives on. Despite all this, there is not a country in the world today that doesn’t have electricity.

One final note: I didn’t mean to suggest the inexorable adoption of any and all technologies, my intent was to point out the impossibility of determining a technology’s future adoption or rejection by measuring contemporary attitudes, hostile or otherwise.

’nuff said for today. Happy weekend!

Detecting dangerous liquids in airline luggage with a Josephson junction; NANOvember in Albany, New York; nano haiku for November

To be free of those clear plastic bags which hold all your bottles of liquids when you go through airport security with your luggage! That is a very worthwhile nanotechnology promise. From the news item on Nanowerk,

Restrictions on liquids in carry-on bags on commercial airliners could become a thing of the past thanks to a revolutionary nano-electric device which detects potentially hazardous liquids in luggage in a fraction of a second, according to a team of German scientists. Writing in the journal Superconductor Science and Technology, the researchers at the Forschungszentrum Juelich in western Germany claim that they have been able to do this using an optical approach that detects all existing and future harmful liquids within one fifth of a second.

Since the paper has been published, the researchers have been approached by industrial partners about producing a prototype. (sigh) Most likely this means they hope it will be about five years before we see the devices in airports. The device itself is known as a Josephson junction and you can read more about it on the Azonano site too.

I am happy to see that the College of Nanoscale Science and Engineering (CNSE) at the University of Albany (New York, US) has held a remarkably successful nano event, Community Day, during NANOvember  attracting about 1000 people.  From the news item on Nanowerk,

NANOvember is part of “NEXSTEP,” or “Nanotechnology Explorations for Science, Training and Education Promotion,” a partnership between CNSE and KeyBank. Spearheaded by CNSE’s Nanoeconomics Constellation, the initiative features a variety of educational programs designed to promote greater understanding of the changing economic and business environment in the Capital Region and New York State being driven by nanotechnology. “As nanotechnology increasingly shapes the educational and economic landscapes of the Capital Region, NANOvember offers a platform through which the community can better understand the impact and opportunities driven by this emerging science,” said Jeffrey Stone, president, Capital Region, KeyBank N.A.

I’m impressed they attracted that large a crowd in a city with a population of about 100,000 (Albany county has a population of about 300,000) according the 2000 census statistics. By contrast, the city of Vancouver (Canada) has a population of about 600,000 with a regional population of approximately 2 million (from the City of Vancouver website on November 9, 2009) and I’m hard pressed to recall either of our local universities claiming a similar success for one of their community days.

One other point about Albany and nanotechnology, in a July 2008 posting I noted a $1.5B investment for a research centre  in Albany, NY, being made by IBM. So this nanotechnology communication/education event seems to dovetail very nicely with past occurrences and suggests an overall strategy is at work.

Some haiku from NISEnet’s (Nanoscale Informal Science Education Network) newsletter,

After you read this
Your finger nail will have grown
a nanometer
by Troy Dassler

We struggle to show
The size of a molecule.
Kids wait patiently.

by Mike Falvo

You can check out the organization’s The Nano Bite blog here.

Selling science; policy founded on evidence-based research

There’s more from the 2009 Canadian Science Policy conference in Toronto last week. Preston Manning (part 1 and part 2 of his interview for this blog) was Day 2’s keynote speaker and Rob Annan covers Manning’s suggestions for Canadian science policy here. In reading over Rob’s comments for all three days, the speakers’ focus seemed to be on encouraging scientists to learn how to better communicate to politicians, to organize themselves with the purpose of communicating more effectively, and to engage directly in politics, policymaking, and society.

I have commented previously here on how much more effective scientists in the US (and elsewhere) have been with their communication efforts. There is much room for improvement in Canada although I have to admit to choking on this suggestion of Manning’s,

c) create a working group who can work on the application of the science of communication to the communication of science (he liked that phrase – it’s pretty good). Basically, figure out new and innovative ways to get the message out.

The ‘science of communication’ … hmmm … is this like the science of marketing? or the science of advertising? …  It sounds as if Manning believes that there’s a formula. Well, advertisers have an old formula/saying, “50% of your advertising works but nobody knows which 50%. ”

Take the ‘frankenfoods’ or GM (genetically modified) foods debacle for an example of a wildly successful communications campaign. That was a lightning strike. As I noted here in my posting, ‘The unpredictability of ‘frankenfoods’, the activist groups got lucky. There was also another element, most successful campaigns, activist or otherwise, are based on persistence and hard work. In other words, you keep pitching. Add to or change your techniques and  your tools, tweak your messages, etc. but above all, keep pitching.

Selling science is a complicated affair (what follows is a simplified list) because those messages are competing with many others; reciprocity and respect  (i.e. listening to what your recipient has to say) is not always included in the equation especially when it seems uninformed or downright foolish by your standards; and/or your recipients may never be able to accept your message regardless of the evidence supporting your position.

Andrew Maynard has posted about a situation in the UK where the recipients (government officials) are unable or unwilling to consider a new position despite extensive evidence.  Professor David Nutt was until recently the senior scientific advisor to the UK government on the misuse of drugs. He was sacked after a paper he authored was released this last month (October 2009). I found a newspaper (The Guardian) account by Mark Tran of the situation here.

Andrew’s analysis points to something that we’ve all observed, people will choose to disbelieve something against all reason. In fact, we’ve all done it. You just don’t want to change your mind about something that’s usually a deeply held belief linking to your basic worldview. I call it the triumph of orthodoxy over fact.

Bravo to Professor Nutt for his thoughtful paper and his courage (I suspect he was well aware that there might be a reprisal.)

I hope Canadian scientists do become more involved and communicate more effectively while realizing that there are no guarantees that they will achieve their dearly hoped-for outcomes. In the shorter term.

Over the longer term, things change. The concept of universal literacy, democracy; women having the right to vote; ubiquitous electricity; etc. All of these things were bitterly fought against over decades or more.