Tag Archives: PEN

Synbio (Synthetic Biology) in society a May 12, 2010 panel discussion hosted by the Project on Emerging Nanotechnologies

The proper title for this event, hosted by the Project on Emerging Nanotechnologies (PEN) is: Synbio in Society: Toward New Forms of Collaboration? which will be webcast live (I hope they’re able to pull that off this time) this coming Wednesday, May 12, 2010.  The time is listed as 12:30 pm ET (9:30 am PT) but a light lunch (for attendees at the Washington, DC live event) is also mentioned and the folks at PEN haven’t distinguished (as per their usual practice) the time that the panel starts.

From the news release,

One response to society’s concerns about synthetic biology has been to institutionalize the involvement of social scientists in the field. There have been a series of initiatives in which ethics and biosafety approaches have been purposely incorporated into synthetic biology research and development. The collaborative Human Practices model within the NSF-funded SynBERC project was the first initiative in which social scientists were explicitly integrated into a synthetic biology research program. But these new collaborations have also flourished in the UK where four research councils have funded seven scientific networks in synthetic biology that require consideration of ethical, legal and social issues. Another example is the US-UK Synthetic Aesthetics Project, which brings together synthetic biologists, social scientists, designers and artists to explore collaborations between synthetic biology and the creative professions.

Similarly, the European Commission’s Seventh Framework Programme funds projects like Synth-ethics, which “aims at discerning relevant ethical issues in close collaboration with the synthetic biology community.” (http://synthethics.eu/) and SYBHEL, which aims to examine ethical legal and social aspects of SynBio as it applies to health care (http://sybhel.org/).

On May 12, 2010, the Synthetic Biology Project at the Woodrow Wilson International Center for Scholars will present a panel discussion to explore new forms of collaboration that have emerged between scientists and social scientists working on synthetic biology. A distinguished group of speakers will explore the many ways in which the new science of synthetic biology–far from standing apart from the rest of the academic disciplines–is in constant conversation with the social sciences and the arts.

While I’m not a big fan of the whole synthetic biology movement, I do find this collaboration between sciences/social sciences/arts to be quite intriguing.

You can read more about the event or click on to the live streaming webcast on Weds. or RSVP to attend the actual event here.

Quite by chance I found out that Canada’s National Institute for Nanotechnology (NINT) includes synthetic biology in its programme focus. From the Nano Life Sciences at NINT page,

The Nano Life Sciences researchers investigate the fields of synthetic biology, computational biology, protein structure, intermolecular membrane dynamics and microfluidics devices for biological analysis.

* Synthetic biology is a young field that uses genetic engineering and DNA synthesis to develop new proteins and genetic circuits. Proteins are the nanoscale machinery of life while genetic circuits represent computational “logic” capabilities in cells. Research in this field could lead to a “toolkit” for “re-programming” bacteria to produce useful functions.

I haven’t been able to find any more details about the Canadian synbio endeavour on the NINT website.

Oil spills, environmental remediation, and nanotechnology

Oil spills have been on my mind lately as I’ve caught some of the overage about the BP (British Petroleum) oil spill in the Gulf of Mexico. One  leak (the smallest) has been fixed according to a news item on physorg.com

Days of work off the coast of Louisiana with underwater submarines nearly a mile below the surface finally bore fruit as a valve was secured over the smallest of the three leaks and the flow shut off.

The feat does not alter the overall amount of crude spilling into the sea and threatening the fragile US Gulf coast, but is significant nonetheless as the focus can now narrow on just two remaining leaks.

“Working with two leaks is going to be a lot easier than working with three leaks. Progress is being made,” US Coast Guard Petty Officer Brandon Blackwell told AFP.

More than two weeks after the Deepwater Horizon rig exploded, the full impact of the disaster is being realized as a massive slick looms off the US Gulf coast, imperilling the livelihoods of shoreline communities.

The news item goes on to detail how much crude oil is still being lost, the oil slick’s progress, the probable impact on the shoreline and animals, and the other efforts being made to ameliorate the situation.

With all the talk there is about nanotechnology’s potential for helping us to clean up these messes, there’s been no mention of it in the current  efforts as Dexter Johnson over at the IEEE’s (Institute of Electrical and Electronics Engineers)  Nanoclast blog pointed out the other day. From Dexter’s posting which features both a  discussion about patents for nanotechnology-enabled clean up products and an interview with Tim Harper,

So to get a sense of where we really are I wanted to get the perspective of my colleague, Tim Harper (principal of Cientifica), who in addition to being a noted expert on the commercialization of nanotechnologies also has devoted his attention to the use of nanotechnologies in cleantech including its remediation capabilities, leading him to his presentation this week in Australia at the conference Cleantech Science and Solutions: mainstream and at the edge.

“If you are looking for a quick fix from nanotechnology, forget it,” says Harper. “Nanotech is already making an impact in reducing energy, and therefore oil use, it is also being used to create stronger lighter materials that can be used for pipelines, and enabling better sensors for early warning of damage, but in terms of cleaning up the mess, the contribution is minor at best.”

Clearly not the hopeful words that many would have hoped for, and the pity is that it might have been different, according to Harper.

“As with all technologies, the applications take a while to develop,” he says. “If someone had come up with some funding 10 years ago for this specific application then we may have had better tools to deal with it.”

Dexter’s posting about patents and Harper’s comments reminded me of an article by Mason Inman I saw two years ago on the New Scientist website titled, Nanotech ’tissue’ loves oil spills, hates water. From the article,

A material with remarkable oil-absorbing properties has been developed by US researchers. It could help develop high-tech “towels” able to soak up oil spills at sea faster, protecting wildlife and human health.

Almost 200,000 tonnes of oil have been spilled at sea in accidents since the start of the decade, according to the International Tanker Owners Pollution Federation. [This article was posted May 30, 2008]

Clean-up methods have improved in recent years, but separating oil from thousands of gallons of water is still difficult and perhaps the biggest barrier to faster clean ups.

The new water-repellent material is based on manganese oxide nanowires and could provide a blueprint for a new generation of oil-spill cleaners. It is able to absorb up to 20 times its own weight in oil, without sucking up a drop of water.

Unfortunately,

But [Joerg] Lahann [University of Michigan in Ann Arbor, US]  points out that manganese oxide may not be the best material for real-world applications because it could be toxic. He says, though, that the new material “clearly provides a blueprint that can guide the design of future nanomaterials for environmental applications.”

I wonder if they’ve done any research to determine if manganese oxide in the shape and size required to create this nanotech ’tissue’ is toxic. Intriguingly, there was a recent news item on Nanowerk about toxicology research in a marine environment being undertaken.

Led by Dr. Emilien Pelletier, the Institut des Sciences de la Mer de Rimouski at the Université du Québec à Rimouski has obtained an LVEM5 benchtop electron microscope to help them study the short-term and long-term effects of nano-materials on the marine environment.

Dr. Pelletier is the Canada Research Chair in Marine Ecotoxicology. The overall objective of the chair is to understand the impact of natural and anthropogenic stresses on the short-and long-term high-latitude coastal ecosystems to contribute to the conservation, protection and sustainable development of cold coastal marine resources.

Since the news release was written by the company supplying the microscope there is no word as to exactly what Emilien’s team will be researching and how the work might have an impact on other members of the community such as the researchers with the ‘oil-hungry nanotech tissue’ made of nanoscale manganese oxide.

There is as always a political element to all of this discussion about what we could or couldn’t do with nanotechnology-enabled means to clean up oil spills and/or reduce/eliminate our dependence on oil. This discussion is not new as Dr. J. Storrs Hall implies during a presentation being reported in a recent (May 4, 2010) Foresight Institute blog entry by Dave Cronz, PhD. From the posting,

Here I offer my reflections on some of the highlights of the presentation by Dr. J. Storrs Hall of the Foresight Institute, entitled “Feynman’s Pathway to Nanomanufacturing,” and the panel discussion that followed, “How Do We Get There from Here?” Discussions such as these are crucial opportunities to reflect on – and potentially shape – emerging technologies whose destinies are often left to be determined by “market forces.”

Dr. Hall began with an intriguing argument: Feynman’s top-down approach to reaching the nano scale in manufacturing, achieved through a step-down method of replicating and miniaturizing an entire, fully-equipped machine shop in 1:4 scale over and over would yield countless benefits to science, engineering, and manufacturing at each step. These microscopic, tele-manipulated master-slave “Waldos” (named after Heinlein’s 1942 story “Waldo F. Jones”) would get nanotechnology back on track by focusing on machines and manufacturing, since most of our current emphasis is on science at the nano scale. Feynman’s top-down approach to nanoscale manufacturing is missing from the Foresight Institute’s roadmap, according to Hall, “for political reasons.” This raises a fundamental point: science and technology cannot develop independent of the political and social spheres, which pose as many challenges as the technology. Many would argue that social and technological processes are inseparable and treating them otherwise borders on folly. I commend Dr. Hall for offering his argument. It soon became clear that the panelists who joined him after his presentation disagreed. [bolded emphases mine]

As Dr. Hall aptly noted it’s not dispassionate calculations but “serendipity: the way science always works.”

I’m in agreement with Dr. Hall, the political and social spheres are inseparable from the scientific and technological spheres. As for “emerging technologies whose destinies  are often left to be determined by market forces”, Dexter’s posting ends with this,

But foresight is not the strong suit of businesses built around short-term profit motives as evidenced by them [BP] not even investing in the remote systems that would have turned the oil well off and possibly avoided the entire problem.

I strongly recommend reading Dexter’s posting to get the nuances and to explore his links.

I’m going to finish on a faint note of hope. There is work being done on site remediation and it seems to be successful, i.e., nonpolluting, less disruptive to the environment, and cheaper.  The Project on Emerging Nanotechnologies (PEN) has a webcast of a presentation titled, Contaminated Site Remediation: Are Nanomaterials the Answer?. You can find my comments about the webcast here (scoll down a bit) and PEN’s Nanoremediation Map which lists projects around the world although most are in the US. It’s incomplete since there is no requirement to report a nanoremediation site to PEN but it does give you an idea of what’s going on. Canada has two sites on the map.

Reinventing technology assessment or why should I start thinking about how to make better decisions about science and technology?

I think a better title for this posting might have been Dr. Strangelove or How I Learned to Stop Worrying and Love the Bomb (old movie title) as it’s got the right rhythm unfortunately the sentiment isn’t quite right (although quite close in some places) for this discussion about technology assessment along with the notion of unimpeded science ‘progress’.

Yesterday April 28, 2010, the Project on Emerging Nanotechnologies (PEN) held an event to launch a new report, Reinventing Technology Assessment for the 21st Century by Richard Sclove. (It’s the second time that PEN has not offered a live webcast of one of these events and I hope this is due to technical difficulties rather than financial.) The description for the event and link to the report and the speaker’s presentation can be found here.

I’m going to briefly discuss Richard Sclove’s  presentation slides  (and will see the webcast, which hopefully has been made, when it’s posted in a few weeks).  He offers a brief history of technology assessment (TA) in the US (an office was opened in 1972 and closed in 1995) and brief description of what it was supposed to accomplish. From the presentation,

Technology Assessment

Enhances societal understanding
of the broad implications of
science and technology, and
improves decision-making.

The presenter, Richard Sclove, also notes that there are now 18 TA agencies in Europe and makes the case that TA is important. What I found particularly interesting in the presentation is his focus on participatory TA. He’s not interested in simply reinstating the TA office in the US but in broadening engagement in the technology assessment process which is why his presentation and report use the word reinvention.

The suggestion for participation in TA is certainly in line with the current interest in involving citizens in all kinds of work, e.g. citizen scientists (an earlier blog posting) and citizen archivists (earlier blog posting) where volunteers work along aside professionals on certain projects. There is also a similarity to public engagement where experts and citizens meet to discuss emerging technology with the intent that the experts will take these meetings into account when decisionmaking. Sclove’s particular project (he is launching a project based on his report) seems to integrate the two approaches by formalizing the public engagement aspect beyond a series of meetings and/or workshops into a working relationship such as one between a citizen scientist and a professional scientist.

I find Sclove’s concept appealing and was made to reconsider it after reading Andrew Maynard’s (over at his 2020 Science blog) thoughts about the concept of TA. From Andrew’s posting,

It [TA] is based on the assumption that, if only we can get some insight into where a particular technology innovation is going and what the broader social and economic consequences might be, we should be able to tweak the system to increase the benefits and decrease the downsides.

As an idea, it’s an attractive one. Having the foresight to identify potential hurdles to progress ahead of time and make decisions that help overcome them at an early stage makes sound sense. If businesses wants to develop products that are sustainable over long periods, governments want to craft policies that have long-reaching positive consequences and citizens want to support actions that will benefit them and their children, any intelligence on the potential benefits and pitfalls associated with a new technology is invaluable to informed decision-making.

The trouble is, making sense of a complex future where technology, social issues, politics, economics and sheer human irrationality collide, is anything but straight forward.

It’s the dynamic nature of an emerging technology, as he points out, that makes all of the decisionmaking and regulatory development so very challenging. Andrew also contrasts the traditional TA concepts with the ideas in a book (Bad Ideas? An arresting history of our inventions) by Robert Winston who cautions against society’s blind assumption that the adoption of an emerging science or technology is both inevitable and good. You can certainly see that attitude in some of the information about nanotechnology. Even Andrew Schneider (earlier posting discussing the contretemps) who has roundly criticized the National Nanotechnology Initiative’s efforts assumes that nanotechnology’s adoption is inevitable.

Do read the posting and the comments. Richard Sclove dropped in and I offer this one excerpt from his comment,

Early on your mention that technology assessment (TA) “is based on the assumption that, if only we can get some insight into where a particular technology innovation is going . . . we should be able to tweak the system to increase the benefits and decrease the downsides.” As written, that is exactly right. Although if you read my report carefully, you’ll see that I’m interested in seeing if we can push the capability of TA (both participatory and not) to move beyond only studying one “particular technology” at a time to also considering the synergistic interactions among complexes of (seemingly unrelated) techs.

I noticed that nowhere in Sclove’s full comment does he address the much thornier issue of whether we must adopt an emerging science or technology simply because we can. You can learn more about Sclove’s project, the Expert & Citizen Assessment of Science and Technology (ECAST) project here. I notice the founding partners include PEN and the Science Cheerleader which has been mentioned here from time to time (notably in the posting about citizen scientists).

Peter Julian’s interview about proposing Canada’s first nanotechnology legislation (part 2 of 3); more on the UK Nanotechnologies Strategy; Dylan Thomas, neuroscience and an open reading

This is part 2 of an interview with Member of Parliament, Peter Julian, NDP (New Democrat Party) who tabled the first Canadian bill to regulate nanotechnology. Yesterday’s part of the interview featured some biographical notes about Mr. Julian and his answers to questions about why he, in particular, tabled the bill; the NDP’s shadow science minister’s (Jim Malloway) involvement; and the NDP’s commitment to science policy. Today, Julian explains why he favours the application of the precautionary principle to nanotechnology, notes the research he used before writing his bill, and comments on a national inventory scheme. NOTE: As some folks may prefer other media or summaries/commentaries on these reports, in situations where I have additional material, I’ve taken the liberty of giving links, clearly marking my additions.

Why do you favour applying the precautionary principle which has received some criticism as it favours the status quo?

I believe that the precautionary principle does not favour the status quo. The status quo hinders appropriate applications of precaution. Environmental, health, and safety gaps in the application of Nanotechnology are a shared concern between countries, as reflected in recent reports to Congress and the EU and at the OECD. Precaution towards discovery, product, production, use and eventual disposal is simple common sense.

The precautionary principle deters action without reflection. When a product is massively put on the market we have to be sure that it will not have adverse effects on health and the environment, and not just a short lived positive effect on the bottom line.

What research materials support your (BILL) and are these materials that you would recommend interested citizens read?

I have a list of links concerning these materials:

ED. NOTE:  I offered some commentary here and links to other commentaries here about this report.

  • The Chatham House briefing paper, Regulating Nanomaterials: A Transatlantic Agenda (September 2009) an excellent eight page read:

http://www.chathamhouse.org.uk/publications/papers/view/-/id/774/

ED. NOTE: There is a Project on Emerging Nanotechnologies (PEN)webcast of a presentation by the folks who authored the report. The webcast and speaker presentations can be found here and my commentary on the webcast here.

ED. NOTE: PEN webcast a presentation by J. Clarence Davies on Oversight of Next Generation Nanotechnology available here along with a speaker’s presentation and additional materials.

  • The National Nanotechnology Initiative document lays out a substantive, and sound, research program. Canada’s strategy remains limited in scope and vision.

http://www.nano.gov/NNI_EHS_Research_Strategy.pdf

I noticed mention of a public inventory for nanomaterials and it reminded me of a proposed Environment Canada nanomaterials inventory or reporting plan that was announced in January 2008. Do you know if this inventory ever took place or what its current status is?

The inventory is not completed yet. The bill develops a mandatory requirement for an inventory and there have been no prior operational inventories regarding nanotechnology products, which is why this bill is so important.

I would like to stress that in addition to the precautionary principle, Bill C-494 is built on a definition of Nanotechnology that adopts a broader and more inclusive definition of nanomaterials. This is consistent with the findings of the UK House of Lords Science and Technology Committee:

  • We recommend that the Government should work towards ensuring that any regulatory definition of nanomaterials proposed at a European level, in particular in the Novel Foods Regulation, should not include a size limit of 100nm but instead refer to ‘the nanoscale’ to ensure that all materials with a dimension under 1000nm are considered.A change in functionality, meaning how a substance interacts with the body, should be the factor that distinguishes a nanomaterial from its larger form within the nanoscale.

UK House of Lords Science and Technology Committee
Nanotechnologies and Food (8 January 2010)
Recommendation 12, p.76

http://www.publications.parliament.uk/pa/ld/ldsctech.htm

This is in contrast with Health Canada policy which looks at narrow definition of nanomaterials:

  • Health Canada’s Science Policy Directorate announced the adoption of the Interim Policy Statement on Health Canada’s Working Definition for Nanomaterials and its posting on the Health Canada website 2 March 2010. This Government of Canada policy adopts a 1-100nm “inclusive” regulatory benchmark, effective immediately, with a public comment period underway.

http://www.hc-sc.gc.ca/sr-sr/consult/_2010/nanomater/index-eng.php

ED. NOTE: I made an error in my question, the proposed nano inventory by Environment Canada was announced in Jan. 2009. My postings on the announcement are here and here. The odd thing about the announcement was that it was made initially by PEN which is located in Washington, DC and subsequently picked up by Canadian news media. As far as I know, Environment Canada has never offered comment about its 2009 plan for a nanotechnology inventory.

Tomorrow Julian wraps up with answers to questions about why someone who’s shadow portfolio includes international trade is interested in nanotechnology and the potential costs for his proposed legislation.

Peter Julian interview Part 1, Part 3, Comments: Nano Ontario, Comments: nanoAlberta

More on the UK 2010 Nanotechnologies Strategy Report

Dexter Johnson over on Nanoclast has done some detective work in a bid to understand why the market numbers used in the report differ wildly from anyone else’s. From Dexter’s posting,

It [the report] quotes market numbers for nano-enabled products that are such a drastic departure from most estimates that it leaves one questioning why tens of billions of dollars are being poured in by governments around the world to fund research.

If you have it, do take the time to follow along as Dexter  trails the company that the UK government used as its source for their market numbers. Amongst other names, I recognized one, ObservatoryNANO. (It was an organization I followed briefly and dismissed as being frivolous.)

One other commenter has emerged, Tim Harper. Now as the  principle of a nanotechnology business consulting company (Cientifica) some might be inclined to dismiss his comments but they have the ring of honest frustration and a sincere desire to contribute. From Harper’s posting,

Every UK nanotech report to date has excluded any data provided by UK companies. Even offers of free copies of our market research to government committees looking into various bits of nanotechnology provoke the same response as if we’d offered them a fresh dog turd wrapped in newspaper.

And now for a complete change of pace,

Dylan Thomas and neuroscience

There‘s an event tonight  (Thursday, March 25, 2010) in Vancouver being put on by the Dylan Thomas Circle (he lived in North Vancouver for a time as he worked on Under the volcano). It’s being held at the Red Dragon Pub at the Cambrian Hall on 17th & Main St.  Doors open at 6:45 pm and the presentation starts at 7:30 pm followed by an open reading. From the news release,

THE DYLAN THOMAS CIRCLE OF VANCOUVER presents

“Dylan Thomas, Creativity and Neuroscience”

Ariadne Sawyer will lead an exploration into creativity and the creative process as manifest through the works and the life of Dylan Thomas. She will investigate why we are creative, what happens during the creative process and what effect it has upon us.

This will be followed by an intermission and an: ‘OPEN READING’: an invitation to everyone who is interested to read aloud a poem or literary excerpt of their choice. This can be your own work, Dylan’s work or any other writer’s material. Most importantly, it is our chance to indulge in a little of our own creativity and to do it in a relaxed and in a friendly atmosphere.

About Ariadne Sawyer:

Ariadne has done on line Performance Plus Coaching with trainees from England, France, Canada and the United States for the last two years. She has received the Award of Excellence given by McLean-Hunter for the Brain Bulletin Series. Ariadne publishes an electronic newsletter called: Ariadne’s Performance Plus Newsletter along with Performance Plus Tips which are sent to all the participating trainees. She also co-hosts a weekly radio program on CFRO 102.7 FM, which has been on the air for the past two years. The Performance Plus Mini Course has been presented on the show with astounding success. She has two electronic courses available soon on the Internet. Performance Plus Level One and the Performance Plus Diplomacy Course. Ariadne has worked with trainees from Europe, the US and across Canada.

Science festivals in the US; nanoparticles and environmental health and safety report from ENRHES; new technique in molecular biology; PEN’s site remediation webcast commentary

I just came across a notice for the first ever USA Science and Engineering Festival to be held in Washington, DC, Oct. 10-24, 2010. From the Azonano news item,

Agilent Technologies Inc. (NYSE:A) today announced its support of the USA Science & Engineering Festival, the country’s first national science festival. The event will take place in Washington, D.C., in October 2010. The festival, expected to be a multi-cultural and multi-disciplinary celebration of science in the United States, will offer science and engineering organizations throughout the country the opportunity to present hands-on science activities to inspire the next generation of scientists and engineers. Festival organizers already have engaged more than 350 participants from the nation’s leading science and engineering organizations.

From what I’ve seen of their website, they are using the term multi-disciplinary in a fairly conservative sense, i. e., different science and engineering disciplines are being brought together. This contrasts with the approach used in the World Science Festival, being held in New York, June 2-6, 2010, where they mash together artists as well as scientists from many different disciplines.

Michael Berger at Nanowerk sputters a bit as he comments on the Engineered Nanoparticles Review of Health and Environmental Safety (ENRHES) report,

Before we take a look at the report’s findings, it’s quite remarkable that the authors feel compelled to start their introduction section with this sentence: “Nanotechnology is a sector of the material manufacturing industry that has already created a multibillion $US market, and is widely expected to grow to 1 trillion $US by 2015.” Firstly, a lot of people would argue with the narrow definition of nanotechnology as being a sector of the material manufacturing industry. Secondly, it appears that still no publicly funded report can afford to omit the meaningless and nonsensical reference to a ‘trillion dollar industry by 2015’. It really is astonishing how this claim gets regurgitated over and over again – even by serious scientists – without getting scrutinized (read “Debunking the trillion dollar nanotechnology market size hype”). It would be interesting to know if scientific authors, who otherwise operate in a fact-based world, just accept a number picked out of thin air by some consultants because it helps impress their funders; or if they deliberately use what they know is a fishy number because the politicians and bureaucrats who control the purses are easily fooled by sensational claims like these and keep the funding coming.

Sadly, picking a number out of thin air happens more often than we like to believe. A few years back I was reading a book about food and how it’s changing as we keep manipulating our food products to make them last longer on the shelf, etc. In one chapter of the book, the author chatted with an individual who helped to define high cholesterol. As he told the story, he and his colleagues (scientists all) got in a room and picked a number that was used to define a high cholesterol count. (I will try to find the title of that book, unfortunately the memory escapes me at the moment. ETA: Mar.4.10, the book is by Gina Mellet, Last chance to eat, 2004) I’ve heard variations of this business of picking a number that sounds good before.

As for the rest of the ENRHES report, Berger has this to say,

Thankfully, the rest of the report stands on solid ground.

I’m using those last two words, “solid ground” to eventually ease my way into a discussion about site remediation and the Project on Emerging Nanotechnologies’ (PEN) recent webcast. First, there’s a brief and related item on molecular biology.

Scientists at the University of Chicago are trying to develop a method for understanding how biological processes emerge from molecular interactions. From the news item (which includes an audio file of Andre Dinner, one of the scientists, discussing his work) on physorg.com,

Funded by a $1 million grant from the W.M. Keck Foundation, University of Chicago scientists are aiming to develop a reliable method for determining how biological processes emerge from molecular interactions. The method may permit them to “rewire” the regulatory circuitry of insulin-secreting pancreatic beta cells, which play a major role in type-2 diabetes.

A second goal: to control cell behavior and function more generally, which may ultimately culminate in other applications, including the bioremediation of environmental problems.

The four scientists [Aaron Dinner, Louis Philipson, Rustem Ismagilov, and Norbert Scherer] share an interest in the collective behavior of cells that emerges from a complex ensemble of atoms and molecules working in concert at different scales of time and space. “In a living system you have this hierarchy of coupled time and length scales,” Dinner said. “How is it that all of these different dynamics at one time and length scale get coupled to dynamics at another scale?”

In other words, how does life begin? I know that’s not the question they’re asking but this work has to lead in that direction and I imagine the synthetic biology people are watching with much interest.

In the more immediate future, this work in molecular biology may lead to better bioremediation, which was the topic at hand on the Project on Emerging Nanotechnologies’ recent (Feb.4.10) webcast.From their website (you can click to view the webcast [approx. 54 mins.] from here),

A new review article appearing in Environmental Health Perspectives (EHP) co-authored by Dr. Todd Kuiken, research associate for the Project on Emerging Nanotechnologies (PEN), Dr. Barbara Karn, Office of Research and Development, U.S. Environmental Protection Agency and Marti Otto, Office of Superfund Remediation and Technology Innovation, U.S. Environmental Protection Agency focuses on the use of nanomaterials for environmental cleanup. It provides an overview of current practices; research findings; societal issues; potential environment, health, and safety implications; and possible future directions for nanoremediation. The authors conclude that the technology could be an effective and economically viable alternative for some current site cleanup practices, but potential risks remain poorly understood.

There is an interactive map of remediation sites available here and, if you scroll down to the bottom of the page, you’ll find a link to the review article or you can go here.

I found the information interesting although I was not the intended audience. This was focused primarily on people who are involved in site remediation and/or are from the US. The short story is that more research needs to be done and there have been some very promising results. The use of nanoscale zero-valent iron (nZVI) nanoparticles was the main topic of discussion. It allows for ‘in situ’ site remediation, in other words, you don’t need to move soil and/or pump water through some treatment process. It’s not appropriate for all sites. It can be faster than the current site remediation treatments and it’s cheaper. There was no mention of any problems or hazards using nZVI but there hasn’t been much research either. The technique is now being used in seven different countries (including Canada with one in Ontario and one in Quebec). If I understand it rightly, there is no requirement to report nanotechnology-enabled site remediation so these numbers are based on self-reports. From the article in Environment Health Perspectives,

The number of actual applications of nZVI is increasing rapidly. Only a fraction of the projects has been reported, and new projects show up regularly. Figure 2 and Supplemental Material, Table 2 (doi:10.1289/ehp.0900793.S1) describe 44 sites where nanoremediation methods have been tested for site remediation.

I think that’s it for today, tomorrow some news from NISENet (Nanoscale Informal Science Education Network).

Site remediation and nano materials; perspectives on risk assessment; Leonardo’s call for nano and art; a new nano art/science book

The Project on Emerging Nanotechnologies (PEN) is holding an event on site remediation on Feb. 4, 2010 (12:30 pm to 1:30 pm EST). From the news release,

A new review article appearing in Environmental Health Perspectives (EHP) co-authored by Dr. Todd Kuiken, Research Associate for the Project on Emerging Nanotechnologies (PEN), Dr. Barbara Karn, Office of Research and Development, U.S. Environmental Protection Agency and Marti Otto, Office of Superfund Remediation and Technology Innovation, U.S. Environmental Protection Agency focuses on the use of nanomaterials for environmental cleanup. It provides an overview of current practices; research findings; societal issues; potential environment, health, and safety implications; and possible future directions for nanoremediation. The authors conclude that the technology could be an effective and economically viable alternative for some current site cleanup practices, but potential risks remain poorly understood.

PEN’s Contaminated Site Remediation: Are Nanomaterials the Answer? features the EHN article’s authors  Kulken, Karn, and Otto on a panel with David Rejeski, PEN’s executive director moderating. PEN also has a map detailing almost 60 sites (mostly in the US, 2  in Canada, 4 in Europe, and 1 in Taiwan) where nanomaterials are being used for remediation.  More from the news release,

According to Dr. Kuiken, “Despite the potentially high performance and low cost of nanoremediation, more research is needed to understand and prevent any potential adverse environmental impacts, particularly studies on full-scale ecosystem-wide impacts. To date, little research has been done.”

In its 2004 report Nanoscience and nanotechnologies: opportunities and uncertainties, the British Royal Society and Royal Academy of Engineering recommended that the use of free manufactured nanoparticles be prohibited for environmental applications such as remediation until further research on potential risks and benefits had been conducted. The European Commission’s Scientific Committee on Emerging and Newly Identified Health Risks (SCENIHR) called for further risk research in 2005 while acknowledging environmental remediation technology as one of nanotechnology’s potential benefits.

If you wish to attend in person (i.e. you are in Washington, DC), you are asked to RSVP here (they provide a light lunch starting at 12 pm) or you can watch the webcast (no RSVP necessary and I will put up a link to the webcast closer to the date).

On the topic of risk, Michael Berger has written an in depth piece about a recently published article, Redefining research risk priorities for nanomaterials, in the Journal of Nanoparticle Research. From Berger’s piece,

While research in quantitative risk characterization of nanomaterials is crucially important, and no one advocates abandoning this approach, scientists and policy makers must face the reality that many of these knowledge gaps cannot be expected to be closed for many years to come – and decision making will need to continue under conditions of uncertainty. At the same time, current chemical-based research efforts are mainly directed at establishing toxicological and ecotoxicological and exposure data for nanomaterials, with comparatively little research undertaken on the tools or approaches that may facilitate near-term decisions.

In other words, there’s a big lag between developing new products using nanomaterials and the research needed to determine the health and environmental risks associated both with the production and use of these new materials. The precautionary principle suggests that we not produce or adopt these products until we are certain about risks and how to ameliorate and/or eliminate them. That’s an impossible position as we can never anticipate with any certainty what will happen when something is released to the general public or into the environment at large.  From Berger’s piece,

In their article, [Khara Deanna] Grieger [PhD student at Technical University of Denmark (DTU)], Anders Baun, who heads DTU’s Department of Environmental Engineering, and Richard Owens from the Policy Studies Institute in the UK, argue that there has not yet been a significant amount of attention dedicated to the field of timely and informed decision making for near term decisions. “We see this as the central issue for the responsible emergence of nanotechnologies” says Grieger.

Getting back to site remediation using nanomaterials, since it’s already in use as per the map and the authors state that there hasn’t been enough research into risks, do we pull back and adopt the precautionary principle or do we proceed as intelligently as possible in an area where uncertainty rules? That’s a question I will continue to explore as I get my hands on more information.

On a completely different nano front, the Leonardo magazine has issued a call for papers on nano and art,

2011 is the International Year of Chemistry! To celebrate Leonardo is seeking to publish papers and artworks on the intersections of chemistry,
nanotechnology and art for our on-going special section on nanotechnology and the arts. Since its inception nanotech/science has been intimately connected to chemistry; fullerenes, nanoputians, molecular machines, nano-inorganics and self-assembling molecular systems all spring from the minds and labs of chemists, biochemists and chemical engineers. If you’re a nano-oriented chemist who is serious about art, an artist working on the molecular level, or a chemical educator exploring the mysteries of nano through the arts we are especially seeking submissions from you.

You can send proposals, queries, and/or manuscripts to the Leonardo editorial office: leonardomanuscripts@gmail.com. You can read more about the call for papers here at Leblogducorps or you can go here to the Leonardo online journal.

Meanwhile, Andrew Maynard at 2020 Science is posting about a new book which integrates art work in an attempt to explain nanotechnology without ever mentioning it. From Andrew’s posting,

How do you write a book about something few people have heard off, and less seem interested in?  The answer, it seems, is to write about something else.

Felice Frankel and George Whitesides have clearly taken this lesson to heart. Judged by the cover alone, their new book “No Small Matter:  Science at the Nanoscale” is all about science in the Twilight zone of the nanoscale

– where stuff doesn’t behave in the way intuition says it should.

Drat! I can’t make the indent go away. At any rate, do visit 2020 as Andrew to read more from this posting and at least one other where he has gotten permission to excerpt parts of the book (text and images).