Tag Archives: President’s Council of Advisors on Science and Technology

Report on nano EHS from US General Accountability Office (GAO)

According the June 22, 2012 news item on Nanowerk, The US General Accountability Office (GAO) has release a new report titled, Nanotechnology: Improved Performance Information Needed for Environmental, Health, and Safety Research (published May 2012). From the report,

Nanotechnology involves the ability to control matter at approximately 1 to 100 nanometers. Worldwide trends suggest that products that rely on nanotechnology will be a $3 trillion market by 2020. However, some of the EHS [Environmental, Health, and Safety]impacts of nanotechnology are unknown. The NSTC [National Science and Technology Council] coordinates and oversees the NNI [National Nanotechnology Initiative], an interagency program that, among other things, develops national strategy documents for federal efforts in nanotechnology.

In this context, GAO examined: (1) changes in federal funding for nanotechnology EHS research from fiscal years 2006 to 2010; (2) the nanomaterials that NNI member agencies’ EHS research focused on in fiscal year 2010; (3) the extent to which NNI member agencies collaborate with stakeholders on this research and related strategies; and (4) the extent to which NNI strategy documents address desirable characteristics of national strategies. GAO’s review included seven NNI agencies that funded 93 percent of the EHS research dollars in fiscal year 2010. This report is based on analysis of NNI and agency documents and responses to a questionnaire of nonfederal stakeholders.

GAO recommends that the Director of the Office of Science and Technology Policy (OSTP), which administers the NSTC, (1) coordinate development of performance information for NNI EHS research needs and publicly report this information; and (2) estimate the costs and resources necessary to meet the research needs. OSTP and the seven included agencies neither agreed nor disagreed with the recommendations. [p.2 of the PDF]

This provides some interesting contrast to the National Nanotechnology Initiative’s (NNI) 4th assessment report which I wrote about in my May 2, 2012 posting,

PCAST [President’s Council of Advisors on Science and Technology] acknowledges that the NSET [Nanoscale Science, Engineering, and Technology Subcommittee coordinates planning, budgeting, program implementation, and review of the NNI] has acted on our recommendation to identify a central coordina­tor for nanotechnology-related EHS research within NNCO. The EHS coordinator has done a laudable job developing and communicating the 2011 NNI EHS research strategy. [emphasis mine] However, there is still a lack of integration between nanotechnology-related EHS research funded through the NNI and the kind of information policy makers need to effectively manage potential risks from nanomaterials. The estab­lishment of the Emerging Technologies Interagency Policy Coordination Committee (ETIPC) through OSTP has begun to bridge that gap, but without close integration between ETIPC and the NEHI working group [Nanotechnology Environmental and Health Implications Working Group], the gap may not be sufficiently narrowed. OSTP and the NSET Subcommittee should expand the charter of the NEHI working group to enable the group to address cross-agency nanotechnology-related policy issues more broadly.

Alphabet soup, eh? The best I can gather is that the GAO report has identified gaps that are identified by the NNI (and which they have begun to address) as per my emphasis in the excerpt from the 4th assessment. As someone who does not know the politics or have access to inside information, the GAO report recommendations are much simpler to understand as the issues are laid out from a more ‘global’ perspective (or big picture perspective) as per US EHS nanotechnology research efforts. The NNI’s 4th assessment report offers more detail and, frankly, I found it more confusing.

This is my 2nd GAO report and, again, I love the writing and organization of the report. (Note: I am lauding the report writing skills.)  Thank you to Frank Rusco, Dan Haas, Krista Anderson, Nirmal Chaudhary, Elizabeth Curda, Lorraine Ettaro, Alison O’Neill, Tind Shepper Ryen, Jeanette Soares, Ruth Solomon, Hai Tran, and Jack Wang.

4th assessment of the US’s National Nanotechnology Initiative (found some info. about Canada in the rept.!)

It seems there a number of reports concerning the US National Nanotechnology Initiative and their efforts and responses to the PCAST 2010 recommendations (I commented on another of their reports in my Dec. 13, 2011 posting). This fourth report/assessment was submitted by the President’s Council of Advisors on Science and Technology (PCAST) and focuses on efforts from various government agencies to follow recommendations from that 2010 PCAST assessment and set of recommendations.

According to the April 27, 2012 news item on Nanowerk,

PCAST found that the Federal agencies in the NNI have made substantial progress in addressing many of the 2010 recommendations that were aimed at maintaining U.S. leadership in nanotechnology. One of the primary goals of the NNI is to stay ahead of heavily-investing competitors such as China, South Korea, the European Union, and Russia. Overall, PCAST concluded that the NNI remains a successful cooperative venture that is supporting high-quality research, facilitating the translation of discoveries into new commercial products, and ensuring the Nation’s continued global leadership in this important field.

The PCAST assessment particularly commends the expanded efforts of the NNCO [National Nanotechnology Coordination Office] in the area of commercialization and coordination with industry, and the NNCO’s release of a focused research strategy for addressing environmental, health, and safety (EHS) implications of nanotechnology. In addition, the assessment recognizes NNI’s strong and growing portfolio of research on the societal implications of nanotechnology, nanotechnology education, and public outreach.

Dexter Johnson at his Nanoclast blog on the IEEE (Institute of Electrical and Electronics Engineers) website comments in his May 1, 2012 posting,

Okay, pat on the back, job well done…uh, wait, there are still some new recommendations that PCAST would like to see addressed.  You can find them in the PDF of the full report on page vii. They fall into the areas of strategic planning, program management, metrics for assessing nanotechnology’s commercial and societal impacts, and…wait for it…increased support for EHS research.

Additional support for EHS research might be a required element for every PCAST report in the future. More interesting to me, however, is this continued emphasis on improved “metrics for assessing nanotechnology’s commercial and societal impacts.”

Dexter goes on to observe that many countries and corporations are interested in better metrics regarding  nanotechnology and its impacts and hints that he has a few ideas for better metrics.

I’ve looked at the report and found, to my surprise, mention of Canada. In analyzing the US NNI efforts, they also compare US government funding and corporate to that in other countries. On page 14 (print version; p. 30 PDF) of the PCAST 4th Assessment of the NNI, there’s a table which shows the top 10 countries for spending on nanotechnology,

As you can see, Canadian funding has been relatively flat throughout 2008 – 2010. It appears to have decreased slightly in 2009 and remained the same in 2010.

Aside: I’d dearly love to know how they sourced their data. A couple of years ago, a Canadian Member of Parliament (Peter Julian) asked for similar figures and received some 80 pages of Excel spreadsheets from various department listing any number of research projects that had been funded. (I’d asked Julian’s parliamentary assistant for a copy of the government’s response to his question, which is how I came to see that mess of paper.)

For anyone familiar with the Canadian scene (industrial research in Canada is rare), this next chart won’t be any surprise, from page 14 (print version; p. 30 PDF) of the PCAST 4th Assessment of the NNI,

However, this may be a surprise, from page 15 (print version; p. 31 PDF) of the PCAST 4th Assessment of the NNI,

Good grief! Canada is in the top five countries for venture capital spending on nanotechnology. Of course, we had our banner year in 2008, with quite a dip in 2009 but it looks like we rebounded mildly in 2010.

It’s always interesting for me to analyze the US nanotechnology efforts in relationship to the Canadian efforts (as well as, getting a sense of the international scene). Actually, I can’t analyze our efforts since the Canadian government doesn’t tend to share information (or provides reams of meaningless data) with its citizens so I’m driven to finding it in US government documents and materials provided by international governmental organizations such as the OECD (Organization for Econ0mic Cooperation and Development).

Getting back to the report, which after all is about the US situation, I’m particularly interested in the recommendations for metrics (thank you, Dexter) and EHS. From page 22 (print version; p. 38 PDF) of the PCAST 4th Assessment of the NNI (I have edited out some footnotes),

Agencies should develop a mission-appropriate definition of nanotechnology that enables the tracking of specific nanotechnology investments supported at the program level. The definition and funding details should be published in agency implementation plans to promote clarity.

This recommendation enables each agency to develop a mission-appropriate definition of nanotechnol­ogy to characterize its nanotechnology portfolio. Requiring each agency to publish its definition and the resulting budget allocations will improve clarity across the Federal nanotechnology portfolio and ensure that nanotechnology investments are accurately characterized.

The NNCO should track the development of metrics for quantifying the Federal nanotechnology portfolio and implement them to assess NNI outputs.

Current Federal efforts to measure public and private investment, scientific productivity, and workforce have been inconsistent and decentralized. The publication of agency-specific data will enable the NNCO to consistently track nanotechnology investments across the Federal government and enable it to report NNI impacts with greater confidence and transparency.

There is an extensive and growing body of high-quality academic research that is already working toward the establishment of nanotechnology metrics by drawing upon bibliometrics data from the public domain (e.g., publication and patent data). … Bibliometrics data are used as indicators of productivity beyond academia, often in the absence of other metrics from the private sector. As nanotechnology continues to mature and move closer toward commercialization, efforts to more accurately capture economic returns are picking up pace. Examples include the March 2012 International Symposium on Assessing Economic Impacts of Nanotechnologies sponsored jointly by the NNI and the Organization for Economic Co-Operation and Development held in Washington, DC, [mentioned in my March 29, 2012 posting] as well as the upcoming 2012 National Research Council review of the NNI.

A final area in need of metrics development is in the quantification of the nanotechnology workforce.  [emphasis mine] Accurately categorizing agency-level nanotechnology investments will facilitate the identification of nanotechnology trainees, including the academic, scientific, and professional nanotechnology workforce for which there is currently a paucity of data…. One area where such tracking would have significant impact is in the identification of nanotechnology-related jobs for which there are no standard occu­pational codes. Good data on the workforce will enable the implementation of additional measures to identify and mitigate future threats to occupational health and safety.

PCAST recommends that NNCO serve as a central repository to collect these metrics and leverage advances in metrics-development to collect, track, and analyze data regarding publications, patents, educational activities, and the workforce to produce and publish its own statistics on behalf of the NSET. This under­taking is an integral component of cross-agency coordination of the Federal nanotechnology portfolio.

That first recommendation seems problematic. The notion of agencies developing mission-specific definitions of nanotechnology, as recommended, sets the stage for multiple and competing definitions in a situation where you want to standardize as much as possible.

Unfortunately, the alternative is not an improvement. An attempt to standardize across all agencies would most probably lead to years of meetings and discussions before anything was ever measured.

I’m not quite as confident about bibliometrics as the authors of this report are but, as they hint, oftentimes it’s the only quantifiable data available. While there is much talk about establishing other metrics, there is no hint as to how this will be done or who will do it or whether money will be allocated for this purpose.

The recommendations for further EHS research, from pp. 22-3 (print version; pp. 38-9 PDF) of the PCAST 4th Assessment of the NNI, include (I have edited out a reference to an appendix),

The NSET should establish high-level, cross-agency authoritative and accountable governance of Federal nanotechnology-related EHS research so that the knowledge created as a result of Federal investments can better inform policy makers.

PCAST acknowledges that the NSET has acted on our recommendation to identify a central coordina­tor for nanotechnology-related EHS research within NNCO. The EHS coordinator has done a laudable job developing and communicating the 2011 NNI EHS research strategy. However, there is still a lack of integration between nanotechnology-related EHS research funded through the NNI and the kind of information policy makers need to effectively manage potential risks from nanomaterials. The estab­lishment of the Emerging Technologies Interagency Policy Coordination Committee (ETIPC) through OSTP has begun to bridge that gap, but without close integration between ETIPC and the NEHI working group, the gap may not be sufficiently narrowed. OSTP and the NSET Subcommittee should expand the charter of the NEHI working group to enable the group to address cross-agency nanotechnology-related policy issues more broadly.

The NSET should increase investment in cross-cutting areas of EHS that promote knowledge transfer such as informatics, partnerships, and instrumentation development.

The 2011 NNI EHS research strategy acknowledges the critical role that informatics, partnerships, and instrumentation development play in a comprehensive approach to addressing nanotechnology risks to human health and the environment. Nascent efforts in informatics should be supported so that advances can be accelerated in this critical cross-cutting area. Rather than continue to support the proliferation of databases that results from many new nano-EHS projects, the effort should be directed at enabling diverse communities to extract meaningful information from each other’s work. New networks that connect researchers together, along with new tools for extracting information from Federally funded research, should be established and supported through the NNI. The findings of the December 2011 workshop to establish a Nanoinformatics 2020 Roadmap19 in conjunction with the 2011 NNI EHS research strategy can serve as a guide for new work in this area.

Significant progress has been made in the area of partnerships with numerous examples of mul­tistakeholder and interagency collaboration underway. One of these is the Nanorelease Project,20 which brings together five NNI agencies, non-governmental organizations, a labor union, and several companies, among others, to develop methods for measuring the release of nanomaterials from com­mercial products. A specific area where better coordination could occur is in the area of occupational safety. The Occupational Safety and Health Administration (OSHA) should work with companies in a non-enforcement capacity to develop better tools for hazard communication similar to the National Institute of Occupational Health and Safety’s (NIOSH) partnership program. This is especially important as the United States seeks to bring its hazard communication standard in alignment with the Globally Harmonized System of Classification and Labeling of Chemicals. Greater engagement by OSHA would also begin to address some of the difficulties companies face in implementing good health and safety programs in their nanomaterial workplaces …

New modes of international cooperation, such as the joint funding of two environmental-impacts consortia by the EPA and the United Kingdom, have also emerged since the 2010 PCAST report. The NNI should increase funding for these cross-cutting activities to leverage the U.S. investment in nanotechnology-related EHS research.

The wealth of abbreviations makes this section a little hard to read. As I understand it, the recommendations are aimed at improving use of their current and future resources by better coordinating the research efforts, sharing data (with a special eye to providing information policymakers can use effectively), and collaborating internationally on EHS research.

US National Nanotechnology Initiative reports on last year’s recommendations

Richard M. Jones at the American Institute of Physics (AIP) reports in a Dec. 9, 2011 article in the AIP Bulletin no. 145,

Members of the President’s Council of Advisors on Science and Technology [PCAST] were briefed last month on the implementation of the council’s recommendations regarding the National Nanotechnology Initiative (NNI).  Now in its tenth year, federal agencies participating in the NNI expend about $2 billion per year, having spent a cumulative $14 billion on nanotechnology R&D since its inception.

Jones summarized the presentations (here’s a sampling),

Sally Tinkle, Deputy Director of the National Nanotechnology Coordination Office was the first of four speakers in this sixty-minute briefing. … As examples, she described an increase in the number of public-private partnerships (citing examples from the NIH and NIST), outreach to states (including a full-time employee dedicated to this effort), interactions with officials from the European Union,  better information dissemination programs, and research on health, environmental, safety, ethical, and legal matters.  …

Carlos Pena, Director of Emerging Technology at the Office of Science and Health Coordination of the Food and Drug Administration was the second speaker. He described FDA’s efforts to carefully protect human health while fostering the development of nanotechnology, using science-based decision making. Among those steps it has taken is increasing training of its staff and improved coordination and cooperation with other agencies. …

Other topics covered in a concluding question-and-answer period included monthly inter-agency briefings, meetings with the European Union, products awaiting FDA approval, federal agency funding collaborations, the desirability of a multi-agency roadmap to support further development of nanotechnology, the engagement of nongovernmental stakeholders, and computational support.

You can access the webcast, briefing materials, minutes, etc. from the Nov. 2, 2011 meeting here.

You can view the webcast here.

What I find most interesting is that this particular US government administration is making a big effort at offering access and information about science matters. It seems strange to me that I rarely come across similar information from the Canadian government, which makes no great effort to let us know about their (it is most definitely theirs and not ours) science.

Nano regulatory frameworks are everywhere!

The scene around nanotechnology regulatory frameworks has been frantic (by comparison with any other time period during the 3 years I’ve been blogging about nano) in the last month or so. This is my second attempt this month at pulling together information about nanotechnology regulatory frameworks (my June 9, 2011 posting).

I’ll start off slow and easy with this roundup of sorts with a brief look at the international scene, move on to US initiatives, offer a brief comment on the Canadian situation, and wrap up with Europe.

International

Dr. Andrew Maynard at the University of Michigan Risk Science Center (UMRSC) blog has written a commentary about the ISO’s (*International Organization for Standardization) latest set of nanotechnology guidelines in his May 27, 2011 posting.  From the posting,

ISO/TR 31321:2011: Nanotechnologies – Nanomaterial risk evaluation is unashamedly based on the Environmental Defense Fund/DuPont Nano Risk Framework. Much of the structure and content reflects that of the original – a testament to the thought and effort that went into the first document. …The ISO report is written in a much tighter style than that of the original document, and loses some of the occasionally long-winded expositions on what should be done and why. And the ISO document is more compact – 66 pages as opposed to 104. But from a comparative reading, surprisingly little has been changed from the 2007 document.

It’s build around a framework of six steps:

  1. describe materials and applications
  2. material profiles
  3. evaluate risks
  4. assess risk management options
  5. decide, document, and act
  6. review and adapt

From the posting,

Inherent to this framework is the need to make situation-specific decisions that are guided by the Technical Report but not necessarily prescribed by it, and the need to constantly review and revise procedures and decisions. This built-in flexibility and adaptability makes ISO/TR 31321 a powerful tool for developing tailored nanomaterial management strategies that are responsive to new information as it becomes available. It also presents an integrative approach to using materials safely, that deals with the need to make decisions under considerable uncertainty by blurring the line between risk assessment and risk management.

Andrew’s view of these guidelines is largely positive and you can get more details and history by viewing his original commentary. (I first mentioned these new ISO guidelines in my May 18, 2011 posting.)

Sticking with the international scene (in this case, ISO), there was a June 13, 2011 news item on Nanowerk about a new ISO general liability classification for nanotechnology and alternative energy (from the news item),

The new classifications to address the growing use of nanotechnology are Nanomaterial Distributors and Nanomaterial Manufacturing. The once-limited use of nanotechnology in electronics and information technology industries is now swiftly permeating the consumer marketplace, from cosmetics to clothing and more. The Nanomaterial Distributors classification applies to risks that sell nanomaterials to others, and the Nanomaterial Manufacturing classification applies to risks that manufacture or engineer nanomaterials for others.

“With heightened interest to reduce the carbon footprint, establish energy independence, and increase the use of renewable resources, alternative power is a priority for many,” said Beth Fitzgerald, vice president of commercial lines and modeling at ISO. “In response to the growing demand for alternative energy, ISO introduced classifications for risks in three main areas: biofuels, solar energy, and wind energy. The new classifications will allow for future evaluation of the loss experience of those emerging markets.”

The biofuels classifications consist of Biofuels Manufacturing and Biofuels Distributors. Since ethanol already has a widespread and accepted use, a further distinction is made between “ethanol” and “biofuels other than ethanol.”

The solar energy classifications include Solar Energy Farms, Solar Energy Equipment Dealers or Distributors, and Solar Energy Equipment Manufacturing. The wind energy classifications include Wind Turbine Contractors – Installation, Service, or Repair and onshore and offshore Wind Farms.

* I have for many years understood that ISO is the International Standards Organization and I see from a note on the UMRSC blog that these days it is the International Organization for Standardization.

US

On the US front, three different agencies have made announcements that in one way or another will have an impact on the nanotechnology regulatory frameworks in that country.

The White House Emerging Technologies Interagency Policy Coordination Committee (ETIPC) recently released a set of principles for the regulation and oversight of nanotechnology applications and guidance for the development and implementation of policies at the agency level. From the June 9, 2011 news item on Nanowerk,

The realization of nanotechnology’s full potential will require continued research and flexible, science-based approaches to regulation that protect public health and the environment while promoting economic growth, innovation, competitiveness, exports, and job creation.

In furtherance of those goals, the White House Emerging Technologies Interagency Policy Coordination Committee (ETIPC) has developed a set of principles (pdf) specific to the regulation and oversight of applications of nanotechnology, to guide the development and implementation of policies at the agency level.

These principles reinforce a set of overarching principles (pdf) for the regulation and oversight of emerging technologies released on March 11, 2011. They also reflect recommendations from a report on nanotechnology (pdf) by the President’s Council of Advisors on Science and Technology. The report encourages Federal support for the commercialization of nanotech products and calls for the development of rational, science- and risk-based regulatory approaches that would be based on the full array of a material’s properties and their plausible risks and not simply on the basis of size alone.

You can read more about the guidelines at Nanowerk or on the Environemental Expert website here.

Back over on the UMRSC blog, Dr. Andrew Maynard had these comments in his June 13, 2011 posting,

In a joint memorandum, the Office of Science and Technology Policy, the Office of Management and Budget and the Office of the United States Trade Representative laid out Policy Principles for the U.S. Decision Making Concerning Regulations and Oversight of Applications of Nanotechnology and Nanomaterials.

Reading through it, a number of themes emerge, including:

  • Existing regulatory frameworks provide a firm foundation for the oversight of nanomaterials, but there is a need to respond to new scientific evidence on potential risks, and to consider administrative and legal modifications to the regulatory landscape should the need arise.
  • Regulatory action on nanomaterials should be based on scientific evidence of risk, and not on definitions of materials that do not necessarily reflect the evidence-based likelihood of a material causing harm.
  • There should be no prior judgement on whether nanomaterials are intrinsically benign or harmful, in the absence of supporting scientific evidence.
  • Transparency and communication are important to ensuring effective evidence-based regulation.

Overall, this is a strong set of policy principles that lays the groundwork for developing regulation that is grounded in science and not swayed by speculative whims, and yet is responsive and adaptive to emerging challenges. Gratifyingly, the memorandum begins to touch on some of the concerns I have expressed previously about approaches to nanomaterial regulation that seem not to be evidence-based. There is a reasonable chance that they will help move away from the dogma that engineered nanomaterials should be regulated separately because they are new, to a more nuanced and evidence-based approach to ensuring the safe use of increasingly sophisticated materials. Where it perhaps lacks is in recognizing the importance of other factors in addition to science in crafting effective regulation, and in handling uncertainty in decision making.

June 9, 2011 was quite the day as in addition to the White House documents, the US Environmental Protection Agency (EPA) and the US Food and Drug Administration (FDA) both announced public consultations on nanotechnology regulation.

From the June 9, 2011 news item on Nanowerk about the US EPA public consultation,

The U.S. Environmental Protection Agency announced today it plans to obtain information on nanoscale materials in pesticide products. Under the requirements of the law, EPA will gather information on what nanoscale materials are present in pesticide products to determine whether the registration of a pesticide may cause unreasonable adverse effects on the environment and human health. The proposed policy will be open for public comment.

“We want to obtain timely and accurate information on what nanoscale materials may be in pesticide products, “said Steve Owens assistant administrator for EPA’s Office of Chemical Safety and Pollution Prevention. “This information is needed for EPA to meet its requirement under the law to protect public health and the environment.”

Comments on the Federal Register notice will be accepted until 30 days after publication. The notice will be available at www.regulations.gov in docket number EPA–HQ–OPP–2010-0197. More information or to read the proposed notice: http://www.epa.gov/pesticides/regulating/nanotechnology.html [Pesticides; Policies Concerning Products Containing Nanoscale Materials; Opportunity for Public Comment]

The US FDA has taken a more complicated approach to its public consultation with two notices being issued about the same consultation. The June 9, 2011 news item on Nanowerk had this to say,

The U.S. Food and Drug Administration today released draft guidance to provide regulated industries with greater certainty about the use of nanotechnology, which generally involves materials made up of particles that are at least one billionth of a meter in size. The guidance outlines the agency’s view on whether regulated products contain nanomaterials or involve the application of nanotechnology.

The draft guidance, “Considering Whether an FDA-Regulated Product Involves the Application of Nanotechnology”, is available online and open for public comment. It represents the first step toward providing regulatory clarity on the FDA’s approach to nanotechnology.

Specifically, the agency named certain characteristics – such as the size of nanomaterials used and the exhibited properties of those materials – that may be considered when attempting to identify applications of nanotechnology in regulated products.

“With this guidance, we are not announcing a regulatory definition of nanotechnology,” said Margaret A. Hamburg, MD, Commissioner of Food and Drugs. “However, as a first step, we want to narrow the discussion to these points and work with industry to determine if this focus is an appropriate starting place.”

Then there was a June 15, 2011 news item on Nanowerk offering more details about the draft guidance announcement of June 9, 2011,

The guidelines list things that might be considered when deciding if nanotechnology was used on a product regulated by FDA—including the size of the nanomaterials that were used, and what their properties are.

And FDA wants industry leaders and the public to weigh-in.

Nanotechnology—the science of manipulating materials on a scale so small that it can’t be seen with a regular microscope—could have a broad range of applications, such as increasing the effectiveness of a particular drug or improving the packaging of food or cosmetics. “Nanotechnology is an emerging technology that has the potential to be used in a broad array of FDA-regulated medical products, foods, and cosmetics,” says Carlos Peña, director of FDA’s emerging technology programs. “But because materials in the nanoscale dimension may have different chemical, physical, or biological properties from their larger counterparts, FDA is monitoring the technology to assure such use is beneficial.”

In other words, using nanotechnology can change the way a product looks or operates, Peña says.

Although the technology is still evolving, it’s already in use as display technology for laptop computers, cell phones, and digital cameras. In the medical community, a number of manufacturers have used nanotechnology in:

  • Drugs
  • Medical imaging
  • Antimicrobial materials
  • Medical devices
  • Sunscreens

Andrew Maynard in his previously noted June 13, 2011 posting on on the UMRSC blog had this to say  about the EPA’s draft document,

This is a long and somewhat convoluted document, that spends some time outlining what the agency considers is an engineered nanomaterial, and reviewing nanomaterial hazard data.

Reading the document, EPA still seems somewhat tangled up with definitions of engineered nanomaterials. After outlining conventional attributes associated with engineered nanomaterials, including structures between ~1 – 100 nm and unique or novel properties, the document states,

“These elements do not readily work in a regulatory context because of the high degree of subjectivity involved with interpreting such phrases as “unique or novel properties” or “manufactured or engineered to take advantage of these properties” Moreover the contribution of these subjective elements to risk has not been established.”

This aligns with where my own thinking has been moving in recent years. Yet following this statement, the document reverts back to considering nanoparticles between 1 – 100 nm as the archetypal nanomaterial, and intimates “novel” properties such as “larger surface area per unit volume and/or quantum effects” as raising new risk concerns.

Canadian segue

I’ll point out here that Health Canada’s Interim Policy definition also adheres to the 1 to 100 nm definition for a nanomaterial, a concern I expressed in my submission to the public consultation held last year. Interestingly, since 29 submissions does seem particularly daunting to read there has yet to be any public response to these submissions. Not even a list of which agencies and individuals made submissions.

Back to US

Andrew also comments on the FDA document,

The FDA Guidance for Industry: Considering Whether an FDA-Regulated Product Involves the Application of Nanotechnology is a very different kettle of fish to the EPA document. It is overtly responsive to the White House memo; it demonstrates a deep understanding of the issues surrounding nanotechnology and regulation; and it is mercifully concise.

To be fair, the scope of the draft guidance is limited to helping manufacturers understand how the agency is approaching nanotechnology-enabled products under their purview. But this is something it does well.

One of the more significant aspects of the guidance is the discussion on regulatory definitions of nanomaterials. Following a line of reasoning established some years ago, the agency focuses on material properties rather than rigid definitions:

“FDA has not to date established regulatory definitions of “nanotechnology,” “nanoscale” or related terms… Based on FDA’s current scientific and technical understanding of nanomaterials and their characteristics, FDA believes that evaluations of safety, effectiveness or public health impact of such products should consider the unique properties and behaviors that nanomaterials may exhibit”

I recommend reading the full text of Andrew’s comments.

Europe

Meanwhile, there was a June 10, 2011 news item on Nanowerk about the availability of  28 presentations from a May 10-12, 2011 joint European workshop hosted by the Engineered NanoParticle Risk Assessment (ENPRA) FP (Framework Programme) 7 project and the European Commission’s Joint Research Centre. From the news item about the Challenges of Regulation and Risk Assessment of Nanomaterials workshop,

Twenty-eight presentations delivered at the Joint JRC Nano event and 2nd ENPRA Stakeholders Workshop are now available on-line: ENPRA Workshop 2011 – Programme with Presentations.

The workshop (by invitation only) involved about 90 participants, from industry, government, NGOs, and academia. …

During two days and a half, 34 experts from 26 different organisations informed the participants on the latest scientific progress in the field of nanoparticles risk assessment produced within national and European projects, and first results of ENPRA FP7 project were presented in detail. In addition, recent developments concerning legislation in the EU and beyond were discussed.

Amongst other participants, you can include representatives of EU Associate and Candidate Countries, environment and workers’ protection organisations, CAIQ (Chinese Academy of Inspection and Quarantine), US-EPA, ECHA, and EFSA.

To close this piece (and I want to do that very badly), I’m going to give Tim Harper at his TNT blog (on the Cientifica website) the final word from his June 10, 2011 posting,

The White House Emerging Technologies Interagency Policy Coordination Committee (ETIPC) has developed a set of principles (pdf) specific to the regulation and oversight of applications of nanotechnology, to guide the development and implementation of policies at the agency level.

I’m glad to see that it addresses those two old bugbears, the confusion between risk and hazard and the prejudging of issues without reference to scientific evidence …

It is an approach which appears to diverge slightly from the European adoption of the precautionary principle …

As with any regulation, the problems will arise not from the the original wording, but through its (mis)interpretation and inconsistent application.

Comments on the Golden Triangle workshop for PCAST’s PITAC

I didn’t catch the entire webcast as it was live streaming but what I caught was fascinating to observe. For those who don’t know, PCAST is the US President’s Council of Advisors on Science and Technology and PITAC is the President’s Innovation and Technology Advisory Committee. This morning they held a workshop mentioned in yesterday’s posting here that was focused on innovation in the US regarding information technology, nanotechnology, and biotechnology (the Golden Triangle). You can go to the PCAST website for information about this morning’s workshop and hopefully find a copy of the webcast once they’ve posted it.

A few items from the webcast caught my attention such as a comment by Judith Estrin (invitee and business woman). She talked about a laboratory gap (aka valley of death) while referencing the loss of large industrial labs such as the Bell Labs where as of Aug. 2008 the focus shifted from basic science to more easily commercialized applications.

I think there’s a significant difference between doing basic research in an academic environment and doing it in an industrial environment. I believe what Estrin is referencing is the support an industrial laboratory can offer a scientist who wants to pursue an avenue of basic research which might not find initial support within the academic structure and/or ongoing support as it makes its arduous way to commercialization.

With the loss of a number of large laboratories, start-up companies are under pressure to fill the gap but they have a big problem trying to support that interstitial space between basic research and applied research as they don’t have sufficient capitalization.

The similarity to the Canadian situation with its lack of industrial laboratories really caught my attention.

Franco Vitiliano, President and CEO of ExQor Technologies Inc., reiterated a point made earlier and afterwards about the interdisciplinary nature of the work and difficulty of operating in a business environment that is suspicious and/or fails to understand that kind of work. I was captivated by his story about bio-nanolasers and how these were developed from an observations made about water drops.

Anita Goel, Chairman and CEO of Nanobiosym Inc., noted that another problem with financing lies with the current financial models which are increasingly focused on the short-term and are risk-averse. As well, the current venture capital model is designed to support one technology application for one market. This presents a problem with the interdisciplinary nature of the work in the biotechnology, nanotechnology, and information technology fields currently taking place with its applications being considered for multiple markets.

There were many astute and interesting speakers. I can’t always remember who said what and sometimes I couldn’t see the person’s placard so I apologize if I’ve wrongly attributed some of the comments. If someone could correct me, I’d be more than happy to edit the changes in.

I was suprised that there were no individuals from the venture capital  community or representatives from some of the large companies such as HP Labs, IBM, etc. Most of the start-ups represented at the meeting came from the biomedical sector. I did not hear anyone discuss energy, clean water, site remediation, or other such applications. As far as I could tell there weren’t any nongovernmental agencies present either. Nonetheless, it was a very crowded table and I imagine that more people would have necessitated a much longer session.

I found the webcast was stimulating but the acid test for this meeting and others of its type is always whether or not action is taken.

As for the Canadian situation with it’s ‘innovation gap’, there’s more in Rob Annan’s posting, Research policy odds and sods, where he highlights a number of recent articles  about Canadian innovation laced with some of his observations. It’s a good roundup of the latest and I encourage you to check it out.

ETA June 23 2010: Dexter Johnson at Nanoclast offers his thoughts on the webcast and notes that while the promotional material suggested a discussion about public engagement, the workshop itself was focused on the ‘innovation gap’. He highlights comments from speakers I did not mention, as well as some of the questions received via Facebook and Twitter. For someone who doesn’t have the time to sit through the webcast, I strongly suggest that you check out Dexter’s posting as he adds insight borne of more intimate knowledge than mine of the US situation.

Measuring professional and national scientific achievements; Canadian science policy conferences

I’m going to start with an excellent study about publication bias in science papers and careerism that I stumbled across this morning on physorg.com (from the news item),

Dr [Daniele] Fanelli [University of Edinburgh] analysed over 1300 papers that declared to have tested a hypothesis in all disciplines, from physics to sociology, the principal author of which was based in a U.S. state. Using data from the National Science Foundation, he then verified whether the papers’ conclusions were linked to the states’ productivity, measured by the number of papers published on average by each academic.

Findings show that papers whose authors were based in more “productive” states were more likely to support the tested hypothesis, independent of discipline and funding availability. This suggests that scientists working in more competitive and productive environments are more likely to make their results look “positive”. It remains to be established whether they do this by simply writing the papers differently or by tweaking and selecting their data.

I was happy to find out that Fanelli’s paper has been published by the PLoS [Public Library of Science] ONE , an open access journal. From the paper [numbers in square brackets are citations found at the end of the published paper],

Quantitative studies have repeatedly shown that financial interests can influence the outcome of biomedical research [27], [28] but they appear to have neglected the much more widespread conflict of interest created by scientists’ need to publish. Yet, fears that the professionalization of research might compromise its objectivity and integrity had been expressed already in the 19th century [29]. Since then, the competitiveness and precariousness of scientific careers have increased [30], and evidence that this might encourage misconduct has accumulated. Scientists in focus groups suggested that the need to compete in academia is a threat to scientific integrity [1], and those guilty of scientific misconduct often invoke excessive pressures to produce as a partial justification for their actions [31]. Surveys suggest that competitive research environments decrease the likelihood to follow scientific ideals [32] and increase the likelihood to witness scientific misconduct [33] (but see [34]). However, no direct, quantitative study has verified the connection between pressures to publish and bias in the scientific literature, so the existence and gravity of the problem are still a matter of speculation and debate [35].

Fanelli goes on to describe his research methods and how he came to his conclusion that the pressure to publish may have a significant impact on ‘scientific objectivity’.

This paper provides an interesting counterpoint to a discussion about science metrics or bibliometrics taking place on (the journal) Nature’s website here. It was stimulated by Judith Lane’s recent article titled, Let’s Make Science Metrics More Scientific. The article is open access and comments are invited. From the article [numbers in square brackets refer to citations found at the end of the article],

Measuring and assessing academic performance is now a fact of scientific life. Decisions ranging from tenure to the ranking and funding of universities depend on metrics. Yet current systems of measurement are inadequate. Widely used metrics, from the newly-fashionable Hirsch index to the 50-year-old citation index, are of limited use [1]. Their well-known flaws include favouring older researchers, capturing few aspects of scientists’ jobs and lumping together verified and discredited science. Many funding agencies use these metrics to evaluate institutional performance, compounding the problems [2]. Existing metrics do not capture the full range of activities that support and transmit scientific ideas, which can be as varied as mentoring, blogging or creating industrial prototypes.

The range of comments is quite interesting, I was particularly taken by something Martin Fenner said,

Science metrics are not only important for evaluating scientific output, they are also great discovery tools, and this may indeed be their more important use. Traditional ways of discovering science (e.g. keyword searches in bibliographic databases) are increasingly superseded by non-traditional approaches that use social networking tools for awareness, evaluations and popularity measurements of research findings.

(Fenner’s blog along with more of his comments about science metrics can be found here. If this link doesn’t work, you can get to Fenner’s blog by going to Lane’s Nature article and finding him in the comments section.)

There are a number of issues here: how do we measure science work (citations in other papers?) as well as how do we define the impact of science work (do we use social networks?) which brings the question to: how do we measure the impact when we’re talking about a social network?

Now, I’m going to add timeline as an issue. Over what period of time are we measuring the impact? I ask the question because of the memristor story.  Dr. Leon Chua wrote a paper in 1971 that, apparently, didn’t receive all that much attention at the time but was cited in a 2008 paper which received widespread attention. Meanwhile, Chua had continued to theorize about memristors in a 2003 paper that received so little attention that Chua abandoned plans to write part 2. Since the recent burst of renewed interest in the memristor and his 2003 paper, Chua has decided to follow up with part 2, hopefully some time in 2011. (as per this April 13, 2010 posting) There’s one more piece to the puzzle: an earlier paper by F. Argall. From Blaise Mouttet’s April 5, 2010 comment here on this blog,

In addition HP’s papers have ignored some basic research in TiO2 multi-state resistance switching from the 1960’s which disclose identical results. See F. Argall, “Switching Phenomena in Titanium Oxide thin Films,” Solid State Electronics, 1968.
http://pdf.com.ru/a/ky1300.pdf

[ETA: April 22, 2010 Blaise Mouttet has provided a link to an article  which provides more historical insight into the memristor story. http://knol.google.com/k/memistors-memristors-and-the-rise-of-strong-artificial-intelligence#

How do you measure or even track  all of that? Shy of some science writer taking the time to pursue the story and write a nonfiction book about it.

I’m not counselling that the process be abandoned but since it seems that the people are revisiting the issues, it’s an opportune time to get all the questions on the table.

As for its importance, this process of trying to establish better and new science metrics may seem irrelevant to most people but it has a much larger impact than even the participants appear to realize. Governments measure their scientific progress by touting the number of papers their scientists have produced amongst other measures such as  patents. Measuring the number of published papers has an impact on how governments want to be perceived internationally and within their own borders. Take for example something which has both international and national impact, the recent US National Nanotechnology Initiative (NNI) report to the President’s Council of Science and Technology Advisors (PCAST). The NNI used the number of papers published as a way of measuring the US’s possibly eroding leadership in the field. (China published about 5000 while the US published about 3000.)

I don’t have much more to say other than I hope to see some new metrics.

Canadian science policy conferences

We have two such conferences and both are two years old in 2010. The first one is being held in Gatineau, Québec, May 12 – 14, 2010. Called Public Science  in Canada: Strengthening Science and Policy to Protect Canadians [ed. note: protecting us from what?], the target audience for the conference seems to be government employees. David Suzuki (tv host, scientist, evironmentalist, author, etc.) and Preston Manning (ex-politico) will be co-presenting a keynote address titled: Speaking Science to Power.

The second conference takes place in Montréal, Québec, Oct. 20-22, 2010. It’s being produced by the Canadian Science Policy Centre. Other than a notice on the home page, there’s not much information about their upcoming conference yet.

I did note that Adam Holbrook (aka J. Adam Holbrook) is both speaking at the May conference and is an advisory committee member for the folks who are organizing the October conference. At the May conference, he will be participating in a session titled: Fostering innovation: the role of public S&T. Holbrook is a local (to me) professor as he works at Simon Fraser University, Vancouver, Canada.

That’s all of for today.

The memristor rises; commercialization and academic research in the US; carbon nanotubes could be made safer than we thought

In 2008, two memristor papers were published in Nature and Nature Nanotechnology, respectively. In the first (Nature, May 2008 [article still behind a paywall], a team at HP Labs claimed they had proved the existence of memristors (a fourth member of electrical engineering’s ‘Holy Trinity of the capacitor, resistor, and inductor’). In the second paper (Nature Nanotechnology, July 2008 [article still behind a paywall]) the team reported that they had achieved engineering control.

I mention this because (a) there’s some new excitement about memristors and (b) I love the story (you can read my summary of the 2008 story here on the Nanotech Mysteries wiki).

Unbeknownst to me in 2008, there was another team, located in Japan, whose work  on slime mould inspired research by a group at the University of California San Diego (UC San Diego)  which confirmed theorist Leon Chua’s (he first suggested memristors existed in 1971) intuition that biological organisms used memristive systems to learn. From an article (Synapse on a Chip) by Surf daddy Orca on the HPlus magazine site,

Experiments with slime molds in 2008 by Tetsu Saisuga at Hokkaido University in Sapporo sparked additional research at the University of California, San Diego by Max Di Ventra. Di Ventra was familiar with Chua’s work and built a memristive circuit that was able to learn and predict future signals. This ability turns out to be similar to the electrical activity involved in the ebb and flow of potassium and sodium ions across cellular membranes: synapses altering their response according to the frequency and strength of signals. New Scientist reports that Di Ventra’s work confirmed Chua’s suspicions that “synapses were memristors.” “The ion channel was the missing circuit element I was looking for,” says Chua, “and it already existed in nature.”

Fast forward to 2010 and a team at the University of Michigan led by Dr. Wei Lu showing how synapses behave like memristors (published in Nano Letters, DOI: 10.1021/nl904092h [article behind paywall]). (Fromthe  HPlus site article)

Scientific American describes a US military-funded project that is trying to use the memristor “to make neural computing a reality.” DARPA’s Systems of Neuromorphic Adaptive Plastic Scalable Electronics Program (SyNAPSE) is funded to create “electronic neuromorphic machine technology that is scalable to biological levels.”

I’m not sure if the research in Michigan and elsewhere is being funded by DARPA (the US Dept. of Defense’s Defense Advanced Research Project Agency) although it seems likely.

In the short term, scientists talk about energy savings (no need to reboot your computer when you turn it back on). In the longer term, they talk about hardware being able to learn. (Thanks to the Foresight Institute for the latest update on the memristor story and the pointer to HPlus.) Do visit the HPlus site as there are some videos of scientists talking about memristors and additional information (there’s yet another team working on research that is tangentially related).

Commercializing academic research in US

Thanks to Dave Bruggeman at the Pasco Phronesis blog who’s posted some information about a White House Request for Information (RFI) on commercializing academic research. This is of particular interest not just because of the discussion about innovation in Canada but also because the US National Nanotechnology Initiative’s report to PCAST (President’s Council of Advisors on Science and Technology, my comments about the webcast of the proceedings here). From the Pasco Phronesis posting about the NNI report,

While the report notes that the U.S. continues to have a strong nanotechnology sector and corresponding support from the government. However, as with most other economic and research sectors, the rest of the world is catching up, or spending enough to try and catch up to the United States.

According to the report, more attention needs to be paid to commercialization efforts (a concern not unique to nanotechnology).

I don’t know how long the White House’s RFI has been under development but it was made public at the end of March 2010 just weeks after the latest series of reports to PCAST. As for the RFI itself, from the Pasco Phronesis posting about it,

The RFI questions are organized around two basic concerns:

  • Seeking ideas for supporting the commercialization and diffusion of university research. This would include best practices, useful models, metrics (with evidence of their success), and suggested changes in federal policy and/or research funding. In addition, the RFI is interested in how commercialization ecosystems can be developed where none exist.
  • Collecting data on private proof of concept centers (POCCs). These entities seek to help get research over the so-called “Valley of Death” between demonstrable research idea and final commercial product. The RFI is looking for similar kinds of information as for commercialization in general: best practices, metrics, underlying conditions that facilitate such centers.

I find the news of this RFI a little surprising since I had the impression that commercialization of academic research in the US is far more advanced than it is here in Canada. Mind you, that impression is based on a conversation I had with a researcher a year ago who commented that his mentor at a US university rolled out more than 1 start up company every year. As I understand it researchers in Canada may start up one or two companies in their career but never a series of them.

Carbon nanotubes, is exposure ok?

There’s some new research which suggests that carbon nanotubes can be broken down by an enzyme. From the news item on Nanowerk,

A team of Swedish and American scientists has shown for the first time that carbon nanotubes can be broken down by an enzyme – myeloperoxidase (MPO) – found in white blood cells. Their discoveries are presented in Nature Nanotechnology (“Carbon nanotubes degraded by neutrophil myeloperoxidase induce less pulmonary inflammation”) and contradict what was previously believed, that carbon nanotubes are not broken down in the body or in nature. The scientists hope that this new understanding of how MPO converts carbon nanotubes into water and carbon dioxide can be of significance to medicine.

“Previous studies have shown that carbon nanotubes could be used for introducing drugs or other substances into human cells,” says Bengt Fadeel, associate professor at the Swedish medical university Karolinska Institutet. “The problem has been not knowing how to control the breakdown of the nanotubes, which can caused unwanted toxicity and tissue damage. Our study now shows how they can be broken down biologically into harmless components.”

I believe they tested single-walled carbon nanotubes (CNTs) only as the person who wrote the news release seems unaware that mutil-walled CNTs also exist. In any event, this could be very exciting if this research holds up under more testing.

PCAST report; University of Alberta claims leadership in providing nanotech facilities for undergrad students; a securities analysis and innovation in Canada; Mar.10.10 UK debate; science songs

Triumph! After a technical glitch or two,  I was able to watch the live stream of the National Nanotechnology Initiative’s (NNI) representatives’, Maxine Savitch and Ed Penhoet, presentation to the  President’s Council of Advisors on Science and Technology, on Friday, March 12, 2010.  The short story (and it’s the same one for every agency): please keep funding us and please sir, we’d like more. (Oliver Twist reference in that last bit)

More seriously, I was impressed by the fact that they adopted a measured approach regarding basic vs commercialization funding needs and regarding competition for leadership in nanotechnology (US vs the rest of the world). There was an acknowledgment that the NNI is ten years old and from there they launched into the need for funding to commercialize nanotechnology while maintaining their commitment to basic science research. They noted that the US is a leader in nanotechnology but its leadership is eroding as more countries in Europe and Asia particularly devote more attention and resources to nanotechnology research.

Surprisingly, they first singled out Germany as a nanotechnology leader; it’s usually (by international organizations and other jurisdictions as well as the US) China which is singled out first as a competitor because of its extraodinarily fast progress to the top three or five depending on what you’re measuring as nanotechnology research. I think this strategy worked well as it expanded the notion of competition between the US and a single country to emphasize the global aspect of the nanotechnology endeavour and the need for a range of strategies.

I had another surprise while watching the live stream when they discussed strategies for retaining students who study for advanced degrees in the US and return to their home countries on completion. There was talk of stapling a “green card” (permission to work in the US) to the graduate diploma although one member of the council hastened to suggest that they only wanted the “right” kinds of advanced degrees. Presumably the council member did not want to encourage experts with advanced degrees in medieval Italian poetry and other such frippery to remain in the US.

There was considerable concern (which led to a recommendation) about the scarcity of data on commercialization, i.e., the true value of the nanotechnology aspect of a product and its benefits.

Mention was made of risks and hazards with the recommendation that research needs to be focused on defining a path for commercialization and on developing a regulatory framework.

Nanoclast (IEEE blogger), Dexter Johnson, has also commented here on the March 12, 2010 PCAST presentation, if you want another perspective.

The folks at Edmonton’s University of Alberta are doing a little chest beating about the nanotechnology research facilities they make available for undergraduate students. From Elise Stolte’s article in the Edmonton Journal,

In a small, windowless room at the University of Alberta, a dozen undergraduate students sit in the middle of $2-million worth of new equipment sensitive enough to measure an atom, the smallest particle of matter.

It’s the first place in Canada where students not yet finished their first degree can start running real experiments on the nano scale, lab co-ordinator Ben Bathgate said.

Massachusetts Institute of Technology and California’s Stanford University have undergraduate labs that come close, “but they don’t have the range of equipment,” he said.

It’s fragile, state-of-the-art, and so new that one of the 18 machines still has parts in bubble wrap.

I don’t really care whether or not the equipment is better than what they have in Stanford and MIT, I’m just glad to see that an effort is being made to provide students with facilities so they can learn and participate in some exciting and cutting edge research. This is only part of the picture, Tim Harper over at TNT Log comments on a recent report (Vision for UK Research by the Council for Science and Technology) in his post titled, A Concerted Effort to Save British Science,

… there is also a need to start thinking about science in a different way. In fact we really need to look at the whole process of scientific innovation from primary education to technology funding.

This is a holistic approach to the entire endeavour and means that students won’t be left with a degree or certificate and no where to go, which leads me to the topic of innovation.

I’ve commented before on innovation in Canada and the fact that there is general agreement that established businesses don’t spend enough money on R&D (research and development). There is an eye-opening study by Mary J. Benner of The Wharton School which provides what may be some insight into the situation. From the news item on physorg.com,

The reluctance of securities analysts to recommend investment in veteran companies using new techniques to grapple with radical technological change may be harming these companies as they struggle to compete, according to a new study in the current issue of Organization Science, a journal of the Institute for Operations Research and the Management Sciences (INFORMS).

The findings suggest that management teams contemplating bold innovation and the adoption of radical technological change may be held back by conservative investment firms that reward firms that stick to their knitting by extending existing technologies.

“This may be short-sighted,” says Dr. Benner. “Existing companies may be rewarded in the short run with increased stock prices for focusing on strategies that extend the financial performance from the old technology, but they may pay later in the face of threatening technological substitutes.”

Benner’s article is behind a paywall but the news item on physorg.com does offer a good summary.

Kudos to Ms. Benner for pointing out that established companies don’t seem to get much support when they want to embrace new technologies. Benner’s discussion about Polaroid and Kodak is quite salutary. (Note: I once worked for Creo Products, computer-to-plate technology, which was eventually acquired by Kodak, a company which, last I heard, is now in serious financial trouble.) This study certainly provides a basis for better understanding why Canadian companies aren’t inclined to innovate much.

The Brits enjoyed their third and final for this series of UK Cross-Party Science Policy Debate on Tuesday, March 9, 2010. The webcast which was live streamed from the House of Commons is available here.  At 2.5 hours I haven’t found the time to listen past the first few minutes. Dave Bruggeman, Pasco Phronesis, does provide some commentary from his perspective as a US science policy analyst.

One final bit for today, the Pasco Phronesis blog provides some videos of science songs from the Hear Comes Science album by They Might Be Giants.