Tag Archives: University of California at Santa Barbara

Regulators not prepared to manage nanotechnology risks according to survey

The focus of the survey mentioned in the heading is on the US regulatory situation regarding nanotechnology and, interestingly, much of the work was done by researchers at the University of British Columbia (UBC; Vancouver, Canada). A Dec. 19, 2013 news item on Nanowerk provides an overview,

In a survey of nanoscientists and engineers, nano-environmental health and safety scientists, and regulators, researchers at the UCSB Center for Nanotechnology in Society (CNS) and at the University of British Columbia found that those who perceive the risks posed by nanotechnology as “novel” are more likely to believe that regulators are unprepared. Representatives of regulatory bodies themselves felt most strongly that this was the case. “The people responsible for regulation are the most skeptical about their ability to regulate,” said CNS Director and co-author Barbara Herr Harthorn.

“The message is essentially,” said first author Christian Beaudrie of the Institute for Resources, Environment, and Sustainability at the University of British Columbia, “the more that risks are seen as new, the less trust survey respondents have in regulatory mechanisms. That is, regulators don’t have the tools to do the job adequately.”

The Dec. (?), 2013 University of California at Santa Barbara news release (also on EurekAlert), which originated the news item, adds this,

The authors also believe that when respondents suggested that more stakeholder groups need to share the responsibility of preparing for the potential consequences of nanotechnologies, this indicated a greater “perceived magnitude or complexity of the risk management challenge.” Therefore, they assert, not only are regulators unprepared, they need input from “a wide range of experts along the nanomaterial life cycle.” These include laboratory scientists, businesses, health and environmental groups (NGOs), and government agencies.

Here’s a link to and a citation for the paper,

Expert Views on Regulatory Preparedness for Managing the Risks of Nanotechnologies by Christian E. H. Beaudrie, Terre Satterfield, Milind Kandlikar, Barbara H. Harthorn. PLOS [Public Library of Science] ONE Published: November 11, 2013 DOI: 10.1371/journal.pone.0080250

All of the papers on PLOS ONE are open access.

I have taken a look at this paper and notice there will be a separate analysis of the Canadian scene produced at a later date. As for the US analysis, certainly this paper confirms any conjectures made based on my observations and intuitions about the situation given the expressed uneasiness from various groups and individuals about the regulatory situation.

I would have liked to have seen a critique of previous studies rather than a summary, as well as, a critique of the survey itself in its discussion/conclusion. I also would have liked to have seen an appendix with the survey questions listed in the order in which they were asked and seen qualitative research (one-on-one interviews) rather than 100% dependence on an email survey. That said, I was glad to see they reversed the meaning of some of the questions to doublecheck for someone who might indicate the same answers (e.g., 9 [very concerned]) throughout as a means of simplifying their participation,

Onward to the survey with an excerpt from the description of how it was conducted,

Subjects were contacted by email in a three-step process, including initial contact and two reminders at two-week intervals. Respondents received an ‘A’ or ‘B’ version of the survey at random, where the wording of several survey questions were modified to reverse the meaning of the question. Questions with alternate wording were reversed-coded during analysis to enable direct comparison of responses. Where appropriate the sequence of questions was also varied to minimize order effects.

Here’s how the researchers separated the experts into various groups (excerpted from the study),,

This study thus draws from a systematic sampling of US-based nano-scientists and engineers (NSE, n=114), nano-environmental health and safety scientists (NEHS, n=86), and regulatory decision makers and scientists (NREG, n=54), to characterize how well-prepared different experts think regulatory agencies are for the risk management of nanomaterials and applications. We tested the following hypothesis:

  1. (1) Expert views on whether US federal agencies are sufficiently prepared for managing any risks posed by nanotechnologies will differ significantly across classes of experts (NSE vs. NEHS. vs. NREG).

This difference across experts was anticipated and so tested in reference to four additional hypotheses:

  1. (2) Experts who see nanotechnologies as novel (i.e., as a new class of materials or objects) will view US federal regulatory agencies as unprepared for managing risks as compared to those who see nanotechnologies as not new (i.e., as little different from their bulk chemical form)
  2. (3) Experts who deem US federal regulatory agencies as less trustworthy will also view agencies as less prepared compared to those with more trust in agencies
  3. (4) Experts who attribute greater collective stakeholder responsibility (e.g. who view a range of stakeholders as equally responsible for managing risks) will see agencies as less prepared compared to those who attribute less responsibility.
  4. (5) Experts who are more socially and economically conservative will see regulatory agencies as more prepared compared to those with a more liberal orientation.

The researchers included Index Variables of trust, responsibility, conservatism, novelty-risks, and novelty-benefits in relationship to education, gender, field of expertise, etc. for a regression analysis. In the discussion (or conclusion), the authors had this to say (excerpted from the study),

Consistent differences exist between expert groups in their views on agency preparedness to manage nanotechnology risks, yet all three groups perceive regulatory agencies as unprepared. What is most striking however is that NREG experts see regulatory agencies as considerably less prepared than do their NSE or NEHS counterparts. Taking a closer look, the drivers of experts’ concerns over regulator preparedness tell a more nuanced story. After accounting for other differences, the ‘expert group’ classification per se does not drive the observed differences in preparedness perceptions. Rather a substantial portion of this difference results from differing assessments of the perceived novelty of risks across expert groups. Of the remaining variables, trust in regulators is a small but significant driver, and our findings suggest a link between concerns over the novelty of nanomaterials and the adequacy of regulatory design. Experts’ views on stakeholder responsibility are not particularly surprising since greater reliance on a collective responsibility model would need the burden to move away exclusively from regulatory bodies to other groups, and result presumptively in a reduced sense of preparedness.

Experts’ reliance in part upon socio-political values indicates that personal values also play a minor role in preparedness judgments.

I look forward to seeing the Canadian analysis. The paper is worth reading for some of the more subtle analysis I did not include here.

Nanotechnology analogies and policy

There’s a two part essay titled, Regulating Nanotechnology Via Analogy (part 1, Feb. 12, 2013 and part 2, Feb. 18, 2013), by Patrick McCray on his Leaping Robot blog that is well worth reading if you are interested in the impact analogies can have on policymaking.

Before launching into the analogies, here’s a bit about Patrick McCray from the Welcome page to his website, (Note: A link has been removed),

As a professor in the History Department of the University of California, Santa Barbara and a co-founder of the Center for Nanotechnology in Society, my work focuses on different technological and scientific communities and their interactions with the public and policy makers. For the past ten years or so, I’ve been especially interested in the historical development of so-called “emerging technologies,” whenever they emerged.

I hope you enjoy wandering around my web site. The section of it that changes most often is my Leaping Robot blog. I update this every few weeks or so with an extended reflection or essay about science and technology, past and future.

In part 1 (Feb. 12, 2013) of the essay, McCray states (Note: Links and footnotes have been removed),

[Blogger’s note: This post is adapted from a talk I gave in March 2012 at the annual Business History Conference; it draws on research done by Roger Eardley-Pryor, an almost-finished graduate student I'm advising at UCSB [University of California at Santa Barbara], and me. I’m posting it here with his permission. This is the first of a two-part essay…some of the images come from slides we put together for the talk.]

Over the last decade, a range of actors – scientists, policy makers, and activists – have used  historical analogies to suggest different ways that risks associated with nanotechnology – especially those concerned with potential environmental implications – might be minimized. Some of these analogies make sense…others, while perhaps effective, are based on a less than ideal reading of history.

Analogies have been used before as tools to evaluate new technologies. In 1965, NASA requested comparisons between the American railroad of the 19th century and the space program. In response, MIT historian Bruce Mazlish wrote a classic article that analyzed the utility and limitations of historical analogies. Analogies, he explained, function as both model and myth. Mythically, they offer meaning and emotional security through an original archetype of familiar knowledge. Analogies also furnish models for understanding by construing either a structural or a functional relationship. As such, analogies function as devices of anticipation which what today is fashionably called “anticipatory governance.”They also can serve as a useful tool for risk experts.

McCray goes on to cover some of the early discourse on nanotechnology, the players, and early analogies. While the focus is on the US, the discourse reflects many if not all of the concerns being expressed internationally.

In part 2 posted on Feb. 18, 2013 McCray mentions four of the main analogies used with regard to nanotechnology and risk (Note: Footnotes have been removed),

Example #1 – Genetically Modified Organisms

In April 2003, Prof. Vicki Colvin testified before Congress. A chemist at Rice University, Colvin also directed that school’s Center for Biological and Environmental Nanotechnology. This “emerging technology,” Colvin said, had a considerable “wow index.” However, Colvin warned, every promising new technology came with concerns that could drive it from “wow into yuck and ultimately into bankrupt.” To make her point, Colvin compared nanotech to recent experiences researchers and industry had experienced with genetically modified organisms. Colvin’s analogy – “wow to yuck” – made an effective sound bite. But it also conflated two very different histories of two specific emerging technologies.

While some lessons from GMOs are appropriate for controlling the development of nanotechnology, the analogy doesn’t prove watertight. Unlike GMOs, nanotechnology does not always involve biological materials. And genetic engineering in general, never enjoyed any sort of unalloyed “wow” period. There was “yuck” from the outset. Criticism accompanied GMOs from the very start. Furthermore, giant agribusiness firms prospered handsomely even after the public’s widespread negative reactions to their products.  Lastly, living organisms – especially those associated with food – designed for broad release into the environment were almost guaranteed to generate concerns and protests. Rhetorically, the GMO analogy was powerful…but a deeper analysis clearly suggests there were more differences than similarities.

McCray offers three more examples of analogies used to describe nanotechnology: asbestos, (radioactive) fallout, and Recombinant DNA which he dissects and concludes are not the best analogies to be using before offering this thought,

So — If historical analogies teach can teach us anything about the potential regulation of nano and other emerging technologies, they indicate the need to take a little risk in forming socially and politically constructed definitions of nano. These definitions should be based not just on science but rather mirror the complex and messy realm of research, policy, and application. No single analogy fits all cases but an ensemble of several (properly chosen, of course) can suggest possible regulatory options.

I recommend reading both parts of McCray’s essay in full. It’s a timely piece especially in light of a Feb. 28, 2013 article by Daniel Hurst for Australian website, theage.com.au, where a union leader raises health fears about nanotechnology by using the response to asbestos health concerns as the analogy,

Union leader Paul Howes has likened nanotechnology to asbestos, calling for more research to ease fears that the growing use of fine particles could endanger manufacturing workers.

”I don’t want to make the mistake that my predecessors made by not worrying about asbestos,” the Australian Workers Union secretary said.

I have covered the topic of carbon nanotubes and asbestos many times, one of the  latest being this Jan. 16, 2013 posting. Not all carbon nanotubes act like asbestos; the long carbon nanotubes present the problems.

Biosensing cocaine

Amusingly, the Feb. 13, 2013 news item on Nanowerk highlights the biosensing aspect of the work in its title,

New biosensing nanotechnology adopts natural mechanisms to detect molecules

(Nanowerk News) Since the beginning of time, living organisms have developed ingenious mechanisms to monitor their environment.

The Feb. 13, 2013 news release from the University of Montreal (Université de Montréal) takes a somewhat different tack by focusing on cocaine,

Detecting cocaine “naturally”

Since the beginning of time, living organisms have developed ingenious mechanisms to monitor their environment. As part of an international study, a team of researchers has adapted some of these natural mechanisms to detect specific molecules such as cocaine more accurately and quickly. Their work may greatly facilitate the rapid screening—less than five minutes—of many drugs, infectious diseases, and cancers.

Professor Alexis Vallée-Bélisle of the University of Montreal Department of Chemistry has worked with Professor Francesco Ricci of the University of Rome Tor Vergata and Professor Kevin W. Plaxco of the University of California at Santa Barbara to improve a new biosensing nanotechnology. The results of the study were recently published in the Journal of American Chemical Society (JACS).

The scientists have provided an interesting image to illustrate their work,

Artist's rendering: the research team used an existing cocaine biosensor (in green) and revised its design to react to a series of inhibitor molecules (in blue). They were able to adapt the biosensor to respond optimally even within a large concentration window. Courtesy: University of Montreal

Artist’s rendering: the research team used an existing cocaine biosensor (in green) and revised its design to react to a series of inhibitor molecules (in blue). They were able to adapt the biosensor to respond optimally even within a large concentration window. Courtesy: University of Montreal

The news release provides some insight into the current state of biosensing and what the research team was attempting to accomplish,

“Nature is a continuing source of inspiration for developing new technologies,” says Professor Francesco Ricci, senior author of the study. “Many scientists are currently working to develop biosensor technology to detect—directly in the bloodstream and in seconds—drug, disease, and cancer molecules.”

“The most recent rapid and easy-to-use biosensors developed by scientists to determine the levels of various molecules such as drugs and disease markers in the blood only do so when the molecule is present in a certain concentration, called the concentration window,” adds Professor Vallée-Bélisle. “Below or above this window, current biosensors lose much of their accuracy.”

To overcome this limitation, the international team looked at nature: “In cells, living organisms often use inhibitor or activator molecules to automatically program the sensitivity of their receptors (sensors), which are able to identify the precise amount of thousand of molecules in seconds,” explains Professor Vallée-Bélisle. “We therefore decided to adapt these inhibition, activation, and sequestration mechanisms to improve the efficiency of artificial biosensors.”

The researchers put their idea to the test by using an existing cocaine biosensor and revising its design so that it would respond to a series of inhibitor molecules. They were able to adapt the biosensor to respond optimally even with a large concentration window. “What is fascinating,” says Alessandro Porchetta, a doctoral student at the University of Rome, “is that we were successful in controlling the interactions of this system by mimicking mechanisms that occur naturally.”

“Besides the obvious applications in biosensor design, I think this work will pave the way for important applications related to the administration of cancer-targeting drugs, an area of increasing importance,” says Professor Kevin Plaxco. “The ability to accurately regulate biosensor or nanomachine’s activities will greatly increase their efficiency.”

The funders for this project are (from the news release),

… the Italian Ministry of Universities and Research (MIUR), the Bill & Melinda Gates Foundation Grand Challenges Explorations program, the European Commission Marie Curie Actions program, the U.S. National Institutes of Health, and the Fonds de recherche du Québec Nature et Technologies.

Here’s a citation and a link to the research paper,

Using Distal-Site Mutations and Allosteric Inhibition To Tune, Extend, and Narrow the Useful Dynamic Range of Aptamer-Based Sensors by Alessandro Porchetta, Alexis Vallée-Bélisle, Kevin W. Plaxco, and Francesco Ricci. J. Am. Chem. Soc., 2012, 134 (51), pp 20601–20604 DOI: 10.1021/ja310585e Publication Date (Web): December 6, 2012

Copyright © 2012 American Chemical Society

This article is behind a paywall.

One final note, Alexis Vallée-Bélisle has been mentioned here before in the context of a ‘Grand Challenges Canada programme’ (not the Bill and Melinda Gates ‘Grand Challenges’) announcement of several fundees  in my Nov. 22, 2012 posting. That funding appears to be for a difference project.

Teaching physics visually

Art/science news  is usually about a scientist using their own art or collaborating with an artist to produce pieces that engage the public. This particular May 23, 2012 news item by Andrea Estrada on the physorg.com website offers a contrast when it highlights a teaching technique integrating visual arts with physics for physics students,

Based on research she conducted for her doctoral dissertation several years ago, Jatila van der Veen, a lecturer in the College of Creative Studies at UC [University of  California] Santa Barbara and a research associate in UC Santa Barbara’s physics department, created a new approach to introductory physics, which she calls “Noether before Newton.” Noether refers to the early 20th-century German mathematician Emmy Noether, who was known for her groundbreaking contributions to abstract algebra and theoretical physics.

Using arts-based teaching strategies, van der Veen has fashioned her course into a portal through which students not otherwise inclined might take the leap into the sciences — particularly physics and mathematics. Her research appears in the current issue of the American Educational Research Journal, in a paper titled “Draw Your Physics Homework? Art as a Path to Understanding in Physics Teaching.”

The May 22, 2012 press release on the UC Santa Barbara website provides this detail about van der Veen’s course,

While traditional introductory physics courses focus on 17th-century Newtonian mechanics, van der Veen takes a contemporary approach. “I start with symmetry and contemporary physics,” she said. “Symmetry is the underlying mathematical principle of all physics, so this allows for several different branches of inclusion, of accessibility.”

Much of van der Veen’s course is based on the principles of “aesthetic education,” an approach to teaching formulated by the educational philosopher Maxine Greene. Greene founded the Lincoln Center Institute, a joint effort of Teachers College, Columbia University, and Lincoln Center. Van der Veen is quick to point out, however, that concepts of physics are at the core of her course. “It’s not simply looking at art that’s involved in physics, or looking at beautiful pictures of galaxies, or making fractal art,” she said. “It’s using the learning modes that are available in the arts and applying them to math and physics.”

Taking a visual approach to the study of physics is not all that far-fetched. “If you read some of Albert Einstein’s writings, you’ll see they’re very visual,” van der Veen said. “And in some of his writings, he talks about how visualization played an important part in the development of his theories.”

Van der Veen has taught her introductory physics course for five years, and over that time has collected data from one particular homework assignment she gives her students: She asks them to read an article by Einstein on the nature of science, and then draw their understanding of it. “I found over the years that no one ever produced the same drawing from the same article,” she said. “I also found that some students think very concretely in words, some think concretely in symbols, some think allegorically, and some think metaphorically.”

Adopting arts-based teaching strategies does not make van der Veen’s course any less rigorous than traditional introductory courses in terms of the abstract concepts students are required to master. It creates a different, more inclusive way of achieving the same end.

I went to look at van der Veen’s webpage on the UC Santa Barbara website to find a link to this latest article (open access) of hers and some of her other projects. I have taken a brief look at the Draw your physics homework? article (tir is 53 pp.) and found these images on p. 29 (PDF) illustrating her approach,

Figure 5. Abstract-representational drawings. 5a (left): female math major, first year; 5b (right): male math major, third year. Used with permission. (downloaded from the American Educational Research Journal, vol. 49, April 2012)

Van der Veen offers some context on the page preceding the image, p. 28,

Two other examples of abstract-representational drawings are shown in Figure 5. I do not have written descriptions, but in each case I determined that each student understood the article by means of verbal explanation. Figure 5a was drawn by a first-year math major, female, in 2010. She explained the meaning of her drawing as representing Einstein’s layers from sensory input (shaded ball at the bottom), to secondary layer of concepts, represented by the two open circles, and finally up to the third level, which explains everything below with a unified theory. The dashes surrounding the perimeter, she told me, represent the limit of our present knowledge. Figure 5b was drawn by a third-year male math major. He explained that the brick-like objects in the foreground are sensory perceptions, and the shaded portion in the center of the drawing, which appears behind the bricks, is the theoretical explanation which unifies all the experiences.

I find the reference to Einstein and visualization compelling in light of the increased interest (as I perceive it) in visualization currently occurring in the sciences.