Tag Archives: UCSB

Brain research, ethics, and nanotechnology (part one of five)

This post kicks off a series titled ‘Brains, prostheses, nanotechnology, and human enhancement’ which brings together a number of developments in the worlds of neuroscience*, prosthetics, and, incidentally, nanotechnology in the field of interest called human enhancement. Parts one through four are an attempt to draw together a number of new developments, mostly in the US and in Europe. Due to my language skills which extend to English and, more tenuously, French, I can’t provide a more ‘global perspective’. Part five features a summary.

Barbara Herr Harthorn, head of UCSB’s [University of California at Santa Barbara) Center for Nanotechnology in Society (CNS), one of two such centers in the US (the other is at Arizona State University) was featured in a May 12, 2014 article by Lyz Hoffman for the [Santa Barbara] Independent.com,

… Barbara Harthorn has spent the past eight-plus years leading a team of researchers in studying people’s perceptions of the small-scale science with big-scale implications. Sponsored by the National Science Foundation, CNS enjoys national and worldwide recognition for the social science lens it holds up to physical and life sciences.

Earlier this year, Harthorn attended a meeting hosted by the Presidential Commission for the Study of Bioethical Issues. The commission’s chief focus was on the intersection of ethics and brain research, but Harthorn was invited to share her thoughts on the relationship between ethics and nanotechnology.

(You can find Harthorn’s February 2014 presentation to the Presidential Commission for the Study of Bioethical Issues here on their webcasts page.)

I have excerpted part of the Q&A (questions and answers) from Hoffman’s May 12, 2014 article but encourage you to read the piece in its entirety as it provides both a brief beginners’ introduction to nanotechnology and an insight into some of the more complex social impact issues presented by nano and other emerging technologies vis à vis neuroscience and human enhancement,

So there are some environmental concerns with nanomaterials. What are the ethical concerns? What came across at the Presidential Commission meeting? They’re talking about treatment of Alzheimer’s and neurological brain disorders, where the issue of loss of self is a fairly integral part of the disease. There are complicated issues about patients’ decision-making. Nanomaterials could be used to grow new tissues and potentially new organs in the future.

What could that mean for us? Human enhancement is very interesting. It provokes really fascinating discussions. In our view, the discussions are not much at all about the technologies but very much about the social implications. People feel enthusiastic initially, but when reflecting, the issues of equitable access and justice immediately rise to the surface. We [at CNS] are talking about imagined futures and trying to get at the moral and ethical sort of citizen ideas about the risks and benefits of such technologies. Before they are in the marketplace, [the goal is to] understand and find a way to integrate the public’s ideas in the development process.

Here again is a link to the article.

Links to other posts in the Brains, prostheses, nanotechnology, and human enhancement five-part series:

Part two: BRAIN and ethics in the US with some Canucks (not the hockey team) participating (May 19, 2014)

Part three: Gray Matters: Integrative Approaches for Neuroscience, Ethics, and Society issued May 2014 by US Presidential Bioethics Commission (May 20, 2014)

Part four: Brazil, the 2014 World Cup kickoff, and a mind-controlled exoskeleton (May 20, 2014)

Part five: Brains, prostheses, nanotechnology, and human enhancement: summary (May 20, 2014)

* ‘neursocience’ corrected to ‘neuroscience’ on May 20, 2014.

Regulators not prepared to manage nanotechnology risks according to survey

The focus of the survey mentioned in the heading is on the US regulatory situation regarding nanotechnology and, interestingly, much of the work was done by researchers at the University of British Columbia (UBC; Vancouver, Canada). A Dec. 19, 2013 news item on Nanowerk provides an overview,

In a survey of nanoscientists and engineers, nano-environmental health and safety scientists, and regulators, researchers at the UCSB Center for Nanotechnology in Society (CNS) and at the University of British Columbia found that those who perceive the risks posed by nanotechnology as “novel” are more likely to believe that regulators are unprepared. Representatives of regulatory bodies themselves felt most strongly that this was the case. “The people responsible for regulation are the most skeptical about their ability to regulate,” said CNS Director and co-author Barbara Herr Harthorn.

“The message is essentially,” said first author Christian Beaudrie of the Institute for Resources, Environment, and Sustainability at the University of British Columbia, “the more that risks are seen as new, the less trust survey respondents have in regulatory mechanisms. That is, regulators don’t have the tools to do the job adequately.”

The Dec. (?), 2013 University of California at Santa Barbara news release (also on EurekAlert), which originated the news item, adds this,

The authors also believe that when respondents suggested that more stakeholder groups need to share the responsibility of preparing for the potential consequences of nanotechnologies, this indicated a greater “perceived magnitude or complexity of the risk management challenge.” Therefore, they assert, not only are regulators unprepared, they need input from “a wide range of experts along the nanomaterial life cycle.” These include laboratory scientists, businesses, health and environmental groups (NGOs), and government agencies.

Here’s a link to and a citation for the paper,

Expert Views on Regulatory Preparedness for Managing the Risks of Nanotechnologies by Christian E. H. Beaudrie, Terre Satterfield, Milind Kandlikar, Barbara H. Harthorn. PLOS [Public Library of Science] ONE Published: November 11, 2013 DOI: 10.1371/journal.pone.0080250

All of the papers on PLOS ONE are open access.

I have taken a look at this paper and notice there will be a separate analysis of the Canadian scene produced at a later date. As for the US analysis, certainly this paper confirms any conjectures made based on my observations and intuitions about the situation given the expressed uneasiness from various groups and individuals about the regulatory situation.

I would have liked to have seen a critique of previous studies rather than a summary, as well as, a critique of the survey itself in its discussion/conclusion. I also would have liked to have seen an appendix with the survey questions listed in the order in which they were asked and seen qualitative research (one-on-one interviews) rather than 100% dependence on an email survey. That said, I was glad to see they reversed the meaning of some of the questions to doublecheck for someone who might indicate the same answers (e.g., 9 [very concerned]) throughout as a means of simplifying their participation,

Onward to the survey with an excerpt from the description of how it was conducted,

Subjects were contacted by email in a three-step process, including initial contact and two reminders at two-week intervals. Respondents received an ‘A’ or ‘B’ version of the survey at random, where the wording of several survey questions were modified to reverse the meaning of the question. Questions with alternate wording were reversed-coded during analysis to enable direct comparison of responses. Where appropriate the sequence of questions was also varied to minimize order effects.

Here’s how the researchers separated the experts into various groups (excerpted from the study),,

This study thus draws from a systematic sampling of US-based nano-scientists and engineers (NSE, n=114), nano-environmental health and safety scientists (NEHS, n=86), and regulatory decision makers and scientists (NREG, n=54), to characterize how well-prepared different experts think regulatory agencies are for the risk management of nanomaterials and applications. We tested the following hypothesis:

  1. (1) Expert views on whether US federal agencies are sufficiently prepared for managing any risks posed by nanotechnologies will differ significantly across classes of experts (NSE vs. NEHS. vs. NREG).

This difference across experts was anticipated and so tested in reference to four additional hypotheses:

  1. (2) Experts who see nanotechnologies as novel (i.e., as a new class of materials or objects) will view US federal regulatory agencies as unprepared for managing risks as compared to those who see nanotechnologies as not new (i.e., as little different from their bulk chemical form)
  2. (3) Experts who deem US federal regulatory agencies as less trustworthy will also view agencies as less prepared compared to those with more trust in agencies
  3. (4) Experts who attribute greater collective stakeholder responsibility (e.g. who view a range of stakeholders as equally responsible for managing risks) will see agencies as less prepared compared to those who attribute less responsibility.
  4. (5) Experts who are more socially and economically conservative will see regulatory agencies as more prepared compared to those with a more liberal orientation.

The researchers included Index Variables of trust, responsibility, conservatism, novelty-risks, and novelty-benefits in relationship to education, gender, field of expertise, etc. for a regression analysis. In the discussion (or conclusion), the authors had this to say (excerpted from the study),

Consistent differences exist between expert groups in their views on agency preparedness to manage nanotechnology risks, yet all three groups perceive regulatory agencies as unprepared. What is most striking however is that NREG experts see regulatory agencies as considerably less prepared than do their NSE or NEHS counterparts. Taking a closer look, the drivers of experts’ concerns over regulator preparedness tell a more nuanced story. After accounting for other differences, the ‘expert group’ classification per se does not drive the observed differences in preparedness perceptions. Rather a substantial portion of this difference results from differing assessments of the perceived novelty of risks across expert groups. Of the remaining variables, trust in regulators is a small but significant driver, and our findings suggest a link between concerns over the novelty of nanomaterials and the adequacy of regulatory design. Experts’ views on stakeholder responsibility are not particularly surprising since greater reliance on a collective responsibility model would need the burden to move away exclusively from regulatory bodies to other groups, and result presumptively in a reduced sense of preparedness.

Experts’ reliance in part upon socio-political values indicates that personal values also play a minor role in preparedness judgments.

I look forward to seeing the Canadian analysis. The paper is worth reading for some of the more subtle analysis I did not include here.

Structural color and cephalopods at the University of California Santa Barbara

I last wrote about structural color in a Feb.7, 2013 posting featuring a marvelous article on the topic by Cristina Luiggi in the The Scientist. As for cephalopods, one of my favourite postings on the topic is a Feb. 1, 2013 posting which features the giant squid, a newly discovered animal of mythical proportions that appears golden in its native habitat in the deep, deep ocean. Happily, there’s a July 25, 2013 news item on Nanowerk which combines structural color and squid,

Color in living organisms can be formed two ways: pigmentation or anatomical structure. Structural colors arise from the physical interaction of light with biological nanostructures. A wide range of organisms possess this ability, but the biological mechanisms underlying the process have been poorly understood.

Two years ago, an interdisciplinary team from UC Santa Barbara [University of California Santa Barbara a.k.a. UCSB] discovered the mechanism by which a neurotransmitter dramatically changes color in the common market squid, Doryteuthis opalescens. That neurotransmitter, acetylcholine, sets in motion a cascade of events that culminate in the addition of phosphate groups to a family of unique proteins called reflectins. This process allows the proteins to condense, driving the animal’s color-changing process.

The July 25, 2013 UC Santa Barbara news release (also on EurekAlert), which originated the news item, provides a good overview of the team’s work to date and the new work occasioning the news release,

Now the researchers have delved deeper to uncover the mechanism responsible for the dramatic changes in color used by such creatures as squids and octopuses. The findings –– published in the Proceedings of the National Academy of Science, in a paper by molecular biology graduate student and lead author Daniel DeMartini and co-authors Daniel V. Krogstad and Daniel E. Morse –– are featured in the current issue of The Scientist.

Structural colors rely exclusively on the density and shape of the material rather than its chemical properties. The latest research from the UCSB team shows that specialized cells in the squid skin called iridocytes contain deep pleats or invaginations of the cell membrane extending deep into the body of the cell. This creates layers or lamellae that operate as a tunable Bragg reflector. Bragg reflectors are named after the British father and son team who more than a century ago discovered how periodic structures reflect light in a very regular and predicable manner.

“We know cephalopods use their tunable iridescence for camouflage so that they can control their transparency or in some cases match the background,” said co-author Daniel E. Morse, Wilcox Professor of Biotechnology in the Department of Molecular, Cellular and Developmental Biology and director of the Marine Biotechnology Center/Marine Science Institute at UCSB.

“They also use it to create confusing patterns that disrupt visual recognition by a predator and to coordinate interactions, especially mating, where they change from one appearance to another,” he added. “Some of the cuttlefish, for example, can go from bright red, which means stay away, to zebra-striped, which is an invitation for mating.”

The researchers created antibodies to bind specifically to the reflectin proteins, which revealed that the reflectins are located exclusively inside the lamellae formed by the folds in the cell membrane. They showed that the cascade of events culminating in the condensation of the reflectins causes the osmotic pressure inside the lamellae to change drastically due to the expulsion of water, which shrinks and dehydrates the lamellae and reduces their thickness and spacing. The movement of water was demonstrated directly using deuterium-labeled heavy water.

When the acetylcholine neurotransmitter is washed away and the cell can recover, the lamellae imbibe water, rehydrating and allowing them to swell to their original thickness. This reversible dehydration and rehydration, shrinking and swelling, changes the thickness and spacing, which, in turn, changes the wavelength of the light that’s reflected, thus “tuning” the color change over the entire visible spectrum.

“This effect of the condensation on the reflectins simultaneously increases the refractive index inside the lamellae,” explained Morse. “Initially, before the proteins are consolidated, the refractive index –– you can think of it as the density –– inside the lamellae and outside, which is really the outside water environment, is the same. There’s no optical difference so there’s no reflection. But when the proteins consolidate, this increases the refractive index so the contrast between the inside and outside suddenly increases, causing the stack of lamellae to become reflective, while at the same time they dehydrate and shrink, which causes color changes. The animal can control the extent to which this happens –– it can pick the color –– and it’s also reversible. The precision of this tuning by regulating the nanoscale dimensions of the lamellae is amazing.”

Another paper by the same team of researchers, published in Journal of the Royal Society Interface, with optical physicist Amitabh Ghoshal as the lead author, conducted a mathematical analysis of the color change and confirmed that the changes in refractive index perfectly correspond to the measurements made with live cells.

A third paper, in press at Journal of Experimental Biology, reports the team’s discovery that female market squid show a set of stripes that can be brightly activated and may function during mating to allow the female to mimic the appearance of the male, thereby reducing the number of mating encounters and aggressive contacts from males. The most significant finding in this study is the discovery of a pair of stripes that switch from being completely transparent to bright white.

“This is the first time that switchable white cells based on the reflectin proteins have been discovered,” Morse noted. “The facts that these cells are switchable by the neurotransmitter acetylcholine, that they contain some of the same reflectin proteins, and that the reflectins are induced to condense to increase the refractive index and trigger the change in reflectance all suggest that they operate by a molecular mechanism fundamentally related to that controlling the tunable color.”

Could these findings one day have practical applications? “In telecommunications we’re moving to more rapid communication carried by light,” said Morse. “We already use optical cables and photonic switches in some of our telecommunications devices. The question is –– and it’s a question at this point –– can we learn from these novel biophotonic mechanisms that have evolved over millions of years of natural selection new approaches to making tunable and switchable photonic materials to more efficiently encode, transmit, and decode information via light?”

In fact, the UCSB researchers are collaborating with Raytheon Vision Systems in Goleta to investigate applications of their discoveries in the development of tunable filters and switchable shutters for infrared cameras. Down the road, there may also be possible applications for synthetic camouflage. [emphasis mine]

There is at least one other research team (the UK’s University of Bristol) considering the camouflage strategies employed cephalopods and, in their case,  zebra fish as noted in my May 4, 2012 posting, Camouflage for everyone.

Getting back to cephalopod in hand, here’s an image from the UC Santa Barbara team,

This shows the diffusion of the neurotransmitter applied to squid skin at upper right, which induces a wave of iridescence traveling to the lower left and progressing from red to blue. Each object in the image is a living cell, 10 microns long; the dark object in the center of each cell is the cell nucleus. [downloaded from http://www.ia.ucsb.edu/pa/display.aspx?pkey=3076]

This shows the diffusion of the neurotransmitter applied to squid skin at upper right, which induces a wave of iridescence traveling to the lower left and progressing from red to blue. Each object in the image is a living cell, 10 microns long; the dark object in the center of each cell is the cell nucleus. [downloaded from http://www.ia.ucsb.edu/pa/display.aspx?pkey=3076]

Fro papers currently available online, here are links and citations,

Optical parameters of the tunable Bragg reflectors in squid by Amitabh Ghoshal, Daniel G. DeMartini, Elizabeth Eck, and Daniel E. Morse. doi: 10.1098/​rsif.2013.0386 J. R. Soc. Interface 6 August 2013 vol. 10 no. 85 20130386

The Royal Society paper is behind a paywall until August 2014.

Membrane invaginations facilitate reversible water flux driving tunable iridescence in a dynamic biophotonic system by Daniel G. DeMartini, Daniel V. Krogstadb, and Daniel E. Morse. Published online before print January 28, 2013, doi: 10.1073/pnas.1217260110
PNAS February 12, 2013 vol. 110 no. 7 2552-2556

The Proceedings of the National Academy of Sciences (PNAS) paper (or the ‘Daniel’ paper as I prefer to think of it)  is behind a paywall.

Controlling crystal growth for plastic electronics

A July 4, 2013 news item on Nanowerk highlights research into plastic electronics taking place at Imperial College London (ICL), Note: A link has been removed,

Scientists have discovered a way to better exploit a process that could revolutionise the way that electronic products are made.

The scientists from Imperial College London say improving the industrial process, which is called crystallisation, could revolutionise the way we produce electronic products, leading to advances across a whole range of fields; including reducing the cost and improving the design of plastic solar cells.

The process of making many well-known products from plastics involves controlling the way that microscopic crystals are formed within the material. By controlling the way that these crystals are grown engineers can determine the properties they want such as transparency and toughness. Controlling the growth of these crystals involves engineers adding small amounts of chemical additives to plastic formulations. This approach is used in making food boxes and other transparent plastic containers, but up until now it has not been used in the electronics industry.

The team from Imperial have now demonstrated that these additives can also be used to improve how an advanced type of flexible circuitry called plastic electronics is made.

The team found that when the additives were included in the formulation of plastic electronic circuitry they could be printed more reliably and over larger areas, which would reduce fabrication costs in the industry.

The team reported their findings this month in the journal Nature Materials (“Microstructure formation in molecular and polymer semiconductors assisted by nucleation agents”).

The June 7, 2013 Imperial College London news release by Joshua Howgego, which originated the news item, describes the researchers and the process in more detail,

Dr Natalie Stingelin, the leader of the study from the Department of Materials and Centre of Plastic Electronics at Imperial, says:

“Essentially, we have demonstrated a simple way to gain control over how crystals grow in electrically conducting ‘plastic’ semiconductors. Not only will this help industry fabricate plastic electronic devices like solar cells and sensors more efficiently. I believe it will also help scientists experimenting in other areas, such as protein crystallisation, an important part of the drug development process.”

Dr Stingelin and research associate Neil Treat looked at two additives, sold under the names IrgaclearÒ XT 386 and MilladÒ 3988, which are commonly used in industry. These chemicals are, for example, some of the ingredients used to improve the transparency of plastic drinking bottles. The researchers experimented with adding tiny amounts of these chemicals to the formulas of several different electrically conducting plastics, which are used in technologies such as security key cards, solar cells and displays.

The researchers found the additives gave them precise control over where crystals would form, meaning they could also control which parts of the printed material would conduct electricity. In addition, the crystallisations happened faster than normal. Usually plastic electronics are exposed to high temperatures to speed up the crystallisation process, but this can degrade the materials. This heat treatment treatment is no longer necessary if the additives are used.

Another industrially important advantage of using small amounts of the additives was that the crystallisation process happened more uniformly throughout the plastics, giving a consistent distribution of crystals.  The team say this could enable circuits in plastic electronics to be produced quickly and easily with roll-to-roll printing procedures similar to those used in the newspaper industry. This has been very challenging to achieve previously.

Dr Treat says: “Our work clearly shows that these additives are really good at controlling how materials crystallise. We have shown that printed electronics can be fabricated more reliably using this strategy. But what’s particularly exciting about all this is that the additives showed fantastic performance in many different types of conducting plastics. So I’m excited about the possibilities that this strategy could have in a wide range of materials.”

Dr Stingelin and Dr Treat collaborated with scientists from the University of California Santa Barbara (UCSB), and the National Renewable Energy Laboratory in Golden, US, and the Swiss Federal Institute of Technology on this study. The team are planning to continue working together to see if subtle chemical changes to the additives improve their effects – and design new additives.

There are some big plans for this discovery, from the news release,

They [the multinational team from ICL, UCSB, National Renewable Energy Laboratory, and Swiss Federal Institute of Technology]  will be working with the new Engineering and Physical Sciences Research Council (EPSRC)-funded Centre for Innovative Manufacturing in Large Area Electronics in order to drive the industrial exploitation of their process. The £5.6 million of funding for this centre, to be led by researchers from Cambridge University, was announced earlier this year [2013]. They are also exploring collaborations with printing companies with a view to further developing their circuit printing technique.

For the curious, here’s a link to and a citation for the published paper,

Microstructure formation in molecular and polymer semiconductors assisted by nucleation agents by Neil D. Treat, Jennifer A. Nekuda Malik, Obadiah Reid, Liyang Yu, Christopher G. Shuttle, Garry Rumbles, Craig J. Hawker, Michael L. Chabinyc, Paul Smith, & Natalie Stingelin. Nature Materials 12, 628–633 (2013) doi:10.1038/nmat3655 Published online 02 June 2013

This article is open access (at least for now).

Nanotechnology analogies and policy

There’s a two part essay titled, Regulating Nanotechnology Via Analogy (part 1, Feb. 12, 2013 and part 2, Feb. 18, 2013), by Patrick McCray on his Leaping Robot blog that is well worth reading if you are interested in the impact analogies can have on policymaking.

Before launching into the analogies, here’s a bit about Patrick McCray from the Welcome page to his website, (Note: A link has been removed),

As a professor in the History Department of the University of California, Santa Barbara and a co-founder of the Center for Nanotechnology in Society, my work focuses on different technological and scientific communities and their interactions with the public and policy makers. For the past ten years or so, I’ve been especially interested in the historical development of so-called “emerging technologies,” whenever they emerged.

I hope you enjoy wandering around my web site. The section of it that changes most often is my Leaping Robot blog. I update this every few weeks or so with an extended reflection or essay about science and technology, past and future.

In part 1 (Feb. 12, 2013) of the essay, McCray states (Note: Links and footnotes have been removed),

[Blogger’s note: This post is adapted from a talk I gave in March 2012 at the annual Business History Conference; it draws on research done by Roger Eardley-Pryor, an almost-finished graduate student I’m advising at UCSB [University of California at Santa Barbara], and me. I’m posting it here with his permission. This is the first of a two-part essay…some of the images come from slides we put together for the talk.]

Over the last decade, a range of actors – scientists, policy makers, and activists – have used  historical analogies to suggest different ways that risks associated with nanotechnology – especially those concerned with potential environmental implications – might be minimized. Some of these analogies make sense…others, while perhaps effective, are based on a less than ideal reading of history.

Analogies have been used before as tools to evaluate new technologies. In 1965, NASA requested comparisons between the American railroad of the 19th century and the space program. In response, MIT historian Bruce Mazlish wrote a classic article that analyzed the utility and limitations of historical analogies. Analogies, he explained, function as both model and myth. Mythically, they offer meaning and emotional security through an original archetype of familiar knowledge. Analogies also furnish models for understanding by construing either a structural or a functional relationship. As such, analogies function as devices of anticipation which what today is fashionably called “anticipatory governance.”They also can serve as a useful tool for risk experts.

McCray goes on to cover some of the early discourse on nanotechnology, the players, and early analogies. While the focus is on the US, the discourse reflects many if not all of the concerns being expressed internationally.

In part 2 posted on Feb. 18, 2013 McCray mentions four of the main analogies used with regard to nanotechnology and risk (Note: Footnotes have been removed),

Example #1 – Genetically Modified Organisms

In April 2003, Prof. Vicki Colvin testified before Congress. A chemist at Rice University, Colvin also directed that school’s Center for Biological and Environmental Nanotechnology. This “emerging technology,” Colvin said, had a considerable “wow index.” However, Colvin warned, every promising new technology came with concerns that could drive it from “wow into yuck and ultimately into bankrupt.” To make her point, Colvin compared nanotech to recent experiences researchers and industry had experienced with genetically modified organisms. Colvin’s analogy – “wow to yuck” – made an effective sound bite. But it also conflated two very different histories of two specific emerging technologies.

While some lessons from GMOs are appropriate for controlling the development of nanotechnology, the analogy doesn’t prove watertight. Unlike GMOs, nanotechnology does not always involve biological materials. And genetic engineering in general, never enjoyed any sort of unalloyed “wow” period. There was “yuck” from the outset. Criticism accompanied GMOs from the very start. Furthermore, giant agribusiness firms prospered handsomely even after the public’s widespread negative reactions to their products.  Lastly, living organisms – especially those associated with food – designed for broad release into the environment were almost guaranteed to generate concerns and protests. Rhetorically, the GMO analogy was powerful…but a deeper analysis clearly suggests there were more differences than similarities.

McCray offers three more examples of analogies used to describe nanotechnology: asbestos, (radioactive) fallout, and Recombinant DNA which he dissects and concludes are not the best analogies to be using before offering this thought,

So — If historical analogies teach can teach us anything about the potential regulation of nano and other emerging technologies, they indicate the need to take a little risk in forming socially and politically constructed definitions of nano. These definitions should be based not just on science but rather mirror the complex and messy realm of research, policy, and application. No single analogy fits all cases but an ensemble of several (properly chosen, of course) can suggest possible regulatory options.

I recommend reading both parts of McCray’s essay in full. It’s a timely piece especially in light of a Feb. 28, 2013 article by Daniel Hurst for Australian website, theage.com.au, where a union leader raises health fears about nanotechnology by using the response to asbestos health concerns as the analogy,

Union leader Paul Howes has likened nanotechnology to asbestos, calling for more research to ease fears that the growing use of fine particles could endanger manufacturing workers.

”I don’t want to make the mistake that my predecessors made by not worrying about asbestos,” the Australian Workers Union secretary said.

I have covered the topic of carbon nanotubes and asbestos many times, one of the  latest being this Jan. 16, 2013 posting. Not all carbon nanotubes act like asbestos; the long carbon nanotubes present the problems.

Biosensing cocaine

Amusingly, the Feb. 13, 2013 news item on Nanowerk highlights the biosensing aspect of the work in its title,

New biosensing nanotechnology adopts natural mechanisms to detect molecules

(Nanowerk News) Since the beginning of time, living organisms have developed ingenious mechanisms to monitor their environment.

The Feb. 13, 2013 news release from the University of Montreal (Université de Montréal) takes a somewhat different tack by focusing on cocaine,

Detecting cocaine “naturally”

Since the beginning of time, living organisms have developed ingenious mechanisms to monitor their environment. As part of an international study, a team of researchers has adapted some of these natural mechanisms to detect specific molecules such as cocaine more accurately and quickly. Their work may greatly facilitate the rapid screening—less than five minutes—of many drugs, infectious diseases, and cancers.

Professor Alexis Vallée-Bélisle of the University of Montreal Department of Chemistry has worked with Professor Francesco Ricci of the University of Rome Tor Vergata and Professor Kevin W. Plaxco of the University of California at Santa Barbara to improve a new biosensing nanotechnology. The results of the study were recently published in the Journal of American Chemical Society (JACS).

The scientists have provided an interesting image to illustrate their work,

Artist's rendering: the research team used an existing cocaine biosensor (in green) and revised its design to react to a series of inhibitor molecules (in blue). They were able to adapt the biosensor to respond optimally even within a large concentration window. Courtesy: University of Montreal

Artist’s rendering: the research team used an existing cocaine biosensor (in green) and revised its design to react to a series of inhibitor molecules (in blue). They were able to adapt the biosensor to respond optimally even within a large concentration window. Courtesy: University of Montreal

The news release provides some insight into the current state of biosensing and what the research team was attempting to accomplish,

“Nature is a continuing source of inspiration for developing new technologies,” says Professor Francesco Ricci, senior author of the study. “Many scientists are currently working to develop biosensor technology to detect—directly in the bloodstream and in seconds—drug, disease, and cancer molecules.”

“The most recent rapid and easy-to-use biosensors developed by scientists to determine the levels of various molecules such as drugs and disease markers in the blood only do so when the molecule is present in a certain concentration, called the concentration window,” adds Professor Vallée-Bélisle. “Below or above this window, current biosensors lose much of their accuracy.”

To overcome this limitation, the international team looked at nature: “In cells, living organisms often use inhibitor or activator molecules to automatically program the sensitivity of their receptors (sensors), which are able to identify the precise amount of thousand of molecules in seconds,” explains Professor Vallée-Bélisle. “We therefore decided to adapt these inhibition, activation, and sequestration mechanisms to improve the efficiency of artificial biosensors.”

The researchers put their idea to the test by using an existing cocaine biosensor and revising its design so that it would respond to a series of inhibitor molecules. They were able to adapt the biosensor to respond optimally even with a large concentration window. “What is fascinating,” says Alessandro Porchetta, a doctoral student at the University of Rome, “is that we were successful in controlling the interactions of this system by mimicking mechanisms that occur naturally.”

“Besides the obvious applications in biosensor design, I think this work will pave the way for important applications related to the administration of cancer-targeting drugs, an area of increasing importance,” says Professor Kevin Plaxco. “The ability to accurately regulate biosensor or nanomachine’s activities will greatly increase their efficiency.”

The funders for this project are (from the news release),

… the Italian Ministry of Universities and Research (MIUR), the Bill & Melinda Gates Foundation Grand Challenges Explorations program, the European Commission Marie Curie Actions program, the U.S. National Institutes of Health, and the Fonds de recherche du Québec Nature et Technologies.

Here’s a citation and a link to the research paper,

Using Distal-Site Mutations and Allosteric Inhibition To Tune, Extend, and Narrow the Useful Dynamic Range of Aptamer-Based Sensors by Alessandro Porchetta, Alexis Vallée-Bélisle, Kevin W. Plaxco, and Francesco Ricci. J. Am. Chem. Soc., 2012, 134 (51), pp 20601–20604 DOI: 10.1021/ja310585e Publication Date (Web): December 6, 2012

Copyright © 2012 American Chemical Society

This article is behind a paywall.

One final note, Alexis Vallée-Bélisle has been mentioned here before in the context of a ‘Grand Challenges Canada programme’ (not the Bill and Melinda Gates ‘Grand Challenges’) announcement of several fundees  in my Nov. 22, 2012 posting. That funding appears to be for a difference project.

Teaching physics visually

Art/science news  is usually about a scientist using their own art or collaborating with an artist to produce pieces that engage the public. This particular May 23, 2012 news item by Andrea Estrada on the physorg.com website offers a contrast when it highlights a teaching technique integrating visual arts with physics for physics students,

Based on research she conducted for her doctoral dissertation several years ago, Jatila van der Veen, a lecturer in the College of Creative Studies at UC [University of  California] Santa Barbara and a research associate in UC Santa Barbara’s physics department, created a new approach to introductory physics, which she calls “Noether before Newton.” Noether refers to the early 20th-century German mathematician Emmy Noether, who was known for her groundbreaking contributions to abstract algebra and theoretical physics.

Using arts-based teaching strategies, van der Veen has fashioned her course into a portal through which students not otherwise inclined might take the leap into the sciences — particularly physics and mathematics. Her research appears in the current issue of the American Educational Research Journal, in a paper titled “Draw Your Physics Homework? Art as a Path to Understanding in Physics Teaching.”

The May 22, 2012 press release on the UC Santa Barbara website provides this detail about van der Veen’s course,

While traditional introductory physics courses focus on 17th-century Newtonian mechanics, van der Veen takes a contemporary approach. “I start with symmetry and contemporary physics,” she said. “Symmetry is the underlying mathematical principle of all physics, so this allows for several different branches of inclusion, of accessibility.”

Much of van der Veen’s course is based on the principles of “aesthetic education,” an approach to teaching formulated by the educational philosopher Maxine Greene. Greene founded the Lincoln Center Institute, a joint effort of Teachers College, Columbia University, and Lincoln Center. Van der Veen is quick to point out, however, that concepts of physics are at the core of her course. “It’s not simply looking at art that’s involved in physics, or looking at beautiful pictures of galaxies, or making fractal art,” she said. “It’s using the learning modes that are available in the arts and applying them to math and physics.”

Taking a visual approach to the study of physics is not all that far-fetched. “If you read some of Albert Einstein’s writings, you’ll see they’re very visual,” van der Veen said. “And in some of his writings, he talks about how visualization played an important part in the development of his theories.”

Van der Veen has taught her introductory physics course for five years, and over that time has collected data from one particular homework assignment she gives her students: She asks them to read an article by Einstein on the nature of science, and then draw their understanding of it. “I found over the years that no one ever produced the same drawing from the same article,” she said. “I also found that some students think very concretely in words, some think concretely in symbols, some think allegorically, and some think metaphorically.”

Adopting arts-based teaching strategies does not make van der Veen’s course any less rigorous than traditional introductory courses in terms of the abstract concepts students are required to master. It creates a different, more inclusive way of achieving the same end.

I went to look at van der Veen’s webpage on the UC Santa Barbara website to find a link to this latest article (open access) of hers and some of her other projects. I have taken a brief look at the Draw your physics homework? article (tir is 53 pp.) and found these images on p. 29 (PDF) illustrating her approach,

Figure 5. Abstract-representational drawings. 5a (left): female math major, first year; 5b (right): male math major, third year. Used with permission. (downloaded from the American Educational Research Journal, vol. 49, April 2012)

Van der Veen offers some context on the page preceding the image, p. 28,

Two other examples of abstract-representational drawings are shown in Figure 5. I do not have written descriptions, but in each case I determined that each student understood the article by means of verbal explanation. Figure 5a was drawn by a first-year math major, female, in 2010. She explained the meaning of her drawing as representing Einstein’s layers from sensory input (shaded ball at the bottom), to secondary layer of concepts, represented by the two open circles, and finally up to the third level, which explains everything below with a unified theory. The dashes surrounding the perimeter, she told me, represent the limit of our present knowledge. Figure 5b was drawn by a third-year male math major. He explained that the brick-like objects in the foreground are sensory perceptions, and the shaded portion in the center of the drawing, which appears behind the bricks, is the theoretical explanation which unifies all the experiences.

I find the reference to Einstein and visualization compelling in light of the increased interest (as I perceive it) in visualization currently occurring in the sciences.