Tag Archives: soft machines

Brain and machine as one (machine/flesh)

The essay on brains and machines becoming intertwined is making the rounds. First stop on my tour was its Oct. 4, 2016 appearance on the Mail & Guardian, then there was its Oct. 3, 2016 appearance on The Conversation, and finally (moving forward in time) there was its Oct. 4, 2016 appearance on the World Economic Forum website as part of their Final Frontier series.

The essay was written by Richard Jones of Sheffield University (mentioned here many times before but most recently in a Sept. 4, 2014 posting). His book ‘Soft Machines’ provided me with an important and eminently readable introduction to nanotechnology. He is a professor of physics at the University of Sheffield and here’s more from his essay (Oct. 3, 2016 on The Conversation) about brains and machines (Note: Links have been removed),

Imagine a condition that leaves you fully conscious, but unable to move or communicate, as some victims of severe strokes or other neurological damage experience. This is locked-in syndrome, when the outward connections from the brain to the rest of the world are severed. Technology is beginning to promise ways of remaking these connections, but is it our ingenuity or the brain’s that is making it happen?

Ever since an 18th-century biologist called Luigi Galvani made a dead frog twitch we have known that there is a connection between electricity and the operation of the nervous system. We now know that the signals in neurons in the brain are propagated as pulses of electrical potential, whose effects can be detected by electrodes in close proximity. So in principle, we should be able to build an outward neural interface system – that is to say, a device that turns thought into action.

In fact, we already have the first outward neural interface system to be tested in humans. It is called BrainGate and consists of an array of micro-electrodes, implanted into the part of the brain concerned with controlling arm movements. Signals from the micro-electrodes are decoded and used to control the movement of a cursor on a screen, or the motion of a robotic arm.

A crucial feature of these systems is the need for some kind of feedback. A patient must be able to see the effect of their willed patterns of thought on the movement of the cursor. What’s remarkable is the ability of the brain to adapt to these artificial systems, learning to control them better.

You can find out more about BrainGate in my May 17, 2012 posting which also features a video of a woman controlling a mechanical arm so she can drink from a cup coffee by herself for the first time in 15 years.

Jones goes on to describe the cochlear implants (although there’s no mention of the controversy; not everyone believes they’re a good idea) and retinal implants that are currently available. Jones notes this (Note Links have been removed),

The key message of all this is that brain interfaces now are a reality and that the current versions will undoubtedly be improved. In the near future, for many deaf and blind people, for people with severe disabilities – including, perhaps, locked-in syndrome – there are very real prospects that some of their lost capabilities might be at least partially restored.

Until then, our current neural interface systems are very crude. One problem is size; the micro-electrodes in use now, with diameters of tens of microns, may seem tiny, but they are still coarse compared to the sub-micron dimensions of individual nerve fibres. And there is a problem of scale. The BrainGate system, for example, consists of 100 micro-electrodes in a square array; compare that to the many tens of billions of neurons in the brain. The fact these devices work at all is perhaps more a testament to the adaptability of the human brain than to our technological prowess.

Scale models

So the challenge is to build neural interfaces on scales that better match the structures of biology. Here, we move into the world of nanotechnology. There has been much work in the laboratory to make nano-electronic structures small enough to read out the activity of a single neuron. In the 1990s, Peter Fromherz, at the Max Planck Institute for Biochemistry, was a pioneer of using silicon field effect transistors, similar to those used in commercial microprocessors, to interact with cultured neurons. In 2006, Charles Lieber’s group at Harvard succeeded in using transistors made from single carbon nanotubes – whiskers of carbon just one nanometer in diameter – to measure the propagation of single nerve pulses along the nerve fibres.

But these successes have been achieved, not in whole organisms, but in cultured nerve cells which are typically on something like the surface of a silicon wafer. It’s going to be a challenge to extend these methods into three dimensions, to interface with a living brain. Perhaps the most promising direction will be to create a 3D “scaffold” incorporating nano-electronics, and then to persuade growing nerve cells to infiltrate it to create what would in effect be cyborg tissue – living cells and inorganic electronics intimately mixed.

I have featured Charles Lieber and his work here in two recent posts: ‘Bionic’ cardiac patch with nanoelectric scaffolds and living cells on July 11, 2016 and Long-term brain mapping with injectable electronics on Sept. 22, 2016.

For anyone interested in more about the controversy regarding cochlear implants, there’s this page on the Brown University (US) website. You might also want to check out Gregor Wolbring (professor at the University of Calgary) who has written extensively on the concept of ableism (links to his work can be found at the end of this post). I have excerpted from an Aug. 30, 2011 post the portion where Gregor defines ‘ableism’,

From Gregor’s June 17, 2011 posting on the FedCan blog,

The term ableism evolved from the disabled people rights movements in the United States and Britain during the 1960s and 1970s.  It questions and highlights the prejudice and discrimination experienced by persons whose body structure and ability functioning were labelled as ‘impaired’ as sub species-typical. Ableism of this flavor is a set of beliefs, processes and practices, which favors species-typical normative body structure based abilities. It labels ‘sub-normative’ species-typical biological structures as ‘deficient’, as not able to perform as expected.

The disabled people rights discourse and disability studies scholars question the assumption of deficiency intrinsic to ‘below the norm’ labeled body abilities and the favoritism for normative species-typical body abilities. The discourse around deafness and Deaf Culture would be one example where many hearing people expect the ability to hear. This expectation leads them to see deafness as a deficiency to be treated through medical means. In contrast, many Deaf people see hearing as an irrelevant ability and do not perceive themselves as ill and in need of gaining the ability to hear. Within the disabled people rights framework ableism was set up as a term to be used like sexism and racism to highlight unjust and inequitable treatment.

Ableism is, however, much more pervasive.

You can find out more about Gregor and his work here: http://www.crds.org/research/faculty/Gregor_Wolbring2.shtml or here:
https://www.facebook.com/GregorWolbring.

Richard Jones and soft nanotechnology

One of the first posts on this blog was about Richard Jones’ nanotechnology book, ‘Soft Machines’. I have a ‘soft’ spot for the book which I found to be a good introduction to nanotechnology and well written too.

It’s nice to see the book getting some more attention all these years later as James Lewis notes in his Aug. 31, 2014 posting on Nanodot (Foresight Institute’s blog) that nano manufacturing has not progressed as some of the early thinkers in this area had hoped,

Long-term readers of Nanodot will be familiar with the work of Richard Jones, a UK physicist and author of Soft Machines: Nanotechnology and Life, reviewed in Foresight Update Number 55 (2005) page 10. Basically Jones follows Eric Drexler’s lead in Engines of Creation in arguing that the molecular machinery found in nature provides an existence proof of an advanced nanotechnology of enormous capabilities. However, he cites the very different physics governing biomolecular machinery operating in an aqueous environment on the one hand, and macroscopic machine tools of steel and other hard metals, on the other hand. He then argues that rigid diamondoid structures doing atomically precise mechanochemistry, as later presented by Drexler in Nanosystems, although at least theoretically feasible, do not form a practical path to advanced nanotechnology. This stance occasioned several very useful and informative debates on the relative strengths and weaknesses of different approaches to advanced nanotechnology, both on his Soft Machines blog and here on Nanodot (for example “Debate with ‘Soft Machines’ continues“, “Which way(s) to advanced nanotechnology?“, “Recent commentary“). An illuminating interview of Richard Jones over at h+ Magazine not only presents Jones’s current views, but spotlights the lack of substantial effort since 2008 in trying to resolve these issues “Going Soft on Nanotech

Lewis goes on to excerpt parts of the H+ interview which pertain to manufacturing and discusses the implications further. (Note: Eric Drexler not only popularized nanotechnology and introduced us to ‘grey goo’ with his book ‘Engines of Creation’, he also founded the Foresight Institute with then wife Christine Peterson. Drexler is no longer formally associated with Foresight.)

In the interests of avoiding duplication, I am focusing on the parts of the H+ interview concerning soft machines and synthetic biology and topics other than manufacturing. From the Nov. 23, 2013 article by Eddie Germino for H+ magazine,

H+: What are “soft machines”?

RJ: I called my book “Soft Machines” to emphasise that the machines of cell biology work on fundamentally different principles to the human-made machines of the macro-world.  Why “soft”?  As a physicist, one of my biggest intellectual influences was the French theoretical physicist Pierre-Gilles de Gennes (1932-2007, Nobel Prize for Physics 1991).  De Gennes popularised the term “soft matter” for those kinds of materials – polymers, colloids, liquid crystals etc – in which the energies with which molecules interact with each other are comparable with thermal energies, making them soft, mutable and responsive.  These are the characteristics of biological matter, so calling the machines of biology “soft machines” emphasises the different principles on which they operate.  Some people will also recognise the allusion to a William Burroughs novel (for whom a soft machine is a human being).

H+: What kind of work have you done with soft machines?

RJ: In my own lab we’ve been working on a number of “soft machine” related problems.  At the near-term end, we’ve been trying to understand what makes the molecules go where when you manufacture a solar cell from solutions of organic molecules – the idea here is that if you understand the self-assembly processes you can get a well-defined nanostructure that gives you a high conversion efficiency with a process you can use on a very large scale very cheaply. Further away from applications, we’ve been investigating a new mechanism for propelling micro- and nano-scale particles in water.  We use a spatially asymmetric chemical reaction so the particle creates a concentration gradient around itself, as a result of which osmotic pressure pushes it along.

H+: Putting aside MNT [micro/nanotechnology], what other design approaches would be most likely to yield advanced nanomachines?

RJ: If we are going to use the “soft machines” design paradigm to make functional nano machines, we have two choices.  We can co-opt what nature does, modifying biological systems to do what we want.  In essence, this is what is underlying the current enthusiasm for synthetic biology.  Or we can make synthetic molecules and systems that copy the principles that biology uses, possibly thereby widening the range of environments in which it will work.  Top-down methods are still enormously powerful, but they will have limits.

H+: So “synthetic biology” involves the creation of a custom-made microorganism built with the necessary organic parts and DNA to perform a desired function. Even if it is manmade, it only uses recognizable, biological parts in its construction, albeit arranged in ways that don’t occur in nature. But the second approach involving “synthetic molecules and systems that copy the principles that biology uses” is harder to understand. Can you give some clarifying examples?

RJ: If you wanted to make a molecular motor to work in water, you could use the techniques of molecular biology to isolate biological motors from cells, and this approach does work.  Alternatively, you could work out the principles by which the biological motor worked – these involve shape changes in the macromolecules coupled to chemical reactions – and try to make a synthetic molecule which would operate on similar principles.  This is more difficult than hacking out parts from a biological system, but will ultimately be more flexible and powerful.

H+: Why would it be more flexible and powerful?

RJ: The problem with biological macromolecules is that biology has evolved very effective mechanisms for detecting them and eating them.  So although DNA, for example, is a marvellous material for building nanostructures and devices from, its going to be difficult to use these directly in medicine simply because our cells are very good at detecting and destroying foreign DNA.  So using synthetic molecules should lead to more robust systems that can be used in a wider range of environments.

H+: In spite of your admiration for nanoscale soft machines, you’ve said that manmade technology has a major advantage because it can make use of electricity in ways living organisms can’t. Will soft machines use electricity in the future somehow?

RJ: Biology uses electrical phenomenon quite a lot – e.g. in our nervous system – but generally this relies on ion transport rather than coherent electron transport.  Photosynthesis is an exception, as may be certain electron transporting structures recently discovered in some bacteria.  There’s no reason in principle that the principles of self-assembly shouldn’t be used to connect up electronic circuits in which the individual elements are single conducting or semi-conducting molecules.  This idea – “molecular electronics” – is quite old now, but it’s probably fair to say that as a field it hasn’t progressed as fast as people had hoped.

Jones also discusses the term nanotechnology and takes a foray into transhumanism and the singularity (from the Germino article),

H+: What do you think of the label “nanotechnology”? Is it a valid field? What do people most commonly misunderstand about it? 

RJ: Nanotechnology, as the term is used in academia and industry, isn’t really a field in the sense that supramolecular chemistry or surface physics are fields.  It’s more of a socio-political project, which aims to do to physical scientists what the biotech industry did to life scientists – that is, to make them switch their focus from understanding nature to intervening in nature by making gizmos and gadgets, and then to try and make money from that.

What I’ve found, doing quite a lot of work in public engagement around nanotechnology, is that most people don’t have enough awareness of nanotechnology to misunderstand it at all.  Among those who do know something about it, I think the commonest misunderstanding is the belief that it will progress much more rapidly than is actually possible.  It’s a physical technology, not a digital one, so it won’t proceed at the pace we see in digital technologies.  As all laboratory-based nanotechnologists know, the physical world is more cussed than the digital one, and the smaller it gets the more cussed it seems to be…

… 

H+: Your thoughts on picotechnology and femtotechnology?

RJ: There’s a roughly inverse relationship between the energy scales needed to manipulate matter and the distance scale at which that manipulation takes place. Manipulating matter at the picometer scale is essentially a matter of controlling electron energy levels in atoms, which involves electron volt energies.  This is something we’ve got quite good at when we make lasers, for example.  Things are more difficult when we go smaller.  To manipulate matter at the nuclear level – i.e. on femtometer length scales – needs MeV energies, while to manipulate matter at the level of the constituents of hadrons – quarks and gluons – we need GeV energies.  At the moment our technology for manipulating objects at these energy scales is essentially restricted to hurling things at them, which is the business of particle accelerators.  So at the moment we really have no idea how to do femtotechnology of any kind of complexity, nor do we have any idea whether whether there is anything interesting we could do with it if we could.  I suppose the question is whether there is any scope for complexity within nuclear matter.  Perhaps if we were the sorts of beings that lived inside a neutron star or a quark-gluon plasma we’d know.

H+: What do you think of the transhumanist and Singularity movements?

RJ: These are terms that aren’t always used with clearly understood meanings, by me at least.  If by Transhumanism, we are referring to the systematic use of technology to better the lot of humanity, then I’m all in favour.  After all, the modern Western scientific project began with Francis Bacon, who said its purpose was “an improvement in man’s estate and an enlargement of his power over nature”.  And if the essence of Singularitarianism is to say that there’s something radically unknowable about the future, then I’m strongly in agreement.  On the other hand, if we consider Transhumanism and Singularitarianism as part of a belief package promising transcendence through technology, with a belief in a forthcoming era of material abundance, superhuman wisdom and everlasting life, then it’s interesting as a cultural phenomenon.  In this sense it has deep roots in the eschatologies of the apocalyptic traditions of Christianity and Judaism.  These were secularised by Marx and Trotsky, and technologised through, on the one hand, Fyodorov, Tsiolkovsky and the early Russian ideologues of space exploration, and on the other by the British Marxist scientists J.B.S. Haldane and Desmond Bernal.  Of course, the fact that a set of beliefs has a colourful past doesn’t mean they are necessarily wrong, but we should be aware that the deep tendency of humans to predict that their wishes will imminently be fulfilled is a powerful cognitive bias.

Richard goes into more depth about his views on transhumanism and the singularity in an Aug. 24, 2014 posting on his Soft Machines blog,

Transhumanism has never been modern

Transhumanists are surely futurists, if they are nothing else. Excited by the latest developments in nanotechnology, robotics and computer science, they fearlessly look ahead, projecting consequences from technology that are more transformative, more far-reaching, than the pedestrian imaginations of the mainstream. And yet, their ideas, their motivations, do not come from nowhere. They have deep roots, perhaps surprising roots, and following those intellectual trails can give us some important insights into the nature of transhumanism now. From antecedents in the views of the early 20th century British scientific left-wing, and in the early Russian ideologues of space exploration, we’re led back, not to rationalism, but to a particular strand of religious apocalyptic thinking that’s been a persistent feature of Western thought since the middle ages.

The essay that follows is quite dense (many of the thinkers he cites are new to me) so if you’re a beginner in this area, you may want to set some time aside to read this in depth. Also, you will likely want to read the comments which follow the post.

“I write in praise of air,” a catalytic poem absorbing air pollutants on a nanotechnology-enabled billboard

The poem ‘In Praise of Air’, which is on a billboard at the University of Sheffield (UK), is quite literally catalytic. From a May 15, 2014 news item on Nanowerk,

Simon [Armitage], Professor of Poetry at the University, and Pro-Vice-Chancellor for Science Professor Tony Ryan, have collaborated to create a catalytic poem called In Praise of Air – printed on material containing a formula invented at the University which is capable of purifying its surroundings.

Here’s what the billboard looks like,

Courtesy of the University of Sheffield

Courtesy of the University of Sheffield

A May 14, 2014 University of Sheffield news release, which originated the news item, has more details about the project from the scientist’s perspective,

This cheap technology could also be applied to billboards and advertisements alongside congested roads to cut pollution.

Professor Ryan, who came up with the idea of using treated materials to cleanse the air, said: “This is a fun collaboration between science and the arts to highlight a very serious issue of poor air quality in our towns and cities.

“The science behind this is an additive which delivers a real environmental benefit that could actually help cut disease and save lives.

“This poem alone will eradicate the nitrogen oxide pollution created by about 20 cars every day.”

He added: “If every banner, flag or advertising poster in the country did this, we’d have much better air quality. It would add less than £100 to the cost of a poster and would turn advertisements into catalysts in more ways than one. The countless thousands of poster sites that are selling us cars beside our roads could be cleaning up emissions at the same time.”

The 10m x 20m piece of material which the poem is printed on is coated with microscopic pollution-eating particles of titanium dioxide which use sunlight and oxygen to react with nitrogen oxide pollutants and purify the air.

Professor Ryan has been campaigning for some time to have his ingredient added to washing detergent in the UK as part of his Catalytic Clothing project. If manufacturers added it, the UK would meet one of its air quality targets in one step.

The news release also describes the arts component and poet’s perspective on this project,

The poem will be on display on the side of the University’s Alfred Denny Building, Western Bank, for one year and its unveiling also marks the launch of this year’s Sheffield Lyric Festival which takes place between 14-17 May 2014 at the University’s Firth Hall.

At a special celebratory event on Thursday (May 15 2014), Simon will read In Praise of Air for the first time in public and Professor Ryan will explain the technology behind the catalytic poem. Volunteers will be wearing catalytic T-shirts.

Dr Joanna Gavins, from the University’s School of English, project manager for the catalytic poem collaboration, who also leads the Lyric Festival, said: “This highlights the innovation and creativity at the heart of the University and its research excellence.

“We are delighted that such a significant event will help launch this year’s Lyric Festival which also features poetry readings by students of the MA in Creative Writing, alongside internationally renowned writers such as Sinead Morrissey and Benjamin Zephaniah, and music from celebrated Sheffield songwriter, Nat Johnson.”

Simon added: “There’s a legacy of poems in public places in Sheffield and, on behalf of the University, I wanted to be part of that dialogue to show what we could do.

“I wanted to write a poem that was approachable, that might catch the attention of the passer-by and the wandering mind, and one that had some local relevance too. But I also hope it’s robust and intricate enough to sustain deeper enquiries – the School of English looks towards it for one thing, and I’d like to think it’s capable of getting the thumbs up or at least a nod from their direction, and from the big-brained students walking up and down Western Bank, and from discerning residents in the neighbourhood.”

He added: “I’ve enjoyed working with the scientists and the science, trying to weave the message into the words, wanting to collaborate both conceptually and with the physical manifestation of the work.

“Poetry often comes out with the intimate and the personal, so it’s strange to think of a piece in such an exposed place, written so large and so bold. I hope the spelling is right!

For the curious, here’s a link to the In Praise of Air project website where you’ll find the poem and much more,

I write in praise of air.  I was six or five
when a conjurer opened my knotted fist
and I held in my palm the whole of the sky.
I’ve carried it with me ever since.

Let air be a major god, its being
and touch, its breast-milk always tilted
to the lips.  Both dragonfly and Boeing
dangle in its see-through nothingness…

Among the jumbled bric-a-brac I keep
a padlocked treasure-chest of empty space,
and on days when thoughts are fuddled with smog
or civilization crosses the street

with a white handkerchief over its mouth
and cars blow kisses to our lips from theirs
I turn the key, throw back the lid, breathe deep.
My first word, everyone’s  first word, was air.

I like this poem a lot and find it quite inspirational for one of my own projects.

Getting back to Tony Ryan, he and his Catalytic Clothing project have been mentioned here in a Feb. 24, 2012 posting (Catalytic Clothing debuts its kilts at Edinburgh International Science Festival) and in a July 8, 2011 posting featuring a collaboration between Ryan and Professor Helen Storey at the London College of Fashion (Nanotechnology-enabled Catalytic Clothes look good and clean the air). The 2012 posting has an image of two kilted gentlemen and the 2011 posting has a video highlighting one of the dresses, some music from Radiohead, and the ideas behind the project.

You can find out more about Catalytic Clothing and the Lyric Festival (from the news release),

Catalytic Clothing

To find out more about the catalytic clothing project visit http://www.catalytic-clothing.org

Lyric Festival

The Lyric Festival is the [University of Sheffield] Faculty of Arts and Humanities’ annual celebration of the written and spoken word. Each May the festival brings some of the UK’s most renowned and respected writers, broadcasters, academics, and performers to the University, as well as showcasing the talent of Faculty students and staff. For more information visit http://www.sheffield.ac.uk/lyric

One last note about the University of Sheffield, it’s the academic home for Professor Richard Jones who wrote one of my favourite books about nanotechnology, Soft Machines (featured in my earliest pieces here, a May 6, 2008 posting). He is the Pro-Vice-Chancellor – Research & Innovation at the university and a blogger on his Soft Machines blog where he writes about innovation and research in the UK and where you’ll also find a link to purchase his book.

ETA May 20, 2014: A May 19, 2014 article by JW Dowey for Earth Times offers more details about the technology,

Titanium dioxide coating on cars and aircraft have revolutionised protective nanotechnology. The University of Sheffield has set the target as absorbing the poisonous compounds from vehicle exhausts. Tony Ryan is the professor of physical chemistry in charge of adapting self-cleaning window technology to pollution solutions. The 10m x20m poster they now use on the Alfred Denny university building demonstrates how nitrogen oxides from 20 cars per day could be absorbed efficiently by roadside absorption.

There are more tidbits to be had in the article including the extra cost (£100) of adding the protective coating to the ‘poetic’ billboard (or hoarding as they say in the UK).

Scientific research, failure, and the scanning tunneling microscope

“99% of all you do is failure and that’s maybe the most difficult part of basic research,” said Gerd Binnig in a snippet I’ve culled from an interview with Dexter Johnson (Nanoclast blog on the IEEE [Institute of Electrical and Electronics Engineers] website) posted May 23, 2011 where Binnig discussed why he continued with a project that had failed time and time again. (The snippet is from the 2nd audio file from the top of the posting)

Binnig along with Heinrich Rohrer is a Nobel Laureate. Both men won their award for work on the scanning tunneling microscope (STM), which was the project that had failed countless times and that went on to play an important part in the nanotechnology narrative. Earlier this month, both men were honoured when IBM and ETH Zurich opened the Binnig and Rohrer Nanotechnology Center in Zurich. From the May 17, 2011 news item on Nanowerk,

IBM and ETH Zurich, a premiere European science and engineering university, hosted more than 600 guests from industry, academia and government, to open the Binnig and Rohrer Nanotechnology Center located on the campus of IBM Research – Zurich. The facility is the centerpiece of a 10-year strategic partnership in nanoscience between IBM and ETH Zurich where scientists will research novel nanoscale structures and devices to advance energy and information technologies.

The new Center is named for Gerd Binnig and Heinrich Rohrer, the two IBM scientists and Nobel Laureates who invented the scanning tunneling microscope at the Zurich Research Lab in 1981, thus enabling researchers to see atoms on a surface for the first time. The two scientists attended today’s opening ceremony, at which the new lab was unveiled to the public.

Here’s an excerpt from Dexter’s posting where he gives some context for the audio files,

As promised last week, I would like to share some audio recordings I made of Gerd Binnig and Heinrich Rohrer taking questions from the press during the opening of the new IBM and ETH Zurich nanotechnology laboratory named in their honor.

This first audio file features both Binnig’s and Rohrer’s response to my question of why they were interested in looking at inhomogenities on surfaces in the first place, which led them eventually to creating an instrument for doing it. A more complete history of the STM’s genesis can be found in their joint Nobel lecture here.

The sound quality isn’t the best but these snippets are definitely worth listening to if you find the process of scientific inquiry interesting.

For anyone who’s not familiar with the scanning tunneling microscope, I found this description in the book, Soft Machines; Nanotechnology and Life, by Richard Jones.

Scanning probe microscopes rely on an entirely different principle to both light microscopes and electron microscopes, or indeed our own eyes. Rather than detecting waves that have been scattered from the object we are looking at, on feels the surface of that object with a physical probe. This probe is moved across the surface with high precision. As it tracks the contours of the surface, it s moved up or down in a way that is controlled by some interaction between the tip of the probe and the surface. This interaction could be the flow of electrical current, in the case of a scanning tunneling microscope, or simple the force between the tip and the surface in the case of an atomic force microscope. pp. 17-18

24 regional nanotechnology centres on the block in UK is old news?

From Siobhan Wagner’s July 23, 2010 article on The Engineer website,

Science minister David Willetts told MPs yesterday it is ‘most unlikely’ the UK’s 24 nanotechnology centres will still be in existence in 18 months time.

In the first public meeting of the House of Commons science and technology committee, Willetts said the UK has too many centres that are ‘sub-critical in size’ and resources are fractionalised by region.

‘We have been getting a strong message that especially when times are tight that people want fewer, stronger centres,’ he said.

Given the budget concerns in the UK, the move can’t be any surprise. From Richard Jones’ (Soft Machines), July 11, 2010 posting (made before this potential cut was announced),

We know that the budget of his [Willetts’] department – Business, Innovation and Skills – will be cut by somewhere between 25%-33%. [emphasis mine] Science accounts for about 15% of this budget, with Universities accounting for another 29% (not counting the cost of student loans and grants, which accounts for another 27%). So, there’s not going to be a lot of room to protect spending on science and on research in Universities.

What I found particularly interesting in this posting is Willetts’ reference to a philosopher in his speech made July 9, 2010 and Jones’ discussion of what this reference might mean as the UK government grapples with science research, budget cuts, and finding common ground within a coalition that shares the rights and responsibilities of ruling,

More broadly, as society becomes more diverse and cultural traditions increasingly fractured, I see the scientific way of thinking – empiricism – becoming more and more important for binding us together. Increasingly, we have to abide by John Rawls’s standard for public reason – justifying a particular position by arguments that people from different moral or political backgrounds can accept. And coalition, I believe, is good for government and for science, given the premium now attached to reason and evidence. [Jones’ excerpt of Willetts’ speech]

The American political philosopher John Rawls was very concerned about how, in a pluralistic society, one could agree on a common set of moral norms. He rejected the idea that you could construct morality on entirely scientific grounds, as consequentialist ethical systems like utilitarianism try to, instead looking for a principles based morality; but he recognised that this was problematic in a society where Catholics, Methodists, Atheists and Muslims all had their different sets of principles. Hence the idea of trying to find moral principles that everyone in society can agree on, even though the grounds on which they approve of these principles may differ from group to group. In a coalition uniting parties including people as different as Evan Harris and Philippa Stroud [I assume one is a conservative and the other a liberal democrat in the UK’s coalition government] one can see why Willetts might want to call in Rawls for help.

Jones’ posting provides other insights into Willett’s perspective. (BTW, If you do check out the blog, be sure to read the comments.) As for what this perspective might mean relative to the proposed cut, I don’t know. Unfortunately, I have to wait for a future Jones’ posting where he will discuss,

The other significant aspect of Willetts’s speech was a wholesale rejection of the “linear model” of science and innovation, but this needs another post to discuss in detail.

In the meantime, Tim Harper, prinicipal of Cientifica (a nanotechnology consulting firm), and TNT blogger notes,

The lack of any reaction to Fridays announcement that many of the UKs nanotech centres would be unlikely to survive is because it is old news.

He goes on to speculate that the government is gradually preparing the public for the really big cuts due in October 2010. He also provides a brief history of the centres and some of the peculiar circumstances of their existence.

Nanotechnology and sunscreens: recalibrating positions and the excruciating business of getting it as right as possible

I’ve been waiting for Andrew Maynard’s comments (on his 2020 Science blog) about the Friends of the Earth (FoE) guest bloggers’ (Georgia Miller and Ian Illuminato) response (ETA June 6, 2016: Just how risky can nanoparticles in sunscreens be? Friends of the Earth respond; a 2020 Science blog June 15, 2010 posting) to his posting (Just how risky could nanoparticles in sunscreens be?) where he challenged them to quantify the nanosunscreen risk to consumers.  His reflections on the FoE response and the subsequent discussion are well worth reading. From Andrew’s posting, The safety of nanotechnology-based sunscreens – some reflections,

Getting nanomaterials’ use in context. First, Georgia and Ian, very appropriately in my opinion, brought up the societal context within which new technologies and products are developed and used:

“why not support a discussion about the role of the precautionary principle in the management of uncertain new risks associated with emerging technologies? Why not explore the importance of public choice in the exposure to these risks? Why not contribute to a critical discussion about whose interests are served by the premature commercialisation of products about whose safety we know so little, when there is preliminary evidence of risk and very limited public benefit.”

Andrew again,

… we need to think carefully about how we use scientific knowledge and data – “evidence” – in making decisions.

As he goes on to point out, cherrypicking data isn’t a substantive means of supporting your position over the long run.

Unfortunately it’s a common practice on all sides ranging from policymakers, politicians, civil society groups, consumers, medical institutions, etc. and these days we don’t have the luxury, ignorance about downsides such as pollution and chemical poisoning on a global scale for example, that previous generations enjoyed.

Three of the scientists whose work was cited by FoE as proof that nanosunscreens are dangerous either posted directly or asked Andrew to post comments which clarified the situation with exquisite care,

Despite FoE’s implications that nanoparticles in sunscreens might cause cancer because they are photoactive, Peter Dobson points out that there are nanomaterials used in sunscreens that are designed not to be photoactive. Brian Gulson, who’s work on zinc skin penetration was cited by FoE, points out that his studies only show conclusively that zinc atoms or ions can pass through the skin, not that nanoparticles can pass through. He also notes that the amount of zinc penetration from zinc-based sunscreens is very much lower than the level of zinc people have in their body in the first place. Tilman Butz, who led one of the largest projects on nanoparticle penetration through skin to date, points out that – based on current understanding – the nanoparticles used in sunscreens are too large to penetrate through the skin.

These three comments alone begin to cast the potential risks associated with nanomaterials in sunscreens in a very different light to that presented by FoE. Certainly there are still uncertainties about the possible consequences of using these materials – no-one is denying that. But the weight of evidence suggests that nanomaterials within sunscreens – if engineered and used appropriately – do not present a clear and present threat to human health.

Go to the comments section of the 2020 Science blog for the full text of Peter Dobson’s response, Brian Gulson’s response posted by Andrew on Gulson’s behalf, and Tilman Butz’s response posted by Andrew on Butz’s behalf. (I found these comments very helpful as I had made the mistake of assuming that there was proof that nanoparticles do penetrate the skin barrier [as per my posting of June 23, 2010].)

I want to point out that the stakes are quite high despite the fact that sunscreens are classified as a cosmetic. I’ve heard at least one commentator (Pat Roy Mooney of The ETC Group, Interview at 2009 Elevate Festival at 4:32) scoff because nanotechnology is being used in cosmetics as if it’s frivolous. Given the important role sunscreens play in our health these days, a safe sunscreen has to be high on the list of most people’s priorities but this leads to a question.

Should we stop developing more effective nanotechnology-enabled sunscreens (and by extension, other nanotechnology-enabled products) due to concern that we may cause more harm than good?

Andrew goes on to provide some interesting insight into the issue citing the Precautionary Principle and supplementing his comments with some of Richard Jones’ (author of Soft Machines book and blog and consultant to UK government on various nanotechnology topics) suggestions to refine the Precautionary Principle guidelines,

1. what are the benefits that the new technology provides – what are the risks and uncertainties associated with not realising these benefits?

2. what are the risks and uncertainties attached to any current ways we have of realising these benefits using existing technologies?

3. what are the risks and uncertainties of the new technology?

I strongly suggest that anyone interested in the issues around risk, the precautionary principle, emerging technologies, and the role of research read this posting (as well as its predecessors) and as much of the discussion as you can manage.

One additional thought which was posited in the comments section by Hilary Sutcliffe (you’ll need to scroll the comments as I haven’t figured out how to create a direct link to her comment) has to do with the role that companies have with regard to their research and making it available in the discussion about health, safety, and the environment (HSE),

… we need to be able to access ‘the best available information’ in order to make informed decisions in the face of uncertainty and enable the rounded assessment that Prof Richard Jones suggests. This is indeed essential, but ‘we’ are usually constrained by the lack of one very large chunk of ‘available information’ which is the HSE testing the companies themselves have done which leads them to judge the material or product they have developed is safe.

Further in the comment she goes on to discuss a project (What’s fair to share?) that her organization (MATTER) is planning where they want to discuss how companies can share their HSE data without giving away intellectual property and/or competitive advantages.

Finally, I want to paraphrase something I said elsewhere. While I am critical of the tactics used by the Friends of the Earth in this instance, there is no doubt in my mind that the organization and other civil society groups serve a very important role in raising much needed discussion about nanotechnology risks.

Quantum realities and perceptions (part 2)

To sum up Friday’s posting: I discussed the nature of reality (both quantum and macro) and its relationship to our perceptions while examining a Buddhist perspective on science. Today, I’m adding a recently published (Nature Nanotechnology) paper, Anticipating the perceived risk of nanotechnologies, by Terre Satterfield [University of British Columbia, Canada], Milind Kandlikar, Christian E. H. Beaudrie, Joseph Conti and Barbara Herr-Harthorn to the mix.

It’s a meta-analysis of a number of public surveys on nanotechnology and perceptions of risk. From the paper,

Perception is critical [] for a number of reasons: because human behaviour is derivative of what we believe or perceive to be true [emphasis mine]; because perceptions and biases are not easily amenable to change with new knowledge1 [ ] and because risk perceptions are said to be, at least in part, the result of social and psychological factors and not a ‘knowledge deficit’ about risks per se []. [Note: I can’t figure out how to reproduce the numbered notes in superscripted form as my WordPress installation is still problematic. Please read the article if you are interested in them.] p. 1 of the PDF.

Although the authors of the paper are not concerned with the ultimate nature of reality, the words I’ve emphasized struck home because it touches on the notion of relationships. From Peter McKnight’s article about Buddhism and science,

In other words, how we define the objects of our knowledge — in this case, particles — depends on the capacity we have to know about them. This instrumentalist view has a deeply Kantian flavour: Kant taught that our knowledge of phenomena is a product of the relation between things and our ways of knowing about them, rather than about things themselves.

… [Mathieu Ricard, Buddhist monk and former geneticist speaking]

“All properties, all observable phenomena, appear in relationship with each other and dependent on each other. This view of interdependence — one thing arising in dependence on another, and their relationship — actually defines what appear to us as objects. So relations and interdependence are the basic fabric of reality. We participate in that interdependence with our consciousness; we crystallize some aspect of it that appears to us as objects.”

At the base, it’s our perception that governs our behaviour which in turn governs our relationships. Richard Jones in his book (2004), Soft Machines, had this to say,

Issues that concern the nature of life are particularly prone to lead to such a reaction–hence the gulf that has opened up between many scientists and many of the public about the rights and wrongs of genetic modification. These very profound issues about the proper relationship between man and nature are likely to become very urgent as bionanotechnology develops. p. 217

It seems that Jones is not alone, from the Satterfield, et al. paper,

More broadly as applications move as predicted towards more complex domains where bioinformation and nanotechnologies converge, the nature of the risks involved will move beyond the immediate concerns relation to toxicity and enter into contentious moral and ethical terrains. p. 6 of PDF

For me, the whole thing resembles a very complex conversation. More tomorrow.