Tag Archives: Richard Jones

Brain and machine as one (machine/flesh)

The essay on brains and machines becoming intertwined is making the rounds. First stop on my tour was its Oct. 4, 2016 appearance on the Mail & Guardian, then there was its Oct. 3, 2016 appearance on The Conversation, and finally (moving forward in time) there was its Oct. 4, 2016 appearance on the World Economic Forum website as part of their Final Frontier series.

The essay was written by Richard Jones of Sheffield University (mentioned here many times before but most recently in a Sept. 4, 2014 posting). His book ‘Soft Machines’ provided me with an important and eminently readable introduction to nanotechnology. He is a professor of physics at the University of Sheffield and here’s more from his essay (Oct. 3, 2016 on The Conversation) about brains and machines (Note: Links have been removed),

Imagine a condition that leaves you fully conscious, but unable to move or communicate, as some victims of severe strokes or other neurological damage experience. This is locked-in syndrome, when the outward connections from the brain to the rest of the world are severed. Technology is beginning to promise ways of remaking these connections, but is it our ingenuity or the brain’s that is making it happen?

Ever since an 18th-century biologist called Luigi Galvani made a dead frog twitch we have known that there is a connection between electricity and the operation of the nervous system. We now know that the signals in neurons in the brain are propagated as pulses of electrical potential, whose effects can be detected by electrodes in close proximity. So in principle, we should be able to build an outward neural interface system – that is to say, a device that turns thought into action.

In fact, we already have the first outward neural interface system to be tested in humans. It is called BrainGate and consists of an array of micro-electrodes, implanted into the part of the brain concerned with controlling arm movements. Signals from the micro-electrodes are decoded and used to control the movement of a cursor on a screen, or the motion of a robotic arm.

A crucial feature of these systems is the need for some kind of feedback. A patient must be able to see the effect of their willed patterns of thought on the movement of the cursor. What’s remarkable is the ability of the brain to adapt to these artificial systems, learning to control them better.

You can find out more about BrainGate in my May 17, 2012 posting which also features a video of a woman controlling a mechanical arm so she can drink from a cup coffee by herself for the first time in 15 years.

Jones goes on to describe the cochlear implants (although there’s no mention of the controversy; not everyone believes they’re a good idea) and retinal implants that are currently available. Jones notes this (Note Links have been removed),

The key message of all this is that brain interfaces now are a reality and that the current versions will undoubtedly be improved. In the near future, for many deaf and blind people, for people with severe disabilities – including, perhaps, locked-in syndrome – there are very real prospects that some of their lost capabilities might be at least partially restored.

Until then, our current neural interface systems are very crude. One problem is size; the micro-electrodes in use now, with diameters of tens of microns, may seem tiny, but they are still coarse compared to the sub-micron dimensions of individual nerve fibres. And there is a problem of scale. The BrainGate system, for example, consists of 100 micro-electrodes in a square array; compare that to the many tens of billions of neurons in the brain. The fact these devices work at all is perhaps more a testament to the adaptability of the human brain than to our technological prowess.

Scale models

So the challenge is to build neural interfaces on scales that better match the structures of biology. Here, we move into the world of nanotechnology. There has been much work in the laboratory to make nano-electronic structures small enough to read out the activity of a single neuron. In the 1990s, Peter Fromherz, at the Max Planck Institute for Biochemistry, was a pioneer of using silicon field effect transistors, similar to those used in commercial microprocessors, to interact with cultured neurons. In 2006, Charles Lieber’s group at Harvard succeeded in using transistors made from single carbon nanotubes – whiskers of carbon just one nanometer in diameter – to measure the propagation of single nerve pulses along the nerve fibres.

But these successes have been achieved, not in whole organisms, but in cultured nerve cells which are typically on something like the surface of a silicon wafer. It’s going to be a challenge to extend these methods into three dimensions, to interface with a living brain. Perhaps the most promising direction will be to create a 3D “scaffold” incorporating nano-electronics, and then to persuade growing nerve cells to infiltrate it to create what would in effect be cyborg tissue – living cells and inorganic electronics intimately mixed.

I have featured Charles Lieber and his work here in two recent posts: ‘Bionic’ cardiac patch with nanoelectric scaffolds and living cells on July 11, 2016 and Long-term brain mapping with injectable electronics on Sept. 22, 2016.

For anyone interested in more about the controversy regarding cochlear implants, there’s this page on the Brown University (US) website. You might also want to check out Gregor Wolbring (professor at the University of Calgary) who has written extensively on the concept of ableism (links to his work can be found at the end of this post). I have excerpted from an Aug. 30, 2011 post the portion where Gregor defines ‘ableism’,

From Gregor’s June 17, 2011 posting on the FedCan blog,

The term ableism evolved from the disabled people rights movements in the United States and Britain during the 1960s and 1970s.  It questions and highlights the prejudice and discrimination experienced by persons whose body structure and ability functioning were labelled as ‘impaired’ as sub species-typical. Ableism of this flavor is a set of beliefs, processes and practices, which favors species-typical normative body structure based abilities. It labels ‘sub-normative’ species-typical biological structures as ‘deficient’, as not able to perform as expected.

The disabled people rights discourse and disability studies scholars question the assumption of deficiency intrinsic to ‘below the norm’ labeled body abilities and the favoritism for normative species-typical body abilities. The discourse around deafness and Deaf Culture would be one example where many hearing people expect the ability to hear. This expectation leads them to see deafness as a deficiency to be treated through medical means. In contrast, many Deaf people see hearing as an irrelevant ability and do not perceive themselves as ill and in need of gaining the ability to hear. Within the disabled people rights framework ableism was set up as a term to be used like sexism and racism to highlight unjust and inequitable treatment.

Ableism is, however, much more pervasive.

You can find out more about Gregor and his work here: http://www.crds.org/research/faculty/Gregor_Wolbring2.shtml or here:

Richard Jones and soft nanotechnology

One of the first posts on this blog was about Richard Jones’ nanotechnology book, ‘Soft Machines’. I have a ‘soft’ spot for the book which I found to be a good introduction to nanotechnology and well written too.

It’s nice to see the book getting some more attention all these years later as James Lewis notes in his Aug. 31, 2014 posting on Nanodot (Foresight Institute’s blog) that nano manufacturing has not progressed as some of the early thinkers in this area had hoped,

Long-term readers of Nanodot will be familiar with the work of Richard Jones, a UK physicist and author of Soft Machines: Nanotechnology and Life, reviewed in Foresight Update Number 55 (2005) page 10. Basically Jones follows Eric Drexler’s lead in Engines of Creation in arguing that the molecular machinery found in nature provides an existence proof of an advanced nanotechnology of enormous capabilities. However, he cites the very different physics governing biomolecular machinery operating in an aqueous environment on the one hand, and macroscopic machine tools of steel and other hard metals, on the other hand. He then argues that rigid diamondoid structures doing atomically precise mechanochemistry, as later presented by Drexler in Nanosystems, although at least theoretically feasible, do not form a practical path to advanced nanotechnology. This stance occasioned several very useful and informative debates on the relative strengths and weaknesses of different approaches to advanced nanotechnology, both on his Soft Machines blog and here on Nanodot (for example “Debate with ‘Soft Machines’ continues“, “Which way(s) to advanced nanotechnology?“, “Recent commentary“). An illuminating interview of Richard Jones over at h+ Magazine not only presents Jones’s current views, but spotlights the lack of substantial effort since 2008 in trying to resolve these issues “Going Soft on Nanotech

Lewis goes on to excerpt parts of the H+ interview which pertain to manufacturing and discusses the implications further. (Note: Eric Drexler not only popularized nanotechnology and introduced us to ‘grey goo’ with his book ‘Engines of Creation’, he also founded the Foresight Institute with then wife Christine Peterson. Drexler is no longer formally associated with Foresight.)

In the interests of avoiding duplication, I am focusing on the parts of the H+ interview concerning soft machines and synthetic biology and topics other than manufacturing. From the Nov. 23, 2013 article by Eddie Germino for H+ magazine,

H+: What are “soft machines”?

RJ: I called my book “Soft Machines” to emphasise that the machines of cell biology work on fundamentally different principles to the human-made machines of the macro-world.  Why “soft”?  As a physicist, one of my biggest intellectual influences was the French theoretical physicist Pierre-Gilles de Gennes (1932-2007, Nobel Prize for Physics 1991).  De Gennes popularised the term “soft matter” for those kinds of materials – polymers, colloids, liquid crystals etc – in which the energies with which molecules interact with each other are comparable with thermal energies, making them soft, mutable and responsive.  These are the characteristics of biological matter, so calling the machines of biology “soft machines” emphasises the different principles on which they operate.  Some people will also recognise the allusion to a William Burroughs novel (for whom a soft machine is a human being).

H+: What kind of work have you done with soft machines?

RJ: In my own lab we’ve been working on a number of “soft machine” related problems.  At the near-term end, we’ve been trying to understand what makes the molecules go where when you manufacture a solar cell from solutions of organic molecules – the idea here is that if you understand the self-assembly processes you can get a well-defined nanostructure that gives you a high conversion efficiency with a process you can use on a very large scale very cheaply. Further away from applications, we’ve been investigating a new mechanism for propelling micro- and nano-scale particles in water.  We use a spatially asymmetric chemical reaction so the particle creates a concentration gradient around itself, as a result of which osmotic pressure pushes it along.

H+: Putting aside MNT [micro/nanotechnology], what other design approaches would be most likely to yield advanced nanomachines?

RJ: If we are going to use the “soft machines” design paradigm to make functional nano machines, we have two choices.  We can co-opt what nature does, modifying biological systems to do what we want.  In essence, this is what is underlying the current enthusiasm for synthetic biology.  Or we can make synthetic molecules and systems that copy the principles that biology uses, possibly thereby widening the range of environments in which it will work.  Top-down methods are still enormously powerful, but they will have limits.

H+: So “synthetic biology” involves the creation of a custom-made microorganism built with the necessary organic parts and DNA to perform a desired function. Even if it is manmade, it only uses recognizable, biological parts in its construction, albeit arranged in ways that don’t occur in nature. But the second approach involving “synthetic molecules and systems that copy the principles that biology uses” is harder to understand. Can you give some clarifying examples?

RJ: If you wanted to make a molecular motor to work in water, you could use the techniques of molecular biology to isolate biological motors from cells, and this approach does work.  Alternatively, you could work out the principles by which the biological motor worked – these involve shape changes in the macromolecules coupled to chemical reactions – and try to make a synthetic molecule which would operate on similar principles.  This is more difficult than hacking out parts from a biological system, but will ultimately be more flexible and powerful.

H+: Why would it be more flexible and powerful?

RJ: The problem with biological macromolecules is that biology has evolved very effective mechanisms for detecting them and eating them.  So although DNA, for example, is a marvellous material for building nanostructures and devices from, its going to be difficult to use these directly in medicine simply because our cells are very good at detecting and destroying foreign DNA.  So using synthetic molecules should lead to more robust systems that can be used in a wider range of environments.

H+: In spite of your admiration for nanoscale soft machines, you’ve said that manmade technology has a major advantage because it can make use of electricity in ways living organisms can’t. Will soft machines use electricity in the future somehow?

RJ: Biology uses electrical phenomenon quite a lot – e.g. in our nervous system – but generally this relies on ion transport rather than coherent electron transport.  Photosynthesis is an exception, as may be certain electron transporting structures recently discovered in some bacteria.  There’s no reason in principle that the principles of self-assembly shouldn’t be used to connect up electronic circuits in which the individual elements are single conducting or semi-conducting molecules.  This idea – “molecular electronics” – is quite old now, but it’s probably fair to say that as a field it hasn’t progressed as fast as people had hoped.

Jones also discusses the term nanotechnology and takes a foray into transhumanism and the singularity (from the Germino article),

H+: What do you think of the label “nanotechnology”? Is it a valid field? What do people most commonly misunderstand about it? 

RJ: Nanotechnology, as the term is used in academia and industry, isn’t really a field in the sense that supramolecular chemistry or surface physics are fields.  It’s more of a socio-political project, which aims to do to physical scientists what the biotech industry did to life scientists – that is, to make them switch their focus from understanding nature to intervening in nature by making gizmos and gadgets, and then to try and make money from that.

What I’ve found, doing quite a lot of work in public engagement around nanotechnology, is that most people don’t have enough awareness of nanotechnology to misunderstand it at all.  Among those who do know something about it, I think the commonest misunderstanding is the belief that it will progress much more rapidly than is actually possible.  It’s a physical technology, not a digital one, so it won’t proceed at the pace we see in digital technologies.  As all laboratory-based nanotechnologists know, the physical world is more cussed than the digital one, and the smaller it gets the more cussed it seems to be…


H+: Your thoughts on picotechnology and femtotechnology?

RJ: There’s a roughly inverse relationship between the energy scales needed to manipulate matter and the distance scale at which that manipulation takes place. Manipulating matter at the picometer scale is essentially a matter of controlling electron energy levels in atoms, which involves electron volt energies.  This is something we’ve got quite good at when we make lasers, for example.  Things are more difficult when we go smaller.  To manipulate matter at the nuclear level – i.e. on femtometer length scales – needs MeV energies, while to manipulate matter at the level of the constituents of hadrons – quarks and gluons – we need GeV energies.  At the moment our technology for manipulating objects at these energy scales is essentially restricted to hurling things at them, which is the business of particle accelerators.  So at the moment we really have no idea how to do femtotechnology of any kind of complexity, nor do we have any idea whether whether there is anything interesting we could do with it if we could.  I suppose the question is whether there is any scope for complexity within nuclear matter.  Perhaps if we were the sorts of beings that lived inside a neutron star or a quark-gluon plasma we’d know.

H+: What do you think of the transhumanist and Singularity movements?

RJ: These are terms that aren’t always used with clearly understood meanings, by me at least.  If by Transhumanism, we are referring to the systematic use of technology to better the lot of humanity, then I’m all in favour.  After all, the modern Western scientific project began with Francis Bacon, who said its purpose was “an improvement in man’s estate and an enlargement of his power over nature”.  And if the essence of Singularitarianism is to say that there’s something radically unknowable about the future, then I’m strongly in agreement.  On the other hand, if we consider Transhumanism and Singularitarianism as part of a belief package promising transcendence through technology, with a belief in a forthcoming era of material abundance, superhuman wisdom and everlasting life, then it’s interesting as a cultural phenomenon.  In this sense it has deep roots in the eschatologies of the apocalyptic traditions of Christianity and Judaism.  These were secularised by Marx and Trotsky, and technologised through, on the one hand, Fyodorov, Tsiolkovsky and the early Russian ideologues of space exploration, and on the other by the British Marxist scientists J.B.S. Haldane and Desmond Bernal.  Of course, the fact that a set of beliefs has a colourful past doesn’t mean they are necessarily wrong, but we should be aware that the deep tendency of humans to predict that their wishes will imminently be fulfilled is a powerful cognitive bias.

Richard goes into more depth about his views on transhumanism and the singularity in an Aug. 24, 2014 posting on his Soft Machines blog,

Transhumanism has never been modern

Transhumanists are surely futurists, if they are nothing else. Excited by the latest developments in nanotechnology, robotics and computer science, they fearlessly look ahead, projecting consequences from technology that are more transformative, more far-reaching, than the pedestrian imaginations of the mainstream. And yet, their ideas, their motivations, do not come from nowhere. They have deep roots, perhaps surprising roots, and following those intellectual trails can give us some important insights into the nature of transhumanism now. From antecedents in the views of the early 20th century British scientific left-wing, and in the early Russian ideologues of space exploration, we’re led back, not to rationalism, but to a particular strand of religious apocalyptic thinking that’s been a persistent feature of Western thought since the middle ages.

The essay that follows is quite dense (many of the thinkers he cites are new to me) so if you’re a beginner in this area, you may want to set some time aside to read this in depth. Also, you will likely want to read the comments which follow the post.

Competition, collaboration, and a smaller budget: the US nano community responds

Before getting to the competition, collaboration, and budget mentioned in the head for this posting, I’m supplying some background information.

Within the context of a May 20, 2014 ‘National Nanotechnology Initiative’ hearing before the U.S. House of Representatives Subcommittee on Research and Technology, Committee on Science, Space, and Technology, the US General Accountability Office (GAO) presented a 22 pp. précis (PDF; titled: NANOMANUFACTURING AND U.S. COMPETITIVENESS; Challenges and Opportunities) of its 125 pp. (PDF version report titled: Nanomanufacturing: Emergence and Implications for U.S. Competitiveness, the Environment, and Human Health).

Having already commented on the full report itself in a Feb. 10, 2014 posting, I’m pointing you to Dexter Johnson’s May 21, 2014 post on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website) where he discusses the précis from the perspective of someone who was consulted by the US GAO when they were writing the full report (Note: Links have been removed),

I was interviewed extensively by two GAO economists for the accompanying [full] report “Nanomanufacturing: Emergence and Implications for U.S. Competitiveness, the Environment, and Human Health,” where I shared background information on research I helped compile and write on global government funding of nanotechnology.

While I acknowledge that the experts who were consulted for this report are more likely the source for its views than I am, I was pleased to see the report reflect many of my own opinions. Most notable among these is bridging the funding gap in the middle stages of the manufacturing-innovation process, which is placed at the top of the report’s list of challenges.

While I am in agreement with much of the report’s findings, it suffers from a fundamental misconception in seeing nanotechnology’s development as a kind of race between countries. [emphases mine]

(I encourage you to read the full text of Dexter’s comments as he offers more than a simple comment about competition.)

Carrying on from this notion of a ‘nanotechnology race’, at least one publication focused on that aspect. From the May 20, 2014 article by Ryan Abbott for CourthouseNews.com,

Nanotech Could Keep U.S. Ahead of China

WASHINGTON (CN) – Four of the nation’s leading nanotechnology scientists told a U.S. House of Representatives panel Tuesday that a little tweaking could go a long way in keeping the United States ahead of China and others in the industry.

The hearing focused on the status of the National Nanotechnology Initiative, a federal program launched in 2001 for the advancement of nanotechnology.

As I noted earlier, the hearing was focused on the National Nanotechnology Initiative (NNI) and all of its efforts. It’s quite intriguing to see what gets emphasized in media reports and, in this case, the dearth of media reports.

I have one more tidbit, the testimony from Lloyd Whitman, Interim Director of the National Nanotechnology Coordination Office and Deputy Director of the Center for Nanoscale Science and Technology, National Institute of Standards and Technology. The testimony is in a May 21, 2014 news item on insurancenewsnet.com,

Testimony by Lloyd Whitman, Interim Director of the National Nanotechnology Coordination Office and Deputy Director of the Center for Nanoscale Science and Technology, National Institute of Standards and Technology

Chairman Bucshon, Ranking Member Lipinski, and Members of the Committee, it is my distinct privilege to be here with you today to discuss nanotechnology and the role of the National Nanotechnology Initiative in promoting its development for the benefit of the United States.

Highlights of the National Nanotechnology Initiative

Our current Federal research and development program in nanotechnology is strong. The NNI agencies continue to further the NNI’s goals of (1) advancing nanotechnology R&D, (2) fostering nanotechnology commercialization, (3) developing and maintaining the U.S. workforce and infrastructure, and (4) supporting the responsible and safe development of nanotechnology. …


The sustained, strategic Federal investment in nanotechnology R&D combined with strong private sector investments in the commercialization of nanotechnology-enabled products has made the United States the global leader in nanotechnology. The most recent (2012) NNAP report analyzed a wide variety of sources and metrics and concluded that “… in large part as a result of the NNI the United States is today… the global leader in this exciting and economically promising field of research and technological development.” n10 A recent report on nanomanufacturing by Congress’s own Government Accountability Office (GAO) arrived at a similar conclusion, again drawing on a wide variety of sources and stakeholder inputs. n11 As discussed in the GAO report, nanomanufacturing and commercialization are key to capturing the value of Federal R&D investments for the benefit of the U.S. economy. The United States leads the world by one important measure of commercial activity in nanotechnology: According to one estimate, n12 U.S. companies invested $4.1 billion in nanotechnology R&D in 2012, far more than investments by companies in any other country.  …

There’s cognitive dissonance at work here as Dexter notes in his own way,

… somewhat ironically, the [GAO] report suggests that one of the ways forward is more international cooperation, at least in the development of international standards. And in fact, one of the report’s key sources of information, Mihail Roco, has made it clear that international cooperation in nanotechnology research is the way forward.

It seems to me that much of the testimony and at least some of the anxiety about being left behind can be traced to a decreased 2015 budget allotment for nanotechnology (mentioned here in a March 31, 2014 posting [US National Nanotechnology Initiative’s 2015 budget request shows a decrease of $200M]).

One can also infer a certain anxiety from a recent presentation by Barbara Herr Harthorn, head of UCSB’s [University of California at Santa Barbara) Center for Nanotechnology in Society (CNS). She was at a February 2014 meeting of the Presidential Commission for the Study of Bioethical Issues (mentioned in parts one and two [the more substantive description of the meeting which also features a Canadian academic from the genomics community] of my recent series on “Brains, prostheses, nanotechnology, and human enhancement”). II noted in part five of the series what seems to be a shift towards brain research as a likely beneficiary of the public engagement work accomplished under NNI auspices and, in the case of the Canadian academic, the genomics effort.

The Americans are not the only ones feeling competitive as this tweet from Richard Jones, Pro-Vice Chancellor for Research and Innovation at Sheffield University (UK), physicist, and author of Soft Machines, suggests,

May 18

The UK has fewer than 1% of world patents on graphene, despite it being discovered here, according to the FT –

I recall reading a report a few years back which noted that experts in China were concerned about falling behind internationally in their research efforts. These anxieties are not new, CP Snow’s book and lecture The Two Cultures (1959) also referenced concerns in the UK about scientific progress and being left behind.

Competition/collaboration is an age-old conundrum and about as ancient as anxieties of being left behind. The question now is how are we all going to resolve these issues this time?

ETA May 28, 2014: The American Institute of Physics (AIP) has produced a summary of the May 20, 2014 hearing as part of their FYI: The AIP Bulletin of Science Policy News, May 27, 2014 (no. 93).

ETA Sept. 12, 2014: My first posting about the diminished budget allocation for the US NNI was this March 31, 2014 posting.

“I write in praise of air,” a catalytic poem absorbing air pollutants on a nanotechnology-enabled billboard

The poem ‘In Praise of Air’, which is on a billboard at the University of Sheffield (UK), is quite literally catalytic. From a May 15, 2014 news item on Nanowerk,

Simon [Armitage], Professor of Poetry at the University, and Pro-Vice-Chancellor for Science Professor Tony Ryan, have collaborated to create a catalytic poem called In Praise of Air – printed on material containing a formula invented at the University which is capable of purifying its surroundings.

Here’s what the billboard looks like,

Courtesy of the University of Sheffield

Courtesy of the University of Sheffield

A May 14, 2014 University of Sheffield news release, which originated the news item, has more details about the project from the scientist’s perspective,

This cheap technology could also be applied to billboards and advertisements alongside congested roads to cut pollution.

Professor Ryan, who came up with the idea of using treated materials to cleanse the air, said: “This is a fun collaboration between science and the arts to highlight a very serious issue of poor air quality in our towns and cities.

“The science behind this is an additive which delivers a real environmental benefit that could actually help cut disease and save lives.

“This poem alone will eradicate the nitrogen oxide pollution created by about 20 cars every day.”

He added: “If every banner, flag or advertising poster in the country did this, we’d have much better air quality. It would add less than £100 to the cost of a poster and would turn advertisements into catalysts in more ways than one. The countless thousands of poster sites that are selling us cars beside our roads could be cleaning up emissions at the same time.”

The 10m x 20m piece of material which the poem is printed on is coated with microscopic pollution-eating particles of titanium dioxide which use sunlight and oxygen to react with nitrogen oxide pollutants and purify the air.

Professor Ryan has been campaigning for some time to have his ingredient added to washing detergent in the UK as part of his Catalytic Clothing project. If manufacturers added it, the UK would meet one of its air quality targets in one step.

The news release also describes the arts component and poet’s perspective on this project,

The poem will be on display on the side of the University’s Alfred Denny Building, Western Bank, for one year and its unveiling also marks the launch of this year’s Sheffield Lyric Festival which takes place between 14-17 May 2014 at the University’s Firth Hall.

At a special celebratory event on Thursday (May 15 2014), Simon will read In Praise of Air for the first time in public and Professor Ryan will explain the technology behind the catalytic poem. Volunteers will be wearing catalytic T-shirts.

Dr Joanna Gavins, from the University’s School of English, project manager for the catalytic poem collaboration, who also leads the Lyric Festival, said: “This highlights the innovation and creativity at the heart of the University and its research excellence.

“We are delighted that such a significant event will help launch this year’s Lyric Festival which also features poetry readings by students of the MA in Creative Writing, alongside internationally renowned writers such as Sinead Morrissey and Benjamin Zephaniah, and music from celebrated Sheffield songwriter, Nat Johnson.”

Simon added: “There’s a legacy of poems in public places in Sheffield and, on behalf of the University, I wanted to be part of that dialogue to show what we could do.

“I wanted to write a poem that was approachable, that might catch the attention of the passer-by and the wandering mind, and one that had some local relevance too. But I also hope it’s robust and intricate enough to sustain deeper enquiries – the School of English looks towards it for one thing, and I’d like to think it’s capable of getting the thumbs up or at least a nod from their direction, and from the big-brained students walking up and down Western Bank, and from discerning residents in the neighbourhood.”

He added: “I’ve enjoyed working with the scientists and the science, trying to weave the message into the words, wanting to collaborate both conceptually and with the physical manifestation of the work.

“Poetry often comes out with the intimate and the personal, so it’s strange to think of a piece in such an exposed place, written so large and so bold. I hope the spelling is right!

For the curious, here’s a link to the In Praise of Air project website where you’ll find the poem and much more,

I write in praise of air.  I was six or five
when a conjurer opened my knotted fist
and I held in my palm the whole of the sky.
I’ve carried it with me ever since.

Let air be a major god, its being
and touch, its breast-milk always tilted
to the lips.  Both dragonfly and Boeing
dangle in its see-through nothingness…

Among the jumbled bric-a-brac I keep
a padlocked treasure-chest of empty space,
and on days when thoughts are fuddled with smog
or civilization crosses the street

with a white handkerchief over its mouth
and cars blow kisses to our lips from theirs
I turn the key, throw back the lid, breathe deep.
My first word, everyone’s  first word, was air.

I like this poem a lot and find it quite inspirational for one of my own projects.

Getting back to Tony Ryan, he and his Catalytic Clothing project have been mentioned here in a Feb. 24, 2012 posting (Catalytic Clothing debuts its kilts at Edinburgh International Science Festival) and in a July 8, 2011 posting featuring a collaboration between Ryan and Professor Helen Storey at the London College of Fashion (Nanotechnology-enabled Catalytic Clothes look good and clean the air). The 2012 posting has an image of two kilted gentlemen and the 2011 posting has a video highlighting one of the dresses, some music from Radiohead, and the ideas behind the project.

You can find out more about Catalytic Clothing and the Lyric Festival (from the news release),

Catalytic Clothing

To find out more about the catalytic clothing project visit http://www.catalytic-clothing.org

Lyric Festival

The Lyric Festival is the [University of Sheffield] Faculty of Arts and Humanities’ annual celebration of the written and spoken word. Each May the festival brings some of the UK’s most renowned and respected writers, broadcasters, academics, and performers to the University, as well as showcasing the talent of Faculty students and staff. For more information visit http://www.sheffield.ac.uk/lyric

One last note about the University of Sheffield, it’s the academic home for Professor Richard Jones who wrote one of my favourite books about nanotechnology, Soft Machines (featured in my earliest pieces here, a May 6, 2008 posting). He is the Pro-Vice-Chancellor – Research & Innovation at the university and a blogger on his Soft Machines blog where he writes about innovation and research in the UK and where you’ll also find a link to purchase his book.

ETA May 20, 2014: A May 19, 2014 article by JW Dowey for Earth Times offers more details about the technology,

Titanium dioxide coating on cars and aircraft have revolutionised protective nanotechnology. The University of Sheffield has set the target as absorbing the poisonous compounds from vehicle exhausts. Tony Ryan is the professor of physical chemistry in charge of adapting self-cleaning window technology to pollution solutions. The 10m x20m poster they now use on the Alfred Denny university building demonstrates how nitrogen oxides from 20 cars per day could be absorbed efficiently by roadside absorption.

There are more tidbits to be had in the article including the extra cost (£100) of adding the protective coating to the ‘poetic’ billboard (or hoarding as they say in the UK).

Gary Goodyear rouses passions: more on Canada’s National Research Council and its new commitment to business

Gary Goodyear’s, Minister of State (Science and Technology), office in attempting to set the record straight has, inadvertently, roused even more passion in Phil Plait’s (Slate.com blogger) bosom and inspired me to examine more commentary about the situation regarding the NRC and its ‘new’ commitment to business.

Phil Plait in a May 22, 2013 followup to one 0f his recent postings (I have the details about Plait’s and other commentaries in my May 13, 2013 posting about the NRC’s recent declarations) responds to an email from Michele-Jamali Paquette, the director of communication for Goodyear (Note: A link has been removed),

I read the transcripts, and assuming they are accurate, let me be very clear: Yes, the literal word-for-word quotation I used was incorrect, and one point I made was technically and superficially in error. But the overall point—that this is a terrible move by the NRC and the conservative Canadian government, short-changing real science—still stands. And, in my opinion, Goodyear’s office is simply trying to spin what has become a PR problem.

I’ll note that in her email to me, Paquette quoted my own statement:

John MacDougal [sic], President of the NRC, literally said, “Scientific discovery is not valuable unless it has commercial value”

Paquette took exception to my use of the word “literally,” emphasizing it in her email. (The link, in both her email and my original post, goes to the Toronto Sun story with the garbled quotation.) Apparently MacDougal did not literally say that. But the objection strikes me as political spin since the meaning of what MacDougal said at the press conference is just as I said it was in my original post.

As I pointed out in my first post: Science can and should be done for its own sake. It pays off in the end, but that’s not why we do it. To wit …

Paquette’s choice of what issues (the 2nd issue was Plait’s original description of the NRC as a funding agency) to dispute seem odd and picayune as they don’t have an impact on Plait’s main argument,

Unfortunately, despite these errors, the overall meaning remains the same: The NRC is moving away from basic science to support business better, and the statements by both Goodyear and MacDougal [sic] are cause for concern.

Plait goes on to restate his argument and provide a roundup of commentaries. It’s well worth reading for the roundup alone.  (One picayune comment from me, I wish Plait would notice that the head of Canada’s National Research Council’s name is spelled this way, John McDougall.)

Happily, Nassif Ghoussoub has also chimed in with a May 22, 2013 posting (on his Piece of Mind blog) regarding the online discussion (Note: Links have been removed),

The Canadian twitter world has been split in the last couple of days. … But then, you have the story of the Tories’ problem with science, be it defunding, muzzling, disbelieving, doubting, preventing, delegitimizing etc. The latter must have restarted with the incredible announcement about the National Research Council (NRC), presented as “Canada sells out science” in Slate, and as “Failure doesn’t come cheap” in Maclean’s. What went unnoticed was the fact that the restructuring turned out to be totally orthogonal to the recommendations of the Jenkins report about the NRC. Then came the latest Science, Technology and Innovation Council (STIC) report, which showed that Canada’s expenditure on research and development has fallen from 16th out of 41 comparable countries in the year Stephen Harper became prime minister, to 23rd in 2011. Paul Wells seems to be racking up hits on his Maclean’s article,  “Stephen Harper and the knowledge economy: perfect strangers.”  But the story of the last 48 hours has been John Dupuis’s chronology of what he calls, “The Canadian war on science” and much more.

Yes, it’s another roundup but it’s complementary (albeit with one or two repetitions) since Plait does not seem all that familiar with the Canadian scene (I find it’s always valuable to have an outside perspective) and Nassif is a longtime insider.

John Dupuis’ May 20, 2013 posting (on his Confessions of a Science Librarian blog), mentioned by both Nassif and Plait, provides an extraordinary listing of stories ranging from 2006 through to 2013 whose headlines alone paint a very bleak picture of the practice of science in Canada,

As is occasionally my habit, I have pulled together a chronology of sorts. It is a chronology of all the various cuts, insults, muzzlings and cancellations that I’ve been able to dig up. Each of them represents a single shot in the Canadian Conservative war on science. It should be noted that not every item in this chronology, if taken in isolation, is necessarily the end of the world. It’s the accumulated evidence that is so damning.

As I’ve noted before, I am no friend of Stephen Harper and his Conservative government and many of their actions have been reprehensible and, at times, seem childishly spiteful but they do occasionally get something right. There was a serious infrastructure problem in Canada. Buildings dedicated to the pursuit of science were sadly aged and no longer appropriate for the use to which they were being put. Harper and his government have poured money into rebuilding infrastructure and for that they should be acknowledged.

As for what the Conservatives are attempting with this shift in direction for the National Research Council (NRC), which has been ongoing for at least two years as I noted in my May 13, 2013 posting, I believe they are attempting to rebalance the Canadian research enterprise.  It’s generally agreed that Canada historically has very poor levels of industrial research and development (R&D) and high levels of industrial R&D are considered, internationally, as key to a successful economy. (Richard Jones, Pro-Vice Chancellor for Research and Innovation at the University of Sheffield, UK, discusses how a falling percentage of industrial R&D, taking place over decades,  is affecting the UK economy in a May 10, 2013 commentary on the University of  Sheffield SPERI [Sheffield Political Economy Research Institute] website.)

This NRC redirection when taken in conjunction with the recent StartUp visa programme (my May 20, 2013 posting discusses Minister of Immigration Jason Kenney’s recent recruitment tour in San Francisco [Silicon Valley]),  is designed to take Canada and Canadians into uncharted territory—the much desired place where we develop a viable industrial R&D sector and an innovative economy in action.

In having reviewed at least some of the commentary, there are a couple of questions left unasked about this international obsession with industrial R&D,

  • is a country’s economic health truly tied to industrial R&D or is this ‘received’ wisdom?
  • if industrial R&D is the key to economic health, what would be the best balance between it and the practice of basic science?

As for the Canadian situation, what might be some of the unintended consequences? It occurs to me that if scientists are rewarded for turning their research into commercially viable products they might be inclined to constrain access to materials. Understandable if the enterprise is purely private but the NRC redirection is aimed at bringing together academics and private enterprise in a scheme that seems a weird amalgam of both.

For example, cellulose nanocrystals (CNC) are not easily accessed if you’re a run-of-the-mill entrepreneur. I’ve had more than one back-channel request about how to purchase the material and it would seem that access is tightly controlled by the academics and publicly funded enterprise, in this case, a private business, who produce the material. (I’m speaking of the FPInnovations and Domtar comingling in CelluForce, a CNC production facility and much more. It would make a fascinating case study on how public monies are used to help finance private enterprises and their R&D efforts; the relationship between nongovernmental agencies (FPInnovations, which I believe was an NRC spinoff), various federal public funding agencies, and Domtar, a private enterprise; and the power dynamics between all the players including the lowly entrepreneur.

Precautionary principle and the new Swiss synthetic nanomaterial matrix

The precautionary principle appears to be much loved by civil society groups such the ETC Group and Friends of the Earth. They tend to cite it with some frequency as a means of managing scientific research or, as some might suggest, as a means of stopping research. I have to admit I’ve tended to view the precautionary principle as a way of saying ‘don’t do anything unless you can prove it’s safe’, and that is a gross misunderstanding of the principle. The recent announcement from Switzerland about developing a precautionary matrix for synthetic nanomaterials had me revisiting my ideas.

I found this description of the principle in a July 18, 2010 posting about nanosunscreens by Andrew Mayanard on his 2020 Science blog,

The Precautionary Principle is one approach – and a very misunderstood and misused one – to addressing this [risk and uncertainty], and one brought up by FoE and others in the context of sunscreens.  It has many formulations – it’s not a hard and fast principle.  But it is currently described in the European Union in this way:

The precautionary principle should be informed by three specific principles:

  • implementation of the principle should be based on the fullest possible scientific evaluation. As far as possible this evaluation should determine the degree of scientific uncertainty at each stage;
  • any decision to act or not to act pursuant to the precautionary principle must be preceded by a risk evaluation and an evaluation of the potential consequences of inaction;
  • once the results of the scientific evaluation and/or the risk evaluation are available, all the interested parties must be given the opportunity to study of the various options available, while ensuring the greatest possible transparency.

This is a pragmatic principle that looks to using evidence and an evaluation of consequences in making informed decisions in the face of uncertainty.  It certainly does not preclude the development or implementation of a new technology until there is certainty on safety.

The emphasis on the potential consequences of inaction are particularly relevant to today’s world, where we are stuck on a technological tight-rope, and where the consequences of not doing something may be more harmful than taking action. [emphasis mine]  Richard Jones [author Soft Machines and a Professor of Physics and the Pro-Vice Chancellor for Research and Innovation at the University of Sheffield] picked up on this in his suggestion for a more relevant application of the Precautionary Principle to emerging technologies:

  1. what are the benefits that the new technology provides – what are the risks and uncertainties associated with not realising these benefits?
  2. what are the risks and uncertainties attached to any current ways we have of realising these benefits using existing technologies?
  3. what are the risks and uncertainties of the new technology?

This seems a useful place to start from when faced with the reality of having to make the best possible decisions in the face of uncertainty, and where inaction isn’t an option.

But to make decisions – even when there are gaping holes in the data – you need something to go on.

The new Swiss matrix helps to further flesh out the precautionary principle (from the July 29,2011 news item on Nanowerk),

The precautionary matrix provides a structured method to assess the “nanospecific precautionary need” of workers, consumers and the environment arising from the production and use of synthetic nanomaterials.

The matrix is a tool to help trade and industry meet their obligations of care and self-monitoring. It helps them to recognise applications which may entail risk and to take precautionary measures to protect human health and the environment. In the case of new developments, the matrix can contribute to the development of safer products. It enables users to conduct an initial analysis on the basis of currently available knowledge and indicates when further investigations are necessary.

The matrix can be found on this page of the Swiss Federal Office of Public Health (scroll to the right of the page for the guidelines, matrix, FAQs, etc.).

One question keeps popping up. The phrase ‘consequences of inaction’ has me asking how do we define inaction? My suspicion is that a research nanoscientist and a representative from a civil society organization may have two very different answers to that question, i.e., ‘we must continue with the research to solve the problem’ as opposed to ‘we must stop the actions that caused the problem in the first place’.

Scientific research, failure, and the scanning tunneling microscope

“99% of all you do is failure and that’s maybe the most difficult part of basic research,” said Gerd Binnig in a snippet I’ve culled from an interview with Dexter Johnson (Nanoclast blog on the IEEE [Institute of Electrical and Electronics Engineers] website) posted May 23, 2011 where Binnig discussed why he continued with a project that had failed time and time again. (The snippet is from the 2nd audio file from the top of the posting)

Binnig along with Heinrich Rohrer is a Nobel Laureate. Both men won their award for work on the scanning tunneling microscope (STM), which was the project that had failed countless times and that went on to play an important part in the nanotechnology narrative. Earlier this month, both men were honoured when IBM and ETH Zurich opened the Binnig and Rohrer Nanotechnology Center in Zurich. From the May 17, 2011 news item on Nanowerk,

IBM and ETH Zurich, a premiere European science and engineering university, hosted more than 600 guests from industry, academia and government, to open the Binnig and Rohrer Nanotechnology Center located on the campus of IBM Research – Zurich. The facility is the centerpiece of a 10-year strategic partnership in nanoscience between IBM and ETH Zurich where scientists will research novel nanoscale structures and devices to advance energy and information technologies.

The new Center is named for Gerd Binnig and Heinrich Rohrer, the two IBM scientists and Nobel Laureates who invented the scanning tunneling microscope at the Zurich Research Lab in 1981, thus enabling researchers to see atoms on a surface for the first time. The two scientists attended today’s opening ceremony, at which the new lab was unveiled to the public.

Here’s an excerpt from Dexter’s posting where he gives some context for the audio files,

As promised last week, I would like to share some audio recordings I made of Gerd Binnig and Heinrich Rohrer taking questions from the press during the opening of the new IBM and ETH Zurich nanotechnology laboratory named in their honor.

This first audio file features both Binnig’s and Rohrer’s response to my question of why they were interested in looking at inhomogenities on surfaces in the first place, which led them eventually to creating an instrument for doing it. A more complete history of the STM’s genesis can be found in their joint Nobel lecture here.

The sound quality isn’t the best but these snippets are definitely worth listening to if you find the process of scientific inquiry interesting.

For anyone who’s not familiar with the scanning tunneling microscope, I found this description in the book, Soft Machines; Nanotechnology and Life, by Richard Jones.

Scanning probe microscopes rely on an entirely different principle to both light microscopes and electron microscopes, or indeed our own eyes. Rather than detecting waves that have been scattered from the object we are looking at, on feels the surface of that object with a physical probe. This probe is moved across the surface with high precision. As it tracks the contours of the surface, it s moved up or down in a way that is controlled by some interaction between the tip of the probe and the surface. This interaction could be the flow of electrical current, in the case of a scanning tunneling microscope, or simple the force between the tip and the surface in the case of an atomic force microscope. pp. 17-18

Nano Science Cafe workshop starts and other NISE Net tidbits

I signed up for an online workshop on how to host and produce a Nano Science Café that the Nanoscale Informal Science Education Network (NISE Net) holds. It started this Monday and so far we’ve been introducing ourselves (approximately 80 people are signed up) and people are sharing ideas about how to hold these events successfully.  Most of the participants are located in the US although there are two Canucks (me and someone from Ontario). Of course, not everyone has introduced themselves yet.

There’s a blog posting by Larry Bell about NISE Net’s increasing focus on nano’s societal implications,

Just about a year ago NISE Net launched an expanded collaboration with the Center for Nanotechnology in Society and you’ll hear more about upcoming activities in the months ahead. The conversation started when staff from seven science centers brought cart demos and stage presentations to the S.NET conference in Seattle on Labor Day weekend last year. S.NET is a new professional society for the study of nanoscience and emerging technologies in areas of the social sciences and humanities. I was a little naive and thought the participants were all social scientists, but learned that many were historians, political scientists, philosophers, and ethicists and really not social scientists.

I’m not entirely certain what to make of either NISE Net’s interest or S.NET (Society for the Study of Nanoscience and Emerging Technologies) since this first meeting seems to have be focused primarily on hands-on demos and public outreach initiatives. There will be a 2nd annual S.NET meeting in 2010 (from the conference info.),

Second Annual Conference of the Society for the Study of Nanoscience and Emerging Technologies

Darmstadt, Germany – Sept 29 to Oct 2, 2010

(Wednesday afternoon 2pm through Saturday afternoon 4pm)

The plenary speakers and program committee lists a few names I’ve come across,

This year’s plenary speakers are Armin Grunwald, Richard Jones [has written a book about nanotechnology titled Soft Machines and maintains a blog also titled Soft Machines], Andrew Light, Bernard Stiegler, and Jan Youtie.

Program Committee

Diana Bowman (Public Health and Law, University of Melbourne, Australia)

Julia Guivant (Sociology and Political Science, Santa Catarina, Brazil)

David Guston (Political Science/Center for Nanotechnology in Society, Arizona State University, USA) [guest blogged for Andrew Maynard at 2020 Science]

Barbara Herr Harthorn (Feminist Studies, Anthropology, Sociology/Center for Nanotechnology in Society,University of California Santa Barbara, USA)

Brice Laurent (Sociology, Mines ParisTech, France)

Colin Milburn (English, University of California Davis, USA)[has proposed a nanotechnology origins story which pre-dates Richard Feynman’s famous speech, There’s plenty of room at the bottom]

Cyrus Mody (History, Rice University, United USA)

Alfred Nordmann (Philosophy, nanoOffice, NanoCenter, Technische Universität Darmstadt and University of South Carolina – chair)

Ingrid Ott (Economics, Karlsruhe Institute of Technology, Germany – co-chair)

Arie Rip (Philosophy of Science and Technology, University of Twente, Netherlands) [read a nano paper where he introduced me to blobology and this metaphor for nanotechnology ‘furniture of the world’]

Ursula Weisenfeld (Business Administration, Leuphana Universität, Lüneburg, Germany)

This looks promising and I wish the good luck with the conference.

As far conferences go, there’s another one for the Association of Science and Technology Centers (ASTC) in Hawaii, Oct 3 – 5, 2010, which will feature some NISE Net sessions and workshops . You can check out the ASTC conference details here.

Here’s the monthly NISE Net nano haiku,

Kit kit kit kit kit kit kit
There are no nodes now.

by Anders Liljeholm of the Oregon Museum of Science and Industry. Those of you who may not remember that our regional hubs used to be call nodes (or those looking to brush up on their NISE Net vocabulary in general) can check out the NISE Net Glossary in the nisenet.org catalog.