Tag Archives: Eddie Germino

Richard Jones and soft nanotechnology

One of the first posts on this blog was about Richard Jones’ nanotechnology book, ‘Soft Machines’. I have a ‘soft’ spot for the book which I found to be a good introduction to nanotechnology and well written too.

It’s nice to see the book getting some more attention all these years later as James Lewis notes in his Aug. 31, 2014 posting on Nanodot (Foresight Institute’s blog) that nano manufacturing has not progressed as some of the early thinkers in this area had hoped,

Long-term readers of Nanodot will be familiar with the work of Richard Jones, a UK physicist and author of Soft Machines: Nanotechnology and Life, reviewed in Foresight Update Number 55 (2005) page 10. Basically Jones follows Eric Drexler’s lead in Engines of Creation in arguing that the molecular machinery found in nature provides an existence proof of an advanced nanotechnology of enormous capabilities. However, he cites the very different physics governing biomolecular machinery operating in an aqueous environment on the one hand, and macroscopic machine tools of steel and other hard metals, on the other hand. He then argues that rigid diamondoid structures doing atomically precise mechanochemistry, as later presented by Drexler in Nanosystems, although at least theoretically feasible, do not form a practical path to advanced nanotechnology. This stance occasioned several very useful and informative debates on the relative strengths and weaknesses of different approaches to advanced nanotechnology, both on his Soft Machines blog and here on Nanodot (for example “Debate with ‘Soft Machines’ continues“, “Which way(s) to advanced nanotechnology?“, “Recent commentary“). An illuminating interview of Richard Jones over at h+ Magazine not only presents Jones’s current views, but spotlights the lack of substantial effort since 2008 in trying to resolve these issues “Going Soft on Nanotech

Lewis goes on to excerpt parts of the H+ interview which pertain to manufacturing and discusses the implications further. (Note: Eric Drexler not only popularized nanotechnology and introduced us to ‘grey goo’ with his book ‘Engines of Creation’, he also founded the Foresight Institute with then wife Christine Peterson. Drexler is no longer formally associated with Foresight.)

In the interests of avoiding duplication, I am focusing on the parts of the H+ interview concerning soft machines and synthetic biology and topics other than manufacturing. From the Nov. 23, 2013 article by Eddie Germino for H+ magazine,

H+: What are “soft machines”?

RJ: I called my book “Soft Machines” to emphasise that the machines of cell biology work on fundamentally different principles to the human-made machines of the macro-world.  Why “soft”?  As a physicist, one of my biggest intellectual influences was the French theoretical physicist Pierre-Gilles de Gennes (1932-2007, Nobel Prize for Physics 1991).  De Gennes popularised the term “soft matter” for those kinds of materials – polymers, colloids, liquid crystals etc – in which the energies with which molecules interact with each other are comparable with thermal energies, making them soft, mutable and responsive.  These are the characteristics of biological matter, so calling the machines of biology “soft machines” emphasises the different principles on which they operate.  Some people will also recognise the allusion to a William Burroughs novel (for whom a soft machine is a human being).

H+: What kind of work have you done with soft machines?

RJ: In my own lab we’ve been working on a number of “soft machine” related problems.  At the near-term end, we’ve been trying to understand what makes the molecules go where when you manufacture a solar cell from solutions of organic molecules – the idea here is that if you understand the self-assembly processes you can get a well-defined nanostructure that gives you a high conversion efficiency with a process you can use on a very large scale very cheaply. Further away from applications, we’ve been investigating a new mechanism for propelling micro- and nano-scale particles in water.  We use a spatially asymmetric chemical reaction so the particle creates a concentration gradient around itself, as a result of which osmotic pressure pushes it along.

H+: Putting aside MNT [micro/nanotechnology], what other design approaches would be most likely to yield advanced nanomachines?

RJ: If we are going to use the “soft machines” design paradigm to make functional nano machines, we have two choices.  We can co-opt what nature does, modifying biological systems to do what we want.  In essence, this is what is underlying the current enthusiasm for synthetic biology.  Or we can make synthetic molecules and systems that copy the principles that biology uses, possibly thereby widening the range of environments in which it will work.  Top-down methods are still enormously powerful, but they will have limits.

H+: So “synthetic biology” involves the creation of a custom-made microorganism built with the necessary organic parts and DNA to perform a desired function. Even if it is manmade, it only uses recognizable, biological parts in its construction, albeit arranged in ways that don’t occur in nature. But the second approach involving “synthetic molecules and systems that copy the principles that biology uses” is harder to understand. Can you give some clarifying examples?

RJ: If you wanted to make a molecular motor to work in water, you could use the techniques of molecular biology to isolate biological motors from cells, and this approach does work.  Alternatively, you could work out the principles by which the biological motor worked – these involve shape changes in the macromolecules coupled to chemical reactions – and try to make a synthetic molecule which would operate on similar principles.  This is more difficult than hacking out parts from a biological system, but will ultimately be more flexible and powerful.

H+: Why would it be more flexible and powerful?

RJ: The problem with biological macromolecules is that biology has evolved very effective mechanisms for detecting them and eating them.  So although DNA, for example, is a marvellous material for building nanostructures and devices from, its going to be difficult to use these directly in medicine simply because our cells are very good at detecting and destroying foreign DNA.  So using synthetic molecules should lead to more robust systems that can be used in a wider range of environments.

H+: In spite of your admiration for nanoscale soft machines, you’ve said that manmade technology has a major advantage because it can make use of electricity in ways living organisms can’t. Will soft machines use electricity in the future somehow?

RJ: Biology uses electrical phenomenon quite a lot – e.g. in our nervous system – but generally this relies on ion transport rather than coherent electron transport.  Photosynthesis is an exception, as may be certain electron transporting structures recently discovered in some bacteria.  There’s no reason in principle that the principles of self-assembly shouldn’t be used to connect up electronic circuits in which the individual elements are single conducting or semi-conducting molecules.  This idea – “molecular electronics” – is quite old now, but it’s probably fair to say that as a field it hasn’t progressed as fast as people had hoped.

Jones also discusses the term nanotechnology and takes a foray into transhumanism and the singularity (from the Germino article),

H+: What do you think of the label “nanotechnology”? Is it a valid field? What do people most commonly misunderstand about it? 

RJ: Nanotechnology, as the term is used in academia and industry, isn’t really a field in the sense that supramolecular chemistry or surface physics are fields.  It’s more of a socio-political project, which aims to do to physical scientists what the biotech industry did to life scientists – that is, to make them switch their focus from understanding nature to intervening in nature by making gizmos and gadgets, and then to try and make money from that.

What I’ve found, doing quite a lot of work in public engagement around nanotechnology, is that most people don’t have enough awareness of nanotechnology to misunderstand it at all.  Among those who do know something about it, I think the commonest misunderstanding is the belief that it will progress much more rapidly than is actually possible.  It’s a physical technology, not a digital one, so it won’t proceed at the pace we see in digital technologies.  As all laboratory-based nanotechnologists know, the physical world is more cussed than the digital one, and the smaller it gets the more cussed it seems to be…

… 

H+: Your thoughts on picotechnology and femtotechnology?

RJ: There’s a roughly inverse relationship between the energy scales needed to manipulate matter and the distance scale at which that manipulation takes place. Manipulating matter at the picometer scale is essentially a matter of controlling electron energy levels in atoms, which involves electron volt energies.  This is something we’ve got quite good at when we make lasers, for example.  Things are more difficult when we go smaller.  To manipulate matter at the nuclear level – i.e. on femtometer length scales – needs MeV energies, while to manipulate matter at the level of the constituents of hadrons – quarks and gluons – we need GeV energies.  At the moment our technology for manipulating objects at these energy scales is essentially restricted to hurling things at them, which is the business of particle accelerators.  So at the moment we really have no idea how to do femtotechnology of any kind of complexity, nor do we have any idea whether whether there is anything interesting we could do with it if we could.  I suppose the question is whether there is any scope for complexity within nuclear matter.  Perhaps if we were the sorts of beings that lived inside a neutron star or a quark-gluon plasma we’d know.

H+: What do you think of the transhumanist and Singularity movements?

RJ: These are terms that aren’t always used with clearly understood meanings, by me at least.  If by Transhumanism, we are referring to the systematic use of technology to better the lot of humanity, then I’m all in favour.  After all, the modern Western scientific project began with Francis Bacon, who said its purpose was “an improvement in man’s estate and an enlargement of his power over nature”.  And if the essence of Singularitarianism is to say that there’s something radically unknowable about the future, then I’m strongly in agreement.  On the other hand, if we consider Transhumanism and Singularitarianism as part of a belief package promising transcendence through technology, with a belief in a forthcoming era of material abundance, superhuman wisdom and everlasting life, then it’s interesting as a cultural phenomenon.  In this sense it has deep roots in the eschatologies of the apocalyptic traditions of Christianity and Judaism.  These were secularised by Marx and Trotsky, and technologised through, on the one hand, Fyodorov, Tsiolkovsky and the early Russian ideologues of space exploration, and on the other by the British Marxist scientists J.B.S. Haldane and Desmond Bernal.  Of course, the fact that a set of beliefs has a colourful past doesn’t mean they are necessarily wrong, but we should be aware that the deep tendency of humans to predict that their wishes will imminently be fulfilled is a powerful cognitive bias.

Richard goes into more depth about his views on transhumanism and the singularity in an Aug. 24, 2014 posting on his Soft Machines blog,

Transhumanism has never been modern

Transhumanists are surely futurists, if they are nothing else. Excited by the latest developments in nanotechnology, robotics and computer science, they fearlessly look ahead, projecting consequences from technology that are more transformative, more far-reaching, than the pedestrian imaginations of the mainstream. And yet, their ideas, their motivations, do not come from nowhere. They have deep roots, perhaps surprising roots, and following those intellectual trails can give us some important insights into the nature of transhumanism now. From antecedents in the views of the early 20th century British scientific left-wing, and in the early Russian ideologues of space exploration, we’re led back, not to rationalism, but to a particular strand of religious apocalyptic thinking that’s been a persistent feature of Western thought since the middle ages.

The essay that follows is quite dense (many of the thinkers he cites are new to me) so if you’re a beginner in this area, you may want to set some time aside to read this in depth. Also, you will likely want to read the comments which follow the post.