Tag Archives: There’s plenty of room at the bottom

The memristor as computing device

An Oct. 27, 2016 news item on Nanowerk both builds on the Richard Feynman legend/myth and announces some new work with memristors,

In 1959 renowned physicist Richard Feynman, in his talk “[There’s] Plenty of Room at the Bottom,” spoke of a future in which tiny machines could perform huge feats. Like many forward-looking concepts, his molecule and atom-sized world remained for years in the realm of science fiction.

And then, scientists and other creative thinkers began to realize Feynman’s nanotechnological visions.

In the spirit of Feynman’s insight, and in response to the challenges he issued as a way to inspire scientific and engineering creativity, electrical and computer engineers at UC Santa Barbara [University of California at Santa Barbara, UCSB] have developed a design for a functional nanoscale computing device. The concept involves a dense, three-dimensional circuit operating on an unconventional type of logic that could, theoretically, be packed into a block no bigger than 50 nanometers on any side.

A figure depicting the structure of stacked memristors with dimensions that could satisfy the Feynman Grand Challenge Photo Credit: Courtesy Image

A figure depicting the structure of stacked memristors with dimensions that could satisfy the Feynman Grand Challenge. Photo Credit: Courtesy Image

An Oct. 27, 2016 UCSB news release (also on EurekAlert) by Sonia Fernandez, which originated the news item, offers a basic explanation of the work (useful for anyone unfamiliar with memristors) along with more detail,

“Novel computing paradigms are needed to keep up with the demand for faster, smaller and more energy-efficient devices,” said Gina Adam, postdoctoral researcher at UCSB’s Department of Computer Science and lead author of the paper “Optimized stateful material implication logic for three dimensional data manipulation,” published in the journal Nano Research. “In a regular computer, data processing and memory storage are separated, which slows down computation. Processing data directly inside a three-dimensional memory structure would allow more data to be stored and processed much faster.”

While efforts to shrink computing devices have been ongoing for decades — in fact, Feynman’s challenges as he presented them in his 1959 talk have been met — scientists and engineers continue to carve out room at the bottom for even more advanced nanotechnology. A nanoscale 8-bit adder operating in 50-by-50-by-50 nanometer dimension, put forth as part of the current Feynman Grand Prize challenge by the Foresight Institute, has not yet been achieved. However, the continuing development and fabrication of progressively smaller components is bringing this virus-sized computing device closer to reality, said Dmitri Strukov, a UCSB professor of computer science.

“Our contribution is that we improved the specific features of that logic and designed it so it could be built in three dimensions,” he said.

Key to this development is the use of a logic system called material implication logic combined with memristors — circuit elements whose resistance depends on the most recent charges and the directions of those currents that have flowed through them. Unlike the conventional computing logic and circuitry found in our present computers and other devices, in this form of computing, logic operation and information storage happen simultaneously and locally. This greatly reduces the need for components and space typically used to perform logic operations and to move data back and forth between operation and memory storage. The result of the computation is immediately stored in a memory element, which prevents data loss in the event of power outages — a critical function in autonomous systems such as robotics.

In addition, the researchers reconfigured the traditionally two-dimensional architecture of the memristor into a three-dimensional block, which could then be stacked and packed into the space required to meet the Feynman Grand Prize Challenge.

“Previous groups show that individual blocks can be scaled to very small dimensions, let’s say 10-by-10 nanometers,” said Strukov, who worked at technology company Hewlett-Packard’s labs when they ramped up development of memristors and material implication logic. By applying those results to his group’s developments, he said, the challenge could easily be met.

The tiny memristors are being heavily researched in academia and in industry for their promising uses in memory storage and neuromorphic computing. While implementations of material implication logic are rather exotic and not yet mainstream, uses for it could pop up any time, particularly in energy scarce systems such as robotics and medical implants.

“Since this technology is still new, more research is needed to increase its reliability and lifetime and to demonstrate large scale three-dimensional circuits tightly packed in tens or hundreds of layers,” Adam said.

HP Labs, mentioned in the news release, announced the ‘discovery’ of memristors and subsequent application of engineering control in two papers in 2008.

Here’s a link to and a citation for the UCSB paper,

Optimized stateful material implication logic for threedimensional data manipulation by Gina C. Adam, Brian D. Hoskins, Mirko Prezioso, &Dmitri B. Strukov. Nano Res. (2016) pp. 1 – 10. doi:10.1007/s12274-016-1260-1 First Online: 29 September 2016

This paper is behind a paywall.

You can find many articles about memristors here by using either ‘memristor’ or ‘memristors’ as your search term.

Hands, Waldo, and nano-scalpels

Hands were featured in Waldo (a 1943 short story by Robert Heinlein) and in Richard Feynman’s “Plenty of room at the bottom” 1959 lecture both of which were concerned with describing a field we now call nanotechnology. As I put it in my Aug. 17, 2009 posting,

Both of these texts feature the development of ‘smaller and smaller robotic hands to manipulate matter at the atomic and molecular levels’ and both of these have been cited as the birth of nanotechnology.

The details are a bit sketchy but it seems that scientists at the University of Bath (UK) have created a tiny (nanoscale) tool that looks like a hand. From the University of Bath’s Dec. 12, 2011 news release,

The lower picture shows the AFM probe with the nano-hand circled. The upper image is a vastly enlarged image of the nano-hand, showing the beckoning motion spotted by Dr Gordeev.

Here’s a little more about Dr. Gordeev’s observation from the Dec. 12, 2011 news item on Nanowerk,

Dr Sergey Gordeev, from the Department of Physics, was trying to create a nano-scalpel, a tool which can be used by biologists to look inside cells, when the process went wrong.

Dr Gordeev said: “I was amazed when I looked at the nano-scalpel and saw what appeared to be a beckoning hand.

“Nanoscience research is moving very fast at the moment, so maybe the nano-hand is trying to attract people and funders into this area.

The research group is using funding from Bath Ventures, an organisation which commercialises the results of the University’s research, and private company Diamond Hard Surfaces Ltd, to explore the use of hard coatings for nano-tools, making them more durable and suitable for delicate biological procedures.

I appreciate Dr. Gordeev’s whimsical notion that the hand might be trying to attract funding for this research group.

Nano’s grey goo and the animation series Futurama

You never know where you’re going to find nanotechnology. Most recently I found it in a review of the first few episodes of the animated US tv series, Futurama. Alasdair Wilkins recently offered a few thoughts about a recent ‘nanotechnology-influenced’ episode Benderama. From Wilkins’s June 24, 2011 commentary,

“Benderama” is an example of an episode type that pretty much only Futurama is capable of doing: taking an outlandish but vaguely plausible scientific idea and letting that guide the story. Some all-time great episodes have come from this approach: “The Farnsworth Parabox” did this with alternate universes, Bender’s Big Score used time paradoxes (or the lack thereof), and “The Prisoner of Benda” focused on mind-switching. This time around, the topic is the grey goo scenario of nanotechnology, as Bender gains the ability to create two smaller duplicates of himself, who in turn can each create two smaller duplicates of themselves, who in turn…well, you get the idea. Also, the crew deals with Patton Oswalt’s hideous space giant, who can only take so much mockery of his appearance.

The business about smaller duplicates creating smaller duplicates is very reminiscent of Waldo, the story by Robert Heinlein which according to Colin Milburn influenced the part about creating smaller and smaller hands in Richard Feynman’s famous 1959 talk, There’s plenty of room at the bottom. From a transcript of Feynman’s talk (scroll down 3/4 of the way),

A hundred tiny hands

When I make my first set of slave “hands” at one-fourth scale, I am going to make ten sets. I make ten sets of “hands,” and I wire them to my original levers so they each do exactly the same thing at the same time in parallel. Now, when I am making my new devices one-quarter again as small, I let each one manufacture ten copies, so that I would have a hundred “hands” at the 1/16th size.

The ‘grey goo’ scenario was first proposed by K. Eric Drexler in his 1986 book, The Engines of Creation. He has distanced himself from some of his original assertions about ‘grey goo’ and there is still debate as to the plausibility of the  scenario.

From a more technical perspective, Feynman, Heinlein and Benderama present a top-down engineering scenario where one continually makes things smaller and smaller as opposed to the increasingly popular bottom-up engineering scenario where one mimics biological processes in an effort to promote self-assembly.

I’m not sure I’d call the science in the episode, ‘outlandish but plausible’ as it seems old-fashioned to me both with regard to the science and the humour. Still the episode seems to offer some  gentle fun on a topic that usually lends itself to ‘end of the earth’ scenarios so it’s nice to see the change in tone.

Monkey writes baseball story; Feynman symposium at USC; US government releases nanotechnology data sets; World Economic Forum (at Davos) interested in science

To my horror, researchers at Northwestern University in the US have developed software (Stats Monkey) that will let you automatically generate a story about a baseball game by pressing a button. More specifically, the data from the game is input to a database which when activated can generate content based on the game’s statistics.

I knew this would happen when I interviewed some expert at Xerox about 4 or 5 years ago. He was happily burbling on about tagging words and being able to call information up into a database and generating text automatically. I noted that as a writer I found the concept disturbing. He claimed that it would never be used for standard writing but just for things which are repetitive. I guess he was thinking it could be used for instructions and such or perhaps he was just trying to placate me. Back to stats monkey: I find it interesting that the researchers don’t display any examples of the ‘writing’. If you are interested, you can check out the project here.

The discussion about the nanotechnology narrative continues. At the University of Southern California, they will be holding a 50th anniversary symposium about the publication (in 1960)  of Feynman’s 1959 talk, There’s plenty of room at the bottom, and its impact on nanotechnology. You can read more about the event here or you can see the programme for the symposium here.

Bravo to the US government as they are releasing information to the public in a bid for transparency. Dave Bruggeman at Pasco Phronesis notes that the major science agencies had not released data sets at the time of his posting. Still, the Office of Science and Technology Policy did make data available including data about the National Nanotechnology Initiative,

The National Nanotechnology Initiative (NNI) coordinates Federal nanotechnology research and development among 25 Federal agencies. The data presented here represent NNI investments by agency and program component area (PCA) from the Initiative’s founding in FY 2001 through FY 2010 (requested). These data have been available as part of the NNI’s annual supplements to the President’s Budget. But compared to earlier releases, the data as presented here are more accessible and readily available for analysis by users wishing to assess trends and examine investment allocations over the 10-year history of the NNI. The cumulative NNI investment of nearly $12 billion is advancing our understanding of the unique phenomena and processes that occur at the nanoscale and is helping leverage that knowledge to speed innovation in high-impact opportunity areas such as energy, security, and medicine.

You can get the data set here in either XLS or PDF formats.

It would be very difficult to get this type of information in Canada as we have no central hub for nanotechnology research funding. We do have the National Institute of Nanotechnology which is a National Research Council agency jointly funded by the province of Alberta and the federal government. Not all nanotechnology research is done under their auspices. There’s more than one government agency which funds nanotechnology research and there is no reporting mechanism that would allow us to easily find out how much funding or where it’s going.

The 2010 edition of the World Economic Forum meeting at Davos takes place January 27 – 31. It’s interesting to note that a meeting devoted to economic issues has sessions on science, social media, the arts, etc. which suggests a much broader view of economics than I’m usually exposed to. However, the session on ‘Entrepreneurial Science’ does ring a familiar note. From the session description,

According to the US National Academy of Sciences, only 0.1% of all funded basic science research results in a commercial venture.

How can the commercial viability of scientific research be improved?

I’m not sure how they derived the figure of 0.1%. Was the data international? Were they talking about government-funded research? Over what period of time? (It’s not uncommon for research to lie fallow for decades before conditions shift sufficiently to allow commercialization.) How do you determine the path from research to commercialization? e.g. Perhaps the work that resulted in a commercial application was based on 10 other studies that did not.

Art conservation and nanotechnology; the science of social networks; carbon nanotubes and possible mesothelioma; Eric Drexler has a few words

It looks like nanotechnology innovations in the field of art conservation may help preserve priceless works for longer and with less damage. The problem as articulated in Michael Berger’s article on Nanowerk is,

“Nowadays, one of the most important problems faced during the cleaning of works of art is the removal of organic materials, mainly acrylic polymers, applied in the past as consolidants or protective coatings,” explains Piero Baglioni, a professor of Physical Chemistry at the University of Florence. “Unfortunately, their application induces a drastic alteration of the interfacial properties of the artwork and leads to increased degradation. These organic materials must therefore be removed.”

Baglioni and his colleagues at the University of Florence have developed “… a micro-emulsion cleaning agent that is designed to dissolve only the organic molecules on the surface of a painting …”

This is a little off Azonano’s usual beat (and mine too) but Rensselaer Polytechnic Institute’s Army Research Laboratory is launching an interdisciplinary research center for the study of social and cognitive networks.  From the news item,

“Rensselaer offers a unique research environment to lead this important new network science center,” said Rensselaer President Shirley Ann Jackson. “We have assembled an outstanding team of researchers, and built powerful new research platforms. The team will work with one of the largest academic supercomputing centers in the world – the Rensselaer Computational Center for Nanotechnology Innovations – and the leading visualization and simulation capabilities within our new Experimental Media and Performing Arts Center. The Center for Social and Cognitive Networks will bring together our world-class scientists in the areas of computer science, cognitive science, physics, Web science, and mathematics in an unprecedented collaboration to investigate all aspects of the ever-changing and global social climate of today.”

The center will study the fundamentals of social and cognitive networks and their roles in today’s society and organizations, including the U.S. Army. The goal will be to gain a deeper understanding of these networks and build a firm scientific basis in the field of network science. The work will include research on large social networks, with a focus on networks with mobile agents. An example of a mobile agent is someone who is interacting (e.g., communicating, observing, helping, distracting, interrupting, etc.) with others while moving around the environment.

My suspicion is that the real goal for the work is to exploit the data for military advantage, if possible. Any other benefits would be incidental. Of course, a fair chunk of the technology we enjoy today (for example, tv and the internet) was investigated by the military first.

I’ve mentioned carbon nanotubes and possible toxicology before. Specifically, some carbon nanotubes resemble asbestos fibers and pilot studies have suggested they may behave the same way when ingested by one means or another  into the body. There is a new confirmation of this hypothesis with a study where mice inhaled carbon nanotubes. From the news item on Nanowerk,

Using mice in an animal model study, the researchers set out to determine what happens when multi-walled carbon nanotubes are inhaled. Specifically, researchers wanted to determine whether the nanotubes would be able to reach the pleura, which is the tissue that lines the outside of the lungs and is affected by exposure to certain types of asbestos fibers which cause the cancer mesothelioma. The researchers used inhalation exposure and found that inhaled nanotubes do reach the pleura and cause health effects.

This was one exposure and the mice recovered after three months. More studies will be needed to determine the effects of repeated exposure. This study (Inhaled Carbon Nanotubes Reach the Sub-Pleural Tissue in Mice by Dr. James Bonner, Dr. Jessica Ryman-Rasmussen, Dr. Arnold Brody, et. al.) can be found in the Oct. 25, 2009 issue of Nature Nanotechnology.

On Friday (Oct. 23, 2009) I mentioned an essay by Chris Toumey on the forthcoming 50th anniversary of Richard Feynman’s seminal talk, There’s plenty of room at the bottom. Today I found a response to the essay by Eric Drexler.  From Drexler’s essay on Nanowerk,

Unfortunately, yesterday’s backward-looking guest article in Nanowerk reinforces the widespread but quite mistaken idea that my views are essentially the opposite of what I’ve stated above, and that those perverse ideas are also those of the Foresight Institute. I cannot speak for that organization, or vice versa, because I left it years ago. Contrary to what the article may suggest, I have no affiliation with the organization whatsoever. Regarding terminology, it is of course entirely appropriate to use the term “nanotechnology” to describe nanoscale technologies. The idea that there is a conflict between progress in the field and future applications of that progress is puzzling. This idea appears to stem from a strange episode that came to a head during the political push for the bill that established and funded the U.S. National Nanotechnology Initiative, an episode in which some leading science spokesmen quite properly rejected a collection of popular fantasies, but quite improperly attributed those fantasies to me. Reading claims by confused enthusiasts and the press that “Drexler says this” or “Drexler says that” is no substitute for reading my journal articles, or the technical analysis in my book, Nanosystems, and in my MIT dissertation). The failure of these leaders to do their homework has had substantial and lingering toxic effects.

(My own focus was on the ‘origin’ story for nanotechnology and not on Drexler’s theories.) If I understand the situation rightly, much of the controversy has its roots in Drexler’s popular book, Engines of Creation. It was written over 20 years ago and struck a note which reverberates to this day. The irony is that there are writers who’d trade places with Drexler in a nano second. Imagine having that kind of impact on society and culture (in the US primarily). The downside as Drexler has discovered is that the idea or story has taken on its own life. For a similar example, take Mary Shelley’s book where Frankenstein is not the monster’s name, it’s the scientist’s name. However, the character took its own life and name.

Plenty of Room at the Bottom’s 50th anniversary; new advance in nanoassembly; satirizing the copyright wars; China’s social media map

There’s plenty of room at the bottom, Richard Feynman’s December 29, 1959 talk for the American Physical Society is considered to be the starting point or origin for nanotechnology and this December marks its 50th anniversary. Chris Toumey, a cultural anthropologist at the University of South Carolina NanoCenter, has an interesting commentary about it (on Nanowerk) and he poses the question, would nanotechnology have existed without Richard Feynman’s talk? Toumey answers yes. You can read the commentary here.

In contrast to Toumey’s speculations, there’s  Colin Milburn (professor at University of California, Davis) who in his essay, Nanotechnology in the Age of Posthuman Engineering: Science Fiction as Science, suggests that nanotechnology originated in science fiction. You can read more about Milburn, find the citations for the essay I’ve mentioned, and/or download three of his other essays from here.

Ting Xu and her colleagues at the US Dept. of Energy’s Lawrence Berkeley National Laboratory have developed a new technique for self-assembling nanoparticles. From the news item on Physorg.com,

“Bring together the right basic components – nanoparticles, polymers and small molecules – stimulate the mix with a combination of heat, light or some other factors, and these components will assemble into sophisticated structures or patterns,” says Xu. “It is not dissimilar from how nature does it.”

More details are available here.

TechDirt featured a clip from This hour has 22 minutes, a satirical Canadian comedy tv programme, which pokes fun at the scaremongering which features mightily in discussions about copyright. You can find the clip here on YouTube.

I’ve been meaning to mention this tiny item from Fast Company (by Noah Robischon) about China’s social media. From the news bit,

The major players in the U.S. social media world can be counted on one hand: Facebook, MySpace, Twitter, LinkedIn. Not so in China, where the country’s 300 million online users have a panoply of popular social networks to choose from–and Facebook doesn’t even crack the top 10.

Go here to see the infographic illustrating China’s social media landscape.

Happy weekend!