Tag Archives: Nanoclast

Memristors and proteins

The memristor, a two-terminal circuit element joining the resistor, capacitor, and inductor, has until now been demonstrated using nonbiological materials such as metal oxides, carbon, etc. Researchers in Singapore have reported in a paper (in the Sept. 5, 2011 online edition of Small, Protein-Based Memristive Nanodevice)  that a memristive nanodevice can be based on a protein. From the Sept. 15, 2011 Spotlight article by Michael Berger on Nanowerk,

Memristors – the fourth fundamental two-terminal circuit element following the resistor, the capacitor, and the inductor – have attracted intensive attention owing to their potential applications for instance in nanoelectronic memories, computer logic, or neuromorphic computer architectures.

“Previous work on memristors were based on man-made inorganic/organic materials, so we asked the question whether it is possible to demonstrate memristors based on natural materials,” Xiaodong Chen, an assistant professor in the School of Materials Science & Engineering at Nanyang University, tells Nanowerk. “Many activities in life exhibit memory behavior and substantial research has focused on biomolecules serving as computing elements, hence, natural biomaterials may have potential to be exploited as electronic memristors.”

This work provides a direct proof that natural biomaterials, especially redox proteins, could be used to fabricate solid state devices with transport junctions, which have potential applications in functional nanocircuits.

My last posting about memristors was April 13, 2011, Blood, memristors, cyborgs plus brain-controlled computers, prosthetics, and art.

ETA Sept. 21, 2011: Dexter Johnson at Nanoclast (on the Institute of Electrical and Electronics Engineers website) offers another take on memristors in his Sept. 20,2011 posting, Memristors Go Biological. I particularly liked this bit,

It’s been just three years since the memristor was identified so if statistical norms of commercialization are in place we can expect another four years of waiting before we see this material in our smart phones. In fact, this timeline is pretty close to HP’s expectations of 2014 as a target date for its incorporation into electronic devices.

During this time researchers have not been and will not be sitting on their hands while engineers work out scalability and yields.

July 2011 update on nanotechnology regulatory framework discussion

It’s getting hard to keep up with the material on nanotechnology regulatory frameworks these days but here’s my latest effort (in no particular order).

Nanowiki published a July 7, 2011 roundup of the discussion about the recent FDA (US Food and Drug Administration) and EPA (US Environmental Protection Agency) initiatives along with a list of selected articles and blog postings to supply context (yes, my blog posting Nano regulatory frameworks are everywhere! of June 22, 2011 was included!). Please do check out their roundup as they mention articles and commentaries that I haven’t.

Also included in the Nanowiki roundup was Andrew Maynard’s (Director of the University of Michigan Risk Science Center) draft of an article for Nature magazine  on the topic of nanomaterial definition and nanotechnology regulatory frameworks. The final version of the article is behind a paywall but a draft version can be viewed on Andrew’s 2020 Science blog. From his July 6, 2011 posting,

Five years ago, I was a strong proponent of developing a regulatory definition of engineered nanomaterials.  Today I am not.  Even as policy makers are looking for clear definitions on which to build and implement nano-regulations, the science is showing there is no bright line separating the risks presented by nanometer and non-nanometer scale materials.  As a result, there is a growing danger of science being pushed to one side as government agencies strive to regulate nanomaterials and the products they are used in.

I have mentioned Andrew’s perspective vis à vis bypassing a definition of nanomaterials and getting on with the task of setting a regulatory framework in my June 9, 2011 and my April 15, 2011 postings. I expressed some generalized doubts about this approach in the earlier posting while noting that both Andrew and Dexter Johnson (Nanoclast blog on the IEEE [Institute of Electrical and Electronics Engineers]  Spectrumwebsite) have a point when they express concern that the definition may be based on public relations concerns rather than science.

Also chiming into the debate is Scott Rickert (president and chief executive officer of Nanofilm) in his July 8, 2011 article, Six Ways I Know Nanotechnology Is Here To Stay, for Industry Week,

Have you been keeping up on recent government developments that have the nanotechnology industry in an uproar? First there was a dust-up when Clayton Teague stepped down as Director of the National Nanotechnology Coordination Office. There were rumors that the anti-nano forces had run him out. (Not true, by the way.) Then an announcement that the Food and Drug Administration would be looking at nanotechnology safety guidelines got some folks twitching. The same day, the White House released principles to guide the regulation and oversight of nanotechnology applications. That had people running for the exits.

Colleagues who’ve been in nanotechnology for a decade without incident were considering shutting down businesses, afraid a nano-boogieman was going to target them for billion-dollar lawsuits. Start-ups were in fear that the trickle of investment money would completely dry up. Any day I expect to see black armbands popping up in university labs in mourning over lost research grants.

Rickert goes on to suggest that all this recent regulatory activity can be attributed to ‘growing pains’ which he supports with various facts and figures. He has commented on this topic before as I note in my June 17, 2010 posting.

Happy Weekend!

Nanomaterial regulatory frameworks: what’s all the fuss?

I’ve dug up more information on nanomaterials and regulatory frameworks but before I launch off into the discussion I think it might be interesting to take a look at this graphic of a plant’s potential uptake of various nanomaterials as it illustrates some of the reasons why there’s so much interest in this topic.

Downloaded from the June 7, 2011 article, Nano & The Food Chain: Another Puzzle by Gwyneth K. Shaw for the New Haven Independent (the graphic was originally published in the Journal of Agricultural and Food Chemistry),

”]

Shaw’s article is about a study (Interaction of Nanoparticles with Edible Plants and Their Possible Implications in the Food Chain [this is behind a paywall]) by researchers at the University of Texas at El Paso, which reviews current studies in the field and suggests that as nanoparticles enter the food chain we need to consider cumulative effects.

Meanwhile, the discussion about developing regulatory frameworks and whether or not we need to have a definition for nanomaterials before setting a regulatory framework continues. From the June 7, 2011 news item on Nanowerk,

The Belgian Presidency of the Council of the European Union organized a high level event on September 14, 2010, bringing together representatives of various associations (consumers, environmental protection, workers, industrial federations), scientists, regulatory experts as well as national and European regulatory bodies, in order to review the legislative initiatives in progress with regard to nanomaterials and to establish an operational framework for the management of incidents in the short term and to achieve improved risk management in the long term.

Initially I confused this meeting with the March 2011 meeting mentioned in my April 14, 2011 posting but I gather there are a number of meetings (some of which seem remarkably similar) on the topic with various European Union groups and subgroups. The September 2010 meeting was under the auspices of the European Union and the March 2011 meeting was under the auspices of the European Commission (which I believe is part of the European Union bureaucracy). In any event, the September 2010 meeting resulted in a set of objectives being set (from the news item),

THE [European Union] PRESIDENCY CONCLUDES THAT, IN ORDER TO protect the workers, consumers health and the environment, and at the same time guarantee the development of a secure and sound economy based notably on innovation and societally acceptable industrial applications that create quality jobs, THE FOLLOWING OBJECTIVES MUST BE REACHED, IN RELATION TO NANOMATERIALS, PRODUCTS CONTAINING NANOMATERIALS AND NANOTECHNOLOGIES:

1) REGARDING THE REGULATORY FRAMEWORK:

  • to effectively address their potential risks and uncertainties, at the earliest, and thus ensure a high level of environment and health protection;
  • to consider their challenges transversally, across sectors, disciplines and regulations;
  • in parallel, to implement specific regulatory measures to deal with their particularities;
  • to appropriately inform and consult consumers, workers and citizens;

    2) REGARDING SCIENCE, RESEARCH, INNOVATION AND KNOWLEDGE:

  • to develop the necessary scientific knowledge in a global, coordinated and open manner;
  • to be proactive and to anticipate when dealing with the risks and uncertainties of new technological developments.

    IN CONSEQUENCE, THE FOLLOWING ACTIONS HAVE TO BE TAKEN:

  • to take up responsibilities at the Member States level and, during a transitory period, draw up coordinated and integrated national strategies and concrete measures in favour of risk management, information and monitoring;
  • to develop urgently a regulatory definition for nanomaterials that must include nanomaterials all along their lifecycle, including into substances, products, articles, wearing residues and waste; [emphasis mine]
  • to consider nanotechnology as a priority into a future 2nd Environment and Health Action Plan, including inter alia basic and applied research related to them, their specific potential risks, their traceability and the link between innovation, environment and health safety;
  • to clarify the various issues that remain presently unaddressed in the Commission proposals to adapt REACH to the nanomaterials and, in addition to the adaptations to the guidances to include significant modifications into the REACH 2012 review, including the lowering of the tonnage triggers for nanomaterials, modifications to data requirements in REACH annexes, consideration of nanomaterials as new substances, annexes V (exemptions) and XIII review (PBT, vPvB) and the inclusion in REACH of a definition of nanomaterials and articles containing nanomaterials;
  • to increase public and private resources, especially the financial inputs to the OECD WPMN, with the goal of obtaining results to be used for regulatory purposes as soon as possible;
  • to develop harmonized compulsory databases of nanomaterials and products containing nanomaterials;
  • such databases must be the base for traceability, market surveillance, gaining knowledge for better risk prevention and for the improvement of the legislative framework;
  • to take into account, in the design of such databases, the need for providing information to the citizens, workers and consumers regarding nanomaterials and products containing nanomaterials as well as the industry’s need for data protection;
  • claims made on labels of products containing nanomaterials must be regulated and the requirements to inform the consumer of the presence of nanomaterials in consumer products must be defined;
  • to consider sustainability, societal benefits, demands for public participation, and ethical considerations in the public investments in innovative technologies;
  • to establish a systematic, balanced and appropriate link between on the one hand the assessment of risk, early warnings and uncertainties and on the other hand the public investments in innovative technologies in general and nanotechnologies in particular, including financing mechanisms that take such a link into account;
  • to consider research in toxicology and ecotoxicology of nanomaterials, as well as their fate along the whole lifecycle as a high priority.

There is a school of thought that a regulatory framework can be put in place without establishing a definition beforehand as per my April 15, 2011 posting where I mentioned Dr. Andrew Maynard’s proposal and expressed some hesitation. I see Dexter Johnson (of the Nanoclast blog on the IEEE [Institute of Electrical and Electronics Engineers] website), after interviewing Rudolf Strohmeier, Deputy Director General, Directorate General for Research & Innovation for the European Commission at the EuroNano Forum 2011 in Budapest, Hungary, has weighed in with this in his May 31, 2011 posting,

Below is an audio recording I made of my exchange with Mr. Strohmeier. Interestingly, according to him, the definition was necessary for educating EU citizens as much as for developing regulations. …

In fairness, I didn’t really get a chance to follow up with Mr. Strohmeier to see if he could see the problems that arise when you arbitrarily arrive at a definition that may not always reflect the latest science on the topic. Nonetheless, I can’t help but think that a definition that is as much about mollifying the public as it is about good science has inherent risks itself. [emphases mine]

I take Dexter’s and Andrew’s point about the potential problems that creating a definition for what I’m going to call ‘public relations purposes’ could cause but I still haven’t grasped how one would create a regulatory framework without a definition of some kind (but maybe that’s just the writer in me).

All of this certainly puts the Canadian situation into perspective. There’s an interim definition in place. As for a regulatory framework, it appears that the government (Health Canada) favours a case by case approach as per their plans to investigate nanosunscreens (noted in my June 3, 2011 posting).

Finger pinches today, heartbeats tomorrow and electricity forever

Devices powered by energy generated and harvested from one’s own body have been of tremendous interest to me. Last year I mentioned some research in this area by Professor Zhong Lin Wang at Georgia Tech (Georgia Institute of Technology) in a July 12, 2010 posting. Well, Wang and his team recently announced that they have developed the first commercially viable nanogenerator. From the March 29, 2011 news item on Physorg.com,

After six years of intensive effort, scientists are reporting development of the first commercially viable nanogenerator, a flexible chip that can use body movements — a finger pinch now en route to a pulse beat in the future — to generate electricity. Speaking here today at the 241st National Meeting & Exposition of the American Chemical Society, they described boosting the device’s power output by thousands times and its voltage by 150 times to finally move it out of the lab and toward everyday life.

“This development represents a milestone toward producing portable electronics that can be powered by body movements without the use of batteries or electrical outlets,” said lead scientist Zhong Lin Wang, Ph.D. “Our nanogenerators are poised to change lives in the future. Their potential is only limited by one’s imagination.”

Here’s how it works  (from Kit Eaton’s article on Fast Company),

The trick used by Dr. Zhong Lin Wang’s team has been to utilize nanowires made of zinc oxide (ZnO). ZnO is a piezoelectric material–meaning it changes shape slightly when an electrical field is applied across it, or a current is generated when it’s flexed by an external force. By combining nanoscopic wires (each 500 times narrower than a human hair) of ZnO into a flexible bundle, the team found it could generate truly workable amounts of energy. The bundle is actually bonded to a flexible polymer slice, and in the experimental setup five pinky-nail-size nanogenerators were stacked up to create a power supply that can push out 1 micro Amp at about 3 volts. That doesn’t sound like a lot, but it was enough to power an LED and an LCD screen in a demonstration of the technology’s effectiveness.

Dexter Johnson at Nanoclast on the IEEE (Institute of Electrical Engineering and Electronics) website notes in his March 30, 2010 posting (http://spectrum.ieee.org/nanoclast/semiconductors/nanotechnology/powering-our-electronic-devices-with-nanogenerators-looks-more-feasible) that the nanogenerator’s commercial viability is dependent on work being done at the University of Illinois,

I would have happily chalked this story [about the nanogenerator] up to one more excellent job of getting nanomaterial research into the mainstream press, but because of recent work by Eric Pop and his colleagues at the University of Illinois’s Beckman Institute in reducing the energy consumed by electronic devices it seems a bit more intriguing now.

So low is the energy consumption of the electronics proposed by the University of Illinois research it is to the point where a mobile device may not need a battery but could possibly operate on the energy generated from piezoelectric-enabled nanogenerators contained within such devices like those proposed by Wang.

I have a suspicion it’s going to be a while before I will be wearing nanogenerators to harvest the electricity my body produces. Meanwhile, I have some questions about the possible uses for nanogenerators (from the Kit Eaton article),

The search for tiny power generator technology has slowly inched forward for years for good reason–there are a trillion medical and surveillance uses–not to mention countless consumer electronics applications– for a system that could grab electrical power from something nearby that’s moving even just a tiny bit. Imagine an implanted insulin pump, or a pacemaker that’s powered by the throbbing of the heart or blood vessels nearby (and then imagine the pacemaker powering the heart, which is powered by the pacemaker, and so on and so on….) and you see how useful such a system could be.

It’s the reference to surveillance that makes me a little uneasy.

Innovation discussion in Canada lacks imagination

Today, Feb. 18, 2011, is the last day you have to make a submission to the federal government of Canada’s Review of Federal Support to Research and Development.

By the way, the  expert panel appointed and tasked with carrying out this consultation consists of:

Mr. Thomas Jenkins – Chair
Dr. Bev Dahlby
Dr. Arvind Gupta
Ms. Monique F. Leroux
Dr. David Naylor
Mrs. Nobina Robinson

They represent a mix of industry and academic representatives; you can read more about them here. You will have to click for each biography. Unfortunately, neither the website nor the consultation paper offer a list of members of the panel withbiographies that are grouped together for easy scanning.

One sidenote, big kudos to whomever decided this was a good idea (from the Review web page),

Important note: Submissions received by the panel will be made publicly available on this site as early as March 4, 2011.[emphases mine] * The name and organizational affiliation of the individual making the submission will be posted on the site; however, contact information (i.e., email addresses, phone numbers and postal addresses) will not be posted, unless that information is embedded in the submission itself.

This initiative can be viewed in two ways: (a) necessary housecleaning of funding programmes for research and development (R&D) that are not effective and (b) an attempt to kickstart more innovation, i.e. better ties between government R&D efforts and industry to achieve more productivity, in Canada. From the consultation paper‘s introduction,

WHY A REVIEW?

Innovation by business is a vital part of maintaining a high standard of living in Canada and building Canadian sources of global advantage. The Government of Canada plays an important role in fostering an economic climate that encourages business innovation, including by providing substantial funding through tax incentives and direct program support to enhance business research and development (R&D). Despite the high level of federal support, Canada continues to lag behind other countries in business R&D expenditures (see Figure 1), and this is believed to be a significant factor in contributing to the country’s weak productivity growth. Recognizing this, Budget 2010 announced a comprehensive review of federal support to R&D in order to maximize its contribution to innovation and to economic opportunities for business. (p. 1 print;  p. 3 PDF)

I’d like to offer a submission but I can’t for two reasons. (a)  I really don’t know much about the ‘housecleaning’ aspects. (b) The panel’s terms of reference vis à vis innovation are so constrained that any comments I could offer fall far outside it’s purview.

Here’s what I mean by ‘constrained terms of reference’ (from the consultation paper),

The Panel has been asked to provide advice related to the following questions:

§ What federal initiatives are most effective in increasing business R&D and facilitating commercially relevant R&D partnerships?

§ Is the current mix and design of tax incentives and direct support for business R&D and businessfocused R&D appropriate?

§ What, if any, gaps are evident in the current suite of programming, and what might be done to fill these gaps?

In addition, the Panel’s mandate specifies that its recommendations not result in an increase or decrease to the overall level of funding required for federal R&D initiatives. (p. 3 print; p. 5 PDF)

The ‘housecleaning’ effort is long overdue. Even good government programmes can outlive their usefulness while ineffective and/or bad programmes don’t get jettisoned soon enough or often enough. If you want a sense of just how complicated our current R & D funding system is, just check this out from Nassif Ghoussoub’s (Piece of Mind blog) Jan. 14, 2011 posting,

Now the number of programs that the government supports, and which are under review is simply mind boggling.

First, you have the largest piece of the puzzle, the $4-billion “Scientific Research and Experimental Develoment tax credit program” (SR&ED), which seems to be the big elephant in the room. I hardly know anything about this program, besides the fact that it is a federal tax incentive program, administered by the Canada Revenue Agency, that encourages Canadian businesses of all sizes, and in all sectors to conduct research and development in Canada. Former VP of the NRC and former President of Alberta Ingenuity, Peter Hackett, has lots to say about this. Also on youtube.

But you don’t need to be an expert to imagine the line-up of CEOs waiting to testify as to how important these tax incentives are to the country? “Paris vaut bien une messe” and a billion or four are surely worth testifying for.

Next, just take a look (below) at this illustrative list of more directly funded federal programs. Why “illustrative”?, because there is at least one hundred more!

Do you really think that anyone of the heads/directors/presidents (the shopkeepers!) of these programs (the shops!) are going to testify that their programs are deficient and need less funding? What about those individuals that are getting serious funding from these programs (the clients!)?

Nassif’s list is 50 (!) programmes long and he suggests there are another 100 of them? Yes, housecleaning is long overdue but as Nassif points out. the people most likely to submit comment about these programmes  are likely to be beneficiaries uninclined to see their demise.

There is another problem with this ‘housecleaning’ process in that they seem to be interested in ‘tweaking’ rather than renovating or rethinking the system. Rob Annan at the Researcher Forum (Don’t leave Canada behind) blog, titled his Feb. 4, 2011 post, Innovation vs. Invention, as he questions what we mean by innovation (excerpt from his posting),

I wonder if we’ve got the whole thing wrong.

The fact is: universities don’t produce innovation. For that matter, neither does industrial R&D.

What university and industrial research produces is invention.

The Blackberry is not an innovation, it’s an invention. A new cancer-fighting drug is not an innovation, it’s an invention. A more durable prosthetic knee is not an innovation, it’s an invention.

Universities can – and do – produce inventions.

In fact, they produce inventions at an astonishing rate. University tech transfer offices (now usually branded as “centres for innovation and commercialization”) register more intellectual property than could ever be effectively commercialized.

But innovation is distinct from invention. Innovation is about process.

Innovation is about finding more efficient ways to do things. Innovation is about increasing productivity. Innovation is about creating new markets – sometimes through the commercialization of inventions.

Innovation is about the how not about the what.

Thought-provoking, yes? I think a much broader scope needs to be taken if we’re going really discuss innovation in Canada. I’m talking about culture and making a cultural shift. One of the things I’ve noticed is that everyone keeps saying Canadians aren’t innovative. Fair enough. So, how does adding another government programme change that? As far as I can tell, most of the incentives that were created have simply encouraged people to game the system, which is what you might expect from people who aren’t innovative.

I think one of the questions that should have been asked is, how do you encourage the behaviour, in this case a cultural shift towards innovation, you want when your programmes haven’t elicited that behaviour?

Something else I’d suggest, let’s not confine the question(s) to the usual players as they’ll be inclined to offer more of the same. (There’s an old saying, if you’re a hammer, everything looks like a nail.)

Another aspect of making a cultural shift is modeling at least some of the behaviours. Here’s something what Dexter Johnson at the Nanoclast blog (IEEE Spectrum) noticed about US President Barack Obama’s January 2011 State of the Union address in his January 28, 2011 posting,

Earlier this week in the President’s State of the Union Address, a 16-year-old girl by the name Amy Chyao accompanied the First Lady at her seat.

No doubt Ms. Chyao’s presence was a bit of stage craft to underscore the future of America’s ingenuity and innovation because Ms. Chyao, who is still a high school junior, managed to synthesize a nanoparticle that when exposed to infrared light even when it is inside the body can be triggered like a bomb to kill cancer cells. [emphasis mine] Ms. Chyao performed her research and synthesis in the lab of Kenneth J. Balkus, Jr., a chemistry professor at the University of Texas at Dallas.

This is a remarkable achievement and even more so from someone still so young, so we would have to agree with Prof. Balkus’ assessment that “At some point in her future, she’ll be a star.”

However, Chyao was given to us as a shining example of the US potential for innovation, and, as a result, its competitiveness. So beyond stage craft, what is the assessment of innovation for the US in a time of emerging technologies such as nanotechnology? [emphasis mine]

As President Obama attempts to rally the nation with “This is our Sputnik moment”, Andrew Maynard over on his 20/20 blog tries to work out what innovation means in our current context as compared to what it meant 50 years ago at the dawn of the space race.

Notice the emphasis on innovation. Our US neighbours are as concerned as we are about this and what I find interesting is that there glimmers of a very different approach. Yes, Chyao’s presence was stagecraft but this kind of ‘symbolic communication’ can be incredibly important. I say ‘can’ because if it’s purely stagecraft then it will condemned as a cheap stunt but if they are able to mobilize ‘enough’ stories, programmes, education, etc. that support the notion of US ingenuity and innovation then you can see a cultural shift occur. [Perfection won’t be achieved; there will be failures. What you need are enough stories and successes.] Meanwhile, Canadians keep being told they’re not innovative and ‘we must do something’.

This US consultation may be more stagecraft but it shows that not all consultations have to be as thoroughly constrained as the Canadian one finishing today.  From Mike Masnick’s Feb. 9, 2011 posting (The White House Wants Advice On What’s Blocking American Innovation) on Techdirt,

The White House website kicked off a new feature this week, called Advise the Advisor, in which a senior staff member at the White House will post a YouTube video [there’s one in this posting on the Techdirt website] on a particular subject, asking the public to weigh in on that topic via a form. The very first such topic is one near and dear to our hearts: American Innovation. [emphasis mine] …

And here is the answer I provided:

Research on economic growth has shown time and time again the importance of basic innovation towards improving the standard of living of people around the world. Economist Paul Romer’s landmark research into innovation highlighted the key factor in economic growth is increasing the spread of ideas.

Traditionally, many people have considered the patent system to be a key driver for innovation, but, over the last few decades, research has repeatedly suggested that this is not the case. In fact, patents more frequently act as a hindrance to innovation rather than as a help to it. Recent research by James Bessen & Michael Meurer (reviewing dozens of patent studies) found that the costs of patents far outweigh the benefits.

This is a problem I see daily as the founder of a startup in Silicon Valley — often considered one of the most innovative places on earth. Patents are not seen as an incentive to innovation at all. Here, patents are simply feared. The fear is that anyone doing something innovative will be sued out of nowhere by someone with a broad patent. A single patent lawsuit can cost millions of dollars and can waste tons of resources that could have gone towards actual innovation. Firms in Silicon Valley tend to get patents solely for defensive purposes.

Getting back to Dexter, there is one other aspect of his comments that should be considered, the emphasis on ’emerging technologies’. The circumstances in which we currently find ourselves are hugely different than they were during the Industrial revolution, the arrival of plastics and pesticides, etc. We understand our science and technology and their impacts quite differently than we did even a generation ago and that requires a different approach to innovation than the ones we’ve used in the past. From Andrew Maynard’s Jan. 25, 2011 posting (2020 Science blog),

… if technology innovation is as important as Obama (and many others besides) believes it is, how do we develop the twenty first century understanding, tools and institutions to take full advantage of it?

One thing that is clear is that in connecting innovation to action, we will need new insights and “intelligence” on how to make this connection work in today’s world. These will need to address not only the process of technology innovation, but also how we develop and use it within an increasingly connected society, where more people have greater influence over what works – and what doesn’t – than ever before. This was the crux of a proposal coming out of the World Economic Forum Global Redesign Agenda earlier this year, which outlined the need for a new Global Center for Emerging Technologies Intelligence.

But beyond the need for new institutions, there is also the need for far more integrated approaches to building a sustainable future through technology innovation – getting away from the concept of technology innovation as something that is somebody else’s business, and making it everybody’s business. This was a central theme in the World Economic Forum report that Tim Harper of CIENTIFICA Ltd. and I published last week.

There’s a lot more to be said about the topic. Masnick did get a response of sorts to his submission about US innovation (from his Feb. 17, 2011 posting on Techdirt),

Tony was the first of a bunch of you to send over the news that President Obama’s top advisor, David Plouffe, has put up a blog post providing a preliminary overview of what he “heard” via the Ask the Advisor question, which we wrote about last week, concerning “obstacles to innovation.” The only indication that responses like mine were read was a brief mention about how some people complained about how the government, and particularly patent policy, got in the way of innovation:

Many respondents felt that too much government regulation stifled businesses and innovators and that the patent process and intellectual property laws are broken.

Unfortunately, rather than listening to why today’s patent system is a real and significant problem, it appears that Plouffe is using this to score political points for his boss …

Masnick hasn’t lost hope as he goes on to note in his posting.

For yet another perspective, I found Europeans weighed in on the innovation topic at the American Association for the Advancement of Science (AAAS) 2011 annual meeting this morning (Feb. 18, 2011). From a Government of Canada science blog (http://blogs.science.gc.ca/) posting, Mobilizing resources for research and innovation: the EU model, by Helen Murphy,

EU Commission Director-General of the Joint Research Centre Robert-Jan Smits spoke about what all countries agree on: that research and innovation are essential to prosperity — not just now, but even more so in the future.

He said European leaders are voicing the same message as President Obama, who in his recent State of the Union address linked innovation to “winning the future” — something he called the “Sputnik movement of our generation.”

Smits talked about the challenge of getting agreement among the EU’s 27 member countries on a growth strategy. But they have agreed; they’ve agreed to pursue growth that is smart (putting research and innovation at centre stage), sustainable (using resources efficiently and responsibly) and inclusive (leaving no one behind and creating new jobs).

The goal is ambitious: the EU aims to create nearly four million new jobs in Europe and increase the EU’s GDP by 700 billion Euros by 2025.

What I’m trying to say is that innovation is a big conversation and I hope that the expert panel for Canada’s current consultation on this matter will go beyond its terms reference to suggest that ‘housecleaning and tweaking’ should be part of a larger initiative that includes using a little imagination.

NISE Net’s Youtube channel

Dexter Johnson at his Nanoclast blog noted in an October 13, 2010 posting that NISE Net (Nanoscale Informal Science Education Network) has placed a number of nanotechnology-related videos on its own Youtube channel. From the Nanoclast posting,

I haven’t really looked at a wide variety of videos that NISE has collected, but the ones that come from a DVD NISE Network produced called “Talking Nano” contains some real gems. In particular, I enjoyed a seminar George Whitesides gave educators and journalists back in 2007 at the Museum of Science in Boston on what they should know and consider important when relating the subject of nanotechnology either to their students or their audience.

Whitesides, of course, is a renowned scientist at Harvard University, and someone who I’ve come to appreciate for his unique perspectives on how nanotechnology will develop.

Dexter features part 1 of the Whitesides interview which he recommends. I haven’t had time to check the video out yet, although based on the pleasure of seeing some of Whitesides’ collaborative work with Felice Frankel in book form, I too would recommend it.

Overpromising and underdelivering: genome, stem cells, gene therapy and nano food

When people talk about overpromising (aka hype/hyperbole) and science, they’re usually referring to overexcited marketing collateral and/or a public relations initiative and/or news media coverage.  Scientists themselves don’t tend to be identified as one of the sources for hype even when that’s clearly the case. That’s right, scientists are people too and sometimes they get carried away by their enthusiasms as Emily Yoffe notes in her excellent Slate essay, The Medical Revolution; Where are the cures promised by stem cells, gene therapy, and the human genome? From Yoffe’s essay,

Dr. J. William Langston has been researching Parkinson’s disease for 25 years. At one time, it seemed likely he’d have to find another disease to study, because a cure for Parkinson’s looked imminent. In the late 1980s, the field of regenerative medicine seemed poised to make it possible for doctors to put healthy tissue in a damaged brain, reversing the destruction caused by the disease.

Langston was one of many optimists. In 1999, the then-head of the National Institute of Neurological Disorders and Stroke, Dr. Gerald Fischbach, testified before the Senate that with “skill and luck,” Parkinson’s could be cured in five to 10 years. Now Langston, who is 67, doesn’t think he’ll see a Parkinson’s cure in his professional lifetime. He no longer uses “the C word” and acknowledges he and others were naive. [emphasis mine] He understands the anger of patients who, he says, “are getting quite bitter” that they remain ill, long past the time when they thought they would have been restored to health.

The disappointments are so acute in part because the promises have been so big. Over the past two decades, we’ve been told that a new age of molecular medicine—using gene therapy, stem cells, and the knowledge gleaned from unlocking the human genome—would bring us medical miracles. [emphasis mine] Just as antibiotics conquered infectious diseases and vaccines eliminated the scourges of polio and smallpox, the ability to manipulate our cells and genes is supposed to vanquish everything from terrible inherited disorders, such as Huntington’s and cystic fibrosis, to widespread conditions like cancer, diabetes, and heart disease.

Yoffe goes on to outline the problems that researchers encounter when trying to ‘fix’ what’s gone wrong.

Parkinson’s disease was long held out as the model for new knowledge and technologies eradicating illnesses. Instead, it has become the model for its unforeseen consequences. [emphasis mine]

Langston, head of the Parkinson’s Institute and Clinical Center, explains that scientists believed the damage to patients took place in a discrete part of the brain, the substantia nigra. “It was a small target. All we’d have to do was replace the missing cells, do it once, and that would cure the disease,” Langston says. “We were wrong about that. This disease hits many other areas of the brain. You can’t just put transplants here and there. The brain is not a pincushion.”

Disease of all kinds have proven to be infinitely more complex than first realized. Disease is not ’cause and effect’ driven so much as it is a process with an infinite number of potential inputs and any number of potential outcomes. Take for example gene therapy (Note: the human genome project was supposed to yield gene therapies),

In some ways, gene therapy for boys with a deadly immune disorder, X-linked severe combined immune deficiency, also known as “bubble boy” disease, is the miracle made manifest. Inserting good genes into these children has allowed some to live normal lives. Unfortunately, within a few years of treatment, a significant minority have developed leukemia. The gene therapy, it turns out, activated existing cancer-causing genes in these children. This results in what the co-discoverer of the structure of DNA, James Watson, calls “the depressing calculus” of curing an invariably fatal disease—and hoping it doesn’t cause a sometimes-fatal one.

For me, it seems that that the human genome project was akin to taking a clock apart. Looking at the constituent parts and replacing broken ones does not guarantee that you will be able assemble a more efficient working version unless you know how the clock worked in the first place. We still don’t understand the basic parts, the genes,  interact with each other, within their environment, or with external inputs.

The state of our ignorance is illustrated by the recent sequencing of the genome of Bishop Desmond Tutu and four Bushmen. Three of the Bushmen had a gene mutation associated with a liver disease that kills people while young. But the Bushmen are all over 80—which means either the variation doesn’t actually cause the disease, or there are other factors protecting the Bushmen.

As for the pressures acting on the scientists themselves,

There are forces, both external and internal, on scientists that almost require them to oversell. Without money, there’s no science. Researchers must constantly convince administrators who control tax dollars, investors, and individual donors that the work they are doing will make a difference. Nancy Wexler says that in order to get funding, “You have to promise cures, that you’ll meet certain milestones within a certain time frame.”

The infomercial-level hype for both gene therapy and stem cells is not just because scientists are trying to convince funders, but because they want to believe. [emphases mine]

Scientific advances as one of Yoffe’s interview subjects points out involve a process dogged with failure and setbacks requiring an attitude of humility laced with patience and practiced over decades before an ‘overnight success’ occurs, if it ever does.

I was reminded of Yoffe’s article after reading a nano food article recently written by Kate Kelland for Reuters,

In a taste of things to come, food scientists say they have cooked up a way of using nanotechnology to make low-fat or fat-free foods just as appetizing and satisfying as their full-fat fellows.

The implications could be significant in combating the spread of health problems such as obesity, diabetes and heart disease.

There are two promising areas of research. First, they are looking at ways to slow digestion,

One thing they might look into is work by scientists at Britain’s Institute of Food Research (IFR), who said last month they had found an unexpected synergy that helped break down fat and might lead to new ways of slowing digestion, and ultimately to creating foods that made consumers feel fuller.

“Much of the fat in processed foods is eaten in the form of emulsions such as soups, yoghurt, ice cream and mayonnaise,” said the IFR’s Peter Wilde. “We are unpicking the mechanisms of digestion used to break them down so we can design fats in a rational way that are digested more slowly.”

The idea is that if digestion is slower, the final section of the intestine called the ileum will be put on its “ileal brake,” sending a signal to the consumer that means they feel full even though they have eaten less fat

This sounds harmless and it’s even possible it’s a good idea but then replacing diseased tissue with healthy tissue, as they tried with Parkinson’s Disease gene therapies, seemed like a good idea too. Just how well is the digestive process understood?

As for the second promising area of research,

Experts see promise in another nano technique which involves encapsulating nutrients in bubble-like structures known as vesicles that can be engineered to break down and release their contents at specific stages in the digestive system.

According to Vic Morris, a nano expert at the IFR, this technique in a larger form, micro-encapsulation, was well established in the food industry. The major difference with nano-encapsulation was that the smaller size might be able to take nutrients further or deliver them to more appropriate places. [emphasis mine]

They’ve been talking about trying to encapsulate and target medicines to more appropriate places and, as far as I’m aware, to no avail. I sense a little overenthusiasm on the experts’ part. Kelland does try to counterbalance this by discussing other issues with nanofood such as secretiveness about the food companies’ research, experts’ concerns over nanoparticles, and public concerns over genetically modified food. Still the allure of ‘all you can eat with no consequences’ is likely to overshadow any journalist’s attempt at balanced reporting with resulting disappointment when somebody realizes it’s all much more complicated than we thought.

Dexter Johnson’s Sept. 22, 2010 posting ( Protein-based Nanotubes Pass Electrical Signals Between Cells) on his Nanoclast blog offers more proof that we still have a lot to learn about basic biological processes,

A few years back, scientists led by Hans-Hermann Gerdes at the University of Bergen noticed that there were nanoscale tubes connecting cells sometimes over significant distances. This discovery launched a field known somewhat by the term in the biological community as the “nanotube field.”

Microbiologists remained somewhat skeptical on what this phenomenon was and weren’t entirely pleased with some explanations offered because they seemed to fall outside “existing biological concepts.”

So let’s start summing up.  The team notices nanotubes that connect cells over distances which microbiologists have difficulty accepting as “they [seem] to fall outside existing biological concepts. [emphasis mine] Now the team has published a paper which suggests that electrical signals pass through the nanotubes and that a ‘gap junction’ enables transmission to nonadjacent cells.  (Dexter’s description provides  more technical detail in an accessible writing style.)

As Dexter notes,

Another key biological question it helps address–or complicate, as the case may be–is the complexity of the human brain. This research makes the brain drastically more complex than originally thought, according to Gerdes. [emphasis mine]

Getting back to where I started, scientists are people too. They have their enthusiasms as well as pressure to get grants and produce results for governments and other investors, not to mention their own egos.  And while I’ve focused on the biological and medical sciences in this article, I think that all the sciences yield more questions than answers and that everything is far more complicated and  interconnected than we have yet to realize.

ISO nanomaterials definition

There’s a new definition from the International Standards Organization (ISO) for nanomaterials.  From the news item on Nanowerk,

ISO has therefore published a new technical report, ISO/TR 11360:2010, Nanotechnologies – Methodology for the classification and categorization of nanomaterials, offering a comprehensive, globally harmonized methodology for classifying nanomaterials.

ISO/TR 11360 introduces a system called the “nano-tree”, which places nanotechnology concepts into a logical context by indicating relationships among them as a branching out tree. The most basic and common elements are defined as the main trunk of the tree, and nanomaterials are then differentiated in terms of structure, chemical nature and other properties.

“The document provides users with a structured view of nanotechnology, and facilitates a common understanding of its concepts,” says Peter Hatto, Chair of the committee that developed the standard (ISO/TC 229). “It offers a systematic approach and a commonsensical hierarchy”.

The new definition is called: ISO/TR 11360:2010, Nanotechnologies – Methodology for the classification and categorization of nanomaterials. It will cost you 112 Swiss Francs or, roughly, $112.90 CAD.

I’m not sure what the big difference is between this definition and the one I posted about Oct. 24, 2008 but I suspect the difference lies in the classification level, i.e., the 2008 definition (ISO/TS 27687:2008 titled Nanotechnologies — Terminology and definitions for nano-objects — Nanoparticle, nanofibre and nanoplate) laid the groundwork for this more specific nanomaterials definition.

ETA Aug.21.10: Dexter Johnson at Nanoclast has posted about the new ISO definition and the impact this may have on commercialization of nanomaterials. Go here to read more.

Realism strikes nanotechnology market and employment forecasts

There’s been a new kind of market forecast for nanotechnology kicking around lately. Instead of predicting market values in the trillions, the prediction is in the billions. There’s an item on Nanowerk about this new report,

It therefore is quite refreshing to finally see a market report titled “Nanotechnology: A Realistic Market Assessment” that estimates the worldwide sales revenues for nanotechnology to be $26 billion – yes, that’s illion with a b, not a tr – in 2015.

According to this report, the largest nanotechnology segments in 2009 were nanomaterials, with sales reaching $9 billion in 2009. This is expected to grow to more than $19 billion in 2015. Sales of nanotools, meanwhile, will experience high growth. From a total market revenue of $2.6 billion in 2009, the nanotools segment will increase at a 3.3% CAGR to reach a value of $6,812.5 million in 2015.

These numbers seem more realistic given the commentaries and critiques I’ve seen from more knowledgeable business analysts than me. (There’s more about the report and links to it and other related articles at Nanowerk.)

On the same track, I came across an August 10,2010 posting by Dexter Johnson (Nanoclast) on employment figures for the ‘nanotechnology industry’. From the posting ((Nanotech Employment Numbers Remain Inscrutable),

On the one hand, you have the ever-optimistic viewpoint of Mihail C. Roco, a senior adviser for nanotechnology at NSF [National Science Foundation], who helped develop the numbers back in 2000 that estimated that by 2015 2 million workers worldwide, and 800,000 in the US, would be needed to support nanotechnology manufacturing. According to Roco, we’re still on target with estimates that in 2008 there were 160,000 workers in nanotechnology, representing a 25% increase between 2000 and 2008. If that same percentage increase is applied to the years from 2008 to 2015, then you would get 800,000 by 2015 in Roco’s estimates.

As satisfying as it may be to be dead-on accurate with one’s projections, one cannot help be reminded of Upton Sinclair’s quote “It is difficult to get a man to understand something when his job depends on not understanding it.” If you are given the task of predicting the unpredictable you have to stick to the methodology even when it hardly makes sense.

Dexter is providing commentary on an article by Ann M. Thayer in Chemical and Engineering News, Filling Nanotech Jobs. In the wake of the US National Nanotechnology Initiative’s (NNI) 10th anniversary this year, Thayer unpacks some of the numbers and projections about nanotechnology’s economic impacts. It is sobering. From the article,

Ten years down the road, and with 2015 just over the horizon, it’s clear that the hype has died down and investment momentum has slowed. Although U.S. government nanotech spending under NNI has totaled nearly $12 billion, according to market research firm Lux Research, the recession has further blunted demand for nanomaterials, slowed technology adoption, and reduced its market projections. Many small firms have closed their doors, and some state nanotech initiatives have stalled.

Beyond the likely effect of the economic downturn on employment, efforts to train a nanotech workforce face other uncertainties. The technology has moved into products and manufacturing, but it is still early in its commercial development path. And while it evolves, it must compete for government and investor attention from newer emerging technologies.

Much of the article focuses on educational efforts to support what was intended as a newly emerging and vibrant nanotechnology field. From Thayer’s article,

Reviews of NNI by the President’s Council of Advisors on Science & Technology and others have recommended improving coordination around education and workforce issues. Often near the top of the list is a call for increased participation by the Departments of Labor and Education, agencies new to NNI in 2006, to provide input and help strengthen efforts.

“This should be the next major step,” Roco agrees. “NSF has created a spectrum of methods and models in education, and now these need to be implemented at a larger scale.” He and others in government are counting on the Commerce Department to help assess industry needs and point universities in the right direction.

But the path forward is unclear, in part because the funding environment is in flux. For example, funding that jump-started some of the early nanotech centers, such as NCLT [National Center for Learning & Teaching], has ended, and the centers must recompete or find other ways to sustain their operations.

Education, like any business, responds to market needs. Murday [[James S. Murday, associate director in the University of Southern California’s Office of Research Advancement] supposes that nanoscience education could mirror the materials science field, which came together under government investment in the 1960s. “It’s sort of an existence proof in the past 50 years that you don’t have to be bound by the old disciplines,” Murday says. Instead of getting hung up on what nanotech is or isn’t, “maybe we ought to focus on what we really want, which is new products and figuring out how to design our educational system to make the fastest progress,” he suggests. [emphasis mine]

‘Designing an educational system to make the fastest progress’ as per Murday reeks of the Industrial Revolution. After all, the reason for near universal literacy was that industry in the name of progress needed better educated workers. But that’s a side issue.

What this whole discussion brings up is a question of strategy. The easiest comparison for me to make is between the US and Canada. As I’ve noted before (my Aug. 2, 2010 posting), the US has poured a lot money, time, and energy in a very focused nanotechnology strategy, e.g. NNI,  whereas in Canada, the nanotechnology effort has largely been rolled into pre-existing programs.

At this point, it’s impossible to say if there’s a clear cut right or wrong strategy, as Dexter points out, the people who made and continue to make the projections and decide strategy have a vested interested in being proved right.

Nanomaterial use in construction, in coatings, in site remediation, and on invisible planes

Next to biomedical and electronics industries, the construction industry is expected to be the most affected by nanotechnology according to a study in ACS (American Chemical Society) Nano (journal). From the news item on Azonano,

Pedro Alvarez and colleagues note that nanomaterials likely will have a greater impact on the construction industry than any other sector of the economy, except biomedical and electronics applications. Certain nanomaterials can improve the strength of concrete, serve as self-cleaning and self-sanitizing coatings, and provide many other construction benefits. Concerns exist, however, about the potential adverse health and environmental effects of construction nanomaterials.

The scientists analyzed more than 140 studies on the benefits and risks of nanomaterials. …

The article in ACS Nano is titled, “Nanomaterials in the Construction Industry: A Review of Their Applications and Environmental Health and Safety Considerations.

Still on the construction theme but this time more focused on site remediation, here’s a story about sulfur-rich drywall which corrodes pipes and wiring while possibly causing respiratory illness. From the news item on Nanowerk,

A nanomaterial originally developed to fight toxic waste is now helping reduce debilitating fumes in homes with corrosive drywall.

Developed by Kenneth Klabunde of Kansas State University, and improved over three decades with support from the National Science Foundation, the FAST-ACT material has been a tool of first responders since 2003.

Now, NanoScale Corporation of Manhattan, Kansas–the company Klabunde co-founded to market the technology–has incorporated FAST-ACT into a cartridge that breaks down the corrosive drywall chemicals.

Homeowners have reported that the chemicals–particularly sulfur compounds such as hydrogen sulfide and sulfur dioxide–have caused respiratory illnesses, wiring corrosion and pipe damage in thousands of U.S. homes with sulfur-rich, imported drywall.

“It is devastating to see what has happened to so many homeowners because of the corrosive drywall problem, but I am glad the technology is available to help,” said Klabunde. “We’ve now adapted the technology we developed through years of research for FAST-ACT for new uses by homeowners, contractors and remediators.”

The company has already tested its new product and found that corrosion was reduced and odor levels dropped to almost imperceptible. There are plans to use the company’s technology in the Gulf Coast and elsewhere there are airborne toxic substances.

In Europe, Germany has plans to introduce new concrete paving slabs that reduce the quantity of nitrogen oxide in the air. From the news item on Nanowerk,

In Germany, ambient air quality is not always as good as it might be – data from the federal environment ministry makes this all too clear. In 2009, the amounts of toxic nitrogen oxide in the atmosphere exceeded the maximum permitted levels at no fewer than 55 percent of air monitoring stations in urban areas. The ministry reports that road traffic is one of the primary sources of these emissions.

In light of this fact, the Baroque city of Fulda is currently embarking on new ways to combat air pollution. Special paving slabs that will clean the air are to be laid the length of Petersberger Strasse, where recorded pollution levels topped the annual mean limit of 40 micrograms per cubic meter (µg/m3) last year. These paving slabs are coated with titanium dioxide (TiO2), which converts harmful substances such as nitrogen oxides into nitrates. Titanium dioxide is a photocatalyst; it uses sunlight to accelerate a naturallyoccurring chemical reaction, the speed of which changes with exposure to light.

They’ve already had success with this approach in Italy but Germany has fewer hours of sunshine and lower intensities of light so the product had to be optimized and tested in Germany. Testing has shown that the effect for Germany’s optimized paving slabs does not wear off quickly (it was tested again at 14 months and 23 months). Finally, there don’t seem to be any environmentally unpleasant consequences. If you’re curious about the details, do click on the link.

One last item, this time it’s about a nano-enabled coating that’s a paint. An Israeli company has developed a paint for airplanes that can make them invisible to radar. From Dexter Johnson’s July 14, 2010 posting on Nanoclast,

No, we’re not talking about a Wonder Woman-type of invisible plane, but rather one that becomes very difficult to detect with radar.

The Israel-based Ynetnews is reporting that an Israeli company called Nanoflight has successfully run a test on dummy missiles that were painted with the nano-enabled coating and have shown that radar could not pick them up as missiles.

The YnetNews article rather brutally points out that painting an aircraft with this nanocoating is far cheaper than buying a $5 billion US-made stealth aircraft. Of course, it should also be noted that one sale of a $5 billion aircraft employs a large number of aeronautical engineers, and the high price tag also makes it far more difficult for others to purchase the technology and possess the ability to sneak up on an enemy as well.

You can read more and see a picture of Wonder Woman’s invisible plane by following the link to Dexter’s posting.