The Irish mix up some graphene

There was a lot of excitement (one might almost call it giddiness) earlier this week about a new technique from Irish researchers for producing graphene. From an April 20, 2014 article by Jacob Aron for New Scientist (Note: A link has been removed),

First, pour some graphite powder into a blender. Add water and dishwashing liquid, and mix at high speed. Congratulations, you just made the wonder material graphene.

This surprisingly simple recipe is now the easiest way to mass-produce pure graphene – sheets of carbon just one atom thick. The material has been predicted to revolutionise the electronics industry, based on its unusual electrical and thermal properties. But until now, manufacturing high-quality graphene in large quantities has proved difficult – the best lab techniques manage less than half a gram per hour.

“There are companies producing graphene at much higher rates, but the quality is not exceptional,” says Jonathan Coleman of Trinity College Dublin in Ireland.

Coleman’s team was contracted by Thomas Swan, a chemicals firm based in Consett, UK, to come up with something better. From previous work they knew that it is possible to shear graphene from graphite, the form of carbon found in pencil lead. Graphite is essentially made from sheets of graphene stacked together like a deck of cards, and sliding it in the right way can separate the layers.

Rachel Courtland chimes in with her April 21,2014 post for the Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers]) website (Note: A link has been removed),

The first graphene was made by pulling layers off of graphite using Scotch tape. Now, in keeping with the low-tech origins of the material, a team at Trinity College Dublin has found that it should be possible to make large quantities of the stuff by mixing up some graphite and stabilizing detergent with a blender.

The graphene produced in this manner isn’t anything like the wafer-scale sheets of single-layer graphene that are being grown by Samsung, IBM and others for high-performance electronics. Instead, the blender-made variety consists of small flakes that are exfoliated off of bits of graphite and then separated out by centrifuge. But small-scale graphene has its place, the researchers say. …

An April 22, 2014 CRANN (the Centre for Research on Adaptive Nanostructures and Nanodevices) at Trinity College Dublin news release (also on Nanowerk as an April 20, 2014 news item) provides more details about the new technique and about the private/public partnership behind it,

Research team led by Prof Jonathan Coleman discovers new research method to produce large volumes of high quality graphene.

Researchers in AMBER, the Science Foundation Ireland funded materials science centre headquartered at CRANN, Trinity College Dublin have, for the first time, developed a new method of producing industrial quantities of high quality graphene. …

The discovery will change the way many consumer and industrial products are manufactured. The materials will have a multitude of potential applications including advanced food packaging; high strength plastics; foldable touch screens for mobile phones and laptops; super-protective coatings for wind turbines and ships; faster broadband and batteries with dramatically higher capacity than anything available today.

Thomas Swan Ltd. has worked with the AMBER research team for two years and has signed a license agreement to scale up production and make the high quality graphene available to industry globally. The company has already announced two new products as a result of the research discovery (Elicarb®Graphene Powder and Elicarb® Graphene Dispersion).

Until now, researchers have been unable to produce graphene of high quality in large enough quantities. The subject of on-going international research, the research undertaken by AMBER is the first to perfect a large-scale production of pristine graphene materials and has been highlighted by the highly prestigious Nature Materials publication as a global breakthrough. Professor Coleman and his team used a simple method for transforming flakes of graphite into defect-free graphene using commercially available tools, such as high-shear mixers. They demonstrated that not only could graphene-containing liquids be produced in standard lab-scale quantities of a few 100 millilitres, but the process could be scaled up to produce 100s of litres and beyond.

Minister for Research and Innovation Sean Sherlock, TD commented; “Professor Coleman’s discovery shows that Ireland has won the worldwide race on the production of this ‘miracle material’. This is something that USA, China, Australia, UK, Germany and other leading nations have all been striving for and have not yet achieved. This announcement shows how the Irish Government’s strategy of focusing investment in science with impact, as well as encouraging industry and academic collaboration, is working.”

Here’s a link to and a citation for the researchers’ paper,

Scalable production of large quantities of defect-free few-layer graphene by shear exfoliation in liquids by Keith R. Paton, Eswaraiah Varrla, Claudia Backes, Ronan J. Smith, Umar Khan, Arlene O’Neill, Conor Boland, Mustafa Lotya, Oana M. Istrate, Paul King, Tom Higgins, Sebastian Barwich, Peter May, Pawel Puczkarski, Iftikhar Ahmed, Matthias Moebius, Henrik Pettersson, Edmund Long, João Coelho, Sean E. O’Brien, Eva K. McGuire, Beatriz Mendoza Sanchez, Georg S. Duesberg, Niall McEvoy, Timothy J. Pennycook, et al. Nature Materials (2014) doi:10.1038/nmat3944 Published online 20 April 2014

This article is mostly behind a paywall but there is a free preview available through ReadCube Access.

For anyone who’s curious about AMBER, here’s more from the About Us page on the CRANN website (Note: A link has been removed),

In October 2013, a new Science Foundation Ireland funded research centre, AMBER (Advanced Materials and BioEngineering Research) was launched. AMBER is jointly hosted in TCD [Trinity College Dublin] by CRANN and the Trinity Centre for Bioenineering, and works in collaboration with the Royal College of Surgeons in Ireland and UCC. The centre provides a partnership between leading researchers in materials science and industry and will deliver internationally leading research that will be industrially and clinically informed with outputs including new discoveries and devices in ICT, medical device and industrial technology sectors.

Finally, Thomas Swan Ltd. can be found here.

Wilson Center hosts ‘Environmental Information: The Roles of Experts and the Public’ on April 29, 2014

Here’s a description of the Wilson Center event, Environmental Information: The Roles of Experts and the Public,

Access to environmental information and use of it for environmental decision making are central pillars of environmental democracy. Yet, not much attention is paid to the question of who is producing it, and for whom? By examining the history of environmental information, since NEPA in 1969, three eras can be identified: information produced by experts, for experts (1969-1992); information produced by experts, to be shared by experts and the public (1992-2011); and finally, information produced by experts and the public to be shared by experts and the public.

Underlying these are changes in access to information, rise in levels of education and rapid change due to digital technologies. The three eras and their implication to environmental decision making will be explored, with special attention to the role of geographical information and geographical information systems and to citizen science.  [emphasis mine]

Tuesday, April 29th from 10:00 – 11:30am. [EST]

I hope the speaker description and the paper being distributed on the event page mean this may be a bit more interesting to those of us curious about citizen science than is immediately apparent from the event description,

Muki (Mordechai) Haklay

Muki Haklay is a Professor of Geographic Information Science in the Department of Civil, Environmental and Geomatic Engineering, University College London.  He is also the Director of the UCL Extreme Citizen Science group, which is dedicated to allowing any community, regardless of their literacy, to use scientific methods and tools to collect, analyze and interpret and use information about their area and activities.

His research interests include Public access and use of Environmental Information; Human-Computer Interaction (HCI) and Usability Engineering aspects of GIS; and Societal aspects of GIS use – in particular, participatory mapping and Citizen Science.

Here’s the paper,

Citizen Science and Volunteered Geographic Information – overview and typology of participation

You can RSVP from the event page if you’re planning to attend this event in Washington, DC in person, alternatively you can watch a livestream webcast by returning to the event page on April 29, 2014 at 10 am (that will be 7 am, if you’re on the West Coast),

Move over laser—the graphene/carbon nanotube spaser is here, on your t-shirt

This research graphene/carbon nanotube research comes from Australia according to an April 16, 2014 news item on Nanowerk,

A team of researchers from Monash University’s [Australia] Department of Electrical and Computer Systems Engineering (ECSE) has modelled the world’s first spaser …

An April 16, 2014 Monash University news release, which originated the new item, describes the spaser and its relationship to lasers,,

A new version of “spaser” technology being investigated could mean that mobile phones become so small, efficient, and flexible they could be printed on clothing.

A spaser is effectively a nanoscale laser or nanolaser. It emits a beam of light through the vibration of free electrons, rather than the space-consuming electromagnetic wave emission process of a traditional laser.

The news release also provides more details about the graphene/carbon nanotube spaser research and the possibility of turning t-shirts into telephones,

PhD student and lead researcher Chanaka Rupasinghe said the modelled spaser design using carbon would offer many advantages.

“Other spasers designed to date are made of gold or silver nanoparticles and semiconductor quantum dots while our device would be comprised of a graphene resonator and a carbon nanotube gain element,” Chanaka said.

“The use of carbon means our spaser would be more robust and flexible, would operate at high temperatures, and be eco-friendly.

“Because of these properties, there is the possibility that in the future an extremely thin mobile phone could be printed on clothing.”

Spaser-based devices can be used as an alternative to current transistor-based devices such as microprocessors, memory, and displays to overcome current miniaturising and bandwidth limitations.

The researchers chose to develop the spaser using graphene and carbon nanotubes. They are more than a hundred times stronger than steel and can conduct heat and electricity much better than copper. They can also withstand high temperatures.

Their research showed for the first time that graphene and carbon nanotubes can interact and transfer energy to each other through light. These optical interactions are very fast and energy-efficient, and so are suitable for applications such as computer chips.

“Graphene and carbon nanotubes can be used in applications where you need strong, lightweight, conducting, and thermally stable materials due to their outstanding mechanical, electrical and optical properties. They have been tested as nanoscale antennas, electric conductors and waveguides,” Chanaka said.

Chanaka said a spaser generated high-intensity electric fields concentrated into a nanoscale space. These are much stronger than those generated by illuminating metal nanoparticles by a laser in applications such as cancer therapy.

“Scientists have already found ways to guide nanoparticles close to cancer cells. We can move graphene and carbon nanotubes following those techniques and use the high concentrate fields generated through the spasing phenomena to destroy individual cancer cells without harming the healthy cells in the body,” Chanaka said

Here’s a link to and a citation for the paper,

Spaser Made of Graphene and Carbon Nanotubes by Chanaka Rupasinghe, Ivan D. Rukhlenko, and Malin Premaratne. ACS Nano, 2014, 8 (3), pp 2431–2438. DOI: 10.1021/nn406015d Publication Date (Web): February 23, 2014
Copyright © 2014 American Chemical Society

This paper is behind a paywall.

Chiral breathing at the Institute of Physical Chemistry of the Polish Academy of Sciences (IPC PAS)

An April 17, 2014 news item on ScienceDaily highlights some research about a polymer that has some special properties,

Electrically controlled glasses with continuously adjustable transparency, new polarisation filters, and even chemosensors capable of detecting single molecules of specific chemicals could be fabricated thanks to a new polymer unprecedentedly combining optical and electrical properties.

An international team of chemists from Italy, Germany, and Poland developed a polymer with unique optical and electric properties. Components of this polymer change their spatial configuration depending on the electric potential applied. In turn, the polarisation of transmitted light is affected. The material can be used, for instance, in polarisation filters and window glasses with continuously adjustable transparency. Due to its mechanical properties, the polymer is also perfectly suitable for fabrication of chemical sensors for selective detection and determination of optically active (chiral) forms of an analyte.

The research findings of the international team headed by Prof. Francesco Sannicolo from the Universita degli Studi di Milano were recently published in Angewandte Chemie International Edition.

“Until now, to give polymers chiral properties, chiral pendants were attached to the polymer backbone. In such designs the polymer was used as a scaffold only. Our polymer is exceptional, with chirality inherent to it, and with no pending groups. The polymer is both a scaffold and an optically active chiral structure. Moreover, the polymer conducts electricity,” comments Prof. Włodzimierz Kutner from the Institute of Physical Chemistry of the Polish Academy of Sciences (IPC PAS) in Warsaw, one of the initiators of the research.

An April 17, 2014 IPC PAS news release (also on EurrekAlert), which originated the news item, describes chirality and the breathing metaphor with regard to this new polymer,

Chirality can be best explained by referring to mirror reflection. If two varieties of the same object look like their mutual mirror images, they differ in chirality. Human hands provide perhaps the most universal example of chirality, and the difference between the left and right hand becomes obvious if we try to place a left-handed glove on a right hand. The same difference as between the left and right hand is between two chiral molecules with identical chemical composition. Each of them shows different optical properties, and differently rotates plane-polarised light. In such a case, chemists refer to one chemical compound existing as two optical isomers called enantiomers.

The polymer presented by Prof. Sannicolo’s team was developed on the basis of thiophene, an organic compound composed of a five-member aromatic ring containing a sulphur atom. Thiophene polymerisation gives rise to a chemically stable polymer of high conductivity. The basic component of the new polymer – its monomer – is made of a dimer with two halves each made of two thiophene rings and one thianaphthene unit. The halves are connected at a single point and can partially be rotated with respect to each other by applying electric potential. Depending on the orientation of the halves, the new polymer either assumes or looses chirality. This behaviour is fully reversible and resembles a breathing system, whereas the “chiral breathing” is controlled by an external electric potential.

The development of a new polymer was initiated thanks to the research on molecular imprinting pursued at the Institute of Physical Chemistry of the PAS. The research resulted, for instance, in the development of polymers used as recognising units (receptors) in chemosensors, capable of selective capturing of molecules of various analytes, for instance nicotine, and also melamine, an ill-reputed chemical detrimental to human health, used as an additive to falsify protein content in milk and dairy products produced in China.

Generally, molecular imprinting consists in creating template-shaped cavities in polymer matrices with molecules of interest used first as cavity templates. Subsequently these templates are washed out from the polymer. As a result, the polymer contains traps with a shape and size matching those of molecules of the removed template. To be used as a receptor in chemosensor to recognize analyte molecules similar to templates or templates themselves, the polymer imprinted with these cavities must show a sufficient mechanical strength.

“Three-dimensional networks we attempted to build at the IPC PAS using existing two-dimensional thiophene derivatives just collapsed after the template molecules were removed. That’s why we asked for assistance our Italian partners, specialising in the synthesis of thiophene derivatives. The problem was to design and synthesise a three-dimensional thiophene derivative that would allow us for cross-linking of our polymers in three dimensions. The thiophene derivative synthesised in Milan has a stable three-dimensional structure, and the controllable chiral properties of the new polymer obtained after the derivative was polymerised, turned out a nice surprise for all of us”, explains Prof. Kutner.

Here’s a link to and a citation for the paper,

Potential-Driven Chirality Manifestations and Impressive Enantioselectivity by Inherently Chiral Electroactive Organic Films by  Prof. Francesco Sannicolò1, Serena Arnaboldi, Prof. Tiziana Benincori, Dr. Valentina Bonometti, Dr. Roberto Cirilli, Prof. Lothar Dunsch, Prof. Włodzimierz Kutner, Prof. Giovanna Longhi, Prof. Patrizia R. Mussini, Dr. Monica Panigati, Prof. Marco Pierini, and Dr. Simona Rizzo. Angewandte Chemie International Edition Volume 53, Issue 10, pages 2623–2627, March 3, 2014. Article first published online: 5 FEB 2014 DOI: 10.1002/anie.201309585

© 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

This article is behind a paywall.

State-of-the-art biotech and nanophotonics equipment at Québec’s Institut national de la recherche scientifique (INRS)

Canada Foundation for Innovation (a federal government funding agency) has awarded two infrastructure grants to Québec’s Institut national de la recherche scientifique (INRS) or more specifically their Énergie Matériaux Télécommunications Research Centreaccording to an April 18, 2014 news item on Azonano,

Professor Marc André Gauthier and Professor Luca Razzari of the Énergie Matériaux Télécommunications Research Centre have each been awarded large grants from the John R. Evans Leaders Fund of the Canada Foundation for Innovation (CFI) for the acquisition of state-of-the-art biotech and nanophotonics equipment.

To this funding will be added matching grants from the Ministère de l’Enseignement supérieur, de la Recherche, de la Science et de la Technologie (MESRST). These new laboratories will help us develop new approaches for improving health and information technologies, train the next generation of highly qualified high-tech workers, and transfer technology and expertise to local startups.

An April 17, 2014 INRS news release by Gisèle Bolduc, which originated the news item (Pour ceux qui préfèrent l’actualité en français) , provides more details,

Bio-hybrid materials

Professor Gauthier’snew Laboratory of Bio-Hybrid Materials (LBM) will enable him to tackle the numerous challenges of designing these functional materials and make it possible for the biomedical and biotech sectors to take full advantage of their new and unique properties. Professor Gauthier and his team will work on developing new bio organic reactions involving synthetic and natural molecules and improving those that already exist. They will examine the architecture of protein-polymer grafts and develop methods for adjusting the structure and function of bio-hybrid materials in order to evaluate their therapeutic potential.

Plasmonic nanostructures and nonlinear optics

Professor Luca Razzari will use his Laboratory of Nanostructure-Assisted Spectroscopy and Nonlinear Optics (NASNO Lab) to document the properties of plasmonic nanostructures, improve nanospectroscopies and explore new photonic nanodevices. He will also develop new biosensors able to identify very small numbers of biomarkers. This may have an important impact in the early diagnosis of several diseases such as cancer and life-threatening infectious diseases.Besides this, he will investigate a new generation of nanoplasmonic devices for information and communications technology applications.

Congratulations!

Iran’s work on turmeric (curcumin) as an anti-cancer drug

It’s been a while since I’ve mentioned either Iran or curcumin (a constituent of turmeric) but an April 15, 2014 news item on Nanowerk has given me an opportunity to do both,

Nanotechnology researchers from Tarbiat Modarres University [Iran] produced a new drug capable of detecting and removing cancer cells using turmeric …

The compound is made of curcumin found in the extract of turmeric, and has desirable physical and chemical stability and prevents the proliferation of cancer cells.

An April 16, 2014 Iran Nanotechnology Initiative Council (INIC) news release, which despite its date appears to have originated the news item, fills in details about the research,

In this drug, curcumin with high efficiency (approximately 87%) was loaded in the polymeric nanocarrier, and it created a spherical structure with the size of 140 nm. The drug has high physical and chemical stability. The drug was used successfully in laboratory conditions in the treatment of a type of aggressive tumor in the central nervous system, called glioblastoma (GBM).

The interesting point is that the fatal effect of nanocurcumin on mature stem cells derived from marrow and natural cells of skin fibroblast is observed at a concentration higher than a concentration that is effective on cancer cells. In other words, no fatal effect on natural cells is observed at concentrations that are fatal to cancer cells. It shows that curcumin prefers to enter cancer cells.

The size range of the nanocarrier used in this research is 15-100 nm. Physical and chemical stability, non-toxicity, and biodegradability are among the main characteristics of the nanocarriers. Based on the results, the nanocarrier used in this research has no toxic effect on cells. In other words, all the death in the cells is caused by curcumin, and dendrosome only results in bioavailability and transference of the drug into the cells.

“The drug has the potential to affect a number of message delivery paths in the cells, one of which is cell proliferation path. Therefore, the drug prefers to enter cancer cells rather than various types of natural cells,” the researchers said.

Here’s a link to and a citation for the paper,

Dendrosomal curcumin nanoformulation downregulates pluripotency genes via miR-145 activation in U87MG glioblastoma cells by Maryam Tahmasebi Mirgani, Benedetta Isacchi, Majid Sadeghizadeh, Fabio Marra, Anna Rita Bilia, Seyed Javad Mowla, Farhood Najafi, & Esmael Babaei. International Journal of Nanomedicine, vol. 9, issue 1, January 2014, pp. 403-417.DOI: http://dx.doi.org/10.2147/IJN.S48136

This is an open access paper.

I last wrote about turmeric or more specifically curcumin in a December 25, 2011 posting about research at UCLA (University of California at Los Angeles).

Earth Day, Water Day, and every day

I’m blaming my confusion on the American Chemical Society (ACS) which seemed to be celebrating Earth Day on April 15, 2014 as per its news release highlighting their “Chemists Celebrate Earth Day” video series  while in Vancouver, Canada, we’re celebrating it on April 26, 2014 and elsewhere it seems to be on April 20, this year. Regardless, here’s more about how chemist’s are celebrating from the ACS news release,

Water is arguably the most important resource on the planet. In celebration of Earth Day, the American Chemical Society (ACS) is showcasing three scientists whose research keeps water safe, clean and available for future generations. Geared toward elementary and middle school students, the “Chemists Celebrate Earth Day” series highlights the important work that chemists and chemical engineers do every day. The videos are available at http://bit.ly/CCED2014.

The series focuses on the following subjects:

  • Transforming Tech Toys- Featuring Aydogan Ozcan, Ph.D., of UCLA: Ozcan takes everyday gadgets and turns them into powerful mobile laboratories. He’s made a cell phone into a blood analyzer and a bacteria detector, and now he’s built a device that turns a cell phone into a water tester. It can detect very harmful mercury even at very low levels.
  • All About Droughts - Featuring Collins Balcombe of the U.S. Bureau of Reclamation: Balcombe’s job is to keep your drinking water safe and to find new ways to re-use the water that we flush away everyday so that it doesn’t go to waste, especially in areas that don’t get much rain.
  • Cleaning Up Our Water – Featuring Anne Morrissey, Ph.D., of Dublin City University: We all take medicines, but did you know that sometimes the medicine doesn’t stay in our bodies? It’s up to Anne Morrissey to figure out how to get potentially harmful pharmaceuticals out of the water supply, and she’s doing it using one of the most plentiful things on the planet: sunlight.

Sadly, I missed marking World Water Day which according to a March 21, 2014 news release I received was being celebrated on Saturday, March 22, 2014 with worldwide events and the release of a new UN report,

World Water Day: UN Stresses Water and Energy Issues 

Tokyo Leads Public Celebrations Around the World

Tokyo — March 21 — The deep-rooted relationships between water and energy were highlighted today during main global celebrations in Tokyo marking the United Nations’ annual World Water Day.

“Water and energy are among the world’s most pre-eminent challenges. This year’s focus of World Water Day brings these issues to the attention of the world,” said Michel Jarraud, Secretary-General of the World Meteorological Organization and Chair of UN-Water, which coordinates World Water Day and freshwater-related efforts UN system-wide.

The UN predicts that by 2030 the global population will need 35% more food, 40% more water and 50% more energy. Already today 768 million people lack access to improved water sources, 2.5 billion people have no improved sanitation and 1.3 billion people cannot access electricity.

“These issues need urgent attention – both now and in the post-2015 development discussions. The situation is unacceptable. It is often the same people who lack access to water and sanitation who also lack access to energy, ” said Mr. Jarraud.

The 2014 World Water Development Report (WWDR) – a UN-Water flagship report, produced and coordinated by the World Water Assessment Programme, which is hosted and led by UNESCO – is released on World Water Day as an authoritative status report on global freshwater resources. It highlights the need for policies and regulatory frameworks that recognize and integrate approaches to water and energy priorities.

WWDR, a triennial report from 2003 to 2012, this year becomes an annual edition, responding to the international community’s expression of interest in a concise, evidence-based and yearly publication with a specific thematic focus and recommendations.

WWDR 2014 underlines how water-related issues and choices impact energy and vice versa. For example: drought diminishes energy production, while lack of access to electricity limits irrigation possibilities.

The report notes that roughly 75% of all industrial water withdrawals are used for energy production. Tariffs also illustrate this interdependence: if water is subsidized to sell below cost (as is often the case), energy producers – major water consumers – are less likely to conserve it.  Energy subsidies, in turn, drive up water usage.

The report stresses the imperative of coordinating political governance and ensuring that water and energy prices reflect real costs and environmental impacts.

“Energy and water are at the top of the global development agenda,” said the Rector of United Nations University, David Malone, this year’s coordinator of World Water Day on behalf of UN-Water together with the United Nations Industrial Development Organization (UNIDO).

“Significant policy gaps exist in this nexus at present, and the UN plays an instrumental role in providing evidence and policy-relevant guidance. Through this day, we seek to inform decision-makers, stakeholders and practitioners about the interlinkages, potential synergies and trade-offs, and highlight the need for appropriate responses and regulatory frameworks that account for both water and energy priorities. From UNU’s perspective, it is essential that we stimulate more debate and interactive dialogue around possible solutions to our energy and water challenges.”

UNIDO Director-General LI Yong, emphasized the importance of water and energy for inclusive and sustainable industrial development.

“There is a strong call today for integrating the economic dimension, and the role of industry and manufacturing in particular, into the global post-2015 development priorities. Experience shows that environmentally sound interventions in manufacturing industries can be highly effective and can significantly reduce environmental degradation. I am convinced that inclusive and sustainable industrial development will be a key driver for the successful integration of the economic, social and environmental dimensions,” said Mr. LI.

Rather unusually, Michael Bergerrecently published two Nanowerk Spotlight articles about water (is there theme, anyone?) within 24 hours of each other. In his March 26, 2014 Spotlight article, Michael Berger focuses on graphene and water remediation (Note: Links have been removed),

The unique properties of nanomaterials are beneficial in applications to remove pollutants from the environment. The extremely small size of nanomaterial particles creates a large surface area in relation to their volume, which makes them highly reactive, compared to non-nano forms of the same materials.

The potential impact areas for nanotechnology in water applications are divided into three categories: treatment and remediation; sensing and detection: and pollution prevention (read more: “Nanotechnology and water treatment”).

Silver, iron, gold, titanium oxides and iron oxides are some of the commonly used nanoscale metals and metal oxides cited by the researchers that can be used in environmental remediation (read more: “Overview of nanomaterials for cleaning up the environment”).

A more recent entrant into this nanomaterial arsenal is graphene. Individual graphene sheets and their functionalized derivatives have been used to remove metal ions and organic pollutants from water. These graphene-based nanomaterials show quite high adsorption performance as adsorbents. However they also cause additional cost because the removal of these adsorbent materials after usage is difficult and there is the risk of secondary environmental pollution unless the nanomaterials are collected completely after usage.

One solution to this problem would be the assembly of individual sheets into three-dimensional (3D) macroscopic structures which would preserve the unique properties of individual graphene sheets, and offer easy collecting and recycling after water remediation.

The March 27, 2014 Nanowerk Spotlight article was written by someone at Alberta’s (Canada) Ingenuity Lab and focuses on their ‘nanobiological’ approach to water remediation (Note: Links have been removed),

At Ingenuity Lab in Edmonton, Alberta, Dr. Carlo Montemagno and a team of world-class researchers have been investigating plausible solutions to existing water purification challenges. They are building on Dr. Montemagno’s earlier patented discoveries by using a naturally-existing water channel protein as the functional unit in water purification membranes [4].

Aquaporins are water-transport proteins that play an important osmoregulation role in living organisms [5]. These proteins boast exceptionally high water permeability (~ 1010 water molecules/s), high selectivity for pure water molecules, and a low energy cost, which make aquaporin-embedded membrane well suited as an alternative to conventional RO membranes.

Unlike synthetic polymeric membranes, which are driven by the high pressure-induced diffusion of water through size selective pores, this technology utilizes the biological osmosis mechanism to control the flow of water in cellular systems at low energy. In nature, the direction of osmotic water flow is determined by the osmotic pressure difference between compartments, i.e. water flows toward higher osmotic pressure compartment (salty solution or contaminated water). This direction can however be reversed by applying a pressure to the salty solution (i.e., RO).

The principle of RO is based on the semipermeable characteristics of the separating membrane, which allows the transport of only water molecules depending on the direction of osmotic gradient. Therefore, as envisioned in the recent publication (“Recent Progress in Advanced Nanobiological Materials for Energy and Environmental Applications”), the core of Ingenuity Lab’s approach is to control the direction of water flow through aquaporin channels with a minimum level of pressure and to use aquaporin-embedded biomimetic membranes as an alternative to conventional RO membranes.

Here’s a link to and a citation for Montemagno’s and his colleague’s paper,

Recent Progress in Advanced Nanobiological Materials for Energy and Environmental Applications by Hyo-Jick Choi and Carlo D. Montemagno. Materials 2013, 6(12), 5821-5856; doi:10.3390/ma6125821

This paper is open access.

Returning to where I started, here’s a water video featuring graphene from the ACS celebration of Earth Day 2014,

Happy Earth Day!

Roadmap to neuromorphic engineering digital and analog) for the creation of artificial brains *from the Georgia (US) Institute of Technology

While I didn’t mention neuromorphic engineering in my April 16, 2014 posting which focused on the more general aspect of nanotechnology in Transcendence, a movie starring Johnny Depp and opening on April 18, that specialty (neuromorphic engineering) is what makes the events in the movie ‘possible’ (assuming very large stretches of imagination bringing us into the realm implausibility and beyond). From the IMDB.com plot synopsis for Transcendence,

Dr. Will Caster (Johnny Depp) is the foremost researcher in the field of Artificial Intelligence, working to create a sentient machine that combines the collective intelligence of everything ever known with the full range of human emotions. His highly controversial experiments have made him famous, but they have also made him the prime target of anti-technology extremists who will do whatever it takes to stop him. However, in their attempt to destroy Will, they inadvertently become the catalyst for him to succeed to be a participant in his own transcendence. For his wife Evelyn (Rebecca Hall) and best friend Max Waters (Paul Bettany), both fellow researchers, the question is not if they canbut [sic] if they should. Their worst fears are realized as Will’s thirst for knowledge evolves into a seemingly omnipresent quest for power, to what end is unknown. The only thing that is becoming terrifyingly clear is there may be no way to stop him.

In the film, Carter’s intelligence/consciousness is uploaded to the computer, which suggests the computer has human brainlike qualities and abilities. The effort to make computer or artificial intelligence more humanlike is called neuromorphic engineering and according to an April 17, 2014 news item on phys.org, researchers at the Georgia Institute of Technology (Georgia Tech) have published a roadmap for this pursuit,

In the field of neuromorphic engineering, researchers study computing techniques that could someday mimic human cognition. Electrical engineers at the Georgia Institute of Technology recently published a “roadmap” that details innovative analog-based techniques that could make it possible to build a practical neuromorphic computer.

A core technological hurdle in this field involves the electrical power requirements of computing hardware. Although a human brain functions on a mere 20 watts of electrical energy, a digital computer that could approximate human cognitive abilities would require tens of thousands of integrated circuits (chips) and a hundred thousand watts of electricity or more – levels that exceed practical limits.

The Georgia Tech roadmap proposes a solution based on analog computing techniques, which require far less electrical power than traditional digital computing. The more efficient analog approach would help solve the daunting cooling and cost problems that presently make digital neuromorphic hardware systems impractical.

“To simulate the human brain, the eventual goal would be large-scale neuromorphic systems that could offer a great deal of computational power, robustness and performance,” said Jennifer Hasler, a professor in the Georgia Tech School of Electrical and Computer Engineering (ECE), who is a pioneer in using analog techniques for neuromorphic computing. “A configurable analog-digital system can be expected to have a power efficiency improvement of up to 10,000 times compared to an all-digital system.”

An April 16, 2014 Georgia Tech news release by Rick Robinson, which originated the news item, describes why Hasler wants to combine analog (based on biological principles) and digital computing approaches to the creation of artificial brains,

Unlike digital computing, in which computers can address many different applications by processing different software programs, analog circuits have traditionally been hard-wired to address a single application. For example, cell phones use energy-efficient analog circuits for a number of specific functions, including capturing the user’s voice, amplifying incoming voice signals, and controlling battery power.

Because analog devices do not have to process binary codes as digital computers do, their performance can be both faster and much less power hungry. Yet traditional analog circuits are limited because they’re built for a specific application, such as processing signals or controlling power. They don’t have the flexibility of digital devices that can process software, and they’re vulnerable to signal disturbance issues, or noise.

In recent years, Hasler has developed a new approach to analog computing, in which silicon-based analog integrated circuits take over many of the functions now performed by familiar digital integrated circuits. These analog chips can be quickly reconfigured to provide a range of processing capabilities, in a manner that resembles conventional digital techniques in some ways.

Over the last several years, Hasler and her research group have developed devices called field programmable analog arrays (FPAA). Like field programmable gate arrays (FPGA), which are digital integrated circuits that are ubiquitous in modern computing, the FPAA can be reconfigured after it’s manufactured – hence the phrase “field-programmable.”

Hasler and Marr’s 29-page paper traces a development process that could lead to the goal of reproducing human-brain complexity. The researchers investigate in detail a number of intermediate steps that would build on one another, helping researchers advance the technology sequentially.

For example, the researchers discuss ways to scale energy efficiency, performance and size in order to eventually achieve large-scale neuromorphic systems. The authors also address how the implementation and the application space of neuromorphic systems can be expected to evolve over time.

“A major concept here is that we have to first build smaller systems capable of a simple representation of one layer of human brain cortex,” Hasler said. “When that system has been successfully demonstrated, we can then replicate it in ways that increase its complexity and performance.”

Among neuromorphic computing’s major hurdles are the communication issues involved in networking integrated circuits in ways that could replicate human cognition. In their paper, Hasler and Marr emphasize local interconnectivity to reduce complexity. Moreover, they argue it’s possible to achieve these capabilities via purely silicon-based techniques, without relying on novel devices that are based on other approaches.

Commenting on the recent publication, Alice C. Parker, a professor of electrical engineering at the University of Southern California, said, “Professor Hasler’s technology roadmap is the first deep analysis of the prospects for large scale neuromorphic intelligent systems, clearly providing practical guidance for such systems, with a nearer-term perspective than our whole-brain emulation predictions. Her expertise in analog circuits, technology and device models positions her to provide this unique perspective on neuromorphic circuits.”

Eugenio Culurciello, an associate professor of biomedical engineering at Purdue University, commented, “I find this paper to be a very accurate description of the field of neuromorphic data processing systems. Hasler’s devices provide some of the best performance per unit power I have ever seen and are surely on the roadmap for one of the major technologies of the future.”

Said Hasler: “In this study, we conclude that useful neural computation machines based on biological principles – and potentially at the size of the human brain — seems technically within our grasp. We think that it’s more a question of gathering the right research teams and finding the funding for research and development than of any insurmountable technical barriers.”

Here’s a link to and a citation for the roadmap,

Finding a roadmap to achieve large neuromorphic hardware systems by Jennifer Hasler and Bo Marr.  Front. Neurosci. (Frontiers in Neuroscience), 10 September 2013 | doi: 10.3389/fnins.2013.00118

This is an open access article (at least, the HTML version is).

I have looked at Hasler’s roadmap and it provides a good and readable overview (even for an amateur like me; Note: you do have to need some tolerance for ‘not knowing’) of the state of neuromorphic engineering’s problems, and suggestions for overcoming them. Here’s a description of a human brain and its power requirements as compared to a computer’s (from the roadmap),

One of the amazing thing about the human brain is its ability to perform tasks beyond current supercomputers using roughly 20 W of average power, a level smaller than most individual computer microprocessor chips. A single neuron emulation can tax a high performance processor; given there is 1012 neurons operating at 20 W, each neuron consumes 20 pW average power. Assuming a neuron is conservatively performing the wordspotting computation (1000 synapses), 100,000 PMAC (PMAC = “Peta” MAC = 1015 MAC/s) would be required to duplicate the neural structure. A higher computational efficiency due to active dendritic line channels is expected as well as additional computation due to learning. The efficiency of a single neuron would be 5000 PMAC/W (or 5 TMAC/μW). A similar efficiency for 1011 neurons and 10,000 synapses is expected.

Building neuromorphic hardware requires that technology must scale from current levels given constraints of power, area, and cost: all issues typical in industrial and defense applications; if hardware technology does not scale as other available technologies, as well as takes advantage of the capabilities of IC technology that are currently visible, it will not be successful.

One of my main areas of interest is the memristor (a nanoscale ‘device/circuit element’ which emulates synaptic plasticity), which was mentioned in a way that allows me to understand how the device fits (or doesn’t fit) into the overall conceptual framework (from the roadmap),

The density for a 10 nm EEPROM device acting as a synapse begs the question of whether other nanotechnologies can improve on the resulting Si [silicon] synapse density. One transistor per synapse is hard to beat by any approach, particularly in scaled down Si (like 10 nm), when the synapse memory, computation, and update is contained within the EEPROM device. Most nano device technologies [i.e., memristors (Snider et al., 2011)] show considerable difficulties to get to two-dimensional arrays at a similar density level. Recently, a team from U. of Michigan announced the first functioning memristor two-dimensional (30 × 30) array built on a CMOS chip in 2012 (Kim et al., 2012), claiming applications in neuromorphic engineering, the same group has published innovative devices for digital (Jo and Lu, 2009) and analog applications (Jo et al., 2011).

I notice that the reference to the University’s of Michigan is relatively neutral in tone and the memristor does not figure substantively in Hasler’s roadmap.

Intriguingly, there is a section on commercialization; I didn’t think the research was at that stage yet (from the roadmap),

Although one can discuss how to build a cortical computer on the size of mammals and humans, the question is how will the technology developed for these large systems impact commercial development. The cost for ICs [integrated circuits or chips] alone for cortex would be approximately $20 M in current prices, which although possible for large users, would not be common to be found in individual households. Throughout the digital processor approach, commercial market opportunities have driven the progress in the field. Getting neuromorphic technology integrated into commercial environment allows us to ride this powerful economic “engine” rather than pull.

In most applications, the important commercial issues include minimization of cost, time to market, just sufficient performance for the application, power consumed, size and weight. The cost of a system built from ICs is, at a macro-level, a function of the area of those ICs, which then affects the number of ICs needed system wide, the number of components used, and the board space used. Efficiency of design tools, testing time and programming time also considerably affect system costs. Time to get an application to market is affected by the ability to reuse or quickly modify existing designs, and is reduced for a new application if existing hardware can be reconfigured, adapting to changing specifications, and a designer can utilize tools that allow rapid modifications to the design. Performance is key for any algorithm, but for a particular product, one only needs a solution to that particular problem; spending time to make the solution elegant is often a losing strategy.

The neuromorphic community has seen some early entries into commercial spaces, but we are just at the very beginning of the process. As the knowledge of neuromorphic engineering has progressed, which have included knowledge of sensor interfaces and analog signal processing, there have been those who have risen to the opportunities to commercialize these technologies. Neuromorphic research led to better understanding of sensory processing, particularly sensory systems interacting with other humans, enabling companies like Synaptics (touch pads), Foveon (CMOS color imagers), and Sonic Innovation (analog–digital hearing aids); Gilder provides a useful history of these two companies elsewhere (Gilder, 2005). From the early progress in analog signal processing we see companies like GTronix (acquired by National Semiconductor, then acquired by Texas Instruments) applying the impact of custom analog signal processing techniques and programmability toward auditory signal processing that improved sound quality requiring ultra-low power levels. Further, we see in companies like Audience there is some success from mapping the computational flow of the early stage auditory system, and implementing part of the event based auditory front-end to achieve useful results for improved voice quality. But the opportunities for the neuromorphic community are just beginning, and directly related to understanding the computational capabilities of these items. The availability of ICs that have these capabilities, whether or not one mentions they have any neuromorphic material, will further drive applications.

One expects that part of a cortex processing system would have significant computational possibilities, as well as cortex structures from smaller animals, and still be able to reach price points for commercial applications. In the following discussion, we will consider the potential of cortical structures at different levels of commercial applications. Figure 24 shows one typical block diagram, algorithms at each stage, resulting power efficiency (say based on current technology), as well as potential applications of the approach. In all cases, we will be considering a single die solution, typical for a commercial product, and will minimize the resulting communication power to I/O off the chip (no power consumed due to external memories or digital processing devices). We will assume a net computational efficiency of 10 TMAC/mW, corresponding to a lower power supply (i.e., mostly 500 mV, but not 180 mV) and slightly larger load capacitances; we make these assumptions as conservative pull back from possible applications, although we expect the more aggressive targets would be reachable. We assume the external power consumed is set by 1 event/second/neuron average event-rate off chip to a nearby IC. Given the input event rate is hard to predict, we don’t include that power requirement but assume it is handled by the input system. In all of these cases, getting the required computation using only digital techniques in a competitive size, weight, and especially power is hard to foresee.

We expect progress in these neuromorphic systems and that should find applications in traditional signal processing and graphics handling approaches. We will continue to have needs in computing that outpace our available computing resources, particularly at a power consumption required for a particular application. For example, the recent emphasis on cloud computing for academic/research problems shows the incredible need for larger computing resources than those directly available, or even projected to be available, for a portable computing platform (i.e., robotics). Of course a server per computing device is not a computing model that scales well. Given scaling limits on computing, both in power, area, and communication, one can expect to see more and more of these issues going forward.

We expect that a range of different ICs and systems will be built, all at different targets in the market. There are options for even larger networks, or integrating these systems with other processing elements on a chip/board. When moving to larger systems, particularly ones with 10–300 chips (3 × 107 to 109 neurons) or more, one can see utilization of stacking of dies, both decreasing the communication capacitance as well as board complexity. Stacking dies should roughly increase the final chip cost by the number of dies stacked.

In the following subsections, we overview general guidelines to consider when considering using neuromorphic ICs in the commercial market, first for low-cost consumer electronics, and second for a larger neuromorphic processor IC.

I have a casual observation to make. while the authors of the roadmap came to this conclusion “This study concludes that useful neural computation machines based on biological principles at the size of the human brain seems technically within our grasp.,” they’re also leaving themselves some wiggle room because the truth is no one knows if copying a human brain with circuits and various devices will lead to ‘thinking’ as we understand the concept.

For anyone who’s interested, you can search this blog for neuromorphic engineering, artificial brains, and/or memristors as I have many postings on these topics. One of my most recent on the topic of artificial brains is an April 7, 2014 piece titled: Brain-on-a-chip 2014 survey/overview.

One last observation about the movie ‘Transcendence’, has no one else noticed that it’s the ‘Easter’ story with a resurrected and digitized ‘Jesus’?

* Space inserted between ‘brains’ and ‘from’ in head on April 21, 2014.

Vancouver (Canada) and a city conversation about science that could have been better

Institutional insularity is a problem one finds everywhere. Interestingly, very few people see it that way due in large part to self-reinforcing loopbacks. Take universities for example and more specifically, Simon Fraser University’s April 17, 2014 City Conversation (in Vancouver, Canada) featuring Dr. Arvind Gupta (as of July 2014, president of the University of British Columbia) in a presentation titled: Creativity! Connection! Innovation!

Contrary to the hope I expressed in my April 14, 2014 post about the then upcoming event, this was largely an exercise in self-reference. Predictably with the flyer they used to advertise the event (the text was reproduced in its entirety in my April 14, 2014 posting), over 90% in the audiences (Vancouver, Burnaby, and Surrey campuses) were associated with one university or another.  Adding to the overwhelmingly ‘insider’ feel of this event, the speaker brought with him two students who had benefited from the organization he currently leads, Mitacs (a Canadian not-for-profit organization that offers funding for internships and fellowships at Canadian universities and formerly a mathematics NCE (Networks of Centres of Excellence of Canada program; a Canadian federal government program).

Despite the fact that this was billed as a ‘city conversation’ the talk focused largely on universities and their role in efforts to make Canada more productive and the wonderfulness of Mitacs. Unfortunately, what I wanted to hear and talk about was how Gupta, the students, and audience members saw the role of universities in cities, with a special reference to science.

It was less ‘city’ conversation and more ‘let’s focus on ourselves and our issues’ conversation. Mitacs, Canada’s productivity, and discussion about universities and innovation is of little inherent interest to anyone outside a select group of policy wonks (i.e., government and academe).

The conversation was self-referential until the very end. In the last minutes Gupta mentioned cities and science in the context of how cities in other parts of the world are actively supporting science. (For more about this interest elsewhere, you might find this Oct. 21, 2010 posting which features an article by Richard Van Noorden titled, Cities: Building the best cities for science; Which urban regions produce the best research — and can their success be replicated? as illuminating as I did.)

i wish Gupta had started with the last topic he introduced because Vancouverites have a lot of interest in science. In the last two years, TRIUMF, Canada’s national laboratory for particle and nuclear physics, has held a number of events at Science World and elsewhere which have been fully booked with waiting lists. The Peter Wall Institute for Advanced Studies has also held numerous science-themed events which routinely have waiting lists despite being held in one of Vancouver’s largest theatre venues.

If universities really want to invite outsiders into their environs and have city conversations, they need to follow through on the promise (e.g. talking about cities and science in a series titled “City Conversations”), as well as, do a better job of publicizing their events, encouraging people to enter their sacred portals, and addressing their ‘outsider’ audiences.

By the way, I have a few hints for the student speakers,

  • don’t scold your audience (you may find Canadians’ use of space shocking but please keep your indignation and sense of superiority to yourself)
  • before you start lecturing (at length) about the importance of interdisciplinary work, you might want to assess your audience’s understanding, otherwise you may find yourself preaching to the choir and/or losing your audience’s attention
  • before you start complaining that there’s no longer a mandatory retirement age and suggesting that this is the reason you can’t get a university job you may want to consider a few things: (1) your audience’s average age, in this case, I’d estimate that it was at least 50 and consequently not likely to be as sympathetic as you might like (2) the people who work past mandatory retirement may need the money or are you suggesting your needs are inherently more important? (3) whether or not a few people stay on past their ‘retirement’ age has less to do with your university job prospects than demographics and that’s a numbers game (not sure why I’d have to point that out to someone who’s associated with a mathematics organization such as Mitacs)

I expect no one has spoken or will speak to the organizers, Gupta, or the students other than to give them compliments. In fact, it’s unlikely there will be any real critique of having this presentation as part of a series titled “City Conversations” and that brings this posting back to institutional insularity. This problem is everywhere not just in universities and I’m increasingly interested in approaches to mitigating the tendency. If there’s anyone out there who knows of any examples where insularity has been tackled, please do leave a comment and, if possible, links.

Isis Innovation (University of Oxford, UK) spins out buckyball company, Designer Carbon Materials

Buckyballs are also known as Buckminsterfullerenes. The name is derived from Buckminster Fuller who designed something he called geodesic domes, from the Wikipedia entry (Note: Links have been removed),

Buckminsterfullerene (or bucky-ball) is a spherical fullerene molecule with the formula C60 [C = carbon; 60 is the number of carbon atoms in the molecule]. It has a cage-like fused-ring structure (truncated icosahedron) which resembles a soccer ball, made of twenty hexagons and twelve pentagons, with a carbon atom at each vertex of each polygon and a bond along each polygon edge.

It was first generated in 1985 by Harold Kroto, James R. Heath, Sean O’Brien, Robert Curl, and Richard Smalley at Rice University.[2] Kroto, Curl and Smalley were awarded the 1996 Nobel Prize in Chemistry for their roles in the discovery of buckminsterfullerene and the related class of molecules, the fullerenes. The name is a reference to Buckminster Fuller, as C60 resembles his trademark geodesic domes. Buckminsterfullerene is the most commonly naturally occurring fullerene molecule, as it can be found in small quantities in soot.[3][4] Solid and gaseous forms of the molecule have been detected in deep space.[5]

Here’s a model of a buckyball,

Courtesy: Isis Innovation (Oxford University)

Courtesy: Isis Innovation (Oxford University)

An April 15, 2014 University of Oxford (Isis Innovation) news release (h/t phys.org) describes the news research and some technical details while avoiding any mention of how they’ve tackled the production problems (a major issue, which has seriously constrained their commercial use),

The firm, Designer Carbon Materials, has been established by Isis Innovation, the University of Oxford’s technology commercialisation company, and will cost-effectively manufacture commercially useful quantities of the spherical carbon cage structures. Designer Carbon Materials is based on research from Dr Kyriakos Porfyrakis of Oxford University’s Department of Materials.

‘It is possible to insert a variety of useful atoms or atomic clusters into the hollow interior of these ball-like molecules, giving them new and intriguing abilities. Designer Carbon Materials will focus on the production of these value-added materials for a range of applications,’ said Dr Porfyrakis.

‘For instance, fullerenes are currently used as electron acceptors in polymer-based solar cells achieving some of the highest power conversion efficiencies known for these kinds of solar cells. Our endohedral fullerenes are even better electron-acceptors and therefore have the potential to lead to efficiencies exceeding 10 per cent.

‘The materials could also be developed as superior MRI contrast agents for medical imaging and as diagnostics for Alzheimer’s and Parkinson’s, as they are able to detect the presence of superoxide free radical molecules which may cause these conditions. We are receiving fantastic interest from organisations developing these applications, who until now have been unable to access useful quantities of these materials.’

The manufacturing process, patented by Isis Innovation, will continue to be developed by Designer Carbon Materials as it also makes its first sales of these extremely high-value materials.

Tom Hockaday, managing director of Isis Innovation, said: ‘This is a great example of an Isis spin-out which is both looking at exciting future applications for its technology and also answering a real market need. There is already significant demand for these nanomaterials and we expect the first customer orders will be fulfilled over the next few months.’

Investment in the company has been led by Oxford Technology Management and the Oxford Invention Fund. Lucius Carey from Oxford Technology Management said: ‘We are delighted to be investing in Designer Carbon Materials. The purposes of the investment will be to move into commercial premises and to scale up.’

Isis Innovation is a University of Oxford initiative and you can find out more about Isis Innovation here. As for the new spin-out company, Designer Carbon Materials, they have no website that I’ve been able to find but there is this webpage on the Isis Innovation website.