Tag Archives: European Union

Carbon capture with ‘diamonds from the sky’

Before launching into the latest on a new technique for carbon capture, it might be useful to provide some context. Arthur Neslen’s March 23, 2015 opinion piece outlines the issues and notes that one Norwegian Prime Minister resigned when coalition government partners attempted to build gas power plants without carbon capture and storage facilities (CCS), Note : A link has been removed,

At least 10 European power plants were supposed to begin piping their carbon emissions into underground tombs this year, rather than letting them twirl into the sky. None has done so.

Missed deadlines, squandered opportunities, spiralling costs and green protests have plagued the development of carbon capture and storage (CCS) technology since Statoil proposed the concept more than two decades ago.

But in the face of desperate global warming projections the CCS dream still unites Canadian tar sands rollers with the UN’s Intergovernmental Panel on Climate Change (IPCC), and Shell with some environmentalists.

With 2bn people in the developing world expected to hook up to the world’s dirty energy system by 2050, CCS holds out the tantalising prospect of fossil-led growth that does not fry the planet.


“With CCS in the mix, we can decarbonise in a cost-effective manner and still continue to produce, to some extent, our fossil fuels,” Tim Bertels, Shell’s Glocal CCS portfolio manager told the Guardian. “You don’t need to divest in fossil fuels, you need to decarbonise them.”

The technology has been gifted “a very significant fraction” of the billions of dollars earmarked by Shell for clean energy research, he added. But the firm is also a vocal supporter of public funding for CCS from carbon markets, as are almost all players in the industry.

Enthusiasm for this plan is not universal (from Neslen’s opinion piece),

Many environmentalists see the idea as a non-starter because it locks high emitting power plants into future energy systems, and obstructs funding for the cheaper renewables revolution already underway. “CCS is is completely irrelevant,” said Jeremy Rifkin, a noted author and climate adviser to several governments. “I don’t even think about it. It’s not going to happen. It’s not commercially available and it won’t be commercially viable.”

I recommend reading Neslen’s piece for anyone who’s not already well versed on the issues. He uses Norway as a case study and sums up the overall CCS political situation this way,

In many ways, the debate over carbon capture and storage is a struggle between two competing visions of the societal transformation needed to avert climate disaster. One vision represents the enlightened self-interest of a contributor to the problem. The other cannot succeed without eliminating its highly entrenched opponent. The battle is keenly fought by technological optimists on both sides. But if Norway’s fractious CCS experience is any indicator, it will be decided on the ground by the grimmest of realities.

On that note of urgency, here’s some research on carbon dioxide (CO2) or, more specifically, carbon capture and utilization technology, from an Aug. 19, 2015 news item on Nanowerk,,

Finding a technology to shift carbon dioxide (CO2), the most abundant anthropogenic greenhouse gas, from a climate change problem to a valuable commodity has long been a dream of many scientists and government officials. Now, a team of chemists says they have developed a technology to economically convert atmospheric CO2    directly into highly valued carbon nanofibers for industrial and consumer products.

An Aug. 19, 2015 American Chemical Society (ACS) news release (also on EurekAlert), which originated the news time, expands on the theme,

The team will present brand-new research on this new CO2 capture and utilization technology at the 250th National Meeting & Exposition of the American Chemical Society (ACS). ACS is the world’s largest scientific society. The national meeting, which takes place here through Thursday, features more than 9,000 presentations on a wide range of science topics.

“We have found a way to use atmospheric CO2 to produce high-yield carbon nanofibers,” says Stuart Licht, Ph.D., who leads a research team at George Washington University. “Such nanofibers are used to make strong carbon composites, such as those used in the Boeing Dreamliner, as well as in high-end sports equipment, wind turbine blades and a host of other products.”

Previously, the researchers had made fertilizer and cement without emitting CO2, which they reported. Now, the team, which includes postdoctoral fellow Jiawen Ren, Ph.D., and graduate student Jessica Stuart, says their research could shift CO2 from a global-warming problem to a feed stock for the manufacture of in-demand carbon nanofibers.

Licht calls his approach “diamonds from the sky.” That refers to carbon being the material that diamonds are made of, and also hints at the high value of the products, such as the carbon nanofibers that can be made from atmospheric carbon and oxygen.

Because of its efficiency, this low-energy process can be run using only a few volts of electricity, sunlight and a whole lot of carbon dioxide. At its root, the system uses electrolytic syntheses to make the nanofibers. CO2 is broken down in a high-temperature electrolytic bath of molten carbonates at 1,380 degrees F (750 degrees C). Atmospheric air is added to an electrolytic cell. Once there, the CO2 dissolves when subjected to the heat and direct current through electrodes of nickel and steel. The carbon nanofibers build up on the steel electrode, where they can be removed, Licht says.

To power the syntheses, heat and electricity are produced through a hybrid and extremely efficient concentrating solar-energy system. The system focuses the sun’s rays on a photovoltaic solar cell to generate electricity and on a second system to generate heat and thermal energy, which raises the temperature of the electrolytic cell.

Licht estimates electrical energy costs of this “solar thermal electrochemical process” to be around $1,000 per ton of carbon nanofiber product, which means the cost of running the system is hundreds of times less than the value of product output.

“We calculate that with a physical area less than 10 percent the size of the Sahara Desert, our process could remove enough CO2 to decrease atmospheric levels to those of the pre-industrial revolution within 10 years,” he says. [emphasis mine]

At this time, the system is experimental, and Licht’s biggest challenge will be to ramp up the process and gain experience to make consistently sized nanofibers. “We are scaling up quickly,” he adds, “and soon should be in range of making tens of grams of nanofibers an hour.”

Licht explains that one advance the group has recently achieved is the ability to synthesize carbon fibers using even less energy than when the process was initially developed. “Carbon nanofiber growth can occur at less than 1 volt at 750 degrees C, which for example is much less than the 3-5 volts used in the 1,000 degree C industrial formation of aluminum,” he says.

A low energy approach that cleans up the air by converting greenhouse gases into useful materials and does it quickly is incredibly exciting. Of course, there are a few questions to be asked. Are the research outcomes reproducible by other teams? Licht notes the team is scaling the technology up but how soon can we scale up to industrial strength?

Sunscreen based on algae, reef fish mucus, and chitosan

The proposed sunscreen is all natural and would seem to avoid some of the environmental problems associated with other sunscreens (e.g., washing off into the ocean and polluting it). From a July 29, 2015 American Chemical Society (ACS) news release (also on EurekAlert), Note: Links have been removed,

For consumers searching for just the right sunblock this summer, the options can be overwhelming. But scientists are now turning to the natural sunscreen of algae — which is also found in fish slime — to make a novel kind of shield against the sun’s rays that could protect not only people, but also textiles and outdoor materials. …

Existing sunblock lotions typically work by either absorbing ultraviolet rays or physically blocking them. A variety of synthetic and natural compounds can accomplish this. But most commercial options have limited efficiency, pose risks to the environment and human health or are not stable. To address these shortcomings, Vincent Bulone, Susana C. M. Fernandes and colleagues looked to nature for inspiration.

The researchers used algae’s natural sunscreen molecules, which can also be found in reef fish mucus and microorganisms, and combined them with chitosan, a biopolymer from crustacean shells. Testing showed their materials were biocompatible, stood up well in heat and light, and absorbed both ultraviolet A and ultraviolet B radiation with high efficiency.

The authors acknowledge funding from the European Commission Marie Curie Intra-European Fellowship, the KTH Advanced Carbohydrate Materials Consortium (CarboMat), the Swedish Research Council for Environment, Agricultural Sciences and Spatial Planning (FORMAS) and the Basque Government Department of Education.

Here’s a link to and a citation for the paper,

Exploiting Mycosporines as Natural Molecular Sunscreens for the Fabrication of UV-Absorbing Green Material by Susana C. M. Fernandes, Ana Alonso-Varona, Teodoro Palomares, Verónica Zubillaga, Jalel Labidi, and Vincent Bulone.
ACS Appl. Mater. Interfaces, Article ASAP DOI: 10.1021/acsami.5b04064 Publication Date (Web): July 13, 2015
Copyright © 2015 American Chemical Society

This paper is behind a paywall.

Nanotechnology research protocols for Environment, Health and Safety Studies in US and a nanomedicine characterization laboratory in the European Union

I have two items relating to nanotechnology and the development of protocols. The first item concerns the launch of a new web portal by the US National Institute of Standards and Technology.

US National Institute of Standards and Technology (NIST)

From a July 1, 2015 news item on Azonano,

As engineered nanomaterials increasingly find their way into commercial products, researchers who study the potential environmental or health impacts of those materials face a growing challenge to accurately measure and characterize them. These challenges affect measurements of basic chemical and physical properties as well as toxicology assessments.

To help nano-EHS (Environment, Health and Safety)researchers navigate the often complex measurement issues, the National Institute of Standards and Technology (NIST) has launched a new website devoted to NIST-developed (or co-developed) and validated laboratory protocols for nano-EHS studies.

A July 1, 2015 NIST news release on EurekAlert, which originated the news item, offers more details about the information available through the web portal,

In common lab parlance, a “protocol” is a specific step-by-step procedure used to carry out a measurement or related activity, including all the chemicals and equipment required. Any peer-reviewed journal article reporting an experimental result has a “methods” section where the authors document their measurement protocol, but those descriptions are necessarily brief and condensed, and may lack validation of any sort. By comparison, on NIST’s new Protocols for Nano-EHS website the protocols are extraordinarily detailed. For ease of citation, they’re published individually–each with its own unique digital object identifier (DOI).

The protocols detail not only what you should do, but why and what could go wrong. The specificity is important, according to program director Debra Kaiser, because of the inherent difficulty of making reliable measurements of such small materials. “Often, if you do something seemingly trivial–use a different size pipette, for example–you get a different result. Our goal is to help people get data they can reproduce, data they can trust.”

A typical caution, for example, notes that if you’re using an instrument that measures the size of nanoparticles in a solution by how they scatter light, it’s important also to measure the transmission spectrum of the particles if they’re colored, because if they happen to absorb light strongly at the same frequency as your instrument, the result may be biased.

“These measurements are difficult because of the small size involved,” explains Kaiser. “Very few new instruments have been developed for this. People are adapting existing instruments and methods for the job, but often those instruments are being operated close to their limits and the methods were developed for chemicals or bulk materials and not for nanomaterials.”

“For example, NIST offers a reference material for measuring the size of gold nanoparticles in solution, and we report six different sizes depending on the instrument you use. We do it that way because different instruments sense different aspects of a nanoparticle’s dimensions. An electron microscope is telling you something different than a dynamic light scattering instrument, and the researcher needs to understand that.”

The nano-EHS protocols offered by the NIST site, Kaiser says, could form the basis for consensus-based, formal test methods such as those published by ASTM and ISO.

NIST’s nano-EHS protocol site currently lists 12 different protocols in three categories: sample preparation, physico-chemical measurements and toxicological measurements. More protocols will be added as they are validated and documented. Suggestions for additional protocols are welcome at nanoprotocols@nist.gov.

The next item concerns European nanomedicine.

CEA-LETI and Europe’s first nanomedicine characterization laboratory

A July 1, 2015 news item on Nanotechnology Now describes the partnership which has led to launch of the new laboratory,

CEA-Leti today announced the launch of the European Nano-Characterisation Laboratory (EU-NCL) funded by the European Union’s Horizon 2020 research and innovation programm[1]e. Its main objective is to reach a level of international excellence in nanomedicine characterisation for medical indications like cancer, diabetes, inflammatory diseases or infections, and make it accessible to all organisations developing candidate nanomedicines prior to their submission to regulatory agencies to get the approval for clinical trials and, later, marketing authorization.

“As reported in the ETPN White Paper[2], there is a lack of infrastructure to support nanotechnology-based innovation in healthcare,” said Patrick Boisseau, head of business development in nanomedicine at CEA-Leti and chairman of the European Technology Platform Nanomedicine (ETPN). “Nanocharacterisation is the first bottleneck encountered by companies developing nanotherapeutics. The EU-NCL project is of most importance for the nanomedicine community, as it will contribute to the competiveness of nanomedicine products and tools and facilitate regulation in Europe.”

EU-NCL is partnered with the sole international reference facility, the Nanotechnology Characterization Lab of the National Cancer Institute in the U.S. (US-NCL)[3], to get faster international harmonization of analytical protocols.

“We are excited to be part of this cooperative arrangement between Europe and the U.S.,” said Scott E. McNeil, director of U.S. NCL. “We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.”

A July 2, 2015 EMPA (Swiss Federal Laboratories for Materials Science and Technology) news release on EurekAlert provides more detail about the laboratory and the partnerships,

The «European Nanomedicine Characterization Laboratory» (EU-NCL), which was launched on 1 June 2015, has a clear-cut goal: to help bring more nanomedicine candidates into the clinic and on the market, for the benefit of patients and the European pharmaceutical industry. To achieve this, EU-NCL is partnered with the sole international reference facility, the «Nanotechnology Characterization Laboratory» (US-NCL) of the US-National Cancer Institute, to get faster international harmonization of analytical protocols. EU-NCL is also closely connected to national medicine agencies and the European Medicines Agency to continuously adapt its analytical services to requests of regulators. EU-NCL is designed, organized and operated according to the highest EU regulatory and quality standards. «We are excited to be part of this cooperative project between Europe and the U.S.,» says Scott E. McNeil, director of US-NCL. «We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.»

Nine partners from eight countries

EU-NCL, which is funded by the EU for a four-year period with nearly 5 million Euros, brings together nine partners from eight countries: CEA-Tech in Leti and Liten, France, the coordinator of the project; the Joint Research Centre of the European Commission in Ispra, Italy; European Research Services GmbH in Münster Germany; Leidos Biomedical Research, Inc. in Frederick, USA; Trinity College in Dublin, Ireland; SINTEF in Oslo, Norway; the University of Liverpool in the UK; Empa, the Swiss Federal Laboratories for Materials Science and Technology in St. Gallen, Switzerland; Westfälische Wilhelms-Universität (WWU) and Gesellschaft für Bioanalytik, both in Münster, Germany. Together, the partnering institutions will provide a trans-disciplinary testing infrastructure covering a comprehensive set of preclinical characterization assays (physical, chemical, in vitro and in vivo biological testing), which will allow researchers to fully comprehend the biodistribution, metabolism, pharmacokinetics, safety profiles and immunological effects of their medicinal nano-products. The project will also foster the use and deployment of standard operating procedures (SOPs), benchmark materials and quality management for the preclinical characterization of medicinal nano-products. Yet another objective is to promote intersectoral and interdisciplinary communication among key drivers of innovation, especially between developers and regulatory agencies.

The goal: to bring safe and efficient nano-therapeutics faster to the patient

Within EU-NCL, six analytical facilities will offer transnational access to their existing analytical services for public and private developers, and will also develop new or improved analytical assays to keep EU-NCL at the cutting edge of nanomedicine characterization. A complementary set of networking activities will enable EU-NCL to deliver to European academic or industrial scientists the high-quality analytical services they require for accelerating the industrial development of their candidate nanomedicines. The Empa team of Peter Wick at the «Particles-Biology Interactions» lab will be in charge of the quality management of all analytical methods, a key task to guarantee the best possible reproducibility and comparability of the data between the various analytical labs within the consortium. «EU-NCL supports our research activities in developing innovative and safe nanomaterials for healthcare within an international network, which will actively shape future standards in nanomedicine and strengthen Empa as an enabler to facilitate the transfer of novel nanomedicines from bench to bedside», says Wick.

You can find more information about the laboratory on the Horizon 2020 (a European Union science funding programme) project page for the EU-NCL laboratory. For anyone curious about CEA-Leti, it’s a double-layered organization. CEA is France’s Commission on Atomic Energy and Alternative Energy (Commissariat à l’énergie atomique et aux énergies alternatives); you can go here to their French language site (there is an English language clickable option on the page). Leti is one of the CEA’s institutes and is known as either Leti or CEA-Leti. I have no idea what Leti stands for. Here’s the Leti website (this is the English language version).

LiquiGlide, a nanotechnology-enabled coating for food packaging and oil and gas pipelines

Getting condiments out of their bottles should be a lot easier in several European countries in the near future. A June 30, 2015 news item on Nanowerk describes the technology and the business deal (Note: A link has been removed),

The days of wasting condiments — and other products — that stick stubbornly to the sides of their bottles may be gone, thanks to MIT [Massachusetts Institute of Technology] spinout LiquiGlide, which has licensed its nonstick coating to a major consumer-goods company.

Developed in 2009 by MIT’s Kripa Varanasi and David Smith, LiquiGlide is a liquid-impregnated coating that acts as a slippery barrier between a surface and a viscous liquid. Applied inside a condiment bottle, for instance, the coating clings permanently to its sides, while allowing the condiment to glide off completely, with no residue.

In 2012, amidst a flurry of media attention following LiquiGlide’s entry in MIT’s $100K Entrepreneurship Competition, Smith and Varanasi founded the startup — with help from the Institute — to commercialize the coating.

Today [June 30, 2015], Norwegian consumer-goods producer Orkla has signed a licensing agreement to use the LiquiGlide’s coating for mayonnaise products sold in Germany, Scandinavia, and several other European nations. This comes on the heels of another licensing deal, with Elmer’s [Elmer’s Glue & Adhesives], announced in March [2015].

A June 30, 2015 MIT news release, which originated the news item, provides more details about the researcher/entrepreneurs’ plans,

But this is only the beginning, says Varanasi, an associate professor of mechanical engineering who is now on LiquiGlide’s board of directors and chief science advisor. The startup, which just entered the consumer-goods market, is courting deals with numerous producers of foods, beauty supplies, and household products. “Our coatings can work with a whole range of products, because we can tailor each coating to meet the specific requirements of each application,” Varanasi says.

Apart from providing savings and convenience, LiquiGlide aims to reduce the surprising amount of wasted products — especially food — that stick to container sides and get tossed. For instance, in 2009 Consumer Reports found that up to 15 percent of bottled condiments are ultimately thrown away. Keeping bottles clean, Varanasi adds, could also drastically cut the use of water and energy, as well as the costs associated with rinsing bottles before recycling. “It has huge potential in terms of critical sustainability,” he says.

Varanasi says LiquiGlide aims next to tackle buildup in oil and gas pipelines, which can cause corrosion and clogs that reduce flow. [emphasis mine] Future uses, he adds, could include coatings for medical devices such as catheters, deicing roofs and airplane wings, and improving manufacturing and process efficiency. “Interfaces are ubiquitous,” he says. “We want to be everywhere.”

The news release goes on to describe the research process in more detail and offers a plug for MIT’s innovation efforts,

LiquiGlide was originally developed while Smith worked on his graduate research in Varanasi’s research group. Smith and Varanasi were interested in preventing ice buildup on airplane surfaces and methane hydrate buildup in oil and gas pipelines.

Some initial work was on superhydrophobic surfaces, which trap pockets of air and naturally repel water. But both researchers found that these surfaces don’t, in fact, shed every bit of liquid. During phase transitions — when vapor turns to liquid, for instance — water droplets condense within microscopic gaps on surfaces, and steadily accumulate. This leads to loss of anti-icing properties of the surface. “Something that is nonwetting to macroscopic drops does not remain nonwetting for microscopic drops,” Varanasi says.

Inspired by the work of researcher David Quéré, of ESPCI in Paris, on slippery “hemisolid-hemiliquid” surfaces, Varanasi and Smith invented permanently wet “liquid-impregnated surfaces” — coatings that don’t have such microscopic gaps. The coatings consist of textured solid material that traps a liquid lubricant through capillary and intermolecular forces. The coating wicks through the textured solid surface, clinging permanently under the product, allowing the product to slide off the surface easily; other materials can’t enter the gaps or displace the coating. “One can say that it’s a self-lubricating surface,” Varanasi says.

Mixing and matching the materials, however, is a complicated process, Varanasi says. Liquid components of the coating, for instance, must be compatible with the chemical and physical properties of the sticky product, and generally immiscible. The solid material must form a textured structure while adhering to the container. And the coating can’t spoil the contents: Foodstuffs, for instance, require safe, edible materials, such as plants and insoluble fibers.

To help choose ingredients, Smith and Varanasi developed the basic scientific principles and algorithms that calculate how the liquid and solid coating materials, and the product, as well as the geometry of the surface structures will all interact to find the optimal “recipe.”

Today, LiquiGlide develops coatings for clients and licenses the recipes to them. Included are instructions that detail the materials, equipment, and process required to create and apply the coating for their specific needs. “The state of the coating we end up with depends entirely on the properties of the product you want to slide over the surface,” says Smith, now LiquiGlide’s CEO.

Having researched materials for hundreds of different viscous liquids over the years — from peanut butter to crude oil to blood — LiquiGlide also has a database of optimal ingredients for its algorithms to pull from when customizing recipes. “Given any new product you want LiquiGlide for, we can zero in on a solution that meets all requirements necessary,” Varanasi says.

MIT: A lab for entrepreneurs

For years, Smith and Varanasi toyed around with commercial applications for LiquiGlide. But in 2012, with help from MIT’s entrepreneurial ecosystem, LiquiGlide went from lab to market in a matter of months.

Initially the idea was to bring coatings to the oil and gas industry. But one day, in early 2012, Varanasi saw his wife struggling to pour honey from its container. “And I thought, ‘We have a solution for that,’” Varanasi says.

The focus then became consumer packaging. Smith and Varanasi took the idea through several entrepreneurship classes — such as 6.933 (Entrepreneurship in Engineering: The Founder’s Journey) — and MIT’s Venture Mentoring Service and Innovation Teams, where student teams research the commercial potential of MIT technologies.

“I did pretty much every last thing you could do,” Smith says. “Because we have such a brilliant network here at MIT, I thought I should take advantage of it.”

That May [2012], Smith, Varanasi, and several MIT students entered LiquiGlide in the MIT $100K Entrepreneurship Competition, earning the Audience Choice Award — and the national spotlight. A video of ketchup sliding out of a LiquiGlide-coated bottle went viral. Numerous media outlets picked up the story, while hundreds of companies reached out to Varanasi to buy the coating. “My phone didn’t stop ringing, my website crashed for a month,” Varanasi says. “It just went crazy.”

That summer [2012], Smith and Varanasi took their startup idea to MIT’s Global Founders’ Skills Accelerator program, which introduced them to a robust network of local investors and helped them build a solid business plan. Soon after, they raised money from family and friends, and won $100,000 at the MassChallenge Entrepreneurship Competition.

When LiquiGlide Inc. launched in August 2012, clients were already knocking down the door. The startup chose a select number to pay for the development and testing of the coating for its products. Within a year, LiquiGlide was cash-flow positive, and had grown from three to 18 employees in its current Cambridge headquarters.

Looking back, Varanasi attributes much of LiquiGlide’s success to MIT’s innovation-based ecosystem, which promotes rapid prototyping for the marketplace through experimentation and collaboration. This ecosystem includes the Deshpande Center for Technological Innovation, the Martin Trust Center for MIT Entrepreneurship, the Venture Mentoring Service, and the Technology Licensing Office, among other initiatives. “Having a lab where we could think about … translating the technology to real-world applications, and having this ability to meet people, and bounce ideas … that whole MIT ecosystem was key,” Varanasi says.

Here’s the latest LiquiGlide video,


Credits:

Video: Melanie Gonick/MIT
Additional footage courtesy of LiquiGlide™
Music sampled from “Candlepower” by Chris Zabriskie
https://freemusicarchive.org/music/Ch…
http://creativecommons.org/licenses/b…

I had thought the EU (European Union) offered more roadblocks to marketing nanotechnology-enabled products used in food packaging than the US. If anyone knows why a US company would market its products in Europe first I would love to find out.

Opportunity for companies to take a survey on risk management and nanotechnology

A June 8, 2015 news item on Nanowerk features a European Union (EU) Framework Programme 7 (FP7) nanotechnology risk management project and survey,

The EU FP7 Sustainable Nanotechnologies (SUN) project is based on the idea that the current knowledge on environmental and health risks of nanomaterials – while limited – can nevertheless guide nanomanufacturing to avoid liabilities if an integrated approach addressing the complete product lifecycle is applied. SUN aims to evaluate the risks along the supply chains of engineered nanomaterials and incorporate the results into tools and guidelines for sustainable nanomanufacturing.

A May 26, 2015 SUN press release by Stella Stoycheva, which originated the news item, provides more details,

… A key objective of  Sustainable Nanotechnologies (SUN) is to build the SUN Decision Support System (SUNDS) to facilitate safe and sustainable nanomanufacturing and risk management. It will integrate tools for ecological and human health risk assessment, lifecycle assessment, economic assessment and social impact assessment within a sustainability assessment framework. We are currently developing the Technological Alternatives and Risk Management Measures (TARMM) inventory and are looking for companies to fill in a short survey.

… We would appreciate responses from personnel of companies involved in nanotechnology-related activities who are familiar with the risk management practices.

You can go here to take the survey. The focus is on companies and there don’t seem to be any geographic requirements such as only EU companies can participate.

The use of graphene scanners in art conservation

A May 20, 2015 news item on phys.org describes a new method of examining art work without damaging it,

Museum curators, art restorers, archaeologists and the broader public will soon be able to learn much more about paintings and other historic objects, thanks to an EU project which has become a pioneer in non-invasive art exploration techniques, based on a graphene scanner.

Researchers working on INSIDDE [INtegration of cost-effective Solutions for Imaging, Detection, and Digitisation of hidden Elements in paintings], which received a EUR 2.9 million investment from FP7 ICT Research Programme, have developed a graphene scanner that can explore under the surface of a painting, or through the dirt covering an ancient object unearthed in an archaeological dig, without touching it.

‘As well as showing sketches or previous paintings that have remained hidden beneath a particular artwork, the scanner, together with post-processing techniques, will allow us to identify and distinguish brushstrokes to understand the creative process,’ explained Javier Gutiérrez, of Spanish technology company Treelogic, which is leading the project.

A May 19, 2015 CORDIS press release, which originated the news item, provides more details about the graphene scanner’s cabilities,

The challenge in this field is to develop advanced technologies that avoid damaging the artwork under examination. Solvents and their potential side effects are progressively being replaced by the likes of lasers, to removed dirt and varnish from paintings. Limestone-producing bacteria can be used to fill cracks in sculptures. INSIDDE is taking a step further in this direction by using terahertz, a frequency band lying between microwave and infrared in the electromagnetic spectrum.

Until graphene, considered to be one of the materials of the future, came along it was difficult to generate terahertz frequencies to acquire such detail. Graphene in this application acts as a frequency multiplier, allowing scientists to reveal previously hidden features such as brushstroke textures, pigments and defects, without harming the work.

Although X-ray and infrared reflectography are used elsewhere to carry out this type of study, they heat the object and cannot reach the intermediate layers between the gesso and the varnish in paintings, or other characteristic elements in ceramics. INSIDDE’s device, using terahertz frequency, works in these intermediate layers and does not heat the object.

In conjunction with a commercial scanner mapping the art’s upper layers, it can generate full 3D data from the object in a completely non-intrusive way and processes this data to extract and interpret features invisible to the naked eye, in a way that has never been done before.

INSIDDE is developing this technology to benefit the general public, too. The 2D and 3D digital models it is producing will be uploaded to the Europeana network and the project aims to make the results available through a smartphone and tablet app to be exploited by local and regional museums. The app is currently being trialled at one of the partners, the Asturias Fine Art Museum in Oviedo. It shows the different layers of the painting the visitor is looking at and provides additional information and audio.

The press release notes that the technology offers some new possibilities,

Although the scanner is still in its trial and calibration phase, the project participants have already unveiled some promising results. Marta Flórez, of the Asturias Fine Art Museum, explained: ‘Using the prototype, we have been able to distinguish clearly between different pigments, which in some cases will avoid having to puncture the painting in order to find out what materials the artist used.’

The prototype is also being validated with some recently unearthed 3rd Century pottery from the Stara Zagora regional history museum in Bulgaria. When the project ends in December 2015, one of the options the consortium is assessing is putting this cost-effective solution at the service of smaller local and regional museums without art restoration departments so that they too, like the bigger museums, can make important discoveries about their collections.

You can find out more about INSIDDE here.

Outcomes for US-European Union bridging Nano environment, health, and safety (EHS) research workshop

According to Lynn Bergeson in an April 14, 2015 news item on Nanotechnology Now, a US-European Union (EU) workshop on nanosafety has published a document,

The National Nanotechnology Initiative (NNI) published on March 23, 2015, the outcomes of the March 12-13, 2015, joint workshop held by the U.S. and the European Union (EU), “Bridging NanoEHS Research Efforts.” …

A US National Nanotechnology Initiative (NNI) ??, ??, 2015 notice on the nano.gov site provides more details,

Workshop participants reviewed progress toward COR [communities of research] goals and objectives, shared best practices, and identified areas for cross-COR collaboration.  To address new challenges the CORs were realigned and expanded with the addition of a COR on nanotechnology characterization. The seven CORs now address:

Characterization
Databases and Computational Modeling
Exposure through Product Life
EcoToxicity
Human Toxicity
Risk Assessment
Risk Management and Control

The CORs support the shared goal of responsible nanotechnology development as outlined in the U.S. National Nanotechnology Initiative EHS Research Strategy, and the research strategy of the EU NanoSafety Cluster. The CORs directly address several priorities described in the documents above, including the creation of a comprehensive nanoEHS knowledge base and international cooperation on the development of best practices and consensus standards.

The CORs are self-run, with technical support provided by the European Commission and the U.S. National Nanotechnology Coordination Office. Each Community has European and American co-chairs who convene meetings and teleconferences, guide the discussions, and set the group’s agenda. Participation in the CORs is free and open to any interested individuals. More information is available at www.us-eu.org.

The workshop was organized by the European Commission and the U.S. National Nanotechnology Initiative under the auspices of the agreement for scientific and technological cooperation between the European Union and the United States.

Coincidentally, I received an April 13, 2015 notice about the European Commission’s NanoSafety Cluster’s Spring 2015 newsletter concerning their efforts but found no mention of the ‘bridging workshop’. Presumably, information was not available prior to the newsletter’s deadline.

In my April 8, 2014 posting about a US proposed rule for reporting nanomaterials, I included information about the US and its efforts to promote or participate in harmonizing the nano situation internationally. Scroll down about 35% of the way to find information about the Canada-U.S. Regulatory Cooperation Council (RCC) Nanotechnology Initiative, the Organisation for Economic Cooperation and Development (OECD) effort, and the International Organization for Standardization (ISO) effort.

Converting light to electricity at femto speeds

This is a pretty remarkable (to me anyway) piece of research on speeding up the process of converting light to electricity. From an April 14, 2015 Institute of Photonic Science press release (also on EurekAlert but dated April 15, 2015),

The efficient conversion of light into electricity plays a crucial role in many technologies, ranging from cameras to solar cells. It also forms an essential step in data communication applications, since it allows for information carried by light to be converted into electrical information that can be processed in electrical circuits. Graphene is an excellent material for ultrafast conversion of light to electrical signals, but so far it was not known how fast graphene responds to ultrashort flashes of light.

The new device that the researchers developed is capable of converting light into electricity in less than 50 femtoseconds (a twentieth of a millionth of a millionth of a second). To do this, the researchers used a combination of ultrafast pulse-shaped laser excitation and highly sensitive electrical readout. As Klaas-Jan Tielrooij comments, “the experiment uniquely combined the ultrafast pulse shaping expertise obtained from single molecule ultrafast photonics with the expertise in graphene electronics. Facilitated by graphene’s nonlinear photo-thermoelectric response, these elements enabled the observation of femtosecond photodetection response times.”

The ultrafast creation of a photovoltage in graphene is possible due to the extremely fast and efficient interaction between all conduction band carriers in graphene. This interaction leads to a rapid creation of an electron distribution with an elevated electron temperature. Thus, the energy absorbed from light is efficiently and rapidly converted into electron heat. Next, the electron heat is converted into a voltage at the interface of two graphene regions with different doping. This photo-thermoelectric effect turns out to occur almost instantaneously, thus enabling the ultrafast conversion of absorbed light into electrical signals. As Prof. van Hulst states, “it is amazing how graphene allows direct non-linear detecting of ultrafast femtosecond (fs) pulses”.

The results obtained from the findings of this work, which has been partially funded by the EC Graphene Flagship, open a new pathway towards ultra-fast optoelectronic conversion. As Prof. Koppens comments, “Graphene photodetectors keep showing fascinating performances addressing a wide range of applications”.

Here’s a link to and a citation for the paper,

Generation of photovoltage in graphene on a femtosecond timescale through efficient carrier heating by K. J. Tielrooij, L. Piatkowski, M. Massicotte, A. Woessner, Q. Ma, Y. Lee,  K. S. Myhro, C. N. Lau, P. Jarillo-Herrero, N. F. van Hulst & F. H. L. Koppens. Nature Nanotechnology (2015) doi:10.1038/nnano.2015.54 Published online 13 April 2015

This paper is behind a paywall but there is a free preview via ReadCube Access.

NANoREG halfway through its project (Environment, Health & Safety) term

A March 18, 2015 news item on Nanowerk announces a third NANoReg newsletter marking the halfway point in the project’s term (Note: Links have been removed),

NANoREG is the first FP7 project to deliver the answers needed by regulators and legislators on EHS [Environment, Health & Safety] by linking them to a scientific evaluation of data and test methods.

Time wise, the NANoREG project is now halfway. After setting the basic conditions for its R&D work, the project now focuses on the generation of reliable and comparable experimental data on the EHS aspects of the selected NANoREG nanomaterials. These data will form the basis for the main “end products” of the NANoREG project: the Regulatory Framework and the NANoREG Toolbox. Highlights of this experimental work and results will be shared with you in this 3rd NANoREG Newsletter (pdf).

The editorial for the 3rd issue of the NANoREG newsletter, which seems to have originated the news item, describes upcoming initiatives,

The Regulatory Framework and the NANoREG Toolbox just mentioned will be developed in close cooperation with organisations involved in standardisation and in the regulatory aspects of nanomaterials like ECHA [European Chemicals Agency], OECD [Organization for Economic Cooperation and Development], CEN [European Committee for Standardization] and ISO [International Standards Organization]. The results of other EU FP7 [Framework Programme 7] and H2020 [Horizon 2020] [research funding] projects will also be taken into account when developing these products. One of these projects is the H2020 project NANoREG II that focuses on Safe by design and that will start in the 2nd or 3rd quarter of 2015.

The coordinated and integrated approach in developing the Framework and the NANoREG Toolbox is one of the main elements of the H2020 funded Coordination and Support Action (CSA) “ProSafe” that recently had its Kick-Off meeting in Aix-en-Provence, France. Just like NANoREG this CSA is coordinated by the Dutch Ministry of Infrastructure and the Environment and as such executed by me. Other elements of this CSA are – among others – the expansions of the involvement of EU and non-EU countries in the NANoREG project in order to broaden the platform of support for the NANoREG results world-wide (“NANoREG+”), the exploitation of synergies between the NANoREG project and other “nanosafety” projects and data management.

The outcome of the CSA will be a White Paper that can be used by policy makers, regulators and industry to establish methods for measuring and assessing the EHS aspects of nanomaterials and that will give guidance to industry how to implement “safe by design“. A forerunner of the White Paper will be subject of a three days scientific conference to be held at the end of 2016. It will include the results of the NANoREG project, the results of the evaluation of EHS data available at the OECD and results from other sources. After consulting Risk assessors and policymakers, the White Paper will be published in the first quarter of 2017.

This project has reached out beyond Europe for partners (from the editorial for the 3rd NANoREG newsletter),

It is quite a challenge we face. Given the expertise and scientific authority of our partners, including the Czech-,Brazilian- and South Korean parties that recently joined the NANoREG project, I am confident however that we will succeed in reaching our goal: creating a solid basis for a balanced combination of nanosafety and innovation that will be beneficial to society.

I hope NANoREG is successful with its goal of “creating a solid basis for a balanced combination of nanosafety and innovation that will be beneficial to society.”

I last wrote about NANoREG in a March 21, 2014 posting.

Nanomedicine living up to its promise?

Michael Berger has written a March 10, 2015 Nanowerk spotlight article reviewing nanomedicine’s  progress or lack thereof (Note: Links have been removed),

In early 2003, the European Science Foundation launched its Scientific Forward Look on Nanomedicine, a foresight study (report here ;pdf) and in 2004, the U.S. National Institute[s] of Health (NIH) published its Roadmap (now Common Fund) of the Nanomedicine Initiative. This program began in 2005 with a national network of eight Nanomedicine Development Centers. Now, in the second half of this 10-year program, the four centers best positioned to effectively apply their findings to translational studies were selected to continue receiving support.

A generally accepted definition of nanomedicine refers to highly specific medical intervention at the molecular scale for curing disease or repairing damaged tissues, such as bone, muscle, or nerve.

Much of Berger’s article is based on Subbu Venkatraman’s, Director of the NTU (Nanyang Technological University)-Northwestern Nanomedicine Institute in Singapore, paper, Has nanomedicine lived up to its promise?, 2014 Nanotechnology 25 372501 doi:10.1088/0957-4484/25/37/372501 (Note: Links have been removed),

… Historically, the approval of Doxil as the very first nanotherapeutic product in 1995 is generally regarded as the dawn of nanomedicine for human use. Since then, research activity in this area has been frenetic, with, for example, 2000 patents being generated in 2003, in addition to 1200 papers [2]. In the same time period, a total of 207 companies were involved in developing nanomedicinal products in diagnostics, imaging, drug delivery and implants. About 38 products loosely classified as nanomedicine products were in fact approved by 2004. Out of these, however, a number of products (five in all) were based on PEG-ylated proteins, which strictly speaking, are not so much nanomedicine products as molecular therapeutics. Nevertheless, the promise of nanomedicine was being translated into funding for small companies, and into clinical success, so that by 2013, the number of approved products had reached 54 in all, with another 150 in various stages of clinical trials [3]. The number of companies and institutions had risen to 241 (including research centres that were working on nanomedicine). A PubMed search on articles relating to nanomedicine shows 7400 hits over 10 years, of which 1874 were published in 2013 alone. Similarly, the US patent office database shows 409 patents (since 1976) that were granted in nanomedicine, with another 679 applications awaiting approval. So judging by research activity and funding the field of nanomedicine has been very fertile; however, when we use the yardstick of clinical success and paradigm shifts in treatment, the results appear more modest.

Both Berger’s spotlight article and Venkatraman’s review provide interesting reading and neither is especially long.