Tag Archives: Norway

Risk assessments not the only path to nanotechnology regulation

Nanowerk has republished an essay about nanotechnology regulation from Australia’s The Conversation in an Aug. 25, 2015 news item (Note: A link has been removed),

When it comes to nanotechnology, Australians have shown strong support for regulation and safety testing.

One common way of deciding whether and how nanomaterials should be regulated is to conduct a risk assessment. This involves calculating the risk a substance or activity poses based on the associated hazards or dangers and the level of exposure to people or the environment.

However, our recent review (“Risk Analysis of Nanomaterials: Exposing Nanotechnology’s Naked Emperor”) found some serious shortcomings of the risk assessment process for determining the safety of nanomaterials.

We have argued that these shortcomings are so significant that risk assessment is effectively a naked emperor [reference to a children’s story “The Emperor’s New Clothes“].

The original Aug. 24, 2015 article written by Fern Wickson (Scientist/Program Coordinator at GenØk – Centre for Biosafety in Norway) and Georgia Miller (PhD candidate at UNSW [University of New South Wales], Australia) points out an oft ignored issue with regard to nanotechnology regulation,

Risk assessment has been the dominant decision-aiding tool used by regulators of new technologies for decades, despite it excluding key questions that the community cares about. [emphasis mine] For example: do we need this technology; what are the alternatives; how will it affect social relations, and; who should be involved in decision making?

Wickson and Miller also note more frequently discussed issues,

A fundamental problem is a lack of nano-specific regulation. Most sector-based regulation does not include a “trigger” for nanomaterials to face specific risk assessment. Where a substance has been approved for use in its macro form, it requires no new assessment.

Even if such a trigger were present, there is also currently no cross-sectoral or international agreement on the definition of what constitutes a nanomaterial.

Another barrier is the lack of measurement capability and validated methods for safety testing. We still do not have the means to conduct routine identification of nanomaterials in the complex “matrix” of finished products or the environment.

This makes supply chain tracking and safety testing under real-world conditions very difficult. Despite ongoing investment in safety research, the lack of validated test methods and different methods yielding diverse results allows scientific uncertainty to persist.

With regard to the first problem, the assumption that if a material at the macroscale is safe, then the same is true at the nanoscale informs regulation in Canada and, as far as I’m aware, every other constituency that has any type of nanomaterial regulation. I’ve had mixed feelings about this. On the one hand, we haven’t seen any serious problems associated with the use of nanomaterials but on the other hand, these problems can be slow to emerge.

The second issue mentioned, the lack of a consistent definition internationally, seems to be a relatively common problem in a lot of areas. As far as I’m aware, there aren’t that many international agreements for safety measures. Nuclear weapons and endangered animals and plants (CITES) being two of the few that come to mind.

The lack of protocols for safety testing of nanomaterials mentioned in the last paragraph of the excerpt is of rising concern. For example, there’s my July 7, 2015 posting featuring two efforts: Nanotechnology research protocols for Environment, Health and Safety Studies in US and a nanomedicine characterization laboratory in the European Union. Despite this and other efforts, I do think more can and should be done to standardize tests and protocols (without killing new types of research and results which don’t fit the models).

The authors do seem to be presenting a circular argument with this (from their Aug. 24, 2015 article; Note: A link has been removed),

Indeed, scientific uncertainty about nanomaterials’ risk profiles is a key barrier to their reliable assessment. A review funded by the European Commission concluded that:

[…] there is still insufficient data available to conduct the in depth risk assessments required to inform the regulatory decision making process on the safety of NMs [nanomaterials].

Reliable assessment of any chemical or drug is a major problem. We do have some good risk profiles but how many times have pharmaceutical companies developed a drug that passed successfully through human clinical trials only to present a serious risk when released to the general population? Assessing risk is a very complex problem. even with risk profiles and extensive testing.

Unmentioned throughout the article are naturally occurring nanoparticles (nanomaterials) and those created inadvertently through some manufacturing or other process. In fact, we have been ingesting nanomaterials throughout time. That said, I do agree we need to carefully consider the impact that engineered nanomaterials could have on us and the environment as ever more are being added.

To that end, the authors make some suggestions (Note: Links have been removed),

There are well-developed alternate decision-aiding tools available. One is multicriteria mapping, which seeks to evaluate various perspectives on an issue. Another is problem formulation and options assessment, which expands science-based risk assessment to engage a broader range of individuals and perspectives.

There is also pedigree assessment, which explores the framing and choices taking place at each step of an assessment process so as to better understand the ambiguity of scientific inputs into political processes.

Another, though less well developed, approach popular in Europe involves a shift from risk to innovation governance, with emphasis on developing “responsible research and innovation”.

I have some hesitation about recommending this read due to Georgia Miller’s involvement and the fact that I don’t have the time to check all the references. Miller was a spokesperson for Friends of the Earth (FoE) Australia, a group which led a substantive campaign against ‘nanosunscreens’. Here’s a July 20, 2010 posting where I featured some cherrypicking/misrepresentation of data by FoE in the persons of Georgia Miller and Ian Illuminato.

My Feb. 9, 2012 posting highlights the unintended consequences (avoidance of all sunscreens by some participants in a survey) of the FoE’s campaign in Australia (Note [1]: The percentage of people likely to avoid all sunscreens due to their concerns with nanoparticles in their sunscreens was originally reported to be 17%; Note [2]: Australia has the highest incidence of skin cancer in the world),

Feb.21.12 correction: According to the information in the Feb. 20, 2012 posting on 2020 Science, the percentage of Australians likely to avoid using sunscreens is 13%,

This has just landed in my email in box from Craig Cormick at the Department of Industry, Innovation, Science, Research and Tertiary Education in Australia, and I thought I would pass it on given the string of posts on nanoparticles in sunscreens on 2020 Science over the past few years:

“An online poll of 1,000 people, conducted in January this year, shows that one in three Australians had heard or read stories about the risks of using sunscreens with nanoparticles in them,” Dr Cormick said.

“Thirteen percent of this group were concerned or confused enough that they would be less likely to use any sunscreen, whether or not it contained nanoparticles, putting them selves at increased risk of developing potentially deadly skin cancers.

“The study also found that while one in five respondents stated they would go out of their way to avoid using sunscreens with nanoparticles in them, over three in five would need to know more information before deciding.”

This article with Fern Wickson (with whom I don’t always agree perfectly but hasn’t played any games with research that I’m know of) helps somewhat but it’s going to take more than this before I feel comfortable recommending Ms. Miller’s work for further reading.

Carbon capture with ‘diamonds from the sky’

Before launching into the latest on a new technique for carbon capture, it might be useful to provide some context. Arthur Neslen’s March 23, 2015 opinion piece outlines the issues and notes that one Norwegian Prime Minister resigned when coalition government partners attempted to build gas power plants without carbon capture and storage facilities (CCS), Note : A link has been removed,

At least 10 European power plants were supposed to begin piping their carbon emissions into underground tombs this year, rather than letting them twirl into the sky. None has done so.

Missed deadlines, squandered opportunities, spiralling costs and green protests have plagued the development of carbon capture and storage (CCS) technology since Statoil proposed the concept more than two decades ago.

But in the face of desperate global warming projections the CCS dream still unites Canadian tar sands rollers with the UN’s Intergovernmental Panel on Climate Change (IPCC), and Shell with some environmentalists.

With 2bn people in the developing world expected to hook up to the world’s dirty energy system by 2050, CCS holds out the tantalising prospect of fossil-led growth that does not fry the planet.


“With CCS in the mix, we can decarbonise in a cost-effective manner and still continue to produce, to some extent, our fossil fuels,” Tim Bertels, Shell’s Glocal CCS portfolio manager told the Guardian. “You don’t need to divest in fossil fuels, you need to decarbonise them.”

The technology has been gifted “a very significant fraction” of the billions of dollars earmarked by Shell for clean energy research, he added. But the firm is also a vocal supporter of public funding for CCS from carbon markets, as are almost all players in the industry.

Enthusiasm for this plan is not universal (from Neslen’s opinion piece),

Many environmentalists see the idea as a non-starter because it locks high emitting power plants into future energy systems, and obstructs funding for the cheaper renewables revolution already underway. “CCS is is completely irrelevant,” said Jeremy Rifkin, a noted author and climate adviser to several governments. “I don’t even think about it. It’s not going to happen. It’s not commercially available and it won’t be commercially viable.”

I recommend reading Neslen’s piece for anyone who’s not already well versed on the issues. He uses Norway as a case study and sums up the overall CCS political situation this way,

In many ways, the debate over carbon capture and storage is a struggle between two competing visions of the societal transformation needed to avert climate disaster. One vision represents the enlightened self-interest of a contributor to the problem. The other cannot succeed without eliminating its highly entrenched opponent. The battle is keenly fought by technological optimists on both sides. But if Norway’s fractious CCS experience is any indicator, it will be decided on the ground by the grimmest of realities.

On that note of urgency, here’s some research on carbon dioxide (CO2) or, more specifically, carbon capture and utilization technology, from an Aug. 19, 2015 news item on Nanowerk,,

Finding a technology to shift carbon dioxide (CO2), the most abundant anthropogenic greenhouse gas, from a climate change problem to a valuable commodity has long been a dream of many scientists and government officials. Now, a team of chemists says they have developed a technology to economically convert atmospheric CO2    directly into highly valued carbon nanofibers for industrial and consumer products.

An Aug. 19, 2015 American Chemical Society (ACS) news release (also on EurekAlert), which originated the news time, expands on the theme,

The team will present brand-new research on this new CO2 capture and utilization technology at the 250th National Meeting & Exposition of the American Chemical Society (ACS). ACS is the world’s largest scientific society. The national meeting, which takes place here through Thursday, features more than 9,000 presentations on a wide range of science topics.

“We have found a way to use atmospheric CO2 to produce high-yield carbon nanofibers,” says Stuart Licht, Ph.D., who leads a research team at George Washington University. “Such nanofibers are used to make strong carbon composites, such as those used in the Boeing Dreamliner, as well as in high-end sports equipment, wind turbine blades and a host of other products.”

Previously, the researchers had made fertilizer and cement without emitting CO2, which they reported. Now, the team, which includes postdoctoral fellow Jiawen Ren, Ph.D., and graduate student Jessica Stuart, says their research could shift CO2 from a global-warming problem to a feed stock for the manufacture of in-demand carbon nanofibers.

Licht calls his approach “diamonds from the sky.” That refers to carbon being the material that diamonds are made of, and also hints at the high value of the products, such as the carbon nanofibers that can be made from atmospheric carbon and oxygen.

Because of its efficiency, this low-energy process can be run using only a few volts of electricity, sunlight and a whole lot of carbon dioxide. At its root, the system uses electrolytic syntheses to make the nanofibers. CO2 is broken down in a high-temperature electrolytic bath of molten carbonates at 1,380 degrees F (750 degrees C). Atmospheric air is added to an electrolytic cell. Once there, the CO2 dissolves when subjected to the heat and direct current through electrodes of nickel and steel. The carbon nanofibers build up on the steel electrode, where they can be removed, Licht says.

To power the syntheses, heat and electricity are produced through a hybrid and extremely efficient concentrating solar-energy system. The system focuses the sun’s rays on a photovoltaic solar cell to generate electricity and on a second system to generate heat and thermal energy, which raises the temperature of the electrolytic cell.

Licht estimates electrical energy costs of this “solar thermal electrochemical process” to be around $1,000 per ton of carbon nanofiber product, which means the cost of running the system is hundreds of times less than the value of product output.

“We calculate that with a physical area less than 10 percent the size of the Sahara Desert, our process could remove enough CO2 to decrease atmospheric levels to those of the pre-industrial revolution within 10 years,” he says. [emphasis mine]

At this time, the system is experimental, and Licht’s biggest challenge will be to ramp up the process and gain experience to make consistently sized nanofibers. “We are scaling up quickly,” he adds, “and soon should be in range of making tens of grams of nanofibers an hour.”

Licht explains that one advance the group has recently achieved is the ability to synthesize carbon fibers using even less energy than when the process was initially developed. “Carbon nanofiber growth can occur at less than 1 volt at 750 degrees C, which for example is much less than the 3-5 volts used in the 1,000 degree C industrial formation of aluminum,” he says.

A low energy approach that cleans up the air by converting greenhouse gases into useful materials and does it quickly is incredibly exciting. Of course, there are a few questions to be asked. Are the research outcomes reproducible by other teams? Licht notes the team is scaling the technology up but how soon can we scale up to industrial strength?

Replacing metal with nanocellulose paper

The quest to find uses for nanocellulose materials has taken a step forward with some work coming from the University of Maryland (US). From a July 24, 2015 news item on Nanowerk,

Researchers at the University of Maryland recently discovered that paper made of cellulose fibers is tougher and stronger the smaller the fibers get … . For a long time, engineers have sought a material that is both strong (resistant to non-recoverable deformation) and tough (tolerant of damage).

“Strength and toughness are often exclusive to each other,” said Teng Li, associate professor of mechanical engineering at UMD. “For example, a stronger material tends to be brittle, like cast iron or diamond.”

A July 23, 2015 University of Maryland news release, which originated the news item, provides details about the thinking which buttresses this research along with some details about the research itself,

The UMD team pursued the development of a strong and tough material by exploring the mechanical properties of cellulose, the most abundant renewable bio-resource on Earth. Researchers made papers with several sizes of cellulose fibers – all too small for the eye to see – ranging in size from about 30 micrometers to 10 nanometers. The paper made of 10-nanometer-thick fibers was 40 times tougher and 130 times stronger than regular notebook paper, which is made of cellulose fibers a thousand times larger.

“These findings could lead to a new class of high performance engineering materials that are both strong and tough, a Holy Grail in materials design,” said Li.

High performance yet lightweight cellulose-based materials might one day replace conventional structural materials (i.e. metals) in applications where weight is important. This could lead, for example, to more energy efficient and “green” vehicles. In addition, team members say, transparent cellulose nanopaper may become feasible as a functional substrate in flexible electronics, resulting in paper electronics, printable solar cells and flexible displays that could radically change many aspects of daily life.

Cellulose fibers can easily form many hydrogen bonds. Once broken, the hydrogen bonds can reform on their own—giving the material a ‘self-healing’ quality. The UMD discovered that the smaller the cellulose fibers, the more hydrogen bonds per square area. This means paper made of very small fibers can both hold together better and re-form more quickly, which is the key for cellulose nanopaper to be both strong and tough.

“It is helpful to know why cellulose nanopaper is both strong and tough, especially when the underlying reason is also applicable to many other materials,” said Liangbing Hu, assistant professor of materials science at UMD.

To confirm, the researchers tried a similar experiment using carbon nanotubes that were similar in size to the cellulose fibers. The carbon nanotubes had much weaker bonds holding them together, so under tension they did not hold together as well. Paper made of carbon nanotubes is weak, though individually nanotubes are arguably the strongest material ever made.

One possible future direction for the research is the improvement of the mechanical performance of carbon nanotube paper.

“Paper made of a network of carbon nanotubes is much weaker than expected,” said Li. “Indeed, it has been a grand challenge to translate the superb properties of carbon nanotubes at nanoscale to macroscale. Our research findings shed light on a viable approach to addressing this challenge and achieving carbon nanotube paper that is both strong and tough.”

Here’s a link to and a citation for the paper,

Anomalous scaling law of strength and toughness of cellulose nanopaper by Hongli Zhu, Shuze Zhu, Zheng Jia, Sepideh Parvinian, Yuanyuan Li, Oeyvind Vaaland, Liangbing Hu, and Teng Li. PNAS (Proceedings of the National Academy of Sciences) July 21, 2015 vol. 112 no. 29 doi: 10.1073/pnas.1502870112

This paper is behind a paywall.

There is a lot of research on applications for nanocellulose, everywhere it seems, except Canada, which at one time was a leader in the business of producing cellulose nanocrystals (CNC).

Here’s a sampling of some of my most recent posts on nanocellulose,

Nanocellulose as a biosensor (July 28, 2015)

Microscopy, Paper and Fibre Research Institute (Norway), and nanocellulose (July 8, 2015)

Nanocellulose markets report released (June 5, 2015; US market research)

New US platform for nanocellulose and occupational health and safety research (June 1, 2015; Note: As you find new applications, you need to concern yourself with occupational health and safety.)

‘Green’, flexible electronics with nanocellulose materials (May 26, 2015; research from China)

Treating municipal wastewater and dirty industry byproducts with nanocellulose-based filters (Dec. 23, 2014; research from Sweden)

Nanocellulose and an intensity of structural colour (June 16, 2014; research about replacing toxic pigments with structural colour from the UK)

I ask again, where are the Canadians? If anybody has an answer, please let me know.

Microscopy, Paper and Fibre Research Institute (Norway), and nanocellulose

In keeping with a longstanding interest here in nanocellulose (aka, cellulose nanomaterials) the Norwegian Paper and Fibre Research Institute’s (PFI) ??,??, 2015 announcement about new ion milling equipment and a new scanning electron microscope suitable for research into cellulose at the nanoscale caught my eye,

In order to advance the microscopy capabilities of cellulose-based materials and thanks to a grant from the Norwegian Pulp and Paper Research Institute foundation, PFI has invested in a modern ion milling equipment and a new Scanning Electron Microscope (SEM).

Unusually, the entire news release is being stored at Nanowerk as a July 3, 2015 news item (Note: Links have been removed),

“There are several microscopy techniques that can be used for characterizing cellulose materials, but the scanning electron microscope is one of the most preferable ones as the microscope is easy to use, versatile and provides a multi-scale assessment”, explains Gary Chinga-Carrasco, lead scientist at the PFI Biocomposite area.

“However, good microscopy depends to a large extent on an adequate and optimized preparation of the samples”, adds Per Olav Johnsen, senior engineer and microscopy expert at PFI.

“We are always trying to be in front in the development of new characterization methods, facilitating research and giving support to our industrial partners”, says Chinga-Carrasco, who has been active in developing new methods for characterization of paper, biocomposites and nanocellulose and cannot hide his enthusiasm when he talks about PFI’s new equipment. “In the first period after the installation it is important to work with the equipment with several material samples and techniques to really become confident with its use and reveal its potential”.

The team at PFI is now offering new methods for assessing cellulose materials in great detail. They point out that they have various activities and projects where they already see a big potential with the new equipment.

Examples for these efforts are the assessment of porous nanocellulose structures for biomedical applications (for instance in the NanoHeal program) and the assessment of surface modified wood fibres for use in biocomposites (for instance in the FiberComp project).

Also unusual is the lack of detail about the microscope’s and ion milling machine’s technical specifications and capabilities.

The NanoHeal program was last mentioned here in an April 14, 2014 post and first mentioned here in an Aug. 23, 2012 posting.

Final comment, I wonder if Nanowerk is embarking on a new initiative where the company agrees to store news releases for various agencies such as PFI and others who would prefer not to  archive their own materials. Just a thought.

Nanotechnology research protocols for Environment, Health and Safety Studies in US and a nanomedicine characterization laboratory in the European Union

I have two items relating to nanotechnology and the development of protocols. The first item concerns the launch of a new web portal by the US National Institute of Standards and Technology.

US National Institute of Standards and Technology (NIST)

From a July 1, 2015 news item on Azonano,

As engineered nanomaterials increasingly find their way into commercial products, researchers who study the potential environmental or health impacts of those materials face a growing challenge to accurately measure and characterize them. These challenges affect measurements of basic chemical and physical properties as well as toxicology assessments.

To help nano-EHS (Environment, Health and Safety)researchers navigate the often complex measurement issues, the National Institute of Standards and Technology (NIST) has launched a new website devoted to NIST-developed (or co-developed) and validated laboratory protocols for nano-EHS studies.

A July 1, 2015 NIST news release on EurekAlert, which originated the news item, offers more details about the information available through the web portal,

In common lab parlance, a “protocol” is a specific step-by-step procedure used to carry out a measurement or related activity, including all the chemicals and equipment required. Any peer-reviewed journal article reporting an experimental result has a “methods” section where the authors document their measurement protocol, but those descriptions are necessarily brief and condensed, and may lack validation of any sort. By comparison, on NIST’s new Protocols for Nano-EHS website the protocols are extraordinarily detailed. For ease of citation, they’re published individually–each with its own unique digital object identifier (DOI).

The protocols detail not only what you should do, but why and what could go wrong. The specificity is important, according to program director Debra Kaiser, because of the inherent difficulty of making reliable measurements of such small materials. “Often, if you do something seemingly trivial–use a different size pipette, for example–you get a different result. Our goal is to help people get data they can reproduce, data they can trust.”

A typical caution, for example, notes that if you’re using an instrument that measures the size of nanoparticles in a solution by how they scatter light, it’s important also to measure the transmission spectrum of the particles if they’re colored, because if they happen to absorb light strongly at the same frequency as your instrument, the result may be biased.

“These measurements are difficult because of the small size involved,” explains Kaiser. “Very few new instruments have been developed for this. People are adapting existing instruments and methods for the job, but often those instruments are being operated close to their limits and the methods were developed for chemicals or bulk materials and not for nanomaterials.”

“For example, NIST offers a reference material for measuring the size of gold nanoparticles in solution, and we report six different sizes depending on the instrument you use. We do it that way because different instruments sense different aspects of a nanoparticle’s dimensions. An electron microscope is telling you something different than a dynamic light scattering instrument, and the researcher needs to understand that.”

The nano-EHS protocols offered by the NIST site, Kaiser says, could form the basis for consensus-based, formal test methods such as those published by ASTM and ISO.

NIST’s nano-EHS protocol site currently lists 12 different protocols in three categories: sample preparation, physico-chemical measurements and toxicological measurements. More protocols will be added as they are validated and documented. Suggestions for additional protocols are welcome at nanoprotocols@nist.gov.

The next item concerns European nanomedicine.

CEA-LETI and Europe’s first nanomedicine characterization laboratory

A July 1, 2015 news item on Nanotechnology Now describes the partnership which has led to launch of the new laboratory,

CEA-Leti today announced the launch of the European Nano-Characterisation Laboratory (EU-NCL) funded by the European Union’s Horizon 2020 research and innovation programm[1]e. Its main objective is to reach a level of international excellence in nanomedicine characterisation for medical indications like cancer, diabetes, inflammatory diseases or infections, and make it accessible to all organisations developing candidate nanomedicines prior to their submission to regulatory agencies to get the approval for clinical trials and, later, marketing authorization.

“As reported in the ETPN White Paper[2], there is a lack of infrastructure to support nanotechnology-based innovation in healthcare,” said Patrick Boisseau, head of business development in nanomedicine at CEA-Leti and chairman of the European Technology Platform Nanomedicine (ETPN). “Nanocharacterisation is the first bottleneck encountered by companies developing nanotherapeutics. The EU-NCL project is of most importance for the nanomedicine community, as it will contribute to the competiveness of nanomedicine products and tools and facilitate regulation in Europe.”

EU-NCL is partnered with the sole international reference facility, the Nanotechnology Characterization Lab of the National Cancer Institute in the U.S. (US-NCL)[3], to get faster international harmonization of analytical protocols.

“We are excited to be part of this cooperative arrangement between Europe and the U.S.,” said Scott E. McNeil, director of U.S. NCL. “We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.”

A July 2, 2015 EMPA (Swiss Federal Laboratories for Materials Science and Technology) news release on EurekAlert provides more detail about the laboratory and the partnerships,

The «European Nanomedicine Characterization Laboratory» (EU-NCL), which was launched on 1 June 2015, has a clear-cut goal: to help bring more nanomedicine candidates into the clinic and on the market, for the benefit of patients and the European pharmaceutical industry. To achieve this, EU-NCL is partnered with the sole international reference facility, the «Nanotechnology Characterization Laboratory» (US-NCL) of the US-National Cancer Institute, to get faster international harmonization of analytical protocols. EU-NCL is also closely connected to national medicine agencies and the European Medicines Agency to continuously adapt its analytical services to requests of regulators. EU-NCL is designed, organized and operated according to the highest EU regulatory and quality standards. «We are excited to be part of this cooperative project between Europe and the U.S.,» says Scott E. McNeil, director of US-NCL. «We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.»

Nine partners from eight countries

EU-NCL, which is funded by the EU for a four-year period with nearly 5 million Euros, brings together nine partners from eight countries: CEA-Tech in Leti and Liten, France, the coordinator of the project; the Joint Research Centre of the European Commission in Ispra, Italy; European Research Services GmbH in Münster Germany; Leidos Biomedical Research, Inc. in Frederick, USA; Trinity College in Dublin, Ireland; SINTEF in Oslo, Norway; the University of Liverpool in the UK; Empa, the Swiss Federal Laboratories for Materials Science and Technology in St. Gallen, Switzerland; Westfälische Wilhelms-Universität (WWU) and Gesellschaft für Bioanalytik, both in Münster, Germany. Together, the partnering institutions will provide a trans-disciplinary testing infrastructure covering a comprehensive set of preclinical characterization assays (physical, chemical, in vitro and in vivo biological testing), which will allow researchers to fully comprehend the biodistribution, metabolism, pharmacokinetics, safety profiles and immunological effects of their medicinal nano-products. The project will also foster the use and deployment of standard operating procedures (SOPs), benchmark materials and quality management for the preclinical characterization of medicinal nano-products. Yet another objective is to promote intersectoral and interdisciplinary communication among key drivers of innovation, especially between developers and regulatory agencies.

The goal: to bring safe and efficient nano-therapeutics faster to the patient

Within EU-NCL, six analytical facilities will offer transnational access to their existing analytical services for public and private developers, and will also develop new or improved analytical assays to keep EU-NCL at the cutting edge of nanomedicine characterization. A complementary set of networking activities will enable EU-NCL to deliver to European academic or industrial scientists the high-quality analytical services they require for accelerating the industrial development of their candidate nanomedicines. The Empa team of Peter Wick at the «Particles-Biology Interactions» lab will be in charge of the quality management of all analytical methods, a key task to guarantee the best possible reproducibility and comparability of the data between the various analytical labs within the consortium. «EU-NCL supports our research activities in developing innovative and safe nanomaterials for healthcare within an international network, which will actively shape future standards in nanomedicine and strengthen Empa as an enabler to facilitate the transfer of novel nanomedicines from bench to bedside», says Wick.

You can find more information about the laboratory on the Horizon 2020 (a European Union science funding programme) project page for the EU-NCL laboratory. For anyone curious about CEA-Leti, it’s a double-layered organization. CEA is France’s Commission on Atomic Energy and Alternative Energy (Commissariat à l’énergie atomique et aux énergies alternatives); you can go here to their French language site (there is an English language clickable option on the page). Leti is one of the CEA’s institutes and is known as either Leti or CEA-Leti. I have no idea what Leti stands for. Here’s the Leti website (this is the English language version).

LiquiGlide, a nanotechnology-enabled coating for food packaging and oil and gas pipelines

Getting condiments out of their bottles should be a lot easier in several European countries in the near future. A June 30, 2015 news item on Nanowerk describes the technology and the business deal (Note: A link has been removed),

The days of wasting condiments — and other products — that stick stubbornly to the sides of their bottles may be gone, thanks to MIT [Massachusetts Institute of Technology] spinout LiquiGlide, which has licensed its nonstick coating to a major consumer-goods company.

Developed in 2009 by MIT’s Kripa Varanasi and David Smith, LiquiGlide is a liquid-impregnated coating that acts as a slippery barrier between a surface and a viscous liquid. Applied inside a condiment bottle, for instance, the coating clings permanently to its sides, while allowing the condiment to glide off completely, with no residue.

In 2012, amidst a flurry of media attention following LiquiGlide’s entry in MIT’s $100K Entrepreneurship Competition, Smith and Varanasi founded the startup — with help from the Institute — to commercialize the coating.

Today [June 30, 2015], Norwegian consumer-goods producer Orkla has signed a licensing agreement to use the LiquiGlide’s coating for mayonnaise products sold in Germany, Scandinavia, and several other European nations. This comes on the heels of another licensing deal, with Elmer’s [Elmer’s Glue & Adhesives], announced in March [2015].

A June 30, 2015 MIT news release, which originated the news item, provides more details about the researcher/entrepreneurs’ plans,

But this is only the beginning, says Varanasi, an associate professor of mechanical engineering who is now on LiquiGlide’s board of directors and chief science advisor. The startup, which just entered the consumer-goods market, is courting deals with numerous producers of foods, beauty supplies, and household products. “Our coatings can work with a whole range of products, because we can tailor each coating to meet the specific requirements of each application,” Varanasi says.

Apart from providing savings and convenience, LiquiGlide aims to reduce the surprising amount of wasted products — especially food — that stick to container sides and get tossed. For instance, in 2009 Consumer Reports found that up to 15 percent of bottled condiments are ultimately thrown away. Keeping bottles clean, Varanasi adds, could also drastically cut the use of water and energy, as well as the costs associated with rinsing bottles before recycling. “It has huge potential in terms of critical sustainability,” he says.

Varanasi says LiquiGlide aims next to tackle buildup in oil and gas pipelines, which can cause corrosion and clogs that reduce flow. [emphasis mine] Future uses, he adds, could include coatings for medical devices such as catheters, deicing roofs and airplane wings, and improving manufacturing and process efficiency. “Interfaces are ubiquitous,” he says. “We want to be everywhere.”

The news release goes on to describe the research process in more detail and offers a plug for MIT’s innovation efforts,

LiquiGlide was originally developed while Smith worked on his graduate research in Varanasi’s research group. Smith and Varanasi were interested in preventing ice buildup on airplane surfaces and methane hydrate buildup in oil and gas pipelines.

Some initial work was on superhydrophobic surfaces, which trap pockets of air and naturally repel water. But both researchers found that these surfaces don’t, in fact, shed every bit of liquid. During phase transitions — when vapor turns to liquid, for instance — water droplets condense within microscopic gaps on surfaces, and steadily accumulate. This leads to loss of anti-icing properties of the surface. “Something that is nonwetting to macroscopic drops does not remain nonwetting for microscopic drops,” Varanasi says.

Inspired by the work of researcher David Quéré, of ESPCI in Paris, on slippery “hemisolid-hemiliquid” surfaces, Varanasi and Smith invented permanently wet “liquid-impregnated surfaces” — coatings that don’t have such microscopic gaps. The coatings consist of textured solid material that traps a liquid lubricant through capillary and intermolecular forces. The coating wicks through the textured solid surface, clinging permanently under the product, allowing the product to slide off the surface easily; other materials can’t enter the gaps or displace the coating. “One can say that it’s a self-lubricating surface,” Varanasi says.

Mixing and matching the materials, however, is a complicated process, Varanasi says. Liquid components of the coating, for instance, must be compatible with the chemical and physical properties of the sticky product, and generally immiscible. The solid material must form a textured structure while adhering to the container. And the coating can’t spoil the contents: Foodstuffs, for instance, require safe, edible materials, such as plants and insoluble fibers.

To help choose ingredients, Smith and Varanasi developed the basic scientific principles and algorithms that calculate how the liquid and solid coating materials, and the product, as well as the geometry of the surface structures will all interact to find the optimal “recipe.”

Today, LiquiGlide develops coatings for clients and licenses the recipes to them. Included are instructions that detail the materials, equipment, and process required to create and apply the coating for their specific needs. “The state of the coating we end up with depends entirely on the properties of the product you want to slide over the surface,” says Smith, now LiquiGlide’s CEO.

Having researched materials for hundreds of different viscous liquids over the years — from peanut butter to crude oil to blood — LiquiGlide also has a database of optimal ingredients for its algorithms to pull from when customizing recipes. “Given any new product you want LiquiGlide for, we can zero in on a solution that meets all requirements necessary,” Varanasi says.

MIT: A lab for entrepreneurs

For years, Smith and Varanasi toyed around with commercial applications for LiquiGlide. But in 2012, with help from MIT’s entrepreneurial ecosystem, LiquiGlide went from lab to market in a matter of months.

Initially the idea was to bring coatings to the oil and gas industry. But one day, in early 2012, Varanasi saw his wife struggling to pour honey from its container. “And I thought, ‘We have a solution for that,’” Varanasi says.

The focus then became consumer packaging. Smith and Varanasi took the idea through several entrepreneurship classes — such as 6.933 (Entrepreneurship in Engineering: The Founder’s Journey) — and MIT’s Venture Mentoring Service and Innovation Teams, where student teams research the commercial potential of MIT technologies.

“I did pretty much every last thing you could do,” Smith says. “Because we have such a brilliant network here at MIT, I thought I should take advantage of it.”

That May [2012], Smith, Varanasi, and several MIT students entered LiquiGlide in the MIT $100K Entrepreneurship Competition, earning the Audience Choice Award — and the national spotlight. A video of ketchup sliding out of a LiquiGlide-coated bottle went viral. Numerous media outlets picked up the story, while hundreds of companies reached out to Varanasi to buy the coating. “My phone didn’t stop ringing, my website crashed for a month,” Varanasi says. “It just went crazy.”

That summer [2012], Smith and Varanasi took their startup idea to MIT’s Global Founders’ Skills Accelerator program, which introduced them to a robust network of local investors and helped them build a solid business plan. Soon after, they raised money from family and friends, and won $100,000 at the MassChallenge Entrepreneurship Competition.

When LiquiGlide Inc. launched in August 2012, clients were already knocking down the door. The startup chose a select number to pay for the development and testing of the coating for its products. Within a year, LiquiGlide was cash-flow positive, and had grown from three to 18 employees in its current Cambridge headquarters.

Looking back, Varanasi attributes much of LiquiGlide’s success to MIT’s innovation-based ecosystem, which promotes rapid prototyping for the marketplace through experimentation and collaboration. This ecosystem includes the Deshpande Center for Technological Innovation, the Martin Trust Center for MIT Entrepreneurship, the Venture Mentoring Service, and the Technology Licensing Office, among other initiatives. “Having a lab where we could think about … translating the technology to real-world applications, and having this ability to meet people, and bounce ideas … that whole MIT ecosystem was key,” Varanasi says.

Here’s the latest LiquiGlide video,


Credits:

Video: Melanie Gonick/MIT
Additional footage courtesy of LiquiGlide™
Music sampled from “Candlepower” by Chris Zabriskie
https://freemusicarchive.org/music/Ch…
http://creativecommons.org/licenses/b…

I had thought the EU (European Union) offered more roadblocks to marketing nanotechnology-enabled products used in food packaging than the US. If anyone knows why a US company would market its products in Europe first I would love to find out.

Construction and nanotechnology research in Scandinavia

I keep hearing about the possibilities for better (less polluting, more energy efficient, etc.) building construction materials but there never seems to be much progress.  A June 15, 2015 news item on Nanowerk, which suggests some serious efforts are being made in Scandinavia, may help to explain the delay,

It isn’t cars and vehicle traffic that produce the greatest volumes of climate gas emissions – it’s our own homes. But new research will soon be putting an end to all that!

The building sector is currently responsible for 40% of global energy use and climate gas emissions. This is an under-communicated fact in a world where vehicle traffic and exhaust emissions get far more attention.

In the future, however, we will start to see construction materials and high-tech systems integrated into building shells that are specifically designed to remedy this situation. Such systems will be intelligent and multifunctional. They will consume less energy and generate lower levels of harmful climate gas emissions.

With this objective in mind, researchers at SINTEF are currently testing microscopic nanoparticles as insulation materials, applying voltages to window glass and facades as a means of saving energy, and developing solar cells that prevent the accumulation of snow and ice.

Research Director Susie Jahren and Research Manager Petra Rüther are heading SINTEF’s strategic efforts in the field of future construction materials. They say that although there are major commercial opportunities available in the development of green and low carbon building technologies, the construction industry is somewhat bound by tradition and unable to pay for research into future technology development. [emphasis mine]

A June 15, 2015 SINTEF (Scandinavia’s largest independent research organisation) news release on the Alpha Galileo website, which originated the news item, provides an overview of the research being conducted into nanotechnology-enabled construction materials (Note: I have added some heads and ruthlessly trimmed from the text),

[Insulation]

SINTEF researcher Bente Gilbu Tilset is sitting in her office in Forskningsveien 1 in Oslo [Norway]. She and her colleagues are looking into the manufacture of super-insulation materials made up of microscopic nanospheres.

“Our aim is to create a low thermal conductivity construction material “, says Tilset. “When gas molecules collide, energy is transferred between them. If the pores in a given material are small enough, for example less than 100 nanometres in diameter, a molecule will collide more often with the pore walls than with other gas molecules. This will effectively reduce the thermal conductivity of the gas. So, the smaller the pores, the lower the conductivity of the gas”, she says.

[Solar cells]

As part of the project “Bygningsintegrerte solceller for Norge” (Building Integrated Photovoltaics, BIPV Norway), researchers from SINTEF, NTNU, the IFE [IFE Group, privately owned company, located in Sweden] and Teknova [company created by the Nordic Institute for Studies in Innovation {NIFU}, located in Norway], are planning to look into how we can utilise solar cells as integral housing construction components, and how they can be adapted to Norwegian daylight and climatic conditions.

One of the challenges is to develop a solar cell which prevents the accumulation of snow and ice. The cells must be robust enough to withstand harsh wind and weather conditions and have lifetimes that enable them to function as electricity generators.

[Energy]

Today, we spend 90 per cent of our time indoors. This is as much as three times more than in the 1950s. We are also letting less daylight into our buildings as a result of energy considerations and construction engineering requirements. Research shows that daylight is very important to our health, well-being and biological rhythms. It also promotes productivity and learning. So the question is – is it possible to save energy and get the benefits of greater exposure to daylight?

Technologies involving thermochromic, photochromic and electrochromic pigments can help us to control how sunlight enters our buildings, all according to our requirements for daylight and warmth from the sun.

Self-healing concrete

Every year, between 40 and 120 million Euros are spent in Europe on the maintenance of bridges, tunnels and construction walls. These time-consuming and costly activities have to be reduced, and the project CAPDESIGN is aiming to make a contribution in this field.

The objective of the project is to produce concrete that can be ‘restored’ after being exposed to loads and stresses by means of self-healing agents that prevent the formation of cracks. The method involves mixing small capsules into the wet concrete before it hardens. These remain in the matrix until loads or other factors threaten to crack it. The capsules then burst and the self-healing agents are released to repair the structure.

At SINTEF, researchers are working with the material that makes up the capsule shells. The shell has to be able to protect the self-healing agent in the capsules for an extended period and then, under the right conditions, break down and release the agents in response to the formation of cracks caused by temperature, pH, or a load or stress resulting from an impact or shaking. At the same time, the capsules must not impair the ductility or the mechanical properties of the newly-mixed concrete.

You’ll notice most of the research seems to be taking place in Norway. I suspect that is due to the story having come from a joint Norwegian Norwegian University of Science and Technology (NTNU)/SINTEF, website, Gemini.no/en. Anyone wishing to test their Norwegian readings skills need only omit ‘/en’ from the URL.

Gender gaps in science and how statistics prove and disprove the finding

A Feb. 17, 2015 Northwestern University news release by Hilary Hurd Anyaso (also on EurekAlert) features research suggesting that parity in the numbers of men and women students pursuing science degrees is being achieved,

Scholars from diverse fields have long proposed that interlocking factors such as cognitive abilities, discrimination and interests may cause more women than men to leave the science, technology, engineering and mathematics (STEM) pipeline after entering college.

Now a new Northwestern University analysis has poked holes in the much referenced “leaky pipeline” metaphor.

The research shows that the bachelor’s-to-Ph.D. pipeline in science and engineering fields no longer leaks more women than men as it did in the past

Curt Rice, a professor at Norway’s University of Tromsø, has challenged the findings in a Feb. 18, 2015 post on his eponymous website (more about that later).

The news release goes on to describe how the research was conducted and the conclusions researchers drew from the data,

The researchers used data from two large nationally representative research samples to reconstruct a 30-year portrait of how bachelor’s-to-Ph.D. persistence rates for men and women have changed in the United States since the 1970s. For this study, the term STEM persistence rate refers to the proportion of students who earned a Ph.D. in a particular STEM field (e.g. engineering) among students who had earlier received bachelor’s degrees in that same field.

They were particularly surprised that the gender persistence gap completely closed in pSTEM fields (physical science, technology, engineering and mathematics) — the fields in which women are most underrepresented.

Among students earning pSTEM bachelor’s degrees in the 1970s, men were 1.6 to 1.7 times as likely as women to later earn a pSTEM Ph.D. However, this gap completely closed by the 1990s.

Men still outnumber women by approximately three to one among pSTEM Ph.D. earners. But those differences in representation are not explained by differences in persistence from the bachelor’s to Ph.D. degree, said David Miller, an advanced doctoral student in psychology at Northwestern and lead author of the study.

“Our analysis shows that women are overcoming any potential gender biases that may exist in graduate school or undergraduate mentoring about pursing graduate school,” Miller said. “In fact, the percentage of women among pSTEM degree earners is now higher at the Ph.D. level than at the bachelor’s, 27 percent versus 25 percent.”

Jonathan Wai, a Duke University Talent Identification Program research scientist and co-author of the study, said a narrowing of gender gaps makes sense given increased efforts to promote gender diversity in science and engineering.

“But a complete closing of the gap was unexpected, especially given recent evidence of gender bias in science mentoring,” Wai said.

Consequently, the widely used leaky pipeline metaphor is a dated description of gender differences in postsecondary STEM education, Wai added.

Other research shows that gaps in persistence rates are also small to nonexistent past the Ph.D., Miller said.

“For instance, in physical science and engineering fields, male and female Ph.D. holders are equally likely to earn assistant professorships and academic tenure,” Miller said.

The leaky pipeline metaphor is inaccurate for nearly all postsecondary pathways in STEM, Miller said, with two important exceptions.

“The Ph.D.-to-assistant-professor pipeline leaks more women than men in life science and economics,” he said. “Differences in those fields are large and important.”

The implications of the research, Miller said, are important in guiding research, resources and strategies to explain and change gender imbalances in science.

“The leaking pipeline metaphor could potentially direct thought and resources away from other strategies that could more potently increase women’s representation in STEM,” he said.

For instance, plugging leaks in the pipeline from the beginning of college to the bachelor’s degree would fail to substantially increase women’s representation among U.S. undergraduates in the pSTEM fields, Miller said.

Of concern, women’s representation among pSTEM bachelor’s degrees has been decreasing during the past decade, Miller noted. “Our analyses indicate that women’s representation at the Ph.D. level is starting to follow suit by declining for the first time in over 40 years,” he said.

“This recent decline at the Ph.D. level could likely mean that women’s progress at the assistant professor level might also slow down or reverse in future years, so these trends will need to be watched closely,” Wai said.

While the researchers are encouraged that gender gaps in doctoral persistence have closed, they stressed that accurately assessing and changing gender biases in science should remain an important goal for educators and policy makers.

Before moving on to Rice’s comments, here’s a link to and citation for the paper,

The bachelor’s to Ph.D. STEM pipeline no longer leaks more women than men: a 30-year analysis by David I. Miller and Jonathan Wai. Front. Psychol., 17 February 2015, doi: 10.3389/fpsyg.2015.00037

This paper is open access (at least for now).

Maybe the situation isn’t improving after all

Curt Rice’s response titled, The incontinent pipeline: it’s not just women leaving higher education, suggests this latest research has unmasked a problem (Note: Links have been removed),

Freshly published research gives a more nuanced picture. The traditional recitation of percentages at various points along the pipeline provides a snapshot. The new research is more like a time-lapse film.

Unfortunately, the new study doesn’t actually show a pipeline being tightened up to leak less. Instead, it shows a pipeline that is leaking even more! The convergence in persistence rates for men and women is not a result of an increase in the rate of women taking a PhD; it’s the result of a decline in the rate of men doing so. It’s as though the holes have gotten bigger — they used to be so small that only women slipped through, but now men slide out, too.

Rice believes  that this improvement is ‘relative improvement’ i.e. the improvement exists in relation to declining numbers of men, a statistic that Rice gives more weight to than the Northwestern researchers appear to have done. ‘Absolute improvement’ would mean that numbers of women studying in the field had improved while men’s numbers had held steady or improved for them too.

To be fair, the authors of the paper seem to have taken at least some of this decline in men’s numbers into account (from the research paper),,

Reasons for the convergences in persistence rates remain unclear. Sometimes the convergence was driven by declines in men’s rates (e.g., in mathematics/computer science), increases in women’s rates (e.g., in physical science), or both (e.g., in engineering). help account for the changes in persistence rates. …

Overenthusiasm in the news release

Unfortunately, the headline and bullet list of highlights suggest a more ebullient research conclusion than seems warranted by the actual research results.

Think again about gender gap in science
Bachelor’s-to-Ph.D. pipeline in science, engineering no longer ‘leaks’ more women than men, new 30-year analysis finds

Research shows dated ‘leaky pipeline’ assumptions about gender imbalances in science

  • Men outnumber women as Ph.D. earners in science but no longer in doctoral persistence
  • Dramatic increase of women in science at Ph.D., assistant professorship levels since 1970s, but recent decline since 2010 may be of concern for future supply of female scientists
  • Assessing inaccurate assumptions key to correcting gender biases in science

Here’s the researchers’ conclusion,

Overall, these results and supporting literature point to the need to understand gender differences at the bachelor’s level and below to understand women’s representation in STEM at the Ph.D. level and above. Women’s representation in computer science, engineering, and physical science (pSTEM) fields has been decreasing at the bachelor’s level during the past decade. Our analyses indicate that women’s representation at the Ph.D. level is starting to follow suit by declining for the first time in over 40 years (Figure 2). This recent decline may also cause women’s gains at the assistant professor level and beyond to also slow down or reverse in the next few years. Fortunately, however, pathways for entering STEM are considerably diverse at the bachelor’s level and below. For instance, our prior research indicates that undergraduates who join STEM from a non-STEM field can substantially help the U.S. meet needs for more well-trained STEM graduates (Miller et al., under review). Addressing gender differences at the bachelor’s level could have potent effects at the Ph.D. level, especially now that women and men are equally likely to later earn STEM Ph.D.’s after the bachelor’s.

The conclusion seems to contradict the researchers’ statements in the news release,

“But a complete closing of the gap was unexpected, especially given recent evidence of gender bias in science mentoring,” Wai said.

Consequently, the widely used leaky pipeline metaphor is a dated description of gender differences in postsecondary STEM education, Wai added.

Other research shows that gaps in persistence rates are also small to nonexistent past the Ph.D., Miller said.

Incomplete pipeline

Getting back to Rice, he notes the pipeline in the Northwestern paper is incomplete (Note: Links have been removed),

In addition to the dubious celebration of the decline of persistence rates of men, the new research article also looks at an incomplete pipeline. In particular, it leaves aside the important issue of which PhD institutions students get into. For young researchers moving towards academic careers, we know that a few high-prestige universities are responsible for training future faculty members at nearly all other research universities. Are women and men getting into those high prestige universities in the same numbers? Or do women go to lower prestige institutions?

Following on that thought about lower prestige institutions and their impact on your career, there’s a Feb. 23, 2015 article by Joel Warner and Aaron Clauset in Slate investigating the situation, which applies to both men and women,

The United States prides itself on offering broad access to higher education, and thanks to merit-based admissions, ample financial aid, and emphasis on diverse student bodies, our country can claim some success in realizing this ideal.

The situation for aspiring professors is far grimmer. Aaron Clauset, a co-author of this article, is the lead author of a new study published in Science Advances that scrutinized more than 16,000 faculty members in the fields of business, computer science, and history at 242 schools. He and his colleagues found, as the paper puts it, a “steeply hierarchical structure that reflects profound social inequality.” The data revealed that just a quarter of all universities account for 71 to 86 percent of all tenure-track faculty in the U.S. and Canada in these three fields. Just 18 elite universities produce half of all computer science professors, 16 schools produce half of all business professors, and eight schools account for half of all history professors.

Then, Warner and Clauset said this about gender bias,

Here’s further evidence that the current system isn’t merely sorting the best of the best from the merely good. Female graduates of elite institutions tend to slip 15 percent further down the academic hierarchy than do men from the same institutions, evidence of gender bias to go along with the bias toward the top schools.

I suggest reading the Slate article, Rice’s post, and, if you have time, the Northwestern University research paper.

Coda: All about Curt Rice

Finally, this is for anyone who’s unfamiliar with Curt Rice (from the About page on his website; Note: Links have been removed),

In addition to my work as a professor at the University of Tromsø, I have three other roles that are closely related to the content on this website. I was elected by the permanent faculty to sit on the university board, I lead Norway’s Committee on Gender Balance and Diversity in Research, and I am the head of the Board for Current Research Information System in Norway (CRIStin). In all of these roles, I work to pursue my conviction that research and education are essential to improving society, and that making universities better therefore has the potential to make societies better.

I’m currently writing a book on gender balance. Why do men and women have different career paths? Why should we care? How can we start to make things better? Why is improving gender balance not only the right thing to do, but also the smart thing to do? For a taste of my approach, grab a copy of my free ebook on gender equality.

Beyond this book project, I use my speaking and writing engagements to reach audiences on the topics that excite me the most: gender balance, open access, leadership issues and more. These interests have grown during the past decade while I’ve had the privilege to occupy what were then two brand new leadership positions at the University of Tromsø.

From 2009–2013, I served as the elected Vice Rector for Research & Development (prorektor for forskning og utvikling). Before that, from 2002–2008, I was the founding director of my university’s first Norwegian Center of Excellence, the Center for Advanced Study in Theoretical Linguistics (CASTL). Given the luxury of being able to define those positions, I was able to pursue my passion for improving academic life by working to enhance conditions for education and research.

I’m part of the European Science Foundation’s genderSTE COST action (Gender, Science, Technology and Environment); I helped create the BALANSE program at the Research Council of Norway, which is designed to increase the numbers of women at the highest levels of research organizations. I am on the Advisory Board of the European Commission project EGERA (Effective Gender Equality in Research and Academia); I was on the Science Leaders Panel of the genSET project, in which we advised the European Commission about gender in science; I am a member of the Steering Committee for the Gender Summits.

I also led a national task force on research-based education that issued many suggestions for Norwegian institutions.

The quantum chemistry of nanomedicines

A Jan. 29, 2015 news item on Nanowerk provides an overview of the impact quantum chemical reactions may have on nanomedicines. Intriguingly, this line of query started with computations of white dwarf stars,

Quantum chemical calculations have been used to solve big mysteries in space. Soon the same calculations may be used to produce tomorrow’s cancer drugs.

Some years ago research scientists at the University of Oslo in Norway were able to show that the chemical bonding in the magnetic fields of small, compact stars, so-called white dwarf stars, is different from that on Earth. Their calculations pointed to a completely new bonding mechanism between two hydrogen atoms. The news attracted great attention in the media. The discovery, which in fact was made before astrophysicists themselves observed the first hydrogen molecules in white dwarf stars, was made by UiO’s Centre for Theoretical and Computational Chemistry. They based their work on accurate quantum chemical calculations of what happens when atoms and molecules are exposed to extreme conditions.

A Jan. 29, 2015 University of Oslo press release by Yngve Vogt, which originated the news item, offers a substantive description of molecules, electrons, and more for those of us whose last chemistry class is lost in the mists of time,

The research team is headed by Professor Trygve Helgaker, who for the last thirty years has taken the international lead on the design of a computer system for calculating quantum chemical reactions in molecules.

Quantum chemical calculations are needed to explain what happens to the electrons’ trajectories within a molecule.

Consider what happens when UV radiation sends energy-rich photons into your cells. This increases the energy level of the molecules. The outcome may well be that some of the molecules break up. This is exactly what happens when you sun-bathe.

“The extra energy will affect the behaviour of electrons and can destroy the chemical bonding within the molecule. This can only be explained by quantum chemistry. The quantum chemical models are used to produce a picture of the forces and tensions at play between the atoms and the electrons of a molecule, and of what is required for a molecule to dissociate,” says Trygve Helgaker.

The absurd world of the electrons

The quantum chemical calculations solve the Schrödinger equation for molecules. This equation is fundamental to all chemistry and describes the whereabouts of all electrons within a molecule. But here we need to pay attention, for things are really rather more complicated than that. Your high school physics teacher will have told you that electrons circle the atom. Things are not that simple, though, in the world of quantum physics. Electrons are not only particles, but waves as well. The electrons can be in many places at the same time. It’s impossible to keep track of their position. However, there is hope. Quantum chemical models describe the electrons’ statistical positions. In other words, they can establish the probable location of each electron.

The results of a quantum chemical calculation are often more accurate than what is achievable experimentally.

Among other things, the quantum chemical calculations can be used to predict chemical reactions. This means that the chemists will no longer have to rely on guesstimates in the lab. It is also possible to use quantum chemical calculations in order to understand what happens in experiments.

Enormous calculations

The calculations are very demanding.

“The Schrödinger equation is a highly complicated, partial differential equation, which cannot be accurately solved. Instead, we need to make do with heavy simulations”, says researcher Simen Kvaal.

The computations are so demanding that the scientists use one of the University’s fastest supercomputers.

“We are constantly stretching the boundaries of what is possible. We are restricted by the available machine capacity,” explains Helgaker.

Ten years ago it took two weeks to carry out the calculations for a molecule with 140 atoms. Now it can be done in two minutes.

“That’s 20,000 times faster than ten years ago. The computation process is now running 200 times faster because the computers have been doubling their speed every eighteen months. And the process is a further 100 times faster because the software has been undergoing constant improvement,” says senior engineer Simen Reine.

This year the research group has used 40 million CPU hours, of which twelve million were on the University’s supercomputer, which is fitted with ten thousand parallel processors. This allows ten thousand CPU hours to be over and done with in 60 minutes.

“We will always fill the computer’s free capacity. The higher the computational capacity, the bigger and more reliable the calculations.”

Thanks to ever faster computers, the quantum chemists are able to study ever larger molecules.

Today, it’s routine to carry out a quantum chemical calculation of what happens within a molecule of up to 400 atoms. By using simplified models it is possible to study molecules with several thousand atoms. This does, however, mean that some of the effects within the molecule are not being described in detail.

The researchers are now getting close to a level which enables them to study the quantum mechanics of living cells.

“This is exciting. The molecules of living cells may contain many hundred thousand atoms, but there is no need to describe the entire molecule using quantum mechanical principles. Consequently, we are already at a stage when we can help solve biological problems.”

There’s more from the press release which describes how this work could be applied in the future,

Hunting for the electrons of the insulin molecule

The chemists are thus able to combine sophisticated models with simpler ones. “This will always be a matter of what level of precision and detail you require. The optimal approach would have been to use the Schrödinger equation for everything.”

By way of compromise they can give a detailed description of every electron in some parts of the model, while in other parts they are only looking at average numbers.

Simen Reine has been using the team’s computer program, while working with Aarhus University [Finland], on a study of the insulin molecule. An insulin molecule consists of 782 atoms and 3,500 electrons.

“All electrons repel each other, while at the same time being pulled towards the atomic nuclei. The nuclei also repel each other. Nevertheless, the molecule remains stable. In order to study a molecule to a high level of precision, we therefore need to consider how all of the electrons move relative to one another. Such calculations are referred to as correlated and are very reliable.”

A complete correlated calculation of the insulin molecule takes nearly half a million CPU hours. If they were given the opportunity to run the program on the entire University’s supercomputer, the calculations would theoretically take two days.

“In ten years, we’ll be able to make these calculations in two minutes.”

Medically important

“Quantum chemical calculations can help describe phenomena at a level that may be difficult to access experimentally, but may also provide support for interpreting and planning experiments. Today, the calculations will be put to best use within the fields of molecular biology and biochemistry,” says Knut Fægri [vice-rector at the University of Oslo].

“Quantum chemistry is a fundamental theory which is important for explaining molecular events, which is why it is essential to our understanding of biological systems,” says [Associate Professor] Michele Cascella.

By way of an example, he refers to the analysis of enzymes. Enzymes are molecular catalysts that boost the chemical reactions within our cells.

Cascella also points to nanomedicines, which are drugs tasked with distributing medicine round our bodies in a much more accurate fashion.

“In nanomedicine we need to understand physical phenomena on a nano scale, forming as correct a picture as possible of molecular phenomena. In this context, quantum chemical calculations are important,” explains Michele Cascella.

Proteins and enzymes

Professor K. Kristoffer Andersson at the Department of Biosciences uses the simpler form of quantum chemical calculations to study the details of protein structures and the chemical atomic and electronic functions of enzymes.

“It is important to understand the chemical reaction mechanism, and how enzymes and proteins work. Quantum chemical calculations will teach us more about how proteins go about their tasks, step by step. We can also use the calculations to look at activation energy, i.e. how much energy is required to reach a certain state. It is therefore important to understand the chemical reaction patterns in biological molecules in order to develop new drugs,” says Andersson.

His research will also be useful in the search for cancer drugs. He studies radicals, which may be important to cancer. Among other things, he is looking at the metal ions function in proteins. These are ions with a large number of protons, neutrons and electrons.

Photosynthesis

Professor Einar Uggerud at the Department of Chemistry has uncovered an entirely new form of chemical bonding through sophisticated experiments and quantum chemical calculations.

Working with research fellow Glenn Miller, Professor Uggerud has found an unusually fragile key molecule, in a kite-shaped structure, consisting of magnesium, carbon and oxygen. The molecule may provide a new understanding of photosynthesis. Photosynthesis, which forms the basis for all life, converts CO2 into sugar molecules.

The molecule reacts so fast with water and other molecules that it has only been possible to study in isolation from other molecules, in a vacuum chamber.

“Time will tell whether the molecule really has an important connection with photosynthesis,” says Einar Uggerud.

I’m delighted with this explanation as it corrects my understanding of chemical bonds and helps me to better understand computational chemistry. Thank you University of Oslo and Yngve Vogt.

Finally, here’s a representation of an insulin molecule as understood by quantum computation,

QuantumInsulinMolecule

INSULIN: Working with Aarhus University, Simen Reine has calculated the tensions between the electrons and atoms of an insulin molecule. An insulin molecule consists of 782 atoms and 3,500 electrons. Illustration: Simen Reine-UiO

 

PlasCarb: producing graphene and renewable hydrogen from food waster

I have two tidbits about PlasCarb the first being an announcement of its existence and the second an announcement of its recently published research. A Jan. 13, 2015 news item on Nanowerk describes the PlasCarb project (Note: A link has been removed),

The Centre for Process Innovation (CPI) is leading a European collaborative project that aims to transform food waste into a sustainable source of significant economic added value, namely graphene and renewable hydrogen.

The project titled PlasCarb will transform biogas generated by the anaerobic digestion of food waste using an innovative low energy microwave plasma process to split biogas (methane and carbon dioxide) into high value graphitic carbon and renewable hydrogen.

A Jan. 13, 2015 CPI press release, which originated the news item, describes the project and its organization in greater detail,

CPI  as the coordinator of the project is responsible for the technical aspects in the separation of biogas into methane and carbon dioxide, and separating of the graphitic carbon produced from the renewable hydrogen. The infrastructure at CPI allows for the microwave plasma process to be trialled and optimised at pilot production scale, with a future technology roadmap devised for commercial scale manufacturing.

Graphene is one of the most interesting inventions of modern times. Stronger than steel, yet light, the material conducts electricity and heat. It has been used for a wide variety of applications, from strengthening tennis rackets, spray on radiators, to building semiconductors, electric circuits and solar cells.

The sustainable creation of graphene and renewable hydrogen from food waste in provides a sustainable method towards dealing with food waste problem that the European Union faces. It is estimated that 90 million tonnes of food is wasted each year, a figure which could rise to approximately 126 million tonnes by 2020. In the UK alone, food waste equates to a financial loss to business of at least £5 billion per year.

Dr Keith Robson, Director of Formulation and Flexible Manufacturing at CPI said, “PlasCarb will provide an innovative solution to the problems associated with food waste, which is one of the biggest challenges that the European Union faces in the strive towards a low carbon economy.  The project will not only seek to reduce food waste but also use new technological methods to turn it into renewable energy resources which themselves are of economic value, and all within a sustainable manner.”

PlasCarb will utilise quality research and specialist industrial process engineering to optimise the quality and economic value of the Graphene and hydrogen, further enhancing the sustainability of the process life cycle.

Graphitic carbon has been identified as one of Europe’s economically critical raw materials and of strategic performance in the development of future emerging technologies. The global market for graphite, either mined or synthetic is worth over €10 billion per annum. Hydrogen is already used in significant quantities by industry and recognised with great potential as a future transport fuel for a low carbon economy. The ability to produce renewable hydrogen also has added benefits as currently 95% of hydrogen is produced from fossil fuels. Moreover, it is currently projected that increasing demand of raw materials from fossil sources will lead to price volatility, accelerated environmental degradation and rising political tensions over resource access.

Therefore, the latter stages of the project will be dedicated to the market uptake of the PlasCarb process and the output products, through the development of an economically sustainable business strategy, a financial risk assessment of the project results and a flexible financial model that is able to act as a primary screen of economic viability. Based on this, an economic analysis of the process will be determined. Through the development of a decentralised business model for widespread trans-European implementation, the valorisation of food waste will have the potential to be undertaken for the benefit of local economies and employment. More specifically, three interrelated post project exploitation markets have been defined: food waste management, high value graphite and RH2 sales.

PlasCarb is a 3-year collaborative project, co-funded under the European Union’s Seventh Framework Programme (FP7) and will further reinforce Europe’s leading position in environmental technologies and innovation in high value Carbon. The consortium is composed of eight partners led by CPI from five European countries, whose complimentary research and industrial expertise will enable the required results to be successfully delivered. The project partners are; The Centre for Process Innovation (UK), GasPlas AS (NO), CNRS (FR), Fraunhofer IBP (DE), Uvasol Ltd (UK), GAP Waste Management (UK), Geonardo Ltd. (HU), Abalonyx AS (NO).

You can find PlasCarb here.

The second announcement can be found in a PlasCarb Jan. 14, 2015 press release announcing the publication of research on heterostructures of graphene ribbons,

Few materials have received as much attention from the scientific world or have raised so many hopes with a view to their potential deployment in new applications as graphene has. This is largely due to its superlative properties: it is the thinnest material in existence, almost transparent, the strongest, the stiffest and at the same time the most strechable, the best thermal conductor, the one with the highest intrinsic charge carrier mobility, plus many more fascinating features. Specifically, its electronic properties can vary enormously through its confinement inside nanostructured systems, for example. That is why ribbons or rows of graphene with nanometric widths are emerging as tremendously interesting electronic components. On the other hand, due to the great variability of electronic properties upon minimal changes in the structure of these nanoribbons, exact control on an atomic level is an indispensable requirement to make the most of all their potential.

The lithographic techniques used in conventional nanotechnology do not yet have such resolution and precision. In the year 2010, however, a way was found to synthesise nanoribbons with atomic precision by means of the so-called molecular self-assembly. Molecules designed for this purpose are deposited onto a surface in such a way that they react with each other and give rise to perfectly specified graphene nanoribbons by means of a highly reproducible process and without any other external mediation than heating to the required temperature. In 2013 a team of scientists from the University of Berkeley and the Centre for Materials Physics (CFM), a mixed CSIC (Spanish National Research Council) and UPV/EHU (University of the Basque Country) centre, extended this very concept to new molecules that were forming wider graphene nanoribbons and therefore with new electronic properties. This same group has now managed to go a step further by creating, through this self-assembly, heterostructures that blend segments of graphene nanoribbons of two different widths.

The forming of heterostructures with different materials has been a concept widely used in electronic engineering and has enabled huge advances to be made in conventional electronics. “We have now managed for the first time to form heterostructures of graphene nanoribbons modulating their width on a molecular level with atomic precision. What is more, their subsequent characterisation by means of scanning tunnelling microscopy and spectroscopy, complemented with first principles theoretical calculations, has shown that it gives rise to a system with very interesting electronic properties which include, for example, the creation of what are known as quantum wells,” pointed out the scientist Dimas de Oteyza, who has participated in this project. This work, the results of which are being published this very week in the journal Nature Nanotechnology, therefore constitutes a significant success towards the desired deployment of graphene in commercial electronic applications.

Here’s a link to and a citation for the paper,

Molecular bandgap engineering of bottom-up synthesized graphene nanoribbon heterojunctions by Yen-Chia Chen, Ting Cao, Chen Chen, Zahra Pedramrazi, Danny Haberer, Dimas G. de Oteyza, Felix R. Fischer, Steven G. Louie, & Michael F. Crommie. Nature Nanotechnology (2015) doi:10.1038/nnano.2014.307 Published online 12 January 2015

This article is behind a paywall but there is a free preview available via ReadCube access.