Tag Archives: Netherlands

Open access to nanoparticles and nanocomposites

One of the major issues for developing nanotechnology-enabled products is access to nanoparticles and nanocomposites. For example, I’ve had a number of requests from entrepreneurs for suggestions as to how to access cellulose nanocrystals (CNC) so they can develop a product idea. (It’s been a few years since the last request and I hope that means it’s easier to get access to CNC.)

Regardless, access remains a problem and the European Union has devised a solution which allows open access to nanoparticles and nanocomposites through project Co-Pilot. The announcement was made in a May 10, 2016 news item on Nanowerk (Note: A link has been removed),

“What opportunities does the nanotechnology provide in general, provide nanoparticles for my products and processes?” So far, this question cannot be answered easily. Preparation and modification of nanoparticles and the further processing require special technical infrastructure and complex knowledge. For small and medium businesses the construction of this infrastructure “just on luck” is often not worth it. Even large companies shy away from the risks. As a result many good ideas just stay in the drawer.

A simple and open access to high-class infrastructure for the reliable production of small batches of functionalized nanoparticles and nanocomposites for testing could ease the way towards new nano-based products for chemical and pharmaceutical companies. The European Union has allocated funds for the construction of a number of pilot lines and open-access infrastructure within the framework of the EU project CoPilot.

A May 9, 2016 Fraunhofer-Institut für Silicatforschung press release, which originated the news item, offers greater description,

A simple and open access to high-class infrastructure for the reliable production of small batches of functionalized nanoparticles and nanocomposites for testing could ease the way towards new nano-based products for chemical and pharmaceutical companies. The European Union has allocated funds for the construction of a number of pilot lines and open-access infrastructure within the framework of the EU project CoPilot. A consortium of 13 partners from research and industry, including nanotechnology specialist TNO from the Netherlands and the Fraunhofer Institute for Silicate Research ISC from Wuerzburg, Germany as well as seven nanomaterial manufacturers, is currently setting up the pilot line in Wuerzburg. First, they establish the particle production, modification and compounding on pilot scale based on four different model systems. The approach enables maximum variability and flexibility for the pilot production of various particle systems and composites. Two further open access lines will be established at TNO in Eindhoven and at the Sueddeutsche Kunststoffzentrum SKZ in Selb.

The “nanoparticle kitchen”

Essential elements of the pilot line in Wuerzburg are the particle synthesis in batches up to 100 liters, modification and separation methods such as semi-continuous operating centrifuge and in-line analysis and techniques for the uniform and agglomeration free incorporation of nanoparticles into composites. Dr. Karl Mandel, head of Particle Technology of Fraunhofer ISC, compares the pilot line with a high-tech kitchen: “We provide the top-notch equipment and the star chefs to synthesize a nano menu à la carte as well as nanoparticles according to individual requests. Thus, companies can test their own receipts – or our existing receipts – before they practice their own cooking or set up their nano kitchen.”

In the future, the EU project offers companies a contact point if they want to try their nano idea and require enough material for sampling and estimation of future production costs. This can, on the one hand, minimize the development risk, on the other hand, it maximizes the flexibility and production safety. To give lots of companies the opportunity to influence direction and structure/formation/setup of the nanoparticle kitchen, the project partners will offer open meetings on a regular basis.

I gather Co-Pilot has been offering workshops. The next is in July 2016 according to the press release,

The next workshop in this context takes place at Fraunhofer ISC in Wuerzburg, 7h July 2016. The partners present the pilot line and the first results of the four model systems – double layered hydroxide nanoparticle polymer composites for flame inhibiting fillers, titanium dioxide nanoparticles for high refractive index composites, magnetic particles for innovative catalysts and hollow silica composites for anti-glare coatings. Interested companies can find more information about the upcoming workshop on the website of the project www.h2020copilot.eu and on the website of Fraunhofer ISC www.isc.fraunhofer.de that hosts the event.

I tracked down a tiny bit more information about the July 2016 workshop in a May 2, 2016 Co-Pilot press release,

On July 7 2016, the CoPilot project partners give an insight view of the many new functionalization and applications of tailored nanoparticles in the workshop “The Nanoparticle Kitchen – particles und functions à la carte”, taking place in Wuerzburg, Germany. Join the Fraunhofer ISC’s lab tour of the “Nanoparticle Kitchen”, listen to the presentations of research institutes and industry and discuss your ideas with experts. Nanoparticles offer many options for today’s and tomorrow’s products.

More about program and registration soon on this [CoPilot] website!

I wonder if they’re considering this open access to nanoparticles and nanocomposites approach elsewhere?

Artificial intelligence used for wildlife protection

PAWS (Protection Assistant for Wildlife Security), an artificial intelligence (AI) program, has been tested in Uganda and Malaysia. according to an April 22, 2016 US National Science Foundation (NSF) news release (also on EurekAlert but dated April 21, 2016), Note: Links have been removed,

A century ago, more than 60,000 tigers roamed the wild. Today, the worldwide estimate has dwindled to around 3,200. Poaching is one of the main drivers of this precipitous drop. Whether killed for skins, medicine or trophy hunting, humans have pushed tigers to near-extinction. The same applies to other large animal species like elephants and rhinoceros that play unique and crucial roles in the ecosystems where they live.

Human patrols serve as the most direct form of protection of endangered animals, especially in large national parks. However, protection agencies have limited resources for patrols.

With support from the National Science Foundation (NSF) and the Army Research Office, researchers are using artificial intelligence (AI) and game theory to solve poaching, illegal logging and other problems worldwide, in collaboration with researchers and conservationists in the U.S., Singapore, Netherlands and Malaysia.

“In most parks, ranger patrols are poorly planned, reactive rather than pro-active, and habitual,” according to Fei Fang, a Ph.D. candidate in the computer science department at the University of Southern California (USC).

Fang is part of an NSF-funded team at USC led by Milind Tambe, professor of computer science and industrial and systems engineering and director of the Teamcore Research Group on Agents and Multiagent Systems.

Their research builds on the idea of “green security games” — the application of game theory to wildlife protection. Game theory uses mathematical and computer models of conflict and cooperation between rational decision-makers to predict the behavior of adversaries and plan optimal approaches for containment. The Coast Guard and Transportation Security Administration have used similar methods developed by Tambe and others to protect airports and waterways.

“This research is a step in demonstrating that AI can have a really significant positive impact on society and allow us to assist humanity in solving some of the major challenges we face,” Tambe said.

PAWS puts the claws in anti-poaching

The team presented papers describing how they use their methods to improve the success of human patrols around the world at the AAAI Conference on Artificial Intelligence in February [2016].

The researchers first created an AI-driven application called PAWS (Protection Assistant for Wildlife Security) in 2013 and tested the application in Uganda and Malaysia in 2014. Pilot implementations of PAWS revealed some limitations, but also led to significant improvements.

Here’s a video describing the issues and PAWS,

For those who prefer to read about details rather listen, there’s more from the news release,

PAWS uses data on past patrols and evidence of poaching. As it receives more data, the system “learns” and improves its patrol planning. Already, the system has led to more observations of poacher activities per kilometer.

Its key technical advance lies in its ability to incorporate complex terrain information, including the topography of protected areas. That results in practical patrol routes that minimize elevation changes, saving time and energy. Moreover, the system can also take into account the natural transit paths that have the most animal traffic – and thus the most poaching – creating a “street map” for patrols.

“We need to provide actual patrol routes that can be practically followed,” Fang said. “These routes need to go back to a base camp and the patrols can’t be too long. We list all possible patrol routes and then determine which is most effective.”

The application also randomizes patrols to avoid falling into predictable patterns.

“If the poachers observe that patrols go to some areas more often than others, then the poachers place their snares elsewhere,” Fang said.

Since 2015, two non-governmental organizations, Panthera and Rimbat, have used PAWS to protect forests in Malaysia. The research won the Innovative Applications of Artificial Intelligence award for deployed application, as one of the best AI applications with measurable benefits.

The team recently combined PAWS with a new tool called CAPTURE (Comprehensive Anti-Poaching Tool with Temporal and Observation Uncertainty Reasoning) that predicts attacking probability even more accurately.

In addition to helping patrols find poachers, the tools may assist them with intercepting trafficked wildlife products and other high-risk cargo, adding another layer to wildlife protection. The researchers are in conversations with wildlife authorities in Uganda to deploy the system later this year. They will present their findings at the 15th International Conference on Autonomous Agents and Multiagent Systems (AAMAS 2016) in May.

“There is an urgent need to protect the natural resources and wildlife on our beautiful planet, and we computer scientists can help in various ways,” Fang said. “Our work on PAWS addresses one facet of the problem, improving the efficiency of patrols to combat poaching.”

There is yet another potential use for PAWS, the prevention of illegal logging,

While Fang and her colleagues work to develop effective anti-poaching patrol planning systems, other members of the USC team are developing complementary methods to prevent illegal logging, a major economic and environmental problem for many developing countries.

The World Wildlife Fund estimates trade in illegally harvested timber to be worth between $30 billion and $100 billion annually. The practice also threatens ancient forests and critical habitats for wildlife.

Researchers at USC, the University of Texas at El Paso and Michigan State University recently partnered with the non-profit organization Alliance Vohoary Gasy to limit the illegal logging of rosewood and ebony trees in Madagascar, which has caused a loss of forest cover on the island nation.

Forest protection agencies also face limited budgets and must cover large areas, making sound investments in security resources critical.

The research team worked to determine the balance of security resources in which Madagascar should invest to maximize protection, and to figure out how to best deploy those resources.

Past work in game theory-based security typically involved specified teams — the security workers assigned to airport checkpoints, for example, or the air marshals deployed on flight tours. Finding optimal security solutions for those scenarios is difficult; a solution involving an open-ended team had not previously been feasible.

To solve this problem, the researchers developed a new method called SORT (Simultaneous Optimization of Resource Teams) that they have been experimentally validating using real data from Madagascar.

The research team created maps of the national parks, modeled the costs of all possible security resources using local salaries and budgets, and computed the best combination of resources given these conditions.

“We compared the value of using an optimal team determined by our algorithm versus a randomly chosen team and the algorithm did significantly better,” said Sara Mc Carthy, a Ph.D. student in computer science at USC.

The algorithm is simple and fast, and can be generalized to other national parks with different characteristics. The team is working to deploy it in Madagascar in association with the Alliance Vohoary Gasy.

“I am very proud of what my PhD students Fei Fang and Sara Mc Carthy have accomplished in this research on AI for wildlife security and forest protection,” said Tambe, the team lead. “Interdisciplinary collaboration with practitioners in the field was key in this research and allowed us to improve our research in artificial intelligence.”

Moreover, the project shows other computer science researchers the potential impact of applying their research to the world’s problems.

“This work is not only important because of the direct beneficial impact that it has on the environment, protecting wildlife and forests, but on the way that it can inspire other to dedicate their efforts into making the world a better place,” Mc Carthy said.

The curious can find out more about Panthera here and about Alliance Vohoary Gasy here (be prepared to use your French language skills). Unfortunately, I could not find more information about Rimbat.

Graphene Flagship high points

The European Union’s Graphene Flagship project has provided a series of highlights in place of an overview for the project’s ramp-up phase (in 2013 the Graphene Flagship was announced as one of two winners of a science competition, the other winner was the Human Brain Project, with two prizes of 1B Euros for each project). Here are the highlights from the April 19, 2016 Graphene Flagship press release,

Graphene and Neurons – the Best of Friends

Flagship researchers have shown that it is possible to interface untreated graphene with neuron cells whilst maintaining the integrity of these vital cells [1]. This result is a significant first step towards using graphene to produce better deep brain implants which can both harness and control the brain.

Graphene and Neurons
 

This paper emerged from the Graphene Flagship Work Package Health and Environment. Prof. Prato, the WP leader from the University of Trieste in Italy, commented that “We are currently involved in frontline research in graphene technology towards biomedical applications, exploring the interactions between graphene nano- and micro-sheets with the sophisticated signalling machinery of nerve cells. Our work is a first step in that direction.”

[1] Fabbro A., et al., Graphene-Based Interfaces do not Alter Target Nerve Cells. ACS Nano, 10 (1), 615 (2016).

Pressure Sensing with Graphene: Quite a Squeeze

The Graphene Flagship developed a small, robust, highly efficient squeeze film pressure sensor [2]. Pressure sensors are present in most mobile handsets and by replacing current sensor membranes with a graphene membrane they allow the sensor to decrease in size and significantly increase its responsiveness and lifetime.

Discussing this work which emerged from the Graphene Flagship Work Package Sensors is the paper’s lead author, Robin Dolleman from the Technical University of Delft in The Netherlands “After spending a year modelling various systems the idea of the squeeze-film pressure sensor was formed. Funding from the Graphene Flagship provided the opportunity to perform the experiments and we obtained very good results. We built a squeeze-film pressure sensor from 31 layers of graphene, which showed a 45 times higher response than silicon based devices, while reducing the area of the device by a factor of 25. Currently, our work is focused on obtaining similar results on monolayer graphene.”

 

[2] Dolleman R. J. et al., Graphene Squeeze-Film Pressure Sensors. Nano Lett., 16, 568 (2016)

Frictionless Graphene


Image caption: A graphene nanoribbon was anchored at the tip of a atomic force microscope and dragged over a gold surface. The observed friction force was extremely low.

Image caption: A graphene nanoribbon was anchored at the tip of a atomic force microscope and dragged over a gold surface. The observed friction force was extremely low.

Research done within the Graphene Flagship, has observed the onset of superlubricity in graphene nanoribbons sliding on a surface, unravelling the role played by ribbon size and elasticity [3]. This important finding opens up the development potential of nanographene frictionless coatings. This research lead by the Graphene Flagship Work Package Nanocomposites also involved researchers from Work Package Materials and Work Package Health and the Environment, a shining example of the inter-disciplinary, cross-collaborative approach to research undertaken within the Graphene Flagship. Discussing this further is the Work Package Nanocomposites Leader, Dr Vincenzo Palermo from CNR National Research Council, Italy “Strengthening the collaboration and interactions with other Flagship Work Packages created added value through a strong exchange of materials, samples and information”.

[3] Kawai S., et al., Superlubricity of graphene nanoribbons on gold surfaces. Science. 351, 6276, 957 (2016) 

​Graphene Paddles Forward

Work undertaken within the Graphene Flagship saw Spanish automotive interiors specialist, and Flagship partner, Grupo Antolin SA work in collaboration with Roman Kayaks to develop an innovative kayak that incorporates graphene into its thermoset polymeric matrices. The use of graphene and related materials results in a significant increase in both impact strength and stiffness, improving the resistance to breakage in critical areas of the boat. Pushing the graphene canoe well beyond the prototype demonstration bubble, Roman Kayaks chose to use the K-1 kayak in the Canoe Marathon World Championships held in September in Gyor, Hungary where the Graphene Canoe was really put through its paces.

Talking further about this collaboration from the Graphene Flagship Work Package Production is the WP leader, Dr Ken Teo from Aixtron Ltd., UK “In the Graphene Flagship project, Work Package Production works as a technology enabler for real-world applications. Here we show the worlds first K-1 kayak (5.2 meters long), using graphene related materials developed by Grupo Antolin. We are very happy to see that graphene is creating value beyond traditional industries.” 

​Graphene Production – a Kitchen Sink Approach

Researchers from the Graphene Flagship have devised a way of producing large quantities of graphene by separating graphite flakes in liquids with a rotating tool that works in much the same way as a kitchen blender [4]. This paves the way to mass production of high quality graphene at a low cost.

The method was produced within the Graphene Flagship Work Package Production and is talked about further here by the WP deputy leader, Prof. Jonathan Coleman from Trinity College Dublin, Ireland “This technique produced graphene at higher rates than most other methods, and produced sheets of 2D materials that will be useful in a range of applications, from printed electronics to energy generation.” 

[4] Paton K.R., et al., Scalable production of large quantities of defect-free few-layer graphene by shear exfoliation in liquids. Nat. Mater. 13, 624 (2014).

Flexible Displays – Rolled Up in your Pocket

Working with researchers from the Graphene Flagship the Flagship partner, FlexEnable, demonstrated the world’s first flexible display with graphene incorporated into its pixel backplane. Combined with an electrophoretic imaging film, the result is a low-power, durable display suitable for use in many and varied environments.

Emerging from the Graphene Flagship Work Package Flexible Electronics this illustrates the power of collaboration.  Talking about this is the WP leader Dr Henrik Sandberg from the VTT Technical Research Centre of Finland Ltd., Finland “Here we show the power of collaboration. To deliver these flexible demonstrators and prototypes we have seen materials experts working together with components manufacturers and system integrators. These devices will have a potential impact in several emerging fields such as wearables and the Internet of Things.”

​Fibre-Optics Data Boost from Graphene

A team of researches from the Graphene Flagship have demonstrated high-performance photo detectors for infrared fibre-optic communication systems based on wafer-scale graphene [5]. This can increase the amount of information transferred whilst at the same time make the devises smaller and more cost effective.

Discussing this work which emerged from the Graphene Flagship Work Package Optoelectronics is the paper’s lead author, Daniel Schall from AMO, Germany “Graphene has outstanding properties when it comes to the mobility of its electric charge carriers, and this can increase the speed at which electronic devices operate.”

[5] Schall D., et al., 50 GBit/s Photodetectors Based on Wafer-Scale Graphene for Integrated Silicon Photonic Communication Systems. ACS Photonics. 1 (9), 781 (2014)

​Rechargeable Batteries with Graphene

A number of different research groups within the Graphene Flagship are working on rechargeable batteries. One group has developed a graphene-based rechargeable battery of the lithium-ion type used in portable electronic devices [6]. Graphene is incorporated into the battery anode in the form of a spreadable ink containing a suspension of graphene nanoflakes giving an increased energy efficiency of 20%. A second group of researchers have demonstrated a lithium-oxygen battery with high energy density, efficiency and stability [7]. They produced a device with over 90% efficiency that may be recharged more than 2,000 times. Their lithium-oxygen cell features a porous, ‘fluffy’ electrode made from graphene together with additives that alter the chemical reactions at work in the battery.

Graphene Flagship researchers show how the 2D material graphene can improve the energy capacity, efficiency and stability of lithium-oxygen batteries.

Both devices were developed in different groups within the Graphene Flagship Work Package Energy and speaking of the technology further is Prof. Clare Grey from Cambridge University, UK “What we’ve achieved is a significant advance for this technology, and suggests whole new areas for research – we haven’t solved all the problems inherent to this chemistry, but our results do show routes forward towards a practical device”.

[6] Liu T., et al. Cycling Li-O2 batteries via LiOH formation and decomposition. Science. 350, 6260, 530 (2015)

[7] Hassoun J., et al., An Advanced Lithium-Ion Battery Based on a Graphene Anode and a Lithium Iron Phosphate Cathode. Nano Lett., 14 (8), 4901 (2014)

Graphene – What and Why?

Graphene is a two-dimensional material formed by a single atom-thick layer of carbon, with the carbon atoms arranged in a honeycomb-like lattice. This transparent, flexible material has a number of unique properties. For example, it is 100 times stronger than steel, and conducts electricity and heat with great efficiency.

A number of practical applications for graphene are currently being developed. These include flexible and wearable electronics and antennas, sensors, optoelectronics and data communication systems, medical and bioengineering technologies, filtration, super-strong composites, photovoltaics and energy storage.

Graphene and Beyond

The Graphene Flagship also covers other layered materials, as well as hybrids formed by combining graphene with these complementary materials, or with other materials and structures, ranging from polymers, to metals, cement, and traditional semiconductors such as silicon. Graphene is just the first of thousands of possible single layer materials. The Flagship plans to accelerate their journey from laboratory to factory floor.

Especially exciting is the possibility of stacking monolayers of different elements to create materials not found in nature, with properties tailored for specific applications. Such composite layered materials could be combined with other nanomaterials, such as metal nanoparticles, in order to further enhance their properties and uses.​

Graphene – the Fruit of European Scientific Excellence

Europe, North America and Asia are all active centres of graphene R&D, but Europe has special claim to be at the centre of this activity. The ground-breaking experiments on graphene recognised in the award of the 2010 Nobel Prize in Physics were conducted by European physicists, Andre Geim and Konstantin Novoselov, both at Manchester University. Since then, graphene research in Europe has continued apace, with major public funding for specialist centres, and the stimulation of academic-industrial partnerships devoted to graphene and related materials. It is European scientists and engineers who as part of the Graphene Flagship are closely coordinating research efforts, and accelerating the transfer of layered materials from the laboratory to factory floor.

For anyone who would like links to the published papers, you can check out an April 20, 2016 news item featuring the Graphene Flagship highlights on Nanowerk.

Cities as incubators of technological and economic growth: from the rustbelt to the brainbelt

An April 10, 2016 news article by Xumei Dong on the timesunion website casts a light on what some feel is an emerging ‘brainbelt’ (Note: Links have been removed),

Albany [New York state, US], in the forefront of nanotechnology research, is one of the fastest-growing cities for tech jobs, according to a new book exploring hot spots of innovation across the globe.

“You have GlobalFoundries, which has thousands of employees working in one of the most modern plants in the world,” says Antoine van Agtmael, the Dutch-born investor who wrote “The Smartest Places on Earth: Why Rustbelts Are the Emerging Hotspots of Global Innovation” with Dutch journalist Fred Bakker.

Their book, mentioned in a Brookings Institution panel discussion last week [April 6, 2016], lists Albany as a leading innovation hub — part of an emerging “brainbelt” in the United States.

The Brookings Institute’s The smartest places on Earth: Why rustbelts are the emerging hotspots of global innovation event page provides more details and includes an embedded video of the event (running time: roughly 1 hour 17 mins.), Note: A link has been removed,

The conventional wisdom in manufacturing has long held that the key to maintaining a competitive edge lies in making things as cheaply as possible, which saw production outsourced to the developing world in pursuit of ever-lower costs. In contradiction to that prevailing wisdom, authors Antoine van Agtmael, a Brookings trustee, and Fred Bakker crisscrossed the globe and found that the economic tide is beginning to shift from its obsession with cheap goods to the production of smart ones.

Their new book, “The Smartest Places on Earth” (PublicAffairs, 2016), examines this changing dynamic and the transformation of “rustbelt” cities, the former industrial centers of the U.S. and Europe, into a “brainbelt” of design and innovation.

On Wednesday, April 6 [2016] Centennial Scholar Bruce Katz and the Metropolitan Policy Program hosted an event discussing these emerging hotspots and how cities such as Akron, Albany, Raleigh-Durham, Minneapolis-St.Paul, and Portland in the United States, and Eindhoven, Malmo, Dresden, and Oulu in Europe are seizing the initiative and recovering their economic strength.

You can find the book here or if a summary and biographies of the authors will suffice, there’s this,

The remarkable story of how rustbelt cities such as Akron and Albany in the United States and Eindhoven in Europe are becoming the unlikely hotspots of global innovation, where sharing brainpower and making things smarter—not cheaper—is creating a new economy that is turning globalization on its head

Antoine van Agtmael and Fred Bakker counter recent conventional wisdom that the American and northern European economies have lost their initiative in innovation and their competitive edge by focusing on an unexpected and hopeful trend: the emerging sources of economic strength coming from areas once known as “rustbelts” that had been written off as yesterday’s story.

In these communities, a combination of forces—visionary thinkers, local universities, regional government initiatives, start-ups, and big corporations—have created “brainbelts.” Based on trust, a collaborative style of working, and freedom of thinking prevalent in America and Europe, these brainbelts are producing smart products that are transforming industries by integrating IT, sensors, big data, new materials, new discoveries, and automation. From polymers to medical devices, the brainbelts have turned the tide from cheap, outsourced production to making things smart right in our own backyard. The next emerging market may, in fact, be the West.

about Antoine van Agtmael and Fred Bakker

Antoine van Agtmael is senior adviser at Garten Rothkopf, a public policy advisory firm in Washington, DC. He was a founder, CEO, and CIO of Emerging Markets Management LLC; previously he was deputy director of the capital markets department of the International Finance Corporation (“IFC”), the private sector oriented affiliate of the World Bank, and a division chief in the World Bank’s borrowing operations. He was an adjunct professor at Georgetown Law Center and taught at the Harvard Institute of Politics. Mr. van Agtmael is chairman of the NPR Foundation, a member of the board of NPR, and chairman of its Investment Committee. He is also a trustee of The Brookings Institution and cochairman of its International Advisory Council. He is on the President’s Council on International Activities at Yale University, the Advisory Council of Johns Hopkins University’s Paul H. Nitze School of Advanced International Studies (SAIS), and a member of the Council on Foreign Relations

Alfred Bakker, until his recent retirement, was a journalist specializing in monetary and financial affairs with Het Financieele Dagblad, the “Financial Times of Holland,” serving as deputy editor, editor-in-chief and CEO. In addition to his writing and editing duties he helped develop the company from a newspaper publisher to a multimedia company, developing several websites, a business news radio channel, and a quarterly business magazine, FD Outlook, and, responsible for the establishment of FD Intelligence

A hard cover copy of the book is $25.99, presumably US currency.

With over 150 partners from over 20 countries, the European Union’s Graphene Flagship research initiative unveils its work package devoted to biomedical technologies

An April 11, 2016 news item on Nanowerk announces the Graphene Flagship’s latest work package,

With a budget of €1 billion, the Graphene Flagship represents a new form of joint, coordinated research on an unprecedented scale, forming Europe’s biggest ever research initiative. It was launched in 2013 to bring together academic and industrial researchers to take graphene from the realm of academic laboratories into European society in the timeframe of 10 years. The initiative currently involves over 150 partners from more than 20 European countries. The Graphene Flagship, coordinated by Chalmers University of Technology (Sweden), is implemented around 15 scientific Work Packages on specific science and technology topics, such as fundamental science, materials, health and environment, energy, sensors, flexible electronics and spintronics.

Today [April 11, 2016], the Graphene Flagship announced in Barcelona the creation of a new Work Package devoted to Biomedical Technologies, one emerging application area for graphene and other 2D materials. This initiative is led by Professor Kostas Kostarelos, from the University of Manchester (United Kingdom), and ICREA Professor Jose Antonio Garrido, from the Catalan Institute of Nanoscience and Nanotechnology (ICN2, Spain). The Kick-off event, held in the Casa Convalescència of the Universitat Autònoma de Barcelona (UAB), is co-organised by ICN2 (ICREA Prof Jose Antonio Garrido), Centro Nacional de Microelectrónica (CNM-IMB-CSIC, CIBER-BBN; CSIC Tenured Scientist Dr Rosa Villa), and Institut d’Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS; ICREA Prof Mavi Sánchez-Vives).

An April 11, 2016 ICN2 press release, which originated the news item, provides more detail about the Biomedical Technologies work package and other work packages,

The new Work Package will focus on the development of implants based on graphene and 2D-materials that have therapeutic functionalities for specific clinical outcomes, in disciplines such as neurology, ophthalmology and surgery. It will include research in three main areas: Materials Engineering; Implant Technology & Engineering; and Functionality and Therapeutic Efficacy. The objective is to explore novel implants with therapeutic capacity that will be further developed in the next phases of the Graphene Flagship.

The Materials Engineering area will be devoted to the production, characterisation, chemical modification and optimisation of graphene materials that will be adopted for the design of implants and therapeutic element technologies. Its results will be applied by the Implant Technology and Engineering area on the design of implant technologies. Several teams will work in parallel on retinal, cortical, and deep brain implants, as well as devices to be applied in the periphery nerve system. Finally, The Functionality and Therapeutic Efficacy area activities will centre on development of devices that, in addition to interfacing the nerve system for recording and stimulation of electrical activity, also have therapeutic functionality.

Stimulation therapies will focus on the adoption of graphene materials in implants with stimulation capabilities in Parkinson’s, blindness and epilepsy disease models. On the other hand, biological therapies will focus on the development of graphene materials as transport devices of biological molecules (nucleic acids, protein fragments, peptides) for modulation of neurophysiological processes. Both approaches involve a transversal innovation environment that brings together the efforts of different Work Packages within the Graphene Flagship.

A leading role for Barcelona in Graphene and 2D-Materials

The kick-off meeting of the new Graphene Flagship Work Package takes place in Barcelona because of the strong involvement of local institutions and the high international profile of Catalonia in 2D-materials and biomedical research. Institutions such as the Catalan Institute of Nanoscience and Nanotechnology (ICN2) develop frontier research in a supportive environment which attracts talented researchers from abroad, such as ICREA Research Prof Jose Antonio Garrido, Group Leader of the ICN2 Advanced Electronic Materials and Devices Group and now also Deputy Leader of the Biomedical Technologies Work Package. Until summer 2015 he was leading a research group at the Technische Universität München (Germany).

Further Graphene Flagship events in Barcelona are planned; in May 2016 ICN2 will also host a meeting of the Spintronics Work Package. ICREA Prof Stephan Roche, Group Leader of the ICN2 Theoretical and Computational Nanoscience Group, is the deputy leader of this Work Package led by Prof Bart van Wees, from the University of Groningen (The Netherlands). Another Work Package, on optoelectronics, is led by Prof Frank Koppens from the Institute of Photonic Sciences (ICFO, Spain), with Prof Andrea Ferrari from the University of Cambridge (United Kingdom) as deputy. Thus a number of prominent research institutes in Barcelona are deeply involved in the coordination of this European research initiative.

Kostas Kostarelos, the leader of the Biomedical Technologies Graphene Flagship work package, has been mentioned here before in the context of his blog posts for The Guardian science blog network (see my Aug. 7, 2014 post for a link to his post on metaphors used in medicine).

When based on plastic materials, contemporary art can degrade quickly

There’s an intriguing April 1, 2016 article by Josh Fischman for Scientific American about a problem with artworks from the 20th century and later—plastic-based materials (Note: A link has been removed),

Conservators at museums and art galleries have a big worry. They believe there is a good chance the art they showcase now will not be fit to be seen in one hundred years, according to researchers in a project  called Nanorestart. Why? After 1940, artists began using plastic-based material that was a far cry from the oil-based paints used by classical painters. Plastic is also far more fragile, it turns out. Its chemical bonds readily break. And they cannot be restored using techniques historically relied upon by conservators.

So art conservation scientists have turned to nanotechnology for help.

Sadly, there isn’t any detail in Fischman’s article about how nanotechnology is playing or might play a role in this conservation effort. Further investigation into the two projects (NanoRestART and POPART) mentioned by Fischman didn’t provide much more detail about NanoRestART’s science aspect but POPART does provide some details.

NanoRestART

It’s probably too soon (this project isn’t even a year-old) to be getting much in the way of the nanoscience details but NanoRestART has big plans according to its website homepage,

The conservation of this diverse cultural heritage requires advanced solutions at the cutting edge of modern chemistry and material science in an entirely new scientific framework that will be developed within NANORESTART project.

The NANORESTART project will focus on the synthesis of novel poly-functional nanomaterials and on the development of highly innovative restoration techniques to address the conservation of a wide variety of materials mainly used by modern and contemporary artists.

In NANORESTART, enterprises and academic centers of excellence in the field of synthesis and characterization of nano- and advanced materials have joined forces with complementary conservation institutions and freelance restorers. This multidisciplinary approach will cover the development of different materials in response to real conservation needs, the testing of such materials, the assessment of their environmental impact, and their industrial scalability.

NanoRestART’s (NANOmaterials for the REStoration of works of ART) project page spells out their goals in the order in which they are being approached,

The ground-breaking nature of our research can be more easily outlined by focussing on specific issues. The main conservation challenges that will be addressed in the project are:

 

Conservation challenge 1Cleaning of contemporary painted and plastic surfaces (CC1)

Conservation challenge 2Stabilization of canvases and painted layers in contemporary art (CC2)

Conservation challenge 3Removal of unwanted modern materials (CC3)

Conservation challenge 4Enhanced protection of artworks in museums and outdoors (CC4)

The European Commission provides more information about the project on its CORDIS website’s NanoRestART webpage including the start and end dates for the project and the consortium members,

From 2015-06-01 to 2018-12-01, ongoing project

CHALMERS TEKNISKA HOEGSKOLA AB
Sweden
MIRABILE ANTONIO
France
NATIONALMUSEET
Denmark
CONSIGLIO NAZIONALE DELLE RICERCHE
Italy
UNIVERSITY COLLEGE CORK, NATIONAL UNIVERSITY OF IRELAND, CORK
Ireland
MBN NANOMATERIALIA SPA
Italy
KEMIJSKI INSTITUT
Slovenia
CHEVALIER AURELIA
France
UNIVERSIDADE FEDERAL DO RIO GRANDE DO SUL
Brazil
UNIVERSITA CA’ FOSCARI VENEZIA
Italy
AKZO NOBEL PULP AND PERFORMANCE CHEMICALS AB
Sweden
COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
France
ARKEMA FRANCE SA
France
UNIVERSIDAD DE SANTIAGO DE COMPOSTELA
Spain
UNIVERSITY COLLEGE LONDON
United Kingdom
ZFB ZENTRUM FUR BUCHERHALTUNG GMBH
Germany
UNIVERSITAT DE BARCELONA
Spain
THE BOARD OF TRUSTEES OF THE TATE GALLERY
United Kingdom
ASSOCIAZIONE ITALIANA PER LA RICERCA INDUSTRIALE – AIRI
Italy
THE ART INSTITUTE OF CHICAGO
United States
MINISTERIO DE EDUCACION, CULTURA Y DEPORTE
Spain
STICHTING HET RIJKSMUSEUM
Netherlands
UNIVERSITEIT VAN AMSTERDAM
Netherlands
UNIVERSIDADE FEDERAL DO RIO DE JANEIRO
Brazil
ACCADEMIA DI BELLE ARTI DI BRERA
Italy

It was a bit surprising to see Brazil and the US as participants but The Art Institute of Chicago has done nanotechnology-enabled conservation in the past as per my March 24, 2014 posting about a Renoir painting. I’m not familiar with the Brazilian organization.

POPART

POPART (Preservation of Plastic Artefacts in museum collections) mentioned by Fischman was a European Commission project which ran from 2008 – 2012. Reports can be found on the CORDIS Popart webpage. The final report has some interesting bits (Note: I have added subheads in the [] square brackets),

To achieve a valid comparison of the various invasive and non-invasive techniques proposed for the identification and characterisation of plastics, a sample collection (SamCo) of plastics artefacts of about 100 standard and reference plastic objects was gathered. SamCo was made up of two kinds of reference materials: standards and objects. Each standard represents the reference material of a ‘pure’ plastic; while each object represents the reference of the same plastic as in the standards, but compounded with pigments, dyestuffs, fillers, anti oxidants, plasticizers etc.  Three partners ICN [Instituut Collectie Nederland], V&A [Victoria and Albert Museum] and Natmus [National Museet] collected different natural and synthetic plastics from the ICN reference collections of plastic objects, from flea markets, antique shops and from private collections and from their own collection to contribute to SamCo, the sample collection for identification by POPART partners. …

As a successive step, the collections of the following museums were surveyed:

-Victoria & Albert Museum (V&A), London, U.K.
-Stedelijk Museum, Amsterdam, The Netherlands
-Musée d’Art Moderne et d’Art Contemporaine (MAMAC) Nice, France
-Musée d’Art moderne, St. Etienne, France
-Musée Galliera, Paris, France

At the V&A approximately 200 objects were surveyed. Good or fair conservation conditions were found for about 85% of the objects, whereas the remaining 15% was in poor or even in unacceptable (3%) conditions. In particular, crazing and delamination of polyurethane faux leather and surface stickiness and darkening of plasticized PVC were observed. The situation at the Stedelijk Museum in Amsterdam was particularly favourable because a previous survey had been done in 1995 so that it was possible to make a comparison with the Popart survey in 2010. A total number of 40 objects, which comprised plastics early dating from the 1930’s until the newer plastics from the 1980’s, were considered and their actual conservation state compared with the 1995 records. Of the objects surveyed in 2010, it can be concluded that 21 remained in the same condition. 13 objects containing PA, PUR, PVC, PP or natural rubber changed due to chemical and physical degradation while works of art containing either PMMA or PS changed due to mechanical damages and incorrect artist’s technique (inappropriate adhesive) into a lesser condition. 6 works of art (containing either PA or PMMA or both) changed into a better condition due to restoration or replacements.  More than 230 objects have been examined in the 3 museums in France. A particular effort was devoted to the identification of the constituting plastics materials. Surveys have been undertaken without any sophisticated equipment, in order to work in museums everyday conditions. Plastics hidden by other materials or by paint layers were not or hardly accessible, it is why the final count of some plastics may be under estimated in the final results. Another outcome is that plastic identification has been made at a general level only, by trying to identify the polymer family each plastic belongs to. Lastly, evidence of chemical degradation processes that do not cause visible or perceptible damage have not been detected and could not be taken in account in the final results.

… The most damaged artefacts resulted constituted by cellulose acetate, cellulose nitrate and PVC.

[Polly (the doll)]

One of the main issues that is of interest for conservators and curators is to assess which kinds of plastics are most vulnerable to deterioration and to what extent they can deteriorate under the environmental conditions normally encountered in museums. Although one might expect that real time deterioration could be ascertained by a careful investigation of museum objects on display or in storage, real objects or artworks may not sampled due to ethical considerations. Therefore, reference objects were prepared by Natmus in the form of a doll (Polly) for simultaneous exposures in different environmental conditions. The doll comprised of 11 different plastics representative of types typically found in modern museum collections. The 16 identical dolls realized were exposed in different places, not only in normal exhibit conditions, but also in some selected extreme conditions to ascertain possible acceleration of the deterioration process. In most cases the environmental parameters were also measured. The dolls were periodically evaluated by visual inspection and in selected cases by instrumental analyses. 

In conclusion the experimental campaign carried out with Polly dolls can be viewed as a pilot study aimed at tackling the practical issues related to the monitoring of real three dimensional plastic artworks and the surrounding environment.

The overall exposure period (one year and half) was sufficient to observe initial changes in the more susceptible polymers, such as polyurethane ethers and esters, and polyamide, with detectable chromatic changes and surface effects. Conversely the other polymers were shown to be stable in the same conditions over this time period.

[Polly as an awareness raising tool]

Last but not least, the educational and communication benefits of an object like Polly facilitated the dissemination of the Popart Project to the public, and increased the awareness of issues associated with plastics in museum collections.

[Cleaning issues]

Mechanical cleaning has long been perceived as the least damaging technique to remove soiling from plastics. The results obtained from POPART suggest that the risks of introducing scratches or residues by mechanical cleaning are measurable. Some plastics were clearly more sensitive to mechanical damage than others. From the model plastics evaluated, HIPS was the most sensitive followed by HDPE, PVC, PMMA and CA. Scratches could not be measured on XPS due to its inhomogeneous surfaces. Plasticised PVC scratched easily, but appeared to repair itself because plasticiser migrated to surfaces and filled scratches.

Photo micrographs revealed that although all 22 cleaning materials evaluated in POPART scratched test plastics, some scratches were sufficiently shallow to be invisible to the naked eye. Duzzit and Scotch Brite sponges as well as all paper based products caused more scratching of surfaces than brushes and cloths. Some cleaning materials, notably Akapad yellow and white sponges, compressed air, latex and synthetic rubber sponges and goat hair brushes left residues on surfaces. These residues were only visible on glass-clear, transparent test plastics such as PMMA. HDPE and HIPS surfaces both had matte and roughened appearances after cleaning with dry-ice. XPS was completely destroyed by the treatment. No visible changes were present on PMMA and PVC.

Of the cleaning methods evaluated, only canned air, natural and synthetic feather duster left surfaces unchanged. Natural and synthetic feather duster, microfiber-, spectacle – and cotton cloths, cotton bud, sable hair brush and leather chamois showed good results when applied to clean model plastics.

Most mechanical cleaning materials induced static electricity after cleaning, causing immediate attraction of dust. It was also noticed that generally when adding an aqueous cleaning agent to a cleaning material, the area scratched was reduced. This implied that cleaning agents also functioned as lubricants. A similar effect was exhibited by white spirit and isopropanol.
Based on cleaning vectors, Judith Hofenk de Graaff detergent, distilled water and Dehypon LS45 were the least damaging cleaning agents for all model plastics evaluated. None of the aqueous cleaning agents caused visible changes when used in combination with the least damaging cleaning materials. Sable hair brush, synthetic feather duster and yellow Akapad sponge were unsuitable for applying aqueous cleaning agents. Polyvinyl acetate sponge swelled in contact with solvents and was only suitable for aqueous cleaning processes.

Based on cleaning vectors, white spirit was the least damaging solvent. Acetone and Surfynol 61 were the most damaging for all model plastics and cannot be recommended for cleaning plastics. Surfynol 61 dissolved polyvinyl acetate sponge and left a milky residue on surfaces, which was particularly apparent on clear PMMA surfaces. Surfynol 61 left residues on surfaces on evaporating and acetone evaporated too rapidly to lubricate cleaning materials thereby increasing scratching of surfaces.

Supercritical carbon dioxide induced discolouration and mechanical damage to the model plastics, particularly to XPS, CA and PMMA and should not be used for conservation cleaning of plastics.

Potential Impact:
Cultural heritage is recognised as an economical factor, the cost of decay of cultural heritage and the risk associated to some material in collection may be high. It is generally estimated that plastics, developed at great numbers since the 20th century’s interbellum, will not survive that long. This means that fewer generations will have access to lasting plastic art for study, contemplation and enjoyment. On the other hand will it normally be easier to reveal a contemporary object’s technological secrets because of better documentation and easier access to artists’ working methods, ideas and intentions. A first more or less world encompassing recognition of the problems involved with museum objects made wholly or in part of plastics was through the conference ‘Saving the twentieth century” held in Ottawa, Canada in 1991. This was followed later by ‘Modern Art, who cares’ in Amsterdam, The Netherlands in 1997, ‘Mortality Immortality? The Legacy of Modern Art’ in Los Angeles, USA in 1998 and, for example much more recent, ‘Plastics –Looking at the future and learning from the Past’ in London, UK in 2007. A growing professional interest in the care of plastics was clearly reflected in the creation of an ICOM-CC working group dedicated to modern materials in 1996, its name change to Modern Materials and Contemporary Art in 2002, and its growing membership from 60 at inception to over 200 at the 16th triennial conference in Lisbon, Portugal in 2011 and tentatively to over 300 as one of the aims put forward in the 2011-2014 programme of that ICOM-CC working group. …

[Intellectual property]

Another element pertaining to conservation of modern art is the copyright of artists that extends at least 50 years beyond their death. Both, damage, value and copyright may influence the way by which damage is measured through scientific analysis, more specifically through the application of invasive or non invasive techniques. Any selection of those will not only have an influence on the extent of observable damage, but also on the detail of information gathered and necessary to explain damage and to suggest conservation measures.

[How much is deteriorating?]

… it is obvious from surveys carried out in several museums in France, the UK and The Netherlands that from 15 to 35 % of what I would then call an average plastic material based collection is in a poor to unacceptable condition. However, some 75 % would require cleaning,

I hope to find out more about how nanotechnology is expected to be implemented in the conservation and preservation of plastic-based art. The NanoRestART project started in June 2015 and hopefully more information will be disseminated in the next year or so.

While it’s not directly related, there was some work with conservation of daguerreotypes (19th century photographic technique) and nanotechnology mentioned in my Nov. 17, 2015 posting which was a followup to my Jan. 10, 2015 posting about the project and the crisis precipitating it.

2-D melting and surfacing premelting of a single particle

Scientists at the Hong Kong University of Science and Technology (HKUST) and the University of Amsterdam (in the Netherlands) have measured surface premelting with single particle resolution. From a March 15, 2016 HKUST news release on EurekAlert,

The surface of a solid often melts into a thin layer of liquid even below its melting point. Such surface premelting is prevalent in all classes of solids; for instance, two pieces of ice can fuse below 0°C because the premelted surface water becomes embedded inside the bulk at the contact point and thus freeze. Premelting facilitates crystal growth and is critical in metallurgy, geology, and meteorology such as glacier movement, frost heave, snowflake growth and skating. However, the causative factors of various premelting scenarios, and the effect of dimensionality on premelting are poorly understood due to the lack of microscopic measurements.

To this end, researchers from the Hong Kong University of Science and Technology (HKUST) and University of Amsterdam conducted a research where they were able to measure surface premelting with single-particle resolution for the first time by using novel colloidal crystals. They found that dimensionality is crucial to bulk melting and bulk solid-solid transitions, which strongly affect surface melting behaviors. To the surprise of the researchers, they found that a crystal with free surfaces (solid-vapor interface) melted homogenously from both surfaces and within the bulk, in contrast to the commonly assumed heterogeneous melting from surfaces. These observations would provide new challenges on premelting and melting theories.

The research team was led by associate professor of physics Yilong Han and graduate student Bo Li from HKUST. HKUST graduate students Feng Wang, Di Zhou, Yi Peng, and postdoctoral researcher Ran Ni from University of Amsterdam in Netherlands also participated in the research.

Micrometer sized colloidal spheres in liquid suspensions have been used as powerful model systems for the studies of phase transitions because the thermal-motion trajectories of these “big atoms” can be directly visualized under an optical microscope. “Previous studies mainly used repulsive colloids, which cannot form stable solid-vapor interfaces,” said Han. “Here, we made a novel type colloid with temperature-sensitive attractions which can better mimic atoms, since all atoms have attractions, or otherwise they cannot condense into stable solid in air. We assembled these attractive spheres into large well-tunable two-dimensional colloidal crystals with free surfaces for the first time.

“This paves the way to study surface physics using colloidal model systems. Our first project along this direction is about surface premelting, which was poorly understood before. Surprisingly, we found that it is also related to bulk melting and solid-solid transitions,” Han added.

The team found that two-dimensional (2D) monolayer crystals premelted into a thin layer of liquid with a constant thickness, an exotic phenomenon known as incomplete blocked premelting. By contrast, the surface-liquid thickness of the two- or three-layer thin-film crystal increased to infinity as it approaches its melting point, i.e. a conventional complete premelting. Such blocked surface premelting has been occasionally observed, e.g. in ice and germanium, but lacks theoretical explanations.

“Here, we found that the premelting of the 2D crystal was triggered by an abrupt lattice dilation because the crystal can no longer provide enough attractions to surface particles after a drop in density.” Li said. “Before the surface liquid grew thick, the bulk crystal collapsed and melted due to mechanical instability. This provides a new simple mechanism for blocked premelting. The two-layer crystals are mechanically stable because particles have more neighbors. Thus they exhibit a conventional surface melting.”

As an abrupt dilation does not change the lattice symmetry, this is an isostructural solid-solid transition, which usually occurs in metallic and multiferroic materials. The colloidal system provides the first experimental observation of isostructural solid-solid transition at the single-particle level.

The mechanical instability induced a homogenous melting from within the crystal rather than heterogeneous melting from the surface. “We observed that the 2D melting is a first-order transition with a homogeneous proliferation of grain boundaries, which confirmed the grain-boundary-mediated 2D melting theory.” said Han. “First-order 2D melting has been observed in some molecular monolayers, but the theoretically predicted grain-boundary formation has not been observed before.”

Here’s a link to and a citation for the paper,

Imaging the Homogeneous Nucleation During the Melting of Superheated Colloidal Crystals by Ziren Wang, Feng Wang, Yi Peng, Zhongyu Zheng, Yilong Han. Science  05 Oct 2012:
Vol. 338, Issue 6103, pp. 87-90 DOI: 10.1126/science.1224763

This paper is behind a paywall.

Science advice conference in Brussels, Belgium, Sept. 29 – 30, 2016 and a call for speakers

This is the second such conference and they are issuing a call for speakers; the first was held in New Zealand in 2014 (my April 8, 2014 post offers an overview of the then proposed science advice conference). Thanks to David Bruggeman and his Feb. 23, 2016 posting (on the Pasco Phronesis blog) for the information about this latest one (Note: A link has been removed),

The International Network for Global Science Advice (INGSA) is holding its second global conference in Brussels this September 29 and 30, in conjunction with the European Commission. The organizers have the following goals for the conference:

  • Identify core principles and best practices, common to structures providing scientific advice for governments worldwide.
  • Identify practical ways to improve the interaction of the demand and supply side of scientific advice.
  • Describe, by means of practical examples, the impact of effective science advisory processes.

Here’s a little more about the conference from its webpage on the INGSA website,

Science and Policy-Making: towards a new dialogue

29th – 30th September 2016, Brussels, Belgium

Call for suggestions for speakers for the parallel sessions

BACKGROUND:

“Science advice has never been in greater demand; nor has it been more contested.”[1] The most complex and sensitive policy issues of our time are those for which the available scientific evidence is ever growing and multi-disciplined, but still has uncertainties. Yet these are the very issues for which scientific input is needed most. In this environment, the usefulness and legitimacy of expertise seems obvious to scientists, but is this view shared by policy-makers?

OBJECTIVES:

A two-day conference will take place in Brussels, Belgium, on Thursday 29th and Friday 30th September 2016. Jointly organised by the European Commission and the International Network for Government Science Advice (INGSA), the conference will bring together users and providers of scientific advice on critical, global issues. Policy-makers, leading practitioners and scholars in the field of science advice to governments, as well as other stakeholders, will explore principles and practices in a variety of current and challenging policy contexts. It will also present the new Scientific Advice Mechanism [SAM] of the European Commission [emphasis mine; I have more about SAM further down in the post] to the international community. Through keynote lectures and plenary discussions and topical parallel sessions, the conference aims to take a major step towards responding to the challenge best articulated by the World Science Forum Declaration of 2015:

“The need to define the principles, processes and application of science advice and to address the theoretical and practical questions regarding the independence, transparency, visibility and accountability of those who receive and provide advice has never been more important. We call for concerted action of scientists and policy-makers to define and promulgate universal principles for developing and communicating science to inform and evaluate policy based on responsibility, integrity, independence, and accountability.”

The conference seeks to:

Identify core principles and best practices, common to structures providing scientific advice for governments worldwide.
Identify practical ways to improve the interaction of the demand and supply side of scientific advice.
Describe, by means of practical examples, the impact of effective science advisory processes.

The Programme Committee comprises:

Eva Alisic, Co-Chair of the Global Young Academy

Tateo Arimoto, Director of Science, Technology and Innovation Programme; The Japanese National Graduate Institute for Policy Studies

Peter Gluckman, Chair of INGSA and Prime Minister’s Chief Science Advisor, New Zealand (co-chair)

Robin Grimes, UK Foreign Office Chief Scientific Adviser

Heide Hackmann, International Council for Science (ICSU)

Theodoros Karapiperis, European Parliament – Head of Scientific Foresight Unit (STOA), European Parliamentary Research Service (EPRS) – Science and Technology Options Assessment Panel

Johannes Klumpers, European Commission, Head of Unit – Scientific Advice Mechanism (SAM) (co-chair)

Martin Kowarsch, Head of the Working Group Scientific assessments, Ethics and Public Policy, Mercator Research Institute on Global Commons and Climate Change

David Mair, European Commission – Joint Research Centre (JRC)

Rémi Quirion, Chief Scientist,  Province of Québec, Canada

Flavia Schlegel, UNESCO Assistant Director-General for the Natural Sciences

Henrik Wegener, Executive Vice President, Chief Academic Officer, Provost at Technical University of Denmark, Chair of the EU High Level Group of Scientific Advisors

James Wilsdon, Chair of INGSA, Professor of Research Policy, Director of Impact & Engagement, University of Sheffield
Format

The conference will be a combination of plenary lectures and topical panels in parallel (concurrent) sessions outlined below. Each session will include three speakers (15 minute address with 5 minute Q & A each) plus a 30 minute moderated discussion.

Parallel Session I: Scientific advice for global policy

The pathways of science advice are a product of a country’s own cultural history and will necessarily differ across jurisdictions. Yet, there is an increasing number of global issues that require science advice. Can scientific advice help to address issues requiring action at international level? What are the considerations for providing science advice in these contexts? What are the examples from which we can learn what works and what does not work in informing policy-making through scientific advice?

Topics to be addressed include:

Climate Change – Science for the Paris Agreement: Did it work?
Migration: How can science advice help?
Zika fever, dementia, obesity etc.; how can science advice help policy to address the global health challenges?

Parallel Session II: Getting equipped – developing the practice of providing scientific advice for policy

The practice of science advice to public policy requires a new set of skills that are neither strictly scientific nor policy-oriented, but a hybrid of both. Negotiating the interface between science and policy requires translational and navigational skills that are often not acquired through formal training and education. What are the considerations in developing these unique capacities, both in general and for particular contexts? In order to be best prepared for informing policy-making, up-coming needs for scientific advice should ideally be anticipated. Apart from scientific evidence sensu stricto, can other sources such as the arts, humanities, foresight and horizon scanning provide useful insights for scientific advice? How can scientific advice make best use of such tools and methods?

Topics to be addressed include:

How to close the gap between the need and the capacity for science advice in developing countries with limited or emerging science systems?
What skills do scientists and policymakers need for a better dialogue?
Foresight and science advice: can foresight and horizon scanning help inform the policy agenda?

Parallel Session III: Scientific advice for and with society

In many ways, the practice of science advice has become a key pillar in what has been called the ‘new social contract for science[2]’. Science advice translates knowledge, making it relevant to society through both better informed policy and by helping communities and their elected representatives to make better informed decisions about the impacts of technology. Yet providing science advice is often a distributed and disconnected practice in which academies, formal advisors, journalists, stakeholder organisations and individual scientists play an important role. The resulting mix of information can be complex and even contradictory, particularly as advocate voices and social media join the open discourse. What considerations are there in an increasingly open practice of science advice?

Topics to be addressed include:

Science advice and the media: Lost in translation?
Beyond the ivory tower: How can academies best contribute to science advice for policy?
What is the role of other stakeholders in science advice?

Parallel Session IV: Science advice crossing borders

Science advisors and advisory mechanisms are called upon not just for nationally-relevant advice, but also for issues that increasingly cross borders. In this, the importance of international alignment and collaborative possibilities may be obvious, but there may be inherent tensions. In addition, there may be legal and administrative obstacles to transnational scientific advice. What are these hurdles and how can they be overcome? To what extent are science advisory systems also necessarily diplomatic and what are the implications of this in practice?

Topics to be addressed include:

How is science advice applied across national boundaries in practice?
What support do policymakers need from science advice to implement the Sustainable Development Goals in their countries?
Science Diplomacy/Can Scientists extend the reach of diplomats?

Call for Speakers

The European Commission and INGSA are now in the process of identifying speakers for the above conference sessions. As part of this process we invite those interested in speaking to submit their ideas. Interested policy-makers, scientists and scholars in the field of scientific advice, as well as business and civil-society stakeholders are warmly encouraged to submit proposals. Alternatively, you may propose an appropriate speaker.

The conference webpage includes a form should you wish to submit yourself or someone else as a speaker.

New Scientific Advice Mechanism of the European Commission

For anyone unfamiliar with the Scientific Advice Mechanism (SAM) mentioned in the conference’s notes, once Anne Glover’s, chief science adviser for the European Commission (EC), term of office was completed in 2014 the EC president, Jean-Claude Juncker, obliterated the position. Glover, the first and only science adviser for the EC, was to replaced by an advisory council and a new science advice mechanism.

David Bruggemen describes the then situation in a May 14, 2015 posting (Note: A link has been removed),

Earlier this week European Commission President Juncker met with several scientists along with Commission Vice President for Jobs, Growth, Investment and Competitiveness [Jyrki] Katainen and the Commissioner for Research, Science and Innovation ]Carlos] Moedas. …

What details are publicly available are currently limited to this slide deck.  It lists two main mechanisms for science advice, a high-level group of eminent scientists (numbering seven), staffing and resource support from the Commission, and a structured relationship with the science academies of EU member states.  The deck gives a deadline of this fall for the high-level group to be identified and stood up.

… The Commission may use this high-level group more as a conduit than a source for policy advice.  A reasonable question to ask is whether or not the high-level group can meet the Commission’s expectations, and those of the scientific community with which it is expected to work.

David updated the information in a January 29,2016 posting (Note: Links have been removed),

Today the High Level Group of the newly constituted Scientific Advice Mechanism (SAM) of the European Union held its first meeting.  The seven members of the group met with Commissioner for Research, Science and Innovation Carlos Moedas and Andrus Ansip, the Commission’s Vice-President with responsibility for the Digital Single Market (a Commission initiative focused on making a Europe-wide digital market and improving support and infrastructure for digital networks and services).

Given it’s early days, there’s little more to discuss than the membership of this advisory committee (from the SAM High Level Group webpage),

Janusz Bujnicki

Professor, Head of the Laboratory of Bioinformatics and Protein Engineering, International Institute of Molecular and Cell Biology, Warsaw

Janusz Bujnicki

Professor of Biology, and head of a research group at IIMCB in Warsaw and at Adam Mickiewicz University, Poznań, Poland. Janusz Bujnicki graduated from the Faculty of Biology, University of Warsaw in 1998, defended his PhD in 2001, was awarded with habilitation in 2005 and with the professor title in 2009.

Bujnicki’s research combines bioinformatics, structural biology and synthetic biology. His scientific achievements include the development of methods for computational modeling of protein and RNA 3D structures, discovery and characterization of enzymes involved in RNA metabolism, and engineering of proteins with new functions. He is an author of more than 290 publications, which have been cited by other researchers more than 5400 times (as of October 2015). Bujnicki received numerous awards, prizes, fellowships, and grants including EMBO/HHMI Young Investigator Programme award, ERC Starting Grant, award of the Polish Ministry of Science and award of the Polish Prime Minister, and was decorated with the Knight’s Cross of the Order of Polonia Restituta by the President of the Republic of Poland. In 2013 he won the national plebiscite “Poles with Verve” in the Science category.

Bujnicki has been involved in various scientific organizations and advisory bodies, including the Polish Young Academy, civic movement Citizens of Science, Life, Environmental and Geo Sciences panel of the Science Europe organization, and Scientific Policy Committee – an advisory body of the Ministry of Science and Higher Education in Poland. He is also an executive editor of the scientific journal Nucleic Acids Research.

Curriculum vitae  PDF icon 206 KB

Pearl Dykstra

Professor of Sociology, Erasmus University Rotterdam

Pearl Dykstra

Professor Dykstra has a chair in Empirical Sociology and is Director of Research of the Department of Public Administration and Sociology at the Erasmus University Rotterdam. Previously, she had a chair in Kinship Demography at Utrecht University (2002-2009) and was a senior scientist at the Netherlands Interdisciplinary Demographic Institute (NIDI) in The Hague (1990-2009).

Her publications focus on intergenerational solidarity, aging societies, family change, aging and the life course, and late-life well-being. She is an elected member of the Netherlands Royal Academy of Arts and Sciences (KNAW, 2004) and Vice-President of the KNAW as of 2011, elected Member of the Dutch Social Sciences Council (SWR, 2006), and elected Fellow of the Gerontological Society of America (2010). In 2012 she received an ERC Advanced Investigator Grant for the research project “Families in context”, which will focus on the ways in which policy, economic, and cultural contexts structure interdependence in families.

Curriculum vitae  PDF icon 391 KB

Elvira Fortunato

Deputy Chair

Professor, Materials Science Department of the Faculty of Science and Technology, NOVA University, Lisbon

Elvira Fortunato

Professor Fortunato is a full professor in the Materials Science Department of the Faculty of Science and Technology of the New University of Lisbon, a Fellow of the Portuguese Engineering Academy since 2009 and decorated as a Grand Officer of the Order of Prince Henry the Navigator by the President of the Republic in 2010, due to her scientific achievements worldwide. In 2015 she was appointed by the Portuguese President Chairman of the Organizing Committee of the Celebrations of the National Day of Portugal, Camões and the Portuguese Communities.

She was also a member of the Portuguese National Scientific & Technological Council between 2012 and 2015 and a member of the advisory board of DG CONNECT (2014-15).

Currently she is the director of the Institute of Nanomaterials, Nanofabrication and Nanomodeling and of CENIMAT. She is member of the board of trustees of Luso-American Foundation (Portugal/USA, 2013-2020).

Fortunato pioneered European research on transparent electronics, namely thin-film transistors based on oxide semiconductors, demonstrating that oxide materials can be used as true semiconductors. In 2008, she received in the 1st ERC edition an Advanced Grant for the project “Invisible”, considered a success story. In the same year she demonstrated with her colleagues the possibility to make the first paper transistor, starting a new field in the area of paper electronics.

Fortunato published over 500 papers and during the last 10 years received more than 16 International prizes and distinctions for her work (e.g: IDTechEx USA 2009 (paper transistor); European Woman Innovation prize, Finland 2011).

Curriculum vitae  PDF icon 339 KB

Rolf-Dieter Heuer

Director-General of the European Organization for Nuclear Research (CERN)

Rolf-Dieter Heuer

Professor Heuer is an experimental particle physicist and has been CERN Director-General since January 2009. His mandate, ending December 2015, is characterised by the start of the Large Hadron Collider (LHC) 2009 as well as its energy increase 2015, the discovery of the H-Boson and the geographical enlargement of CERN Membership. He also actively engaged CERN in promoting the importance of science and STEM education for the sustainable development of the society. From 2004 to 2008, Prof. Heuer was research director for particle and astroparticle physics at the DESY laboratory, Germany where he oriented the particle physics groups towards LHC by joining both large experiments, ATLAS and CMS. He has initiated restructuring and focusing of German high energy physics at the energy frontier with particular emphasis on LHC (Helmholtz Alliance “Physics at the Terascale”). In April 2016 he will become President of the German Physical Society. He is designated President of the Council of SESAME (Synchrotron-Light for Experimental Science and Applications in the Middle East).

Prof. Heuer has published over 500 scientific papers and holds many Honorary Degrees from universities in Europe, Asia, Australia and Canada. He is Member of several Academies of Sciences in Europe, in particular of the German Academy of Sciences Leopoldina, and Honorary Member of the European Physical Society. In 2015 he received the Grand Cross 1st class of the Order of Merit of the Federal Republic of Germany.

Curriculum vitae  PDF icon

Julia Slingo

Chief Scientist, Met Office, Exeter

Julia Slingo

Dame Julia Slingo became Met Office Chief Scientist in February 2009 where she leads a team of over 500 scientists working on a very broad portfolio of research that underpins weather forecasting, climate prediction and climate change projections. During her time as Chief Scientist she has fostered much stronger scientific partnerships across UK academia and international research organisations, recognising the multi-disciplinary and grand challenge nature of weather and climate science and services. She works closely with UK Government Chief Scientific Advisors and is regularly called to give evidence on weather and climate related issues.

Before joining the Met Office she was the Director of Climate Research in NERC’s National Centre for Atmospheric Science, at the University of Reading. In 2006 she founded the Walker Institute for Climate System Research at Reading, aimed at addressing the cross disciplinary challenges of climate change and its impacts. Julia has had a long-term career in atmospheric physics, climate modelling and tropical climate variability, working at the Met Office, ECMWF and NCAR in the USA.

Dame Julia has published over 100 peer reviewed papers and has received numerous awards including the prestigious IMO Prize of the World Meteorological Organization for her outstanding work in meteorology, climatology, hydrology and related sciences. She is a Fellow of the Royal Society, an Honorary Fellow of the Royal Society of Chemistry and an Honorary Fellow of the Institute of Physics.

Curriculum vitae  PDF icon 239 KB

Cédric Villani

Director, Henri Poincaré Institute, Paris

Cédric Villani

Born in 1973 in France, Cédric Villani is a mathematician, director of the Institut Henri Poincaré in Paris (from 2009), and professor at the Université Claude Bernard of Lyon (from 2010). In December 2013 he was elected to the French Academy of Sciences.

He has worked on the theory of partial differential equations involved in statistical mechanics, specifically the Boltzmann equation, and on nonlinear Landau damping. He was awarded the Fields Medal in 2010 for his works.

Since then he has been playing an informal role of ambassador for the French mathematical community to media (press, radio, television) and society in general. His books for non-specialists, in particular Théorème vivant (2012, translated in a dozen of languages), La Maison des mathématiques (2014, with J.-Ph. Uzan and V. Moncorgé) and Les Rêveurs lunaires (2015, with E. Baudoin) have all found a wide audience. He has also given hundreds of lectures for all kinds of audiences around the world.

He participates actively in the administration of science, through the Institut Henri Poincaré, but also by sitting in a number of panels and committees, including the higher council of research and the strategic council of Paris. Since 2010 he has been involved in fostering mathematics in Africa, through programs by the Next Einstein Initiative and the World Bank.

Believing in the commitment of scientists in society, Villani is also President of the Association Musaïques, a European federalist and a father of two.

Website

Henrik C. Wegener

Chair

Executive Vice President, Chief Academic Officer and Provost, Technical University of Denmark

Henrik C. Wegener

Henrik C. Wegener is Executive Vice President and Chief Academic Officer at Technical University of Denmark since 2011. He received his M.Sc. in food science and technology at the University of Copenhagen in 1988, his Ph.D. in microbiology at University of Copenhagen in 1992, and his Master in Public Administration (MPA) form Copenhagen Business School in 2005.

Henrik C. Wegener was director of the National Food Institute, DTU from 2006-2011 and before that head of the Department of Epidemiology and Risk Assessment at National Food and Veterinary Research Institute, Denmark (2004-2006). From 1994-1999, he was director of the Danish Zoonosis Centre, and from 1999-2004 professor of zoonosis epidemiology at Danish Veterinary Institute. He was stationed at World Health Organization headquarters in Geneva from 1999-2000. With more than 3.700 citations (h-index 34), he is the author of over 150 scientific papers in journals, research monographs and proceedings, on food safety, zoonoses, antimicrobial resistance and emerging infectious diseases.

He has served as advisor and reviewer to national and international authorities & governments, international organizations and private companies, universities and research foundations, and he has served, and is presently serving, on several national and international committees and boards on food safety, veterinary public health and research policy.

Henrik C. Wegener has received several awards, including the Alliance for the Prudent Use of Antibiotics International Leadership Award in 2003.

That’s quite a mix of sciences and I’m happy to see a social scientist has been included.

Conference submissions

Getting back to the conference and its call for speakers, the deadline for submissions is March 25, 2016. Interestingly, there’s also this (from conference webpage),

The deadline for submissions is 25th March 2016. The conference programme committee with session chairs will review all proposals and select those that best fit the aim of each session while also representing a diverse range of perspectives. We aim to inform selected speakers within 4 weeks of the deadline to enable travel planning to Brussels.

To make the conference as accessible as possible, there is no registration fee. [emphasis mine] The European Commission will cover travel accommodation costs only for confirmed speakers for whom the travel and accommodation arrangements will be made by the Commission itself, on the basis of the speakers’ indication.

Good luck!

*Head for conference submissions added on Feb. 29, 2016 at 1155 hundred hours.

Making diesel cleaner

A Dec. 10, 2015 news item on Nanowerk announces a new method for producing diesel fuels (Note: A link has been removed),

Researchers from KU Leuven [Belgium] and Utrecht University [Netherlands] have discovered a new approach to the production of fuels (Nature, “Nanoscale intimacy in bifunctional catalysts for selective conversion of hydrocarbons”). Their new method can be used to produce much cleaner diesel. It can quickly be scaled up for industrial use. In 5 to 10 years, we may see the first cars driven by this new clean diesel.

A Dec. 10, 2015 KU Leuven press release, which originated the news item, provides more detail about the research,

The production of fuel involves the use of catalysts. These substances trigger the chemical reactions that convert raw material into fuel. In the case of diesel, small catalyst granules are added to the raw material to sufficiently change the molecules of the raw material to produce useable fuel.

Catalysts can have one or more chemical functions. The catalyst that was used for this particular study has two functions, represented by two different materials: a metal (platinum) and a solid-state acid. During the production process for diesel, the molecules bounce to and fro between the metal and the acid. Each time a molecule comes into contact with one of the materials, it changes a little bit. At the end of the process, the molecules are ready to be used for diesel fuel.

The assumption has always been that the metal and the solid-state acid in the catalyst should be as close together as possible. That would speed up the production process by helping the molecules bounce to and fro more quickly. Professor Johan Martens (KU Leuven) and Professor Krijn de Jong (Utrecht University) have now discovered that this assumption is incorrect. [emphasis mine] If the functions within a catalyst are nanometres apart, the process yields better molecules for cleaner fuel.

“Our results are the exact opposite of what we had expected. At first, we thought that the samples had been switched or that something was wrong with our analysis”, says Professor Martens. “We repeated the experiments three times, only to arrive at the same conclusion: the current theory is wrong. There has to be a minimum distance between the functions within a catalyst. This goes against what the industry has been doing for the past 50 years.”

The new technique can optimise quite a few molecules in diesel. Cars that are driven by this clean diesel would emit far fewer particulates and CO². The researchers believe that their method can be scaled up for industrial use with relative ease, so the new diesel could be used in cars in 5 to 10 years.

The new technique can be applied to petroleum-based fuels, but also to renewable carbon from biomass.

A fifty year old assumption has been found wrong. Interesting, non? In any event, here’s a link to and a citation for the paper,

Nanoscale intimacy in bifunctional catalysts for selective conversion of hydrocarbons by Jovana Zecevic, Gina Vanbutsele, Krijn P. de Jong, & Johan A. Martens. Nature 528, 245–248 (10 December 2015)  doi:10.1038/nature16173 Published online 09 December 2015

This paper is behind a paywall.

Why Factory publishes book about research on nanotechnology in architecture

The book titled, Barba. Life in the Fully Adaptable Environment, published by nai010 and the Why Factory, a think tank operated by Dutch architectural firm, MVRDV, and Delft University of Technology in the Netherlands, is a little difficult to describe.  From a Nov. 16, 2015 MVRDV press release,

Is the end of brick and mortar near? How could nanotechnology change buildings and cities in the future? A speculation of The Why Factory on this topic is illustrated in the best tradition of science fiction in the newly published book Barba. Life in the Fully Adaptable Environment. It forms the point of departure for a series of interactive experiments, installations and proposals towards the development of new, body-based and fully adaptive architectures. A beautiful existential story comes alive. A story closer to us then you’d ever have thought. Imagine a new substance that could be steered and altered in real time. Imagine creating a flexible material that could change its shape, that could shrink and expand, that could do almost anything. The Why Factory calls this fictional material Barba. With Barba, we would be able to adapt our environment to every desire and to every need.

The press release delves into the inspiration for the material and the book,

… The first inspiration came from ‘Barbapapa’, an illustrated cartoon character from the 1970s. Invented and drawn by Talus Taylor and Annette Tison, the friendly, blobby protagonist of the eponymous children’s books and television programme could change his shape to resemble different objects. With Barbapapa’s smooth morphosis in mind, The Why Factory wondered how today’s advancements in robotics, material science and computing might allow us to create environments that transform themselves as easily as Barbapapa could. Neither Barbapapa’s inventors nor anybody else from the team behind the cartoon were involved in this project, but The Why Factory owes them absolute gratitude for the inspiration of Barbapapa.

“Barba is a fantastic matter that does whatever we wish for” says Winy Maas, Professor at The Why Factory and MVRDV co-founder. “You can programme your environment like a computer game. You could wake up in a modernist villa that you transform into a Roman Spa after breakfast. Cities can be totally transformed when offices just disappear after office hours.”

The book moves away from pure speculation, however, and makes steps towards real world application, including illustrated vision, programming experiments and applied prototypes. As co-author of the book, Ulf Hackauf, explains, “We started this book with a vision, which we worked out to form a consistent future scenario. This we took as a point of departure for experiments and speculations, including programming, installations and material research. It eventually led us to prototypes, which could form a first step for making Barba real.”

Barba developed through a series of projects organized by The Why Factory and undertaken in collaboration between Delft University of Technology, ETH Zürich and the European Institute of Innovation and Technology. The research was developed over the course of numerous design studios at the Why Factory and elsewhere. Students and collaborators of the Why Factory have all contributed to the book.

The press release goes on to offer some information about Why Factory,

The Why Factory explores possibilities for the development of our cities by focusing on the production of models and visualisations for cities of the future. Education and research of The Why Factory are combined in a research lab and platform that aims to analyse, theorise and construct future cities. It investigates within the given world and produces future scenarios beyond it; from universal to specific and global to local. It proposes, constructs and envisions hypothetical societies and cities; from science to fiction and vice versa. The Why Factory thus acts as a future world scenario making machinery, engaging in a public debate on architecture and urbanism. Their findings are then communicated to the wider public in a variety of ways, including exhibitions, publications, workshops, and panel discussions.

Based on the Why Factory description, I’m surmising that the book is meant to provoke interactivity in some way. However, there doesn’t seem to be a prescribed means to interact with the Why Factory or the authors (Winy Maas, Ulf Hackauf, Adrien Ravon, and Patrick Healy) so perhaps the book is meant to be a piece of fiction/manual for interested educators, architects, and others who want to create ‘think tank’ environments where people speculate about nanotechnology and architecture.

In any event, you can order the book from this nai010 webpage,

How nanotechnology might drastically change cities and architecture

> New, body-based and fully adaptive architecture
How could nanotechnology change buildings and cities in the future? Imagine a new substance, that could be steered and altered in real time. Imagine …

As for The Why Factory, you can find out more here on the think tank’s About page.

One last comment, in checking out MVRDV, the Dutch architectural firm mentioned earlier as one of The Why Factory’s operating organizations, I came across this piece of news generated as a consequence of the Nov. 13, 2015 Paris bombings,

The Why Factory alumna Emilie Meaud died in Friday’s Paris attacks. Our thoughts are with their family, friends and colleagues.

Nov 17, 2015

To our great horror and shock we received the terrible news that The Why Factory alumna Emilie Meaud (29) died in the Paris attacks of last Friday. She finished her master in Architecture at TU-Delft in 2012 and worked at the Agence Chartier-Dalix. She was killed alongside her twin sister Charlotte. Our thoughts are with their family, friends and colleagues.

Amen.