Tag Archives: Netherlands

Nuclear magnetic resonance microscope breaks records

Dutch researchers have found a way to apply the principles underlying magnetic resonance imaging (MRI) to a microscope designed *for* examining matter and life at the nanoscale. From a July 15, 2016 news item on phys.org,

A new nuclear magnetic resonance (NMR) microscope gives researchers an improved instrument to study fundamental physical processes. It also offers new possibilities for medical science—for example, to better study proteins in Alzheimer’s patients’ brains. …

A Leiden Institute of Physics press release, which originated the news item, expands on the theme,

If you get a knee injury, physicians use an MRI machine to look right through the skin and see what exactly is the problem. For this trick, doctors make use of the fact that our body’s atomic nuclei are electrically charged and spin around their axis. Just like small electromagnets they induce their own magnetic field. By placing the knee in a uniform magnetic field, the nuclei line up with their axis pointing in the same direction. The MRI machine then sends a specific type of radio waves through the knee, causing some axes to flip. After turning off this signal, those nuclei flip back after some time, under excitation of a small radio wave. Those waves give away the atoms’ location, and provide physicians with an accurate image of the knee.

NMR

MRI is the medical application of Nuclear Magnetic Resonance (NMR), which is based on the same principle and was invented by physicists to conduct fundamental research on materials. One of the things they study with NMR is the so-called relaxation time. This is the time scale at which the nuclei flip back and it gives a lot of information about a material’s properties.

Microscope

To study materials on the smallest of scales as well, physicists go one step further and develop NMR microscopes, with which they study the mechanics behind physical processes at the level of a group of atoms. Now Leiden PhD students Jelmer Wagenaar and Arthur de Haan have built an NMR microscope, together with principal investigator Tjerk Oosterkamp, that operates at a record temperature of 42 milliKelvin—close to absolute zero. In their article in Physical Review Applied they prove it works by measuring the relaxation time of copper. They achieved a thousand times higher sensitivity than existing NMR microscopes—also a world record.

Alzheimer

With their microscope, they give physicists an instrument to conduct fundamental research on many physical phenomena, like systems displaying strange behavior in extreme cold. And like NMR eventually led to MRI machines in hospitals, NMR microscopes have great potential too. Wagenaar: ‘One example is that you might be able to use our technique to study Alzheimer patients’ brains at the molecular level, in order to find out how iron is locked up in proteins.’

Here’s a link to and a citation for the paper,

Probing the Nuclear Spin-Lattice Relaxation Time at the Nanoscale by J. J. T. Wagenaar, A. M. J. den Haan, J. M. de Voogd, L. Bossoni, T. A. de Jong, M. de Wit, K. M. Bastiaans, D. J. Thoen, A. Endo, T. M. Klapwijk, J. Zaanen, and T. H. Oosterkamp. Phys. Rev. Applied 6, 014007 DOI:http://dx.doi.org/10.1103/PhysRevApplied.6.014007 Published 15 July 2016

This paper is open access.

*’fro’ changed to ‘for’ on Aug. 3, 2016.

Trans-Atlantic Platform (T-AP) is a unique collaboration of humanities and social science researchers from Europe and the Americas

Launched in 2013, the Trans-Atlantic Platform is co-chaired by Dr.Ted Hewitt, president of the Social Sciences and Humanities Research Council of Canada (SSHRC) , and Dr. Renée van Kessel-Hagesteijn, Netherlands Organisation for Scientific Research—Social Sciences (NWO—Social Sciences).

An EU (European Union) publication, International Innovation features an interview about T-AP with Ted Hewitt in a June 30, 2016 posting,

The Trans-Atlantic Platform is a unique collaboration of humanities and social science funders from Europe and the Americas. International Innovation’s Rebecca Torr speaks with Ted Hewitt, President of the Social Sciences and Humanities Research Council and Co-Chair of T-AP to understand more about the Platform and its pilot funding programme, Digging into Data.

Many commentators have called for better integration between natural and social scientists, to ensure that the societal benefits of STEM research are fully realised. Does the integration of diverse scientific disciplines form part of T-AP’s remit, and if so, how are you working to achieve this?

T-AP was designed primarily to promote and facilitate research across SSH. However, given the Platform’s thematic priorities and the funding opportunities being contemplated, we anticipate that a good number of non-SSH [emphasis mine] researchers will be involved.

As an example, on March 1, T-AP launched its first pilot funding opportunity: the T-AP Digging into Data Challenge. One of the sponsors is the Natural Sciences and Engineering Research Council of Canada (NSERC), Canada’s federal funding agency for research in the natural sciences and engineering. Their involvement ensures that the perspective of the natural sciences is included in the challenge. The Digging into Data Challenge is open to any project that addresses research questions in the SSH by using large-scale digital data analysis techniques, and is then able to show how these techniques can lead to new insights. And the challenge specifically aims to advance multidisciplinary collaborative projects.

When you tackle a research question or undertake research to address a social challenge, you need collaboration between various SSH disciplines or between SSH and STEM disciplines. So, while proposals must address SSH research questions, the individual teams often involve STEM researchers, such as computer scientists.

In previous rounds of the Digging into Data Challenge, this has led to invaluable research. One project looked at how the media shaped public opinion around the 1918 Spanish flu pandemic. Another used CT scans to examine hundreds of mummies, ultimately discovering that atherosclerosis, a form of heart disease, was prevalent 4,000 years ago. In both cases, these multidisciplinary historical research projects have helped inform our thinking of the present.

Of course, Digging into Data isn’t the only research area in which T-AP will be involved. Since its inception, T-AP partners have identified three priority areas beyond digital scholarship: diversity, inequality and difference; resilient and innovative societies; and transformative research on the environment. Each of these areas touches on a variety of SSH fields, while the transformative research on the environment area has strong connections with STEM fields. In September 2015, T-AP organised a workshop around this third priority area; environmental science researchers were among the workshop participants.

I wish Hewitt hadn’t described researchers from disciplines other than the humanities and social sciences as “non-SSH.” The designation divides the world in two: us and non-take your pick: non-Catholic/Muslim/American/STEM/SSH/etc.

Getting back to the interview, it is surprisingly Canuck-centric in places,

How does T-AP fit in with Social Sciences and Humanities Research Council of Canada (SSHRC)’s priorities?

One of the objectives in SSHRC’s new strategic plan is to develop partnerships that enable us to expand the reach of our funding. As T-AP provides SSHRC with links to 16 agencies across Europe and the Americas, it is an efficient mechanism for us to broaden the scope of our support and promotion of post-secondary-based research and training in SSH.

It also provides an opportunity to explore cutting edge areas of research, such as big data (as we did with the first call we put out, Digging into Data). The research enterprise is becoming increasingly international, by which I mean that researchers are working on issues with international dimensions or collaborating in international teams. In this globalised environment, SSHRC must partner with international funders to support research excellence. By developing international funding opportunities, T-AP helps researchers create teams better positioned to tackle the most exciting and promising research topics.

Finally, it is a highly effective way of broadly promoting the value of SSH research throughout Canada and around the globe. There are significant costs and complexities involved in international research, and uncoordinated funding from multiple national funders can actually create barriers to collaboration. A platform like T-AP helps funders coordinate and streamline processes.

The interview gets a little more international scope when it turns to the data project,

What is the significance of your pilot funding programme in digital scholarship and what types of projects will it support?

The T-AP Digging into Data Challenge is significant for several reasons. First, the geographic reach of Digging is truly significant. With 16 participants from 11 countries, this round of Digging has significantly broader participation from previous rounds. This is also the first time Digging into Data includes funders from South America.

The T-AP Digging into Data Challenge is open to any research project that addresses questions in SSH. In terms of what those projects will end up being is anybody’s guess – projects from past competitions have involved fields ranging from musicology to anthropology to political science.

The Challenge’s main focus is, of course, the use of big data in research.

You may want to read the interview in its entirety here.

I have checked out the Trans-Atlantic Platform website but cannot determine how someone or some institution might consult that site for information on how to get involved in their projects or get funding. However, there is a T-AP Digging into Data website where there is evidence of the first international call for funding submissions. Sadly, the deadline for the 2016 call has passed if the website is to be believed (sometimes people are late when changing deadline dates).

3D brain-on-a-chip from the University of Twente

Dutch researchers have developed a 3D brain-on-a-chip according to a June 23, 2016 news item on Nanowerk,

To study brain cell’s operation and test the effect of medication on individual cells, the conventional Petri dish with flat electrodes is not sufficient. For truly realistic studies, cells have to flourish within three-dimensional surroundings.

Bart Schurink, researcher at University of Twente’s MESA+ Institute for Nanotechnology, has developed a sieve with 900 openings, each of which has the shape of an inverted pyramid. On top of this array of pyramids, a micro-reactor takes care of cell growth. Schurink defends his PhD thesis June 23 [2016].

A June 23, 2016 University of Twente press release, which originated the news item, provides more detail,

A brain-on-a-chip demands more than a series of electrodes in 2D, on which brain cells can be cultured. To mimic the brain in a realistic way, you need facilities for fluid flow, and the cells need some freedom for themselves even when they are kept at predefined spaces. Schurink therefore developed a micro sieve structure with hundreds of openings on a 2 by 2 mm surface. Each of these holes has the shape of  an inverted pyramid. Each pyramid, in turn, is equipped with an electrode, for measuring electrical signals or sending stimuli to the network. At the same time, liquids can flow through tiny holes, needed to capture the cells and for sending nutrients or medication to a single cell.

NEURONAL NETWORK

After neurons have been placed inside all the pyramids, they will start to form a network. This is not just a 2D network between the holes: by placing a micro reactor on top of the sieve, a neuron network can develop in the vertical direction as well. Growth and electrical activity can be monitored subsequently: each individual cell can be identified by the pyramid it is in. Manufacturing this system, demands a lot of both the production facilities at UT’s NanoLab and of creative solutions the designers come up with. For example, finding the proper way of guaranteeing  the same dimensions for every hole, is quite challenging.

Schurink’s new µSEA (micro sieve electrode array) has been tested with living cells, from the brains of laboratory rats. Both the positioning of the cells and neuronal network growth have been tested. The result of this PhD research is a fully new research platform for performing research on the brain, diseases and effects of medication.

Schurink (1982) has conducted his research within the group Meso Scale Chemical Systems, of Prof Han Gardeniers. The group is part of the MESA+ Institute for Nanotechnology of the University of Twente. Schurink’s thesis is titled ‘Microfabrication and microfluidics for 3D brain-on-chip’ …

I have written about one other piece about a ‘3D’ organ-on-a-chip project in China (my Jan. 29, 2016 posting).

Artists classified the animal kingdom?

Where taxonomy and biology are concerned, my knowledge begins and end with Carl Linnaeus, the Swedish scientist who ushered in modern taxonomy. It was with some surprise that I find out artists also helped develop the field. From a June 21, 2016 news item on ScienceDaily,

In the sixteenth and seventeenth centuries artists were fascinated by how the animal kingdom was classified. They were in some instances ahead of natural historians.

This is one of the findings of art historian Marrigje Rikken. She will defend her PhD on 23 June [2016] on animal images in visual art. In recent years she has studied how images of animals between 1550 and 1630 became an art genre in themselves. ‘The close relationship between science and art at that time was remarkable,’ Rikken comments. ‘Artists tried to bring some order to the animal kingdom, just as biologists did.’

A June 21, 2016 Universiteit Leiden (Leiden University, Netherlands) press release, which originated the news item, expands on the theme,

In some cases the artists were ahead of their times. They became interested in insects, for example, before they attracted the attention of natural historians. It was artist Joris Hoefnagel who in 1575 made the first miniatures featuring beetles, butterflies and dragonflies, indicating how they were related to one another. In his four albums Hoefnagel divided the animal species according to the elements of fire, water, air and earth, but within these classifications he grouped animals on the basis of shared characteristics.

Courtesy: Universiteit Leiden

Beetles, butterflies, and dragonflies by Joris Hoefnagel. Courtesy: Universiteit Leiden

The press release goes on,

Other illustrators, print-makers and painters tried to bring some cohesion to the animal kingdom.  Some of them used an alphabetical system but artist Marcus Gheeraerts  published a print as early as 1583 [visible below, Ed.] in which grouped even-toed ungulates together. The giraffe and sheep – both visible on Gheeraerts’ print – belong to this species of animals. This doesn’t apply to all Gheeraerts’ animals. The mythical unicorn, which was featured by Gheeraerts, no longer appears in contemporary biology books.

Wealthy courtiers

According to Rikken, the so-called menageries played an important role historically in how animals were represented. These forerunners of today’s zoos were popular in the sixteenth and seventeenth centuries particularly among wealthy rulers and courtiers. Unfamiliar exotic animals regularly arrived that were immediately committed to paper by artists. Rikken: ‘The toucan, for example, was immortalised in 1615 by Jan Brueghel the Elder, court painter in Brussels.’  [See the main image, Ed.].’

In the flesh

Rikken also discovered that the number of animals featured in a work gradually increased. ‘Artists from the 1570s generally included one or just a few animals per work. With the arrival of print series a decade later, each illustration tended to include more and more animals. This trend reached its peak in the lavish paintings produced around 1600.’ These paintings are also much more varied than the drawings and prints. Illustrators and print-makers often blindly copied one another’s motifs, even showing the animals in an identical pose. Artists had no hesitation in including the same animal in different positions. Rikken: ‘This allowed them to show that they had observed the animal in the flesh.’

Even-toed ungulates by Marcus Gheeraerts. Courtesy: Leiden Universiteit

Even-toed ungulates by Marcus Gheeraerts. Courtesy: Leiden Universiteit

Yet more proof or, at least, a very strong suggestion that art and science are tightly linked.

Lungs: EU SmartNanoTox and Pneumo NP

I have three news bits about lungs one concerning relatively new techniques for testing the impact nanomaterials may have on lungs and two concerning developments at PneumoNP; the first regarding a new technique for getting antibiotics to a lung infected with pneumonia and the second, a new antibiotic.

Predicting nanotoxicity in the lungs

From a June 13, 2016 news item on Nanowerk,

Scientists at the Helmholtz Zentrum München [German Research Centre for Environmental Health] have received more than one million euros in the framework of the European Horizon 2020 Initiative [a major European Commission science funding initiative successor to the Framework Programme 7 initiative]. Dr. Tobias Stöger and Dr. Otmar Schmid from the Institute of Lung Biology and Disease and the Comprehensive Pneumology Center (CPC) will be using the funds to develop new tests to assess risks posed by nanomaterials in the airways. This could contribute to reducing the need for complex toxicity tests.

A June 13, 2016 Helmholtz Zentrum München (German Research Centre for Environmental Health) press release, which originated the news item, expands on the theme,

Nanoparticles are extremely small particles that can penetrate into remote parts of the body. While researchers are investigating various strategies for harvesting the potential of nanoparticles for medical applications, they could also pose inherent health risks*. Currently the hazard assessment of nanomaterials necessitates a complex and laborious procedure. In addition to complete material characterization, controlled exposure studies are needed for each nanomaterial in order to guarantee the toxicological safety.

As a part of the EU SmartNanoTox project, which has now been funded with a total of eight million euros, eleven European research partners, including the Helmholtz Zentrum München, want to develop a new concept for the toxicological assessment of nanomaterials.

Reference database for hazardous substances

Biologist Tobias Stöger and physicist Otmar Schmid, both research group heads at the Institute of Lung Biology and Disease, hope that the use of modern methods will help to advance the assessment procedure. “We hope to make more reliable nanotoxicity predictions by using modern approaches involving systems biology, computer modelling, and appropriate statistical methods,” states Stöger.

The lung experts are concentrating primarily on the respiratory tract. The approach involves defining a representative selection of toxic nanomaterials and conducting an in-depth examination of their structure and the various molecular modes of action that lead to their toxicity. These data are then digitalized and transferred to a reference database for new nanomaterials. Economical tests that are easy to conduct should then make it possible to assess the toxicological potential of these new nanomaterials by comparing the test results s with what is already known from the database. “This should make it possible to predict whether or not a newly developed nanomaterial poses a health risk,” Otmar Schmid says.

* Review: Schmid, O. and Stoeger, T. (2016). Surface area is the biologically most effective dose metric for acute nanoparticle toxicity in the lung. Journal of Aerosol Science, DOI:10.1016/j.jaerosci.2015.12.006

The SmartNanoTox webpage is here on the European Commission’s Cordis website.

Carrying antibiotics into lungs (PneumoNP)

I received this news from the European Commission’s PneumoNP project (I wrote about PneumoNP in a June 26, 2014 posting when it was first announced). This latest development is from a March 21, 2016 email (the original can be found here on the How to pack antibiotics in nanocarriers webpage on the PneumoNP website),

PneumoNP researchers work on a complex task: attach or encapsulate antibiotics with nanocarriers that are stable enough to be included in an aerosol formulation, to pass through respiratory tracts and finally deliver antibiotics on areas of lungs affected by pneumonia infections. The good news is that they finally identify two promising methods to generate nanocarriers.

So far, compacting polymer coils into single-chain nanoparticles in water and mild conditions was an unsolved issue. But in Spain, IK4-CIDETEC scientists developed a covalent-based method that produces nanocarriers with remarkable stability under those particular conditions. Cherry on the cake, the preparation is scalable for more industrial production. IK4-CIDETEC patented the process.

Fig.: A polymer coil (step 1) compacts into a nanocarrier with cross-linkers (step 2). Then, antibiotics get attached to the nanocarrier (step 3).

Fig.: A polymer coil (step 1) compacts into a nanocarrier with cross-linkers (step 2). Then, antibiotics get attached to the nanocarrier (step 3).

At the same time, another route to produce lipidic nanocarriers have been developed by researchers from Utrecht University. In particular, they optimized the method consisting in assembling lipids directly around a drug. As a result, generated lipidic nanocarriers show encouraging stability properties and are able to carry sufficient quantity of antibiotics.

Fig.: On presence of antibiotics, the lipidic layer (step 1) aggregates the the drug (step 2) until the lipids forms a capsule around the antibiotics (step 3).

Fig.: On presence of antibiotics, a lipidic layer (step 1) aggregates the drug (step 2) until the lipids forms a capsule around antibiotics (step 3).

Assays of both polymeric and lipidic nanocarriers are currently performed by ITEM Fraunhofer Institute in Germany, Ingeniatrics Tecnologias in Spain and Erasmus Medical Centre in the Netherlands. Part of these tests allows to make sure that the nanocarriers are not toxic to cells. Other tests are also done to verify that the efficiency of antibiotics on Klebsiella Pneumoniae bacteria when they are attached to nanocarriers.

A new antibiotic for pneumonia (PneumoNP)

A June 14, 2016 PneumoNP press release (received via email) announces work on a promising new approach to an antibiotic for pneumonia,

The antimicrobial peptide M33 may be the long-sought substitute to treat difficult lung infections, like multi-drug resistant pneumonia.

In 2013, the European Respiratory Society predicted 3 millions cases of pneumonia in Europe every year [1]. The standard treatment for pneumonia is an intravenous administration of a combination of drugs. This leads to the development of antibiotic resistance in the population. Gradually, doctors are running out of solutions to cure patients. An Italian company suggests a new option: the M33 peptide.

Few years ago, the Italian company SetLance SRL decided to investigate the M33 peptide. The antimicrobial peptide is an optimized version of an artificial peptide sequence selected for its efficacy and stability. So far, it showed encouraging in-vitro results against multidrug-resistant Gram-negative bacteria, including Klebsiella Pneumoniae. With the support of EU funding to the PneumoNP project, SetLance SRL had the opportunity to develop a new formulation of M33 that enhances its antimicrobial activity.

The new formulation of M33 fights Gram-negative bacteria in three steps. First of all, the M33 binds with the lipopolysaccharides (LPS) on the outer membrane of bacteria. Then, the molecule forms a helix and finally disrupts the membrane provoking cytoplasm leaking. The peptide enabled up to 80% of mices to survive Pseudomonas Aeruginosa-based lung infections. Beyond these encouraging results, toxicity to the new M33 formulation seems to be much lower than antimicrobial peptides currently used in clinical practice like colistin [2].

Lately, SetLance scaled-up the synthesis route and is now able to produce several hundred milligrams per batch. The molecule is robust enough for industrial production. We may expect this drug to go on clinical development and validation at the beginning of 2018.

[1] http://www.erswhitebook.org/chapters/acute-lower-respiratory-infections/pneumonia/
[2] Ceccherini et al., Antimicrobial activity of levofloxacin-M33 peptide conjugation or combination, Chem Med Comm. 2016; Brunetti et al., In vitro and in vivo efficacy, toxicity, bio-distribution and resistance selection of a novel antibacterial drug candidate. Scientific Reports 2016

I believe all the references are open access.

Brief final comment

The only element linking these news bits together is that they concern the lungs.

Open access to nanoparticles and nanocomposites

One of the major issues for developing nanotechnology-enabled products is access to nanoparticles and nanocomposites. For example, I’ve had a number of requests from entrepreneurs for suggestions as to how to access cellulose nanocrystals (CNC) so they can develop a product idea. (It’s been a few years since the last request and I hope that means it’s easier to get access to CNC.)

Regardless, access remains a problem and the European Union has devised a solution which allows open access to nanoparticles and nanocomposites through project Co-Pilot. The announcement was made in a May 10, 2016 news item on Nanowerk (Note: A link has been removed),

“What opportunities does the nanotechnology provide in general, provide nanoparticles for my products and processes?” So far, this question cannot be answered easily. Preparation and modification of nanoparticles and the further processing require special technical infrastructure and complex knowledge. For small and medium businesses the construction of this infrastructure “just on luck” is often not worth it. Even large companies shy away from the risks. As a result many good ideas just stay in the drawer.

A simple and open access to high-class infrastructure for the reliable production of small batches of functionalized nanoparticles and nanocomposites for testing could ease the way towards new nano-based products for chemical and pharmaceutical companies. The European Union has allocated funds for the construction of a number of pilot lines and open-access infrastructure within the framework of the EU project CoPilot.

A May 9, 2016 Fraunhofer-Institut für Silicatforschung press release, which originated the news item, offers greater description,

A simple and open access to high-class infrastructure for the reliable production of small batches of functionalized nanoparticles and nanocomposites for testing could ease the way towards new nano-based products for chemical and pharmaceutical companies. The European Union has allocated funds for the construction of a number of pilot lines and open-access infrastructure within the framework of the EU project CoPilot. A consortium of 13 partners from research and industry, including nanotechnology specialist TNO from the Netherlands and the Fraunhofer Institute for Silicate Research ISC from Wuerzburg, Germany as well as seven nanomaterial manufacturers, is currently setting up the pilot line in Wuerzburg. First, they establish the particle production, modification and compounding on pilot scale based on four different model systems. The approach enables maximum variability and flexibility for the pilot production of various particle systems and composites. Two further open access lines will be established at TNO in Eindhoven and at the Sueddeutsche Kunststoffzentrum SKZ in Selb.

The “nanoparticle kitchen”

Essential elements of the pilot line in Wuerzburg are the particle synthesis in batches up to 100 liters, modification and separation methods such as semi-continuous operating centrifuge and in-line analysis and techniques for the uniform and agglomeration free incorporation of nanoparticles into composites. Dr. Karl Mandel, head of Particle Technology of Fraunhofer ISC, compares the pilot line with a high-tech kitchen: “We provide the top-notch equipment and the star chefs to synthesize a nano menu à la carte as well as nanoparticles according to individual requests. Thus, companies can test their own receipts – or our existing receipts – before they practice their own cooking or set up their nano kitchen.”

In the future, the EU project offers companies a contact point if they want to try their nano idea and require enough material for sampling and estimation of future production costs. This can, on the one hand, minimize the development risk, on the other hand, it maximizes the flexibility and production safety. To give lots of companies the opportunity to influence direction and structure/formation/setup of the nanoparticle kitchen, the project partners will offer open meetings on a regular basis.

I gather Co-Pilot has been offering workshops. The next is in July 2016 according to the press release,

The next workshop in this context takes place at Fraunhofer ISC in Wuerzburg, 7h July 2016. The partners present the pilot line and the first results of the four model systems – double layered hydroxide nanoparticle polymer composites for flame inhibiting fillers, titanium dioxide nanoparticles for high refractive index composites, magnetic particles for innovative catalysts and hollow silica composites for anti-glare coatings. Interested companies can find more information about the upcoming workshop on the website of the project www.h2020copilot.eu and on the website of Fraunhofer ISC www.isc.fraunhofer.de that hosts the event.

I tracked down a tiny bit more information about the July 2016 workshop in a May 2, 2016 Co-Pilot press release,

On July 7 2016, the CoPilot project partners give an insight view of the many new functionalization and applications of tailored nanoparticles in the workshop “The Nanoparticle Kitchen – particles und functions à la carte”, taking place in Wuerzburg, Germany. Join the Fraunhofer ISC’s lab tour of the “Nanoparticle Kitchen”, listen to the presentations of research institutes and industry and discuss your ideas with experts. Nanoparticles offer many options for today’s and tomorrow’s products.

More about program and registration soon on this [CoPilot] website!

I wonder if they’re considering this open access to nanoparticles and nanocomposites approach elsewhere?

Artificial intelligence used for wildlife protection

PAWS (Protection Assistant for Wildlife Security), an artificial intelligence (AI) program, has been tested in Uganda and Malaysia. according to an April 22, 2016 US National Science Foundation (NSF) news release (also on EurekAlert but dated April 21, 2016), Note: Links have been removed,

A century ago, more than 60,000 tigers roamed the wild. Today, the worldwide estimate has dwindled to around 3,200. Poaching is one of the main drivers of this precipitous drop. Whether killed for skins, medicine or trophy hunting, humans have pushed tigers to near-extinction. The same applies to other large animal species like elephants and rhinoceros that play unique and crucial roles in the ecosystems where they live.

Human patrols serve as the most direct form of protection of endangered animals, especially in large national parks. However, protection agencies have limited resources for patrols.

With support from the National Science Foundation (NSF) and the Army Research Office, researchers are using artificial intelligence (AI) and game theory to solve poaching, illegal logging and other problems worldwide, in collaboration with researchers and conservationists in the U.S., Singapore, Netherlands and Malaysia.

“In most parks, ranger patrols are poorly planned, reactive rather than pro-active, and habitual,” according to Fei Fang, a Ph.D. candidate in the computer science department at the University of Southern California (USC).

Fang is part of an NSF-funded team at USC led by Milind Tambe, professor of computer science and industrial and systems engineering and director of the Teamcore Research Group on Agents and Multiagent Systems.

Their research builds on the idea of “green security games” — the application of game theory to wildlife protection. Game theory uses mathematical and computer models of conflict and cooperation between rational decision-makers to predict the behavior of adversaries and plan optimal approaches for containment. The Coast Guard and Transportation Security Administration have used similar methods developed by Tambe and others to protect airports and waterways.

“This research is a step in demonstrating that AI can have a really significant positive impact on society and allow us to assist humanity in solving some of the major challenges we face,” Tambe said.

PAWS puts the claws in anti-poaching

The team presented papers describing how they use their methods to improve the success of human patrols around the world at the AAAI Conference on Artificial Intelligence in February [2016].

The researchers first created an AI-driven application called PAWS (Protection Assistant for Wildlife Security) in 2013 and tested the application in Uganda and Malaysia in 2014. Pilot implementations of PAWS revealed some limitations, but also led to significant improvements.

Here’s a video describing the issues and PAWS,

For those who prefer to read about details rather listen, there’s more from the news release,

PAWS uses data on past patrols and evidence of poaching. As it receives more data, the system “learns” and improves its patrol planning. Already, the system has led to more observations of poacher activities per kilometer.

Its key technical advance lies in its ability to incorporate complex terrain information, including the topography of protected areas. That results in practical patrol routes that minimize elevation changes, saving time and energy. Moreover, the system can also take into account the natural transit paths that have the most animal traffic – and thus the most poaching – creating a “street map” for patrols.

“We need to provide actual patrol routes that can be practically followed,” Fang said. “These routes need to go back to a base camp and the patrols can’t be too long. We list all possible patrol routes and then determine which is most effective.”

The application also randomizes patrols to avoid falling into predictable patterns.

“If the poachers observe that patrols go to some areas more often than others, then the poachers place their snares elsewhere,” Fang said.

Since 2015, two non-governmental organizations, Panthera and Rimbat, have used PAWS to protect forests in Malaysia. The research won the Innovative Applications of Artificial Intelligence award for deployed application, as one of the best AI applications with measurable benefits.

The team recently combined PAWS with a new tool called CAPTURE (Comprehensive Anti-Poaching Tool with Temporal and Observation Uncertainty Reasoning) that predicts attacking probability even more accurately.

In addition to helping patrols find poachers, the tools may assist them with intercepting trafficked wildlife products and other high-risk cargo, adding another layer to wildlife protection. The researchers are in conversations with wildlife authorities in Uganda to deploy the system later this year. They will present their findings at the 15th International Conference on Autonomous Agents and Multiagent Systems (AAMAS 2016) in May.

“There is an urgent need to protect the natural resources and wildlife on our beautiful planet, and we computer scientists can help in various ways,” Fang said. “Our work on PAWS addresses one facet of the problem, improving the efficiency of patrols to combat poaching.”

There is yet another potential use for PAWS, the prevention of illegal logging,

While Fang and her colleagues work to develop effective anti-poaching patrol planning systems, other members of the USC team are developing complementary methods to prevent illegal logging, a major economic and environmental problem for many developing countries.

The World Wildlife Fund estimates trade in illegally harvested timber to be worth between $30 billion and $100 billion annually. The practice also threatens ancient forests and critical habitats for wildlife.

Researchers at USC, the University of Texas at El Paso and Michigan State University recently partnered with the non-profit organization Alliance Vohoary Gasy to limit the illegal logging of rosewood and ebony trees in Madagascar, which has caused a loss of forest cover on the island nation.

Forest protection agencies also face limited budgets and must cover large areas, making sound investments in security resources critical.

The research team worked to determine the balance of security resources in which Madagascar should invest to maximize protection, and to figure out how to best deploy those resources.

Past work in game theory-based security typically involved specified teams — the security workers assigned to airport checkpoints, for example, or the air marshals deployed on flight tours. Finding optimal security solutions for those scenarios is difficult; a solution involving an open-ended team had not previously been feasible.

To solve this problem, the researchers developed a new method called SORT (Simultaneous Optimization of Resource Teams) that they have been experimentally validating using real data from Madagascar.

The research team created maps of the national parks, modeled the costs of all possible security resources using local salaries and budgets, and computed the best combination of resources given these conditions.

“We compared the value of using an optimal team determined by our algorithm versus a randomly chosen team and the algorithm did significantly better,” said Sara Mc Carthy, a Ph.D. student in computer science at USC.

The algorithm is simple and fast, and can be generalized to other national parks with different characteristics. The team is working to deploy it in Madagascar in association with the Alliance Vohoary Gasy.

“I am very proud of what my PhD students Fei Fang and Sara Mc Carthy have accomplished in this research on AI for wildlife security and forest protection,” said Tambe, the team lead. “Interdisciplinary collaboration with practitioners in the field was key in this research and allowed us to improve our research in artificial intelligence.”

Moreover, the project shows other computer science researchers the potential impact of applying their research to the world’s problems.

“This work is not only important because of the direct beneficial impact that it has on the environment, protecting wildlife and forests, but on the way that it can inspire other to dedicate their efforts into making the world a better place,” Mc Carthy said.

The curious can find out more about Panthera here and about Alliance Vohoary Gasy here (be prepared to use your French language skills). Unfortunately, I could not find more information about Rimbat.

Graphene Flagship high points

The European Union’s Graphene Flagship project has provided a series of highlights in place of an overview for the project’s ramp-up phase (in 2013 the Graphene Flagship was announced as one of two winners of a science competition, the other winner was the Human Brain Project, with two prizes of 1B Euros for each project). Here are the highlights from the April 19, 2016 Graphene Flagship press release,

Graphene and Neurons – the Best of Friends

Flagship researchers have shown that it is possible to interface untreated graphene with neuron cells whilst maintaining the integrity of these vital cells [1]. This result is a significant first step towards using graphene to produce better deep brain implants which can both harness and control the brain.

Graphene and Neurons
 

This paper emerged from the Graphene Flagship Work Package Health and Environment. Prof. Prato, the WP leader from the University of Trieste in Italy, commented that “We are currently involved in frontline research in graphene technology towards biomedical applications, exploring the interactions between graphene nano- and micro-sheets with the sophisticated signalling machinery of nerve cells. Our work is a first step in that direction.”

[1] Fabbro A., et al., Graphene-Based Interfaces do not Alter Target Nerve Cells. ACS Nano, 10 (1), 615 (2016).

Pressure Sensing with Graphene: Quite a Squeeze

The Graphene Flagship developed a small, robust, highly efficient squeeze film pressure sensor [2]. Pressure sensors are present in most mobile handsets and by replacing current sensor membranes with a graphene membrane they allow the sensor to decrease in size and significantly increase its responsiveness and lifetime.

Discussing this work which emerged from the Graphene Flagship Work Package Sensors is the paper’s lead author, Robin Dolleman from the Technical University of Delft in The Netherlands “After spending a year modelling various systems the idea of the squeeze-film pressure sensor was formed. Funding from the Graphene Flagship provided the opportunity to perform the experiments and we obtained very good results. We built a squeeze-film pressure sensor from 31 layers of graphene, which showed a 45 times higher response than silicon based devices, while reducing the area of the device by a factor of 25. Currently, our work is focused on obtaining similar results on monolayer graphene.”

 

[2] Dolleman R. J. et al., Graphene Squeeze-Film Pressure Sensors. Nano Lett., 16, 568 (2016)

Frictionless Graphene


Image caption: A graphene nanoribbon was anchored at the tip of a atomic force microscope and dragged over a gold surface. The observed friction force was extremely low.

Image caption: A graphene nanoribbon was anchored at the tip of a atomic force microscope and dragged over a gold surface. The observed friction force was extremely low.

Research done within the Graphene Flagship, has observed the onset of superlubricity in graphene nanoribbons sliding on a surface, unravelling the role played by ribbon size and elasticity [3]. This important finding opens up the development potential of nanographene frictionless coatings. This research lead by the Graphene Flagship Work Package Nanocomposites also involved researchers from Work Package Materials and Work Package Health and the Environment, a shining example of the inter-disciplinary, cross-collaborative approach to research undertaken within the Graphene Flagship. Discussing this further is the Work Package Nanocomposites Leader, Dr Vincenzo Palermo from CNR National Research Council, Italy “Strengthening the collaboration and interactions with other Flagship Work Packages created added value through a strong exchange of materials, samples and information”.

[3] Kawai S., et al., Superlubricity of graphene nanoribbons on gold surfaces. Science. 351, 6276, 957 (2016) 

​Graphene Paddles Forward

Work undertaken within the Graphene Flagship saw Spanish automotive interiors specialist, and Flagship partner, Grupo Antolin SA work in collaboration with Roman Kayaks to develop an innovative kayak that incorporates graphene into its thermoset polymeric matrices. The use of graphene and related materials results in a significant increase in both impact strength and stiffness, improving the resistance to breakage in critical areas of the boat. Pushing the graphene canoe well beyond the prototype demonstration bubble, Roman Kayaks chose to use the K-1 kayak in the Canoe Marathon World Championships held in September in Gyor, Hungary where the Graphene Canoe was really put through its paces.

Talking further about this collaboration from the Graphene Flagship Work Package Production is the WP leader, Dr Ken Teo from Aixtron Ltd., UK “In the Graphene Flagship project, Work Package Production works as a technology enabler for real-world applications. Here we show the worlds first K-1 kayak (5.2 meters long), using graphene related materials developed by Grupo Antolin. We are very happy to see that graphene is creating value beyond traditional industries.” 

​Graphene Production – a Kitchen Sink Approach

Researchers from the Graphene Flagship have devised a way of producing large quantities of graphene by separating graphite flakes in liquids with a rotating tool that works in much the same way as a kitchen blender [4]. This paves the way to mass production of high quality graphene at a low cost.

The method was produced within the Graphene Flagship Work Package Production and is talked about further here by the WP deputy leader, Prof. Jonathan Coleman from Trinity College Dublin, Ireland “This technique produced graphene at higher rates than most other methods, and produced sheets of 2D materials that will be useful in a range of applications, from printed electronics to energy generation.” 

[4] Paton K.R., et al., Scalable production of large quantities of defect-free few-layer graphene by shear exfoliation in liquids. Nat. Mater. 13, 624 (2014).

Flexible Displays – Rolled Up in your Pocket

Working with researchers from the Graphene Flagship the Flagship partner, FlexEnable, demonstrated the world’s first flexible display with graphene incorporated into its pixel backplane. Combined with an electrophoretic imaging film, the result is a low-power, durable display suitable for use in many and varied environments.

Emerging from the Graphene Flagship Work Package Flexible Electronics this illustrates the power of collaboration.  Talking about this is the WP leader Dr Henrik Sandberg from the VTT Technical Research Centre of Finland Ltd., Finland “Here we show the power of collaboration. To deliver these flexible demonstrators and prototypes we have seen materials experts working together with components manufacturers and system integrators. These devices will have a potential impact in several emerging fields such as wearables and the Internet of Things.”

​Fibre-Optics Data Boost from Graphene

A team of researches from the Graphene Flagship have demonstrated high-performance photo detectors for infrared fibre-optic communication systems based on wafer-scale graphene [5]. This can increase the amount of information transferred whilst at the same time make the devises smaller and more cost effective.

Discussing this work which emerged from the Graphene Flagship Work Package Optoelectronics is the paper’s lead author, Daniel Schall from AMO, Germany “Graphene has outstanding properties when it comes to the mobility of its electric charge carriers, and this can increase the speed at which electronic devices operate.”

[5] Schall D., et al., 50 GBit/s Photodetectors Based on Wafer-Scale Graphene for Integrated Silicon Photonic Communication Systems. ACS Photonics. 1 (9), 781 (2014)

​Rechargeable Batteries with Graphene

A number of different research groups within the Graphene Flagship are working on rechargeable batteries. One group has developed a graphene-based rechargeable battery of the lithium-ion type used in portable electronic devices [6]. Graphene is incorporated into the battery anode in the form of a spreadable ink containing a suspension of graphene nanoflakes giving an increased energy efficiency of 20%. A second group of researchers have demonstrated a lithium-oxygen battery with high energy density, efficiency and stability [7]. They produced a device with over 90% efficiency that may be recharged more than 2,000 times. Their lithium-oxygen cell features a porous, ‘fluffy’ electrode made from graphene together with additives that alter the chemical reactions at work in the battery.

Graphene Flagship researchers show how the 2D material graphene can improve the energy capacity, efficiency and stability of lithium-oxygen batteries.

Both devices were developed in different groups within the Graphene Flagship Work Package Energy and speaking of the technology further is Prof. Clare Grey from Cambridge University, UK “What we’ve achieved is a significant advance for this technology, and suggests whole new areas for research – we haven’t solved all the problems inherent to this chemistry, but our results do show routes forward towards a practical device”.

[6] Liu T., et al. Cycling Li-O2 batteries via LiOH formation and decomposition. Science. 350, 6260, 530 (2015)

[7] Hassoun J., et al., An Advanced Lithium-Ion Battery Based on a Graphene Anode and a Lithium Iron Phosphate Cathode. Nano Lett., 14 (8), 4901 (2014)

Graphene – What and Why?

Graphene is a two-dimensional material formed by a single atom-thick layer of carbon, with the carbon atoms arranged in a honeycomb-like lattice. This transparent, flexible material has a number of unique properties. For example, it is 100 times stronger than steel, and conducts electricity and heat with great efficiency.

A number of practical applications for graphene are currently being developed. These include flexible and wearable electronics and antennas, sensors, optoelectronics and data communication systems, medical and bioengineering technologies, filtration, super-strong composites, photovoltaics and energy storage.

Graphene and Beyond

The Graphene Flagship also covers other layered materials, as well as hybrids formed by combining graphene with these complementary materials, or with other materials and structures, ranging from polymers, to metals, cement, and traditional semiconductors such as silicon. Graphene is just the first of thousands of possible single layer materials. The Flagship plans to accelerate their journey from laboratory to factory floor.

Especially exciting is the possibility of stacking monolayers of different elements to create materials not found in nature, with properties tailored for specific applications. Such composite layered materials could be combined with other nanomaterials, such as metal nanoparticles, in order to further enhance their properties and uses.​

Graphene – the Fruit of European Scientific Excellence

Europe, North America and Asia are all active centres of graphene R&D, but Europe has special claim to be at the centre of this activity. The ground-breaking experiments on graphene recognised in the award of the 2010 Nobel Prize in Physics were conducted by European physicists, Andre Geim and Konstantin Novoselov, both at Manchester University. Since then, graphene research in Europe has continued apace, with major public funding for specialist centres, and the stimulation of academic-industrial partnerships devoted to graphene and related materials. It is European scientists and engineers who as part of the Graphene Flagship are closely coordinating research efforts, and accelerating the transfer of layered materials from the laboratory to factory floor.

For anyone who would like links to the published papers, you can check out an April 20, 2016 news item featuring the Graphene Flagship highlights on Nanowerk.

Cities as incubators of technological and economic growth: from the rustbelt to the brainbelt

An April 10, 2016 news article by Xumei Dong on the timesunion website casts a light on what some feel is an emerging ‘brainbelt’ (Note: Links have been removed),

Albany [New York state, US], in the forefront of nanotechnology research, is one of the fastest-growing cities for tech jobs, according to a new book exploring hot spots of innovation across the globe.

“You have GlobalFoundries, which has thousands of employees working in one of the most modern plants in the world,” says Antoine van Agtmael, the Dutch-born investor who wrote “The Smartest Places on Earth: Why Rustbelts Are the Emerging Hotspots of Global Innovation” with Dutch journalist Fred Bakker.

Their book, mentioned in a Brookings Institution panel discussion last week [April 6, 2016], lists Albany as a leading innovation hub — part of an emerging “brainbelt” in the United States.

The Brookings Institute’s The smartest places on Earth: Why rustbelts are the emerging hotspots of global innovation event page provides more details and includes an embedded video of the event (running time: roughly 1 hour 17 mins.), Note: A link has been removed,

The conventional wisdom in manufacturing has long held that the key to maintaining a competitive edge lies in making things as cheaply as possible, which saw production outsourced to the developing world in pursuit of ever-lower costs. In contradiction to that prevailing wisdom, authors Antoine van Agtmael, a Brookings trustee, and Fred Bakker crisscrossed the globe and found that the economic tide is beginning to shift from its obsession with cheap goods to the production of smart ones.

Their new book, “The Smartest Places on Earth” (PublicAffairs, 2016), examines this changing dynamic and the transformation of “rustbelt” cities, the former industrial centers of the U.S. and Europe, into a “brainbelt” of design and innovation.

On Wednesday, April 6 [2016] Centennial Scholar Bruce Katz and the Metropolitan Policy Program hosted an event discussing these emerging hotspots and how cities such as Akron, Albany, Raleigh-Durham, Minneapolis-St.Paul, and Portland in the United States, and Eindhoven, Malmo, Dresden, and Oulu in Europe are seizing the initiative and recovering their economic strength.

You can find the book here or if a summary and biographies of the authors will suffice, there’s this,

The remarkable story of how rustbelt cities such as Akron and Albany in the United States and Eindhoven in Europe are becoming the unlikely hotspots of global innovation, where sharing brainpower and making things smarter—not cheaper—is creating a new economy that is turning globalization on its head

Antoine van Agtmael and Fred Bakker counter recent conventional wisdom that the American and northern European economies have lost their initiative in innovation and their competitive edge by focusing on an unexpected and hopeful trend: the emerging sources of economic strength coming from areas once known as “rustbelts” that had been written off as yesterday’s story.

In these communities, a combination of forces—visionary thinkers, local universities, regional government initiatives, start-ups, and big corporations—have created “brainbelts.” Based on trust, a collaborative style of working, and freedom of thinking prevalent in America and Europe, these brainbelts are producing smart products that are transforming industries by integrating IT, sensors, big data, new materials, new discoveries, and automation. From polymers to medical devices, the brainbelts have turned the tide from cheap, outsourced production to making things smart right in our own backyard. The next emerging market may, in fact, be the West.

about Antoine van Agtmael and Fred Bakker

Antoine van Agtmael is senior adviser at Garten Rothkopf, a public policy advisory firm in Washington, DC. He was a founder, CEO, and CIO of Emerging Markets Management LLC; previously he was deputy director of the capital markets department of the International Finance Corporation (“IFC”), the private sector oriented affiliate of the World Bank, and a division chief in the World Bank’s borrowing operations. He was an adjunct professor at Georgetown Law Center and taught at the Harvard Institute of Politics. Mr. van Agtmael is chairman of the NPR Foundation, a member of the board of NPR, and chairman of its Investment Committee. He is also a trustee of The Brookings Institution and cochairman of its International Advisory Council. He is on the President’s Council on International Activities at Yale University, the Advisory Council of Johns Hopkins University’s Paul H. Nitze School of Advanced International Studies (SAIS), and a member of the Council on Foreign Relations

Alfred Bakker, until his recent retirement, was a journalist specializing in monetary and financial affairs with Het Financieele Dagblad, the “Financial Times of Holland,” serving as deputy editor, editor-in-chief and CEO. In addition to his writing and editing duties he helped develop the company from a newspaper publisher to a multimedia company, developing several websites, a business news radio channel, and a quarterly business magazine, FD Outlook, and, responsible for the establishment of FD Intelligence

A hard cover copy of the book is $25.99, presumably US currency.