Tag Archives: Netherlands

2016 Nobel Chemistry Prize for molecular machines

Wednesday, Oct. 5, 2016 was the day three scientists received the Nobel Prize in Chemistry for their work on molecular machines, according to an Oct. 5, 2016 news item on phys.org,

Three scientists won the Nobel Prize in chemistry on Wednesday [Oct. 5, 2016] for developing the world’s smallest machines, 1,000 times thinner than a human hair but with the potential to revolutionize computer and energy systems.

Frenchman Jean-Pierre Sauvage, Scottish-born Fraser Stoddart and Dutch scientist Bernard “Ben” Feringa share the 8 million kronor ($930,000) prize for the “design and synthesis of molecular machines,” the Royal Swedish Academy of Sciences said.

Machines at the molecular level have taken chemistry to a new dimension and “will most likely be used in the development of things such as new materials, sensors and energy storage systems,” the academy said.

Practical applications are still far away—the academy said molecular motors are at the same stage that electrical motors were in the first half of the 19th century—but the potential is huge.

Dexter Johnson in an Oct. 5, 2016 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website) provides some insight into the matter (Note: A link has been removed),

In what seems to have come both as a shock to some of the recipients and a confirmation to all those who envision molecular nanotechnology as the true future of nanotechnology, Bernard Feringa, Jean-Pierre Sauvage, and Sir J. Fraser Stoddart have been awarded the 2016 Nobel Prize in Chemistry for their development of molecular machines.

The Nobel Prize was awarded to all three of the scientists based on their complementary work over nearly three decades. First, in 1983, Sauvage (currently at Strasbourg University in France) was able to link two ring-shaped molecules to form a chain. Then, eight years later, Stoddart, a professor at Northwestern University in Evanston, Ill., demonstrated that a molecular ring could turn on a thin molecular axle. Then, eight years after that, Feringa, a professor at the University of Groningen, in the Netherlands, built on Stoddardt’s work and fabricated a molecular rotor blade that could spin continually in the same direction.

Speaking of the Nobel committee’s selection, Donna Nelson, a chemist and president of the American Chemical Society told Scientific American: “I think this topic is going to be fabulous for science. When the Nobel Prize is given, it inspires a lot of interest in the topic by other researchers. It will also increase funding.” Nelson added that this line of research will be fascinating for kids. “They can visualize it, and imagine a nanocar. This comes at a great time, when we need to inspire the next generation of scientists.”

The Economist, which appears to be previewing an article about the 2016 Nobel prizes ahead of the print version, has this to say in its Oct. 8, 2016 article,

BIGGER is not always better. Anyone who doubts that has only to look at the explosion of computing power which has marked the past half-century. This was made possible by continual shrinkage of the components computers are made from. That success has, in turn, inspired a search for other areas where shrinkage might also yield dividends.

One such, which has been poised delicately between hype and hope since the 1990s, is nanotechnology. What people mean by this term has varied over the years—to the extent that cynics might be forgiven for wondering if it is more than just a fancy rebranding of the word “chemistry”—but nanotechnology did originally have a fairly clear definition. It was the idea that machines with moving parts could be made on a molecular scale. And in recognition of this goal Sweden’s Royal Academy of Science this week decided to award this year’s Nobel prize for chemistry to three researchers, Jean-Pierre Sauvage, Sir Fraser Stoddart and Bernard Feringa, who have never lost sight of nanotechnology’s original objective.

Optimists talk of manufacturing molecule-sized machines ranging from drug-delivery devices to miniature computers. Pessimists recall that nanotechnology is a field that has been puffed up repeatedly by both researchers and investors, only to deflate in the face of practical difficulties.

There is, though, reason to hope it will work in the end. This is because, as is often the case with human inventions, Mother Nature has got there first. One way to think of living cells is as assemblies of nanotechnological machines. For example, the enzyme that produces adenosine triphosphate (ATP)—a molecule used in almost all living cells to fuel biochemical reactions—includes a spinning molecular machine rather like Dr Feringa’s invention. This works well. The ATP generators in a human body turn out so much of the stuff that over the course of a day they create almost a body-weight’s-worth of it. Do something equivalent commercially, and the hype around nanotechnology might prove itself justified.

Congratulations to the three winners!

Sonifying a swimmer’s performance to improve technique by listening)

I imagine since the 2016 Olympic Games are over that athletes and their coaches will soon start training for the 2020 Games. Researchers at Bielefeld University (Germany) have developed a new technique for helping swimmers improve their technique (Note: The following video is German language with English language subtitles),

An Aug. 4, 2016 Bielefeld University press release (also on EurekAlert), tells more,

Since 1896, swimming has been an event in the Olympic games. Back then it was the swimmer’s physical condition that was decisive in securing a win, but today it is mostly technique that determines who takes home the title of world champion. Researchers at Bielefeld University have developed a system that professional swimmers can use to optimize their swimming technique. The system expands the athlete’s perception and feel for the water by enabling them to hear, in real time, how the pressure of the water flows created by the swimmer changes with their movements. This gives the swimmer an advantage over his competitors because he can refine the execution of his technique. This “Swimming Sonification” system was developed at the Cluster of Excellence Cognitive Interaction Technology (CITEC) of Bielefeld University. In a video, Bielefeld University’s own “research_tv” reports on the new system.

“Swimmers see the movements of their hands. They also feel how the water glides over their hands, and they sense how quickly they are moving forwards. However, the majority of swimmers are not very aware of one significant factor: how the pressure exerted by the flow of the water on their bodies changes,” says Dr. Thomas Hermann of the Cluster of Excellence Cognitive Interaction Technology (CITEC). The sound researcher is working on converting data into sounds that can be used to benefit the listener. This is called sonification, a process in which measured data values are systematically turned into audible sounds and noises. “In this project, we are using the pressure from water flows as the data source,” says Hermann, who heads CITEC research group Ambient Intelligence. “We convert into sound how the pressure of water flows changes while swimming – in real time. We play the sounds to the swimmer over headphones so that they can then adjust their movements based on what they hear,” explains Hermann.

For this research project on swimming sonification, Dr. Hermann is working together with Dr. Bodo Ungerechts of the Faculty of Psychology and Sports Science. As a biomechanist, Dr. Ungerechts deals with how human beings control their movements, particularly with swimming. “If a swimmer registers how the flow pressure changes by hearing, he can better judge, for instance, how he can produce more thrust at similar energy costs. This give the swimmer a more encompassing perception for his movements in the water,” says Dr. Ungerechts. The researcher even tested the system out for himself. “I was surprised at just how well the sonification and the effects of the water flow, which I felt myself, corresponded with one another,” he says. The system is intuitive and easy to use. “You immediately starts playing with the sounds to hear, for example, what tonal effect spreading your fingers apart or changing the position of your hand has,” says Ungerechts. The new system should open up new training possibilities for athletes. “By using this system, swimmers develop a harmony – a kind of melody. If a swimmer very quickly masters a lap, they can use the recording of the melody to mentally re-imagine and retrace the successful execution of this lap. This mental training can also help athletes perform successfully in competitions.” To this, Thomas Hermann adds “the ear is great at perceiving rhythm and changes in rhythm. In this way, swimmers can find their own rhythm and use this to orient themselves in the water.”

This system includes two gloves with thin tube ends that serve as pressure sensors and are fixed between the fingers. The swimmer wears these gloves during practice. The tubes are linked to a measuring instrument, which is currently connected to the swimmer via a line while he or she is swimming. The measuring device transmits data about water flow pressure to a laptop. A custom-made software then sonifies the data, meaning that it turns the information into sound. “During repeated hand actions, for instance, the system can make rising and sinking flow pressure audible as increasing or decreasing tonal pitches,” says Thomas Hermann. Other settings that sonify features such as symmetry or steadiness can also be activated as needed.

The sounds are transmitted to the swimmer in real time over headphones. When the swimmer modifies a movement, he hears live how this also changes the sound. With the sonification of aquatic flow pressure, the swimmer can now practice the front crawl in way that, for instance, both hands displace the water masses with the same water flow form – to do this, the swimmer just has make sure that he generates the same sound pattern with each hand. Because the coach also hears the sounds over speakers, he can base the instructions he gives to the swimmer not only on the movements he observes, but also on the sounds generated by the swimmer and their rhythm (e.g. “Move your hands so that the tonal pitch increases faster”).

For this sonification project, Thomas Hermann and Bodo Ungerechts are working with Daniel Cesarini, Ph.D., a researcher from the Department of Information Engineering at the University of Pisa in Italy. Dr. Cesarini developed the measuring device that analyzes the aquatic flow pressure data.

In a practical workshop held in September 2015, professional swimmers tested the system out and confirmed that it indeed helped them to optimize their swimming technique. Of the 10 swimmers who participated, three of them qualify for international competitions, and one of the female swimmers is competing this year at the Paralympics in Rio de Janeiro, Brazil. The workshop was funded by the Cluster of Excellence Cognitive Interaction Technology (CITEC). In addition to this, swim teams at the PSV Eindhoven (Philips Sports Union Eindhoven) in the Netherlands tested the new system out for two months, using it as part of their technique training sessions. The PSV swim club competes in the top swimming league in the Netherlands.

“It is advantageous for swimmers to receive immediate feedback on their swimming form,” says Thomas Hermann. “People learn more quickly when they get direct feedback because they can immediately test how the feedback – in this case, the sound – changes when they try out something new.”

The researchers want to continue developing their current prototype. “We are planning to develop a wearable system that can be used independently by the user, without the help of others,” says Thomas Hermann. In addition to this, the new sonification method is planned to be incorporated into long-term training programs in cooperation with swim clubs.

My first post about sonification was this February 7, 2014 post titled, Data sonification: listening to your data instead of visualizing it.

As for this swimmer’s version of data sonification, you can find out more about the project here and/or here.

Nuclear magnetic resonance microscope breaks records

Dutch researchers have found a way to apply the principles underlying magnetic resonance imaging (MRI) to a microscope designed *for* examining matter and life at the nanoscale. From a July 15, 2016 news item on phys.org,

A new nuclear magnetic resonance (NMR) microscope gives researchers an improved instrument to study fundamental physical processes. It also offers new possibilities for medical science—for example, to better study proteins in Alzheimer’s patients’ brains. …

A Leiden Institute of Physics press release, which originated the news item, expands on the theme,

If you get a knee injury, physicians use an MRI machine to look right through the skin and see what exactly is the problem. For this trick, doctors make use of the fact that our body’s atomic nuclei are electrically charged and spin around their axis. Just like small electromagnets they induce their own magnetic field. By placing the knee in a uniform magnetic field, the nuclei line up with their axis pointing in the same direction. The MRI machine then sends a specific type of radio waves through the knee, causing some axes to flip. After turning off this signal, those nuclei flip back after some time, under excitation of a small radio wave. Those waves give away the atoms’ location, and provide physicians with an accurate image of the knee.


MRI is the medical application of Nuclear Magnetic Resonance (NMR), which is based on the same principle and was invented by physicists to conduct fundamental research on materials. One of the things they study with NMR is the so-called relaxation time. This is the time scale at which the nuclei flip back and it gives a lot of information about a material’s properties.


To study materials on the smallest of scales as well, physicists go one step further and develop NMR microscopes, with which they study the mechanics behind physical processes at the level of a group of atoms. Now Leiden PhD students Jelmer Wagenaar and Arthur de Haan have built an NMR microscope, together with principal investigator Tjerk Oosterkamp, that operates at a record temperature of 42 milliKelvin—close to absolute zero. In their article in Physical Review Applied they prove it works by measuring the relaxation time of copper. They achieved a thousand times higher sensitivity than existing NMR microscopes—also a world record.


With their microscope, they give physicists an instrument to conduct fundamental research on many physical phenomena, like systems displaying strange behavior in extreme cold. And like NMR eventually led to MRI machines in hospitals, NMR microscopes have great potential too. Wagenaar: ‘One example is that you might be able to use our technique to study Alzheimer patients’ brains at the molecular level, in order to find out how iron is locked up in proteins.’

Here’s a link to and a citation for the paper,

Probing the Nuclear Spin-Lattice Relaxation Time at the Nanoscale by J. J. T. Wagenaar, A. M. J. den Haan, J. M. de Voogd, L. Bossoni, T. A. de Jong, M. de Wit, K. M. Bastiaans, D. J. Thoen, A. Endo, T. M. Klapwijk, J. Zaanen, and T. H. Oosterkamp. Phys. Rev. Applied 6, 014007 DOI:http://dx.doi.org/10.1103/PhysRevApplied.6.014007 Published 15 July 2016

This paper is open access.

*’fro’ changed to ‘for’ on Aug. 3, 2016.

Trans-Atlantic Platform (T-AP) is a unique collaboration of humanities and social science researchers from Europe and the Americas

Launched in 2013, the Trans-Atlantic Platform is co-chaired by Dr.Ted Hewitt, president of the Social Sciences and Humanities Research Council of Canada (SSHRC) , and Dr. Renée van Kessel-Hagesteijn, Netherlands Organisation for Scientific Research—Social Sciences (NWO—Social Sciences).

An EU (European Union) publication, International Innovation features an interview about T-AP with Ted Hewitt in a June 30, 2016 posting,

The Trans-Atlantic Platform is a unique collaboration of humanities and social science funders from Europe and the Americas. International Innovation’s Rebecca Torr speaks with Ted Hewitt, President of the Social Sciences and Humanities Research Council and Co-Chair of T-AP to understand more about the Platform and its pilot funding programme, Digging into Data.

Many commentators have called for better integration between natural and social scientists, to ensure that the societal benefits of STEM research are fully realised. Does the integration of diverse scientific disciplines form part of T-AP’s remit, and if so, how are you working to achieve this?

T-AP was designed primarily to promote and facilitate research across SSH. However, given the Platform’s thematic priorities and the funding opportunities being contemplated, we anticipate that a good number of non-SSH [emphasis mine] researchers will be involved.

As an example, on March 1, T-AP launched its first pilot funding opportunity: the T-AP Digging into Data Challenge. One of the sponsors is the Natural Sciences and Engineering Research Council of Canada (NSERC), Canada’s federal funding agency for research in the natural sciences and engineering. Their involvement ensures that the perspective of the natural sciences is included in the challenge. The Digging into Data Challenge is open to any project that addresses research questions in the SSH by using large-scale digital data analysis techniques, and is then able to show how these techniques can lead to new insights. And the challenge specifically aims to advance multidisciplinary collaborative projects.

When you tackle a research question or undertake research to address a social challenge, you need collaboration between various SSH disciplines or between SSH and STEM disciplines. So, while proposals must address SSH research questions, the individual teams often involve STEM researchers, such as computer scientists.

In previous rounds of the Digging into Data Challenge, this has led to invaluable research. One project looked at how the media shaped public opinion around the 1918 Spanish flu pandemic. Another used CT scans to examine hundreds of mummies, ultimately discovering that atherosclerosis, a form of heart disease, was prevalent 4,000 years ago. In both cases, these multidisciplinary historical research projects have helped inform our thinking of the present.

Of course, Digging into Data isn’t the only research area in which T-AP will be involved. Since its inception, T-AP partners have identified three priority areas beyond digital scholarship: diversity, inequality and difference; resilient and innovative societies; and transformative research on the environment. Each of these areas touches on a variety of SSH fields, while the transformative research on the environment area has strong connections with STEM fields. In September 2015, T-AP organised a workshop around this third priority area; environmental science researchers were among the workshop participants.

I wish Hewitt hadn’t described researchers from disciplines other than the humanities and social sciences as “non-SSH.” The designation divides the world in two: us and non-take your pick: non-Catholic/Muslim/American/STEM/SSH/etc.

Getting back to the interview, it is surprisingly Canuck-centric in places,

How does T-AP fit in with Social Sciences and Humanities Research Council of Canada (SSHRC)’s priorities?

One of the objectives in SSHRC’s new strategic plan is to develop partnerships that enable us to expand the reach of our funding. As T-AP provides SSHRC with links to 16 agencies across Europe and the Americas, it is an efficient mechanism for us to broaden the scope of our support and promotion of post-secondary-based research and training in SSH.

It also provides an opportunity to explore cutting edge areas of research, such as big data (as we did with the first call we put out, Digging into Data). The research enterprise is becoming increasingly international, by which I mean that researchers are working on issues with international dimensions or collaborating in international teams. In this globalised environment, SSHRC must partner with international funders to support research excellence. By developing international funding opportunities, T-AP helps researchers create teams better positioned to tackle the most exciting and promising research topics.

Finally, it is a highly effective way of broadly promoting the value of SSH research throughout Canada and around the globe. There are significant costs and complexities involved in international research, and uncoordinated funding from multiple national funders can actually create barriers to collaboration. A platform like T-AP helps funders coordinate and streamline processes.

The interview gets a little more international scope when it turns to the data project,

What is the significance of your pilot funding programme in digital scholarship and what types of projects will it support?

The T-AP Digging into Data Challenge is significant for several reasons. First, the geographic reach of Digging is truly significant. With 16 participants from 11 countries, this round of Digging has significantly broader participation from previous rounds. This is also the first time Digging into Data includes funders from South America.

The T-AP Digging into Data Challenge is open to any research project that addresses questions in SSH. In terms of what those projects will end up being is anybody’s guess – projects from past competitions have involved fields ranging from musicology to anthropology to political science.

The Challenge’s main focus is, of course, the use of big data in research.

You may want to read the interview in its entirety here.

I have checked out the Trans-Atlantic Platform website but cannot determine how someone or some institution might consult that site for information on how to get involved in their projects or get funding. However, there is a T-AP Digging into Data website where there is evidence of the first international call for funding submissions. Sadly, the deadline for the 2016 call has passed if the website is to be believed (sometimes people are late when changing deadline dates).

3D brain-on-a-chip from the University of Twente

Dutch researchers have developed a 3D brain-on-a-chip according to a June 23, 2016 news item on Nanowerk,

To study brain cell’s operation and test the effect of medication on individual cells, the conventional Petri dish with flat electrodes is not sufficient. For truly realistic studies, cells have to flourish within three-dimensional surroundings.

Bart Schurink, researcher at University of Twente’s MESA+ Institute for Nanotechnology, has developed a sieve with 900 openings, each of which has the shape of an inverted pyramid. On top of this array of pyramids, a micro-reactor takes care of cell growth. Schurink defends his PhD thesis June 23 [2016].

A June 23, 2016 University of Twente press release, which originated the news item, provides more detail,

A brain-on-a-chip demands more than a series of electrodes in 2D, on which brain cells can be cultured. To mimic the brain in a realistic way, you need facilities for fluid flow, and the cells need some freedom for themselves even when they are kept at predefined spaces. Schurink therefore developed a micro sieve structure with hundreds of openings on a 2 by 2 mm surface. Each of these holes has the shape of  an inverted pyramid. Each pyramid, in turn, is equipped with an electrode, for measuring electrical signals or sending stimuli to the network. At the same time, liquids can flow through tiny holes, needed to capture the cells and for sending nutrients or medication to a single cell.


After neurons have been placed inside all the pyramids, they will start to form a network. This is not just a 2D network between the holes: by placing a micro reactor on top of the sieve, a neuron network can develop in the vertical direction as well. Growth and electrical activity can be monitored subsequently: each individual cell can be identified by the pyramid it is in. Manufacturing this system, demands a lot of both the production facilities at UT’s NanoLab and of creative solutions the designers come up with. For example, finding the proper way of guaranteeing  the same dimensions for every hole, is quite challenging.

Schurink’s new µSEA (micro sieve electrode array) has been tested with living cells, from the brains of laboratory rats. Both the positioning of the cells and neuronal network growth have been tested. The result of this PhD research is a fully new research platform for performing research on the brain, diseases and effects of medication.

Schurink (1982) has conducted his research within the group Meso Scale Chemical Systems, of Prof Han Gardeniers. The group is part of the MESA+ Institute for Nanotechnology of the University of Twente. Schurink’s thesis is titled ‘Microfabrication and microfluidics for 3D brain-on-chip’ …

I have written about one other piece about a ‘3D’ organ-on-a-chip project in China (my Jan. 29, 2016 posting).

Artists classified the animal kingdom?

Where taxonomy and biology are concerned, my knowledge begins and end with Carl Linnaeus, the Swedish scientist who ushered in modern taxonomy. It was with some surprise that I find out artists also helped develop the field. From a June 21, 2016 news item on ScienceDaily,

In the sixteenth and seventeenth centuries artists were fascinated by how the animal kingdom was classified. They were in some instances ahead of natural historians.

This is one of the findings of art historian Marrigje Rikken. She will defend her PhD on 23 June [2016] on animal images in visual art. In recent years she has studied how images of animals between 1550 and 1630 became an art genre in themselves. ‘The close relationship between science and art at that time was remarkable,’ Rikken comments. ‘Artists tried to bring some order to the animal kingdom, just as biologists did.’

A June 21, 2016 Universiteit Leiden (Leiden University, Netherlands) press release, which originated the news item, expands on the theme,

In some cases the artists were ahead of their times. They became interested in insects, for example, before they attracted the attention of natural historians. It was artist Joris Hoefnagel who in 1575 made the first miniatures featuring beetles, butterflies and dragonflies, indicating how they were related to one another. In his four albums Hoefnagel divided the animal species according to the elements of fire, water, air and earth, but within these classifications he grouped animals on the basis of shared characteristics.

Courtesy: Universiteit Leiden

Beetles, butterflies, and dragonflies by Joris Hoefnagel. Courtesy: Universiteit Leiden

The press release goes on,

Other illustrators, print-makers and painters tried to bring some cohesion to the animal kingdom.  Some of them used an alphabetical system but artist Marcus Gheeraerts  published a print as early as 1583 [visible below, Ed.] in which grouped even-toed ungulates together. The giraffe and sheep – both visible on Gheeraerts’ print – belong to this species of animals. This doesn’t apply to all Gheeraerts’ animals. The mythical unicorn, which was featured by Gheeraerts, no longer appears in contemporary biology books.

Wealthy courtiers

According to Rikken, the so-called menageries played an important role historically in how animals were represented. These forerunners of today’s zoos were popular in the sixteenth and seventeenth centuries particularly among wealthy rulers and courtiers. Unfamiliar exotic animals regularly arrived that were immediately committed to paper by artists. Rikken: ‘The toucan, for example, was immortalised in 1615 by Jan Brueghel the Elder, court painter in Brussels.’  [See the main image, Ed.].’

In the flesh

Rikken also discovered that the number of animals featured in a work gradually increased. ‘Artists from the 1570s generally included one or just a few animals per work. With the arrival of print series a decade later, each illustration tended to include more and more animals. This trend reached its peak in the lavish paintings produced around 1600.’ These paintings are also much more varied than the drawings and prints. Illustrators and print-makers often blindly copied one another’s motifs, even showing the animals in an identical pose. Artists had no hesitation in including the same animal in different positions. Rikken: ‘This allowed them to show that they had observed the animal in the flesh.’

Even-toed ungulates by Marcus Gheeraerts. Courtesy: Leiden Universiteit

Even-toed ungulates by Marcus Gheeraerts. Courtesy: Leiden Universiteit

Yet more proof or, at least, a very strong suggestion that art and science are tightly linked.

Lungs: EU SmartNanoTox and Pneumo NP

I have three news bits about lungs one concerning relatively new techniques for testing the impact nanomaterials may have on lungs and two concerning developments at PneumoNP; the first regarding a new technique for getting antibiotics to a lung infected with pneumonia and the second, a new antibiotic.

Predicting nanotoxicity in the lungs

From a June 13, 2016 news item on Nanowerk,

Scientists at the Helmholtz Zentrum München [German Research Centre for Environmental Health] have received more than one million euros in the framework of the European Horizon 2020 Initiative [a major European Commission science funding initiative successor to the Framework Programme 7 initiative]. Dr. Tobias Stöger and Dr. Otmar Schmid from the Institute of Lung Biology and Disease and the Comprehensive Pneumology Center (CPC) will be using the funds to develop new tests to assess risks posed by nanomaterials in the airways. This could contribute to reducing the need for complex toxicity tests.

A June 13, 2016 Helmholtz Zentrum München (German Research Centre for Environmental Health) press release, which originated the news item, expands on the theme,

Nanoparticles are extremely small particles that can penetrate into remote parts of the body. While researchers are investigating various strategies for harvesting the potential of nanoparticles for medical applications, they could also pose inherent health risks*. Currently the hazard assessment of nanomaterials necessitates a complex and laborious procedure. In addition to complete material characterization, controlled exposure studies are needed for each nanomaterial in order to guarantee the toxicological safety.

As a part of the EU SmartNanoTox project, which has now been funded with a total of eight million euros, eleven European research partners, including the Helmholtz Zentrum München, want to develop a new concept for the toxicological assessment of nanomaterials.

Reference database for hazardous substances

Biologist Tobias Stöger and physicist Otmar Schmid, both research group heads at the Institute of Lung Biology and Disease, hope that the use of modern methods will help to advance the assessment procedure. “We hope to make more reliable nanotoxicity predictions by using modern approaches involving systems biology, computer modelling, and appropriate statistical methods,” states Stöger.

The lung experts are concentrating primarily on the respiratory tract. The approach involves defining a representative selection of toxic nanomaterials and conducting an in-depth examination of their structure and the various molecular modes of action that lead to their toxicity. These data are then digitalized and transferred to a reference database for new nanomaterials. Economical tests that are easy to conduct should then make it possible to assess the toxicological potential of these new nanomaterials by comparing the test results s with what is already known from the database. “This should make it possible to predict whether or not a newly developed nanomaterial poses a health risk,” Otmar Schmid says.

* Review: Schmid, O. and Stoeger, T. (2016). Surface area is the biologically most effective dose metric for acute nanoparticle toxicity in the lung. Journal of Aerosol Science, DOI:10.1016/j.jaerosci.2015.12.006

The SmartNanoTox webpage is here on the European Commission’s Cordis website.

Carrying antibiotics into lungs (PneumoNP)

I received this news from the European Commission’s PneumoNP project (I wrote about PneumoNP in a June 26, 2014 posting when it was first announced). This latest development is from a March 21, 2016 email (the original can be found here on the How to pack antibiotics in nanocarriers webpage on the PneumoNP website),

PneumoNP researchers work on a complex task: attach or encapsulate antibiotics with nanocarriers that are stable enough to be included in an aerosol formulation, to pass through respiratory tracts and finally deliver antibiotics on areas of lungs affected by pneumonia infections. The good news is that they finally identify two promising methods to generate nanocarriers.

So far, compacting polymer coils into single-chain nanoparticles in water and mild conditions was an unsolved issue. But in Spain, IK4-CIDETEC scientists developed a covalent-based method that produces nanocarriers with remarkable stability under those particular conditions. Cherry on the cake, the preparation is scalable for more industrial production. IK4-CIDETEC patented the process.

Fig.: A polymer coil (step 1) compacts into a nanocarrier with cross-linkers (step 2). Then, antibiotics get attached to the nanocarrier (step 3).

Fig.: A polymer coil (step 1) compacts into a nanocarrier with cross-linkers (step 2). Then, antibiotics get attached to the nanocarrier (step 3).

At the same time, another route to produce lipidic nanocarriers have been developed by researchers from Utrecht University. In particular, they optimized the method consisting in assembling lipids directly around a drug. As a result, generated lipidic nanocarriers show encouraging stability properties and are able to carry sufficient quantity of antibiotics.

Fig.: On presence of antibiotics, the lipidic layer (step 1) aggregates the the drug (step 2) until the lipids forms a capsule around the antibiotics (step 3).

Fig.: On presence of antibiotics, a lipidic layer (step 1) aggregates the drug (step 2) until the lipids forms a capsule around antibiotics (step 3).

Assays of both polymeric and lipidic nanocarriers are currently performed by ITEM Fraunhofer Institute in Germany, Ingeniatrics Tecnologias in Spain and Erasmus Medical Centre in the Netherlands. Part of these tests allows to make sure that the nanocarriers are not toxic to cells. Other tests are also done to verify that the efficiency of antibiotics on Klebsiella Pneumoniae bacteria when they are attached to nanocarriers.

A new antibiotic for pneumonia (PneumoNP)

A June 14, 2016 PneumoNP press release (received via email) announces work on a promising new approach to an antibiotic for pneumonia,

The antimicrobial peptide M33 may be the long-sought substitute to treat difficult lung infections, like multi-drug resistant pneumonia.

In 2013, the European Respiratory Society predicted 3 millions cases of pneumonia in Europe every year [1]. The standard treatment for pneumonia is an intravenous administration of a combination of drugs. This leads to the development of antibiotic resistance in the population. Gradually, doctors are running out of solutions to cure patients. An Italian company suggests a new option: the M33 peptide.

Few years ago, the Italian company SetLance SRL decided to investigate the M33 peptide. The antimicrobial peptide is an optimized version of an artificial peptide sequence selected for its efficacy and stability. So far, it showed encouraging in-vitro results against multidrug-resistant Gram-negative bacteria, including Klebsiella Pneumoniae. With the support of EU funding to the PneumoNP project, SetLance SRL had the opportunity to develop a new formulation of M33 that enhances its antimicrobial activity.

The new formulation of M33 fights Gram-negative bacteria in three steps. First of all, the M33 binds with the lipopolysaccharides (LPS) on the outer membrane of bacteria. Then, the molecule forms a helix and finally disrupts the membrane provoking cytoplasm leaking. The peptide enabled up to 80% of mices to survive Pseudomonas Aeruginosa-based lung infections. Beyond these encouraging results, toxicity to the new M33 formulation seems to be much lower than antimicrobial peptides currently used in clinical practice like colistin [2].

Lately, SetLance scaled-up the synthesis route and is now able to produce several hundred milligrams per batch. The molecule is robust enough for industrial production. We may expect this drug to go on clinical development and validation at the beginning of 2018.

[1] http://www.erswhitebook.org/chapters/acute-lower-respiratory-infections/pneumonia/
[2] Ceccherini et al., Antimicrobial activity of levofloxacin-M33 peptide conjugation or combination, Chem Med Comm. 2016; Brunetti et al., In vitro and in vivo efficacy, toxicity, bio-distribution and resistance selection of a novel antibacterial drug candidate. Scientific Reports 2016

I believe all the references are open access.

Brief final comment

The only element linking these news bits together is that they concern the lungs.

Open access to nanoparticles and nanocomposites

One of the major issues for developing nanotechnology-enabled products is access to nanoparticles and nanocomposites. For example, I’ve had a number of requests from entrepreneurs for suggestions as to how to access cellulose nanocrystals (CNC) so they can develop a product idea. (It’s been a few years since the last request and I hope that means it’s easier to get access to CNC.)

Regardless, access remains a problem and the European Union has devised a solution which allows open access to nanoparticles and nanocomposites through project Co-Pilot. The announcement was made in a May 10, 2016 news item on Nanowerk (Note: A link has been removed),

“What opportunities does the nanotechnology provide in general, provide nanoparticles for my products and processes?” So far, this question cannot be answered easily. Preparation and modification of nanoparticles and the further processing require special technical infrastructure and complex knowledge. For small and medium businesses the construction of this infrastructure “just on luck” is often not worth it. Even large companies shy away from the risks. As a result many good ideas just stay in the drawer.

A simple and open access to high-class infrastructure for the reliable production of small batches of functionalized nanoparticles and nanocomposites for testing could ease the way towards new nano-based products for chemical and pharmaceutical companies. The European Union has allocated funds for the construction of a number of pilot lines and open-access infrastructure within the framework of the EU project CoPilot.

A May 9, 2016 Fraunhofer-Institut für Silicatforschung press release, which originated the news item, offers greater description,

A simple and open access to high-class infrastructure for the reliable production of small batches of functionalized nanoparticles and nanocomposites for testing could ease the way towards new nano-based products for chemical and pharmaceutical companies. The European Union has allocated funds for the construction of a number of pilot lines and open-access infrastructure within the framework of the EU project CoPilot. A consortium of 13 partners from research and industry, including nanotechnology specialist TNO from the Netherlands and the Fraunhofer Institute for Silicate Research ISC from Wuerzburg, Germany as well as seven nanomaterial manufacturers, is currently setting up the pilot line in Wuerzburg. First, they establish the particle production, modification and compounding on pilot scale based on four different model systems. The approach enables maximum variability and flexibility for the pilot production of various particle systems and composites. Two further open access lines will be established at TNO in Eindhoven and at the Sueddeutsche Kunststoffzentrum SKZ in Selb.

The “nanoparticle kitchen”

Essential elements of the pilot line in Wuerzburg are the particle synthesis in batches up to 100 liters, modification and separation methods such as semi-continuous operating centrifuge and in-line analysis and techniques for the uniform and agglomeration free incorporation of nanoparticles into composites. Dr. Karl Mandel, head of Particle Technology of Fraunhofer ISC, compares the pilot line with a high-tech kitchen: “We provide the top-notch equipment and the star chefs to synthesize a nano menu à la carte as well as nanoparticles according to individual requests. Thus, companies can test their own receipts – or our existing receipts – before they practice their own cooking or set up their nano kitchen.”

In the future, the EU project offers companies a contact point if they want to try their nano idea and require enough material for sampling and estimation of future production costs. This can, on the one hand, minimize the development risk, on the other hand, it maximizes the flexibility and production safety. To give lots of companies the opportunity to influence direction and structure/formation/setup of the nanoparticle kitchen, the project partners will offer open meetings on a regular basis.

I gather Co-Pilot has been offering workshops. The next is in July 2016 according to the press release,

The next workshop in this context takes place at Fraunhofer ISC in Wuerzburg, 7h July 2016. The partners present the pilot line and the first results of the four model systems – double layered hydroxide nanoparticle polymer composites for flame inhibiting fillers, titanium dioxide nanoparticles for high refractive index composites, magnetic particles for innovative catalysts and hollow silica composites for anti-glare coatings. Interested companies can find more information about the upcoming workshop on the website of the project www.h2020copilot.eu and on the website of Fraunhofer ISC www.isc.fraunhofer.de that hosts the event.

I tracked down a tiny bit more information about the July 2016 workshop in a May 2, 2016 Co-Pilot press release,

On July 7 2016, the CoPilot project partners give an insight view of the many new functionalization and applications of tailored nanoparticles in the workshop “The Nanoparticle Kitchen – particles und functions à la carte”, taking place in Wuerzburg, Germany. Join the Fraunhofer ISC’s lab tour of the “Nanoparticle Kitchen”, listen to the presentations of research institutes and industry and discuss your ideas with experts. Nanoparticles offer many options for today’s and tomorrow’s products.

More about program and registration soon on this [CoPilot] website!

I wonder if they’re considering this open access to nanoparticles and nanocomposites approach elsewhere?

Artificial intelligence used for wildlife protection

PAWS (Protection Assistant for Wildlife Security), an artificial intelligence (AI) program, has been tested in Uganda and Malaysia. according to an April 22, 2016 US National Science Foundation (NSF) news release (also on EurekAlert but dated April 21, 2016), Note: Links have been removed,

A century ago, more than 60,000 tigers roamed the wild. Today, the worldwide estimate has dwindled to around 3,200. Poaching is one of the main drivers of this precipitous drop. Whether killed for skins, medicine or trophy hunting, humans have pushed tigers to near-extinction. The same applies to other large animal species like elephants and rhinoceros that play unique and crucial roles in the ecosystems where they live.

Human patrols serve as the most direct form of protection of endangered animals, especially in large national parks. However, protection agencies have limited resources for patrols.

With support from the National Science Foundation (NSF) and the Army Research Office, researchers are using artificial intelligence (AI) and game theory to solve poaching, illegal logging and other problems worldwide, in collaboration with researchers and conservationists in the U.S., Singapore, Netherlands and Malaysia.

“In most parks, ranger patrols are poorly planned, reactive rather than pro-active, and habitual,” according to Fei Fang, a Ph.D. candidate in the computer science department at the University of Southern California (USC).

Fang is part of an NSF-funded team at USC led by Milind Tambe, professor of computer science and industrial and systems engineering and director of the Teamcore Research Group on Agents and Multiagent Systems.

Their research builds on the idea of “green security games” — the application of game theory to wildlife protection. Game theory uses mathematical and computer models of conflict and cooperation between rational decision-makers to predict the behavior of adversaries and plan optimal approaches for containment. The Coast Guard and Transportation Security Administration have used similar methods developed by Tambe and others to protect airports and waterways.

“This research is a step in demonstrating that AI can have a really significant positive impact on society and allow us to assist humanity in solving some of the major challenges we face,” Tambe said.

PAWS puts the claws in anti-poaching

The team presented papers describing how they use their methods to improve the success of human patrols around the world at the AAAI Conference on Artificial Intelligence in February [2016].

The researchers first created an AI-driven application called PAWS (Protection Assistant for Wildlife Security) in 2013 and tested the application in Uganda and Malaysia in 2014. Pilot implementations of PAWS revealed some limitations, but also led to significant improvements.

Here’s a video describing the issues and PAWS,

For those who prefer to read about details rather listen, there’s more from the news release,

PAWS uses data on past patrols and evidence of poaching. As it receives more data, the system “learns” and improves its patrol planning. Already, the system has led to more observations of poacher activities per kilometer.

Its key technical advance lies in its ability to incorporate complex terrain information, including the topography of protected areas. That results in practical patrol routes that minimize elevation changes, saving time and energy. Moreover, the system can also take into account the natural transit paths that have the most animal traffic – and thus the most poaching – creating a “street map” for patrols.

“We need to provide actual patrol routes that can be practically followed,” Fang said. “These routes need to go back to a base camp and the patrols can’t be too long. We list all possible patrol routes and then determine which is most effective.”

The application also randomizes patrols to avoid falling into predictable patterns.

“If the poachers observe that patrols go to some areas more often than others, then the poachers place their snares elsewhere,” Fang said.

Since 2015, two non-governmental organizations, Panthera and Rimbat, have used PAWS to protect forests in Malaysia. The research won the Innovative Applications of Artificial Intelligence award for deployed application, as one of the best AI applications with measurable benefits.

The team recently combined PAWS with a new tool called CAPTURE (Comprehensive Anti-Poaching Tool with Temporal and Observation Uncertainty Reasoning) that predicts attacking probability even more accurately.

In addition to helping patrols find poachers, the tools may assist them with intercepting trafficked wildlife products and other high-risk cargo, adding another layer to wildlife protection. The researchers are in conversations with wildlife authorities in Uganda to deploy the system later this year. They will present their findings at the 15th International Conference on Autonomous Agents and Multiagent Systems (AAMAS 2016) in May.

“There is an urgent need to protect the natural resources and wildlife on our beautiful planet, and we computer scientists can help in various ways,” Fang said. “Our work on PAWS addresses one facet of the problem, improving the efficiency of patrols to combat poaching.”

There is yet another potential use for PAWS, the prevention of illegal logging,

While Fang and her colleagues work to develop effective anti-poaching patrol planning systems, other members of the USC team are developing complementary methods to prevent illegal logging, a major economic and environmental problem for many developing countries.

The World Wildlife Fund estimates trade in illegally harvested timber to be worth between $30 billion and $100 billion annually. The practice also threatens ancient forests and critical habitats for wildlife.

Researchers at USC, the University of Texas at El Paso and Michigan State University recently partnered with the non-profit organization Alliance Vohoary Gasy to limit the illegal logging of rosewood and ebony trees in Madagascar, which has caused a loss of forest cover on the island nation.

Forest protection agencies also face limited budgets and must cover large areas, making sound investments in security resources critical.

The research team worked to determine the balance of security resources in which Madagascar should invest to maximize protection, and to figure out how to best deploy those resources.

Past work in game theory-based security typically involved specified teams — the security workers assigned to airport checkpoints, for example, or the air marshals deployed on flight tours. Finding optimal security solutions for those scenarios is difficult; a solution involving an open-ended team had not previously been feasible.

To solve this problem, the researchers developed a new method called SORT (Simultaneous Optimization of Resource Teams) that they have been experimentally validating using real data from Madagascar.

The research team created maps of the national parks, modeled the costs of all possible security resources using local salaries and budgets, and computed the best combination of resources given these conditions.

“We compared the value of using an optimal team determined by our algorithm versus a randomly chosen team and the algorithm did significantly better,” said Sara Mc Carthy, a Ph.D. student in computer science at USC.

The algorithm is simple and fast, and can be generalized to other national parks with different characteristics. The team is working to deploy it in Madagascar in association with the Alliance Vohoary Gasy.

“I am very proud of what my PhD students Fei Fang and Sara Mc Carthy have accomplished in this research on AI for wildlife security and forest protection,” said Tambe, the team lead. “Interdisciplinary collaboration with practitioners in the field was key in this research and allowed us to improve our research in artificial intelligence.”

Moreover, the project shows other computer science researchers the potential impact of applying their research to the world’s problems.

“This work is not only important because of the direct beneficial impact that it has on the environment, protecting wildlife and forests, but on the way that it can inspire other to dedicate their efforts into making the world a better place,” Mc Carthy said.

The curious can find out more about Panthera here and about Alliance Vohoary Gasy here (be prepared to use your French language skills). Unfortunately, I could not find more information about Rimbat.