Tag Archives: Switzerland

Deep learning and some history from the Swiss National Science Foundation (SNSF)

A June 27, 2016 news item on phys.org provides a measured analysis of deep learning and its current state of development (from a Swiss perspective),

In March 2016, the world Go champion Lee Sedol lost 1-4 against the artificial intelligence AlphaGo. For many, this was yet another defeat for humanity at the hands of the machines. Indeed, the success of the AlphaGo software was forged in an area of artificial intelligence that has seen huge progress over the last decade. Deep learning, as it’s called, uses artificial neural networks to process algorithmic calculations. This software architecture therefore mimics biological neural networks.

Much of the progress in deep learning is thanks to the work of Jürgen Schmidhuber, director of the IDSIA (Istituto Dalle Molle di Studi sull’Intelligenza Artificiale) which is located in the suburbs of Lugano. The IDSIA doctoral student Shane Legg and a group of former colleagues went on to found DeepMind, the startup acquired by Google in early 2014 for USD 500 million. The DeepMind algorithms eventually wound up in AlphaGo.

“Schmidhuber is one of the best at deep learning,” says Boi Faltings of the EPFL Artificial Intelligence Lab. “He never let go of the need to keep working at it.” According to Stéphane Marchand-Maillet of the University of Geneva computing department, “he’s been in the race since the very beginning.”

A June 27, 2016 SNSF news release (first published as a story in Horizons no. 109 June 2016) by Fabien Goubet, which originated the news item, goes on to provide a brief history,

The real strength of deep learning is structural recognition, and winning at Go is just an illustration of this, albeit a rather resounding one. Elsewhere, and for some years now, we have seen it applied to an entire spectrum of areas, such as visual and vocal recognition, online translation tools and smartphone personal assistants. One underlying principle of machine learning is that algorithms must first be trained using copious examples. Naturally, this has been helped by the deluge of user-generated content spawned by smartphones and web 2.0, stretching from Facebook photo comments to official translations published on the Internet. By feeding a machine thousands of accurately tagged images of cats, for example, it learns first to recognise those cats and later any image of a cat, including those it hasn’t been fed.

Deep learning isn’t new; it just needed modern computers to come of age. As far back as the early 1950s, biologists tried to lay out formal principles to explain the working of the brain’s cells. In 1956, the psychologist Frank Rosenblatt of the New York State Aeronautical Laboratory published a numerical model based on these concepts, thereby creating the very first artificial neural network. Once integrated into a calculator, it learned to recognise rudimentary images.

“This network only contained eight neurones organised in a single layer. It could only recognise simple characters”, says Claude Touzet of the Adaptive and Integrative Neuroscience Laboratory of Aix-Marseille University. “It wasn’t until 1985 that we saw the second generation of artificial neural networks featuring multiple layers and much greater performance”. This breakthrough was made simultaneously by three researchers: Yann LeCun in Paris, Geoffrey Hinton in Toronto and Terrence Sejnowski in Baltimore.

Byte-size learning

In multilayer networks, each layer learns to recognise the precise visual characteristics of a shape. The deeper the layer, the more abstract the characteristics. With cat photos, the first layer analyses pixel colour, and the following layer recognises the general form of the cat. This structural design can support calculations being made upon thousands of layers, and it was this aspect of the architecture that gave rise to the name ‘deep learning’.

Marchand-Maillet explains: “Each artificial neurone is assigned an input value, which it computes using a mathematical function, only firing if the output exceeds a pre-defined threshold”. In this way, it reproduces the behaviour of real neurones, which only fire and transmit information when the input signal (the potential difference across the entire neural circuit) reaches a certain level. In the artificial model, the results of a single layer are weighted, added up and then sent as the input signal to the following layer, which processes that input using different functions, and so on and so forth.

For example, if a system is trained with great quantities of photos of apples and watermelons, it will progressively learn to distinguish them on the basis of diameter, says Marchand-Maillet. If it cannot decide (e.g., when processing a picture of a tiny watermelon), the subsequent layers take over by analysing the colours or textures of the fruit in the photo, and so on. In this way, every step in the process further refines the assessment.

Video games to the rescue

For decades, the frontier of computing held back more complex applications, even at the cutting edge. Industry walked away, and deep learning only survived thanks to the video games sector, which eventually began producing graphics chips, or GPUs, with an unprecedented power at accessible prices: up to 6 teraflops (i.e., 6 trillion calculations per second) for a few hundred dollars. “There’s no doubt that it was this calculating power that laid the ground for the quantum leap in deep learning”, says Touzet. GPUs are also very good at parallel calculations, a useful function for executing the innumerable simultaneous operations required by neural networks.
Although image analysis is getting great results, things are more complicated for sequential data objects such as natural spoken language and video footage. This has formed part of Schmidhuber’s work since 1989, and his response has been to develop recurrent neural networks in which neurones communicate with each other in loops, feeding processed data back into the initial layers.

Such sequential data analysis is highly dependent on context and precursory data. In Lugano, networks have been instructed to memorise the order of a chain of events. Long Short Term Memory (LSTM) networks can distinguish ‘boat’ from ‘float’ by recalling the sound that preceded ‘oat’ (i.e., either ‘b’ or ‘fl’). “Recurrent neural networks are more powerful than other approaches such as the Hidden Markov models”, says Schmidhuber, who also notes that Google Voice integrated LSTMs in 2015. “With looped networks, the number of layers is potentially infinite”, says Faltings [?].

For Schmidhuber, deep learning is just one aspect of artificial intelligence; the real thing will lead to “the most important change in the history of our civilisation”. But Marchand-Maillet sees deep learning as “a bit of hype, leading us to believe that artificial intelligence can learn anything provided there’s data. But it’s still an open question as to whether deep learning can really be applied to every last domain”.

It’s nice to get an historical perspective and eye-opening to realize that scientists have been working on these concepts since the 1950s.

Cleaning up nuclear waste gases with nanotechnology-enabled materials

Swiss and US scientists have developed a nanoporous crystal that could be used to clean up nuclear waste gases according to a June 13, 2016 news item on Nanowerk (Note: A link has been removed),

An international team of scientists at EPFL [École polytechnique fédérale de Lausanne in Switzerland] and the US have discovered a material that can clear out radioactive waste from nuclear plants more efficiently, cheaply, and safely than current methods.

Nuclear energy is one of the cheapest alternatives to carbon-based fossil fuels. But nuclear-fuel reprocessing plants generate waste gas that is currently too expensive and dangerous to deal with. Scanning hundreds of thousands of materials, scientists led by EPFL and their US colleagues have now discovered a material that can absorb nuclear waste gases much more efficiently, cheaply and safely. The work is published in Nature Communications (“Metal–organic framework with optimally selective xenon adsorption and separation”).

A June 14, 2016 EPFL press release (also on EurekAlert), which originated the news item, explains further,

Nuclear-fuel reprocessing plants generate volatile radionuclides such as xenon and krypton, which escape in the so-called “off-gas” of these facilities – the gases emitted as byproducts of the chemical process. Current ways of capturing and clearing out these gases involve distillation at very low temperatures, which is expensive in both terms of energy and capital costs, and poses a risk of explosion.

Scientists led by Berend Smit’s lab at EPFL (Sion) and colleagues in the US, have now identified a material that can be used as an efficient, cheaper, and safer alternative to separate xenon and krypton – and at room temperature. The material, abbreviated as SBMOF-1, is a nanoporous crystal and belongs a class of materials that are currently used to clear out CO2 emissions and other dangerous pollutants. These materials are also very versatile, and scientists can tweak them to self-assemble into ordered, pre-determined crystal structures. In this way, they can synthesize millions of tailor-made materials that can be optimized for gas storage separation, catalysis, chemical sensing and optics.

The scientists carried out high-throughput screening of large material databases of over 125,000 candidates. To do this, they used molecular simulations to find structures that can separate xenon and krypton, and under conditions that match those involved in reprocessing nuclear waste.

Because xenon has a much shorter half-life than krypton – a month versus a decade – the scientists had to find a material that would be selective for both but would capture them separately. As xenon is used in commercial lighting, propulsion, imaging, anesthesia and insulation, it can also be sold back into the chemical market to offset costs.

The scientists identified and confirmed that SBMOF-1 shows remarkable xenon capturing capacity and xenon/krypton selectivity under nuclear-plant conditions and at room temperature.

The US partners have also made an announcement with this June 13, 2016 Pacific Northwest National Laboratory (PNNL) news release (also on EurekAlert), Note: It is a little repetitive but there’s good additional information,

Researchers are investigating a new material that might help in nuclear fuel recycling and waste reduction by capturing certain gases released during reprocessing. Conventional technologies to remove these radioactive gases operate at extremely low, energy-intensive temperatures. By working at ambient temperature, the new material has the potential to save energy, make reprocessing cleaner and less expensive. The reclaimed materials can also be reused commercially.

Appearing in Nature Communications, the work is a collaboration between experimentalists and computer modelers exploring the characteristics of materials known as metal-organic frameworks.

“This is a great example of computer-inspired material discovery,” said materials scientist Praveen Thallapally of the Department of Energy’s Pacific Northwest National Laboratory. “Usually the experimental results are more realistic than computational ones. This time, the computer modeling showed us something the experiments weren’t telling us.”

Waste avoidance

Recycling nuclear fuel can reuse uranium and plutonium — the majority of the used fuel — that would otherwise be destined for waste. Researchers are exploring technologies that enable safe, efficient, and reliable recycling of nuclear fuel for use in the future.

A multi-institutional, international collaboration is studying materials to replace costly, inefficient recycling steps. One important step is collecting radioactive gases xenon and krypton, which arise during reprocessing. To capture xenon and krypton, conventional technologies use cryogenic methods in which entire gas streams are brought to a temperature far below where water freezes — such methods are energy intensive and expensive.

Thallapally, working with Maciej Haranczyk and Berend Smit of Lawrence Berkeley National Laboratory [LBNL] and others, has been studying materials called metal-organic frameworks, also known as MOFs, that could potentially trap xenon and krypton without having to use cryogenics.

These materials have tiny pores inside, so small that often only a single molecule can fit inside each pore. When one gas species has a higher affinity for the pore walls than other gas species, metal-organic frameworks can be used to separate gaseous mixtures by selectively adsorbing.

To find the best MOF for xenon and krypton separation, computational chemists led by Haranczyk and Smit screened 125,000 possible MOFs for their ability to trap the gases. Although these gases can come in radioactive varieties, they are part of a group of chemically inert elements called “noble gases.” The team used computing resources at NERSC, the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility at LBNL.

“Identifying the optimal material for a given process, out of thousands of possible structures, is a challenge due to the sheer number of materials. Given that the characterization of each material can take up to a few hours of simulations, the entire screening process may fill a supercomputer for weeks,” said Haranczyk. “Instead, we developed an approach to assess the performance of materials based on their easily computable characteristics. In this case, seven different characteristics were necessary for predicting how the materials behaved, and our team’s grad student Cory Simon’s application of machine learning techniques greatly sped up the material discovery process by eliminating those that didn’t meet the criteria.”

The team’s models identified the MOF that trapped xenon most selectively and had a pore size close to the size of a xenon atom — SBMOF-1, which they then tested in the lab at PNNL.

After optimizing the preparation of SBMOF-1, Thallapally and his team at PNNL tested the material by running a mixture of gases through it — including a non-radioactive form of xenon and krypton — and measuring what came out the other end. Oxygen, helium, nitrogen, krypton, and carbon dioxide all beat xenon out. This indicated that xenon becomes trapped within SBMOF-1’s pores until the gas saturates the material.

Other tests also showed that in the absence of xenon, SBMOF-1 captures krypton. During actual separations, then, operators would pass the gas streams through SBMOF-1 twice to capture both gases.

The team also tested SBMOF-1’s ability to hang onto xenon in conditions of high humidity. Humidity interferes with cryogenics, and gases must be dehydrated before putting them through the ultra-cold method, another time-consuming expense. SBMOF-1, however, performed quite admirably, retaining more than 85 percent of the amount of xenon in high humidity as it did in dry conditions.

The final step in collecting xenon or krypton gas would be to put the MOF material under a vacuum, which sucks the gas out of the molecular cages for safe storage. A last laboratory test examined how stable the material was by repeatedly filling it up with xenon gas and then vacuuming out the xenon. After 10 cycles of this, SBMOF-1 collected just as much xenon as the first cycle, indicating a high degree of stability for long-term use.

Thallapally attributes this stability to the manner in which SBMOF-1 interacts with xenon. Rather than chemical reactions between the molecular cages and the gases, the relationship is purely physical. The material can last a lot longer without constantly going through chemical reactions, he said.

A model finding

Although the researchers showed that SBMOF-1 is a good candidate for nuclear fuel reprocessing, getting these results wasn’t smooth sailing. In the lab, the researchers had followed a previously worked out protocol from Stony Brook University to prepare SBMOF-1. Part of that protocol requires them to “activate” SBMOF-1 by heating it up to 300 degrees Celsius, three times the temperature of boiling water.

Activation cleans out material left in the pores from MOF synthesis. Laboratory tests of the activated SBMOF-1, however, showed the material didn’t behave as well as it should, based on the computer modeling results.

The researchers at PNNL repeated the lab experiments. This time, however, they activated SBMOF-1 at a lower temperature, 100 degrees Celsius, or the actual temperature of boiling water. Subjecting the material to the same lab tests, the researchers found SBMOF-1 behaving as expected, and better than at the higher activation temperature.

But why? To figure out where the discrepancy came from, the researchers modeled what happened to SBMOF-1 at 300 degrees Celsius. Unexpectedly, the pores squeezed in on themselves.

“When we heated the crystal that high, atoms within the pore tilted and partially blocked the pores,” said Thallapally. “The xenon doesn’t fit.”

Armed with these new computational and experimental insights, the researchers can explore SBMOF-1 and other MOFs further for nuclear fuel recycling. These MOFs might also be able to capture other noble gases such as radon, a gas known to pool in some basements.

Researchers hailed from several other institutions as well as those listed earlier, including University of California, Berkeley, Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, Brookhaven National Laboratory, and IMDEA Materials Institute in Spain. This work was supported by the [US] Department of Energy Offices of Nuclear Energy and Science.

Here’s an image the researchers have provided to illustrate their work,

Caption: The crystal structure of SBMOF-1 (green = Ca, yellow = S, red = O, gray = C, white = H). The light blue surface is a visualization of the one-dimensional channel that SBMOF-1 creates for the gas molecules to move through. The darker blue surface illustrates where a Xe atom sits in the pores of SBMOF-1 when it adsorbs. Credit: Berend Smit/EPFL/University of California Berkley

Caption: The crystal structure of SBMOF-1 (green = Ca, yellow = S, red = O, gray = C, white = H). The light blue surface is a visualization of the one-dimensional channel that SBMOF-1 creates for the gas molecules to move through. The darker blue surface illustrates where a Xe atom sits in the pores of SBMOF-1 when it adsorbs. Credit: Berend Smit/EPFL/University of California Berkley

Here’s a link to and a citation for the paper,

Metal–organic framework with optimally selective xenon adsorption and separation by Debasis Banerjee, Cory M. Simon, Anna M. Plonka, Radha K. Motkuri, Jian Liu, Xianyin Chen, Berend Smit, John B. Parise, Maciej Haranczyk, & Praveen K. Thallapally. Nature Communications 7, Article number: ncomms11831  doi:10.1038/ncomms11831 Published 13 June 2016

This paper is open access.

Final comment, this is the second time in the last month I’ve stumbled across more positive approaches to nuclear energy. The first time was a talk (Why Nuclear Power is Necessary) held in Vancouver, Canada in May 2016 (details here). I’m not trying to suggest anything unduly sinister but it is interesting since most of my adult life nuclear power has been viewed with fear and suspicion.

More from PETA (People for the Ethical Treatment of Animals) about nanomaterials and lungs

Science progress by increments. First, there was this April 27, 2016 post featuring some recent work by the organization, People for the Ethical Treatment of Animals (PETA) focused on nanomaterials and lungs. Now approximately one month later, PETA announces a new paper on the topic according to a May 26, 2016 news item on phys.org,

A scientist from the PETA International Science Consortium Ltd. is the lead author of a review on pulmonary fibrosis that results from inhaling nanomaterials, which has been published in Archives of Toxicology. The coauthors are scientists from Health Canada, West Virginia University, and the University of Fribourg in Switzerland.

A May 26, 2016 PETA news release on EurekAlert, which originated the news item, provides more detail (Note: Links have been removed),

The increasing use of nanomaterials in consumer goods such as paint, building materials, and food products has increased the likelihood of human exposure. Inhalation is one of the most prominent routes by which exposure can occur, and because inhalation of nanomaterials may be linked to lung problems such as pulmonary fibrosis, testing is conducted to assess the safety of these materials.

The review is one part of the proceedings of a 2015 workshop [mentioned in my Sept. 3, 2015 posting] organized by the PETA International Science Consortium, at which scientists discussed recommendations for designing an in vitro approach to assessing the toxicity of nanomaterials in the human lung. The workshop also produced another report that was recently published in Archives of Toxicology (Clippinger et al. 2016) and a review published in Particle and Fibre Toxicology (Polk et al. 2016) [mentioned in my April 27, 2016 posting] on exposing nanomaterials to cells grown in vitro.

The expert recommendations proposed at the workshop are currently being used to develop an in vitro system to predict the development of lung fibrosis in humans, which is being funded by the Science Consortium.

“International experts who took part in last year’s workshop have advanced the understanding and application of non-animal methods of studying nanomaterial effects in the lung,” says Dr. Monita Sharma, nanotoxicology specialist at the Consortium and lead author of the review in Archives of Toxicology. “Good science is leading the way toward more humane testing of nanomaterials, which, in turn, will lead to better protection of human health.”

Here’s a link to and a citation for the paper,

Predicting pulmonary fibrosis in humans after exposure to multi-walled carbon nanotubes (MWCNTs) by Monita Sharma, Jake Nikota, Sabina Halappanavar, Vincent Castranova, Barbara Rothen-Rutishauser, Amy J. Clippinger. Archives of Toxicology pp 1-18 DOI: 10.1007/s00204-016-1742-7 First online: 23 May 2016

This paper is behind a paywall.

Scented video games: a nanotechnology project in Europe

Ten years ago when I was working on a master’s degree (creative writing and new media), I was part of a group presentation on multimedia and to prepare started a conversation about scent as part of a multimedia experience. Our group leader was somewhat outraged. He’d led international multimedia projects and as far as he was concerned the ‘scent’ discussion was a waste of time when we were trying to prepare a major presentation.

He was right and wrong. I think you’re supposed to have these discussions when you’re learning and exploring ideas but, in 2006, there wasn’t much work of that type to discuss. It seems things may be changing according to a May 21, 2016 news item on Nanowerk (Note: A link has been removed),

Controlled odour emission could transform video games and television viewing experiences and benefit industries such as pest control and medicine [emphasis mine]. The NANOSMELL project aims to switch smells on and off by tagging artificial odorants with nanoparticles exposed to electromagnetic field.

I wonder if the medicinal possibilities include nanotechnology-enabled aroma therapy?

Getting back to the news, a May 10, 2016 European Commission press release, which originated the news item, expands on the theme,

The ‘smellyvision’ – a TV that offers olfactory as well as visual stimulation – has been a science fiction staple for years. However, realising this concept has proved difficult given the sheer complexity of how smell works and the technical challenges of emitting odours on demand.

NANOSMELL will specifically address these two challenges by developing artificial smells that can be switched on and off remotely. This would be achieved by tagging specific DNA-based artificial odorants – chemical compounds that give off smells – with nanoparticles that respond to external electromagnetic fields.

With the ability to remotely control these artificial odours, the project team would then be able to examine exactly how olfactory receptors respond. Sensory imaging to investigate the patterns of neural activity and behavioural tests will be carried out in animals.

The project would next apply artificial odorants to the human olfactory system and measure perceptions by switching artificial smells on and off. Researchers will also assess whether artificial odorants have a role to play in wound healing by placing olfactory receptors in skin.

The researchers aim to develop controllable odour-emitting components that will further understanding of smell and open the door to novel odour-emitting applications in fields ranging from entertainment to medicine.

Project details

  • Project acronym: NanoSmell
  • Participants: Israel (Coordinator), Spain, Germany, Switzerland
  • Project Reference N° 662629
  • Total cost: € 3 979 069
  • EU contribution: € 3 979 069
  • Duration:September 2015 – September 2019

You can find more information on the European Commission’s NANOSMELL project page.

Frankenstein and Switzerland in 2016

The Frankenstein Bicentennial celebration is in process as various events and projects are now being launched. In a Nov. 12, 2015 posting I made mention of the Frankenstein Bicentennial Project 1818-2018 at Arizona State University (ASU; scroll down about 15% of the way),

… the Transmedia Museum (Frankenstein Bicentennial Project 1818-2018).  This project is being hosted by Arizona State University. From the project homepage,

No work of literature has done more to shape the way people imagine science and its moral consequences than Frankenstein; or The Modern Prometheus, Mary Shelley’s enduring tale of creation and responsibility. The novel’s themes and tropes—such as the complex dynamic between creator and creation—continue to resonate with contemporary audiences. Frankenstein continues to influence the way we confront emerging technologies, conceptualize the process of scientific research, imagine the motivations and ethical struggles of scientists, and weigh the benefits of innovation with its unforeseen pitfalls.

The Frankenstein Bicentennial Project will infuse science and engineering endeavors with considerations of ethics. It will use the power of storytelling and art to shape processes of innovation and empower public appraisal of techno-scientific research and creation. It will offer humanists and artists a new set of concerns around research, public policy, and the ramifications of exploration and invention. And it will inspire new scientific and technological advances inspired by Shelley’s exploration of our inspiring and terrifying ability to bring new life into the world. Frankenstein represents a landmark fusion of science, ethics, and literary expression.

The bicentennial provides an opportunity for vivid reflection on how science is culturally framed and understood by the public, as well as our ethical limitations and responsibility for nurturing the products of our creativity. It is also a moment to unveil new scientific and technological marvels, especially in the areas of synthetic biology and artificial intelligence. Engaging with Frankenstein allows scholars and educators, artists and writers, and the public at large to consider the history of scientific invention, reflect on contemporary research, and question the future of our technological society. Acting as a network hub for the bicentennial celebration, ASU will encourage and coordinate collaboration across institutions and among diverse groups worldwide.

2016 Frankenstein events

Now, there’s an exhibition in Switzerland where Frankenstein was ‘born’ according to a May 12, 2016 news item on phys.org,

Frankenstein, the story of a scientist who brings to life a cadaver and causes his own downfall, has for two centuries given voice to anxiety surrounding the unrelenting advance of science.

To mark the 200 years since England’s Mary Shelley first imagined the ultimate horror story during a visit to a frigid, rain-drenched Switzerland, an exhibit opens in Geneva Friday called “Frankenstein, Creation of Darkness”.

In the dimly-lit, expansive basement at the Martin Bodmer Foundation, a long row of glass cases holds 15 hand-written, yellowed pages from a notebook where Shelley in 1816 wrote the first version of what is considered a masterpiece of romantic literature.

The idea for her “miserable monster” came when at just 18 she and her future husband, English poet Percy Bysshe Shelley, went to a summer home—the Villa Diodati—rented by literary great Lord Byron on the outskirts of Geneva.

The current private owners of the picturesque manor overlooking Lake Geneva will also open their lush gardens to guided tours during the nearby exhibit which runs to October 9 [May 13 – Oct. 9, 2016].

While the spot today is lovely, with pink and purple lilacs spilling from the terraces and gravel walkways winding through rose-covered arches, in the summer of 1816 the atmosphere was more somber.

A massive eruption from the Tambora volcano in Indonesia wreaked havoc with the global climate that year, and a weather report for Geneva in June on display at the exhibit mentions “not a single leaf” had yet appeared on the oak trees.

To pass the time, poet Lord Byron challenged the band of literary bohemians gathered at the villa to each invent a ghost story, resulting in several famous pieces of writing.

English doctor and author John Polidori came up with the idea for “The Vampyre”, which was published three years later and is considered to have pioneered the romantic vampyre genre, including works like Bram Stoker’s “Dracula”.

That book figures among a multitude of first editions at the Geneva exhibit, including three of Mary Shelley’s “Frankenstein, or the Modern Prometheus”—the most famous story to emerge from the competition.

Here’s a description of the exhibit, from the Martin Bodmer Foundation’s Frankenstein webpage,

To celebrate the 200th anniversary of the writing of this historically influential work of literature, the Martin Bodmer Foundation presents a major exhibition on the origins of Frankenstein, the perspectives it opens and the questions it raises.

A best seller since its first publication in 1818, Mary Shelley’s novel continues to demand attention. The questions it raises remain at the heart of literary and philosophical concerns: the ethics of science, climate change, the technologisation of the human body, the unconscious, human otherness, the plight of the homeless and the dispossessed.

The exposition Frankenstein: Creation of Darkness recreates the beginnings of the novel in its first manuscript and printed forms, along with paintings and engravings that evoke the world of 1816. A variety of literary and scientific works are presented as sources of the novel’s ideas. While exploring the novel’s origins, the exhibition also evokes the social and scientific themes of the novel that remain important in our own day.

For what it’s worth, I have come across analyses which suggest science and technology may not have been the primary concern at the time. There are interpretations which suggest issues around childbirth (very dangerous until modern times) and fear of disfigurement and disfigured individuals. What makes Frankenstein and the book so fascinating is how flexible interpretations can be. (For more about Frankenstein and flexibility, read Susan Tyler Hitchcock’s 2009 book, Frankenstein: a cultural history.)

There’s one more upcoming Frankenstein event, from The Frankenstein Bicentennial announcement webpage,

On June 14 and 15, 2016, the Brocher Foundation, Arizona State University, Duke University, and the University of Lausanne will host “Frankenstein’s Shadow,” a symposium in Geneva, Switzerland to commemorate the origin of Frankenstein and assess its influence in different times and cultures, particularly its resonance in debates about public policy governing biotechnology and medicine. These dates place the symposium almost exactly 200 years after Mary Shelley initially conceived the idea for Frankenstein on June 16, 1816, and in almost exactly the same geographical location on the shores of Lake Geneva.

If you’re interested in details such as the programme schedule, there’s this PDF,

Frankenstein¹s_ShadowConference

Enjoy!

Measuring the van der Waals forces between individual atoms for the first time

A May 13, 2016 news item on Nanowerk heralds the first time measuring the van der Waals forces between individual atoms,

Physicists at the Swiss Nanoscience Institute and the University of Basel have succeeded in measuring the very weak van der Waals forces between individual atoms for the first time. To do this, they fixed individual noble gas atoms within a molecular network and determined the interactions with a single xenon atom that they had positioned at the tip of an atomic force microscope. As expected, the forces varied according to the distance between the two atoms; but, in some cases, the forces were several times larger than theoretically calculated.

A May 13, 2016 University of Basel press release (also on EurekAlert), which originated the news item, provides an explanation of van der Waals forces (the most comprehensive I’ve seen) and technical details about how the research was conducted,

Van der Waals forces act between non-polar atoms and molecules. Although they are very weak in comparison to chemical bonds, they are hugely significant in nature. They play an important role in all processes relating to cohesion, adhesion, friction or condensation and are, for example, essential for a gecko’s climbing skills.

Van der Waals interactions arise due to a temporary redistribution of electrons in the atoms and molecules. This results in the occasional formation of dipoles, which in turn induce a redistribution of electrons in closely neighboring molecules. Due to the formation of dipoles, the two molecules experience a mutual attraction, which is referred to as a van der Waals interaction. This only exists temporarily but is repeatedly re-formed. The individual forces are the weakest binding forces that exist in nature, but they add up to reach magnitudes that we can perceive very clearly on the macroscopic scale – as in the example of the gecko.

Fixed within the nano-beaker

To measure the van der Waals forces, scientists in Basel used a low-temperature atomic force microscope with a single xenon atom on the tip. They then fixed the individual argon, krypton and xenon atoms in a molecular network. This network, which is self-organizing under certain experimental conditions, contains so-called nano-beakers of copper atoms in which the noble gas atoms are held in place like a bird egg. Only with this experimental set-up is it possible to measure the tiny forces between microscope tip and noble gas atom, as a pure metal surface would allow the noble gas atoms to slide around.

Compared with theory

The researchers compared the measured forces with calculated values and displayed them graphically. As expected from the theoretical calculations, the measured forces fell dramatically as the distance between the atoms increased. While there was good agreement between measured and calculated curve shapes for all of the noble gases analyzed, the absolute measured forces were larger than had been expected from calculations according to the standard model. Above all for xenon, the measured forces were larger than the calculated values by a factor of up to two.

The scientists are working on the assumption that, even in the noble gases, charge transfer occurs and therefore weak covalent bonds are occasionally formed, which would explain the higher values.

Here’s a link to and a citation for the paper,

Van der Waals interactions and the limits of isolated atom models at interfaces by Shigeki Kawai, Adam S. Foster, Torbjörn Björkman, Sylwia Nowakowska, Jonas Björk, Filippo Federici Canova, Lutz H. Gade, Thomas A. Jung, & Ernst Meyer. Nature Communications 7, Article number: 11559  doi:10.1038/ncomms11559 Published 13 May 2016

This is an open access paper.

New model to track flow of nanomaterials through our air, earth, and water

Just how many tons of nanoparticles are making their way through the environment? Scientists at the Swiss Federal Laboratories for Materials Science and Technology (Empa) have devised a new model which could help answer that question. From a May 12, 2016 news item on phys.org,

Carbon nanotubes remain attached to materials for years while titanium dioxide and nanozinc are rapidly washed out of cosmetics and accumulate in the ground. Within the National Research Program “Opportunities and Risks of Nanomaterials” (NRP 64) a team led by Empa scientist Bernd Nowack has developed a new model to track the flow of the most important nanomaterials in the environment.

A May 12, 2016 Empa press release by Michael Hagmann, which also originated the news item, provides more detail such as an estimated tonnage for titanium dioxide nanoparticles produced annually in Europe,

How many man-made nanoparticles make their way into the air, earth or water? In order to assess these amounts, a group of researchers led by Bernd Nowack from Empa, the Swiss Federal Laboratories for Materials Science and Technology, has developed a computer model as part of the National Research Program “Opportunities and Risks of Nanomaterials” (NRP 64). “Our estimates offer the best available data at present about the environmental accumulation of nanosilver, nanozinc, nano-tinanium dioxide and carbon nanotubes”, says Nowack.

In contrast to the static calculations hitherto in use, their new, dynamic model does not just take into account the significant growth in the production and use of nanomaterials, but also makes provision for the fact that different nanomaterials are used in different applications. For example, nanozinc and nano-titanium dioxide are found primarily in cosmetics. Roughly half of these nanoparticles find their way into our waste water within the space of a year, and from there they enter into sewage sludge. Carbon nanotubes, however, are integrated into composite materials and are bound in products such as which are immobilized and are thus found for example in tennis racquets and bicycle frames. It can take over ten years before they are released, when these products end up in waste incineration or are recycled.

39,000 metric tons of nanoparticles

The researchers involved in this study come from Empa, ETH Zurich and the University of Zurich. They use an estimated annual production of nano-titanium dioxide across Europe of 39,000 metric tons – considerably more than the total for all other nanomaterials. Their model calculates how much of this enters the atmosphere, surface waters, sediments and the earth, and accumulates there. In the EU, the use of sewage sludge as fertilizer (a practice forbidden in Switzerland) means that nano-titanium dioxide today reaches an average concentration of 61 micrograms per kilo in affected soils.

Knowing the degree of accumulation in the environment is only the first step in the risk assessment of nanomaterials, however. Now this data has to be compared with results of eco-toxicological tests and the statutory thresholds, says Nowack. A risk assessment has not been carried out with his new model so far. Earlier work with data from a static model showed, however, that the concentrations determined for all four nanomaterials investigated are not expected to have any impact on the environment.

But in the case of nanozinc at least, its concentration in the environment is approaching the critical level. This is why this particular nanomaterial has to be given priority in future eco-toxicological studies – even though nanozinc is produced in smaller quantities than nano-titanium dioxide. Furthermore, eco-toxicological tests have until now been carried out primarily with freshwater organisms. The researchers conclude that additional investigations using soil-dwelling organisms are a priority.

Here are links to and citations for papers featuring the work,

Dynamic Probabilistic Modeling of Environmental Emissions of Engineered Nanomaterials by Tian Yin Sun†, Nikolaus A. Bornhöft, Konrad Hungerbühler, and Bernd Nowack. Environ. Sci. Technol., 2016, 50 (9), pp 4701–4711 DOI: 10.1021/acs.est.5b05828 Publication Date (Web): April 04, 2016

Copyright © 2016 American Chemical Society

Probabilistic environmental risk assessment of five nanomaterials (nano-TiO2, nano-Ag, nano-ZnO, CNT, and fullerenes) by Claudia Coll, Dominic Notter, Fadri Gottschalk, Tianyin Sun, Claudia Som, & Bernd Nowack. Nanotoxicology Volume 10, Issue 4, 2016 pages 436-444 DOI: 10.3109/17435390.2015.1073812 Published online: 10 Nov 2015

The first paper, which is listed in Environmental Science & Technology, appears to be open access while the second paper is behind a paywall.

Nanosafety Cluster newsletter—excerpts from the Spring 2016 issue

The European Commission’s NanoSafety Cluster Newsletter (no.7) Spring 2016 edition is some 50 pp. long and it provides a roundup of activities and forthcoming events. Here are a few excerpts,

“Closer to the Market” Roadmap (CTTM) now finalised

Hot off the press! the Cluster’s “Closer to the Market” Roadmap (CTTM)  is  a  multi-dimensional,  stepwise  plan  targeting  a framework to deliver safe nano-enabled products to the market. After some years of discussions, several consultations of a huge number of experts in the nanosafety-field, conferences at which the issue of market implementation of nanotechnologies was talked  about,  writing  hours/days,  and  finally  two public consultation rounds, the CTTM is now finalized.

As stated in the Executive Summary: “Nano-products and nano-enabled applications need a clear and easy-to-follow human and environmental safety framework for the development along the innovation chain from initial idea to market and beyond that facilitates  navigation  through  the  complex  regulatory and approval processes under which different product categories fall.

Download it here, and get involved in its implementation through the Cluster!
Authors: Andreas Falk* 1, Christa Schimpel1, Andrea Haase3, Benoît Hazebrouck4, Carlos Fito López5, Adriele Prina-Mello6, Kai Savolainen7, Adriënne Sips8, Jesús M. Lopez de Ipiña10, Iseult Lynch11, Costas Charitidis12, Visser Germ13

NanoDefine hosts Synergy Workshop with NSC projects

NanoDefine  organised  the  2nd Nanosafety  Cluster  (NSC)  Synergy Workshop  at  the  Netherlands  House  for Education  and  Research  in Brussels  on  2nd  February  2016. The  aim  was  to  identify  overlaps and synergies existing between different projects that could develop into
outstanding cooperation opportunities.

One central issue was the building of a common ontology and a European framework for data management and analysis, as planned within eNanoMapper, to facilitate a closer interdisciplinary collaboration between  NSC projects and to better address the need for proper data storage, analysis and sharing (Open Access).

Unexpectedly, there’s a Canadian connection,

Discovering protocols for nanoparticles: the soils case
NanoFASE WP7 & NanoSafety Cluster WG3 Exposure

In NanoFASE, of course, we focus on the exposure to nanomaterials. Having consistent and meaningful protocols to characterize the fate of nanomaterials in different environments is therefore of great interest to us. Soils and sediments are in this respect very cumbersome. Also in the case of conventional chemicals has the development of  protocols for fate description in terrestrial systems been a long route.

The special considerations of nanomaterials make this job even harder. For instance, how does one handle the fact that the interaction between soils and nanoparticles is always out of equilibrium? How does one distinguish between the nanoparticles that are still mobile and those that are attached to soil?

In the case of conventional chemicals, a single measurement of a filtered soil suspension often suffices to find the mobile fraction, as long one is sure that equilibrium has been attained. Equilibrium never occurs in the case of  nanoparticles, and the distinction between attached/suspended particles is analytically less clear to do.

Current activity in NanoFASE is focusing at finding protocols to characterize this interaction. Not only does the protocol have to provide meaningful parameters that can be used, e.g. in modelling, but also the method itself should be fast and cheap enough so that a lot of data can be collected in a reasonable amount of time. NanoFASE is  in a good position to do this, because of its focus on fate and because of the many international collaborators.

For  instance,  the Swedish  Agricultural  University (Uppsala)  is  collaborating  with  McGill  University (Montreal, Canada [emphasis mine]), an advisory partner to NanoFASE, in developing the OECD [Organization for Economic Cooperation and Development] protocol for column tests (OECD test nr 312:  “Leaching in soil columns”). The effort is led by Yasir Sultan from Environment Canada and by Karlheinz Weinfurtner from the Frauenhofer institute in Germany. Initial results show the transport of nanomaterials in soil columns to be very limited.

The OECD protocol therefore does not often lead to measurable breakthrough curves that can be modelled to provide information about  nanomaterial  mobility  in  soils  and  most  likely  requires adaptations  to  account  for  the  relatively  low mobility  of  typical pristine nanomaterials.

OECD 312 prescribes to use 40 cm columns, which is most likely too long to show a breakthrough in the case of nanoparticles. Testing in NanoFASE will therefore focus on working with shorter columns and also investigating the effect of the flow speed.

The progress and the results of this action will be reported on our website (www.nanofase.eu).

ENM [engineered nanomaterial] Transformation in and Release from Managed Waste Streams (WP5): The NanoFASE pilot Wastewater Treatment Plant is up and running and producing sludge – soon we’ll be dosing with nanoparticles to test “real world” aging.

Now, wastewater,

ENM [engineered nanomaterial] Transformation in and Release from Managed Waste Streams (WP5): The NanoFASE pilot Wastewater Treatment Plant is up and running and producing sludge – soon we’ll be dosing with nanoparticles to test “real world” aging.

WP5 led by Ralf Kaegi of EAWAG [Swiss Federal Institute of Aquatic Science and Technology] (Switzerland) will establish transformation and release rates of ENM during their passage through different reactors. We are focusing on wastewater treatment plants (WWTPs), solid waste and dedicated sewage sludge incinerators as well as landfills (see figure below). Additionally, lab-scale experiments using pristine and well characterized materials, representing the realistic fate relevant forms at each stage, will allow us to obtain a mechanistic understanding of the transformation processes in waste treatment reactors. Our experimental results will feed directly into the development of a mathematical model describing the transformation and transfer of ENMs through the investigated reactors.

I’m including this since I’ve been following the ‘silver nanoparticle story’ for some time,

NanoMILE publication update: NanoMILE on the air and on the cover

Dramatic  differences  in  behavior  of  nano-silver during  the  initial  wash  cycle  and  for  its  further dissolution/transformation potential over time depending on detergent composition and form.

In an effort to better relate nanomaterial aging procedures to those which they are most likely to undergo during the life cycle of nano-enhanced products, in this paper we describe the various transformations which are possible when exposing Ag engineered nanoparticles (ENPs) to a suite of commercially available washing detergents (Figure 1). While Ag ENP transformation and washing of textiles has received considerable attention in recent years, our study is novel in that we (1) used several commercially available detergents allowing us to estimate the various changes possible in individual homes and commercial washing settings; (2) we have continued  method  development  of  state  of  the  art nanometrology techniques, including single particle ICP-MS, for the detection and characterization of ENPs in complex media; and (3) we were able to provide novel additions to the knowledge base of the environmental nanotechnology research community both in terms of the analytical methods (e.g. the first time ENP aggregates have been definitively analyzed via single particle ICP-MS) and broadening the scope of “real world” conditions that should be considered when understanding AgENP through their life cycle.

Our findings, which were recently published in Environmental Science and Toxicology (2015, 49: 9665), indicate that the washing detergent chemistry causes dramatic differences in ENP behavior during the initial wash cycle and has ramifications for the dissolution/transformation potential of the Ag ENPs over time (see Figure 2). The use of silver as an  antimicrobial  treatment  in  textiles  continues  to garner  considerable  attention.  Last  year  we  published  a manuscript in ACS Nano that considered how various silver treatments to textiles (conventional and nano) both release  nano-sized  material  after  the  wash  cycle  with  similar chemical  characteristics.  That  study  essentially conveyed that multiple silver treatments would become more similar through the product life cycle. Our newest  work expands this by investigating one silver ENP under various washing conditions thereby creating more varied silver products as an end result.

Fascinating stuff if you’ve been following the issues around nanotechnology and safety.

Towards the end of the newsletter on pp. 46-48, they list opportunities for partnerships, collaboration, and research posts and they list websites where you can check out job opportunities. Good Luck!

Less pollution from ships with nanofilter

04.05.16 - Cargo ships are among the leading sources of pollution on the planet. Starting in 2020, however, stricter sulfur emission standards will take effect. A low-cost solution for reaching the new targets may come from an EPFL start-up, which is developing a nanostructured filter for use in a ship’s exhaust stacks. Courtesy EPFL

04.05.16 – Cargo ships are among the leading sources of pollution on the planet. Starting in 2020, however, stricter sulfur emission standards will take effect. A low-cost solution for reaching the new targets may come from an EPFL start-up, which is developing a nanostructured filter for use in a ship’s exhaust stacks. Copyright Alain Herzog Courtesy EPFL

A May 4, 2016 news item on Nanowerk describes a marine initiative from the École Polytechnique de Lausanne (EPFL) in Switzerland,

Around 55,000 cargo ships ply the oceans every day, powered by a fuel that is dirtier than diesel. And owing to lax standards, maritime transport has emerged as one of the leading emitters – alongside air transport – of nitrogen oxide and sulfur. But the International Maritime Organization has enacted tighter emission limits, with new standards set to take effect in 2020. In response, an EPFL start-up is developing a low-cost and eco-friendly solution: a filter that can be installed in the ships’ exhaust stacks. The start-up, Daphne Technology, could do well on this massive market.

Given that no oceans or seas border Switzerland, it’s a rather interesting initiative on their part. Here’s more from a May 4, 2015 EPFL press release, which originated the news item,

Lowering sulfur emissions to below 1%

Under laboratory conditions, the nanostructured filter is able to cut sulfur emissions to below 1% and nitrogen oxide emissions to 15% of the current standards. This is a major improvement, seeing as the new standards will require an approximately 14% reduction in sulfur emissions.

Manufacturing the filters is similar to manufacturing solar cells. A thin metal plate – titanium in this case – is nanostructured in order to increase its surface area, and a number of substances are deposited in extremely thin layers. The plates are then placed vertically and evenly spaced, creating channels through which the toxic gases travel. The gases are captured by the nanostructured surfaces. This approach is considered eco-friendly because the substances in the filter are designed to be recycled. And the exhaust gas itself becomes inert and could be used in a variety of products, such as fertilizer.

The main challenges now are to figure out a way to make these filters on large surfaces, and to bring down the cost. It was at EPFL’s Swiss Plasma Center that researcher Mario Michan found a machine that he could modify to meet his needs: it uses plasma to deposit thin layers of substances. The next step is to produce a prototype that can be tested under real-world conditions.

Michan came up with his solution for toxic gas emissions after he worked on merchant ships while completing his Master’s degree in microengineering. It took several years, some techniques he picked up in the various labs in which he worked, and a few patents for Michan to make headway on his project. It was while he was working in another field at CERN and observing the technologies used to coat the inside of particle accelerators that he discovered a process needed for his original concept. An EPFL patent tying together the various aspects of the technology and several manufacturing secrets should be filed this year.

According to the European Environment Agency, merchant ships give off 204 times more sulfur than the billion cars on the roads worldwide. Michan estimates that his nanostructured filters, if they were used by all cargo ships, would reduce these emissions to around twice the level given off by all cars, and the ships would not need to switch to another fuel. Other solutions exist, but his market research showed that they were all lacking in some way: “Marine diesel fuel is cleaner but much more expensive and would drive up fuel costs by 50% according to ship owners. And the other technologies that have been proposed cannot be used on boats or they only cut down on sulfur emissions without addressing the problem of nitrogen oxide.”

The Daphne Technology website is here.

An atom without properties?

There’s rather intriguing Swiss research into atoms and so-called Bell Correlations according to an April 21, 2016 news item on ScienceDaily,

The microscopic world is governed by the rules of quantum mechanics, where the properties of a particle can be completely undetermined and yet strongly correlated with those of other particles. Physicists from the University of Basel have observed these so-called Bell correlations for the first time between hundreds of atoms. Their findings are published in the scientific journal Science.

Everyday objects possess properties independently of each other and regardless of whether we observe them or not. Einstein famously asked whether the moon still exists if no one is there to look at it; we answer with a resounding yes. This apparent certainty does not exist in the realm of small particles. The location, speed or magnetic moment of an atom can be entirely indeterminate and yet still depend greatly on the measurements of other distant atoms.

An April 21, 2016 University of Basel (Switzerland) press release (also on EurekAlert), which originated the news item, provides further explanation,

With the (false) assumption that atoms possess their properties independently of measurements and independently of each other, a so-called Bell inequality can be derived. If it is violated by the results of an experiment, it follows that the properties of the atoms must be interdependent. This is described as Bell correlations between atoms, which also imply that each atom takes on its properties only at the moment of the measurement. Before the measurement, these properties are not only unknown – they do not even exist.

A team of researchers led by professors Nicolas Sangouard and Philipp Treutlein from the University of Basel, along with colleagues from Singapore, have now observed these Bell correlations for the first time in a relatively large system, specifically among 480 atoms in a Bose-Einstein condensate. Earlier experiments showed Bell correlations with a maximum of four light particles or 14 atoms. The results mean that these peculiar quantum effects may also play a role in larger systems.

Large number of interacting particles

In order to observe Bell correlations in systems consisting of many particles, the researchers first had to develop a new method that does not require measuring each particle individually – which would require a level of control beyond what is currently possible. The team succeeded in this task with the help of a Bell inequality that was only recently discovered. The Basel researchers tested their method in the lab with small clouds of ultracold atoms cooled with laser light down to a few billionths of a degree above absolute zero. The atoms in the cloud constantly collide, causing their magnetic moments to become slowly entangled. When this entanglement reaches a certain magnitude, Bell correlations can be detected. Author Roman Schmied explains: “One would expect that random collisions simply cause disorder. Instead, the quantum-mechanical properties become entangled so strongly that they violate classical statistics.”

More specifically, each atom is first brought into a quantum superposition of two states. After the atoms have become entangled through collisions, researchers count how many of the atoms are actually in each of the two states. This division varies randomly between trials. If these variations fall below a certain threshold, it appears as if the atoms have ‘agreed’ on their measurement results; this agreement describes precisely the Bell correlations.

New scientific territory

The work presented, which was funded by the National Centre of Competence in Research Quantum Science and Technology (NCCR QSIT), may open up new possibilities in quantum technology; for example, for generating random numbers or for quantum-secure data transmission. New prospects in basic research open up as well: “Bell correlations in many-particle systems are a largely unexplored field with many open questions – we are entering uncharted territory with our experiments,” says Philipp Treutlein.

Here’s a link to and a citation for the paper,

Bell correlations in a Bose-Einstein condensate by Roman Schmied, Jean-Daniel Bancal, Baptiste Allard, Matteo Fadel, Valerio Scarani, Philipp Treutlein, Nicolas Sangouard. Science  22 Apr 2016: Vol. 352, Issue 6284, pp. 441-444 DOI: 10.1126/science.aad8665

This paper is behind a paywall.