Category Archives: nanotechnology

Popping (nano)bubbles!

Who doesn’t love to pop bubbles? Well, there’s probably someone out there but it does seem to be a near universal delight (especially with the advent of bubble wrap which I’ve seen more than one person happily popping). Scientists are no more immune to that impulse than the rest of us although they approach the whole endeavour from a more technical perspective where popping bubbles becomes destabilization and bubble rupture. From a Sept. 28, 2017 American Institute of Physics (AIP) news release (also on EurekAlert),

Nanobubbles have recently gained popularity for their unique properties and expansive applications. Their large surface area and high stability in saturated liquids make nanobubbles ideal candidates for food science, medicine and environmental advancements. Nanobubbles also have long lifetimes of hours or days, and greater applicability than traditional macrobubbles, which typically only last for seconds.

The stability of nanobubbles is well understood, but the mechanisms causing their eventual destabilization are still in question. Using molecular dynamics simulations (MDS), researchers from the Beijing University of Chemical Technology explored the effect of surfactants — components that lower surface tension — on the stabilization of nanobubbles. They report their findings on the surprising mechanisms of destabilization [emphasis mine] for both soluble and insoluble surfactants this week [Sept. 25-29, 2017] in Applied Physics Letters, from AIP Publishing.

Researchers investigated the differences between soluble and insoluble surfactants and their varying influence on nanobubble stability using MDS software. They created a controled model system where the only variables that could be manipulated were the number of surfactants and the interaction between the surfactant and the substrate, the base of the model where the bubble is formed, to measure the direct influence of surfactants on nanobubble stability.

Analyzing both soluble and insoluble surfactants, the group focused on two possible mechanisms of destabilization: contact line depinning, where the surfactant flexibility reduces the forces responsible for stabilizing the bubble shape, causing it to rupture from lack of inner surface force; and surface tension reduction, causing a liquid to vapor phase transition.

The found soluble surfactants initiated nanobubble depinning when a large amount, roughly 80 percent, of the surfactant was adsorbed by the substrate, eventually causing the nanobubbles to burst.

“However, when small concentrations of soluble surfactant were introduced it remained dissolved, and adsorption onto the substrate was insignificant, generating a negligible effect on nanobubble stability,” said Xianren Zhang at Beijing University of Chemical Technology.

Simulations with insoluble surfactants showed comparable results to soluble surfactants when interacting heavily with substrates, but a new mechanism was discovered demonstrating a liquid-to-vapor transition model of bubble rupture [emphasis mine].

The transition is similar to how we traditionally envision bubbles popping, occurring when a surfactant significantly reduces the surface tension on the outside of the nanobubble. Nanobubbles destabilize in this fashion when a large amount of surfactant is present, but little — around 40 percent — surfactant-substrate interaction occurs.

These findings are critical to understanding nanobubble stability and have implications for nanobubble interaction with other molecules, including proteins and contaminants. Nanobubble applications could revolutionize aspects of modern medicine such as ultrasound techniques, expand functions in food science, and improve waste water treatment. But better characterizing basic properties like instability is essential to fully utilizing their potential in these applications.

There researchers have made this image illustrating their work available,

Several typical snapshots for nanobubbles losing their stability with various concentrations of surfactants and levels of interaction with substrates. In each picture, top panel shows evolution of the system with all involved particles, while in the bottom panel, solvent molecules are not shown to clarify the effect of surfactants. CREDIT: Qianxiang Xiao, Yawei Liu, Zhenjiang Guo, Zhiping Liu, and Xianren Zhang

Here’s a link to and a citation for the paper,

How nanobubbles lose stability: Effects of surfactants featured by Qianxiang Xiao, Yawei Liu, Zhenjiang Guo, Zhiping Liua, and Xianren Zhang. Appl. Phys. Lett. 111, 131601 (2017); doi: http://dx.doi.org/10.1063/1.5000831

This paper is open access.

Manipulating graphene’s conductivity with honey

Honey can be used for many things, to heal wounds, for advice (You catch more flies with honey), to clean your hair (see suggestion no. 19 here) and, even, scientific inspiration according to a Sept. 22, 2017 news item on phys.org,

Dr. Richard Ordonez, a nanomaterials scientist at the Space and Naval Warfare Systems Center Pacific (SSC Pacific), was having stomach pains last year. So begins the story of the accidental discovery that honey—yes, the bee byproduct—is an effective, non-toxic substitute for the manipulation of the current and voltage characteristics of graphene.

The news item was originated by a Sept. 22, 2017 article by Katherine Connor (who works for  the US Space and Naval warfare Center) and placed in cemag.us,

Ordonez’ lab mate and friend Cody Hayashi gave him some store-bought honey as a Christmas gift and anti-inflammatory for his stomach, and Ordonez kept it near his work station for daily use. One day in the lab, the duo was investigating various dielectric materials they could use to fabricate a graphene transistor. First, the team tried to utilize water as a top-gate dielectric to manipulate graphene’s electrical conductivity. This approach was unsuccessful, so they proceeded with various compositions of sugar and deionized water, another electrolyte, which still resulted in negligible performance. That’s when the honey caught Ordonez’ eye, and an accidental scientific breakthrough was realized.

The finding is detailed in a paper in Nature Scientific Reports, in which the team describes how honey produces a nanometer-sized electric double layer at the interface with graphene that can be used to gate the ambipolar transport of graphene.

“As a top-gate dielectric, water is much too conductive, so we moved to sugar and de-ionized water to control the ionic composition in hopes we could reduce conductivity,” Ordonez explains. “However, sugar water didn’t work for us either because, as a gate-dielectric, there was still too much leakage current. Out of frustration, literally inches away from me was the honey Cody had bought, so we decided to drop-cast the honey on graphene to act as top-gate dielectric — I thought maybe the honey would mimic dielectric gels I read about in literature. To our surprise — everyone said it’s not going to work — we tried and it did.”

Image of the liquid-metal graphene field-effect transistor (LM-GFET) and representation of charge distribution in electrolytic gate dielectrics comprised of honey. Image: Space and Naval Warfare Systems Center

 

Ordonez, Hayashi, and a team of researchers from SSC Pacific, in collaboration with the University of Hawai′i at Mānoa, have been developing novel graphene devices as part of a Navy Innovative Science and Engineering (NISE)-funded effort to imbue the Navy with inexpensive, lightweight, flexible graphene-based devices that can be used as next-generation sensors and wearable devices.

“Traditionally, electrolytic gate transistors are made with ionic gel materials,” Hayashi says. “But you must be proficient with the processes to synthesize them, and it can take several months to figure out the correct recipe that is required for these gels to function in the environment. Some of the liquids are toxic, so experimentation must be conducted in an atmospheric-controlled environment. Honey is completely different — it performs similarly to these much more sophisticated materials, but is safe, inexpensive, and easier to use. The honey was an intermediate step towards using ionic gels, and possibly a replacement for certain applications.”

Ordonez and Hayashi envision the honey-based version of graphene products being used for rapid prototyping of devices, since the devices can be created quickly and easily redesigned based on results. Instead of having to spend months developing the materials before even beginning to incorporate it into devices, using honey allows the team to get initial tests underway without waiting for costly fabrication equipment.

Ordonez also sees a use for such products in science, technology, engineering, and math (STEM) outreach efforts, since the honey is non-toxic and could be used to teach students about graphene.

This latest innovation and publication was a follow-on from the group’s discovery last year that liquid metals can be used in place of rigid electrodes such as gold and silver to electrically contact graphene. This, coupled with research on graphene and multi-spectral detection, earned them the Federal Laboratory Consortium Far West Regional Award in the category of Outstanding Technology Development.

SSC Pacific is the naval research and development lab responsible for ensuring Information Warfare superiority for warfighters, including the areas of cyber, command and control, intelligence, surveillance and reconnaissance, and space systems.

Here’s a link to and a citation for the paper,

Rapid Fabrication of Graphene Field-Effect Transistors with Liquid-metal Interconnects and Electrolytic Gate Dielectric Made of Honey by Richard C. Ordonez, Cody K. Hayashi, Carlos M. Torres, Jordan L. Melcher, Nackieb Kamin, Godwin Severa, & David Garmire. Scientific Reports 7, Article number: 10171 (2017) doi:10.1038/s41598-017-10043-4 Published online: 31 August 2017

This paper is open access.

Calligraphy ink and cancer treatment

Courtesy of ACS Omega and the researchers

Nice illustration! I wish I could credit the artist. For anyone who needs a little text to make sense of it, there’s a Sept. 27, 2017 news item on Nanowerk (Note: A link has been removed),

For hundreds of years, Chinese calligraphers have used a plant-based ink to create beautiful messages and art. Now, one group reports in ACS Omega (“New Application of Old Material: Chinese Traditional Ink for Photothermal Therapy of Metastatic Lymph Nodes”) that this ink could noninvasively and effectively treat cancer cells that spread, or metastasize, to lymph nodes.

A Sept. 27, 2017 American Chemical Society (ACS) news release, which originated the news item, reveals more about the research,

As cancer cells leave a tumor, they frequently make their way to lymph nodes, which are part of the immune system. In this case, the main treatment option is surgery, but this can result in complications. Photothermal therapy (PTT) is an emerging noninvasive treatment option in which nanomaterials are injected and accumulate in cancer cells. A laser heats up the nanomaterials, and this heat kills the cells. Many of these nanomaterials are expensive, difficult-to-make and toxic. However, a traditional Chinese ink called Hu-Kaiwen ink (Hu-ink) has similar properties to the nanomaterials used in PTT. For example, they are the same color, and are both carbon-based and stable in water. So Wuli Yang and colleagues wanted to see if Hu-ink could be a good alternative material for PTT.

The researchers analyzed Hu-ink and found that it consists of nanoparticles and thin layers of carbon. When Hu-ink was heated with a laser, its temperature rose by 131 degrees Fahrenheit, much higher than current nanomaterials. Under PPT conditions, the Hu-ink killed cancer cells in a laboratory dish, but under normal conditions, the ink was non-toxic. This was also the scenario observed in mice with tumors. The researchers also noted that Hu-ink could act as a probe to locate tumors and metastases because it absorbs near-infrared light, which goes through skin.

Being a little curious about Hu-ink’s similarity to nanomaterial, I looked for more detail in the the paper (Note: Links have been removed), From the: Introduction,

Photothermal therapy (PTT) is an emerging tumor treatment strategy, which utilizes hyperthermia generated from absorbed near-infrared (NIR) light energy by photoabsorbing agents to kill tumor cells.(7-13) Different from chemotherapy, surgical treatment, and radiotherapy, PTT is noninvasive and more efficient.(7, 14, 15) In the past decade, PTT with diverse nanomaterials to eliminate cancer metastases lymph nodes has attracted extensive attention by several groups, including our group.(3, 16-20) For instance, Liu and his co-workers developed a treatment method based on PEGylated single-walled carbon nanotubes for PTT of tumor sentinel lymph nodes and achieved remarkably improved treatment effect in an animal tumor model.(21) To meet the clinical practice, the potential metastasis of deeper lymph nodes was further ablated in our previous work, using magnetic graphene oxide as a theranostic agent.(22) However, preparation of these artificial nanomaterials usually requires high cost, complicated synthetic process, and unavoidably toxic catalyst or chemicals,(23, 24) which impede their future clinical application. For the clinical application, exploring an environment-friendly material with simple preparation procedure, good biocompatibility, and excellent therapeutic efficiency is still highly desired. [emphases mine]

From the: Preparation and Characterization of Hu-Ink

To obtain an applicable sample, the condensed Hu-ink was first diluted into aqueous dispersion with a lower concentration. The obtained Hu-ink dispersion without any further treatment was black in color and stable in physiological environment, including water, phosphate-buffered saline (PBS), and Roswell Park Memorial Institute (RPMI) 1640; furthermore, no aggregation was observed even after keeping undisturbed for 3 days (Figure 2a). The nanoscaled morphology of Hu-ink was examined by transmission electron microscopy (TEM) (Figure 2b), which demonstrates that Hu-ink mainly exist in the form of small aggregates. These small aggregates consist of a few nanoparticles with diameter of about 20–50 nm. Dynamic light scattering (DLS) measurement (Figure 2c) further shows that Hu-ink aqueous dispersion possesses a hydrodynamic diameter of about 186 nm (polydispersity index: 0.18), which was a crucial prerequisite for biomedical applications.(29) In the X-ray diffraction (XRD) pattern, no other characteristic peaks are found except carbon peak (Figure S1, Supporting Information), which confirms that the main component of Hu-ink is carbon.(25) Raman spectroscopy was a common tool to characterize graphene-related materials.(30) D band (∼1300 cm–1, corresponding to the defects) and G band (∼1600 cm–1, related to the sp2 carbon sites) peaks could be observed in Figure 2d with the ratio ID/IG = 0.96, which confirms the existence of graphene sheetlike structure in Hu-ink.(31) The UV–vis–NIR spectra (Figure 2e) also revealed that Hu-ink has high absorption in the NIR region around 650–900 nm, in which hemoglobin and water, the major absorbers of biological tissue, have their lowest absorption coefficient.(32) The high NIR absorption capability of Hu-ink encouraged us to investigate its photothermal properties.(33-35) Hu-ink dispersions with different concentrations were irradiated under an 808 nm laser (the commercial and widely used wavelength in photothermal therapy).(8-13) [emphases mine]

Curiosity satisfied! For those who’d like to investigate even further, here’s a link to and a citation for the paper,

New Application of Old Material: Chinese Traditional Ink for Photothermal Therapy of Metastatic Lymph Nodes by Sheng Wang, Yongbin Cao, Qin Zhang, Haibao Peng, Lei Liang, Qingguo Li, Shun Shen, Aimaier Tuerdi, Ye Xu, Sanjun Cai, and Wuli Yang. ACS Omega, 2017, 2 (8), pp 5170–5178 DOI: 10.1021/acsomega.7b00993 Publication Date (Web): August 30, 2017

Copyright © 2017 American Chemical Society

This paper appears to be open access.

Limitless energy and the International Thermonuclear Experimental Reactor (ITER)

Over 30 years in the dreaming, the International Thermonuclear Experimental Reactor (ITER) is now said to be 1/2 way to completing construction. A December 6, 2017 ITER press release (received via email) makes the joyful announcement,

WORLD’S MOST COMPLEX MACHINE IS 50 PERCENT COMPLETED
ITER is proving that fusion is the future source of clean, abundant, safe and economic energy_

The International Thermonuclear Experimental Reactor (ITER), a project to prove that fusion power can be produced on a commercial scale and is sustainable, is now 50 percent built to initial operation. Fusion is the same energy source from the Sun that gives the Earth its light and warmth.

ITER will use hydrogen fusion, controlled by superconducting magnets, to produce massive heat energy. In the commercial machines that will follow, this heat will drive turbines to produce electricity with these positive benefits:

* Fusion energy is carbon-free and environmentally sustainable, yet much more powerful than fossil fuels. A pineapple-sized amount of hydrogen offers as much fusion energy as 10,000 tons of coal.

* ITER uses two forms of hydrogen fuel: deuterium, which is easily extracted from seawater; and tritium, which is bred from lithium inside the fusion reactor. The supply of fusion fuel for industry and megacities is abundant, enough for millions of years.

* When the fusion reaction is disrupted, the reactor simply shuts down-safely and without external assistance. Tiny amounts of fuel are used, about 2-3 grams at a time; so there is no physical possibility of a meltdown accident.

* Building and operating a fusion power plant is targeted to be comparable to the cost of a fossil fuel or nuclear fission plant. But unlike today’s nuclear plants, a fusion plant will not have the costs of high-level radioactive waste disposal. And unlike fossil fuel plants,
fusion will not have the environmental cost of releasing CO2 and other pollutants.

ITER is the most complex science project in human history. The hydrogen plasma will be heated to 150 million degrees Celsius, ten times hotter than the core of the Sun, to enable the fusion reaction. The process happens in a donut-shaped reactor, called a tokamak(*), which is surrounded by giant magnets that confine and circulate the superheated, ionized plasma, away from the metal walls. The superconducting magnets must be cooled to minus 269°C, as cold as interstellar space.

The ITER facility is being built in Southern France by a scientific partnership of 35 countries. ITER’s specialized components, roughly 10 million parts in total, are being manufactured in industrial facilities all over the world. They are subsequently shipped to the ITER worksite, where they must be assembled, piece-by-piece, into the final machine.

Each of the seven ITER members-the European Union, China, India, Japan, Korea, Russia, and the United States-is fabricating a significant portion of the machine. This adds to ITER’s complexity.

In a message dispatched on December 1 [2017] to top-level officials in ITER member governments, the ITER project reported that it had completed 50 percent of the “total construction work scope through First Plasma” (**). First Plasma, scheduled for December 2025, will be the first stage of operation for ITER as a functional machine.

“The stakes are very high for ITER,” writes Bernard Bigot, Ph.D., Director-General of ITER. “When we prove that fusion is a viable energy source, it will eventually replace burning fossil fuels, which are non-renewable and non-sustainable. Fusion will be complementary with wind, solar, and other renewable energies.

“ITER’s success has demanded extraordinary project management, systems engineering, and almost perfect integration of our work.

“Our design has taken advantage of the best expertise of every member’s scientific and industrial base. No country could do this alone. We are all learning from each other, for the world’s mutual benefit.”

The ITER 50 percent milestone is getting significant attention.

“We are fortunate that ITER and fusion has had the support of world leaders, historically and currently,” says Director-General Bigot. “The concept of the ITER project was conceived at the 1985 Geneva Summit between Ronald Reagan and Mikhail Gorbachev. When the ITER Agreement was signed in 2006, it was strongly supported by leaders such as French President Jacques Chirac, U.S. President George W. Bush, and Indian Prime Minister Manmohan Singh.

“More recently, President Macron and U.S. President Donald Trump exchanged letters about ITER after their meeting this past July. One month earlier, President Xi Jinping of China hosted Russian President Vladimir Putin and other world leaders in a showcase featuring ITER and fusion power at the World EXPO in Astana, Kazakhstan.

“We know that other leaders have been similarly involved behind the scenes. It is clear that each ITER member understands the value and importance of this project.”

Why use this complex manufacturing arrangement?

More than 80 percent of the cost of ITER, about $22 billion or EUR18 billion, is contributed in the form of components manufactured by the partners. Many of these massive components of the ITER machine must be precisely fitted-for example, 17-meter-high magnets with less than a millimeter of tolerance. Each component must be ready on time to fit into the Master Schedule for machine assembly.

Members asked for this deal for three reasons. First, it means that most of the ITER costs paid by any member are actually paid to that member’s companies; the funding stays in-country. Second, the companies working on ITER build new industrial expertise in major fields-such as electromagnetics, cryogenics, robotics, and materials science. Third, this new expertise leads to innovation and spin-offs in other fields.

For example, expertise gained working on ITER’s superconducting magnets is now being used to map the human brain more precisely than ever before.

The European Union is paying 45 percent of the cost; China, India, Japan, Korea, Russia, and the United States each contribute 9 percent equally. All members share in ITER’s technology; they receive equal access to the intellectual property and innovation that comes from building ITER.

When will commercial fusion plants be ready?

ITER scientists predict that fusion plants will start to come on line as soon as 2040. The exact timing, according to fusion experts, will depend on the level of public urgency and political will that translates to financial investment.

How much power will they provide?

The ITER tokamak will produce 500 megawatts of thermal power. This size is suitable for studying a “burning” or largely self-heating plasma, a state of matter that has never been produced in a controlled environment on Earth. In a burning plasma, most of the plasma heating comes from the fusion reaction itself. Studying the fusion science and technology at ITER’s scale will enable optimization of the plants that follow.

A commercial fusion plant will be designed with a slightly larger plasma chamber, for 10-15 times more electrical power. A 2,000-megawatt fusion electricity plant, for example, would supply 2 million homes.

How much would a fusion plant cost and how many will be needed?

The initial capital cost of a 2,000-megawatt fusion plant will be in the range of $10 billion. These capital costs will be offset by extremely low operating costs, negligible fuel costs, and infrequent component replacement costs over the 60-year-plus life of the plant. Capital costs will decrease with large-scale deployment of fusion plants.

At current electricity usage rates, one fusion plant would be more than enough to power a city the size of Washington, D.C. The entire D.C. metropolitan area could be powered with four fusion plants, with zero carbon emissions.

“If fusion power becomes universal, the use of electricity could be expanded greatly, to reduce the greenhouse gas emissions from transportation, buildings and industry,” predicts Dr. Bigot. “Providing clean, abundant, safe, economic energy will be a miracle for our planet.”

*     *     *

FOOTNOTES:

* “Tokamak” is a word of Russian origin meaning a toroidal or donut-shaped magnetic chamber. Tokamaks have been built and operated for the past six decades. They are today’s most advanced fusion device design.

** “Total construction work scope,” as used in ITER’s project performance metrics, includes design, component manufacturing, building construction, shipping and delivery, assembly, and installation.

It is an extraordinary project on many levels as Henry Fountain notes in a March 27, 2017 article for the New York Times (Note: Links have been removed),

At a dusty construction site here amid the limestone ridges of Provence, workers scurry around immense slabs of concrete arranged in a ring like a modern-day Stonehenge.

It looks like the beginnings of a large commercial power plant, but it is not. The project, called ITER, is an enormous, and enormously complex and costly, physics experiment. But if it succeeds, it could determine the power plants of the future and make an invaluable contribution to reducing planet-warming emissions.

ITER, short for International Thermonuclear Experimental Reactor (and pronounced EAT-er), is being built to test a long-held dream: that nuclear fusion, the atomic reaction that takes place in the sun and in hydrogen bombs, can be controlled to generate power.

ITER will produce heat, not electricity. But if it works — if it produces more energy than it consumes, which smaller fusion experiments so far have not been able to do — it could lead to plants that generate electricity without the climate-affecting carbon emissions of fossil-fuel plants or most of the hazards of existing nuclear reactors that split atoms rather than join them.

Success, however, has always seemed just a few decades away for ITER. The project has progressed in fits and starts for years, plagued by design and management problems that have led to long delays and ballooning costs.

ITER is moving ahead now, with a director-general, Bernard Bigot, who took over two years ago after an independent analysis that was highly critical of the project. Dr. Bigot, who previously ran France’s atomic energy agency, has earned high marks for resolving management problems and developing a realistic schedule based more on physics and engineering and less on politics.

The site here is now studded with tower cranes as crews work on the concrete structures that will support and surround the heart of the experiment, a doughnut-shaped chamber called a tokamak. This is where the fusion reactions will take place, within a plasma, a roiling cloud of ionized atoms so hot that it can be contained only by extremely strong magnetic fields.

Here’s a rendering of the proposed reactor,

Source: ITER Organization

It seems the folks at the New York Times decided to remove the notes which help make sense of this image. However, it does get the idea across.

If I read the article rightly, the official cost in March 2017 was around 22 B Euros and more will likely be needed. You can read Fountain’s article for more information about fusion and ITER or go to the ITER website.

I could have sworn a local (Vancouver area) company called General Fusion was involved in the ITER project but I can’t track down any sources for confirmation. The sole connection I could find is in a documentary about fusion technology,

Here’s a little context for the film from a July 4, 2017 General Fusion news release (Note: A link has been removed),

A new documentary featuring General Fusion has captured the exciting progress in fusion across the public and private sectors.

Let There Be Light made its international premiere at the South By Southwest (SXSW) music and film festival in March [2017] to critical acclaim. The film was quickly purchased by Amazon Video, where it will be available for more than 70 million users to stream.

Let There Be Light follows scientists at General Fusion, ITER and Lawrenceville Plasma Physics in their pursuit of a clean, safe and abundant source of energy to power the world.

The feature length documentary has screened internationally across Europe and North America. Most recently it was shown at the Hot Docs film festival in Toronto, where General Fusion founder and Chief Scientist Dr. Michel Laberge joined fellow fusion physicist Dr. Mark Henderson from ITER at a series of Q&A panels with the filmmakers.

Laberge and Henderson were also interviewed by the popular CBC radio science show Quirks and Quarks, discussing different approaches to fusion, its potential benefits, and the challenges it faces.

It is yet to be confirmed when the film will be release for streaming, check Amazon Video for details.

You can find out more about General Fusion here.

Brief final comment

ITER is a breathtaking effort but if you’ve read about other large scale projects such as building a railway across the Canadian Rocky Mountains, establishing telecommunications in an  astonishing number of countries around the world, getting someone to the moon, eliminating small pox, building the pyramids, etc., it seems standard operating procedure both for the successes I’ve described and for the failures we’ve forgotten. Where ITER will finally rest on the continuum between success and failure is yet to be determined but the problems experienced so far are not necessarily a predictor.

I wish the engineers, scientists, visionaries, and others great success with finding better ways to produce energy.

Europe’s cathedrals get a ‘lift’ with nanoparticles

That headline is a teensy bit laboured but I couldn’t resist the levels of wordplay available to me. They’re working on a cathedral close to the leaning Tower of Pisa in this video about the latest in stone preservation in Europe.

I have covered the topic of preserving stone monuments before (most recently in my Oct. 21, 2014 posting). The action in this field seems to be taking place mostly in Europe, specifically Italy, although other countries are also quite involved.

Finally, getting to the European Commission’s latest stone monument preservation project, Nano-Cathedral, a Sept. 26, 2017 news item on Nanowerk announces the latest developments,

Just a few meters from Pisa’s famous Leaning Tower, restorers are defying scorching temperatures to bring back shine to the city’s Cathedral.

Ordinary restoration techniques like laser are being used on much of the stonework that dates back to the 11th century. But a brand new technique is also being used: a new material made of innovative nanoparticles. The aim is to consolidate the inner structure of the stones. It’s being applied mainly on marble.

A March 7, 2017 item on the Euro News website, which originated the Nanowerk news item, provides more detail,

“Marble has very low porosity, which means we have to use nanometric particles in order to go deep inside the stone, to ensure that the treatment is both efficient while still allowing the stone to breathe,” explains Roberto Cela, civil engineer at Opera Della Primaziale Pisana.

The material developed by the European research team includes calcium carbonate, which is a mix of calcium oxide, water and carbon dioxide.

The nano-particles penetrate the stone cementing its decaying structure.

“It is important that these particles have the same chemical nature as the stones that are being treated, so that the physical and mechanical processes that occur over time don’t lead to the break-up of the stones,” says Dario Paolucci, chemist at the University of Pisa.

Vienna’s St Stephen’s is another of the five cathedrals where the new restoration materials are being tested.

The first challenge for researchers is to determine the mechanical characteristics of the cathedral’s stones. Since there are few original samples to work on, they had to figure out a way of “ageing” samples of stones of similar nature to those originally used.

“We tried different things: we tried freeze storage, we tried salts and acids, and we decided to go for thermal ageing,” explains Matea Ban, material scientist at the University of Technology in Vienna. “So what happens is that we heat the stone at certain temperatures. Minerals inside then expand in certain directions, and when they expand they build up stresses to neighbouring minerals and then they crack, and we need those cracks in order to consolidate them.”

Consolidating materials were then applied on a variety of limestones, sandstones and marble – a selection of the different types of stones that were used to build cathedrals around Europe.

What researchers are looking for are very specific properties.

“First of all, the consolidating material has to be well absorbed by the stone,” says petrologist Johannes Weber of the University of Applied Arts in Vienna. “Then, as it evaporates, it has to settle properly within the stone structure. It should not shrink too much. All materials shrink when drying, including consolidating materials. They should adhere to the particles of the stone but shouldn’t completely obstruct its pores.”

Further tests are underway in cathedrals across Europe in the hope of better protecting our invaluable cultural heritage.

There’s a bit more detail about Nano-Cathedral on the Opera della Primaziale Pisana (O₽A) website (from their Nano-Cathedral project page),

With the meeting of June 3 this year the Nano Cathedral project kicked off, supported by the European Union within the nanotechnology field applied to Horizon 2020 cultural heritage with a fund of about 6.5 million euro.

A total of six monumental buildings will be for three years under the eyes and hands of petrographers, geologists, chemists and restorers of the institutes belonging to the Consortium: five cathedrals have been selected to represent the cultural diversity within Europe from the perspective of developing shared values and transnational identity, and a contemporary monumental building entirely clad in Carrara marble, the Opera House of Oslo.

Purpose: the testing of nanomaterials for the conservation of marble and the outer surfaces of our ‘cathedrals’.
The field of investigation to check degradation, testing new consolidating and protective products is the Cathedral of Pisa together with the Cathedrals of Cologne, Vienna, Ghent and Vitoria.
For the selection of case studies we have crosschecked requirements for their historical and architectural value but also for the different types of construction materials – marble, limestone and sandstone – as well as the relocation of six monumental buildings according to European climates.

The Cathedral of Pisa is the most southern, fully positioned in Mediterranean climate, therefore subject to degradation and very different from those which the weather conditions of the Scandinavian peninsula recorded; all the intermediate climate phases are modulated through Ghent, Vitoria, Cologne and Vienna.

At the conclusion of the three-year project, once the analysis in situ and in the laboratory are completed and all the experiments are tested on each different identified portion in each monumental building, an intervention protocol will be defined in detail in order to identify the mineralogical and petrographic characteristics of stone materials and of their degradation, the assessment of the causes and mechanisms of associated alteration, including interactions with factors of environmental pollution. Then we will be able to identify the most appropriate method of restoration and testing of nanotechnology products for the consolidation and protection of different stone materials.

In 2018 we hope to have new materials to protect and safeguard the ‘skin’ of our historic buildings and monuments for a long time.

Back to my headline and the second piece of wordplay, ‘lift’ as in ‘skin lift’ in that last sentence.

I realize this is a bit off topic but it’s worth taking a look at ORA’s home page,

Gabriele D’Annunzio effectively condenses the wonder and admiration that catch whoever visits the Duomo Square of Pisa.

The Opera della Primaziale Pisana (O₽A) is a non-profit organisation which was established in order to oversee the first works for the construction of the monuments in the Piazza del Duomo, subject to its own charter which includes the protection, promotion and enhancement of its heritage, in order to pass the religious and artistic meaning onto future generations.

«L’Ardea roteò nel cielo di Cristo, sul prato dei Miracoli.»
Gabriele d’Annunzio in Forse che sì forse che no (1910)

If you go to the home page, you can buy tickets to visit the monuments surrounding the square and there are other notices including one for a competition (it’s too late to apply but the details are interesting) to construct four stained glass windows for the Pisa cathedral.

Liquid circuitry, shape-shifting fluids and more

I’d have to see it to believe it but researchers at the US Dept. of Energy (DOE) Lawrence Berkeley National Laboratory (LBNL) have developed a new kind of ‘bijel’ which would allow for some pretty nifty robotics. From a Sept. 25, 2017 news item on ScienceDaily,

A new two-dimensional film, made of polymers and nanoparticles and developed by researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), can direct two different non-mixing liquids into a variety of exotic architectures. This finding could lead to soft robotics, liquid circuitry, shape-shifting fluids, and a host of new materials that use soft, rather than solid, substances.

The study, reported today in the journal Nature Nanotechnology, presents the newest entry in a class of substances known as bicontinuous jammed emulsion gels, or bijels, which hold promise as a malleable liquid that can support catalytic reactions, electrical conductivity, and energy conversion.

A Sept. 25, 2017 LBNL news release (also on EurekAlert), which originated the news item, expands on the theme,

Bijels are typically made of immiscible, or non-mixing, liquids. People who shake their bottle of vinaigrette before pouring the dressing on their salad are familiar with such liquids. As soon as the shaking stops, the liquids start to separate again, with the lower density liquid – often oil – rising to the top.

Trapping, or jamming, particles where these immiscible liquids meet can prevent the liquids from completely separating, stabilizing the substance into a bijel. What makes bijels remarkable is that, rather than just making the spherical droplets that we normally see when we try to mix oil and water, the particles at the interface shape the liquids into complex networks of interconnected fluid channels.

Bijels are notoriously difficult to make, however, involving exact temperatures at precisely timed stages. In addition, the liquid channels are normally more than 5 micrometers across, making them too large to be useful in energy conversion and catalysis.

“Bijels have long been of interest as next-generation materials for energy applications and chemical synthesis,” said study lead author Caili Huang. “The problem has been making enough of them, and with features of the right size. In this work, we crack that problem.”

Huang started the work as a graduate student with Thomas Russell, the study’s principal investigator, at Berkeley Lab’s Materials Sciences Division, and he continued the project as a postdoctoral researcher at DOE’s Oak Ridge National Laboratory.

Creating a new bijel recipe

The method described in this new study simplifies the bijel process by first using specially coated particles about 10-20 nanometers in diameter. The smaller-sized particles line the liquid interfaces much more quickly than the ones used in traditional bijels, making the smaller channels that are highly valued for applications.

Illustration shows key stages of bijel formation. Clockwise from top left, two non-mixing liquids are shown. Ligands (shown in yellow) with amine groups are dispersed throughout the oil or solvent, and nanoparticles coated with carboxylic acids (shown as blue dots) are scattered in the water. With vigorous shaking, the nanoparticles and ligands form a “supersoap” that gets trapped at the interface of the two liquids. The bottom panel is a magnified view of the jammed nanoparticle supersoap. (Credit: Caili Huang/ORNL)

“We’ve basically taken liquids like oil and water and given them a structure, and it’s a structure that can be changed,” said Russell, a visiting faculty scientist at Berkeley Lab. “If the nanoparticles are responsive to electrical, magnetic, or mechanical stimuli, the bijels can become reconfigurable and re-shaped on demand by an external field.”

The researchers were able to prepare new bijels from a variety of common organic, water-insoluble solvents, such as toluene, that had ligands dissolved in it, and deionized water, which contained the nanoparticles. To ensure thorough mixing of the liquids, they subjected the emulsion to a vortex spinning at 3,200 revolutions per minute.

“This extreme shaking creates a whole bunch of new places where these particles and polymers can meet each other,” said study co-author Joe Forth, a postdoctoral fellow at Berkeley Lab’s Materials Sciences Division. “You’re synthesizing a lot of this material, which is in effect a thin, 2-D coating of the liquid surfaces in the system.”

The liquids remained a bijel even after one week, a sign of the system’s stability.

Russell, who is also a professor of polymer science and engineering at the University of Massachusetts-Amherst, added that these shape-shifting characteristics would be valuable in microreactors, microfluidic devices, and soft actuators.

Nanoparticle supersoap

Nanoparticles had not been seriously considered in bijels before because their small size made them hard to trap in the liquid interface. To resolve that problem, the researchers coated nano-sized particles with carboxylic acids and put them in water. They then took polymers with an added amine group – a derivative of ammonia – and dissolved them in the toluene.

At left is a vial of bijel stabilized with nanoparticle surfactants. On the right is the same vial after a week of inversion, showing that the nanoparticle kept the liquids from moving. (Credit: Caili Huang/ORNL)

This configuration took advantage of the amine group’s affinity to water, a characteristic that is comparable to surfactants, like soap. Their nanoparticle “supersoap” was designed so that the nanoparticles join ligands, forming an octopus-like shape with a polar head and nonpolar legs that get jammed at the interface, the researchers said.

“Bijels are really a new material, and also excitingly weird in that they are kinetically arrested in these unusual configurations,” said study co-author Brett Helms, a staff scientist at Berkeley Lab’s Molecular Foundry. “The discovery that you can make these bijels with simple ingredients is a surprise. We all have access to oils and water and nanocrystals, allowing broad tunability in bijel properties. This platform also allows us to experiment with new ways to control their shape and function since they are both responsive and reconfigurable.”

The nanoparticles were made of silica, but the researchers noted that in previous studies they used graphene and carbon nanotubes to form nanoparticle surfactants.

“The key is that the nanoparticles can be made of many materials,” said Russell.  “The most important thing is what’s on the surface.”

This is an animation of the bijel

3-D rendering of the nanoparticle bijel taken by confocal microscope. (Credit: Caili Huang/ORNL [Oak Ridge National Laboratory] and Joe Forth/Berkeley Lab)

Here’s a link to and a citation for the paper,

Bicontinuous structured liquids with sub-micrometre domains using nanoparticle surfactants by Caili Huang, Joe Forth, Weiyu Wang, Kunlun Hong, Gregory S. Smith, Brett A. Helms & Thomas P. Russell. Nature Nanotechnology (2017) doi:10.1038/nnano.2017.182 25 September 2017

This paper is behind a paywall.

Plastic nanoparticles and brain damage in fish

Researchers in Sweden suggest plastic nanoparticles may cause brain damage in fish according to a Sept. 25, 2017 news item on phys.org,

Calculations have shown that 10 per cent of all plastic produced around the world ultimately ends up in the oceans. As a result, a large majority of global marine debris is in fact plastic waste. Human production of plastics is a well-known environmental concern, but few studies have studied the effects of tiny plastic particles, known as nanoplastic particles.

“Our study is the first to show that nanosized plastic particles can accumulate in fish brains”, says Tommy Cedervall, a chemistry researcher at Lund University.

A Sept. 25, 2017 Lund University press release, which originated the news item, provides more detail about the research,

The Lund University researchers studied how nanoplastics may be transported through different organisms in the aquatic ecosystem, i.e. via algae and animal plankton to larger fish. Tiny plastic particles in the water are eaten by animal plankton, which in turn are eaten by fish.

According to Cedervall, the study includes several interesting results on how plastic of different sizes affects aquatic organisms. Most importantly, it provides evidence that nanoplastic particles can indeed cross the blood-brain barrier in fish and thus accumulate inside fish’s brain tissue.

In addition, the researchers involved in the present study have demonstrated the occurrence of behavioural disorders in fish that are affected by nanoplastics. They eat slower and explore their surroundings less. The researchers believe that these behavioural changes may be linked to brain damage caused by the presence of nanoplastics in the brain.

Another result of the study is that animal plankton die when exposed to nanosized plastic particles, while larger plastic particles do not affect them. Overall, these different effects of nanoplastics may have an impact on the ecosystem as a whole.

“It is important to study how plastics affect ecosystems and that nanoplastic particles likely have a more dangerous impact on aquatic ecosystems than larger pieces of plastics”, says Tommy Cedervall.

However, he does not dare to draw the conclusion that plastic nanoparticles could accumulate in other tissues in fish and thus potentially be transmitted to humans through consumption.

“No, we are not aware of any such studies and are therefore very cautious about commenting on it”, says Tommy Cedervall.

Here’s a link to and a citation for the paper,

Brain damage and behavioural disorders in fish induced by plastic nanoparticles delivered through the food chain by Karin Mattsson, Elyse V. Johnson, Anders Malmendal, Sara Linse, Lars-Anders Hansson & Tommy Cedervall. Scientific Reports 7, Article number: 11452 (2017) doi:10.1038/s41598-017-10813-0 Published online: 13 September 2017

This paper is open access.

Predictive policing in Vancouver—the first jurisdiction in Canada to employ a machine learning system for property theft reduction

Predictive policing has come to Canada, specifically, Vancouver. A July 22, 2017 article by Matt Meuse for the Canadian Broadcasting Corporation (CBC) news online describes the new policing tool,

The Vancouver Police Department is implementing a city-wide “predictive policing” system that uses machine learning to prevent break-ins by predicting where they will occur before they happen — the first of its kind in Canada.

Police chief Adam Palmer said that, after a six-month pilot project in 2016, the system is now accessible to all officers via their cruisers’ onboard computers, covering the entire city.

“Instead of officers just patrolling randomly throughout the neighbourhood, this will give them targeted areas it makes more sense to patrol in because there’s a higher likelihood of crime to occur,” Palmer said.

 

Things got off to a slow start as the system familiarized itself [during a 2016 pilot project] with the data, and floundered in the fall due to unexpected data corruption.

But Special Const. Ryan Prox said the system reduced property crime by as much as 27 per cent in areas where it was tested, compared to the previous four years.

The accuracy of the system was also tested by having it generate predictions for a given day, and then watching to see what happened that day without acting on the predictions.

Palmer said the system was getting accuracy rates between 70 and 80 per cent.

When a location is identified by the system, Palmer said officers can be deployed to patrol that location. …

“Quite often … that visible presence will deter people from committing crimes [altogether],” Palmer said.

Though similar systems are used in the United States, Palmer said the system is the first of its kind in Canada, and was developed specifically for the VPD.

While the current focus is on residential break-ins, Palmer said the system could also be tweaked for use with car theft — though likely not with violent crime, which is far less predictable.

Palmer dismissed the inevitable comparison to the 2002 Tom Cruise film Minority Report, in which people are arrested to prevent them from committing crimes in the future.

“We’re not targeting people, we’re targeting locations,” Palmer said. “There’s nothing dark here.”

If you want to get a sense of just how dismissive Chief Palmer was, there’s a July 21, 2017 press conference (run time: approx. 21 mins.) embedded with a media release of the same date. The media release offered these details,

The new model is being implemented after the VPD ran a six-month pilot study in 2016 that contributed to a substantial decrease in residential break-and-enters.

The pilot ran from April 1 to September 30, 2016. The number of residential break-and enters during the test period was compared to the monthly average over the same period for the previous four years (2012 to 2015). The highest drop in property crime – 27 per cent – was measured in June.

The new model provides data in two-hour intervals for locations where residential and commercial break-and-enters are anticipated. The information is for 100-metre and 500-metre zones. Police resources can be dispatched to that area on foot or in patrol cars, to provide a visible presence to deter thieves.

The VPD’s new predictive policing model is built on GEODASH – an advanced machine-learning technology that was implemented by the VPD in 2015. A public version of GEODASH was introduced in December 2015 and is publicly available on vpd.ca. It retroactively plots the location of crimes on a map to provide a general idea of crime trends to the public.

I wish Chief Palmer had been a bit more open to discussion about the implications of ‘predictive policing’. In the US where these systems have been employed in various jurisdictions, there’s some concern arising after an almost euphoric initial response as a Nov. 21, 2016 article by Logan Koepke for the slate.com notes (Note: Links have been removed),

When predictive policing systems began rolling out nationwide about five years ago, coverage was often uncritical and overly reliant on references to Minority Report’s precog system. The coverage made predictive policing—the computer systems that attempt to use data to forecast where crime will happen or who will be involved—seem almost magical.

Typically, though, articles glossed over Minority Report’s moral about how such systems can go awry. Even Slate wasn’t immune, running a piece in 2011 called “Time Cops” that said, when it came to these systems, “Civil libertarians can rest easy.”

This soothsaying language extended beyond just media outlets. According to former New York City Police Commissioner William Bratton, predictive policing is the “wave of the future.” Microsoft agrees. One vendor even markets its system as “better than a crystal ball.” More recent coverage has rightfully been more balanced, skeptical, and critical. But many still seem to miss an important point: When it comes to predictive policing, what matters most isn’t the future—it’s the past.

Some predictive policing systems incorporate information like the weather, a location’s proximity to a liquor store, or even commercial data brokerage information. But at their core, they rely either mostly or entirely on historical crime data held by the police. Typically, these are records of reported crimes—911 calls or “calls for service”—and other crimes the police detect. Software automatically looks for historical patterns in the data, and uses those patterns to make its forecasts—a process known as machine learning.

Intuitively, it makes sense that predictive policing systems would base their forecasts on historical crime data. But historical crime data has limits. Criminologists have long emphasized that crime reports—and other statistics gathered by the police—do not necessarily offer an accurate picture of crime in a community. The Department of Justice’s National Crime Victimization Survey estimates that from 2006 to 2010, 52 percent of violent crime went unreported to police, as did 60 percent of household property crime. Essentially: Historical crime data is a direct record of how law enforcement responds to particular crimes, rather than the true rate of crime. Rather than predicting actual criminal activity, then, the current systems are probably better at predicting future police enforcement.

Koepke goes on to cover other potential issues with ‘predicitive policing’ in this thoughtful piece. He also co-authored an August 2016 report, Stuck in a Pattern; Early evidence on “predictive” policing and civil rights.

There seems to be increasing attention on machine learning and bias as noted in my May 24, 2017 posting where I provide links to other FrogHeart postings on the topic and there’s this Feb. 28, 2017 posting about a new regional big data sharing project, the Cascadia Urban Analytics Cooperative where I mention Cathy O’Neil (author of the book, Weapons of Math Destruction) and her critique in a subsection titled: Algorithms and big data.

I would like to see some oversight and some discussion in Canada about this brave new world of big data.

One final comment, it is possible to get access to the Vancouver Police Department’s data through the City of Vancouver’s Open Data Catalogue (home page).

A jellyfish chat on November 28, 2017 at Café Scientifique Vancouver get together

Café Scientifique Vancouver sent me an announcement (via email) about their upcoming event,

We are pleased to announce our next café which will happen on TUESDAY,
NOVEMBER 28TH at 7:30PM in the back room of YAGGER'S DOWNTOWN (433 W
Pender).

JELLYFISH – FRIEND, FOE, OR FOOD?

Did you know that in addition to stinging swimmers, jellyfish also cause
extensive damage to fisheries and coastal power plants? As threats such
as overfishing, pollution, and climate change alter the marine
environment, recent media reports are proclaiming that jellyfish are
taking over the oceans. Should we hail to our new jellyfish overlords or
do we need to examine the evidence behind these claims? Join Café
Scientifique on Nov. 28, 2017 to learn everything you ever wanted to
know about jellyfish, and find out if jelly burgers are coming soon to a
menu near you.

Our speaker for the evening will be DR. LUCAS BROTZ, a Postdoctoral
Research Fellow with the Sea Around Us at UBC’s Institute for the
Oceans and Fisheries. Lucas has been studying jellyfish for more than a
decade, and has been called “Canada’s foremost jellyfish
researcher” by CBC Nature of Things host Dr. David Suzuki. Lucas has
participated in numerous international scientific collaborations, and
his research has been featured in more than 100 media outlets including
Nature News, The Washington Post, and The New York Times. He recently
received the Michael A. Bigg award for highly significant student
research as part of the Coastal Ocean Awards at the Vancouver Aquarium.

We hope to see you there!

You can find out more about Lucas Brotz here and about Sea Around Us here.

For anyone who’s curious about the jellyfish ‘issue’, there’s a November 8, 2017 Norwegian University of Science and Technology press release on AlphaGallileo or on EurekAlert, which provides insight into the problems and the possibilities,

Jellyfish could be a resource in producing microplastic filters, fertilizer or fish feed. A new 6 million euro project called GoJelly, funded by the EU and coordinated by the GEOMAR Helmholtz Centre for Ocean Research, Germany and including partners at the Norwegian University of Science and Technology (NTNNU) and SINTEF [headquartered in Trondheim, Norway, is the largest independent research organisation in Scandinavia; more about SINTEF in its Wikipedia entry], hopes to turn jellyfish from a nuisance into a useful product.

Global climate change and the human impact on marine ecosystems has led to dramatic decreases in the number of fish in the ocean. It has also had an unforseen side effect: because overfishing decreases the numbers of jellyfish competitors, their blooms are on the rise.

The GoJelly project, coordinated by the GEOMAR Helmholtz Centre for Ocean Research, Germany, would like to transform problematic jellyfish into a resource that can be used to produce microplastic filter, fertilizer or fish feed. The EU has just approved funding of EUR 6 million over 4 years to support the project through its Horizon 2020 programme.

Rising water temperatures, ocean acidification and overfishing seem to favour jellyfish blooms. More and more often, they appear in huge numbers that have already destroyed entire fish farms on European coasts and blocked cooling systems of power stations near the coast. A number of jellyfish species are poisonous, while some tropical species are even among the most toxic animals on earth.

“In Europe alone, the imported American comb jelly has a biomass of one billion tons. While we tend to ignore the jellyfish there must be other solutions,” says Jamileh Javidpour of GEOMAR, initiator and coordinator of the GoJelly project, which is a consortium of 15 scientific institutions from eight countries led by the GEOMAR Helmholtz Centre for Ocean Research in Kiel.

The project will first entail exploring the life cycle of a number of jellyfish species. A lack of knowledge about life cycles makes it is almost impossible to predict when and why a large jellyfish bloom will occur. “This is what we want to change so that large jellyfish swarms can be caught before they reach the coasts,” says Javidpour.

At the same time, the project partners will also try to answer the question of what to do with jellyfish once they have been caught. One idea is to use the jellyfish to battle another, man-made threat.

“Studies have shown that mucus of jellyfish can bind microplastic. Therefore, we want to test whether biofilters can be produced from jellyfish. These biofilters could then be used in sewage treatment plants or in factories where microplastic is produced,” the GoJelly researchers say.

Jellyfish can also be used as fertilizers for agriculture or as aquaculture feed. “Fish in fish farms are currently fed with captured wild fish, which does not reduce the problem of overfishing, but increases it. Jellyfish as feed would be much more sustainable and would protect natural fish stocks,” says the GoJelly team.

Another option is using jellyfish as food for humans. “In some cultures, jellyfish are already on the menu. As long as the end product is no longer slimy, it could also gain greater general acceptance,” said Javidpour. Finally yet importantly, jellyfish contain collagen, a substance very much sought after in the cosmetics industry.

Project partners from the Norwegian University of Science and Technology, led by Nicole Aberle-Malzahn, and SINTEF Ocean, led by Rachel Tiller, will analyse how abiotic (hydrography, temperature), biotic (abundance, biomass, ecology, reproduction) and biochemical parameters (stoichiometry, food quality) affect the initiation of jellyfish blooms.

Based on a comprehensive analysis of triggering mechanisms, origin of seed populations and ecological modelling, the researchers hope to be able to make more reliable predictions on jellyfish bloom formation of specific taxa in the GoJelly target areas. This knowledge will allow sustainable harvesting of jellyfish communities from various Northern and Southern European populations.

This harvest will provide a marine biomass of unknown potential that will be explored by researchers at SINTEF Ocean, among others, to explore the possible ways to use the material.

A team from SINTEF Ocean’s strategic program Clean Ocean will also work with European colleagues on developing a filter from the mucus of the jellyfish that will catch microplastics from household products (which have their source in fleece sweaters, breakdown of plastic products or from cosmetics, for example) and prevent these from entering the marine ecosystem.

Finally, SINTEF Ocean will examine the socio-ecological system and games, where they will explore the potentials of an emerging international management regime for a global effort to mitigate the negative effects of microplastics in the oceans.

“Jellyfish can be used for many purposes. We see this as an opportunity to use the potential of the huge biomass drifting right in front of our front door,” Javidpour said.

You can find out more about GoJelly on their Twitter account.

Could CRISPR (clustered regularly interspaced short palindromic repeats) be weaponized?

On the occasion of an American team’s recent publication of research where they edited the germline (embryos), I produced a three-part series about CRISPR (clustered regularly interspaced short palindromic repeats), sometimes referred to as CRISPR/Cas9, (links offered at end of this post).

Somewhere in my series, there’s a quote about how CRISPR could be used as a ‘weapon of mass destruction’ and it seems this has been a hot topic for the last year or so as James Revill, research fellow at the University of Sussex, references in his August 31, 2017 essay on theconversation.com (h/t phys.org August 31, 2017 news item), Note: Links have been removed,

The gene editing technique CRISPR has been in the limelight after scientists reported they had used it to safely remove disease in human embryos for the first time. This follows a “CRISPR craze” over the last couple of years, with the number of academic publications on the topic growing steadily.

There are good reasons for the widespread attention to CRISPR. The technique allows scientists to “cut and paste” DNA more easily than in the past. It is being applied to a number of different peaceful areas, ranging from cancer therapies to the control of disease carrying insects.

Some of these applications – such as the engineering of mosquitoes to resist the parasite that causes malaria – effectively involve tinkering with ecosystems. CRISPR has therefore generated a number of ethical and safety concerns. Some also worry that applications being explored by defence organisations that involve “responsible innovation in gene editing” may send worrying signals to other states.

Concerns are also mounting that gene editing could be used in the development of biological weapons. In 2016, Bill Gates remarked that “the next epidemic could originate on the computer screen of a terrorist intent on using genetic engineering to create a synthetic version of the smallpox virus”. More recently, in July 2017, John Sotos, of Intel Health & Life Sciences, stated that gene editing research could “open up the potential for bioweapons of unimaginable destructive potential”.

An annual worldwide threat assessment report of the US intelligence community in February 2016 argued that the broad availability and low cost of the basic ingredients of technologies like CRISPR makes it particularly concerning.

A Feb. 11, 2016 news item on sciencemagazine.org offers a précis of some of the reactions while a February 9, 2016 article by Antonio Regalado for the Massachusetts Institute of Technology’s MIT Technology Review delves into the matter more deeply,

Genome editing is a weapon of mass destruction.

That’s according to James Clapper, [former] U.S. director of national intelligence, who on Tuesday, in the annual worldwide threat assessment report of the U.S. intelligence community, added gene editing to a list of threats posed by “weapons of mass destruction and proliferation.”

Gene editing refers to several novel ways to alter the DNA inside living cells. The most popular method, CRISPR, has been revolutionizing scientific research, leading to novel animals and crops, and is likely to power a new generation of gene treatments for serious diseases (see “Everything You Need to Know About CRISPR’s Monster Year”).

It is gene editing’s relative ease of use that worries the U.S. intelligence community, according to the assessment. “Given the broad distribution, low cost, and accelerated pace of development of this dual-use technology, its deliberate or unintentional misuse might lead to far-reaching economic and national security implications,” the report said.

The choice by the U.S. spy chief to call out gene editing as a potential weapon of mass destruction, or WMD, surprised some experts. It was the only biotechnology appearing in a tally of six more conventional threats, like North Korea’s suspected nuclear detonation on January 6 [2016], Syria’s undeclared chemical weapons, and new Russian cruise missiles that might violate an international treaty.

The report is an unclassified version of the “collective insights” of the Central Intelligence Agency, the National Security Agency, and half a dozen other U.S. spy and fact-gathering operations.

Although the report doesn’t mention CRISPR by name, Clapper clearly had the newest and the most versatile of the gene-editing systems in mind. The CRISPR technique’s low cost and relative ease of use—the basic ingredients can be bought online for $60—seems to have spooked intelligence agencies.

….

However, one has to be careful with the hype surrounding new technologies and, at present, the security implications of CRISPR are probably modest. There are easier, cruder methods of creating terror. CRISPR would only get aspiring biological terrorists so far. Other steps, such as growing and disseminating biological weapons agents, would typically be required for it to become an effective weapon. This would require additional skills and places CRISPR-based biological weapons beyond the reach of most terrorist groups. At least for the time being.

A July 5, 2016 opinion piece by Malcolm Dando for Nature argues for greater safeguards,

In Geneva next month [August 2016], officials will discuss updates to the global treaty that outlaws the use of biological weapons. The 1972 Biological Weapons Convention (BWC) was the first agreement to ban an entire class of weapons, and it remains a crucial instrument to stop scientific research on viruses, bacteria and toxins from being diverted into military programmes.

The BWC is the best route to ensure that nations take the biological-weapons threat seriously. Most countries have struggled to develop and introduce strong and effective national programmes — witness the difficulty the United States had in agreeing what oversight system should be applied to gain-of-function experiments that created more- dangerous lab-grown versions of common pathogens.

As scientific work advances — the CRISPR gene-editing system has been flagged as the latest example of possible dual-use technology — this treaty needs to be regularly updated. This is especially important because it has no formal verification system. Proposals for declarations, monitoring visits and inspections were vetoed by the United States in 2001, on the grounds that such verification threatened national security and confidential business information.

Even so, issues such as the possible dual-use threat from gene-editing systems will not be easily resolved. But we have to try. Without the involvement of the BWC, codes of conduct and oversight systems set up at national level are unlikely to be effective. The stakes are high, and after years of fumbling, we need strong international action to monitor and assess the threats from the new age of biological techniques.

Revill notes the latest BWC agreement and suggests future directions,

This convention is imperfect and lacks a way to ensure that states are compliant. Moreover, it has not been adequately “tended to” by its member states recently, with the last major meeting unable to agree a further programme of work. Yet it remains the cornerstone of an international regime against the hostile use of biology. All 178 state parties declared in December of 2016 their continued determination “to exclude completely the possibility of the use of (biological) weapons, and their conviction that such use would be repugnant to the conscience of humankind”.

These states therefore need to address the hostile potential of CRISPR. Moreover, they need to do so collectively. Unilateral national measures, such as reasonable biological security procedures, are important. However, preventing the hostile exploitation of CRISPR is not something that can be achieved by any single state acting alone.

As such, when states party to the convention meet later this year, it will be important to agree to a more systematic and regular review of science and technology. Such reviews can help with identifying and managing the security risks of technologies such as CRISPR, as well as allowing an international exchange of information on some of the potential benefits of such technologies.

Most states supported the principle of enhanced reviews of science and technology under the convention at the last major meeting. But they now need to seize the opportunity and agree on the practicalities of such reviews in order to prevent the convention being left behind by developments in science and technology.

Experts (military, intelligence, medical, etc.) are not the only ones concerned about CRISPR according to a February 11, 2016 article by Sharon Begley for statnews.com (Note: A link has been removed),

Most Americans oppose using powerful new technology to alter the genes of unborn babies, according to a new poll — even to prevent serious inherited diseases.

They expressed the strongest disapproval for editing genes to create “designer babies” with enhanced intelligence or looks.

But the poll, conducted by STAT and Harvard T.H. Chan School of Public Health, found that people have mixed, and apparently not firm, views on emerging genetic techniques. US adults are almost evenly split on whether the federal government should fund research on editing genes before birth to keep children from developing diseases such as cystic fibrosis or Huntington’s disease.

“They’re not against scientists trying to improve [genome-editing] technologies,” said Robert Blendon, professor of health policy and political analysis at Harvard’s Chan School, perhaps because they recognize that one day there might be a compelling reason to use such technologies. An unexpected event, such as scientists “eliminating a terrible disease” that a child would have otherwise inherited, “could change people’s views in the years ahead,” Blendon said.

But for now, he added, “people are concerned about editing the genes of those who are yet unborn.”

A majority, however, wants government regulators to approve gene therapy to treat diseases in children and adults.

The STAT-Harvard poll comes as scientists and policy makers confront the ethical, social, and legal implications of these revolutionary tools for changing DNA. Thanks to a technique called CRISPR-Cas9, scientists can easily, and with increasing precision, modify genes through the genetic analog of a computer’s “find and replace” function.

I find it surprising that there’s resistance to removing diseases found in the germline (embryos). When they were doing public consultations on nanotechnology, the one area where people tended to be quite open to research was health and medicine. Where food was concerned however, people had far more concerns.

If you’re interested in the STAT-Harvard poll, you can find it here. As for James Revill, he has written a more substantive version of this essay as a paper, which is available here.

On a semi-related note, I found STAT (statnews.com) to be a quite interesting and accessibly written online health science journal. Here’s more from the About Us page (Note: A link has been removed),

What’s STAT all about?
STAT is a national publication focused on finding and telling compelling stories about health, medicine, and scientific discovery. We produce daily news, investigative articles, and narrative projects in addition to multimedia features. We tell our stories from the places that matter to our readers — research labs, hospitals, executive suites, and political campaigns.

Why did you call it STAT?
In medical parlance, “stat” means important and urgent, and that’s what we’re all about — quickly and smartly delivering good stories. Read more about the origins of our name here.

Who’s behind the new publication?
STAT is produced by Boston Globe Media. Our headquarters is located in Boston but we have bureaus in Washington, New York, Cleveland, Atlanta, San Francisco, and Los Angeles. It was started by John Henry, the owner of Boston Globe Media and the principal owner of the Boston Red Sox. Rick Berke is executive editor.

So is STAT part of The Boston Globe?
They’re distinct properties but the two share content and complement one another.

Is it free?
Much of STAT is free. We also offer STAT Plus, a premium subscription plan that includes exclusive reporting about the pharmaceutical and biotech industries as well as other benefits. Learn more about it here.

Who’s working for STAT?
Some of the best-sourced science, health, and biotech journalists in the country, as well as motion graphics artists and data visualization specialists. Our team includes talented writers, editors, and producers capable of the kind of explanatory journalism that complicated science issues sometimes demand.

Who’s your audience?
You. Even if you don’t work in science, have never stepped foot in a hospital, or hated high school biology, we’ve got something for you. And for the lab scientists, health professionals, business leaders, and policy makers, we think you’ll find coverage here that interests you, too. The world of health, science, and medicine is booming and yielding fascinating stories. We explore how they affect us all.

….

As promised, here are the links to my three-part series on CRISPR,

Part 1 opens the series with a basic description of CRISPR and the germline research that occasioned the series along with some of the other (non-weapon) ethical issues and patent disputes that are arising from this new technology. CRISPR and editing the germline in the US (part 1 of 3): In the beginning

Part 2 covers three critical responses to the reporting and between them describe the technology in more detail and the possibility of ‘designer babies’.  CRISPR and editing the germline in the US (part 2 of 3): ‘designer babies’?

Part 3 is all about public discussion or, rather, the lack of and need for according to a couple of social scientists. Informally, there is some discussion via pop culture and Joelle Renstrom notes although she is focused on the larger issues touched on by the television series, Orphan Black and as I touch on in my final comments. CRISPR and editing the germline in the US (part 3 of 3): public discussions and pop culture

Finally, I hope to stumble across studies from other countries about how they are responding to the possibilities presented by CRISPR/Cas9 so that I can offer a more global perspective than this largely US perspective. At the very least, it would be interesting to find it if there differences.