Manipulating graphene’s conductivity with honey

Honey can be used for many things, to heal wounds, for advice (You catch more flies with honey), to clean your hair (see suggestion no. 19 here) and, even, scientific inspiration according to a Sept. 22, 2017 news item on phys.org,

Dr. Richard Ordonez, a nanomaterials scientist at the Space and Naval Warfare Systems Center Pacific (SSC Pacific), was having stomach pains last year. So begins the story of the accidental discovery that honey—yes, the bee byproduct—is an effective, non-toxic substitute for the manipulation of the current and voltage characteristics of graphene.

The news item was originated by a Sept. 22, 2017 article by Katherine Connor (who works for  the US Space and Naval warfare Center) and placed in cemag.us,

Ordonez’ lab mate and friend Cody Hayashi gave him some store-bought honey as a Christmas gift and anti-inflammatory for his stomach, and Ordonez kept it near his work station for daily use. One day in the lab, the duo was investigating various dielectric materials they could use to fabricate a graphene transistor. First, the team tried to utilize water as a top-gate dielectric to manipulate graphene’s electrical conductivity. This approach was unsuccessful, so they proceeded with various compositions of sugar and deionized water, another electrolyte, which still resulted in negligible performance. That’s when the honey caught Ordonez’ eye, and an accidental scientific breakthrough was realized.

The finding is detailed in a paper in Nature Scientific Reports, in which the team describes how honey produces a nanometer-sized electric double layer at the interface with graphene that can be used to gate the ambipolar transport of graphene.

“As a top-gate dielectric, water is much too conductive, so we moved to sugar and de-ionized water to control the ionic composition in hopes we could reduce conductivity,” Ordonez explains. “However, sugar water didn’t work for us either because, as a gate-dielectric, there was still too much leakage current. Out of frustration, literally inches away from me was the honey Cody had bought, so we decided to drop-cast the honey on graphene to act as top-gate dielectric — I thought maybe the honey would mimic dielectric gels I read about in literature. To our surprise — everyone said it’s not going to work — we tried and it did.”

Image of the liquid-metal graphene field-effect transistor (LM-GFET) and representation of charge distribution in electrolytic gate dielectrics comprised of honey. Image: Space and Naval Warfare Systems Center

 

Ordonez, Hayashi, and a team of researchers from SSC Pacific, in collaboration with the University of Hawai′i at Mānoa, have been developing novel graphene devices as part of a Navy Innovative Science and Engineering (NISE)-funded effort to imbue the Navy with inexpensive, lightweight, flexible graphene-based devices that can be used as next-generation sensors and wearable devices.

“Traditionally, electrolytic gate transistors are made with ionic gel materials,” Hayashi says. “But you must be proficient with the processes to synthesize them, and it can take several months to figure out the correct recipe that is required for these gels to function in the environment. Some of the liquids are toxic, so experimentation must be conducted in an atmospheric-controlled environment. Honey is completely different — it performs similarly to these much more sophisticated materials, but is safe, inexpensive, and easier to use. The honey was an intermediate step towards using ionic gels, and possibly a replacement for certain applications.”

Ordonez and Hayashi envision the honey-based version of graphene products being used for rapid prototyping of devices, since the devices can be created quickly and easily redesigned based on results. Instead of having to spend months developing the materials before even beginning to incorporate it into devices, using honey allows the team to get initial tests underway without waiting for costly fabrication equipment.

Ordonez also sees a use for such products in science, technology, engineering, and math (STEM) outreach efforts, since the honey is non-toxic and could be used to teach students about graphene.

This latest innovation and publication was a follow-on from the group’s discovery last year that liquid metals can be used in place of rigid electrodes such as gold and silver to electrically contact graphene. This, coupled with research on graphene and multi-spectral detection, earned them the Federal Laboratory Consortium Far West Regional Award in the category of Outstanding Technology Development.

SSC Pacific is the naval research and development lab responsible for ensuring Information Warfare superiority for warfighters, including the areas of cyber, command and control, intelligence, surveillance and reconnaissance, and space systems.

Here’s a link to and a citation for the paper,

Rapid Fabrication of Graphene Field-Effect Transistors with Liquid-metal Interconnects and Electrolytic Gate Dielectric Made of Honey by Richard C. Ordonez, Cody K. Hayashi, Carlos M. Torres, Jordan L. Melcher, Nackieb Kamin, Godwin Severa, & David Garmire. Scientific Reports 7, Article number: 10171 (2017) doi:10.1038/s41598-017-10043-4 Published online: 31 August 2017

This paper is open access.

Calligraphy ink and cancer treatment

Courtesy of ACS Omega and the researchers

Nice illustration! I wish I could credit the artist. For anyone who needs a little text to make sense of it, there’s a Sept. 27, 2017 news item on Nanowerk (Note: A link has been removed),

For hundreds of years, Chinese calligraphers have used a plant-based ink to create beautiful messages and art. Now, one group reports in ACS Omega (“New Application of Old Material: Chinese Traditional Ink for Photothermal Therapy of Metastatic Lymph Nodes”) that this ink could noninvasively and effectively treat cancer cells that spread, or metastasize, to lymph nodes.

A Sept. 27, 2017 American Chemical Society (ACS) news release, which originated the news item, reveals more about the research,

As cancer cells leave a tumor, they frequently make their way to lymph nodes, which are part of the immune system. In this case, the main treatment option is surgery, but this can result in complications. Photothermal therapy (PTT) is an emerging noninvasive treatment option in which nanomaterials are injected and accumulate in cancer cells. A laser heats up the nanomaterials, and this heat kills the cells. Many of these nanomaterials are expensive, difficult-to-make and toxic. However, a traditional Chinese ink called Hu-Kaiwen ink (Hu-ink) has similar properties to the nanomaterials used in PTT. For example, they are the same color, and are both carbon-based and stable in water. So Wuli Yang and colleagues wanted to see if Hu-ink could be a good alternative material for PTT.

The researchers analyzed Hu-ink and found that it consists of nanoparticles and thin layers of carbon. When Hu-ink was heated with a laser, its temperature rose by 131 degrees Fahrenheit, much higher than current nanomaterials. Under PPT conditions, the Hu-ink killed cancer cells in a laboratory dish, but under normal conditions, the ink was non-toxic. This was also the scenario observed in mice with tumors. The researchers also noted that Hu-ink could act as a probe to locate tumors and metastases because it absorbs near-infrared light, which goes through skin.

Being a little curious about Hu-ink’s similarity to nanomaterial, I looked for more detail in the the paper (Note: Links have been removed), From the: Introduction,

Photothermal therapy (PTT) is an emerging tumor treatment strategy, which utilizes hyperthermia generated from absorbed near-infrared (NIR) light energy by photoabsorbing agents to kill tumor cells.(7-13) Different from chemotherapy, surgical treatment, and radiotherapy, PTT is noninvasive and more efficient.(7, 14, 15) In the past decade, PTT with diverse nanomaterials to eliminate cancer metastases lymph nodes has attracted extensive attention by several groups, including our group.(3, 16-20) For instance, Liu and his co-workers developed a treatment method based on PEGylated single-walled carbon nanotubes for PTT of tumor sentinel lymph nodes and achieved remarkably improved treatment effect in an animal tumor model.(21) To meet the clinical practice, the potential metastasis of deeper lymph nodes was further ablated in our previous work, using magnetic graphene oxide as a theranostic agent.(22) However, preparation of these artificial nanomaterials usually requires high cost, complicated synthetic process, and unavoidably toxic catalyst or chemicals,(23, 24) which impede their future clinical application. For the clinical application, exploring an environment-friendly material with simple preparation procedure, good biocompatibility, and excellent therapeutic efficiency is still highly desired. [emphases mine]

From the: Preparation and Characterization of Hu-Ink

To obtain an applicable sample, the condensed Hu-ink was first diluted into aqueous dispersion with a lower concentration. The obtained Hu-ink dispersion without any further treatment was black in color and stable in physiological environment, including water, phosphate-buffered saline (PBS), and Roswell Park Memorial Institute (RPMI) 1640; furthermore, no aggregation was observed even after keeping undisturbed for 3 days (Figure 2a). The nanoscaled morphology of Hu-ink was examined by transmission electron microscopy (TEM) (Figure 2b), which demonstrates that Hu-ink mainly exist in the form of small aggregates. These small aggregates consist of a few nanoparticles with diameter of about 20–50 nm. Dynamic light scattering (DLS) measurement (Figure 2c) further shows that Hu-ink aqueous dispersion possesses a hydrodynamic diameter of about 186 nm (polydispersity index: 0.18), which was a crucial prerequisite for biomedical applications.(29) In the X-ray diffraction (XRD) pattern, no other characteristic peaks are found except carbon peak (Figure S1, Supporting Information), which confirms that the main component of Hu-ink is carbon.(25) Raman spectroscopy was a common tool to characterize graphene-related materials.(30) D band (∼1300 cm–1, corresponding to the defects) and G band (∼1600 cm–1, related to the sp2 carbon sites) peaks could be observed in Figure 2d with the ratio ID/IG = 0.96, which confirms the existence of graphene sheetlike structure in Hu-ink.(31) The UV–vis–NIR spectra (Figure 2e) also revealed that Hu-ink has high absorption in the NIR region around 650–900 nm, in which hemoglobin and water, the major absorbers of biological tissue, have their lowest absorption coefficient.(32) The high NIR absorption capability of Hu-ink encouraged us to investigate its photothermal properties.(33-35) Hu-ink dispersions with different concentrations were irradiated under an 808 nm laser (the commercial and widely used wavelength in photothermal therapy).(8-13) [emphases mine]

Curiosity satisfied! For those who’d like to investigate even further, here’s a link to and a citation for the paper,

New Application of Old Material: Chinese Traditional Ink for Photothermal Therapy of Metastatic Lymph Nodes by Sheng Wang, Yongbin Cao, Qin Zhang, Haibao Peng, Lei Liang, Qingguo Li, Shun Shen, Aimaier Tuerdi, Ye Xu, Sanjun Cai, and Wuli Yang. ACS Omega, 2017, 2 (8), pp 5170–5178 DOI: 10.1021/acsomega.7b00993 Publication Date (Web): August 30, 2017

Copyright © 2017 American Chemical Society

This paper appears to be open access.

Limitless energy and the International Thermonuclear Experimental Reactor (ITER)

Over 30 years in the dreaming, the International Thermonuclear Experimental Reactor (ITER) is now said to be 1/2 way to completing construction. A December 6, 2017 ITER press release (received via email) makes the joyful announcement,

WORLD’S MOST COMPLEX MACHINE IS 50 PERCENT COMPLETED
ITER is proving that fusion is the future source of clean, abundant, safe and economic energy_

The International Thermonuclear Experimental Reactor (ITER), a project to prove that fusion power can be produced on a commercial scale and is sustainable, is now 50 percent built to initial operation. Fusion is the same energy source from the Sun that gives the Earth its light and warmth.

ITER will use hydrogen fusion, controlled by superconducting magnets, to produce massive heat energy. In the commercial machines that will follow, this heat will drive turbines to produce electricity with these positive benefits:

* Fusion energy is carbon-free and environmentally sustainable, yet much more powerful than fossil fuels. A pineapple-sized amount of hydrogen offers as much fusion energy as 10,000 tons of coal.

* ITER uses two forms of hydrogen fuel: deuterium, which is easily extracted from seawater; and tritium, which is bred from lithium inside the fusion reactor. The supply of fusion fuel for industry and megacities is abundant, enough for millions of years.

* When the fusion reaction is disrupted, the reactor simply shuts down-safely and without external assistance. Tiny amounts of fuel are used, about 2-3 grams at a time; so there is no physical possibility of a meltdown accident.

* Building and operating a fusion power plant is targeted to be comparable to the cost of a fossil fuel or nuclear fission plant. But unlike today’s nuclear plants, a fusion plant will not have the costs of high-level radioactive waste disposal. And unlike fossil fuel plants,
fusion will not have the environmental cost of releasing CO2 and other pollutants.

ITER is the most complex science project in human history. The hydrogen plasma will be heated to 150 million degrees Celsius, ten times hotter than the core of the Sun, to enable the fusion reaction. The process happens in a donut-shaped reactor, called a tokamak(*), which is surrounded by giant magnets that confine and circulate the superheated, ionized plasma, away from the metal walls. The superconducting magnets must be cooled to minus 269°C, as cold as interstellar space.

The ITER facility is being built in Southern France by a scientific partnership of 35 countries. ITER’s specialized components, roughly 10 million parts in total, are being manufactured in industrial facilities all over the world. They are subsequently shipped to the ITER worksite, where they must be assembled, piece-by-piece, into the final machine.

Each of the seven ITER members-the European Union, China, India, Japan, Korea, Russia, and the United States-is fabricating a significant portion of the machine. This adds to ITER’s complexity.

In a message dispatched on December 1 [2017] to top-level officials in ITER member governments, the ITER project reported that it had completed 50 percent of the “total construction work scope through First Plasma” (**). First Plasma, scheduled for December 2025, will be the first stage of operation for ITER as a functional machine.

“The stakes are very high for ITER,” writes Bernard Bigot, Ph.D., Director-General of ITER. “When we prove that fusion is a viable energy source, it will eventually replace burning fossil fuels, which are non-renewable and non-sustainable. Fusion will be complementary with wind, solar, and other renewable energies.

“ITER’s success has demanded extraordinary project management, systems engineering, and almost perfect integration of our work.

“Our design has taken advantage of the best expertise of every member’s scientific and industrial base. No country could do this alone. We are all learning from each other, for the world’s mutual benefit.”

The ITER 50 percent milestone is getting significant attention.

“We are fortunate that ITER and fusion has had the support of world leaders, historically and currently,” says Director-General Bigot. “The concept of the ITER project was conceived at the 1985 Geneva Summit between Ronald Reagan and Mikhail Gorbachev. When the ITER Agreement was signed in 2006, it was strongly supported by leaders such as French President Jacques Chirac, U.S. President George W. Bush, and Indian Prime Minister Manmohan Singh.

“More recently, President Macron and U.S. President Donald Trump exchanged letters about ITER after their meeting this past July. One month earlier, President Xi Jinping of China hosted Russian President Vladimir Putin and other world leaders in a showcase featuring ITER and fusion power at the World EXPO in Astana, Kazakhstan.

“We know that other leaders have been similarly involved behind the scenes. It is clear that each ITER member understands the value and importance of this project.”

Why use this complex manufacturing arrangement?

More than 80 percent of the cost of ITER, about $22 billion or EUR18 billion, is contributed in the form of components manufactured by the partners. Many of these massive components of the ITER machine must be precisely fitted-for example, 17-meter-high magnets with less than a millimeter of tolerance. Each component must be ready on time to fit into the Master Schedule for machine assembly.

Members asked for this deal for three reasons. First, it means that most of the ITER costs paid by any member are actually paid to that member’s companies; the funding stays in-country. Second, the companies working on ITER build new industrial expertise in major fields-such as electromagnetics, cryogenics, robotics, and materials science. Third, this new expertise leads to innovation and spin-offs in other fields.

For example, expertise gained working on ITER’s superconducting magnets is now being used to map the human brain more precisely than ever before.

The European Union is paying 45 percent of the cost; China, India, Japan, Korea, Russia, and the United States each contribute 9 percent equally. All members share in ITER’s technology; they receive equal access to the intellectual property and innovation that comes from building ITER.

When will commercial fusion plants be ready?

ITER scientists predict that fusion plants will start to come on line as soon as 2040. The exact timing, according to fusion experts, will depend on the level of public urgency and political will that translates to financial investment.

How much power will they provide?

The ITER tokamak will produce 500 megawatts of thermal power. This size is suitable for studying a “burning” or largely self-heating plasma, a state of matter that has never been produced in a controlled environment on Earth. In a burning plasma, most of the plasma heating comes from the fusion reaction itself. Studying the fusion science and technology at ITER’s scale will enable optimization of the plants that follow.

A commercial fusion plant will be designed with a slightly larger plasma chamber, for 10-15 times more electrical power. A 2,000-megawatt fusion electricity plant, for example, would supply 2 million homes.

How much would a fusion plant cost and how many will be needed?

The initial capital cost of a 2,000-megawatt fusion plant will be in the range of $10 billion. These capital costs will be offset by extremely low operating costs, negligible fuel costs, and infrequent component replacement costs over the 60-year-plus life of the plant. Capital costs will decrease with large-scale deployment of fusion plants.

At current electricity usage rates, one fusion plant would be more than enough to power a city the size of Washington, D.C. The entire D.C. metropolitan area could be powered with four fusion plants, with zero carbon emissions.

“If fusion power becomes universal, the use of electricity could be expanded greatly, to reduce the greenhouse gas emissions from transportation, buildings and industry,” predicts Dr. Bigot. “Providing clean, abundant, safe, economic energy will be a miracle for our planet.”

*     *     *

FOOTNOTES:

* “Tokamak” is a word of Russian origin meaning a toroidal or donut-shaped magnetic chamber. Tokamaks have been built and operated for the past six decades. They are today’s most advanced fusion device design.

** “Total construction work scope,” as used in ITER’s project performance metrics, includes design, component manufacturing, building construction, shipping and delivery, assembly, and installation.

It is an extraordinary project on many levels as Henry Fountain notes in a March 27, 2017 article for the New York Times (Note: Links have been removed),

At a dusty construction site here amid the limestone ridges of Provence, workers scurry around immense slabs of concrete arranged in a ring like a modern-day Stonehenge.

It looks like the beginnings of a large commercial power plant, but it is not. The project, called ITER, is an enormous, and enormously complex and costly, physics experiment. But if it succeeds, it could determine the power plants of the future and make an invaluable contribution to reducing planet-warming emissions.

ITER, short for International Thermonuclear Experimental Reactor (and pronounced EAT-er), is being built to test a long-held dream: that nuclear fusion, the atomic reaction that takes place in the sun and in hydrogen bombs, can be controlled to generate power.

ITER will produce heat, not electricity. But if it works — if it produces more energy than it consumes, which smaller fusion experiments so far have not been able to do — it could lead to plants that generate electricity without the climate-affecting carbon emissions of fossil-fuel plants or most of the hazards of existing nuclear reactors that split atoms rather than join them.

Success, however, has always seemed just a few decades away for ITER. The project has progressed in fits and starts for years, plagued by design and management problems that have led to long delays and ballooning costs.

ITER is moving ahead now, with a director-general, Bernard Bigot, who took over two years ago after an independent analysis that was highly critical of the project. Dr. Bigot, who previously ran France’s atomic energy agency, has earned high marks for resolving management problems and developing a realistic schedule based more on physics and engineering and less on politics.

The site here is now studded with tower cranes as crews work on the concrete structures that will support and surround the heart of the experiment, a doughnut-shaped chamber called a tokamak. This is where the fusion reactions will take place, within a plasma, a roiling cloud of ionized atoms so hot that it can be contained only by extremely strong magnetic fields.

Here’s a rendering of the proposed reactor,

Source: ITER Organization

It seems the folks at the New York Times decided to remove the notes which help make sense of this image. However, it does get the idea across.

If I read the article rightly, the official cost in March 2017 was around 22 B Euros and more will likely be needed. You can read Fountain’s article for more information about fusion and ITER or go to the ITER website.

I could have sworn a local (Vancouver area) company called General Fusion was involved in the ITER project but I can’t track down any sources for confirmation. The sole connection I could find is in a documentary about fusion technology,

Here’s a little context for the film from a July 4, 2017 General Fusion news release (Note: A link has been removed),

A new documentary featuring General Fusion has captured the exciting progress in fusion across the public and private sectors.

Let There Be Light made its international premiere at the South By Southwest (SXSW) music and film festival in March [2017] to critical acclaim. The film was quickly purchased by Amazon Video, where it will be available for more than 70 million users to stream.

Let There Be Light follows scientists at General Fusion, ITER and Lawrenceville Plasma Physics in their pursuit of a clean, safe and abundant source of energy to power the world.

The feature length documentary has screened internationally across Europe and North America. Most recently it was shown at the Hot Docs film festival in Toronto, where General Fusion founder and Chief Scientist Dr. Michel Laberge joined fellow fusion physicist Dr. Mark Henderson from ITER at a series of Q&A panels with the filmmakers.

Laberge and Henderson were also interviewed by the popular CBC radio science show Quirks and Quarks, discussing different approaches to fusion, its potential benefits, and the challenges it faces.

It is yet to be confirmed when the film will be release for streaming, check Amazon Video for details.

You can find out more about General Fusion here.

Brief final comment

ITER is a breathtaking effort but if you’ve read about other large scale projects such as building a railway across the Canadian Rocky Mountains, establishing telecommunications in an  astonishing number of countries around the world, getting someone to the moon, eliminating small pox, building the pyramids, etc., it seems standard operating procedure both for the successes I’ve described and for the failures we’ve forgotten. Where ITER will finally rest on the continuum between success and failure is yet to be determined but the problems experienced so far are not necessarily a predictor.

I wish the engineers, scientists, visionaries, and others great success with finding better ways to produce energy.

Europe’s cathedrals get a ‘lift’ with nanoparticles

That headline is a teensy bit laboured but I couldn’t resist the levels of wordplay available to me. They’re working on a cathedral close to the leaning Tower of Pisa in this video about the latest in stone preservation in Europe.

I have covered the topic of preserving stone monuments before (most recently in my Oct. 21, 2014 posting). The action in this field seems to be taking place mostly in Europe, specifically Italy, although other countries are also quite involved.

Finally, getting to the European Commission’s latest stone monument preservation project, Nano-Cathedral, a Sept. 26, 2017 news item on Nanowerk announces the latest developments,

Just a few meters from Pisa’s famous Leaning Tower, restorers are defying scorching temperatures to bring back shine to the city’s Cathedral.

Ordinary restoration techniques like laser are being used on much of the stonework that dates back to the 11th century. But a brand new technique is also being used: a new material made of innovative nanoparticles. The aim is to consolidate the inner structure of the stones. It’s being applied mainly on marble.

A March 7, 2017 item on the Euro News website, which originated the Nanowerk news item, provides more detail,

“Marble has very low porosity, which means we have to use nanometric particles in order to go deep inside the stone, to ensure that the treatment is both efficient while still allowing the stone to breathe,” explains Roberto Cela, civil engineer at Opera Della Primaziale Pisana.

The material developed by the European research team includes calcium carbonate, which is a mix of calcium oxide, water and carbon dioxide.

The nano-particles penetrate the stone cementing its decaying structure.

“It is important that these particles have the same chemical nature as the stones that are being treated, so that the physical and mechanical processes that occur over time don’t lead to the break-up of the stones,” says Dario Paolucci, chemist at the University of Pisa.

Vienna’s St Stephen’s is another of the five cathedrals where the new restoration materials are being tested.

The first challenge for researchers is to determine the mechanical characteristics of the cathedral’s stones. Since there are few original samples to work on, they had to figure out a way of “ageing” samples of stones of similar nature to those originally used.

“We tried different things: we tried freeze storage, we tried salts and acids, and we decided to go for thermal ageing,” explains Matea Ban, material scientist at the University of Technology in Vienna. “So what happens is that we heat the stone at certain temperatures. Minerals inside then expand in certain directions, and when they expand they build up stresses to neighbouring minerals and then they crack, and we need those cracks in order to consolidate them.”

Consolidating materials were then applied on a variety of limestones, sandstones and marble – a selection of the different types of stones that were used to build cathedrals around Europe.

What researchers are looking for are very specific properties.

“First of all, the consolidating material has to be well absorbed by the stone,” says petrologist Johannes Weber of the University of Applied Arts in Vienna. “Then, as it evaporates, it has to settle properly within the stone structure. It should not shrink too much. All materials shrink when drying, including consolidating materials. They should adhere to the particles of the stone but shouldn’t completely obstruct its pores.”

Further tests are underway in cathedrals across Europe in the hope of better protecting our invaluable cultural heritage.

There’s a bit more detail about Nano-Cathedral on the Opera della Primaziale Pisana (O₽A) website (from their Nano-Cathedral project page),

With the meeting of June 3 this year the Nano Cathedral project kicked off, supported by the European Union within the nanotechnology field applied to Horizon 2020 cultural heritage with a fund of about 6.5 million euro.

A total of six monumental buildings will be for three years under the eyes and hands of petrographers, geologists, chemists and restorers of the institutes belonging to the Consortium: five cathedrals have been selected to represent the cultural diversity within Europe from the perspective of developing shared values and transnational identity, and a contemporary monumental building entirely clad in Carrara marble, the Opera House of Oslo.

Purpose: the testing of nanomaterials for the conservation of marble and the outer surfaces of our ‘cathedrals’.
The field of investigation to check degradation, testing new consolidating and protective products is the Cathedral of Pisa together with the Cathedrals of Cologne, Vienna, Ghent and Vitoria.
For the selection of case studies we have crosschecked requirements for their historical and architectural value but also for the different types of construction materials – marble, limestone and sandstone – as well as the relocation of six monumental buildings according to European climates.

The Cathedral of Pisa is the most southern, fully positioned in Mediterranean climate, therefore subject to degradation and very different from those which the weather conditions of the Scandinavian peninsula recorded; all the intermediate climate phases are modulated through Ghent, Vitoria, Cologne and Vienna.

At the conclusion of the three-year project, once the analysis in situ and in the laboratory are completed and all the experiments are tested on each different identified portion in each monumental building, an intervention protocol will be defined in detail in order to identify the mineralogical and petrographic characteristics of stone materials and of their degradation, the assessment of the causes and mechanisms of associated alteration, including interactions with factors of environmental pollution. Then we will be able to identify the most appropriate method of restoration and testing of nanotechnology products for the consolidation and protection of different stone materials.

In 2018 we hope to have new materials to protect and safeguard the ‘skin’ of our historic buildings and monuments for a long time.

Back to my headline and the second piece of wordplay, ‘lift’ as in ‘skin lift’ in that last sentence.

I realize this is a bit off topic but it’s worth taking a look at ORA’s home page,

Gabriele D’Annunzio effectively condenses the wonder and admiration that catch whoever visits the Duomo Square of Pisa.

The Opera della Primaziale Pisana (O₽A) is a non-profit organisation which was established in order to oversee the first works for the construction of the monuments in the Piazza del Duomo, subject to its own charter which includes the protection, promotion and enhancement of its heritage, in order to pass the religious and artistic meaning onto future generations.

«L’Ardea roteò nel cielo di Cristo, sul prato dei Miracoli.»
Gabriele d’Annunzio in Forse che sì forse che no (1910)

If you go to the home page, you can buy tickets to visit the monuments surrounding the square and there are other notices including one for a competition (it’s too late to apply but the details are interesting) to construct four stained glass windows for the Pisa cathedral.

Liquid circuitry, shape-shifting fluids and more

I’d have to see it to believe it but researchers at the US Dept. of Energy (DOE) Lawrence Berkeley National Laboratory (LBNL) have developed a new kind of ‘bijel’ which would allow for some pretty nifty robotics. From a Sept. 25, 2017 news item on ScienceDaily,

A new two-dimensional film, made of polymers and nanoparticles and developed by researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), can direct two different non-mixing liquids into a variety of exotic architectures. This finding could lead to soft robotics, liquid circuitry, shape-shifting fluids, and a host of new materials that use soft, rather than solid, substances.

The study, reported today in the journal Nature Nanotechnology, presents the newest entry in a class of substances known as bicontinuous jammed emulsion gels, or bijels, which hold promise as a malleable liquid that can support catalytic reactions, electrical conductivity, and energy conversion.

A Sept. 25, 2017 LBNL news release (also on EurekAlert), which originated the news item, expands on the theme,

Bijels are typically made of immiscible, or non-mixing, liquids. People who shake their bottle of vinaigrette before pouring the dressing on their salad are familiar with such liquids. As soon as the shaking stops, the liquids start to separate again, with the lower density liquid – often oil – rising to the top.

Trapping, or jamming, particles where these immiscible liquids meet can prevent the liquids from completely separating, stabilizing the substance into a bijel. What makes bijels remarkable is that, rather than just making the spherical droplets that we normally see when we try to mix oil and water, the particles at the interface shape the liquids into complex networks of interconnected fluid channels.

Bijels are notoriously difficult to make, however, involving exact temperatures at precisely timed stages. In addition, the liquid channels are normally more than 5 micrometers across, making them too large to be useful in energy conversion and catalysis.

“Bijels have long been of interest as next-generation materials for energy applications and chemical synthesis,” said study lead author Caili Huang. “The problem has been making enough of them, and with features of the right size. In this work, we crack that problem.”

Huang started the work as a graduate student with Thomas Russell, the study’s principal investigator, at Berkeley Lab’s Materials Sciences Division, and he continued the project as a postdoctoral researcher at DOE’s Oak Ridge National Laboratory.

Creating a new bijel recipe

The method described in this new study simplifies the bijel process by first using specially coated particles about 10-20 nanometers in diameter. The smaller-sized particles line the liquid interfaces much more quickly than the ones used in traditional bijels, making the smaller channels that are highly valued for applications.

Illustration shows key stages of bijel formation. Clockwise from top left, two non-mixing liquids are shown. Ligands (shown in yellow) with amine groups are dispersed throughout the oil or solvent, and nanoparticles coated with carboxylic acids (shown as blue dots) are scattered in the water. With vigorous shaking, the nanoparticles and ligands form a “supersoap” that gets trapped at the interface of the two liquids. The bottom panel is a magnified view of the jammed nanoparticle supersoap. (Credit: Caili Huang/ORNL)

“We’ve basically taken liquids like oil and water and given them a structure, and it’s a structure that can be changed,” said Russell, a visiting faculty scientist at Berkeley Lab. “If the nanoparticles are responsive to electrical, magnetic, or mechanical stimuli, the bijels can become reconfigurable and re-shaped on demand by an external field.”

The researchers were able to prepare new bijels from a variety of common organic, water-insoluble solvents, such as toluene, that had ligands dissolved in it, and deionized water, which contained the nanoparticles. To ensure thorough mixing of the liquids, they subjected the emulsion to a vortex spinning at 3,200 revolutions per minute.

“This extreme shaking creates a whole bunch of new places where these particles and polymers can meet each other,” said study co-author Joe Forth, a postdoctoral fellow at Berkeley Lab’s Materials Sciences Division. “You’re synthesizing a lot of this material, which is in effect a thin, 2-D coating of the liquid surfaces in the system.”

The liquids remained a bijel even after one week, a sign of the system’s stability.

Russell, who is also a professor of polymer science and engineering at the University of Massachusetts-Amherst, added that these shape-shifting characteristics would be valuable in microreactors, microfluidic devices, and soft actuators.

Nanoparticle supersoap

Nanoparticles had not been seriously considered in bijels before because their small size made them hard to trap in the liquid interface. To resolve that problem, the researchers coated nano-sized particles with carboxylic acids and put them in water. They then took polymers with an added amine group – a derivative of ammonia – and dissolved them in the toluene.

At left is a vial of bijel stabilized with nanoparticle surfactants. On the right is the same vial after a week of inversion, showing that the nanoparticle kept the liquids from moving. (Credit: Caili Huang/ORNL)

This configuration took advantage of the amine group’s affinity to water, a characteristic that is comparable to surfactants, like soap. Their nanoparticle “supersoap” was designed so that the nanoparticles join ligands, forming an octopus-like shape with a polar head and nonpolar legs that get jammed at the interface, the researchers said.

“Bijels are really a new material, and also excitingly weird in that they are kinetically arrested in these unusual configurations,” said study co-author Brett Helms, a staff scientist at Berkeley Lab’s Molecular Foundry. “The discovery that you can make these bijels with simple ingredients is a surprise. We all have access to oils and water and nanocrystals, allowing broad tunability in bijel properties. This platform also allows us to experiment with new ways to control their shape and function since they are both responsive and reconfigurable.”

The nanoparticles were made of silica, but the researchers noted that in previous studies they used graphene and carbon nanotubes to form nanoparticle surfactants.

“The key is that the nanoparticles can be made of many materials,” said Russell.  “The most important thing is what’s on the surface.”

This is an animation of the bijel

3-D rendering of the nanoparticle bijel taken by confocal microscope. (Credit: Caili Huang/ORNL [Oak Ridge National Laboratory] and Joe Forth/Berkeley Lab)

Here’s a link to and a citation for the paper,

Bicontinuous structured liquids with sub-micrometre domains using nanoparticle surfactants by Caili Huang, Joe Forth, Weiyu Wang, Kunlun Hong, Gregory S. Smith, Brett A. Helms & Thomas P. Russell. Nature Nanotechnology (2017) doi:10.1038/nnano.2017.182 25 September 2017

This paper is behind a paywall.

Of musical parodies, Despacito, and evolution

What great timing, I just found out about a musical science parody featuring evolution and biology and learned of the latest news about the study of evolution on one of the islands in the Galapagos (where Charles Darwin made some of his observations). Thanks to Stacey Johnson for her November 24, 2017 posting on the Signals blog for featuring Evo-Devo (Despacito Biology Parody), an A Capella Science music video from Tim Blais,

Now, for the latest regarding the Galapagos and evolution (from a November 24, 2017 news item on ScienceDaily),

The arrival 36 years ago of a strange bird to a remote island in the Galapagos archipelago has provided direct genetic evidence of a novel way in which new species arise.

In this week’s issue of the journal Science, researchers from Princeton University and Uppsala University in Sweden report that the newcomer belonging to one species mated with a member of another species resident on the island, giving rise to a new species that today consists of roughly 30 individuals.

The study comes from work conducted on Darwin’s finches, which live on the Galapagos Islands in the Pacific Ocean. The remote location has enabled researchers to study the evolution of biodiversity due to natural selection.

The direct observation of the origin of this new species occurred during field work carried out over the last four decades by B. Rosemary and Peter Grant, two scientists from Princeton, on the small island of Daphne Major.

A November 23, 2017 Princeton University news release on EurekAlert, which originated the news item, provides more detail,

“The novelty of this study is that we can follow the emergence of new species in the wild,” said B. Rosemary Grant, a senior research biologist, emeritus, and a senior biologist in the Department of Ecology and Evolutionary Biology. “Through our work on Daphne Major, we were able to observe the pairing up of two birds from different species and then follow what happened to see how speciation occurred.”

In 1981, a graduate student working with the Grants on Daphne Major noticed the newcomer, a male that sang an unusual song and was much larger in body and beak size than the three resident species of birds on the island.

“We didn’t see him fly in from over the sea, but we noticed him shortly after he arrived. He was so different from the other birds that we knew he did not hatch from an egg on Daphne Major,” said Peter Grant, the Class of 1877 Professor of Zoology, Emeritus, and a professor of ecology and evolutionary biology, emeritus.

The researchers took a blood sample and released the bird, which later bred with a resident medium ground finch of the species Geospiz fortis, initiating a new lineage. The Grants and their research team followed the new “Big Bird lineage” for six generations, taking blood samples for use in genetic analysis.

In the current study, researchers from Uppsala University analyzed DNA collected from the parent birds and their offspring over the years. The investigators discovered that the original male parent was a large cactus finch of the species Geospiza conirostris from Española island, which is more than 100 kilometers (about 62 miles) to the southeast in the archipelago.

The remarkable distance meant that the male finch was not able to return home to mate with a member of his own species and so chose a mate from among the three species already on Daphne Major. This reproductive isolation is considered a critical step in the development of a new species when two separate species interbreed.

The offspring were also reproductively isolated because their song, which is used to attract mates, was unusual and failed to attract females from the resident species. The offspring also differed from the resident species in beak size and shape, which is a major cue for mate choice. As a result, the offspring mated with members of their own lineage, strengthening the development of the new species.

Researchers previously assumed that the formation of a new species takes a very long time, but in the Big Bird lineage it happened in just two generations, according to observations made by the Grants in the field in combination with the genetic studies.

All 18 species of Darwin’s finches derived from a single ancestral species that colonized the Galápagos about one to two million years ago. The finches have since diversified into different species, and changes in beak shape and size have allowed different species to utilize different food sources on the Galápagos. A critical requirement for speciation to occur through hybridization of two distinct species is that the new lineage must be ecologically competitive — that is, good at competing for food and other resources with the other species — and this has been the case for the Big Bird lineage.

“It is very striking that when we compare the size and shape of the Big Bird beaks with the beak morphologies of the other three species inhabiting Daphne Major, the Big Birds occupy their own niche in the beak morphology space,” said Sangeet Lamichhaney, a postdoctoral fellow at Harvard University and the first author on the study. “Thus, the combination of gene variants contributed from the two interbreeding species in combination with natural selection led to the evolution of a beak morphology that was competitive and unique.”

The definition of a species has traditionally included the inability to produce fully fertile progeny from interbreeding species, as is the case for the horse and the donkey, for example. However, in recent years it has become clear that some closely related species, which normally avoid breeding with each other, do indeed produce offspring that can pass genes to subsequent generations. The authors of the study have previously reported that there has been a considerable amount of gene flow among species of Darwin’s finches over the last several thousands of years.

One of the most striking aspects of this study is that hybridization between two distinct species led to the development of a new lineage that after only two generations behaved as any other species of Darwin’s finches, explained Leif Andersson, a professor at Uppsala University who is also affiliated with the Swedish University of Agricultural Sciences and Texas A&M University. “A naturalist who came to Daphne Major without knowing that this lineage arose very recently would have recognized this lineage as one of the four species on the island. This clearly demonstrates the value of long-running field studies,” he said.

It is likely that new lineages like the Big Birds have originated many times during the evolution of Darwin’s finches, according to the authors. The majority of these lineages have gone extinct but some may have led to the evolution of contemporary species. “We have no indication about the long-term survival of the Big Bird lineage, but it has the potential to become a success, and it provides a beautiful example of one way in which speciation occurs,” said Andersson. “Charles Darwin would have been excited to read this paper.”

Here’s a link to and a citation for the paper,

Rapid hybrid speciation in Darwin’s finches by Sangeet Lamichhaney, Fan Han, Matthew T. Webster, Leif Andersson, B. Rosemary Grant, Peter R. Grant. Science 23 Nov 2017: eaao4593 DOI: 10.1126/science.aao4593

This paper is behind a paywall.

Happy weekend! And for those who love their Despacito, there’s this parody featuring three Italians in a small car (thanks again to Stacey Johnson’s blog posting),

Plastic nanoparticles and brain damage in fish

Researchers in Sweden suggest plastic nanoparticles may cause brain damage in fish according to a Sept. 25, 2017 news item on phys.org,

Calculations have shown that 10 per cent of all plastic produced around the world ultimately ends up in the oceans. As a result, a large majority of global marine debris is in fact plastic waste. Human production of plastics is a well-known environmental concern, but few studies have studied the effects of tiny plastic particles, known as nanoplastic particles.

“Our study is the first to show that nanosized plastic particles can accumulate in fish brains”, says Tommy Cedervall, a chemistry researcher at Lund University.

A Sept. 25, 2017 Lund University press release, which originated the news item, provides more detail about the research,

The Lund University researchers studied how nanoplastics may be transported through different organisms in the aquatic ecosystem, i.e. via algae and animal plankton to larger fish. Tiny plastic particles in the water are eaten by animal plankton, which in turn are eaten by fish.

According to Cedervall, the study includes several interesting results on how plastic of different sizes affects aquatic organisms. Most importantly, it provides evidence that nanoplastic particles can indeed cross the blood-brain barrier in fish and thus accumulate inside fish’s brain tissue.

In addition, the researchers involved in the present study have demonstrated the occurrence of behavioural disorders in fish that are affected by nanoplastics. They eat slower and explore their surroundings less. The researchers believe that these behavioural changes may be linked to brain damage caused by the presence of nanoplastics in the brain.

Another result of the study is that animal plankton die when exposed to nanosized plastic particles, while larger plastic particles do not affect them. Overall, these different effects of nanoplastics may have an impact on the ecosystem as a whole.

“It is important to study how plastics affect ecosystems and that nanoplastic particles likely have a more dangerous impact on aquatic ecosystems than larger pieces of plastics”, says Tommy Cedervall.

However, he does not dare to draw the conclusion that plastic nanoparticles could accumulate in other tissues in fish and thus potentially be transmitted to humans through consumption.

“No, we are not aware of any such studies and are therefore very cautious about commenting on it”, says Tommy Cedervall.

Here’s a link to and a citation for the paper,

Brain damage and behavioural disorders in fish induced by plastic nanoparticles delivered through the food chain by Karin Mattsson, Elyse V. Johnson, Anders Malmendal, Sara Linse, Lars-Anders Hansson & Tommy Cedervall. Scientific Reports 7, Article number: 11452 (2017) doi:10.1038/s41598-017-10813-0 Published online: 13 September 2017

This paper is open access.

Predictive policing in Vancouver—the first jurisdiction in Canada to employ a machine learning system for property theft reduction

Predictive policing has come to Canada, specifically, Vancouver. A July 22, 2017 article by Matt Meuse for the Canadian Broadcasting Corporation (CBC) news online describes the new policing tool,

The Vancouver Police Department is implementing a city-wide “predictive policing” system that uses machine learning to prevent break-ins by predicting where they will occur before they happen — the first of its kind in Canada.

Police chief Adam Palmer said that, after a six-month pilot project in 2016, the system is now accessible to all officers via their cruisers’ onboard computers, covering the entire city.

“Instead of officers just patrolling randomly throughout the neighbourhood, this will give them targeted areas it makes more sense to patrol in because there’s a higher likelihood of crime to occur,” Palmer said.

 

Things got off to a slow start as the system familiarized itself [during a 2016 pilot project] with the data, and floundered in the fall due to unexpected data corruption.

But Special Const. Ryan Prox said the system reduced property crime by as much as 27 per cent in areas where it was tested, compared to the previous four years.

The accuracy of the system was also tested by having it generate predictions for a given day, and then watching to see what happened that day without acting on the predictions.

Palmer said the system was getting accuracy rates between 70 and 80 per cent.

When a location is identified by the system, Palmer said officers can be deployed to patrol that location. …

“Quite often … that visible presence will deter people from committing crimes [altogether],” Palmer said.

Though similar systems are used in the United States, Palmer said the system is the first of its kind in Canada, and was developed specifically for the VPD.

While the current focus is on residential break-ins, Palmer said the system could also be tweaked for use with car theft — though likely not with violent crime, which is far less predictable.

Palmer dismissed the inevitable comparison to the 2002 Tom Cruise film Minority Report, in which people are arrested to prevent them from committing crimes in the future.

“We’re not targeting people, we’re targeting locations,” Palmer said. “There’s nothing dark here.”

If you want to get a sense of just how dismissive Chief Palmer was, there’s a July 21, 2017 press conference (run time: approx. 21 mins.) embedded with a media release of the same date. The media release offered these details,

The new model is being implemented after the VPD ran a six-month pilot study in 2016 that contributed to a substantial decrease in residential break-and-enters.

The pilot ran from April 1 to September 30, 2016. The number of residential break-and enters during the test period was compared to the monthly average over the same period for the previous four years (2012 to 2015). The highest drop in property crime – 27 per cent – was measured in June.

The new model provides data in two-hour intervals for locations where residential and commercial break-and-enters are anticipated. The information is for 100-metre and 500-metre zones. Police resources can be dispatched to that area on foot or in patrol cars, to provide a visible presence to deter thieves.

The VPD’s new predictive policing model is built on GEODASH – an advanced machine-learning technology that was implemented by the VPD in 2015. A public version of GEODASH was introduced in December 2015 and is publicly available on vpd.ca. It retroactively plots the location of crimes on a map to provide a general idea of crime trends to the public.

I wish Chief Palmer had been a bit more open to discussion about the implications of ‘predictive policing’. In the US where these systems have been employed in various jurisdictions, there’s some concern arising after an almost euphoric initial response as a Nov. 21, 2016 article by Logan Koepke for the slate.com notes (Note: Links have been removed),

When predictive policing systems began rolling out nationwide about five years ago, coverage was often uncritical and overly reliant on references to Minority Report’s precog system. The coverage made predictive policing—the computer systems that attempt to use data to forecast where crime will happen or who will be involved—seem almost magical.

Typically, though, articles glossed over Minority Report’s moral about how such systems can go awry. Even Slate wasn’t immune, running a piece in 2011 called “Time Cops” that said, when it came to these systems, “Civil libertarians can rest easy.”

This soothsaying language extended beyond just media outlets. According to former New York City Police Commissioner William Bratton, predictive policing is the “wave of the future.” Microsoft agrees. One vendor even markets its system as “better than a crystal ball.” More recent coverage has rightfully been more balanced, skeptical, and critical. But many still seem to miss an important point: When it comes to predictive policing, what matters most isn’t the future—it’s the past.

Some predictive policing systems incorporate information like the weather, a location’s proximity to a liquor store, or even commercial data brokerage information. But at their core, they rely either mostly or entirely on historical crime data held by the police. Typically, these are records of reported crimes—911 calls or “calls for service”—and other crimes the police detect. Software automatically looks for historical patterns in the data, and uses those patterns to make its forecasts—a process known as machine learning.

Intuitively, it makes sense that predictive policing systems would base their forecasts on historical crime data. But historical crime data has limits. Criminologists have long emphasized that crime reports—and other statistics gathered by the police—do not necessarily offer an accurate picture of crime in a community. The Department of Justice’s National Crime Victimization Survey estimates that from 2006 to 2010, 52 percent of violent crime went unreported to police, as did 60 percent of household property crime. Essentially: Historical crime data is a direct record of how law enforcement responds to particular crimes, rather than the true rate of crime. Rather than predicting actual criminal activity, then, the current systems are probably better at predicting future police enforcement.

Koepke goes on to cover other potential issues with ‘predicitive policing’ in this thoughtful piece. He also co-authored an August 2016 report, Stuck in a Pattern; Early evidence on “predictive” policing and civil rights.

There seems to be increasing attention on machine learning and bias as noted in my May 24, 2017 posting where I provide links to other FrogHeart postings on the topic and there’s this Feb. 28, 2017 posting about a new regional big data sharing project, the Cascadia Urban Analytics Cooperative where I mention Cathy O’Neil (author of the book, Weapons of Math Destruction) and her critique in a subsection titled: Algorithms and big data.

I would like to see some oversight and some discussion in Canada about this brave new world of big data.

One final comment, it is possible to get access to the Vancouver Police Department’s data through the City of Vancouver’s Open Data Catalogue (home page).

A jellyfish chat on November 28, 2017 at Café Scientifique Vancouver get together

Café Scientifique Vancouver sent me an announcement (via email) about their upcoming event,

We are pleased to announce our next café which will happen on TUESDAY,
NOVEMBER 28TH at 7:30PM in the back room of YAGGER'S DOWNTOWN (433 W
Pender).

JELLYFISH – FRIEND, FOE, OR FOOD?

Did you know that in addition to stinging swimmers, jellyfish also cause
extensive damage to fisheries and coastal power plants? As threats such
as overfishing, pollution, and climate change alter the marine
environment, recent media reports are proclaiming that jellyfish are
taking over the oceans. Should we hail to our new jellyfish overlords or
do we need to examine the evidence behind these claims? Join Café
Scientifique on Nov. 28, 2017 to learn everything you ever wanted to
know about jellyfish, and find out if jelly burgers are coming soon to a
menu near you.

Our speaker for the evening will be DR. LUCAS BROTZ, a Postdoctoral
Research Fellow with the Sea Around Us at UBC’s Institute for the
Oceans and Fisheries. Lucas has been studying jellyfish for more than a
decade, and has been called “Canada’s foremost jellyfish
researcher” by CBC Nature of Things host Dr. David Suzuki. Lucas has
participated in numerous international scientific collaborations, and
his research has been featured in more than 100 media outlets including
Nature News, The Washington Post, and The New York Times. He recently
received the Michael A. Bigg award for highly significant student
research as part of the Coastal Ocean Awards at the Vancouver Aquarium.

We hope to see you there!

You can find out more about Lucas Brotz here and about Sea Around Us here.

For anyone who’s curious about the jellyfish ‘issue’, there’s a November 8, 2017 Norwegian University of Science and Technology press release on AlphaGallileo or on EurekAlert, which provides insight into the problems and the possibilities,

Jellyfish could be a resource in producing microplastic filters, fertilizer or fish feed. A new 6 million euro project called GoJelly, funded by the EU and coordinated by the GEOMAR Helmholtz Centre for Ocean Research, Germany and including partners at the Norwegian University of Science and Technology (NTNNU) and SINTEF [headquartered in Trondheim, Norway, is the largest independent research organisation in Scandinavia; more about SINTEF in its Wikipedia entry], hopes to turn jellyfish from a nuisance into a useful product.

Global climate change and the human impact on marine ecosystems has led to dramatic decreases in the number of fish in the ocean. It has also had an unforseen side effect: because overfishing decreases the numbers of jellyfish competitors, their blooms are on the rise.

The GoJelly project, coordinated by the GEOMAR Helmholtz Centre for Ocean Research, Germany, would like to transform problematic jellyfish into a resource that can be used to produce microplastic filter, fertilizer or fish feed. The EU has just approved funding of EUR 6 million over 4 years to support the project through its Horizon 2020 programme.

Rising water temperatures, ocean acidification and overfishing seem to favour jellyfish blooms. More and more often, they appear in huge numbers that have already destroyed entire fish farms on European coasts and blocked cooling systems of power stations near the coast. A number of jellyfish species are poisonous, while some tropical species are even among the most toxic animals on earth.

“In Europe alone, the imported American comb jelly has a biomass of one billion tons. While we tend to ignore the jellyfish there must be other solutions,” says Jamileh Javidpour of GEOMAR, initiator and coordinator of the GoJelly project, which is a consortium of 15 scientific institutions from eight countries led by the GEOMAR Helmholtz Centre for Ocean Research in Kiel.

The project will first entail exploring the life cycle of a number of jellyfish species. A lack of knowledge about life cycles makes it is almost impossible to predict when and why a large jellyfish bloom will occur. “This is what we want to change so that large jellyfish swarms can be caught before they reach the coasts,” says Javidpour.

At the same time, the project partners will also try to answer the question of what to do with jellyfish once they have been caught. One idea is to use the jellyfish to battle another, man-made threat.

“Studies have shown that mucus of jellyfish can bind microplastic. Therefore, we want to test whether biofilters can be produced from jellyfish. These biofilters could then be used in sewage treatment plants or in factories where microplastic is produced,” the GoJelly researchers say.

Jellyfish can also be used as fertilizers for agriculture or as aquaculture feed. “Fish in fish farms are currently fed with captured wild fish, which does not reduce the problem of overfishing, but increases it. Jellyfish as feed would be much more sustainable and would protect natural fish stocks,” says the GoJelly team.

Another option is using jellyfish as food for humans. “In some cultures, jellyfish are already on the menu. As long as the end product is no longer slimy, it could also gain greater general acceptance,” said Javidpour. Finally yet importantly, jellyfish contain collagen, a substance very much sought after in the cosmetics industry.

Project partners from the Norwegian University of Science and Technology, led by Nicole Aberle-Malzahn, and SINTEF Ocean, led by Rachel Tiller, will analyse how abiotic (hydrography, temperature), biotic (abundance, biomass, ecology, reproduction) and biochemical parameters (stoichiometry, food quality) affect the initiation of jellyfish blooms.

Based on a comprehensive analysis of triggering mechanisms, origin of seed populations and ecological modelling, the researchers hope to be able to make more reliable predictions on jellyfish bloom formation of specific taxa in the GoJelly target areas. This knowledge will allow sustainable harvesting of jellyfish communities from various Northern and Southern European populations.

This harvest will provide a marine biomass of unknown potential that will be explored by researchers at SINTEF Ocean, among others, to explore the possible ways to use the material.

A team from SINTEF Ocean’s strategic program Clean Ocean will also work with European colleagues on developing a filter from the mucus of the jellyfish that will catch microplastics from household products (which have their source in fleece sweaters, breakdown of plastic products or from cosmetics, for example) and prevent these from entering the marine ecosystem.

Finally, SINTEF Ocean will examine the socio-ecological system and games, where they will explore the potentials of an emerging international management regime for a global effort to mitigate the negative effects of microplastics in the oceans.

“Jellyfish can be used for many purposes. We see this as an opportunity to use the potential of the huge biomass drifting right in front of our front door,” Javidpour said.

You can find out more about GoJelly on their Twitter account.

Colour: an art/science open call for submissions

The submission deadline for this open ‘art/sci’ call is January 17, 2018 (from a November 29, 2017 Art/Science Salon announcement; received via email),

COLOUR: WHAT DO YOU MEAN BY THAT?

An exhibition exploring colour as a phenomenon that crosses the
boundaries of the arts and sciences.

Artists and designers revel in, and seek to understand, the visceral,
physical and ephemeral qualities of colour. Sir Isaac Newton began his
scientific experiments with light and prisms as ‘a very pleasing
divertisement, to view the vivid and intense colours produced
thereby’. His investigations ultimately changed our understanding of
the fundamental nature of light and colour. Johann Wolfgang von Goethe
challenged Newton’s understanding as limited, and introduced colour as
an emotionally charged phenomenon. He proposed an alternative
methodological approach based on ’empathic observation’.

COLOUR: WHAT DO YOU MEAN BY THAT? calls for art inspired by, or
questioning, scientific concepts about colour: art that encapsulates
colour knowledge from multiple perspectives.

We are not looking for the merely colourful – rather we look for work
engaging ideas, theories and aspects of colour – both conceptual and
physical – that highlight colour knowledge as richly meaningful across
diverse ways of knowing.

To this end, we invite proposals that present, consider, or respond to
research about colour and colour phenomena. Work may relate to:

* physical colour phenomena, e.g. light sources, interference,
iridescence, scattering, reflection
* chemistry of dyes & pigments
* colour vision / colour perception
* colour renderings of energies outside of the visible spectrum
(ultraviolet, infra-red, etc.)
* colour meanings (cultural, scientific, philosophical)
* cross-sensory colour sensations and understandings
* colour theories
* colour histories

SHOW DATES: MARCH 7-25, 2018.

COLOUR: WHAT DO YOU MEAN BY THAT? is jointly sponsored by Propeller
Gallery and the Colour Research Society of Canada [1]

SHOW LOCATION: Propeller Gallery, 30 Abell St, Toronto, ON, Canada

SUBMISSION DEADLINE: Wed Jan 17, 2018, 11:59pm [which timezone?]

SUBMIT YOUR APPLICATION HERE:
HTTP://HUUTAART.COM/OPENCALLS/COLOUR-WHAT-DO-YOU-MEAN-BY-THAT [2]

SUBMISSION REQUIREMENTS:

You may submit more than one submission, provided the concept is
substantially different for each piece, with a maximum of three
submissions. With each submission, please provide at least one image
(maximum 4 images) relevant to your proposal.

Details about yourself and your work including:

* Name, address, email, phone number, with a brief bio.
* Title of Work, Year, Medium, Size and Value in $CAD.
* A brief written statement about the work, including how the work
deals with, or draws its inspiration from, diverse ways of knowing about
colour (max. 150 words).
* NON-REFUNDABLE SUBMISSION FEE OF $50.00 PER SUBMISSION.

CURATORIAL TEAM MEMBERS: Doreen Balabanoff, Robin Kingsburgh, Janet
Read, Judith Tinkl

ADDITIONAL INFORMATION:

* 25% commission collected on any work sold as a result of this
exhibition.
* For more information visit our website: www.propellerctr.com [3]
* If selected, you agree to allow us to use your submission material,
without compensation, in any potential catalogue/publication for this
exhibition.
* Selected artists will be contacted by email not later than January
31. Delivery instructions will be given at that time.
* An event at the exhibition, related to International Colour Day,
March 21st, will be announced in early 2018.

Please direct inquiries to:

Nathan Heuvingh
Gallery Director
gallery@propellerctr.com
1-416-504-7142

Facebook: https://www.facebook.com/PropellerTO/ [4]
Twitter: @PropellerTO
Instagram: @propellerygallery_to

The co-sponsor for this upcoming exhibition, the Colour Research Society of Canada has a website that proved to be a delightful surprise.

Getting back to COLOUR: WHAT DO YOU MEAN BY THAT?, good luck with your submission, and should it be accepted, good luck with sales!