Tag Archives: Australia

Café Scientifique (Vancouver, Canada) on climate change and rise of complex life on Nov. 24, 2015 and Member of Parliament Joyce Murray’s Paris Climate Conference breakfast meeting

On Tuesday, November 24, 2015 at 7:30 pm in the back room of The Railway Club (2nd floor of 579 Dunsmuir St. [at Seymour St.]), Café Scientifique will be hosting a talk about climate change and the rise of complex life (from the Nov. 12, 2015 announcement),

Our speaker for the evening will be Dr. Mark Jellinek.  The title of his talk is:

The Formation and Breakup of Earth’s Supercontinents and the Remarkable Link to Earth’s Climate and the Rise of Complex Life

Earth history is marked by the intermittent formation and breakup of “supercontinents”, where all the land mass is organized much like a completed jigsaw puzzle centered at the equator or pole of the planet. Such events disrupt the mantle convective motions that cool our planet, affecting the volcanic and weathering processes that maintain Earth’s remarkably hospitable climate, in turn. In this talk I will explore how the last two supercontinental cycles impelled Earth into profoundly different climate extreme’s: a ~150 million year long cold period involving protracted global glaciations beginning about 800 million years ago and a ~100 million year long period of extreme warming beginning about 170 million years ago. One of the most provocative features of the last period of global glaciation is the rapid emergence of complex, multicellular animals about 650 million years ago. Why global glaciation might stimulate such an evolutionary bifurcation is, however, unclear. Predictable environmental stresses related to effects of the formation and breakup of the supercontinent Rodinia on ocean chemistry and Earth’s surface climate may play a crucial and unexpected role that I will discuss.

A professor in the Dept. of Earth, Ocean and Atmospheric Sciences at the University of British Columbia, Dr. Jellinek’s research interests include Volcanology, Geodynamics, Planetary Science, Geological Fluid Mechanics. You can find out more about Dr. Jellinek and his work here.

Joyce Murray and the Paris Climate Conference (sold out)

Joyce Murray is a Canadian Member of Parliament, (Liberal) for the riding of Vancouver Quadra who hosts a regular breakfast meeting where topics of interest (child care, seniors, transportation, the arts, big data, etc.) are discussed. From a Nov. 13, 2015 email announcement,

You are invited to our first post-election Vancouver Quadra MP Breakfast Connections on November 27th at Enigma Restaurant, for a discussion with Dr. Mark Jaccard on why the heat will be on world leaders in Paris, in the days leading to December 12th,  at the Paris Climate Conference (COP 21).

After 20 years of UN negotiations, the world expects a legally binding universal agreement on climate to keep temperature increases below 2°C! The climate heat will especially be on laggards like Canada and Australia’s new Prime Ministers. What might be expected of the Right Honorable Justin Trudeau and his provincial premiers? What are the possible outcomes of COP21?

Dr. Jaccard has worked with leadership in countries like China and the United States, and helped develop British Columbia’s innovative Climate Action Plan and Carbon Tax.

Join us for this unique opportunity to engage with a climate policy expert who has participated in this critical global journey. From the occasion of the 1992 Rio Earth Summit resulting in the UN Framework Convention on Climate Change (UNFCCC), through the third Conference of Parties’ (COP3) Kyoto Protocol, to COP21 today, the building blocks for a binding international solution have been assembled. What’s still missing?

Mark has been a professor in the School of Resource and Environmental Management at Simon Fraser University since 1986 and is a global leader and consultant on structuring climate mitigation solutions. Former Chair and CEO of the British Columbia Utilities Commission, he has published over 100 academic papers, most of these related to his principal research focus: the design and application of energy-economy models that assess the effectiveness of sustainable energy and climate policies.

When: Friday November 27th 7:30 to 9:00AM

Where: Enigma Restaurant 4397 west 10th Avenue (at Trimble)

Cost: $20 includes a hot buffet breakfast; $10 for students (cash only please)

RSVP by emailing joyce.murray.c1@parl.gc.ca or call 604-664-9220


They’re not even taking names for a waiting list. You can find out more about Dr. Jaccard’s work here.

Primordial goo for implants

Using the words ‘goo’ and ‘nanotechnology’ together almost always leads to ‘end of world’ scenarios referred to as  ‘grey goo‘ or there’s an alternative ‘green goo’ version also known as ecophagy. Presumably, that’s why Australian researchers avoided the word ‘nanotechnology’ in their study of the original goo, primordial goo from which all life oozed, to develop a coating for medical implants. From a Nov. 16, 2015 (Australia) Commonwealth Scientific and Industrial Research Organisation (CSIRO) press release (also on EurekAlert),

Australia’s national science research organisation, CSIRO, has developed an innovative new coating that could be used to improve medical devices and implants, thanks to a “goo” thought to be have been home to the building blocks of life.

The molecules from this primordial goo – known as prebiotic compounds – can be traced back billions of years and have been studied intensively since their discovery several decades ago.

For the first time, Australian researchers have uncovered a way to use these molecules to assist with medical treatments.

“We wanted to use these prehistoric molecules, which are believed to have been the source of all life evolving on Earth, to see if we could apply the chemistry in a practical way.” [Dr. Richard Evans, CSIRO researcher]

The team discovered that the coating is bio-friendly and cells readily grow and colonise it.

It could be applied to medical devices to improve their performance and acceptance by the body.

This could assist with a range of medical procedures.

“The non-toxic coating (left) is adhesive and will coat almost any material making its potential biomedical applications really broad,” Dr Evans said.

The researchers also experimented with adding silver compounds, in order to produce an antibacterial coating that can be used on devices such as catheters to avoid infections.

“Other compounds can also be added to implants to reduce friction, make them more durable and resistant to wear,” Dr Evans said.

The coating process the scientists developed is very simple and uses methods and substances that are readily available.

This means biomedical manufacturers can produce improved results more cost effectively compared to existing coatings.

CSIRO is the first organisation to investigate practical applications of this kind using prebiotic chemistry.

“This research opens the door to a host of new biomedical possibilities that are still yet to be explored,” Dr Evans said.

CSIRO is seeking to partner with biomedical manufacturers to exploit this technology.

Here’s a link to and a citation for the paper,

Prebiotic-chemistry inspired polymer coatings for biomedical and material science applications by Helmut Thissen, Aylin Koegler, Mario Salwiczek, Christopher D Easton, Yue Qu, Trevor Lithgow, and Richard A Evans.  NPG Asia Materials (2015) 7, e225; doi:10.1038/am.2015.122 Published online 13 November 2015

This is an open access paper,

Café Scientifique (Vancouver, Canada) and noise on Oct. 27, 2015

On Tuesday, October 27, 2015  Café Scientifique, in the back room of The Railway Club (2nd floor of 579 Dunsmuir St. [at Seymour St.]), will be hosting a talk on the history of noise (from the Oct. 13, 2015 announcement),

Our speaker for the evening will be Dr. Shawn Bullock.  The title of his talk is:

The History of Noise: Perspectives from Physics and Engineering

The word “noise” is often synonymous with “nuisance,” which implies something to be avoided as much as possible. We label blaring sirens, the space between stations on the radio dial and the din of a busy street as “noise.” Is noise simply a sound we don’t like? We will consider the evolution of how scientists and engineers have thought about noise, beginning in the 19th-century and continuing to the present day. We will explore the idea of noise both as a social construction and as a technological necessity. We’ll also touch on critical developments in the study of sound, the history of physics and engineering, and the development of communications technology.

This description is almost identical to the description Bullock gave for a November 2014 talk he titled: Snap, Crackle, Pop!: A Short History of Noise which he summarizes this way after delivering the talk,

I used ideas from the history of physics, the history of music, the discipline of sound studies, and the history of electrical engineering to make the point that understanding “noise” is essential to understanding advancements in physics and engineering in the last century. We began with a discussion of 19th-century attitudes toward noise (and its association with “progress” and industry) before moving on to examine the early history of recorded sound and music, early attempts to measure noise, and the noise abatement movement. I concluded with a brief overview of my recent work on the role of noise in the development of the modem during the early Cold War.

You can find out more about Dr. Bullock who is an assistant professor of science education at Simon Fraser University here at his website.

On the subject of noise, although not directly related to Bullock’s work, there’s some research suggesting that noise may be having a serious impact on marine life. From an Oct. 8, 2015 Elsevier press release on EurekAlert,

Quiet areas should be sectioned off in the oceans to give us a better picture of the impact human generated noise is having on marine animals, according to a new study published in Marine Pollution Bulletin. By assigning zones through which ships cannot travel, researchers will be able to compare the behavior of animals in these quiet zones to those living in noisier areas, helping decide the best way to protect marine life from harmful noise.

The authors of the study, from the University of St Andrews, UK, the Oceans Initiative, Cornell University, USA, and Curtin University, Australia, say focusing on protecting areas that are still quiet will give researchers a better insight into the true impact we are having on the oceans.

Almost all marine organisms, including mammals like whales and dolphins, fish and even invertebrates, use sound to find food, avoid predators, choose mates and navigate. Chronic noise from human activities such as shipping can have a big impact on these animals, since it interferes with their acoustic signaling – increased background noise can mean animals are unable to hear important signals, and they tend to swim away from sources of noise, disrupting their normal behavior.

The number of ships in the oceans has increased fourfold since 1992, increasing marine noise dramatically. Ships are also getting bigger, and therefore noisier: in 2000 the biggest cargo ships could carry 8,000 containers; today’s biggest carry 18,000.

“Marine animals, especially whales, depend on a naturally quiet ocean for survival, but humans are polluting major portions of the ocean with noise,” said Dr. Christopher Clark from the Bioacoustics Research Program, Cornell University. “We must make every effort to protect quiet ocean regions now, before they grow too noisy from the din of our activities.”

For the new study, lead author Dr. Rob Williams and the team mapped out areas of high and low noise pollution in the oceans around Canada. Using shipping route and speed data from Environment Canada, the researchers put together a model of noise based on ships’ location, size and speed, calculating the cumulative sound they produce over the course of a year. They used the maps to predict how noisy they thought a particular area ought to be.

To test their predictions, in partnership with Cornell University, they deployed 12 autonomous hydrophones – devices that can measure noise in water – and found a correlation in terms of how the areas ranked from quietest to noisiest. The quiet areas are potential noise conservation zones.

“We tend to focus on problems in conservation biology. This was a fun study to work on, because we looked for opportunities to protect species by working with existing patterns in noise and animal distribution, and found that British Colombia offers many important habitat for whales that are still quiet,” said Dr. Rob Williams, lead author of the study. “If we think of quiet, wild oceans as a natural resource, we are lucky that Canada is blessed with globally rare pockets of acoustic wilderness. It makes sense to talk about protecting acoustic sanctuaries before we lose them.”

Although it is clear that noise has an impact on marine organisms, the exact effect is still not well understood. By changing their acoustic environment, we could be inadvertently choosing winners and losers in terms of survival; researchers are still at an early stage of predicting who will win or lose under different circumstances. The quiet areas the team identified could serve as experimental control sites for research like the International Quiet Ocean Experiment to see what effects ocean noise is having on marine life.

“Sound is perceived differently by different species, and some are more affected by noise than others,” said Christine Erbe, co-author of the study and Director of the Marine Science Center, Curtin University, Australia.

So far, the researchers have focused on marine mammals – whales, dolphins, porpoises, seals and sea lions. With a Pew Fellowship in Marine Conservation, Dr. Williams now plans to look at the effects of noise on fish, which are less well understood. By starting to quantify that and let people know what the likely economic effect on fisheries or on fish that are culturally important, Dr. Williams hopes to get the attention of the people who make decisions that affect ocean noise.

“When protecting highly mobile and migratory species that are poorly studied, it may make sense to focus on threats rather than the animals themselves. Shipping patterns decided by humans are often more predictable than the movements of whales and dolphins,” said Erin Ashe, co-author of the study and co-founder of the Oceans Initiative from the University of St Andrews.

Keeping areas of the ocean quiet is easier than reducing noise in already busy zones, say the authors of the study. However, if future research that stems from noise protected zones indicates that overall marine noise should be reduced, there are several possible approaches to reducing noise. The first is speed reduction: the faster a ship goes, the noisier it gets, so slowing down would reduce overall noise. The noisiest ships could also be targeted for replacement: by reducing the noise produced by the noisiest 10% of ships in use today, overall marine noise could be reduced by more than half. The third, more long-term, option would be to build quieter ships from the outset.

I can’t help wondering why Canadian scientists aren’t involved in this research taking place off our shores. Regardless, here’s a link to and a citation for the paper,

Quiet(er) marine protected areas by Rob Williams, Christine Erbe, Erin Ashe, & Christopher W. Clark. Marine Pollution Bulletin Available online 16 September 2015 In Press, Corrected Proof doi:10.1016/j.marpolbul.2015.09.012

This is an open access paper.

Copyright and patent protections and human rights

The United Nations (UN) and cultural rights don’t immediately leap to mind when the subjects of copyright and patents are discussed. A Mar. 13, 2015 posting by Tim Cushing on Techdirt and an Oct. 14, 2015 posting by Glyn Moody also on Techdirt explain the connection in the person of Farida Shaheed, the UN Special Rapporteur on cultural rights and the author of two UN reports one on copyright and one on patents.

From the Mar. 13, 2015 posting by Tim Cushing,

… Farida Shaheed, has just delivered a less-than-complimentary report on copyright to the UN’s Human Rights Council. Shaheed’s report actually examines where copyright meshes with arts and science — the two areas it’s supposed to support — and finds it runs contrary to the rosy image of incentivized creation perpetuated by the MPAAs and RIAAs of the world.

Shaheed said a “widely shared concern stems from the tendency for copyright protection to be strengthened with little consideration to human rights issues.” This is illustrated by trade negotiations conducted in secrecy, and with the participation of corporate entities, she said.

She stressed the fact that one of the key points of her report is that intellectual property rights are not human rights. “This equation is false and misleading,” she said.

The last statement fires shots over the bows of “moral rights” purveyors, as well as those who view infringement as a moral issue, rather than just a legal one.

Shaheed also points out that the protections being installed around the world at the behest of incumbent industries are not necessarily reflective of creators’ desires. …

Glyn Moody’s Oct. 14, 2015 posting features Shaheed’s latest report on patents,

… As the summary to her report puts it:

There is no human right to patent protection. The right to protection of moral and material interests cannot be used to defend patent laws that inadequately respect the right to participate in cultural life, to enjoy the benefits of scientific progress and its applications, to scientific freedoms and the right to food and health and the rights of indigenous peoples and local communities.

Patents, when properly structured, may expand the options and well-being of all people by making new possibilities available. Yet, they also give patent-holders the power to deny access to others, thereby limiting or denying the public’s right of participation to science and culture. The human rights perspective demands that patents do not extend so far as to interfere with individuals’ dignity and well-being. Where patent rights and human rights are in conflict, human rights must prevail.

The report touches on many issues previously discussed here on Techdirt. For example, how pharmaceutical patents limit access to medicines by those unable to afford the high prices monopolies allow — a particularly hot topic in the light of TPP’s rules on data exclusivity for biologics. The impact of patents on seed independence is considered, and there is a warning about corporate sovereignty chapters in trade agreements, and the chilling effects they can have on the regulatory function of states and their ability to legislate in the public interest — for example, with patent laws.

I have two Canadian examples for data exclusivity and corporate sovereignty issues, both from Techdirt. There’s an Oct. 19, 2015 posting by Glyn Moody featuring a recent Health Canada move to threaten a researcher into suppressing information from human clinical trials,

… one of the final sticking points of the TPP negotiations [Trans Pacific Partnership] was the issue of data exclusivity for the class of drugs known as biologics. We’ve pointed out that the very idea of giving any monopoly on what amounts to facts is fundamentally anti-science, but that’s a rather abstract way of looking at it. A recent case in Canada makes plain what data exclusivity means in practice. As reported by CBC [Canadian Broadcasting Corporation] News, it concerns unpublished clinical trial data about a popular morning sickness drug:

Dr. Navindra Persaud has been fighting for four years to get access to thousands of pages of drug industry documents being held by Health Canada.

He finally received the material a few weeks ago, but now he’s being prevented from revealing what he has discovered.

That’s because Health Canada required him to sign a confidentiality agreement, and has threatened him with legal action if he breaks it.

The clinical trials data is so secret that he’s been told that he must destroy the documents once he’s read them, and notify Health Canada in writing that he has done so….

For those who aren’t familiar with it, the Trans Pacific Partnership is a proposed trade agreement including 12 countries (Australia, Brunei Darussalam, Canada, Chile, Japan, Malaysia, Mexico, New Zealand, Peru, Singapore, United States, and Vietnam) from the Pacific Rim. If all the countries sign on (it looks as if they will; Canada’s new Prime Minister as of Oct. 19, 2015 seems to be in favour of the agreement although he has yet to make a definitive statement), the TPP will represent a trading block that is almost double the size of the European Union.

An Oct. 8, 2015 posting by Mike Masnick provides a description of corporate sovereignty and of the Eli Lilly suit against the Canadian government.

We’ve pointed out a few times in the past that while everyone refers to the Trans Pacific Partnership (TPP) agreement as a “free trade” agreement, the reality is that there’s very little in there that’s actually about free trade. If it were truly a free trade agreement, then there would be plenty of reasons to support it. But the details show it’s not, and yet, time and time again, we see people supporting the TPP because “well, free trade is good.” …
… it’s that “harmonizing regulatory regimes” thing where the real nastiness lies, and where you quickly discover that most of the key factors in the TPP are not at all about free trade, but the opposite. It’s about as protectionist as can be. That’s mainly because of the really nasty corprorate sovereignty clauses in the agreement (which are officially called “investor state dispute settlement” or ISDS in an attempt to make it sound so boring you’ll stop paying attention). Those clauses basically allow large incumbents to force the laws of countries to change to their will. Companies who feel that some country’s regulation somehow takes away “expected profits” can convene a tribunal, and force a country to change its laws. Yes, technically a tribunal can only issue monetary sanctions against a country, but countries who wish to avoid such monetary payments will change their laws.

Remember how Eli Lilly is demanding $500 million from Canada after Canada rejected some Eli Lilly patents, noting that the new compound didn’t actually do anything new and useful? Eli Lilly claims that using such a standard to reject patents unfairly attacks its expected future profits, and thus it can demand $500 million from Canadian taxpayers. Now, imagine that on all sorts of other systems.

Cultural rights, human rights, corporate rights. It would seem that corporate rights are going to run counter to human rights, if nothing else.

Policy makers, beware experts! And, evidence too

There is much to admire in this new research but there’s also a troubling conflation.

An Oct. 14, 2015 University of Cambridge press release (also on EurekAlert) cautions policy makers about making use of experts,

The accuracy and reliability of expert advice is often compromised by “cognitive frailties”, and needs to be interrogated with the same tenacity as research data to avoid weak and ill-informed policy, warn two leading risk analysis and conservation researchers in the journal Nature today.

While many governments aspire to evidence-based policy [emphasis mine], the researchers say the evidence on experts themselves actually shows that they are highly susceptible to “subjective influences” – from individual values and mood, to whether they stand to gain or lose from a decision – and, while highly credible, experts often vastly overestimate their objectivity and the reliability of peers.

They appear to be conflating evidence and expertise. Evidence usually means data while expertise is a more ephemeral concept. (Presumably, an expert is someone whose opinion is respected for one reason or another and who has studied the evidence and drawn some conclusions from it.)

The study described in the press release notes that one of the weaknesses of relying on experts is that they are subject to bias. They don’t mention that evidence or data can also be subject to bias but perhaps that’s why they suggest the experts should provide and assess the evidence on which they are basing their advice,

The researchers caution that conventional approaches of informing policy by seeking advice from either well-regarded individuals or assembling expert panels needs to be balanced with methods that alleviate the effects of psychological and motivational bias.

They offer a straightforward framework for improving expert advice, and say that experts should provide and assess [emphasis mine] evidence on which decisions are made – but not advise decision makers directly, which can skew impartiality.

“We are not advocating replacing evidence with expert judgements, rather we suggest integrating and improving them,” write professors William Sutherland and Mark Burgman from the universities of Cambridge and Melbourne respectively.

“Policy makers use expert evidence as though it were data. So they should treat expert estimates with the same critical rigour that must be applied to data,” they write.

“Experts must be tested, their biases minimised, their accuracy improved, and their estimates validated with independent evidence. Put simply, experts should be held accountable for their opinions.”

Sutherland and Burgman point out that highly regarded experts are routinely shown to be no better than novices at making judgements.

However, several processes have been shown to improve performances across the spectrum, they say, such as ‘horizon scanning’ – identifying all possible changes and threats – and ‘solution scanning’ – listing all possible options, using both experts and evidence, to reduce the risk of overlooking valuable alternatives.

To get better answers from experts, they need better, more structured questions, say the authors. “A seemingly straightforward question, ‘How many diseased animals are there in the area?’ for example, could be interpreted very differently by different people. Does it include those that are infectious and those that have recovered? What about those yet to be identified?” said Sutherland, from Cambridge’s Department of Zoology.

“Structured question formats that extract upper and lower boundaries, degrees of confidence and force consideration of alternative theories are important for shoring against slides into group-think, or individuals getting ascribed greater credibility based on appearance or background,” he said.

When seeking expert advice, all parties must be clear about what they expect of each other, says Burgman, Director of the Centre of Excellence for Biosecurity Risk Analysis. “Are policy makers expecting estimates of facts, predictions of the outcome of events, or advice on the best course of action?”

“Properly managed, experts can help with estimates and predictions, but providing advice assumes the expert shares the same values and objectives as the decision makers. Experts need to stick to helping provide and assess evidence on which such decisions are made,” he said.

Sutherland and Burgman have created a framework of eight key ways to improve the advice of experts. These include using groups – not individuals – with diverse, carefully selected members well within their expertise areas.

They also caution against being bullied or “starstruck” by the over-assertive or heavyweight. “People who are less self-assured will seek information from a more diverse range of sources, and age, number of qualifications and years of experience do not explain an expert’s ability to predict future events – a finding that applies in studies from geopolitics to ecology,” said Sutherland.

Added Burgman: “Some experts are much better than others at estimation and prediction. However, the only way to tell a good expert from a poor one is to test them. Qualifications and experience don’t help to tell them apart.”

“The cost of ignoring these techniques – of using experts inexpertly – is less accurate information and so more frequent, and more serious, policy failures,” write the researchers.

Here’s a link to and a citation for the paper,

Policy advice: Use experts wisely by William J. Sutherland & Mark Burgman. Nature 526, 317–318 (15 October 2015) doi:10.1038/526317a

It’s good to see a nuanced attempt to counteract mindless adherence to expert opinion. I hope they will include evidence and data as  needing to be approached cautiously in future work.

D-Wave upgrades Google’s quantum computing capabilities

Vancouver-based (more accurately, Burnaby-based) D-Wave systems has scored a coup as key customers have upgraded from a 512-qubit system to a system with over 1,000 qubits. (The technical breakthrough and concomitant interest from the business community was mentioned here in a June 26, 2015 posting.) As for the latest business breakthrough, here’s more from a Sept. 28, 2015 D-Wave press release,

D-Wave Systems Inc., the world’s first quantum computing company, announced that it has entered into a new agreement covering the installation of a succession of D-Wave systems located at NASA’s Ames Research Center in Moffett Field, California. This agreement supports collaboration among Google, NASA and USRA (Universities Space Research Association) that is dedicated to studying how quantum computing can advance artificial intelligence and machine learning, and the solution of difficult optimization problems. The new agreement enables Google and its partners to keep their D-Wave system at the state-of-the-art for up to seven years, with new generations of D-Wave systems to be installed at NASA Ames as they become available.

“The new agreement is the largest order in D-Wave’s history, and indicative of the importance of quantum computing in its evolution toward solving problems that are difficult for even the largest supercomputers,” said D-Wave CEO Vern Brownell. “We highly value the commitment that our partners have made to D-Wave and our technology, and are excited about the potential use of our systems for machine learning and complex optimization problems.”

Cade Wetz’s Sept. 28, 2015 article for Wired magazine provides some interesting observations about D-Wave computers along with some explanations of quantum computing (Note: Links have been removed),

Though the D-Wave machine is less powerful than many scientists hope quantum computers will one day be, the leap to 1000 qubits represents an exponential improvement in what the machine is capable of. What is it capable of? Google and its partners are still trying to figure that out. But Google has said it’s confident there are situations where the D-Wave can outperform today’s non-quantum machines, and scientists at the University of Southern California [USC] have published research suggesting that the D-Wave exhibits behavior beyond classical physics.

A quantum computer operates according to the principles of quantum mechanics, the physics of very small things, such as electrons and photons. In a classical computer, a transistor stores a single “bit” of information. If the transistor is “on,” it holds a 1, and if it’s “off,” it holds a 0. But in quantum computer, thanks to what’s called the superposition principle, information is held in a quantum system that can exist in two states at the same time. This “qubit” can store a 0 and 1 simultaneously.

Two qubits, then, can hold four values at any given time (00, 01, 10, and 11). And as you keep increasing the number of qubits, you exponentially increase the power of the system. The problem is that building a qubit is a extreme difficult thing. If you read information from a quantum system, it “decoheres.” Basically, it turns into a classical bit that houses only a single value.

D-Wave claims to have a found a solution to the decoherence problem and that appears to be borne out by the USC researchers. Still, it isn’t a general quantum computer (from Wetz’s article),

… researchers at USC say that the system appears to display a phenomenon called “quantum annealing” that suggests it’s truly operating in the quantum realm. Regardless, the D-Wave is not a general quantum computer—that is, it’s not a computer for just any task. But D-Wave says the machine is well-suited to “optimization” problems, where you’re facing many, many different ways forward and must pick the best option, and to machine learning, where computers teach themselves tasks by analyzing large amount of data.

It takes a lot of innovation before you make big strides forward and I think D-Wave is to be congratulated on producing what is to my knowledge the only commercially available form of quantum computing of any sort in the world.

ETA Oct. 6, 2015* at 1230 hours PST: Minutes after publishing about D-Wave I came across this item (h/t Quirks & Quarks twitter) about Australian researchers and their quantum computing breakthrough. From an Oct. 6, 2015 article by Hannah Francis for the Sydney (Australia) Morning Herald,

For decades scientists have been trying to turn quantum computing — which allows for multiple calculations to happen at once, making it immeasurably faster than standard computing — into a practical reality rather than a moonshot theory. Until now, they have largely relied on “exotic” materials to construct quantum computers, making them unsuitable for commercial production.

But researchers at the University of New South Wales have patented a new design, published in the scientific journal Nature on Tuesday, created specifically with computer industry manufacturing standards in mind and using affordable silicon, which is found in regular computer chips like those we use every day in smartphones or tablets.

“Our team at UNSW has just cleared a major hurdle to making quantum computing a reality,” the director of the university’s Australian National Fabrication Facility, Andrew Dzurak, the project’s leader, said.

“As well as demonstrating the first quantum logic gate in silicon, we’ve also designed and patented a way to scale this technology to millions of qubits using standard industrial manufacturing techniques to build the world’s first quantum processor chip.”

According to the article, the university is looking for industrial partners to help them exploit this breakthrough. Fisher’s article features an embedded video, as well as, more detail.

*It was Oct. 6, 2015 in Australia but Oct. 5, 2015 my side of the international date line.

ETA Oct. 6, 2015 (my side of the international date line): An Oct. 5, 2015 University of New South Wales news release on EurekAlert provides additional details.

Here’s a link to and a citation for the paper,

A two-qubit logic gate in silicon by M. Veldhorst, C. H. Yang, J. C. C. Hwang, W. Huang,    J. P. Dehollain, J. T. Muhonen, S. Simmons, A. Laucht, F. E. Hudson, K. M. Itoh, A. Morello    & A. S. Dzurak. Nature (2015 doi:10.1038/nature15263 Published online 05 October 2015

This paper is behind a paywall.

Global overview of nano-enabled food and agriculture regulation

First off, this post features an open access paper summarizing global regulation of nanotechnology in agriculture and food production. From a Sept. 11, 2015 news item on Nanowerk,

An overview of regulatory solutions worldwide on the use of nanotechnology in food and feed production shows a differing approach: only the EU and Switzerland have nano-specific provisions incorporated in existing legislation, whereas other countries count on non-legally binding guidance and standards for industry. Collaboration among countries across the globe is required to share information and ensure protection for people and the environment, according to the paper …

A Sept. 11, 2015 European Commission Joint Research Centre press release (also on EurekAlert*), which originated the news item, summarizes the paper in more detail (Note: Links have been removed),

The paper “Regulatory aspects of nanotechnology in the agri/feed/food sector in EU and non-EU countries” reviews how potential risks or the safety of nanotechnology are managed in different countries around the world and recognises that this may have implication on the international market of nano-enabled agricultural and food products.

Nanotechnology offers substantial prospects for the development of innovative products and applications in many industrial sectors, including agricultural production, animal feed and treatment, food processing and food contact materials. While some applications are already marketed, many other nano-enabled products are currently under research and development, and may enter the market in the near future. Expected benefits of such products include increased efficacy of agrochemicals through nano-encapsulation, enhanced bioavailability of nutrients or more secure packaging material through microbial nanoparticles.

As with any other regulated product, applicants applying for market approval have to demonstrate the safe use of such new products without posing undue safety risks to the consumer and the environment. Some countries have been more active than others in examining the appropriateness of their regulatory frameworks for dealing with the safety of nanotechnologies. As a consequence, different approaches have been adopted in regulating nano-based products in the agri/feed/food sector.

The analysis shows that the EU along with Switzerland are the only ones which have introduced binding nanomaterial definitions and/or specific provisions for some nanotechnology applications. An example would be the EU labelling requirements for food ingredients in the form of ‘engineered nanomaterials’. Other regions in the world regulate nanomaterials more implicitly mainly by building on non-legally binding guidance and standards for industry.

The overview of existing legislation and guidances published as an open access article in the Journal Regulatory Toxicology and Pharmacology is based on information gathered by the JRC, RIKILT-Wageningen and the European Food Safety Agency (EFSA) through literature research and a dedicated survey.

Here’s a link to and a citation for the paper,

Regulatory aspects of nanotechnology in the agri/feed/food sector in EU and non-EU countries by Valeria Amenta, Karin Aschberger, , Maria Arena, Hans Bouwmeester, Filipa Botelho Moniz, Puck Brandhoff, Stefania Gottardo, Hans J.P. Marvin, Agnieszka Mech, Laia Quiros Pesudo, Hubert Rauscher, Reinhilde Schoonjans, Maria Vittoria Vettori, Stefan Weigel, Ruud J. Peters. Regulatory Toxicology and Pharmacology Volume 73, Issue 1, October 2015, Pages 463–476 doi:10.1016/j.yrtph.2015.06.016

This is the most inclusive overview I’ve seen yet. The authors cover Asian countries, South America, Africa, and the MIddle East, as well as, the usual suspects in Europe and North America.

Given I’m a Canadian blogger I feel obliged to include their summary of the Canadian situation (Note: Links have been removed),

4.2. Canada

The Canadian Food Inspection Agency (CFIA) and Public Health Agency of Canada (PHAC), who have recently joined the Health Portfolio of Health Canada, are responsible for food regulation in Canada. No specific regulation for nanotechnology-based food products is available but such products are regulated under the existing legislative and regulatory frameworks.11 In October 2011 Health Canada published a “Policy Statement on Health Canada’s Working Definition for Nanomaterials” (Health Canada, 2011), the document provides a (working) definition of NM which is focused, similarly to the US definition, on the nanoscale dimensions, or on the nanoscale properties/phenomena of the material (see Annex I). For what concerns general chemicals regulation in Canada, the New Substances (NS) program must ensure that new substances, including substances that are at the nano-scale (i.e. NMs), are assessed in order to determine their toxicological profile ( Environment Canada, 2014). The approach applied involves a pre-manufacture and pre-import notification and assessment process. In 2014, the New Substances program published a guidance aimed at increasing clarity on which NMs are subject to assessment in Canada ( Environment Canada, 2014).

Canadian and US regulatory agencies are working towards harmonising the regulatory approaches for NMs under the US-Canada Regulatory Cooperation Council (RCC) Nanotechnology Initiative.12 Canada and the US recently published a Joint Forward Plan where findings and lessons learnt from the RCC Nanotechnology Initiative are discussed (Canada–United States Regulatory Cooperation Council (RCC) 2014).

Based on their summary of the Canadian situation, with which I am familiar, they’ve done a good job of summarizing. Here are a few of the countries whose regulatory instruments have not been mentioned here before (Note: Links have been removed),

In Turkey a national or regional policy for the responsible development of nanotechnology is under development (OECD, 2013b). Nanotechnology is considered as a strategic technological field and at present 32 nanotechnology research centres are working in this field. Turkey participates as an observer in the EFSA Nano Network (Section 3.6) along with other EU candidate countries Former Yugoslav Republic of Macedonia, and Montenegro (EFSA, 2012). The Inventory and Control of Chemicals Regulation entered into force in Turkey in 2008, which represents a scale-down version of the REACH Regulation (Bergeson et al. 2010). Moreover, the Ministry of Environment and Urban Planning published a Turkish version of CLP Regulation (known as SEA in Turkish) to enter into force as of 1st June 2016 (Intertek).

The Russian legislation on food safety is based on regulatory documents such as the Sanitary Rules and Regulations (“SanPiN”), but also on national standards (known as “GOST”) and technical regulations (Office of Agricultural Affairs of the USDA, 2009). The Russian policy on nanotechnology in the industrial sector has been defined in some national programmes (e.g. Nanotechnology Industry Development Program) and a Russian Corporation of Nanotechnologies was established in 2007.15 As reported by FAO/WHO (FAO/WHO, 2013), 17 documents which deal with the risk assessment of NMs in the food sector were released within such federal programs. Safe reference levels on nanoparticles impact on the human body were developed and implemented in the sanitary regulation for the nanoforms of silver and titanium dioxide and, single wall carbon nanotubes (FAO/WHO, 2013).

Other countries included in this overview are Brazil, India, Japan, China, Malaysia, Iran, Thailand, Taiwan, Australia, New Zealand, US, South Africa, South Korea, Switzerland, and the countries of the European Union.

*EurekAlert link added Sept. 14, 2015.

Hector Barron Escobar and his virtual nanomaterial atomic models for the oil, mining, and energy industries

I think there’s some machine translation at work in the Aug. 27, 2015 news item about Hector Barron Escobar on Azonano,

By using supercomputers the team creates virtual atomic models that interact under different conditions before being taken to the real world, allowing savings in time and money.

With the goal of potentiate the oil, mining and energy industries, as well as counteract the emission of greenhouse gases, the nanotechnologist Hector Barron Escobar, designs more efficient and profitable nanomaterials.

The Mexican who lives in Australia studies the physical and chemical properties of platinum and palladium, metal with excellent catalytic properties that improve processes in petrochemistry, solar cells and fuel cells, which because of their scarcity have a high and unprofitable price, hence the need to analyze their properties and make them long lasting.

Structured materials that the specialist in nanotechnology designs can be implemented in the petrochemical and automotive industries. In the first, they accelerate reactions in the production of hydrocarbons, and in the second, nanomaterials are placed in catalytic converters of vehicles to transform the pollutants emitted by combustion into less harmful waste.

An August 26, 2015 Investigación y Desarrollo press release on Alpha Galileo, which originated the news item, continues Barron Escobar’s profile,

PhD Barron Escobar, who majored in physics at the National University of Mexico (UNAM), says that this are created by using virtual supercomputers to interact with atomic models under different conditions before being taken to the real world.

Barron recounts how he came to Australia with an invitation of his doctoral advisor, Amanda Partner with whom he analyzed the electronic properties of gold in the United States.

He explains that using computer models in the Virtual Nanoscience Laboratory (VNLab) in Australia, he creates nanoparticles that interact in different environmental conditions such as temperature and pressure. He also analyzes their mechanical and electronic properties, which provide specific information about behavior and gives the best working conditions. Together, these data serve to establish appropriate patterns or trends in a particular application.

The work of the research team serves as a guide for experts from the University of New South Wales in Australia, with which they cooperate, to build nanoparticles with specific functions. “This way we perform virtual experiments, saving time, money and offer the type of material conditions and ideal size for a specific catalytic reaction, which by the traditional way would cost a lot of money trying to find what is the right substance” Barron Escobar comments.

Currently he designs nanomaterials for the mining company Orica, because in this industry explosives need to be controlled in order to avoid damaging the minerals or the environment.

Research is also immersed in the creation of fuel cells, with the use of the catalysts designed by Barron is possible to produce more electricity without polluting.

Additionally, they enhance the effectiveness of catalytic converters in petrochemistry, where these materials help accelerate oxidation processes of hydrogen and carbon, which are present in all chemical reactions when fuel and gasoline are created. “We can identify the ideal particles for improving this type of reactions.”

The nanotechnology specialist also seeks to analyze the catalytic properties of bimetallic materials like titanium, ruthenium and gold, as their reaction according to size, shape and its components.

Escobar Barron chose to study nanomaterials because it is interesting to see how matter at the nano level completely changes its properties: at large scale it has a definite color, but keep another at a nanoscale, besides many applications can be obtained with these metals.

For anyone interested in Orica, there’s more here on their website; as for Dr. Hector Barron Escobar, there’s this webpage on  Australia’s Commonwealth Scientific and Industrial Research Organisation (CSIRO) website.

Risk assessments not the only path to nanotechnology regulation

Nanowerk has republished an essay about nanotechnology regulation from Australia’s The Conversation in an Aug. 25, 2015 news item (Note: A link has been removed),

When it comes to nanotechnology, Australians have shown strong support for regulation and safety testing.

One common way of deciding whether and how nanomaterials should be regulated is to conduct a risk assessment. This involves calculating the risk a substance or activity poses based on the associated hazards or dangers and the level of exposure to people or the environment.

However, our recent review (“Risk Analysis of Nanomaterials: Exposing Nanotechnology’s Naked Emperor”) found some serious shortcomings of the risk assessment process for determining the safety of nanomaterials.

We have argued that these shortcomings are so significant that risk assessment is effectively a naked emperor [reference to a children’s story “The Emperor’s New Clothes“].

The original Aug. 24, 2015 article written by Fern Wickson (Scientist/Program Coordinator at GenØk – Centre for Biosafety in Norway) and Georgia Miller (PhD candidate at UNSW [University of New South Wales], Australia) points out an oft ignored issue with regard to nanotechnology regulation,

Risk assessment has been the dominant decision-aiding tool used by regulators of new technologies for decades, despite it excluding key questions that the community cares about. [emphasis mine] For example: do we need this technology; what are the alternatives; how will it affect social relations, and; who should be involved in decision making?

Wickson and Miller also note more frequently discussed issues,

A fundamental problem is a lack of nano-specific regulation. Most sector-based regulation does not include a “trigger” for nanomaterials to face specific risk assessment. Where a substance has been approved for use in its macro form, it requires no new assessment.

Even if such a trigger were present, there is also currently no cross-sectoral or international agreement on the definition of what constitutes a nanomaterial.

Another barrier is the lack of measurement capability and validated methods for safety testing. We still do not have the means to conduct routine identification of nanomaterials in the complex “matrix” of finished products or the environment.

This makes supply chain tracking and safety testing under real-world conditions very difficult. Despite ongoing investment in safety research, the lack of validated test methods and different methods yielding diverse results allows scientific uncertainty to persist.

With regard to the first problem, the assumption that if a material at the macroscale is safe, then the same is true at the nanoscale informs regulation in Canada and, as far as I’m aware, every other constituency that has any type of nanomaterial regulation. I’ve had mixed feelings about this. On the one hand, we haven’t seen any serious problems associated with the use of nanomaterials but on the other hand, these problems can be slow to emerge.

The second issue mentioned, the lack of a consistent definition internationally, seems to be a relatively common problem in a lot of areas. As far as I’m aware, there aren’t that many international agreements for safety measures. Nuclear weapons and endangered animals and plants (CITES) being two of the few that come to mind.

The lack of protocols for safety testing of nanomaterials mentioned in the last paragraph of the excerpt is of rising concern. For example, there’s my July 7, 2015 posting featuring two efforts: Nanotechnology research protocols for Environment, Health and Safety Studies in US and a nanomedicine characterization laboratory in the European Union. Despite this and other efforts, I do think more can and should be done to standardize tests and protocols (without killing new types of research and results which don’t fit the models).

The authors do seem to be presenting a circular argument with this (from their Aug. 24, 2015 article; Note: A link has been removed),

Indeed, scientific uncertainty about nanomaterials’ risk profiles is a key barrier to their reliable assessment. A review funded by the European Commission concluded that:

[…] there is still insufficient data available to conduct the in depth risk assessments required to inform the regulatory decision making process on the safety of NMs [nanomaterials].

Reliable assessment of any chemical or drug is a major problem. We do have some good risk profiles but how many times have pharmaceutical companies developed a drug that passed successfully through human clinical trials only to present a serious risk when released to the general population? Assessing risk is a very complex problem. even with risk profiles and extensive testing.

Unmentioned throughout the article are naturally occurring nanoparticles (nanomaterials) and those created inadvertently through some manufacturing or other process. In fact, we have been ingesting nanomaterials throughout time. That said, I do agree we need to carefully consider the impact that engineered nanomaterials could have on us and the environment as ever more are being added.

To that end, the authors make some suggestions (Note: Links have been removed),

There are well-developed alternate decision-aiding tools available. One is multicriteria mapping, which seeks to evaluate various perspectives on an issue. Another is problem formulation and options assessment, which expands science-based risk assessment to engage a broader range of individuals and perspectives.

There is also pedigree assessment, which explores the framing and choices taking place at each step of an assessment process so as to better understand the ambiguity of scientific inputs into political processes.

Another, though less well developed, approach popular in Europe involves a shift from risk to innovation governance, with emphasis on developing “responsible research and innovation”.

I have some hesitation about recommending this read due to Georgia Miller’s involvement and the fact that I don’t have the time to check all the references. Miller was a spokesperson for Friends of the Earth (FoE) Australia, a group which led a substantive campaign against ‘nanosunscreens’. Here’s a July 20, 2010 posting where I featured some cherrypicking/misrepresentation of data by FoE in the persons of Georgia Miller and Ian Illuminato.

My Feb. 9, 2012 posting highlights the unintended consequences (avoidance of all sunscreens by some participants in a survey) of the FoE’s campaign in Australia (Note [1]: The percentage of people likely to avoid all sunscreens due to their concerns with nanoparticles in their sunscreens was originally reported to be 17%; Note [2]: Australia has the highest incidence of skin cancer in the world),

Feb.21.12 correction: According to the information in the Feb. 20, 2012 posting on 2020 Science, the percentage of Australians likely to avoid using sunscreens is 13%,

This has just landed in my email in box from Craig Cormick at the Department of Industry, Innovation, Science, Research and Tertiary Education in Australia, and I thought I would pass it on given the string of posts on nanoparticles in sunscreens on 2020 Science over the past few years:

“An online poll of 1,000 people, conducted in January this year, shows that one in three Australians had heard or read stories about the risks of using sunscreens with nanoparticles in them,” Dr Cormick said.

“Thirteen percent of this group were concerned or confused enough that they would be less likely to use any sunscreen, whether or not it contained nanoparticles, putting them selves at increased risk of developing potentially deadly skin cancers.

“The study also found that while one in five respondents stated they would go out of their way to avoid using sunscreens with nanoparticles in them, over three in five would need to know more information before deciding.”

This article with Fern Wickson (with whom I don’t always agree perfectly but hasn’t played any games with research that I’m know of) helps somewhat but it’s going to take more than this before I feel comfortable recommending Ms. Miller’s work for further reading.

Lightning strikes to create glass (reshaping rock at the atomic level)

This features glass (more specifically glass tubes), one of my interests, and it’s a fascinating story. From an Aug. 6, 2015 news item on Azonano,

At a rock outcropping in southern France, a jagged fracture runs along the granite. The surface in and around the crevice is discolored black, as if wet or covered in algae.

But, according to a new paper coauthored by the University of Pennsylvania’s Reto Gieré, the real explanation for the rock’s unusual features is more dramatic: a powerful bolt of lightning.

Here’s what the rock looks like afterwards,

A rock fulgurite revealed that lightning strikes alter quartz's crystal structure on the atomic level. Courtesy: Penn State

A rock fulgurite revealed that lightning strikes alter quartz’s crystal structure on the atomic level. Courtesy: University of Pennsylvania

The researchers have also provided an image taken under an transmission electron microscope,

Gieré and colleagues observed the parallel lines of shock lamellae under a transmission electron microscope Courtesy: Penn State

Gieré and colleagues observed the parallel lines of shock lamellae under a transmission electron microscope Courtesy: University of Pennsylvania

An Aug. 5, 2015 University of Pennsylvania news release, which originated the news item, provides more technical details about the research,

Using extremely high-resolution microscopy, Gieré, professor and chair of the Department of Earth and Environmental Science in Penn’s School of Arts & Sciences, and his coauthors found that not only had the lightning melted the rock’s surface, resulting in a distinctive black “glaze,” but had transferred enough pressure to deform a thin layer of quartz crystals beneath the surface, resulting in distinct atomic-level structures called shock lamellae.

Prior to this study, the only natural events known to create this type of lamellae were meteorite impacts.

“I think the most exciting thing about this study is just to see what lightning can do,” Gieré said. “To see that lightning literally melts the surface of a rock and changes crystal structures, to me, is fascinating.”

Gieré said the finding serves as a reminder to geologists not to rush to interpret shock lamellae as indicators of a meteorite strike.

“Most geologists are careful; they don’t just use one observation,” he said, “But this is a good reminder to always use multiple observations to draw big conclusions, that there are multiple mechanisms that can result in a similar effect.”

Gieré collaborated on the study with Wolfhard Wimmenauer and Hiltrud Müller-Sigmund of Albert-Ludwigs-Universität, Richard Wirth of GeoForschungsZentrum Potsdam and Gregory R. Lumpkin and Katherine L. Smith of the Australian Nuclear Science and Technology Organization.

The paper was published in the journal American Mineralogist.

Geologists have long known that lightning, through rapid increases in temperature as well as physical and chemical effects, can alter sediments. When it strikes sand, for example, lightning melts the grains, which fuse and form glass tubes known as fulgurites.

Fulgurites can also form when lightning strikes other materials, including rock and soil. The current study examined a rock fulgurite found near Les Pradals, France. Gieré and colleagues took samples from the rock, then cut thin sections and polished them.

Under an optical microscope, they found that the outer black layer — the fulgurite itself — appeared shiny, “almost like a ceramic glaze,” Gieré said.

The layer was also porous, almost like a foam, due to the lightning’s heat vaporizing the rock’s surface. A chemical analysis of the fulgurite layer turned up elevated levels of sulfur dioxide and phosphorous pentoxide, which the researchers believe may have derived from lichen living on the rock’s surface at the time of the lightning strike.

The team further studied the samples using a transmission electron microscope, which allows users to examine specimens at the atomic level. This revealed that the fulgurite lacked any crystalline structure, consistent with it representing a melt formed through the high heat from the lightning strike.

But, in a layer of the sample immediately adjacent to the fulgurite, slightly deeper in the rock, the researchers spotted an unusual feature: a set of straight, parallel lines known as shock lamellae. This feature occurs when the crystal structure of quartz or other minerals deform in response to a vast wave of pressure.

“It’s like if someone pushes you, you rearrange your body to be comfortable,” Gieré said. “The mineral does the same thing.”

The lamellae were present in a layer of the rock only about three micrometers wide, indicating that the energy of the lightning bolt’s impact dissipated over that distance.

This characteristic deformation of crystals had previously only been seen in minerals from sites where meteorites struck. Shock lamellae are believed to form at pressures up to more than 10 gigapascals, or with 20 million times greater force than a boxer’s punch.

Gieré and colleagues hope to study rock fulgurites from other sites to understand the physical and chemical effects of lightning bolts on rocks in greater detail.

Another takeaway for geologists, rock climbers and hikers who spend time on rocks in high, exposed places is to beware when they see the tell-tale shiny black glaze of a rock fulgurite, as it might indicate a site prone to lightning strikes.

“Once it was pointed out to me, I started seeing it again and again,” he said. “I’ve had some close calls with thunderstorms in the field, where I’ve had to throw down my metal instruments and run.”

Here’s a link to and a citation for the paper,

Lightning-induced shock lamellae in quartz by Reto Gieré, Wolfhard Wimmenauer, Hiltrud Müller-Sigmund, Richard Wirth, Gregory R. Lumpkin, and Katherine L. Smith. American Mineralogist, July 2015 v. 100 no. 7 p. 1645-1648 doi: 10.2138/am-2015-5218

This paper is behind a paywall.