Monthly Archives: June 2016

A treasure trove of molecule and battery data released to the public

Scientists working on The Materials Project have taken the notion of open science to their hearts and opened up access to their data according to a June 9, 2016 news item on Nanowerk,

The Materials Project, a Google-like database of material properties aimed at accelerating innovation, has released an enormous trove of data to the public, giving scientists working on fuel cells, photovoltaics, thermoelectrics, and a host of other advanced materials a powerful tool to explore new research avenues. But it has become a particularly important resource for researchers working on batteries. Co-founded and directed by Lawrence Berkeley National Laboratory (Berkeley Lab) scientist Kristin Persson, the Materials Project uses supercomputers to calculate the properties of materials based on first-principles quantum-mechanical frameworks. It was launched in 2011 by the U.S. Department of Energy’s (DOE) Office of Science.

A June 8, 2016 Berkeley Lab news release, which originated the news item, provides more explanation about The Materials Project,

The idea behind the Materials Project is that it can save researchers time by predicting material properties without needing to synthesize the materials first in the lab. It can also suggest new candidate materials that experimentalists had not previously dreamed up. With a user-friendly web interface, users can look up the calculated properties, such as voltage, capacity, band gap, and density, for tens of thousands of materials.

Two sets of data were released last month: nearly 1,500 compounds investigated for multivalent intercalation electrodes and more than 21,000 organic molecules relevant for liquid electrolytes as well as a host of other research applications. Batteries with multivalent cathodes (which have multiple electrons per mobile ion available for charge transfer) are promising candidates for reducing cost and achieving higher energy density than that available with current lithium-ion technology.

The sheer volume and scope of the data is unprecedented, said Persson, who is also a professor in UC Berkeley’s Department of Materials Science and Engineering. “As far as the multivalent cathodes, there’s nothing similar in the world that exists,” she said. “To give you an idea, experimentalists are usually able to focus on one of these materials at a time. Using calculations, we’ve added data on 1,500 different compositions.”

While other research groups have made their data publicly available, what makes the Materials Project so useful are the online tools to search all that data. The recent release includes two new web apps—the Molecules Explorer and the Redox Flow Battery Dashboard—plus an add-on to the Battery Explorer web app enabling researchers to work with other ions in addition to lithium.

“Not only do we give the data freely, we also give algorithms and software to interpret or search over the data,” Persson said.

The Redox Flow Battery app gives scientific parameters as well as techno-economic ones, so battery designers can quickly rule out a molecule that might work well but be prohibitively expensive. The Molecules Explorer app will be useful to researchers far beyond the battery community.

“For multivalent batteries it’s so hard to get good experimental data,” Persson said. “The calculations provide rich and robust benchmarks to assess whether the experiments are actually measuring a valid intercalation process or a side reaction, which is particularly difficult for multivalent energy technology because there are so many problems with testing these batteries.”

Here’s a screen capture from the Battery Explorer app,

The Materials Project’s Battery Explorer app now allows researchers to work with other ions in addition to lithium.

The Materials Project’s Battery Explorer app now allows researchers to work with other ions in addition to lithium. Courtesy: The Materials Project

The news release goes on to describe a new discovery made possible by The Materials Project (Note: A link has been removed),

Together with Persson, Berkeley Lab scientist Gerbrand Ceder, postdoctoral associate Miao Liu, and MIT graduate student Ziqin Rong, the Materials Project team investigated some of the more promising materials in detail for high multivalent ion mobility, which is the most difficult property to achieve in these cathodes. This led the team to materials known as thiospinels. One of these thiospinels has double the capacity of the currently known multivalent cathodes and was recently synthesized and tested in the lab by JCESR researcher Linda Nazar of the University of Waterloo, Canada.

“These materials may not work well the first time you make them,” Persson said. “You have to be persistent; for example you may have to make the material very phase pure or smaller than a particular particle size and you have to test them under very controlled conditions. There are people who have actually tried this material before and discarded it because they thought it didn’t work particularly well. The power of the computations and the design metrics we have uncovered with their help is that it gives us the confidence to keep trying.”

The researchers were able to double the energy capacity of what had previously been achieved for this kind of multivalent battery. The study has been published in the journal Energy & Environmental Science in an article titled, “A High Capacity Thiospinel Cathode for Mg Batteries.”

“The new multivalent battery works really well,” Persson said. “It’s a significant advance and an excellent proof-of-concept for computational predictions as a valuable new tool for battery research.”

Here’s a link to and a citation for the paper,

A high capacity thiospinel cathode for Mg batteries by Xiaoqi Sun, Patrick Bonnick, Victor Duffort, Miao Liu, Ziqin Rong, Kristin A. Persson, Gerbrand Ceder and  Linda F. Nazar. Energy Environ. Sci., 2016, Advance Article DOI: 10.1039/C6EE00724D First published online 24 May 2016

This paper seems to be behind a paywall.

Getting back to the news release, there’s more about The Materials Project in relationship to its membership,

The Materials Project has attracted more than 20,000 users since launching five years ago. Every day about 20 new users register and 300 to 400 people log in to do research.

One of those users is Dane Morgan, a professor of engineering at the University of Wisconsin-Madison who develops new materials for a wide range of applications, including highly active catalysts for fuel cells, stable low-work function electron emitter cathodes for high-powered microwave devices, and efficient, inexpensive, and environmentally safe solar materials.

“The Materials Project has enabled some of the most exciting research in my group,” said Morgan, who also serves on the Materials Project’s advisory board. “By providing easy access to a huge database, as well as tools to process that data for thermodynamic predictions, the Materials Project has enabled my group to rapidly take on materials design projects that would have been prohibitive just a few years ago.”

More materials are being calculated and added to the database every day. In two years, Persson expects another trove of data to be released to the public.

“This is the way to reach a significant part of the research community, to reach students while they’re still learning material science,” she said. “It’s a teaching tool. It’s a science tool. It’s unprecedented.”

Supercomputing clusters at the National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility hosted at Berkeley Lab, provide the infrastructure for the Materials Project.

Funding for the Materials Project is provided by the Office of Science (US Department of Energy], including support through JCESR [Joint Center for Energy Storage Research].

Happy researching!

New elements named (provisionally)

They say it’s provisionally but I suspect it would take an act of god for a change in the proposed names. From a June 8, 2016 blog posting (scroll down about 25% of the way) on the International Union of Pure and Applied Chemistry (IUPAC) website,

IUPAC is naming the four new elements nihonium, moscovium, tennessine, and oganesson

Following earlier reports that the claims for discovery of these elements have been fulfilled [1, 2], the discoverers have been invited to propose names and the following are now disclosed for public review:

  • Nihonium and symbol Nh, for the element 113,
  • Moscovium and symbol Mc, for the element 115,
  • Tennessine and symbol Ts, for the element 117, and
  • Oganesson and symbol Og, for the element 118.

The IUPAC Inorganic Chemistry Division has reviewed and considered these proposals and recommends these for acceptance. A five-month public review is now set, expiring 8 November 2016, prior to the formal approval by the IUPAC Council.

I can’t figure out how someone from the public might offer a comment about the names.

There’s more from the posting about what kinds of names are acceptable and how the names in this set of four were arrived at,

The guidelines for the naming the elements were recently revised [3] and shared with the discoverers to assist in their proposals. Keeping with tradition, newly discovered elements can be named after:
(a) a mythological concept or character (including an astronomical object),
(b) a mineral or similar substance,
(c) a place, or geographical region,
(d) a property of the element, or
(e) a scientist.
The names of all new elements in general would have an ending that reflects and maintains historical and chemical consistency. This would be in general “-ium” for elements belonging to groups 1-16, “-ine” for elements of group 17 and “-on” for elements of group 18. Finally, the names for new chemical elements in English should allow proper translation into other major languages.

For the element with atomic number 113 the discoverers at RIKEN Nishina Center for Accelerator-Based Science (Japan) proposed the name nihonium and the symbol Nh. Nihon is one of the two ways to say “Japan” in Japanese, and literally mean “the Land of Rising Sun”. The name is proposed to make a direct connection to the nation where the element was discovered. Element 113 is the first element to have been discovered in an Asian country. While presenting this proposal, the team headed by Professor Kosuke Morita pays homage to the trailblazing work by Masataka Ogawa done in 1908 surrounding the discovery of element 43. The team also hopes that pride and faith in science will displace the lost trust of those who suffered from the 2011 Fukushima nuclear disaster.

For the element with atomic number 115 the name proposed is moscovium with the symbol Mc and for element with atomic number 117, the name proposed is tennessine with the symbol Ts. These are in line with tradition honoring a place or geographical region and are proposed jointly by the discoverers at the Joint Institute for Nuclear Research, Dubna (Russia), Oak Ridge National Laboratory (USA), Vanderbilt University (USA) and Lawrence Livermore National Laboratory (USA).

Moscovium is in recognition of the Moscow region and honors the ancient Russian land that is the home of the Joint Institute for Nuclear Research, where the discovery experiments were conducted using the Dubna Gas-Filled Recoil Separator in combination with the heavy ion accelerator capabilities of the Flerov Laboratory of Nuclear Reactions.

Tennessine is in recognition of the contribution of the Tennessee region, including Oak Ridge National Laboratory, Vanderbilt University, and the University of Tennessee at Knoxville, to superheavy element research, including the production and chemical separation of unique actinide target materials for superheavy element synthesis at ORNL’s High Flux Isotope Reactor (HFIR) and Radiochemical Engineering Development Center (REDC).

For the element with atomic number 118 the collaborating teams of discoverers at the Joint Institute for Nuclear Research, Dubna (Russia) and Lawrence Livermore National Laboratory (USA) proposed the name oganesson and symbol Og. The proposal is in line with the tradition of honoring a scientist and recognizes Professor Yuri Oganessian (born 1933) for his pioneering contributions to transactinoid elements research. His many achievements include the discovery of superheavy elements and significant advances in the nuclear physics of superheavy nuclei including experimental evidence for the “island of stability”.

“It is a pleasure to see that specific places and names (country, state, city, and scientist) related to the new elements is recognized in these four names. Although these choices may perhaps be viewed by some as slightly self-indulgent, the names are completely in accordance with IUPAC rules”, commented Jan Reedijk, who corresponded with the various laboratories and invited the discoverers to make proposals. “In fact, I see it as thrilling to recognize that international collaborations were at the core of these discoveries and that these new names also make the discoveries somewhat tangible.”

So, let’s welcome Tennessine, Muscovium, Nihonium, and Oganesson to the table of periodic elements. I imagine Don Lehrer’s Elements Song will be updated soon. In the meantime we have this from ASAP Science, which includes the new elements under their placeholder names (when the addition was first publicized in January 2016. All of the placeholder names start with U,

Enjoy!

Canadian science petition and a science diplomacy event in Ottawa on June 21, 2016*

The Canadian science policy and science funding scene is hopping these days. Canada’s Minister of Science, Kirsty Duncan, announced a new review of federal funding for fundamental science on Monday, June 13, 2016 (see my June 15, 2016 post for more details and a brief critique of the panel) and now, there’s a new Parliamentary campaign for a science advisor and a Canadian Science Policy Centre event on science diplomacy.

Petition for a science advisor

Kennedy Stewart, Canadian Member of Parliament (Burnaby South) and NDP (New Democratic Party) Science Critic, has launched a campaign for independent science advice for the government. Here’s more from a June 15, 2016 announcement (received via email),

After years of muzzling and misuse of science by the Conservatives, our scientists need lasting protections in order to finally turn the page on the lost Harper decade.

I am writing to ask your support for a new campaign calling for an independent science advisor.

While I applaud the new Liberal government for their recent promises to support science, we have a long way to go to rebuild Canada’s reputation as a global knowledge leader. As NDP Science Critic, I continue to push for renewed research funding and measures to restore scientific integrity.

Canada badly needs a new science advisor to act as a public champion for research and evidence in Ottawa. Although the Trudeau government has committed to creating a Chief Science Officer, the Minister of Science – Dr. Kirsty Duncan – has yet to state whether or not the new officer will be given real independence and a mandate protected by law.

Today, we’re launching a new parliamentary petition calling for just that: https://petitions.parl.gc.ca/en/Petition/Sign/e-415

Can you add your name right now?

Canada’s last national science advisor lacked independence from the government and was easily eliminated in 2008 after the anti-science Harper Conservatives took power.

That’s why the Minister needs to build the new CSO to last and entrench the position in legislation. Rhetoric and half-measures aren’t good enough.

Please add your voice for public science by signing our petition to the Minister of Science.

Thank you for your support,

Breakfast session on science diplomacy

One June 21, 2016 the Canadian Science Policy Centre is presenting a breakfast session on Parliament Hill in Ottawa, (from an announcement received via email),

“Science Diplomacy in the 21st Century: The Potential for Tomorrow”
Remarks by Dr. Vaughan Turekian,
Science and Technology Adviser to Secretary of State John Kerry

Event Information
Tuesday, June 21, 2016, Room 238-S, Parliament Hill
7:30am – 8:00am – Continental Breakfast
8:00am – 8:10am – Opening Remarks, MP Terry Beech
8:10am – 8:45am – Dr. Vaughan Turekian Remarks and Q&A

Dr. Turekian’s visit comes during a pivotal time as Canada is undergoing fundamental changes in numerous policy directions surrounding international affairs. With Canada’s comeback on the world stage, there is great potential for science to play an integral role in the conduct of our foreign affairs.  The United States is currently one of the leaders in science diplomacy, and as such, listening to Dr.Turekian will provide a unique perspective from the best practices of science diplomacy in the US.

Actually, Dr. Turekian’s visit comes before a North American Summit being held in Ottawa on June 29, 2016 and which has already taken a scientific turn. From a June 16, 2016 news item on phys.org,

Some 200 intellectuals, scientists and artists from around the world urged the leaders of Mexico, the United States and Canada on Wednesday to save North America’s endangered migratory Monarch butterfly.

US novelist Paul Auster, environmental activist Robert F. Kennedy Jr., Canadian poet [Canadian media usually describe her as a writer] Margaret Atwood, British writer Ali Smith and India’s women’s and children’s minister Maneka Sanjay Gandhi were among the signatories of an open letter to the three leaders.

US President Barack Obama, Canadian Prime Minister Justin Trudeau and Mexican President Enrique Pena Nieto will hold a North American summit in Ottawa on June 29 [2016].

The letter by the so-called Group of 100 calls on the three leaders to “take swift and energetic actions to preserve the Monarch’s migratory phenomenon” when they meet this month.

In 1996-1997, the butterflies covered 18.2 hectares (45 acres) of land in Mexico’s central mountains.

It fell to 0.67 hectares in 2013-2014 but rose to 4 hectares this year. Their population is measured by the territory they cover.

They usually arrive in Mexico between late October and early November and head back north in March.

Given this turn of events, I wonder how Turekian, given that he’s held his current position for less than a year, might (or might not) approach the question of Monarch butterflies and diplomacy.

I did a little research about Turekian and found this Sept. 10, 2016 news release announcing his appointment as the Science and Technology Adviser to the US Secretary of State,

On September 8, Dr. Vaughan Turekian, formerly the Chief International Officer at The American Association for the Advancement of Science (AAAS), was named the 5th Science and Technology Adviser to the Secretary of State. In this capacity, Dr. Turekian will advise the Secretary of State and the Under Secretary for Economic Growth, Energy, and the Environment on international environment, science, technology, and health matters affecting the foreign policy of the United States. Dr. Turekian will draw upon his background in atmospheric chemistry and extensive policy experience to promote science, technology, and engineering as integral components of U.S. diplomacy.

Dr. Turekian brings both technical expertise and 14 years of policy experience to the position. As former Chief International Officer for The American Association for the Advancement of Science (AAAS) and Director of AAAS’s Center for Science Diplomacy, Dr. Turekian worked to build bridges between nations based on shared scientific goals, placing special emphasis on regions where traditional political relationships are strained or do not exist. As Editor-in-Chief of Science & Diplomacy, an online quarterly publication, Dr. Turekian published original policy pieces that have served to inform international science policy recommendations. Prior to his work at AAAS, Turekian worked at the State Department as Special Assistant and Adviser to the Under Secretary for Global Affairs on issues related to sustainable development, climate change, environment, energy, science, technology, and health and as a Program Director for the Committee on Global Change Research at the National Academy of Sciences where he was study director for a White House report on climate change science.

Turekian’s last editorial for Science & Diplomacy dated June 30, 2015 features a title (Evolving Institutions for Twenty-First Century [Science] Diplomacy) bearing a resemblance to the title for his talk in Ottawa and perhaps it provides a preview (spoilers),

Over the recent decade, its treatment of science and technology issues has increased substantially, with a number of cover stories focused on topics that bridge science, technology, and foreign affairs. This thought leadership reflects a broader shift in thinking within institutions throughout the world about the importance of better integrating the communities of science and diplomacy in novel ways.

In May, a high-level committee convened by Japan’s minister of foreign affairs released fifteen recommendations for how Japan could better incorporate its scientific and technological expertise into its foreign policy. While many of the recommendations were to be predicted, including the establishment of the position of science adviser to the foreign minister, the breadth of the recommendations highlighted numerous new ways Japan could leverage science to meet its foreign policy objectives. The report itself marks a turning point for an institution looking to upgrade its ability to meet and shape the challenges of this still young century.

On the other side of the Pacific, the U.S. National Academy of Sciences released its own assessment of science in the U.S. Department of State. Their report, “Diplomacy for the 21st Century: Embedding a Culture of Science and Technology Throughout the Department of State,” builds on its landmark 1999 report, which, among other things, established the position of science and technology adviser to the secretary of state. The twenty-seven recommendations in the new report are wide ranging, but as a whole speak to the fact that while one of the oldest U.S. institutions of government has made much progress toward becoming more scientifically and technologically literate, there are many more steps that could be taken to leverage science and technology as a key element of U.S. foreign policy.

These two recent reports highlight the importance of foreign ministries as vital instruments of science diplomacy. These agencies of foreign affairs, like their counterparts around the world, are often viewed as conservative and somewhat inflexible institutions focused on stability rather than transformation. However, they are adjusting to a world in which developments in science and technology move rapidly and affect relationships and interactions at bilateral, regional, and global scales.

At the same time that some traditional national instruments of diplomacy are evolving to better incorporate science, international science institutions are also evolving to meet the diplomatic and foreign policy drivers of this more technical century. …

It’s an interesting read and I’m glad to see the mention of Japan in his article. I’d like to see Canadian science advice and policy initiatives take more notice of the rest of the world rather than focusing almost solely on what’s happening in the US and Great Britain (see my June 15, 2016 post for an example of what I mean). On another note, it was disconcerting to find out that Turekian believes that we are only now moving past the Cold War politics of the past.

Unfortunately for anyone wanting to attend the talk, ticket sales have ended even though they were supposed to be open until June 17, 2016. And, there doesn’t seem to be a wait list.

You may want to try arriving at the door and hoping that people have cancelled or fail to arrive therefore acquiring a ticket. Should you be an MP (Member of Parliament), Senator, or guest of the Canadian Science Policy Conference, you get a free ticket. Should you be anyone else, expect to pay $15, assuming no one is attempting to scalp (sell one for more than it cost) these tickets.

*’ … on June’ in headline changed to ‘ … on June 21, 2016’ on June 17, 2016.

Science literacy, science advice, the US Supreme Court, and Britain’s House of Commons

This ‘think’ piece is going to cover a fair bit of ground including science literacy in the general public and in the US Supreme Court, and what that might mean for science advice and UK Members of Parliament (MPs).

Science literacy generally and in the US Supreme Court

A science literacy report for the US National Academy of Sciences (NAS), due sometime from early to mid 2017, is being crafted with an eye to capturing a different perspective according to a March 24, 2016 University of Wisconsin-Madison news release by Terry Dewitt,

What does it mean to be science literate? How science literate is the American public? How do we stack up against other countries? What are the civic implications of a public with limited knowledge of science and how it works? How is science literacy measured?

These and other questions are under the microscope of a 12-member National Academy of Sciences (NAS) panel — including University of Wisconsin—Madison Life Sciences Communication Professor Dominique Brossard and School of Education Professor Noah Feinstein — charged with sorting through the existing data on American science and health literacy and exploring the association between knowledge of science and public perception of and support for science.

The committee — composed of educators, scientists, physicians and social scientists — will take a hard look at the existing data on the state of U.S. science literacy, the questions asked, and the methods used to measure what Americans know and don’t know about science and how that knowledge has changed over time. Critically for science, the panel will explore whether a lack of science literacy is associated with decreased public support for science or research.

Historically, policymakers and leaders in the scientific community have fretted over a perceived lack of knowledge among Americans about science and how it works. A prevailing fear is that an American public unequipped to come to terms with modern science will ultimately have serious economic, security and civic consequences, especially when it comes to addressing complex and nuanced issues like climate change, antibiotic resistance, emerging diseases, environment and energy choices.

While the prevailing wisdom, inspired by past studies, is that Americans don’t stack up well in terms of understanding science, Brossard is not so convinced. Much depends on what kinds of questions are asked, how they are asked, and how the data is analyzed.

It is very easy, she argues, to do bad social science and past studies may have measured the wrong things or otherwise created a perception about the state of U.S. science literacy that may or may not be true.

“How do you conceptualize scientific literacy? What do people need to know? Some argue that scientific literacy may be as simple as an understanding of how science works, the nature of science, [emphasis mine]” Brossard explains. “For others it may be a kind of ‘civic science literacy,’ where people have enough knowledge to be informed and make good decisions in a civics context.”

Science literacy may not be just for the public, it would seem that US Supreme Court judges may not have a basic understanding of how science works. David Bruggeman’s March 24, 2016 posting (on his Pasco Phronesis blog) describes a then current case before the Supreme Court (Justice Antonin Scalia has since died), Note: Links have been removed,

It’s a case concerning aspects of the University of Texas admissions process for undergraduates and the case is seen as a possible means of restricting race-based considerations for admission.  While I think the arguments in the case will likely revolve around factors far removed from science and or technology, there were comments raised by two Justices that struck a nerve with many scientists and engineers.

Both Justice Antonin Scalia and Chief Justice John Roberts raised questions about the validity of having diversity where science and scientists are concerned [emphasis mine].  Justice Scalia seemed to imply that diversity wasn’t esential for the University of Texas as most African-American scientists didn’t come from schools at the level of the University of Texas (considered the best university in Texas).  Chief Justice Roberts was a bit more plain about not understanding the benefits of diversity.  He stated, “What unique perspective does a black student bring to a class in physics?”

To that end, Dr. S. James Gates, theoretical physicist at the University of Maryland, and member of the President’s Council of Advisers on Science and Technology (and commercial actor) has an editorial in the March 25 [2016] issue of Science explaining that the value of having diversity in science does not accrue *just* to those who are underrepresented.

Dr. Gates relates his personal experience as a researcher and teacher of how people’s background inform their practice of science, and that two different people may use the same scientific method, but think about the problem differently.

I’m guessing that both Scalia and Roberts and possibly others believe that science is the discovery and accumulation of facts. In this worldview science facts such as gravity are waiting for discovery and formulation into a ‘law’. They do not recognize that most science is a collection of beliefs and may be influenced by personal beliefs. For example, we believe we’ve proved the existence of the Higgs boson but no one associated with the research has ever stated unequivocally that it exists.

For judges who are under the impression that scientific facts are out there somewhere waiting to be discovered diversity must seem irrelevant. It is not. Who you are affects the questions you ask and how you approach science. The easiest example is to look at how women were viewed when they were subjects in medical research. The fact that women’s physiology is significantly different (and not just in child-bearing ways) was never considered relevant when reporting results. Today, researchers consider not only gender, but age (to some extent), ethnicity, and more when examining results. It’s still not a perfect but it was a step forward.

So when Brossard included “… an understanding of how science works, the nature of science …” as an aspect of science literacy, the judges seemed to present a good example of how not understanding science can have a major impact on how others live.

I’d almost forgotten this science literacy piece as I’d started the draft some months ago but then I spotted a news item about a science advice/MP ‘dating’ service in the UK.

Science advice and UK MPs

First, the news, then, the speculation (from a June 6, 2016 news item on ScienceDaily),

MPs have expressed an overwhelming willingness to use a proposed new service to swiftly link them with academics in relevant areas to help ensure policy is based on the latest evidence.

A June 6, 2016 University of Exeter press release, which originated the news item, provides more detail about the proposed service and the research providing the supporting evidence (Note: A link has been removed),

The government is pursuing a drive towards evidence-based policy, yet policy makers still struggle to incorporate evidence into their decisions. One reason for this is limited easy access to the latest research findings or to academic experts who can respond to questions about evidence quickly.

Researchers at Cardiff University, the University of Exeter and University College London have today published results of the largest study to date reporting MPs’ attitudes to evidence in policy making and their reactions to a proposed Evidence Information Service (EIS) – a rapid match-making advisory service that would work alongside existing systems to put MPs in touch with relevant academic experts.

Dr Natalia Lawrence, of the University of Exeter, said: “It’s clear from our study that politicians want to ensure their decisions incorporate the most reliable evidence, but it can sometimes be very difficult for them to know how to access the latest research findings. This new matchmaking service could be a quick and easy way for them to seek advice from cutting-edge researchers and to check their understanding and facts. It could provide a useful complement to existing highly-valued information services.”

The research, published today in the journal Evidence and Policy, reports the findings of a national consultation exercise between politicians and the public. The researchers recruited members of the public to interview their local parliamentary representative. In total 86, politicians were contacted with 56 interviews completed. The MPs indicated an overwhelming willingness to use a service such as the EIS, with 85% supporting the idea, but noted a number of potential reservations related to the logistics of the EIS such as response time and familiarity with the service. Yet, the MPs indicated that their logistical reservations could be overcome by accessing the EIS via existing highly-valued parliamentary information services such as those provided by the House of Commons and Lords Libraries. Furthermore prior to rolling out the EIS on a nationwide basis it would first need to be piloted.

Developing the proposed EIS in line with feedback from this consultation of MPs would offer the potential to provide policy makers with rapid, reliable and confidential evidence from willing volunteers from the research community.

Professor Chris Chambers, of Cardiff University, said: “The government has given a robust steer that MPs need to link in more with academics to ensure decisions shaping the future of the country are evidence-based. It’s heartening to see that there is a will to adopt this system and we now need to move into a phase of developing a service that is both simple and effective to meet this need.”

The next steps for the project are parallel consultations of academics and members of the public and a pilot of the EIS, using funding from GW4 alliance of universities, made up of Bath, Bristol, Cardiff and Exeter.

What this study shows:
• The consultation shows that politicians recognise the importance of evidence-based policy making and agree on the need for an easier and more direct linkage between academic experts and policy makers.
• Politicians would welcome the creation of the EIS as a provider of rapid, reliable and confidential evidence.

What this study does not show:
• This study does not show how academics would provide evidence. This was a small-scale study which consulted politicians and has not attempted to give voice to the academic community.
• This study does not detail the mechanism of an operational EIS. Instead it indicates the need for a service such as the EIS and suggests ways in which the EIS can be operationalized.

Here’s a link to and a citation for the paper,

Service as a new platform for supporting evidence-based policy: a consultation of UK parliamentarians by Natalia Lawrence, Jemma Chambers, Sinead Morrison, Sven Bestmann, Gerard O’Grady, Christopher Chambers, Andrew Kythreotis. Evidence & Policy: A Journal of Research, Debate and Practice DOI: http://dx.doi.org/10.1332/174426416X14643531912169 Appeared or available online: June 6, 2016

This paper is behind a paywall open access. *Corrected June 17, 2016.*

It’s an interesting idea and I can understand the appeal. However, operationalizing this ‘dating’ or ‘matchmaking’ service could prove quite complex. I appreciate the logistics issues but I’m a little more concerned about the MPs’ science literacy. Are they going to be like the two US justices who believe that science is the pursuit of immutable facts? What happens if two MPs are matched up with a different scientist and those two scientists didn’t agree about what the evidence says. Or, what happens if one scientist is more cautious than the other. There are all kinds of pitfalls. I’m not arguing against the idea but it’s going to require a lot of careful consideration.

Luminescent upconversion nanoparticles could make imaging more efficient

Researchers at the University of Adelaide (Australia) have found a way to embed luminiscent nanoparticles in glass, according to a June 8, 2016 news item on Nanotechnology,

This new “hybrid glass” successfully combines the properties of these special luminescent (or light-emitting) nanoparticles with the well-known aspects of glass, such as transparency and the ability to be processed into various shapes including very fine optical fibres.

The research, in collaboration with Macquarie University and University of Melbourne, has been published online in the journal Advanced Optical Materials.

A June 7, 2016 University of Adelaide press release (also on EurekAlert), which originated the news item, offers more detail,

“These novel luminescent nanoparticles, called upconversion nanoparticles, have become promising candidates for a whole variety of ultra-high tech applications such as biological sensing, biomedical imaging and 3D volumetric displays,” says lead author Dr Tim Zhao, from the University of Adelaide’s School of Physical Sciences and Institute for Photonics and Advanced Sensing (IPAS).

“Integrating these nanoparticles into glass, which is usually inert, opens up exciting possibilities for new hybrid materials and devices that can take advantage of the properties of nanoparticles in ways we haven’t been able to do before. For example, neuroscientists currently use dye injected into the brain and lasers to be able to guide a glass pipette to the site they are interested in. If fluorescent nanoparticles were embedded in the glass pipettes, the unique luminescence of the hybrid glass could act like a torch to guide the pipette directly to the individual neurons of interest.”

Although this method was developed with upconversion nanoparticles, the researchers believe their new ‘direct-doping’ approach can be generalised to other nanoparticles with interesting photonic, electronic and magnetic properties. There will be many applications – depending on the properties of the nanoparticle.

“If we infuse glass with a nanoparticle that is sensitive to radiation and then draw that hybrid glass into a fibre, we could have a remote sensor suitable for nuclear facilities,” says Dr Zhao.

To date, the method used to integrate upconversion nanoparticles into glass has relied on the in-situ growth of the nanoparticles within the glass.

“We’ve seen remarkable progress in this area but the control over the nanoparticles and the glass compositions has been limited, restricting the development of many proposed applications,” says project leader Professor Heike Ebendorff-Heideprem, Deputy Director of IPAS.

“With our new direct doping method, which involves synthesizing the nanoparticles and glass separately and then combining them using the right conditions, we’ve been able to keep the nanoparticles intact and well dispersed throughout the glass. The nanoparticles remain functional and the glass transparency is still very close to its original quality. We are heading towards a whole new world of hybrid glass and devices for light-based technologies.”

Here’s a link to and a citation for the paper,

Upconversion Nanocrystal-Doped Glass: A New Paradigm for Photonic Materials by Jiangbo Zhao, Xianlin Zheng, Erik P. Schartner, Paul Ionescu, Run Zhang, Tich-Lam Nguyen, Dayong Jin, and Heike Ebendorff-Heidepriem. Advanced Optical Materials DOI: 10.1002/adom.201600296 Version of Record online: 30 MAY 2016

© 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

This paper is behind a paywall.

Biodegradable films from cellulose nanofibrils

A team at Purdue University (Indiana, US) has developed a new process for biodegradable films based on cellulose according to a June 8, 2016 news item on phys.org,

Purdue University researchers have developed tough, flexible, biodegradable films from cellulose, the main component of plant cell walls. The films could be used for products such as food packaging, agricultural groundcovers, bandages and capsules for medicine or bioactive compounds.

Food scientists Srinivas Janaswamy and Qin Xu engineered the cellophane-like material by solubilizing cellulose using zinc chloride, a common inorganic salt, and adding calcium ions to cause the cellulose chains to become tiny fibers known as nanofibrils, greatly increasing the material’s tensile strength. The zinc chloride and calcium ions work together to form a gel network, allowing the researchers to cast the material into a transparent, food-grade film.

A June 7, 2016 Purdue University news release by Natalie van Hoose, which originated the news item, discusses the need for these films and provides a few more technical details about the work (Note: A link has been removed),

“We’re looking for innovative ways to adapt and use cellulose – an inexpensive and widely available material – for a range of food, biomedical and pharmaceutical applications,” said Janaswamy, research assistant professor of food science and principal author of the study. “Though plastics have a wide variety of applications, their detrimental impact on the environment raises a critical need for alternative materials. Cellulose stands out as a viable option, and our process lays a strong foundation for developing new biodegradable plastics.”

Cellulose’s abundance, renewability and ability to biodegrade make it a promising substitute for petroleum-based products. While a variety of products such as paper, cellophane and rayon are made from cellulose, its tightly interlinked structure and insolubility – qualities that give plants strength and protection – make it a challenging material to work with.

Janaswamy and Xu loosened the cellulose network by adding zinc chloride, which helps push cellulose’s closely packed sheets apart, allowing water to penetrate and solubilize it. Adding calcium ions spurs the formation of nanofibrils through strong bonds between the solubilized cellulose sheets. The calcium ions boost the tensile strength of the films by about 250 percent.

The production process preserves the strength and biodegradability of cellulose while rendering it transparent and flexible.

Because the zinc chloride can be recycled to repeat the process, the method offers an environmentally friendly alternative to conventional means of breaking down cellulose, which tend to rely on toxic chemicals and extreme temperatures.

“Products based on this film can have a no-waste lifecycle,” said Xu, research assistant professor of food science and first author of the study. “This process allows us to create a valuable product from natural materials – including low-value or waste materials such as corn stover or wood chips- that can eventually be returned to the Earth.”

The methodology could be adapted to mass-produce cellulose films, the researchers said.

The next step in the project is to find ways of making the cellulose film insoluble to water while maintaining its ability to biodegrade.

Here’s a link to and a citation for the paper,

A facile route to prepare cellulose-based films by Qin Xu, Chen Chen, Katelyn Rosswurm, Tianming Yao, Srinivas Janaswamy. Carbohydrate Polymers Volume 149, 20 September 2016, Pages 274–281 doi:10.1016/j.carbpol.2016.04.114

This paper is behind a paywall.

Taking DNA beyond genetics with living computers and nanobots

You might want to keep a salt shaker with you while reading a June 7, 2016 essay by Matteo Palma (Queen Mary’s University of London) about nanotechnology and DNA on The Conversation website (h/t June 7, 2016 news item on Nanowerk).

This is not a ‘hype’ piece as Palma backs every claim with links to the research while providing a good overview of some very exciting work but the mood is a bit euphoric so you may want to keep the earlier mentioned salt shaker nearby.

Palma offers a very nice beginner introduction especially helpful for someone who only half-remembers their high school biology (from the June 7, 2016 essay)

DNA is one of the most amazing molecules in nature, providing a way to carry the instructions needed to create almost any lifeform on Earth in a microscopic package. Now scientists are finding ways to push DNA even further, using it not just to store information but to create physical components in a range of biological machines.

Deoxyribonucleic acid or “DNA” carries the genetic information that we, and all living organisms, use to function. It typically comes in the form of the famous double-helix shape, made up of two single-stranded DNA molecules folded into a spiral. Each of these is made up of a series of four different types of molecular component: adenine (A), guanine (G), thymine (T), and cytosine (C).

Genes are made up from different sequences of these building block components, and the order in which they appear in a strand of DNA is what encodes genetic information. But by precisely designing different A,G,T and C sequences, scientists have recently been able to develop new ways of folding DNA into different origami shapes, beyond the conventional double helix.

This approach has opened up new possibilities of using DNA beyond its genetic and biological purpose, turning it into a Lego-like material for building objects that are just a few billionths of a metre in diameter (nanoscale). DNA-based materials are now being used for a variety of applications, ranging from templates for electronic nano-devices, to ways of precisely carrying drugs to diseased cells.

He highlights some Canadian work,

Designing electronic devices that are just nanometres in size opens up all sorts of possible applications but makes it harder to spot defects. As a way of dealing with this, researchers at the University of Montreal have used DNA to create ultrasensitive nanoscale thermometers that could help find minuscule hotspots in nanodevices (which would indicate a defect). They could also be used to monitor the temperature inside living cells.

The nanothermometers are made using loops of DNA that act as switches, folding or unfolding in response to temperature changes. This movement can be detected by attaching optical probes to the DNA. The researchers now want to build these nanothermometers into larger DNA devices that can work inside the human body.

He also mentions the nanobots that will heal your body (according to many works of fiction),

Researchers at Harvard Medical School have used DNA to design and build a nanosized robot that acts as a drug delivery vehicle to target specific cells. The nanorobot comes in the form of an open barrel made of DNA, whose two halves are connected by a hinge held shut by special DNA handles. These handles can recognise combinations of specific proteins present on the surface of cells, including ones associated with diseases.

When the robot comes into contact with the right cells, it opens the container and delivers its cargo. When applied to a mixture of healthy and cancerous human blood cells, these robots showed the ability to target and kill half of the cancer cells, while the healthy cells were left unharmed.

Palma is describing a very exciting development and there are many teams worldwide working on ways to make drugs more effective and less side effect-ridden. However there does seem to be a bit of a problem with targeted drug delivery as noted in my April 27, 2016 posting,

According to an April 27, 2016 news item on Nanowerk researchers at the University of Toronto (Canada) along with their collaborators in the US (Harvard Medical School) and Japan (University of Tokyo) have determined that less than 1% of nanoparticle-based drugs reach their intended destination …

Less than 1%? Admittedly, nanoparticles are not the same as nanobots but the problem is in the delivery, from my April 27, 2016 posting,

… the authors argue that, in order to increase nanoparticle delivery efficiency, a systematic and coordinated long-term strategy is necessary. To build a strong foundation for the field of cancer nanomedicine, researchers will need to understand a lot more about the interactions between nanoparticles and the body’s various organs than they do today. …

I imagine nanobots will suffer a similar fate since the actual delivery mechanism to a targeted cell is still a mystery.

I quite enjoyed Palma’s essay and appreciated the links he provided. My only proviso, keep a salt shaker nearby. That rosy future is going take a while to get here.

Canada and its review of fundamental science

Big thanks to David Bruggeman’s June 14, 2016 post (on his Pasco Phronesis blog) for news of Canada’s Fundamental Science Review, which was launched on June 13, 2016 (Note: Links have been removed),

The panel’s mandate focuses on support for fundamental research, research facilities, and platform technologies.  This will include the three granting councils as well as other research organisations such as the Canada Foundation for Innovation. But it does not preclude the panel from considering and providing advice and recommendations on research matters outside of the mandate.  The plan is to make the panel’s work and recommendations readily accessible to the public, either online or through any report or reports the panel produces.  The panel’s recommendations to Minister Duncan are non-binding. …

As Ivan Semeniuk notes at The Globe and Mail [Canadian ‘national’ newspaper], the recent Nurse Review in the U.K., which led to the notable changes underway in the organization of that country’s research councils, seems comparable to this effort.  But I think it worth noting the differences in the research systems of the two countries, and the different political pressures in play.  It is not at all obvious to this writer that the Canadian review would necessarily lead to similar recommendations for a streamlining and reorganization of the Canadian research councils.

Longtime observers of the Canadian science funding scene may recall an earlier review held under the auspices of the Steven Harper Conservative government known as the ‘Review of Federal Support to R&D’. In fact it was focused on streamlining government funding for innovation and commercialization of science. The result was the 2011 report, ‘Innovation Canada: A Call to Action’, known popularly as the ‘Jenkins report’ after the panel chair, Tom Jenkins. (More about the report and responses to it can be found in my Oct. 21, 2011 post).

It’s nice to see that fundamental science is being given its turn for attention.

A June 13, 2016 Innovation, Science and Economic Development Canada news release provides more detail about the review and the panel guiding the review,

The Government of Canada understands the role of science in maintaining a thriving, clean economy and in providing the evidence for sound policy decisions. To deliver on this role however, federal programs that support Canada’s research efforts must be aligned in such a way as to ensure they are strategic, effective and focused on meeting the needs of scientists first.

That is why the Honourable Kirsty Duncan, Minister of Science, today launched an independent review of federal funding for fundamental science. The review will assess the program machinery that is currently in place to support science and scientists in Canada. The scope of the review includes the three granting councils [Social Sciences and Humanities Research Council {SSHRC}, Natural Sciences and Engineering Research Council {NSERC}, Canadian Institutes of Health Research {CIHR}] along with certain federally funded organizations such as the Canada Foundation for Innovation [CFI].

The review will be led by an independent panel of distinguished research leaders and innovators including Dr. David Naylor, former president of the University of Toronto and chair of the panel. Other panelists include:

  • Dr. Robert Birgeneau, former chancellor, University of California, Berkeley
  • Dr. Martha Crago, Vice-President, Research, Dalhousie University
  • Mike Lazaridis, co-founder, Quantum Valley Investments
  • Dr. Claudia Malacrida, Associate Vice-President, Research, University of Lethbridge
  • Dr. Art McDonald, former director of the Sudbury Neutrino Laboratory, Nobel Laureate
  • Dr. Martha Piper, interim president, University of British Columbia
  • Dr. Rémi Quirion, Chief Scientist, Quebec
  • Dr. Anne Wilson, Canadian Institute for Advanced Research Successful Societies Fellow and professor of psychology, Wilfrid Laurier University

The panel will spend the next six months seeking input from the research community and Canadians on how to optimize support for fundamental science in Canada. The panel will also survey international best practices for funding science and examine whether emerging researchers face barriers that prevent them from achieving career goals. It will look at what must be done to address these barriers and what more can be done to encourage Canada’s scientists to take on bold new research challenges. In addition to collecting input from the research community, the panel will also invite Canadians to participate in the review [emphasis mine] through an online consultation.

Ivan Semeniuk in his June 13, 2016 article for The Globe and Mail provides some interesting commentary about the possible outcomes of this review,

Depending on how its recommendations are taken on board, the panel could trigger anything from minor tweaks to a major rebuild of Ottawa’s science-funding apparatus, which this year is expected to funnel more than $3-billion to Canadian researchers and their labs.

Asked what she most wanted the panel to address, Ms. Duncan cited, as an example, the plight of younger researchers who, in many cases, must wait until they are in their 40s to get federal support.

Another is the risk of losing the benefits of previous investments when funding rules become restrictive, such as a 14-year limit on how long the government can support one of its existing networks of centres of excellence, or the dependence of research projects that are in the national interest on funding streams that require support from provincial governments or private sources.

The current system for proposing and reviewing research grants has been criticized as cumbersome and fraught with biases that mean the best science is not always supported.

In a paper published on Friday in the research journal PLOS One, Trent University biologist Dennis Murray and colleagues combed through 13,526 grant proposals to the Natural Sciences and Engineering Research Council between 2011 and 2014 and found significant evidence that researchers at smaller universities have consistently lower success rates.

Dr. Murray advocates for a more quantitative and impartial system of review to keep such biases at bay.

“There are too many opportunities for human impressions — conscious or unconscious — to make their way into the current evaluation process,” Dr. Murray said.

More broadly, researchers say the time is right for a look at a system that has grown convoluted and less suited to a world in which science is increasingly cross-disciplinary, and international research collaborations are more important.

If you have time, I encourage you to take a look at Semeniuk’s entire article as for the paper he mentions, here’s a link to and a citation for it,

Bias in Research Grant Evaluation Has Dire Consequences for Small Universities by Dennis L. Murray, Douglas Morris, Claude Lavoie, Peter R. Leavitt, Hugh MacIsaac,  Michael E. J. Masson, & Marc-Andre Villard. PLOS http://dx.doi.org/10.1371/journal.pone.0155876  Published: June 3, 2016

This paper is open access.

Getting back to the review and more specifically, the panel, it’s good to see that four of the nine participants are women but other than that there doesn’t seem to be much diversity, i.e.,the majority (five) spring from the Ontario/Québec nexus of power and all the Canadians are from the southern part of country. Back to diversity, there is one business man, Mike Laziridis known primarily as the founder of Research in Motion (RIM or more popularly as the Blackberry company) making the panel not a wholly ivory tower affair. Still, I hope one day these panels will have members from the Canadian North and international members who come from somewhere other than the US, Great Britain, and/or if they’re having a particularly wild day, Germany. Here are some candidate countries for other places to look for panel members: Japan, Israel, China, South Korea, and India. Other possibilities include one of the South American countries, African countries, and/or the Middle Eastern countries.

Take the continent of Africa for example, where many countries seem to have successfully tackled one of the issues as we face. Specifically, the problem of encouraging young researchers. James Wilsdon notes some success in his April 9, 2016 post about Africa and science advice for the Guardian science blogs (Note: Links have been removed),

… some of the brightest talents and most exciting advances in African science were on display at the Next Einstein Forum. This landmark meeting, initiated by the African Institute of Mathematical Sciences, and held in Senegal, brought together almost 1000 researchers, entrepreneurs, businesses and policymakers from across Africa to celebrate and support the continent’s most promising early-career researchers.

A new cadre of fifteen Next Einstein Fellows and fifty-four ambassadors was announced, and the forum ended with an upbeat declaration of commitment to Africa’s role in world-leading, locally-relevant science. …

… UNESCO’s latest global audit of science, published at the end of 2015, concludes that African science is firmly on the rise. The number of journal articles published on the continent rose by sixty per cent from 2008 to 2014. Research investment rose from $12.9 billion in 2007 to $19.9 billion (US dollars) in 2013. Over the same period, R&D expenditure as a percentage of GDP nudged upwards from 0.36 per cent to 0.45 per cent, and the population of active researchers expanded from 150,000 to 190,000.

If you have the time, do read Wilsdon’s piece which covers some of the more difficult aspects facing the science communities in Africa and more.

In any event, it’s a bit late to bemoan the panel’s makeup but hopefully the government will take note for the future as I’m planning to include some of my critique in my comments to the panel in answer to their request for public comments.

You can find out more about Canada’s Fundamental Science Review here and you can easily participate here and/or go here to subscribe for updates.

“Breaking Me Softly” at the nanoscale

“Breaking Me Softly” sounds like a song title but in this case the phrase as been coined to describe a new technique for controlling materials at the nanoscale according to a June 6, 2016 news item on ScienceDaily,

A finding by a University of Central Florida researcher that unlocks a means of controlling materials at the nanoscale and opens the door to a new generation of manufacturing is featured online in the journal Nature.

Using a pair of pliers in each hand and gradually pulling taut a piece of glass fiber coated in plastic, associate professor Ayman Abouraddy found that something unexpected and never before documented occurred — the inner fiber fragmented in an orderly fashion.

“What we expected to see happen is NOT what happened,” he said. “While we thought the core material would snap into two large pieces, instead it broke into many equal-sized pieces.”

He referred to the technique in the Nature article title as “Breaking Me Softly.”

A June 6, 2016 University of Central Florida (UCF) news release (also on EurekAlert) by Barbara Abney, which originated the news item, expands on the theme,

The process of pulling fibers to force the realignment of the molecules that hold them together, known as cold drawing, has been the standard for mass production of flexible fibers like plastic and nylon for most of the last century.

Abouraddy and his team have shown that the process may also be applicable to multi-layered materials, a finding that could lead to the manufacturing of a new generation of materials with futuristic attributes.

“Advanced fibers are going to be pursuing the limits of anything a single material can endure today,” Abouraddy said.

For example, packaging together materials with optical and mechanical properties along with sensors that could monitor such vital sign as blood pressure and heart rate would make it possible to make clothing capable of transmitting vital data to a doctor’s office via the Internet.

The ability to control breakage in a material is critical to developing computerized processes for potential manufacturing, said Yuanli Bai, a fracture mechanics specialist in UCF’s College of Engineering and Computer Science.

Abouraddy contacted Bai, who is a co-author on the paper, about three years ago and asked him to analyze the test results on a wide variety of materials, including silicon, silk, gold and even ice.

He also contacted Robert S. Hoy, a University of South Florida physicist who specializes in the properties of materials like glass and plastic, for a better understanding of what he found.

Hoy said he had never seen the phenomena Abouraddy was describing, but that it made great sense in retrospect.

The research takes what has traditionally been a problem in materials manufacturing and turned it into an asset, Hoy said.

“Dr. Abouraddy has found a new application of necking” –  a process that occurs when cold drawing causes non-uniform strain in a material, Hoy said.  “Usually you try to prevent necking, but he exploited it to do something potentially groundbreaking.”

The necking phenomenon was discovered decades ago at DuPont and ushered in the age of textiles and garments made of synthetic fibers.

Abouraddy said that cold-drawing is what makes synthetic fibers like nylon and polyester useful. While those fibers are initially brittle, once cold-drawn, the fibers toughen up and become useful in everyday commodities. This discovery at DuPont at the end of the 1920s ushered in the age of textiles and garments made of synthetic fibers.

Only recently have fibers made of multiple materials become possible, he said.  That research will be the centerpiece of a $317 Million U.S. Department of Defense program focused on smart fibers that Abouraddy and UCF will assist with.   The Revolutionary Fibers and Textiles Manufacturing Innovation Institute (RFT-MII), led by the Massachusetts Institute of Technology, will incorporate research findings published in the Nature paper, Abouraddy said.

The implications for manufacturing of the smart materials of the future are vast.

By controlling the mechanical force used to pull the fiber and therefore controlling the breakage patterns, materials can be developed with customized properties allowing them to interact with each other and eternal forces such as the sun (for harvesting energy) and the internet in customizable ways.

A co-author on the paper, Ali P. Gordon, an associate professor in the Department of Mechanical & Aerospace Engineering and director of UCF’s Mechanics of Materials Research Group said that the finding is significant because it shows that by carefully controlling the loading condition imparted to the fiber, materials can be developed with tailored performance attributes.

“Processing-structure-property relationships need to be strategically characterized for complex material systems. By combining experiments, microscopy, and computational mechanics, the physical mechanisms of the fragmentation process were more deeply understood,” Gordon said.

Abouraddy teamed up with seven UCF scientists from the College of Optics & Photonics and the College of Engineering & Computer Science (CECS) to write the paper.   Additional authors include one researcher each from the Massachusetts Institute of Technology, Nanyang Technological University in Singapore and the University of South Florida.

Here’s a link to and a citation for the paper,

Controlled fragmentation of multimaterial fibres and films via polymer cold-drawing by Soroush Shabahang, Guangming Tao, Joshua J. Kaufman, Yangyang Qiao, Lei Wei, Thomas Bouchenot, Ali P. Gordon, Yoel Fink, Yuanli Bai, Robert S. Hoy & Ayman F. Abouraddy. Nature (2016) doi:10.1038/nature17980 Published online  06 June 2016

This paper is behind a paywall.

Accountability for artificial intelligence decision-making

How does an artificial intelligence program arrive at its decisions? It’s a question that’s not academic any more as these programs take on more decision-making chores according to a May 25, 2016 Carnegie Mellon University news release (also on EurekAlert) by Bryon Spice (Note: Links have been removed),

Machine-learning algorithms increasingly make decisions about credit, medical diagnoses, personalized recommendations, advertising and job opportunities, among other things, but exactly how usually remains a mystery. Now, new measurement methods developed by Carnegie Mellon University [CMU] researchers could provide important insights to this process.

Was it a person’s age, gender or education level that had the most influence on a decision? Was it a particular combination of factors? CMU’s Quantitative Input Influence (QII) measures can provide the relative weight of each factor in the final decision, said Anupam Datta, associate professor of computer science and electrical and computer engineering.

It’s reassuring to know that more requests for transparency of the decision-making process are being made. After all, it’s disconcerting that someone with the life experience of a gnat and/or possibly some issues might be developing an algorithm that could affection your life in some fundamental ways. Here’s more from the news release (Note: Links have been removed),

“Demands for algorithmic transparency are increasing as the use of algorithmic decision-making systems grows and as people realize the potential of these systems to introduce or perpetuate racial or sex discrimination or other social harms,” Datta said.

“Some companies are already beginning to provide transparency reports, but work on the computational foundations for these reports has been limited,” he continued. “Our goal was to develop measures of the degree of influence of each factor considered by a system, which could be used to generate transparency reports.”

These reports might be generated in response to a particular incident — why an individual’s loan application was rejected, or why police targeted an individual for scrutiny, or what prompted a particular medical diagnosis or treatment. Or they might be used proactively by an organization to see if an artificial intelligence system is working as desired, or by a regulatory agency to see whether a decision-making system inappropriately discriminated between groups of people.

Datta, along with Shayak Sen, a Ph.D. student in computer science, and Yair Zick, a post-doctoral researcher in the Computer Science Department, will present their report on QII at the IEEE Symposium on Security and Privacy, May 23–25 [2016], in San Jose, Calif.

Generating these QII measures requires access to the system, but doesn’t necessitate analyzing the code or other inner workings of the system, Datta said. It also requires some knowledge of the input dataset that was initially used to train the machine-learning system.

A distinctive feature of QII measures is that they can explain decisions of a large class of existing machine-learning systems. A significant body of prior work takes a complementary approach, redesigning machine-learning systems to make their decisions more interpretable and sometimes losing prediction accuracy in the process.

QII measures carefully account for correlated inputs while measuring influence. For example, consider a system that assists in hiring decisions for a moving company. Two inputs, gender and the ability to lift heavy weights, are positively correlated with each other and with hiring decisions. Yet transparency into whether the system uses weight-lifting ability or gender in making its decisions has substantive implications for determining if it is engaging in discrimination.

“That’s why we incorporate ideas for causal measurement in defining QII,” Sen said. “Roughly, to measure the influence of gender for a specific individual in the example above, we keep the weight-lifting ability fixed, vary gender and check whether there is a difference in the decision.”

Observing that single inputs may not always have high influence, the QII measures also quantify the joint influence of a set of inputs, such as age and income, on outcomes and the marginal influence of each input within the set. Since a single input may be part of multiple influential sets, the average marginal influence of the input is computed using principled game-theoretic aggregation measures previously applied to measure influence in revenue division and voting.

“To get a sense of these influence measures, consider the U.S. presidential election,” Zick said. “California and Texas have influence because they have many voters, whereas Pennsylvania and Ohio have power because they are often swing states. The influence aggregation measures we employ account for both kinds of power.”

The researchers tested their approach against some standard machine-learning algorithms that they used to train decision-making systems on real data sets. They found that the QII provided better explanations than standard associative measures for a host of scenarios they considered, including sample applications for predictive policing and income prediction.

Now, they are seeking collaboration with industrial partners so that they can employ QII at scale on operational machine-learning systems.

Here’s a link to and a citation for a PDF of the paper presented at the May 2016 conference,

Algorithmic Transparency via Quantitative Input Influence: Theory and Experiments with Learning Systems by Anupam Datta, Shayak Sen, Yair Zick. Presented at the at the IEEE Symposium on Security and Privacy, May 23–25, in San Jose, Calif.

I’ve also embedded the paper here,

CarnegieMellon_AlgorithmicTransparency