Monthly Archives: February 2016

Science advice conference in Brussels, Belgium, Sept. 29 – 30, 2016 and a call for speakers

This is the second such conference and they are issuing a call for speakers; the first was held in New Zealand in 2014 (my April 8, 2014 post offers an overview of the then proposed science advice conference). Thanks to David Bruggeman and his Feb. 23, 2016 posting (on the Pasco Phronesis blog) for the information about this latest one (Note: A link has been removed),

The International Network for Global Science Advice (INGSA) is holding its second global conference in Brussels this September 29 and 30, in conjunction with the European Commission. The organizers have the following goals for the conference:

  • Identify core principles and best practices, common to structures providing scientific advice for governments worldwide.
  • Identify practical ways to improve the interaction of the demand and supply side of scientific advice.
  • Describe, by means of practical examples, the impact of effective science advisory processes.

Here’s a little more about the conference from its webpage on the INGSA website,

Science and Policy-Making: towards a new dialogue

29th – 30th September 2016, Brussels, Belgium

Call for suggestions for speakers for the parallel sessions

BACKGROUND:

“Science advice has never been in greater demand; nor has it been more contested.”[1] The most complex and sensitive policy issues of our time are those for which the available scientific evidence is ever growing and multi-disciplined, but still has uncertainties. Yet these are the very issues for which scientific input is needed most. In this environment, the usefulness and legitimacy of expertise seems obvious to scientists, but is this view shared by policy-makers?

OBJECTIVES:

A two-day conference will take place in Brussels, Belgium, on Thursday 29th and Friday 30th September 2016. Jointly organised by the European Commission and the International Network for Government Science Advice (INGSA), the conference will bring together users and providers of scientific advice on critical, global issues. Policy-makers, leading practitioners and scholars in the field of science advice to governments, as well as other stakeholders, will explore principles and practices in a variety of current and challenging policy contexts. It will also present the new Scientific Advice Mechanism [SAM] of the European Commission [emphasis mine; I have more about SAM further down in the post] to the international community. Through keynote lectures and plenary discussions and topical parallel sessions, the conference aims to take a major step towards responding to the challenge best articulated by the World Science Forum Declaration of 2015:

“The need to define the principles, processes and application of science advice and to address the theoretical and practical questions regarding the independence, transparency, visibility and accountability of those who receive and provide advice has never been more important. We call for concerted action of scientists and policy-makers to define and promulgate universal principles for developing and communicating science to inform and evaluate policy based on responsibility, integrity, independence, and accountability.”

The conference seeks to:

Identify core principles and best practices, common to structures providing scientific advice for governments worldwide.
Identify practical ways to improve the interaction of the demand and supply side of scientific advice.
Describe, by means of practical examples, the impact of effective science advisory processes.

The Programme Committee comprises:

Eva Alisic, Co-Chair of the Global Young Academy

Tateo Arimoto, Director of Science, Technology and Innovation Programme; The Japanese National Graduate Institute for Policy Studies

Peter Gluckman, Chair of INGSA and Prime Minister’s Chief Science Advisor, New Zealand (co-chair)

Robin Grimes, UK Foreign Office Chief Scientific Adviser

Heide Hackmann, International Council for Science (ICSU)

Theodoros Karapiperis, European Parliament – Head of Scientific Foresight Unit (STOA), European Parliamentary Research Service (EPRS) – Science and Technology Options Assessment Panel

Johannes Klumpers, European Commission, Head of Unit – Scientific Advice Mechanism (SAM) (co-chair)

Martin Kowarsch, Head of the Working Group Scientific assessments, Ethics and Public Policy, Mercator Research Institute on Global Commons and Climate Change

David Mair, European Commission – Joint Research Centre (JRC)

Rémi Quirion, Chief Scientist,  Province of Québec, Canada

Flavia Schlegel, UNESCO Assistant Director-General for the Natural Sciences

Henrik Wegener, Executive Vice President, Chief Academic Officer, Provost at Technical University of Denmark, Chair of the EU High Level Group of Scientific Advisors

James Wilsdon, Chair of INGSA, Professor of Research Policy, Director of Impact & Engagement, University of Sheffield
Format

The conference will be a combination of plenary lectures and topical panels in parallel (concurrent) sessions outlined below. Each session will include three speakers (15 minute address with 5 minute Q & A each) plus a 30 minute moderated discussion.

Parallel Session I: Scientific advice for global policy

The pathways of science advice are a product of a country’s own cultural history and will necessarily differ across jurisdictions. Yet, there is an increasing number of global issues that require science advice. Can scientific advice help to address issues requiring action at international level? What are the considerations for providing science advice in these contexts? What are the examples from which we can learn what works and what does not work in informing policy-making through scientific advice?

Topics to be addressed include:

Climate Change – Science for the Paris Agreement: Did it work?
Migration: How can science advice help?
Zika fever, dementia, obesity etc.; how can science advice help policy to address the global health challenges?

Parallel Session II: Getting equipped – developing the practice of providing scientific advice for policy

The practice of science advice to public policy requires a new set of skills that are neither strictly scientific nor policy-oriented, but a hybrid of both. Negotiating the interface between science and policy requires translational and navigational skills that are often not acquired through formal training and education. What are the considerations in developing these unique capacities, both in general and for particular contexts? In order to be best prepared for informing policy-making, up-coming needs for scientific advice should ideally be anticipated. Apart from scientific evidence sensu stricto, can other sources such as the arts, humanities, foresight and horizon scanning provide useful insights for scientific advice? How can scientific advice make best use of such tools and methods?

Topics to be addressed include:

How to close the gap between the need and the capacity for science advice in developing countries with limited or emerging science systems?
What skills do scientists and policymakers need for a better dialogue?
Foresight and science advice: can foresight and horizon scanning help inform the policy agenda?

Parallel Session III: Scientific advice for and with society

In many ways, the practice of science advice has become a key pillar in what has been called the ‘new social contract for science[2]’. Science advice translates knowledge, making it relevant to society through both better informed policy and by helping communities and their elected representatives to make better informed decisions about the impacts of technology. Yet providing science advice is often a distributed and disconnected practice in which academies, formal advisors, journalists, stakeholder organisations and individual scientists play an important role. The resulting mix of information can be complex and even contradictory, particularly as advocate voices and social media join the open discourse. What considerations are there in an increasingly open practice of science advice?

Topics to be addressed include:

Science advice and the media: Lost in translation?
Beyond the ivory tower: How can academies best contribute to science advice for policy?
What is the role of other stakeholders in science advice?

Parallel Session IV: Science advice crossing borders

Science advisors and advisory mechanisms are called upon not just for nationally-relevant advice, but also for issues that increasingly cross borders. In this, the importance of international alignment and collaborative possibilities may be obvious, but there may be inherent tensions. In addition, there may be legal and administrative obstacles to transnational scientific advice. What are these hurdles and how can they be overcome? To what extent are science advisory systems also necessarily diplomatic and what are the implications of this in practice?

Topics to be addressed include:

How is science advice applied across national boundaries in practice?
What support do policymakers need from science advice to implement the Sustainable Development Goals in their countries?
Science Diplomacy/Can Scientists extend the reach of diplomats?

Call for Speakers

The European Commission and INGSA are now in the process of identifying speakers for the above conference sessions. As part of this process we invite those interested in speaking to submit their ideas. Interested policy-makers, scientists and scholars in the field of scientific advice, as well as business and civil-society stakeholders are warmly encouraged to submit proposals. Alternatively, you may propose an appropriate speaker.

The conference webpage includes a form should you wish to submit yourself or someone else as a speaker.

New Scientific Advice Mechanism of the European Commission

For anyone unfamiliar with the Scientific Advice Mechanism (SAM) mentioned in the conference’s notes, once Anne Glover’s, chief science adviser for the European Commission (EC), term of office was completed in 2014 the EC president, Jean-Claude Juncker, obliterated the position. Glover, the first and only science adviser for the EC, was to replaced by an advisory council and a new science advice mechanism.

David Bruggemen describes the then situation in a May 14, 2015 posting (Note: A link has been removed),

Earlier this week European Commission President Juncker met with several scientists along with Commission Vice President for Jobs, Growth, Investment and Competitiveness [Jyrki] Katainen and the Commissioner for Research, Science and Innovation ]Carlos] Moedas. …

What details are publicly available are currently limited to this slide deck.  It lists two main mechanisms for science advice, a high-level group of eminent scientists (numbering seven), staffing and resource support from the Commission, and a structured relationship with the science academies of EU member states.  The deck gives a deadline of this fall for the high-level group to be identified and stood up.

… The Commission may use this high-level group more as a conduit than a source for policy advice.  A reasonable question to ask is whether or not the high-level group can meet the Commission’s expectations, and those of the scientific community with which it is expected to work.

David updated the information in a January 29,2016 posting (Note: Links have been removed),

Today the High Level Group of the newly constituted Scientific Advice Mechanism (SAM) of the European Union held its first meeting.  The seven members of the group met with Commissioner for Research, Science and Innovation Carlos Moedas and Andrus Ansip, the Commission’s Vice-President with responsibility for the Digital Single Market (a Commission initiative focused on making a Europe-wide digital market and improving support and infrastructure for digital networks and services).

Given it’s early days, there’s little more to discuss than the membership of this advisory committee (from the SAM High Level Group webpage),

Janusz Bujnicki

Professor, Head of the Laboratory of Bioinformatics and Protein Engineering, International Institute of Molecular and Cell Biology, Warsaw

Janusz Bujnicki

Professor of Biology, and head of a research group at IIMCB in Warsaw and at Adam Mickiewicz University, Poznań, Poland. Janusz Bujnicki graduated from the Faculty of Biology, University of Warsaw in 1998, defended his PhD in 2001, was awarded with habilitation in 2005 and with the professor title in 2009.

Bujnicki’s research combines bioinformatics, structural biology and synthetic biology. His scientific achievements include the development of methods for computational modeling of protein and RNA 3D structures, discovery and characterization of enzymes involved in RNA metabolism, and engineering of proteins with new functions. He is an author of more than 290 publications, which have been cited by other researchers more than 5400 times (as of October 2015). Bujnicki received numerous awards, prizes, fellowships, and grants including EMBO/HHMI Young Investigator Programme award, ERC Starting Grant, award of the Polish Ministry of Science and award of the Polish Prime Minister, and was decorated with the Knight’s Cross of the Order of Polonia Restituta by the President of the Republic of Poland. In 2013 he won the national plebiscite “Poles with Verve” in the Science category.

Bujnicki has been involved in various scientific organizations and advisory bodies, including the Polish Young Academy, civic movement Citizens of Science, Life, Environmental and Geo Sciences panel of the Science Europe organization, and Scientific Policy Committee – an advisory body of the Ministry of Science and Higher Education in Poland. He is also an executive editor of the scientific journal Nucleic Acids Research.

Curriculum vitae  PDF icon 206 KB

Pearl Dykstra

Professor of Sociology, Erasmus University Rotterdam

Pearl Dykstra

Professor Dykstra has a chair in Empirical Sociology and is Director of Research of the Department of Public Administration and Sociology at the Erasmus University Rotterdam. Previously, she had a chair in Kinship Demography at Utrecht University (2002-2009) and was a senior scientist at the Netherlands Interdisciplinary Demographic Institute (NIDI) in The Hague (1990-2009).

Her publications focus on intergenerational solidarity, aging societies, family change, aging and the life course, and late-life well-being. She is an elected member of the Netherlands Royal Academy of Arts and Sciences (KNAW, 2004) and Vice-President of the KNAW as of 2011, elected Member of the Dutch Social Sciences Council (SWR, 2006), and elected Fellow of the Gerontological Society of America (2010). In 2012 she received an ERC Advanced Investigator Grant for the research project “Families in context”, which will focus on the ways in which policy, economic, and cultural contexts structure interdependence in families.

Curriculum vitae  PDF icon 391 KB

Elvira Fortunato

Deputy Chair

Professor, Materials Science Department of the Faculty of Science and Technology, NOVA University, Lisbon

Elvira Fortunato

Professor Fortunato is a full professor in the Materials Science Department of the Faculty of Science and Technology of the New University of Lisbon, a Fellow of the Portuguese Engineering Academy since 2009 and decorated as a Grand Officer of the Order of Prince Henry the Navigator by the President of the Republic in 2010, due to her scientific achievements worldwide. In 2015 she was appointed by the Portuguese President Chairman of the Organizing Committee of the Celebrations of the National Day of Portugal, Camões and the Portuguese Communities.

She was also a member of the Portuguese National Scientific & Technological Council between 2012 and 2015 and a member of the advisory board of DG CONNECT (2014-15).

Currently she is the director of the Institute of Nanomaterials, Nanofabrication and Nanomodeling and of CENIMAT. She is member of the board of trustees of Luso-American Foundation (Portugal/USA, 2013-2020).

Fortunato pioneered European research on transparent electronics, namely thin-film transistors based on oxide semiconductors, demonstrating that oxide materials can be used as true semiconductors. In 2008, she received in the 1st ERC edition an Advanced Grant for the project “Invisible”, considered a success story. In the same year she demonstrated with her colleagues the possibility to make the first paper transistor, starting a new field in the area of paper electronics.

Fortunato published over 500 papers and during the last 10 years received more than 16 International prizes and distinctions for her work (e.g: IDTechEx USA 2009 (paper transistor); European Woman Innovation prize, Finland 2011).

Curriculum vitae  PDF icon 339 KB

Rolf-Dieter Heuer

Director-General of the European Organization for Nuclear Research (CERN)

Rolf-Dieter Heuer

Professor Heuer is an experimental particle physicist and has been CERN Director-General since January 2009. His mandate, ending December 2015, is characterised by the start of the Large Hadron Collider (LHC) 2009 as well as its energy increase 2015, the discovery of the H-Boson and the geographical enlargement of CERN Membership. He also actively engaged CERN in promoting the importance of science and STEM education for the sustainable development of the society. From 2004 to 2008, Prof. Heuer was research director for particle and astroparticle physics at the DESY laboratory, Germany where he oriented the particle physics groups towards LHC by joining both large experiments, ATLAS and CMS. He has initiated restructuring and focusing of German high energy physics at the energy frontier with particular emphasis on LHC (Helmholtz Alliance “Physics at the Terascale”). In April 2016 he will become President of the German Physical Society. He is designated President of the Council of SESAME (Synchrotron-Light for Experimental Science and Applications in the Middle East).

Prof. Heuer has published over 500 scientific papers and holds many Honorary Degrees from universities in Europe, Asia, Australia and Canada. He is Member of several Academies of Sciences in Europe, in particular of the German Academy of Sciences Leopoldina, and Honorary Member of the European Physical Society. In 2015 he received the Grand Cross 1st class of the Order of Merit of the Federal Republic of Germany.

Curriculum vitae  PDF icon

Julia Slingo

Chief Scientist, Met Office, Exeter

Julia Slingo

Dame Julia Slingo became Met Office Chief Scientist in February 2009 where she leads a team of over 500 scientists working on a very broad portfolio of research that underpins weather forecasting, climate prediction and climate change projections. During her time as Chief Scientist she has fostered much stronger scientific partnerships across UK academia and international research organisations, recognising the multi-disciplinary and grand challenge nature of weather and climate science and services. She works closely with UK Government Chief Scientific Advisors and is regularly called to give evidence on weather and climate related issues.

Before joining the Met Office she was the Director of Climate Research in NERC’s National Centre for Atmospheric Science, at the University of Reading. In 2006 she founded the Walker Institute for Climate System Research at Reading, aimed at addressing the cross disciplinary challenges of climate change and its impacts. Julia has had a long-term career in atmospheric physics, climate modelling and tropical climate variability, working at the Met Office, ECMWF and NCAR in the USA.

Dame Julia has published over 100 peer reviewed papers and has received numerous awards including the prestigious IMO Prize of the World Meteorological Organization for her outstanding work in meteorology, climatology, hydrology and related sciences. She is a Fellow of the Royal Society, an Honorary Fellow of the Royal Society of Chemistry and an Honorary Fellow of the Institute of Physics.

Curriculum vitae  PDF icon 239 KB

Cédric Villani

Director, Henri Poincaré Institute, Paris

Cédric Villani

Born in 1973 in France, Cédric Villani is a mathematician, director of the Institut Henri Poincaré in Paris (from 2009), and professor at the Université Claude Bernard of Lyon (from 2010). In December 2013 he was elected to the French Academy of Sciences.

He has worked on the theory of partial differential equations involved in statistical mechanics, specifically the Boltzmann equation, and on nonlinear Landau damping. He was awarded the Fields Medal in 2010 for his works.

Since then he has been playing an informal role of ambassador for the French mathematical community to media (press, radio, television) and society in general. His books for non-specialists, in particular Théorème vivant (2012, translated in a dozen of languages), La Maison des mathématiques (2014, with J.-Ph. Uzan and V. Moncorgé) and Les Rêveurs lunaires (2015, with E. Baudoin) have all found a wide audience. He has also given hundreds of lectures for all kinds of audiences around the world.

He participates actively in the administration of science, through the Institut Henri Poincaré, but also by sitting in a number of panels and committees, including the higher council of research and the strategic council of Paris. Since 2010 he has been involved in fostering mathematics in Africa, through programs by the Next Einstein Initiative and the World Bank.

Believing in the commitment of scientists in society, Villani is also President of the Association Musaïques, a European federalist and a father of two.

Website

Henrik C. Wegener

Chair

Executive Vice President, Chief Academic Officer and Provost, Technical University of Denmark

Henrik C. Wegener

Henrik C. Wegener is Executive Vice President and Chief Academic Officer at Technical University of Denmark since 2011. He received his M.Sc. in food science and technology at the University of Copenhagen in 1988, his Ph.D. in microbiology at University of Copenhagen in 1992, and his Master in Public Administration (MPA) form Copenhagen Business School in 2005.

Henrik C. Wegener was director of the National Food Institute, DTU from 2006-2011 and before that head of the Department of Epidemiology and Risk Assessment at National Food and Veterinary Research Institute, Denmark (2004-2006). From 1994-1999, he was director of the Danish Zoonosis Centre, and from 1999-2004 professor of zoonosis epidemiology at Danish Veterinary Institute. He was stationed at World Health Organization headquarters in Geneva from 1999-2000. With more than 3.700 citations (h-index 34), he is the author of over 150 scientific papers in journals, research monographs and proceedings, on food safety, zoonoses, antimicrobial resistance and emerging infectious diseases.

He has served as advisor and reviewer to national and international authorities & governments, international organizations and private companies, universities and research foundations, and he has served, and is presently serving, on several national and international committees and boards on food safety, veterinary public health and research policy.

Henrik C. Wegener has received several awards, including the Alliance for the Prudent Use of Antibiotics International Leadership Award in 2003.

That’s quite a mix of sciences and I’m happy to see a social scientist has been included.

Conference submissions

Getting back to the conference and its call for speakers, the deadline for submissions is March 25, 2016. Interestingly, there’s also this (from conference webpage),

The deadline for submissions is 25th March 2016. The conference programme committee with session chairs will review all proposals and select those that best fit the aim of each session while also representing a diverse range of perspectives. We aim to inform selected speakers within 4 weeks of the deadline to enable travel planning to Brussels.

To make the conference as accessible as possible, there is no registration fee. [emphasis mine] The European Commission will cover travel accommodation costs only for confirmed speakers for whom the travel and accommodation arrangements will be made by the Commission itself, on the basis of the speakers’ indication.

Good luck!

*Head for conference submissions added on Feb. 29, 2016 at 1155 hundred hours.

Ice-free materials courtesy of penguins

The Humboldt penguin’s feathers don’t allow ice to form and a team of scientists have figured out why according to a Feb. 24, 2016 news item on Nanotechnology Now,

Humboldt penguins live in places that dip below freezing in the winter, and despite getting wet, their feathers stay sleek and free of ice. Scientists have now figured out what could make that possible. They report in ACS’ Journal of Physical Chemistry C that the key is in the microstructure of penguins’ feathers. Based on their findings, the scientists replicated the architecture in a nanofiber membrane that could be developed into an ice-proof material.

A Feb. 24, 2016 American Chemical Society (ACS) news release on EurekAlert, which originated the news item, provides a bit more detail,

The range of Humboldt penguins extends from coastal Peru to the tip of southern Chile. Some of these areas can get frigid, and the water the birds swim in is part of a cold ocean current that sweeps up the coast from the Antarctic. Their feathers keep them both warm and ice-free. Scientists had suspected that penguin feathers’ ability to easily repel water explained why ice doesn’t accumulate on them: Water would slide off before freezing. But research has found that under high humidity or ultra-low temperatures, ice can stick to even superhydrophobic surfaces. So Jingming Wang and colleagues sought another explanation.

The researchers closely examined Humboldt penguin feathers using a scanning electron microscope. They found that the feathers were comprised of a network of barbs, wrinkled barbules and tiny interlocking hooks. In addition to being hydrophobic, this hierarchical architecture with grooved structures is anti-adhesive. Testing showed ice wouldn’t stick to it. Mimicking the feathers’ microstructure, the researchers developed an icephobic polyimide fiber membrane. They say it could potentially be used in applications such as electrical insulation.

The researchers have provided an image illustrating their work,

[downloaded from http://pubs.acs.org/doi/abs/10.1021/acs.jpcc.5b12298]

[downloaded from http://pubs.acs.org/doi/abs/10.1021/acs.jpcc.5b12298]

Here’s a link to and a citation for the paper,

Icephobicity of Penguins Spheniscus Humboldti and an Artificial Replica of Penguin Feather with Air-Infused Hierarchical Rough Structures by Shuying Wang, Zhongjia Yang, Guangming Gong, Jingming Wang, Juntao Wu, Shunkun Yang, and Lei Jiang. J. Phys. Chem. C, Article ASAP DOI: 10.1021/acs.jpcc.5b12298 Publication Date (Web): February 3, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.

Biological supercomputers (living, breathing supercomputers) and an international collaboration spearheaded by Canadian scientists

A living, breathing supercomputer is a bit mind-boggling but scientists at McGill University (Canada) and their international colleagues have created a working model according to a Feb. 26, 2016 McGill University news release on EurekAlert (and received via email), Note: A link has been removed,

The substance that provides energy to all the cells in our bodies, Adenosine triphosphate (ATP), may also be able to power the next generation of supercomputers. That is what an international team of researchers led by Prof. Nicolau, the Chair of the Department of Bioengineering at McGill, believe. They’ve published an article on the subject earlier this week in the Proceedings of the National Academy of Sciences (PNAS), in which they describe a model of a biological computer that they have created that is able to process information very quickly and accurately using parallel networks in the same way that massive electronic super computers do.

Except that the model bio supercomputer they have created is a whole lot smaller than current supercomputers, uses much less energy, and uses proteins present in all living cells to function.

Doodling on the back of an envelope

“We’ve managed to create a very complex network in a very small area,” says Dan Nicolau, Sr. with a laugh. He began working on the idea with his son, Dan Jr., more than a decade ago and was then joined by colleagues from Germany, Sweden and The Netherlands, some 7 years ago [there is also one collaborator from the US according the journal’s [PNAS] list of author affiliations, read on for the link to the paper]. “This started as a back of an envelope idea, after too much rum I think, with drawings of what looked like small worms exploring mazes.”

The model bio-supercomputer that the Nicolaus (father and son) and their colleagues have created came about thanks to a combination of geometrical modelling and engineering knowhow (on the nano scale). It is a first step, in showing that this kind of biological supercomputer can actually work.

The circuit the researchers have created looks a bit like a road map of a busy and very organized city as seen from a plane. Just as in a city, cars and trucks of different sizes, powered by motors of different kinds, navigate through channels that have been created for them, consuming the fuel they need to keep moving.

More sustainable computing

But in the case of the biocomputer, the city is a chip measuring about 1.5 cm square in which channels have been etched. Instead of the electrons that are propelled by an electrical charge and move around within a traditional microchip, short strings of proteins (which the researchers call biological agents) travel around the circuit in a controlled way, their movements powered by ATP, the chemical that is, in some ways, the juice of life for everything from plants to politicians.

Because it is run by biological agents, and as a result hardly heats up at all, the model bio-supercomputer that the researchers have developed uses far less energy than standard electronic supercomputers do, making it more sustainable. Traditional supercomputers use so much electricity that they heat up a lot and then need to be cooled down, often requiring their own power plant to function.

Moving from model to reality

Although the model bio supercomputer was able to very efficiently tackle a complex classical mathematical problem by using parallel computing of the kind used by supercomputers, the researchers recognize that there is still a lot of work ahead to move from the model they have created to a full-scale functional computer.

”Now that this model exists as a way of successfully dealing with a single problem, there are going to be many others who will follow up and try to push it further, using different biological agents, for example,” says Nicolau. “It’s hard to say how soon it will be before we see a full scale bio super-computer. One option for dealing with larger and more complex problems may be to combine our device with a conventional computer to form a hybrid device. Right now we’re working on a variety of ways to push the research further.”

What was once the stuff of science fiction, is now just science.

The funding for this project is interesting,

This research was funded by: The European Union Seventh Framework Programme; [US] Defense Advanced Research Projects Agency [DARPA]; NanoLund; The Miller Foundation; The Swedish Research Council; The Carl Trygger Foundation; the German Research Foundation; and by Linnaeus University.

I don’t see a single Canadian funding agency listed.

In any event, here’s a link to and a citation for the paper,

Parallel computation with molecular-motor-propelled agents in nanofabricated networks by Dan V. Nicolau, Jr., Mercy Lard, Till Kortend, Falco C. M. J. M. van Delft, Malin Persson, Elina Bengtsson, Alf Månsson, Stefan Diez, Heiner Linke, and Dan V. Nicolau. Proceedings of the National Academy of Sciences (PNAS): http://www.pnas.org/content/early/2016/02/17/1510825113

This paper appears to be open access.

Finally, the researchers have provided an image illustrating their work,

Caption: Strands of proteins of different lengths move around the chip in the bio computer in directed patterns, a bit like cars and trucks navigating the streets of a city. Credit: Till Korten

Caption: Strands of proteins of different lengths move around the chip in the bio computer in directed patterns, a bit like cars and trucks navigating the streets of a city. Credit: Till Korten

ETA Feb. 29 2016: Technical University Dresden’s Feb. 26, 2016 press release on EurekAlert also announces the bio-computer albeit from a rather different perspective,

The pioneering achievement was developed by researchers from the Technische Universität Dresden and the Max Planck Institute of Molecular Cell Biology and Genetics, Dresden in collaboration with international partners from Canada, England, Sweden, the US, and the Netherlands.

Conventional electronic computers have led to remarkable technological advances in the past decades, but their sequential nature -they process only one computational task at a time- prevents them from solving problems of combinatorial nature such as protein design and folding, and optimal network routing. This is because the number of calculations required to solve such problems grows exponentially with the size of the problem, rendering them intractable with sequential computing. Parallel computing approaches can in principle tackle such problems, but the approaches developed so far have suffered from drawbacks that have made up-scaling and practical implementation very difficult. The recently reported parallel-computing approach aims to address these issues by combining well established nanofabrication technology with molecular motors which are highly energy efficient and inherently work in parallel.

In this approach, which the researchers demonstrate on a benchmark combinatorial problem that is notoriously hard to solve with sequential computers, the problem to be solved is ‘encoded’ into a network of nanoscale channels (Fig. 1a). This is done, on the one hand by mathematically designing a geometrical network that is capable of representing the problem, and on the other hand by fabricating a physical network based on this design using so-called lithography, a standard chip-manufacturing technique.

The network is then explored in parallel by many protein filaments (here actin filaments or microtubules) that are self-propelled by a molecular layer of motor proteins (here myosin or kinesin) covering the bottom of the channels (Fig. 3a). The design of the network using different types of junctions automatically guides the filaments to the correct solutions to the problem (Fig. 1b). This is realized by different types of junctions, causing the filaments to behave in two different ways. As the filaments are rather rigid structures, turning to the left or right is only possible for certain angles of the crossing channels. By defining these options (‘split junctions’ Fig. 2a + 3b and ‘pass junctions’, Fig. 2b + 3c) the scientists achieved an ‘intelligent’ network giving the filaments the opportunity either to cross only straight or to decide between two possible channels with a 50/50 probability.

The time to solve combinatorial problems of size N using this parallel-computing approach scales approximately as N2, which is a dramatic improvement over the exponential (2N) time scales required by conventional, sequential computers. Importantly, the approach is fully scalable with existing technologies and uses orders of magnitude less energy than conventional computers, thus circumventing the heating issues that are currently limiting the performance of conventional computing.

The diagrams mentioned were not included with the press release.

Quantum physics experiments designed by an algorithm

A Feb. 22, 2016 news item on Nanotechnology Now describes research into quantum physics performed by an algorithm,

Quantum physicist Mario Krenn and his colleagues in the group of Anton Zeilinger from the Faculty of Physics at the University of Vienna and the Austrian Academy of Sciences have developed an algorithm which designs new useful quantum experiments. As the computer does not rely on human intuition, it finds novel unfamiliar solutions.

The researchers have provided an image illustrating their work,

Caption: The algorithm Melvin found out that the most simple realization can be asymmetric and therefore counterintuitive. Credit: Copyright: Robert Fickler, Universität Wien (University of Vienna)

Caption: The algorithm Melvin found out that the most simple realization can be asymmetric and therefore counterintuitive. Credit: Copyright: Robert Fickler, Universität Wien (University of Vienna)

A Feb. 22, 2016 University of Vienna press release (also on EurekAlert), which originated the news item, expands on the theme,

The idea was developed when the physicists wanted to create new quantum states in the laboratory, but were unable to conceive of methods to do so. “After many unsuccessful attempts to come up with an experimental implementation, we came to the conclusion that our intuition about these phenomena seems to be wrong. We realized that in the end we were just trying random arrangements of quantum building blocks. And that is what a computer can do as well – but thousands of times faster”, explains Mario Krenn, PhD student in Anton Zeilinger’s group and first author research.

After a few hours of calculation, their algorithm – which they call Melvin – found the recipe to the question they were unable to solve, and its structure surprised them. Zeilinger says: “Suppose I want build an experiment realizing a specific quantum state I am interested in. Then humans intuitively consider setups reflecting the symmetries of the state. Yet Melvin found out that the most simple realization can be asymmetric and therefore counterintuitive. A human would probably never come up with that solution.”

The physicists applied the idea to several other questions and got dozens of new and surprising answers. “The solutions are difficult to understand, but we were able to extract some new experimental tricks we have not thought of before. Some of these computer-designed experiments are being built at the moment in our laboratories”, says Krenn.

Melvin not only tries random arrangements of experimental components, but also learns from previous successful attempts, which significantly speeds up the discovery rate for more complex solutions. In the future, the authors want to apply their algorithm to even more general questions in quantum physics, and hope it helps to investigate new phenomena in laboratories.

Here’s a link to and a citation for the paper,

Automated Search for new Quantum Experiments by Mario Krenn, Mehul Malik, Robert Fickler, Radek Lapkiewicz, Anton Zeilinger. arXiv (Submitted on 9 Sep 2015 (v1), last revised 20 Feb 2016 (this version, v2))

The version of the paper on arXiv.org is open access. The paper has also been accepted by Physical Review Letters but does not seem to have been published online or in print yet,

Automated search for new quantum experiments
by Mario Krenn, Mehul Malik, Robert Fickler, Radek Lapkiewicz, and Anton Zeilinger. Phys. Rev. Lett. Accepted 27 January 2016

There is a copy of the abstract available on the Physical Review Letters site.

Hypersensitivity to nanomedicine: the CARPA reaction

There is some intriguing research (although I do have a reservation) into some unexpected side effects that nanomedicine may have according to a Feb. 23, 2016 news item on phys.org,

Keywords such as nano-, personalized-, or targeted medicine sound like bright future. What most people do not know, is that nanomedicines can cause severe undesired effects for actually being too big! Those modern medicines easily achieve the size of viruses which the body potentially recognizes as foreign starting to defend itself against —a sometimes severe immune response unfolds.

The CARPA-phenomenon (Complement Activation-Related PseudoAllergy) is a frequent hypersensitivity response to nanomedicine application. Up to 100 patients worldwide suffer from severe reactions, such as cardiac distress, difficulty of breathing, chest and back pain or fainting each day when their blood gets exposed to certain nanoparticles during medical treatment. Every 10 days one patient even dies due to an uncontrollable anaphylactoid reaction.

Apart from being activated in a different way, this pseudoallergy has the same symptoms as a common allergy, bearing a crucial difference:  the reaction is taking place without previous sensitizing exposure to a substance, making it hard to predict, whether a person will react to a specific nanodrug or be safe. Intrigued by this vital challenge, János Szebeni from Semmelweis University, Budapest, has been working with scientific verve on the decipherment and prevention of the CARPA phenomenon for more than 20 years. With his invaluable support De Gruyter´s European Journal of Nanomedicine (EJNM) lately dedicated an elaborate compilation of the most recent scientific advances on CARPA, presented by renowned experts on the subject.

A Feb. 23, 2016 De Gruyter Publishers press release, which originated the news item, provides more detail,

Interestingly it´s pigs that turned out to serve as best model for research on the complex pathomechanism, diagnosis and potential treatment of CARPA. “Pigs´ sensitivity equals that of humans responding most vehemently to reactogenic nanomedicines”, Szebeni states.  In a contribution to EJNM´s compilation on CARPA, Rudolf Urbanics and colleagues show that reactions to specific nanodrugs are even quantitatively reproducible in pigs … . Szebeni: “This is absolutely rare in allergy-research. In these animals the endpoint of the overreaction is reflected in a rise of pulmonary arterial pressure, being as accurate as a Swiss watch”. Pigs can thus be used for drug screening and prediction of the CARPAgenic potential of nanomedicines. This becomes increasingly important with the ever growing interest in modern drugs requiring reliable preclinical safety assays during the translation process from bench to bedside. Results might also help to personalize nanomedicine administration schedules during for example the targeted treatment of cancer. The same holds true for a very recently developed in vitro immunoassay. By simply using a patient´s blood sample, it tests for potential CARPA reactions even before application of specific nanodrugs.

Here’s a link to and a citation for the paper,

Lessons learned from the porcine CARPA model: constant and variable responses to different nanomedicines and administration protocols by Rudolf Urbanics, Péter Bedőcs, János Szebeni. European Journal of Nanomedicine. Volume 7, Issue 3, Pages 219–231, ISSN (Online) 1662-596X, ISSN (Print) 1662-5986, DOI: 10.1515/ejnm-2015-0011, June 2015

This paper appears to be open access.

As for reservations, I’m not sure what occasioned the news release so many months post publication of the paperand it should be noted that János Szebeni seems to be the paper’s lead author and the editor of the European Journal of Nanomedicine.

King Abdullah University of Science and Technology (Saudi Arabia) develops sensors from household materials

Researchers at the King Adbullah University of Science and Technology (KAUST) are developing sensors made of household materials according to a Feb. 19, 2016 KAUST news release (also on EurekAlert but dated Feb. 21, 2016),

Everyday materials from the kitchen drawer, such as aluminum foil, sticky note paper, sponges and tape, have been used by a team of electrical engineers from KAUST to develop a low-cost sensor that can detect external stimuli, including touch, pressure, temperature, acidity and humidity.

The sensor, which is called Paper Skin, performs as well as other artificial skin applications currently being developed while integrating multiple functions using cost-effective materials1.

“This work has the potential to revolutionize the electronics industry and opens the door to commercializing affordable high-performance sensing devices,” stated Muhammad Mustafa Hussain from the University’s Integrated Nanotechnology Lab, where the research was conducted.

Wearable and flexible electronics show promise for a variety of applications, such as wireless monitoring of patient health and touch-free computer interfaces. Current research in this direction employs expensive and sophisticated materials and processes.

The team used sticky note paper to detect humidity, sponges and wipes to detect pressure and aluminum foil to detect motion. Coloring a sticky note with an HB pencil allowed the paper to detect acidity levels, and aluminum foil and conductive silver ink were used to detect temperature differences.

The materials were put together into a simple paper-based platform that was then connected to a device that detected changes in electrical conductivity according to external stimuli.

Increasing levels of humidity, for example, increased the platform’s ability to store an electrical charge, or its capacitance. Exposing the sensor to an acidic solution increased its resistance, while exposing it to an alkaline solution decreased it. Voltage changes were detected with temperature changes. Bringing a finger closer to the platform disturbed its electromagnetic field, decreasing its capacitance.

The team leveraged the various properties of the materials they used, including their porosity, adsorption, elasticity and dimensions to develop the low-cost sensory platform. They also demonstrated that a single integrated platform could simultaneously detect multiple stimuli in real time.

Several challenges must be overcome before a fully autonomous, flexible and multifunctional sensory platform becomes commercially achievable, explained Hussain. Wireless interaction with the paper skin needs to be developed. Reliability tests also need to be conducted to assess how long the sensor can last and how good its performance is under severe bending conditions.

“The next stage will be to optimize the sensor’s integration on this platform for applications in medical monitoring systems. The flexible and conformal sensory platform will enable simultaneous real-time monitoring of body vital signs, such as heart rate, blood pressure, breathing patterns and movement,” Hussain said.

Here’s a link to and a citation for the paper,

Paper Skin Multisensory Platform for Simultaneous Environmental Monitoring by Joanna M. Nassar, Marlon D. Cordero, Arwa T. Kutbee, Muhammad A. Karimi, Galo A. Torres Sevilla, Aftab M. Hussain, Atif Shamim, and Muhammad M. Hussain. Advanced Materials Technologies DOI: 10.1002/admt.201600004 Article first published online: 19 FEB 2016

© 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

This appears to be an open access paper.

Short term exposure to engineered nanoparticles used for semiconductors not too risky?

Short term exposure means anywhere from 30 minutes to 48 hours according to the news release and the concentration is much higher than would be expected in current real life conditions. Still, this research from the University of Arizona and collaborators represents an addition to the data about engineered nanoparticles (ENP) and their possible impact on health and safety. From a Feb. 22, 2016 news item on phys.org,

Short-term exposure to engineered nanoparticles used in semiconductor manufacturing poses little risk to people or the environment, according to a widely read research paper from a University of Arizona-led research team.

Co-authored by 27 researchers from eight U.S. universities, the article, “Physical, chemical and in vitro toxicological characterization of nanoparticles in chemical mechanical planarization suspensions used in the semiconductor industry: towards environmental health and safety assessments,” was published in the Royal Society of Chemistry journal Environmental Science Nano in May 2015. The paper, which calls for further analysis of potential toxicity for longer exposure periods, was one of the journal’s 10 most downloaded papers in 2015.

A Feb. 17, 2016 University of Arizona news release (also on EurekAlert), which originated the news item, provides more detail,

“This study is extremely relevant both for industry and for the public,” said Reyes Sierra, lead researcher of the study and professor of chemical and environmental engineering at the University of Arizona.

Small Wonder

Engineered nanoparticles are used to make semiconductors, solar panels, satellites, food packaging, food additives, batteries, baseball bats, cosmetics, sunscreen and countless other products. They also hold great promise for biomedical applications, such as cancer drug delivery systems.

Designing and studying nano-scale materials is no small feat. Most university researchers produce them in the laboratory to approximate those used in industry. But for this study, Cabot Microelectronics provided slurries of engineered nanoparticles to the researchers.

“Minus a few proprietary ingredients, our slurries were exactly the same as those used by companies like Intel and IBM,” Sierra said. Both companies collaborated on the study.

The engineers analyzed the physical, chemical and biological attributes of four metal oxide nanomaterials — ceria, alumina, and two forms of silica — commonly used in chemical mechanical planarization slurries for making semiconductors.

Clean Manufacturing

Chemical mechanical planarization is the process used to etch and polish silicon wafers to be smooth and flat so the hundreds of silicon chips attached to their surfaces will produce properly functioning circuits. Even the most infinitesimal scratch on a wafer can wreak havoc on the circuitry.

When their work is done, engineered nanoparticles are released to wastewater treatment facilities. Engineered nanoparticles are not regulated, and their prevalence in the environment is poorly understood [emphasis mine].

Researchers at the UA and around the world are studying the potential effects of these tiny and complex materials on human health and the environment.

“One of the few things we know for sure about engineered nanoparticles is that they behave very differently than other materials,” Sierra said. “For example, they have much greater surface area relative to their volume, which can make them more reactive. We don’t know whether this greater reactivity translates to enhanced toxicity.”

The researchers exposed the four nanoparticles, suspended in separate slurries, to adenocarcinoma human alveolar basal epithelial cells at doses up to 2,000 milligrams per liter for 24 to 38 hours, and to marine bacteria cells, Aliivibrio fischeri, up to 1,300 milligrams per liter for approximately 30 minutes.

These concentrations are much higher than would be expected in the environment, Sierra said.

Using a variety of techniques, including toxicity bioassays, electron microscopy, mass spectrometry and laser scattering, to measure such factors as particle size, surface area and particle composition, the researchers determined that all four nanoparticles posed low risk to the human and bacterial cells.

“These nanoparticles showed no adverse effects on the human cells or the bacteria, even at very high concentrations,” Sierra said. “The cells showed the very same behavior as cells that were not exposed to nanoparticles.”

The authors recommended further studies to characterize potential adverse effects at longer exposures and higher concentrations.

“Think of a fish in a stream where wastewater containing nanoparticles is discharged,” Sierra said. “Exposure to the nanoparticles could be for much longer.”

Here’s a link to and a citation for the paper,

Physical, chemical, and in vitro toxicological characterization of nanoparticles in chemical mechanical planarization suspensions used in the semiconductor industry: towards environmental health and safety assessments by David Speed, Paul Westerhoff, Reyes Sierra-Alvarez, Rockford Draper, Paul Pantano, Shyam Aravamudhan, Kai Loon Chen, Kiril Hristovski, Pierre Herckes, Xiangyu Bi, Yu Yang, Chao Zeng, Lila Otero-Gonzalez, Carole Mikoryak, Blake A. Wilson, Karshak Kosaraju, Mubin Tarannum, Steven Crawford, Peng Yi, Xitong Liu, S. V. Babu, Mansour Moinpour, James Ranville, Manuel Montano, Charlie Corredor, Jonathan Posner, and Farhang Shadman. Environ. Sci.: Nano, 2015,2, 227-244 DOI: 10.1039/C5EN00046G First published online 14 May 2015

This is open access but you may need to register before reading the paper.

The bit about nanoparticles’ “… prevalence in the environment is poorly understood …”and the focus of this research reminded me of an April 2014 announcement (my April 8, 2014 posting; scroll down about 40% of the way) regarding a new research network being hosted by Arizona State University, the LCnano network, which is part of the Life Cycle of Nanomaterials project being funded by the US National Science Foundation. The network’s (LCnano) director is Paul Westerhoff who is also one of this report’s authors.

Using copyright to shut down easy access to scientific research

This started out as a simple post on copyright and publishers vis à vis Sci-Hub but then John Dupuis wrote a think piece (with which I disagree somewhat) on the situation in a Feb. 22, 2016 posting on his blog, Confessions of a Science Librarian. More on Dupuis and my take on it after a description of the situation.

Sci-Hub

Before getting to the controversy and legal suit, here’s a preamble about the purpose for copyright as per the US constitution from Mike Masnick’s Feb. 17, 2016 posting on Techdirt,

Lots of people are aware of the Constitutional underpinnings of our copyright system. Article 1, Section 8, Clause 8 famously says that Congress has the following power:

To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.

We’ve argued at great length over the importance of the preamble of that section, “to promote the progress,” but many people are confused about the terms “science” and “useful arts.” In fact, many people not well-versed in the issue often get the two backwards and think that “science” refers to inventions, and thus enables a patent system, while “useful arts” refers to “artistic works” and thus enables the copyright system. The opposite is actually the case. “Science” at the time the Constitution was written was actually synonymous with “learning” and “education” (while “useful arts” was a term meaning invention and new productivity tools).

While over the centuries, many who stood to benefit from an aggressive system of copyright control have tried to rewrite, whitewash or simply ignore this history, turning the copyright system falsely into a “property” regime, the fact is that it was always intended as a system to encourage the wider dissemination of ideas for the purpose of education and learning. The (potentially misguided) intent appeared to be that by granting exclusive rights to a certain limited class of works, it would encourage the creation of those works, which would then be useful in educating the public (and within a few decades enter the public domain).

Masnick’s preamble leads to a case where Elsevier (Publishers) has attempted to halt the very successful Sci-Hub, which bills itself as “the first pirate website in the world to provide mass and public access to tens of millions of research papers.” From Masnick’s Feb. 17, 2016 posting,

Rightfully, this is being celebrated as a massive boon to science and learning, making these otherwise hidden nuggets of knowledge and science that were previously locked up and hidden away available to just about anyone. And, to be clear, this absolutely fits with the original intent of copyright law — which was to encourage such learning. In a very large number of cases, it is not the creators of this content and knowledge who want the information to be locked up. Many researchers and academics know that their research has much more of an impact the wider it is seen, read, shared and built upon. But the gatekeepers — such as Elsveier and other large academic publishers — have stepped in and demanded copyright, basically for doing very little.

They do not pay the researchers for their work. Often, in fact, that work is funded by taxpayer funds. In some cases, in certain fields, the publishers actually demand that the authors of these papers pay to submit them. The journals do not pay to review the papers either. They outsource that work to other academics for “peer review” — which again, is unpaid. Finally, these publishers profit massively, having convinced many universities that they need to subscribe, often paying many tens or even hundreds of thousands of dollars for subscriptions to journals that very few actually read.

Simon Oxenham of the Neurobonkers blog on the big think website wrote a Feb. 9 (?), 2016 post about Sci-Hub, its originator, and its current legal fight (Note: Links have been removed),

On September 5th, 2011, Alexandra Elbakyan, a researcher from Kazakhstan, created Sci-Hub, a website that bypasses journal paywalls, illegally providing access to nearly every scientific paper ever published immediately to anyone who wants it. …

This was a game changer. Before September 2011, there was no way for people to freely access paywalled research en masse; researchers like Elbakyan were out in the cold. Sci-Hub is the first website to offer this service and now makes the process as simple as the click of a single button.

As the number of papers in the LibGen database expands, the frequency with which Sci-Hub has to dip into publishers’ repositories falls and consequently the risk of Sci-Hub triggering its alarm bells becomes ever smaller. Elbakyan explains, “We have already downloaded most paywalled articles to the library … we have almost everything!” This may well be no exaggeration. Elsevier, one of the most prolific and controversial scientific publishers in the world, recently alleged in court that Sci-Hub is currently harvesting Elsevier content at a rate of thousands of papers per day. Elbakyan puts the number of papers downloaded from various publishers through Sci-Hub in the range of hundreds of thousands per day, delivered to a running total of over 19 million visitors.

In one fell swoop, a network has been created that likely has a greater level of access to science than any individual university, or even government for that matter, anywhere in the world. Sci-Hub represents the sum of countless different universities’ institutional access — literally a world of knowledge. This is important now more than ever in a world where even Harvard University can no longer afford to pay skyrocketing academic journal subscription fees, while Cornell axed many of its Elsevier subscriptions over a decade ago. For researchers outside the US’ and Western Europe’s richest institutions, routine piracy has long been the only way to conduct science, but increasingly the problem of unaffordable journals is coming closer to home.

… This was the experience of Elbakyan herself, who studied in Kazakhstan University and just like other students in countries where journal subscriptions are unaffordable for institutions, was forced to pirate research in order to complete her studies. Elbakyan told me, “Prices are very high, and that made it impossible to obtain papers by purchasing. You need to read many papers for research, and when each paper costs about 30 dollars, that is impossible.”

While Sci-Hub is not expected to win its case in the US, where one judge has already ordered a preliminary injunction making its former domain unavailable. (Sci-Hub moved.) Should you be sympathetic to Elsevier, you may want to take this into account (Note: Links have been removed),

Elsevier is the world’s largest academic publisher and by far the most controversial. Over 15,000 researchers have vowed to boycott the publisher for charging “exorbitantly high prices” and bundling expensive, unwanted journals with essential journals, a practice that allegedly is bankrupting university libraries. Elsevier also supports SOPA and PIPA, which the researchers claim threatens to restrict the free exchange of information. Elsevier is perhaps most notorious for delivering takedown notices to academics, demanding them to take their own research published with Elsevier off websites like Academia.edu.

The movement against Elsevier has only gathered speed over the course of the last year with the resignation of 31 editorial board members from the Elsevier journal Lingua, who left in protest to set up their own open-access journal, Glossa. Now the battleground has moved from the comparatively niche field of linguistics to the far larger field of cognitive sciences. Last month, a petition of over 1,500 cognitive science researchers called on the editors of the Elsevier journal Cognition to demand Elsevier offer “fair open access”. Elsevier currently charges researchers $2,150 per article if researchers wish their work published in Cognition to be accessible by the public, a sum far higher than the charges that led to the Lingua mutiny.

In her letter to Sweet [New York District Court Judge Robert W. Sweet], Elbakyan made a point that will likely come as a shock to many outside the academic community: Researchers and universities don’t earn a single penny from the fees charged by publishers [emphasis mine] such as Elsevier for accepting their work, while Elsevier has an annual income over a billion U.S. dollars.

As Masnick noted, much of this research is done on the public dime (i. e., funded by taxpayers). For her part, Elbakyan has written a letter defending her actions on ethical rather than legal grounds.

I recommend reading the Oxenham article as it provides details about how the site works and includes text from the letter Elbakyan wrote.  For those who don’t have much time, Masnick’s post offers a good précis.

Sci-Hub suit as a distraction from the real issues?

Getting to Dupuis’ Feb. 22, 2016 posting and his perspective on the situation,

My take? Mostly that it’s a sideshow.

One aspect that I have ranted about on Twitter which I think is worth mentioning explicitly is that I think Elsevier and all the other big publishers are actually quite happy to feed the social media rage machine with these whack-a-mole controversies. The controversies act as a sideshow, distracting from the real issues and solutions that they would prefer all of us not to think about.

By whack-a-mole controversies I mean this recurring story of some person or company or group that wants to “free” scholarly articles and then gets sued or harassed by the big publishers or their proxies to force them to shut down. This provokes wide outrage and condemnation aimed at the publishers, especially Elsevier who is reserved a special place in hell according to most advocates of openness (myself included).

In other words: Elsevier and its ilk are thrilled to be the target of all the outrage. Focusing on the whack-a-mole game distracts us from fixing the real problem: the entrenched systems of prestige, incentive and funding in academia. As long as researchers are channelled into “high impact” journals, as long as tenure committees reward publishing in closed rather than open venues, nothing will really change. Until funders get serious about mandating true open access publishing and are willing to put their money where their intentions are, nothing will change. Or at least, progress will be mostly limited to surface victories rather than systemic change.

I think Dupuis is referencing a conflict theory (I can’t remember what it’s called) which suggests that certain types of conflicts help to keep systems in place while apparently attacking those systems. His point is well made but I disagree somewhat in that I think these conflicts can also raise awareness and activate people who might otherwise ignore or mindlessly comply with those systems. So, if Elsevier and the other publishers are using these legal suits as diversionary tactics, they may find they’ve made a strategic error.

ETA April 29, 2016: Sci-Hub does seem to move around so I’ve updated the links so it can be accessed but Sci-Hub’s situation can change at any moment.

Sir Mark Welland, nanoscientist, elected as master of St. Catharine’s College in Cambridge (UK)

I first tripped across Mark Welland’s work at the University of Cambridge in 2008 when I was working on my Nanotech Mysteries wiki. a project for my maser’s. While I did not manage to speak to him directly, I did speak with his secretary and got permission to reproduce some images in the wiki. I have mentioned Welland and his work here from time to time, my April 30, 2010 posting (scroll down about 30% of the way) probably offers the best summary of the parts of his work I’ve stumbled across. There’s also a Cambridge video about nanotechnology  featuring Stephen Fry as its host and, if memory serves, an interview with Welland.

Since those days he has become Sir Mark Welland and a Feb. 22, 2016 University of Cambridge press release announces the latest news,

The Fellows of St Catharine’s have elected Professor Sir Mark Welland as the next Master of the College.

Professor Sir Mark Welland is Professor of Nanotechnology and Head of Electrical Engineering at the University of Cambridge, where he has established the purpose-built Nanoscience Centre.

Sir Mark is currently researching into a broad range of both fundamental and applied problems. These include using nanotechnology to both understand and treat human diseases, biologically inspired nanomaterials for green technologies, and nanoelectronics for future generation energy transmission and sensing.

From April 2008 until May 2012, Sir Mark was Chief Scientific Adviser to the UK Government Ministry of Defence.

He was elected a Fellow of the Royal Society, a Fellow of the Royal Academy of Engineering, and a Fellow of the Institute of Physics in 2002, a Foreign Fellow of the National Academy of Sciences of India in 2008, and a Foreign Fellow of the Danish Academy of Sciences in 2010.

Sir Mark was awarded a Knighthood in the Queen’s Birthday Honours list in 2011.

Sir Mark brings to the College unrivalled national and international experience and expertise, as well as a thorough understanding of the University and the way it can engage with the wider world.

Sir Mark said: “I am in equal measures humbled and excited at being elected as Master and am looking forward to supporting the Fellows, students and staff of St Catharine’s over the next years.”

“I am honoured to be following Dame Jean, who has set a very high standard of leadership and intellectual rigour.”

Dame Jean said: “The Fellows have elected a distinguished scientist as the 39th Master to lead the College in the next phase of its 543-year history. Sir Mark will find a welcoming and flourishing community at St Catharine’s. I offer him my warmest congratulations on his election, I wish him well for the future, and I hope he will be as happy at St Catharine’s as I have been for almost ten years.”

A Feb. 22, 2016 news item about Welland’s election as Master of St. Catharine’s College for Cambridge News notes the age of the college,

Prof Sir Mark Welland, the university’s current head of electrical engineering, will take up the role in September, succeeding Prof Dame Jean Thomas, who will step down after nine years in charge.

He will be the 39th master of the college, which was founded in 1473 [emphasis mine] and has a population of nearly 800 current students and nearly 60 fellows.

Congratulations Sir Mark!

I note in passing that Canada will be celebrating its 150th anniversary as a country in 2017.

A demonstration of quantum surrealism

The Canadian Institute for Advanced Research (CIFAR) has announced some intriguing new research results. A Feb. 19, 2016 news item on ScienceDaily gets the ball rolling,

New research demonstrates that particles at the quantum level can in fact be seen as behaving something like billiard balls rolling along a table, and not merely as the probabilistic smears that the standard interpretation of quantum mechanics suggests. But there’s a catch — the tracks the particles follow do not always behave as one would expect from “realistic” trajectories, but often in a fashion that has been termed “surrealistic.”

A Feb. 19, 2016 CIFAR news release by Kurt Kleiner, which originated the news item, offers the kind of explanation that allows an amateur such as myself to understand the principles (while I’m reading it), thank you Kurt Kleiner,

In a new version of an old experiment, CIFAR Senior Fellow Aephraim Steinberg (University of Toronto) and colleagues tracked the trajectories of photons as the particles traced a path through one of two slits and onto a screen. But the researchers went further, and observed the “nonlocal” influence of another photon that the first photon had been entangled with.

The results counter a long-standing criticism of an interpretation of quantum mechanics called the De Broglie-Bohm theory. Detractors of this interpretation had faulted it for failing to explain the behaviour of entangled photons realistically. For Steinberg, the results are important because they give us a way of visualizing quantum mechanics that’s just as valid as the standard interpretation, and perhaps more intuitive.

“I’m less interested in focusing on the philosophical question of what’s ‘really’ out there. I think the fruitful question is more down to earth. Rather than thinking about different metaphysical interpretations, I would phrase it in terms of having different pictures. Different pictures can be useful. They can help shape better intuitions.”

At stake is what is “really” happening at the quantum level. The uncertainty principle tells us that we can never know both a particle’s position and momentum with complete certainty. And when we do interact with a quantum system, for instance by measuring it, we disturb the system. So if we fire a photon at a screen and want to know where it will hit, we’ll never know for sure exactly where it will hit or what path it will take to get there.

The standard interpretation of quantum mechanics holds that this uncertainty means that there is no “real” trajectory between the light source and the screen. The best we can do is to calculate a “wave function” that shows the odds of the photon being in any one place at any time, but won’t tell us where it is until we make a measurement.

Yet another interpretation, called the De Broglie-Bohm theory, says that the photons do have real trajectories that are guided by a “pilot wave” that accompanies the particle. The wave is still probabilistic, but the particle takes a real trajectory from source to target. It doesn’t simply “collapse” into a particular location once it’s measured.

In 2011 Steinberg and his colleagues showed that they could follow trajectories for photons by subjecting many identical particles to measurements so weak that the particles were barely disturbed, and then averaging out the information. This method showed trajectories that looked similar to classical ones — say, those of balls flying through the air.

But critics had pointed out a problem with this viewpoint. Quantum mechanics also tells us that two particles can be entangled, so that a measurement of one particle affects the other. The critics complained that in some cases, a measurement of one particle would lead to an incorrect prediction of the trajectory of the entangled particle. They coined the term “surreal trajectories” to describe them.

In the most recent experiment, Steinberg and colleagues showed that the surrealism was a consequence of non-locality — the fact that the particles were able to influence one another instantaneously at a distance. In fact, the “incorrect” predictions of trajectories by the entangled photon were actually a consequence of where in their course the entangled particles were measured. Considering both particles together, the measurements made sense and were consistent with real trajectories.

Steinberg points out that both the standard interpretation of quantum mechanics and the De Broglie-Bohm interpretation are consistent with experimental evidence, and are mathematically equivalent. But it is helpful in some circumstances to visualize real trajectories, rather than wave function collapses, he says.

An image illustrating the work has been provided,

On the left, a still image from an animation of reconstructed trajectories for photons going through a double-slit. A second photon “measures” which slit each photon traversed, so no interference results on the screen. The image on the right shows the polarisation of this second, “probe." Credit: Dylan Mahler Courtesy: CIFAR

On the left, a still image from an animation of reconstructed trajectories for photons going through a double-slit. A second photon “measures” which slit each photon traversed, so no interference results on the screen. The image on the right shows the polarisation of this second, “probe.” Credit: Dylan Mahler Courtesy: CIFAR

Here’s a link to and a citation for the paper,

Experimental nonlocal and surreal Bohmian trajectories by Dylan H. Mahler, Lee Rozema, Kent Fisher, Lydia Vermeyden, Kevin J. Resch, Howard M. Wiseman, and Aephraim Steinberg. Science Advances  19 Feb 2016: Vol. 2, no. 2, e1501466 DOI: 10.1126/science.1501466

This article appears to be open access.