Tag Archives: University of Edinburgh

Water’s liquid-vapour interface

The UK’s National Physical Laboratory (NPL), along with IBM and the University of Edinburgh, has developed a new quantum model for understanding water’s liquid-vapour interface according to an April 20, 2015 news item on Nanowerk,

The National Physical Laboratory (NPL), the UK’s National Measurement Institute in collaboration with IBM and the University of Edinburgh, has used a new quantum model to reveal the molecular structure of water’s liquid surface.

The liquid-vapour interface of water is one of the most common of all heterogeneous (or non-uniform) environments. Understanding its molecular structure will provide insight into complex biochemical interactions underpinning many biological processes. But experimental measurements of the molecular structure of water’s surface are challenging, and currently competing models predict various different arrangements.

An April 20, 2015 NPL press release on EurekAlert, which originated the news item, describes the model and research in more detail,

The model is based on a single charged particle, the quantum Drude oscillator (QDO), which mimics the way the electrons of a real water molecule fluctuate and respond to their environment. This simplified representation retains interactions not normally accessible in classical models and accurately captures the properties of liquid water.

In new research, published in a featured article in the journal Physical Chemistry Chemical Physics, the team used the QDO model to determine the molecular structure of water’s liquid surface. The results provide new insight into the hydrogen-bonding topology at the interface, which is responsible for the unusually high surface tension of water.

This is the first time the QDO model of water has been applied to the liquid-vapour interface. The results enabled the researchers to identify the intrinsic asymmetry of hydrogen bonds as the mechanism responsible for the surface’s molecular orientation. The model was also capable of predicting the temperature dependence of the surface tension with remarkable accuracy – to within 1 % of experimental values.

Coupled with earlier work on bulk water, this result demonstrates the exceptional transferability of the QDO approach and offers a promising new platform for molecular exploration of condensed matter.

Here’s a link to and a citation for the paper,

Hydrogen bonding and molecular orientation at the liquid–vapour interface of water by Flaviu S. Cipcigan, Vlad P. Sokhan, Andrew P. Jones, Jason Crain and Glenn J. Martyna.  Phys. Chem. Chem. Phys., 2015,17, 8660-8669 DOI: 10.1039/C4CP05506C First published online 17 Feb 2015

The paper is open access although you do need to register on the site provided you don’t have some other means of accessing the paper.

Biosensing devices from Scotland

The timing for Deborah Rowe’s article in the Guardian newspaper is fascinating. Rowe is writing about nanoscale biosensors developed at the University of Edinburgh, research published in Dec. 2013, while her piece, published Sept. 9, 2014, appears less than 10 days before Scotland’s vote (Sept. 18, 2014) on the question of whether or not it should be independent. Also interesting, the published paper is available as open access until the end of Sept. 2014, which seems like a strategic time period to give open access to your paper.

That said, this is an exciting piece of research if you’re particularly interested in biosensors and ways to produce them more cheaply and at a higher volume (from Rowe’s Sept. 9, 2014 article),

An interdisciplinary research team from the Schools of Engineering and Chemistry at the University of Edinburgh (in association with Nanoflex Ltd), has overcome some of the constraints associated with conventional nano-scale electrode arrays, to develop the first precision-engineered nanoelectrode array system with the promise of high-volume and low-cost.*

Such miniaturised electrode arrays have the potential to provide a faster and more sensitive response to, for example, biomolecules than current biosensors. This would make them invaluable components in the increasingly sensitive devices being developed for biomedical sensing and electrochemical applications.

Rowe goes on to describe the researchers’ Microsquare Nanoband Edge Electrode (MNEE) array technology in lucid and brief detail. For those who want more, here’s a link to and a citation for the paper,

Nanoscale electrode arrays produced with microscale lithographic techniques for use in biomedical sensing applications by Jonathan G. Terry, Ilka Schmüser, Ian Underwood, Damion K. Corrigan, Neville J. Freeman, Andrew S. Bunting, Andrew R. Mount, Anthony J. Walton. IET Nanobiotechnology, Volume 7, Issue 4, December 2013, p. 125 – 134
DOI:  10.1049/iet-nbt.2013.0049 , Print ISSN 1751-8741, Online ISSN 1751-875X Published Oct. 29, 2013

Given the timing of the Guardian article and the availability of the paper for free access, I was moved to find information about the funding agencies, from the researchers’ IET paper,

Support from the Scottish Funding Council (SFC) is acknowledged through the Edinburgh Research Partnership in engineering and mathematics (ERPem) and the Edinburgh and St Andrews Chemistry (EaStCHEM) initiatives, along with knowledge transfer funding. Support from the Engineering and Physical Sciences Research Council (EPSRC) of the UK through the IeMRC (Smart Microsystems – FS/01/02/10) Grant is acknowledged. Ilka Schmüser thanks the EPSRC and the University of Edinburgh for financial support.

And, there was this from Rowe’s article,

The work is part of a larger R&D programme on the development of smart sensors at the University of Edinburgh. It involves staff and students from the Schools of Engineering and Chemistry thus providing the required broad set of skills and experience. The resulting MNEE technology is currently being commercialised by Nanoflex Ltd.

So, the funding comes from Scottish and UK sources and the company which is commercializing the MNEE is located in the North West of England in the  Sci-Tech Daresbury Campus (from the company’s LinkedIn page). This certainly illustrates how entwined the Scottish and UK science scenes are entwined as is the commercialization process.

I last mentioned Scotland, science, and the independence vote in a July 8, 2014 posting which covers some of the ‘pro’ and ‘con’ thinking at the time.

Peter Higgs and François Englert to receive 2013 Nobel Prize in Physics and TRIUMF name changes?

After all the foofaraw about finding/confirming the existence of the Higgs Boson or ‘god’ particle (featured in my July 4, 2012 posting amongst many others), the Royal Swedish Academy of Sciences has decided to award the 2013 Nobel prize for Physics to two of the individuals responsible for much of the current thinking about subatomic particles and mass (from the Oct. 8, 2013 news item on ScienceDaily),

The Royal Swedish Academy of Sciences has decided to award the Nobel Prize in Physics for 2013 to François Englert of Université Libre de Bruxelles, Brussels, Belgium, and Peter W. Higgs of the University of Edinburgh, UK, “for the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles, and which recently was confirmed through the discovery of the predicted fundamental particle, by the ATLAS and CMS experiments at CERN’s Large Hadron Collider.”

François Englert and Peter W. Higgs are jointly awarded the Nobel Prize in Physics 2013 for the theory of how particles acquire mass. In 1964, they proposed the theory independently of each other (Englert together with his now deceased colleague Robert Brout). In 2012, their ideas were confirmed by the discovery of a so called Higgs particle at the CERN laboratory outside Geneva in Switzerland.

TRIUMF, sometimes known as Canada’s national laboratory for particle and nuclear physics, has issued an Oct. 8, 2013 news release,

HIGGS, ENGLERT SHARE 2013 NOBEL PRIZE IN PHYSICS

Canadians Key Part of Historical Nobel Prize to “Godfathers” of the “God Particle”

(Vancouver, BC) — The Royal Swedish Academy of Sciences today awarded the Nobel Prize in physics to Professor Peter W. Higgs (Univ. of Edinburgh) and Professor François Englert (Univ. Libre de Bruxelles) to recognize their work developing the theory of what is now known as the Higgs field, which gives elementary particles mass.  Canadians have played critical roles in all stages of the breakthrough discovery Higgs boson particle that validates the original theoretical framework.  Throngs across Canada are celebrating.

More than 150 Canadian scientists and students at 10 different institutions are presently involved in the global ATLAS experiment at CERN.  Canada’s national laboratory for particle and nuclear physics, TRIUMF, has been a focal point for much of the Canadian involvement that has ranged from assisting with the construction of the LHC accelerator to building key elements of the ATLAS detector and hosting one of the ten global Tier-1 Data Centres that stores and processes the physics for the team of thousands.

“The observation of a Higgs Boson at about 125 GeV, or 130 times the mass of the proton, by both the ATLAS and CMS groups is a tremendous achievement,” said Rob McPherson, spokesperson of the ATLAS Canada collaboration, a professor of physics at the University of Victoria and Institute of Particle Physics scientist. “Its existence was predicted in 1964 when theorists reconciled how massive particles came into being.  It took almost half a century to confirm the detailed predictions of the theories in a succession of experiments, and finally to discover the Higgs Boson itself using our 2012 data.”

The Brout-Englert-Higgs (BEH) mechanism was first proposed in 1964 in two papers published independently, the first by Belgian physicists Robert Brout and François Englert, and the second by British physicist Peter Higgs. It explains how the force responsible for beta decay is much weaker than electromagnetism, but is better known as the mechanism that endows fundamental particles with mass. A third paper, published by Americans Gerald Guralnik and Carl Hagen with their British colleague Tom Kibble further contributed to the development of the new idea, which now forms an essential part of the Standard Model of particle physics. As was pointed out by Higgs, a key prediction of the idea is the existence of a massive boson of a new type, which was discovered by the ATLAS and CMS experiments at CERN in 2012.

The next step will be to determine the precise nature of the Higgs particle and its significance for our understanding of the universe. Are its properties as expected for the Higgs boson predicted by the Standard Model of particle physics? Or is it something more exotic? The Standard Model describes the fundamental particles from which we, and every visible thing
in the universe, are made, and the forces acting between them. All the matter that we can see, however, appears to be no more than about 4% of the total. A more exotic version of the Higgs particle could be a bridge to understanding the 96% of the universe that remains obscure.

TRIUMF salutes Peter Higgs and François Englert for their groundbreaking work recognized by today’s Nobel Prize and congratulates the international team of tens of thousands of scientists, engineers, students, and many more from around the world who helped make the discovery.

For spokespeople at the major Canadian universities involved in the Higgs discovery, please see the list below:

CANADIAN CONTACTS

U of Alberta: Doug Gingrich, gingrich@ualberta.ca, 780-492-9501
UBC:  Colin Gay, cgay@physics.ubc.ca, 604-822-2753
Carleton U: Gerald Oakham (& TRIUMF), oakham@physics.carleton.ca, 613-520-7539
McGill U: Brigitte Vachon (also able to interview in French), vachon@physics.mcgill.ca, 514-398-6478
U of Montreal: Claude Leroy (also able to interview in French),leroy@lps.uontreal.ca, 514-343-6722
Simon Fraser U: Mike Vetterli (& TRIUMF, also able to interview in French), vetm@triumf.ca, 778-782-5488
TRIUMF: Isabel Trigger (also able to interview in French), itrigger@triumf.ca, 604-222-7651
U of Toronto: Robert Orr, orr@physics.utoronto.ca, 416-978-6029
U of Victoria: Rob McPherson, rmcphers@triumf.ca, 604-222-7654
York U: Wendy Taylor, taylorw@yorku.ca, 416-736-2100 ext 77758

While I know Canadians have been part of the multi-year, multi-country effort to determine the existence or non-existence of the Higgs Boson and much more in the field of particle physics, I would prefer we were not described as “… Key Part of Historical Nobel Prize … .” The question that springs to mind is: how were Canadian efforts key to this work? The answer is not revealed in the news release, which suggests that the claim may be a little overstated. On the other hand, I do like the bit about ‘saluting Higgs and Englert for their groundbreaking work’.

As for TRIUMF and what appears to be a series of name changes, I’m left somewhat puzzled, This Oct. 8, 2013 news release bears the name (or perhaps it’s a motto or tagline of some sort?): TRIUMF — Accelerating Science for Canada, meanwhile the website still sports this: TRIUMF Canada’s national laboratory for particle and nuclear physics while a July 17, 2013 TRIUMF news release gloried in this name: TRIUMF Accelerators, Inc., (noted in my July 18, 2013 posting). Perhaps TRIUMF is trying to follow in CERN’s footsteps. CERN was once known as the ‘European particle physics laboratory’ but is now known as the European Organization for Nuclear Research and seems to also have the tagline: ‘Accelerating science’.

Phytoremediation, clearing pollutants from industrial lands, could also be called phyto-mining

The University of Edinburgh (along with the Universities of Warwick and Birmingham, Newcastle University and Cranfield University) according to its Mar. 4, 2013 news release on EurekAlert is involved in a phytoremediation project,

Common garden plants are to be used to clean polluted land, with the extracted poisons being used to produce car parts and aid medical research.

Scientists will use plants such as alyssum, pteridaceae and a type of mustard called sinapi to soak up metals from land previously occupied by factories, mines and landfill sites.

Dangerous levels of metals such as arsenic and platinum, which can lurk in the ground and can cause harm to people and animals, will be extracted using a natural process known as phytoremediation.

A Mar. 4, 2013 news item on the BBC News Edinburgh, Fife and East Scotland site offers more details about the project and the technology,

A team of researchers from the Universities of Edinburgh, Warwick, Birmingham, Newcastle and Cranfield has developed a way of extracting the chemicals through a process called phytoremediation, and are testing its effectiveness.

Once the plants have drawn contaminated material out of the soil, they will be harvested and processed in a bio-refinery.

A specially designed bacteria will be added to the waste to transform the toxic metal ions into metallic nanoparticles.

The team said these tiny particles could then be used to develop cancer treatments, and could also be used to make catalytic converters for cars.

Dr Louise Horsfall, of Edinburgh’s University’s school of biological sciences, said: “Land is a finite resource. As the world’s population grows along with the associated demand for food and shelter, we believe that it is worth decontaminating land to unlock vast areas for better food security and housing.

“I hope to use synthetic biology to enable bacteria to produce high value nanoparticles and thereby help make land decontamination financially viable.”

The research team said the land where phytoremediation was used would also be cleared of chemicals, meaning it could be reused for new building projects.

In my Sept. 28, 2012 posting I featured an international collaboration between universities in the UK, US, Canada, and New Zealand in a ‘phyto-mining’ project bearing some resemblance to this newly announced project. In that project, announced in Fall 2012, scientists were studying how they might remove platinum for reuse from plants near the tailings of mines.

I do have one other posting about phytoremediation. I featured a previously published piece by Joe Martin in a two-part series on the topic plant (phyto) and nano soil remediation. The March 30, 2012 posting is part one, which focuses on the role of plants in soil remediation.

Phyto-mining and environmental remediation flower in the United Kingdom

Researchers on a £3 million research programme called “Cleaning Land for Wealth” (CL4W) are confident they’ll be able to use flowers and plants to clean soil of poisonous materials (environmental remediation) and to recover platinum (phyto-mining). From the Nov. 21, 2012 news item on Nanowerk,

A consortium of researchers led by WMG (Warwick Manufacturing Group) at the University of Warwick are to embark on a £3 million research programme called “Cleaning Land for Wealth” (CL4W), that will use a common class of flower to restore poisoned soils while at the same time producing perfectly sized and shaped nano sized platinum and arsenic nanoparticles for use in catalytic convertors, cancer treatments and a range of other applications.

The Nov. 20, 2012 University of Warwick news release, which originated the news item, describes both how CL4W came together and how it produced an unintended project benefit,

A “Sandpit” exercise organised by the Engineering and Physical Sciences Research Council (EPSRC) allowed researchers from WMG (Warwick Manufacturing group) at the University of Warwick, Newcastle University, The University of Birmingham, Cranfield University and the University of Edinburgh to come together and share technologies and skills to come up with an innovative multidisciplinary research project that could help solve major technological and environmental challenges.

The researchers pooled their knowledge of how to use plants and bacteria to soak up particular elements and chemicals and how to subsequently harvest, process and collect that material. They have devised an approach to demonstrate the feasibility in which they are confident that they can use common classes of flower and plants (such as Alyssum), to remove poisonous chemicals such as arsenic and platinum from polluted land and water courses potentially allowing that land to be reclaimed and reused.

That in itself would be a significant achievement, but as the sandpit progressed the researchers found that jointly they had the knowledge to achieve much more than just cleaning up the land.

As lead researcher on the project Professor Kerry Kirwan from WMG at the University of Warwick explained:

“The processes we are developing will not only remove poisons such as arsenic and platinum from contaminated land and water courses, we are also confident that we can develop suitable biology and biorefining processes (or biofactories as we are calling them) that can tailor the shapes and sizes of the metallic nanoparticles they will make. This would give manufacturers of catalytic convertors, developers of cancer treatments and other applicable technologies exactly the right shape, size and functionality they need without subsequent refinement. We are also expecting to recover other high value materials such as fine chemicals, pharmaceuticals, anti-oxidants etc. from the crops during the same biorefining process.”

I last mentioned phyto-mining in my Sept. 26, 2012 post with regard to an international project being led by researchers at the University of York (UK).  The biorefining processes (biofactories) mentioned by Kirwan takes the idea of recovering platinum, etc. one step beyond phyto-mining recovery.

Here’s a picture of the flower (Alyssum) mentioned in the news release,

Alyssum montanum photographed by myself in 1988, Unterfranken, Germany [http://en.wikipedia.org/wiki/Alyssum]

From the Wikipedia essay (Note: I have removed links],

Alyssum is a genus of about 100–170 species of flowering plants in the family Brassicaceae, native to Europe, Asia, and northern Africa, with the highest species diversity in the Mediterranean region. The genus comprises annual and perennial herbaceous plants or (rarely) small shrubs, growing to 10–100 cm tall, with oblong-oval leaves and yellow or white flowers (pink to purple in a few species).

Health science writing? Australian writer accuses gym equipment of killing you through nanotechnology

Toby McCasker’s Sept. 30, 2012 article for news.com.au  is one of the more peculiar pieces I’ve seen about nanotechnology and its dangers. From the article,

Is gym equipment killing you?

THE nanofibres that make up sports and gym equipment just might be doing you more harm than good.

McCasker then blesses us with this wonderful, wonderful passage where he explains his concern,

Why is this (maybe) bad? Nanotechnology sounds awesome, after all. Very cyberpunk. Inject them into your dude piston and become a thrumming love-machine, all that. [emphases mine] They’re maybe bad because researchers from the University of Edinburgh in the UK have just discovered that some nanofibres bear a resemblance to asbestos fibres, which can cause lung cancer.

You can’t inject nanotechnology. Since it’s a field of study,  it would be the equivalent of injecting biology or quantum mechanics.

As for nanotechnology being cyberpunk, here’s how Cyberpunk is defined  in The Free Dictionary,

Noun   1.         cyberpunk – a programmer who breaks into computer systems in order to steal or change or destroy information as a form of cyber-terrorism

cyber-terrorist, hacker

act of terrorism, terrorism, terrorist act – the calculated use of violence (or the threat of violence) against civilians in order to attain goals that are political or religious or ideological in nature; this is done through intimidation or coercion or instilling fear

coder, computer programmer, programmer, software engineer – a person who designs and writes and tests computer programs

terrorist – a radical who employs terror as a political weapon; usually organizes with other terrorists in small cells; often uses religion as a cover for terrorist activities

2.         cyberpunk – a writer of science fiction set in a lawless subculture of an oppressive society dominated by computer technology

author, writer – writes (books or stories or articles or the like) professionally (for pay)

3.         cyberpunk – a genre of fast-paced science fiction involving oppressive futuristic computerized societies

science fiction – literary fantasy involving the imagined impact of science on society

The closest definition that fits McCasker’s usage is this description (the passage by Lawrence Person) of cyberpunk, a post-modern science fiction genre, in Wikipedia,

Cyberpunk plots often center on a conflict among hackers, artificial intelligences, and megacorporations, and tend to be set in a near-future Earth, rather than the far-future settings or galactic vistas found in novels such as Isaac Asimov’s Foundation or Frank Herbert’s Dune. The settings are usually post-industrial dystopias but tend to be marked by extraordinary cultural ferment and the use of technology in ways never anticipated by its creators (“the street finds its own uses for things”). Much of the genre’s atmosphere echoes film noir, and written works in the genre often use techniques from detective fiction.

“Classic cyberpunk characters were marginalized, alienated loners who lived on the edge of society in generally dystopic futures where daily life was impacted by rapid technological change, an ubiquitous datasphere of computerized information, and invasive modification of the human body.” – Lawrence Person

It’s the part about “invasive modification of the human body” which seems closest to McCasker’s ” inject them into your dude piston”  (dude piston is my new favourite phrase).

As for the reference to nanofibres, McCasker is correct. There are carbon nanotubes that resemble asbestos fibres and there is concern for anyone who may ingest them. As far as I know, the people at greatest risk would be workers who are exposed to the carbon nanotubes directly. I have not heard of anyone getting sick because of their golf clubs where carbon nanotubes are often used to make them lighter and stronger.

The research (mentioned in my Aug. 22, 2012 posting)  at the University of Edinburgh that McCasker cites is important because it adds to a body of substantive research work on this issue regarding carbon nanotubes, asbestos, and the possibility of mesothelioma and bears no mention of gym equipment.

It’s the length, not the size that matters with nanofibres such as carbon nanotubes

The Aug. 22, 2012 news item on Nanowerk by way of Feedzilla features some research at the University of Edinburgh which determined that short nanofibres do not have the same effect on lung cells as longer fibres do. From the news item, here’s a description of why this research was undertaken

Nanofibres, which can be made from a range of materials including carbon, are about 1,000 times smaller than the width of a human hair and can reach the lung cavity when inhaled.

This may lead to a cancer known as mesothelioma, which is known to be caused by breathing in asbestos fibres, which are similar to nanofibres.

I wrote about research at Brown University which explained why some fibres get stuck in lung cells in a Sept. 22, 2011 posting titled, Why asbestos and carbon nanotubes are so dangerous to cells. The short answer is: if the tip is rounded, the cell mistakes the fibre for a sphere and, in error, it attempts to absorb it. Here’s some speculation on my part about what the results might mean (from my Sept. 22, 2011 posting),

The whole thing has me wondering about long vs. short carbon nanotubes. Does this mean that short carbon nanotubes can be ingested successfully? If so, at what point does short become too long to ingest?

The University of Edinburgh Aug. 22, 2012 news release provides answer to last year’s  speculation about length,

The University study found that lung cells were not affected by short fibres that were less than five-thousandths of a millimetre long.

However, longer fibres can reach the lung cavity, where they become stuck and cause disease.

We knew that long fibres, compared with shorter fibres, could cause tumours but until now we did not know the cut-off length at which this happened. Knowing the length beyond which the tiny fibres can cause disease is important in ensuring that safe fibres are made in the future as well as helping to understand the current risk from asbestos and other fibres, [said] Ken Donaldson, Professor of Respiratory Toxicology.

Sometimes, I surprise myself. I think I’ll take a moment to bask. … Done now!

Here’s my final thought, while this research suggests short length nanofibres won’t cause mesothelioma, this doesn’t rule out  other potential problems. So, let’s celebrate this new finding and then get back to investigating nanofibres and their impact on health.

Knotty molecules

I couldn’t resist the wordplay (knotty/naughty) when I saw the Nov. 7, 2011 news item on Nanowerk titled, Tying molecules in knots. From the news item,

A research team headed by Professor David Leigh of the University of Edinburgh (UK) and Academy Professor Kari Rissanen of the University of Jyväskylä (Finland) have made the most complex molecular knot to date, as reported in Nature Chemistry (“A synthetic molecular pentafoil knot”).

However, deliberately tying molecules into well-defined knots so that these effects can be studied is extremely difficult. Up to now, only the simplest type of knot – a trefoil knot – had been prepared by scientists. Now Professor David Leigh’s team (www.catenane.net) at the University of Edinburgh together with Academy Professor Kari Rissanen at the University of Jyväskylä have succeeded in preparing and characterizing a more complex type of knot – a pentafoil knot (also known as a cinquefoil knot or a Solomon’s seal knot) – a knot which looks like a five-pointed star.

Remarkably, the thread that is tied into the star-shaped knot is just 160 atoms in length – that is about 16 nanometers long (one nanometer is one millionth of a millimeter).

Will this repopularize macramé (making textile by knotting the fibres)?

Cavandoli Macramé_Keith Russell

I found the image in Macramé essay on Wikipedia and Cavandoli is a form of Italian macramé.

Measuring professional and national scientific achievements; Canadian science policy conferences

I’m going to start with an excellent study about publication bias in science papers and careerism that I stumbled across this morning on physorg.com (from the news item),

Dr [Daniele] Fanelli [University of Edinburgh] analysed over 1300 papers that declared to have tested a hypothesis in all disciplines, from physics to sociology, the principal author of which was based in a U.S. state. Using data from the National Science Foundation, he then verified whether the papers’ conclusions were linked to the states’ productivity, measured by the number of papers published on average by each academic.

Findings show that papers whose authors were based in more “productive” states were more likely to support the tested hypothesis, independent of discipline and funding availability. This suggests that scientists working in more competitive and productive environments are more likely to make their results look “positive”. It remains to be established whether they do this by simply writing the papers differently or by tweaking and selecting their data.

I was happy to find out that Fanelli’s paper has been published by the PLoS [Public Library of Science] ONE , an open access journal. From the paper [numbers in square brackets are citations found at the end of the published paper],

Quantitative studies have repeatedly shown that financial interests can influence the outcome of biomedical research [27], [28] but they appear to have neglected the much more widespread conflict of interest created by scientists’ need to publish. Yet, fears that the professionalization of research might compromise its objectivity and integrity had been expressed already in the 19th century [29]. Since then, the competitiveness and precariousness of scientific careers have increased [30], and evidence that this might encourage misconduct has accumulated. Scientists in focus groups suggested that the need to compete in academia is a threat to scientific integrity [1], and those guilty of scientific misconduct often invoke excessive pressures to produce as a partial justification for their actions [31]. Surveys suggest that competitive research environments decrease the likelihood to follow scientific ideals [32] and increase the likelihood to witness scientific misconduct [33] (but see [34]). However, no direct, quantitative study has verified the connection between pressures to publish and bias in the scientific literature, so the existence and gravity of the problem are still a matter of speculation and debate [35].

Fanelli goes on to describe his research methods and how he came to his conclusion that the pressure to publish may have a significant impact on ‘scientific objectivity’.

This paper provides an interesting counterpoint to a discussion about science metrics or bibliometrics taking place on (the journal) Nature’s website here. It was stimulated by Judith Lane’s recent article titled, Let’s Make Science Metrics More Scientific. The article is open access and comments are invited. From the article [numbers in square brackets refer to citations found at the end of the article],

Measuring and assessing academic performance is now a fact of scientific life. Decisions ranging from tenure to the ranking and funding of universities depend on metrics. Yet current systems of measurement are inadequate. Widely used metrics, from the newly-fashionable Hirsch index to the 50-year-old citation index, are of limited use [1]. Their well-known flaws include favouring older researchers, capturing few aspects of scientists’ jobs and lumping together verified and discredited science. Many funding agencies use these metrics to evaluate institutional performance, compounding the problems [2]. Existing metrics do not capture the full range of activities that support and transmit scientific ideas, which can be as varied as mentoring, blogging or creating industrial prototypes.

The range of comments is quite interesting, I was particularly taken by something Martin Fenner said,

Science metrics are not only important for evaluating scientific output, they are also great discovery tools, and this may indeed be their more important use. Traditional ways of discovering science (e.g. keyword searches in bibliographic databases) are increasingly superseded by non-traditional approaches that use social networking tools for awareness, evaluations and popularity measurements of research findings.

(Fenner’s blog along with more of his comments about science metrics can be found here. If this link doesn’t work, you can get to Fenner’s blog by going to Lane’s Nature article and finding him in the comments section.)

There are a number of issues here: how do we measure science work (citations in other papers?) as well as how do we define the impact of science work (do we use social networks?) which brings the question to: how do we measure the impact when we’re talking about a social network?

Now, I’m going to add timeline as an issue. Over what period of time are we measuring the impact? I ask the question because of the memristor story.  Dr. Leon Chua wrote a paper in 1971 that, apparently, didn’t receive all that much attention at the time but was cited in a 2008 paper which received widespread attention. Meanwhile, Chua had continued to theorize about memristors in a 2003 paper that received so little attention that Chua abandoned plans to write part 2. Since the recent burst of renewed interest in the memristor and his 2003 paper, Chua has decided to follow up with part 2, hopefully some time in 2011. (as per this April 13, 2010 posting) There’s one more piece to the puzzle: an earlier paper by F. Argall. From Blaise Mouttet’s April 5, 2010 comment here on this blog,

In addition HP’s papers have ignored some basic research in TiO2 multi-state resistance switching from the 1960’s which disclose identical results. See F. Argall, “Switching Phenomena in Titanium Oxide thin Films,” Solid State Electronics, 1968.
http://pdf.com.ru/a/ky1300.pdf

[ETA: April 22, 2010 Blaise Mouttet has provided a link to an article  which provides more historical insight into the memristor story. http://knol.google.com/k/memistors-memristors-and-the-rise-of-strong-artificial-intelligence#

How do you measure or even track  all of that? Shy of some science writer taking the time to pursue the story and write a nonfiction book about it.

I’m not counselling that the process be abandoned but since it seems that the people are revisiting the issues, it’s an opportune time to get all the questions on the table.

As for its importance, this process of trying to establish better and new science metrics may seem irrelevant to most people but it has a much larger impact than even the participants appear to realize. Governments measure their scientific progress by touting the number of papers their scientists have produced amongst other measures such as  patents. Measuring the number of published papers has an impact on how governments want to be perceived internationally and within their own borders. Take for example something which has both international and national impact, the recent US National Nanotechnology Initiative (NNI) report to the President’s Council of Science and Technology Advisors (PCAST). The NNI used the number of papers published as a way of measuring the US’s possibly eroding leadership in the field. (China published about 5000 while the US published about 3000.)

I don’t have much more to say other than I hope to see some new metrics.

Canadian science policy conferences

We have two such conferences and both are two years old in 2010. The first one is being held in Gatineau, Québec, May 12 – 14, 2010. Called Public Science  in Canada: Strengthening Science and Policy to Protect Canadians [ed. note: protecting us from what?], the target audience for the conference seems to be government employees. David Suzuki (tv host, scientist, evironmentalist, author, etc.) and Preston Manning (ex-politico) will be co-presenting a keynote address titled: Speaking Science to Power.

The second conference takes place in Montréal, Québec, Oct. 20-22, 2010. It’s being produced by the Canadian Science Policy Centre. Other than a notice on the home page, there’s not much information about their upcoming conference yet.

I did note that Adam Holbrook (aka J. Adam Holbrook) is both speaking at the May conference and is an advisory committee member for the folks who are organizing the October conference. At the May conference, he will be participating in a session titled: Fostering innovation: the role of public S&T. Holbrook is a local (to me) professor as he works at Simon Fraser University, Vancouver, Canada.

That’s all of for today.