Category Archives: science

The origins of gold and other precious metals

The link between this research and my side project on gold nanoparticles is a bit tenuous but this work on the origins for gold and other precious metals being found in the stars is so fascinating and I’m determined to find a connection.

An artist's impression of two neutron stars colliding. (Credit: Dana Berry / Skyworks Digital, Inc.) Courtesy: Kavli Foundation

An artist’s impression of two neutron stars colliding. (Credit: Dana Berry / Skyworks Digital, Inc.) Courtesy: Kavli Foundation

From a May 19, 2016 news item on phys.org,

The origin of many of the most precious elements on the periodic table, such as gold, silver and platinum, has perplexed scientists for more than six decades. Now a recent study has an answer, evocatively conveyed in the faint starlight from a distant dwarf galaxy.

In a roundtable discussion, published today [May 19, 2016?], The Kavli Foundation spoke to two of the researchers behind the discovery about why the source of these heavy elements, collectively called “r-process” elements, has been so hard to crack.

From the Spring 2016 Kavli Foundation webpage hosting the  “Galactic ‘Gold Mine’ Explains the Origin of Nature’s Heaviest Elements” Roundtable ,

RESEARCHERS HAVE SOLVED a 60-year-old mystery regarding the origin of the heaviest elements in nature, conveyed in the faint starlight from a distant dwarf galaxy.

Most of the chemical elements, composing everything from planets to paramecia, are forged by the nuclear furnaces in stars like the Sun. But the cosmic wellspring for a certain set of heavy, often valuable elements like gold, silver, lead and uranium, has long evaded scientists.

Astronomers studying a galaxy called Reticulum II have just discovered that its stars contain whopping amounts of these metals—collectively known as “r-process” elements (See “What is the R-Process?”). Of the 10 dwarf galaxies that have been similarly studied so far, only Reticulum II bears such strong chemical signatures. The finding suggests some unusual event took place billions of years ago that created ample amounts of heavy elements and then strew them throughout the galaxy’s reservoir of gas and dust. This r-process-enriched material then went on to form Reticulum II’s standout stars.

Based on the new study, from a team of researchers at the Kavli Institute at the Massachusetts Institute of Technology, the unusual event in Reticulum II was likely the collision of two, ultra-dense objects called neutron stars. Scientists have hypothesized for decades that these collisions could serve as a primary source for r-process elements, yet the idea had lacked solid observational evidence. Now armed with this information, scientists can further hope to retrace the histories of galaxies based on the contents of their stars, in effect conducting “stellar archeology.”

The Kavli Foundation recently spoke with three astrophysicists about how this discovery can unlock clues about galactic evolution as well as the abundances of certain elements on Earth we use for everything from jewelry-making to nuclear power generation. The participants were:

  • Alexander Ji – is a graduate student in physics at the Massachusetts Institute of Technology (MIT) and a member of the MIT Kavli Institute for Astrophysics and Space Research (MKI). He is lead author of a paper in Nature describing this discovery.
  • Anna Frebel – is the Silverman Family Career Development Assistant Professor in the Department of Physics at MIT and also a member of MKI. Frebel is Ji’s advisor and coauthored the Nature paper. Her work delves into the chemical and physical conditions of the early universe as conveyed by the oldest stars.
  • Enrico Ramirez-Ruiz – is a Professor of Astronomy and Astrophysics at the University of California, Santa Cruz. His research explores violent events in the universe, including the mergers of neutron stars and their role in generating r-process elements.

Here’s a link to and citation for Ji’s and Frebel’s paper about r-process elements in the stars,

R-process enrichment from a single event in an ancient dwarf galaxy by Alexander P. Ji, Anna Frebel, Anirudh Chiti, & Joshua D. Simon. Nature 531, 610–613 (31 March 2016) doi:10.1038/nature17425 Published online 21 March 2016

This paper is behind a paywall but you can read an edited transcript of the roundtable discussion on the Galactic ‘Gold Mine’ Explains the Origin of Nature’s Heaviest Elements webpage (keep scrolling past the introductory text).

As for my side project, Steep (2) on gold nanoparticles, that’s still in the planning stages but if there’s a way to include this information, I’ll do it.

Two May 31, 2016 talks (Why nuclear power is necessary and DNA is not destiny) in Vancouver, Canada

Both the upcoming science talks in Vancouver are scheduled for May 31, 2016. Isn’t that always the way?

Why nuclear power is necessary

This talk is being held by ARPICO (Society of Italian Researchers & Professionals in Western Canada). From the ARPICO event page,

Why Nuclear Power is Necessary

Presenter

Patrick Walden graduated with a B.Sc. in Physics from UBC and a Ph.D in Particle Physics from Caltech. His Post Doctoral research was done at the Stanford University Linear Accelerator (SLAC), and since 1974 he has been at TRIUMF here in Vancouver. Patrick has been active in the fields of pion photo-production, meson spectroscopy, the dynamics of pion production from nuclei, and nuclear astrophysics.

Abstract

Nuclear power is the second largest source of greenhouse gas emissions-free energy in the world. It supplies approximately 5% of the world’s total energy demand. Presently, human activity is on the brink of initiating a global greenhouse climate catastrophe unless we can limit our greenhouse gas emissions.

In this talk, Dr. Patrick Walden will examine the concerns about nuclear power and the reasons why, contrary to public perception, nuclear power is one of the safest, most economical, plentiful, and greenest sources of energy available.

Logistics

  • May 31, 2016 – 7:00pm
  • Roundhouse Community Centre – Room B – (181 Roundhouse Mews, Vancouver BC V6Z2W3)
  • Underground pay parking is available, access off Drake St. south of Pacific Blvd.
    Admission by donation. Q&A and complimentary refreshments follow. Registration is highly recommended as seating is limited. RSVP at info@arpico.ca or at EventBrite by May 28th, 2016.

A map for the location can be found here.

There is a Skytrain station nearbyYaletown-Roundhouse Canada Line Station

DNA is not destiny

This month’s Café Scientifique talk is being held in downtown Vancouver at Yaggers (433 W. Pender St.). Details of the talk are (from the May 13, 2016 email announcement,

… Our speaker for the evening will be Dr. Steven Heine, a Professor in the Department of Psychology at UBC [University of British Columbia]. The title of his talk is:

DNA is Not Destiny: How Essences Distort how we Think about Genes

People the world over are essentialist thinkers – they are attracted to the idea that hidden essences make things as they are. And because genetic concepts remind people of essences, they tend to think of genes in ways similar to essences. That is, people tend to think about genetic causes as immutable, deterministic, homogenous, discrete, and natural.  Dr. Heine will discuss how our essentialist biases lead people to think differently about sex, race, crime, eugenics, and disease whenever these are described in genetic terms. Moreover, Dr. Heine will discuss how our essentialistic biases make people vulnerable to the sensationalist hype that has emerged with the genomic revolution and access to direct-to-consumer genotyping services.

Logistics

Tuesday May 31st, 7:30pm at Yagger’s Downtown (433 W Pender).

I have found a little more information about Dr. Steven Heine and his work (from his University of British Columbia webpage),

Our lab is currently working on three distinct research programs, which we refer to as Cultural Psychology, Meaning Maintenance, and Genetic Essentialism.

Our third research program on genetic esssentialism considers how people understand essences and genetic foundations for human behavior. We propose that encounters with genetic explanations for human outcomes prompts people to think of those outcomes in essentiialized ways, by viewing those outcomes as more deterministic, immutable, and fatalistic. For example, we find that women are more vulnerable to stereotype threat when they hear of genetic reasons for why men outperform women in math than when they hear of environmental reasons for this difference. We also find that men are more tolerant of sex crimes when they learn of genetic basis for sexual motivations than when they hear of social-constructivist accounts. We are conducting several studies to explore the ways that people respond to genetic accounts for human conditions.

Have fun whichever one you choose to attend.

Frankenstein and Switzerland in 2016

The Frankenstein Bicentennial celebration is in process as various events and projects are now being launched. In a Nov. 12, 2015 posting I made mention of the Frankenstein Bicentennial Project 1818-2018 at Arizona State University (ASU; scroll down about 15% of the way),

… the Transmedia Museum (Frankenstein Bicentennial Project 1818-2018).  This project is being hosted by Arizona State University. From the project homepage,

No work of literature has done more to shape the way people imagine science and its moral consequences than Frankenstein; or The Modern Prometheus, Mary Shelley’s enduring tale of creation and responsibility. The novel’s themes and tropes—such as the complex dynamic between creator and creation—continue to resonate with contemporary audiences. Frankenstein continues to influence the way we confront emerging technologies, conceptualize the process of scientific research, imagine the motivations and ethical struggles of scientists, and weigh the benefits of innovation with its unforeseen pitfalls.

The Frankenstein Bicentennial Project will infuse science and engineering endeavors with considerations of ethics. It will use the power of storytelling and art to shape processes of innovation and empower public appraisal of techno-scientific research and creation. It will offer humanists and artists a new set of concerns around research, public policy, and the ramifications of exploration and invention. And it will inspire new scientific and technological advances inspired by Shelley’s exploration of our inspiring and terrifying ability to bring new life into the world. Frankenstein represents a landmark fusion of science, ethics, and literary expression.

The bicentennial provides an opportunity for vivid reflection on how science is culturally framed and understood by the public, as well as our ethical limitations and responsibility for nurturing the products of our creativity. It is also a moment to unveil new scientific and technological marvels, especially in the areas of synthetic biology and artificial intelligence. Engaging with Frankenstein allows scholars and educators, artists and writers, and the public at large to consider the history of scientific invention, reflect on contemporary research, and question the future of our technological society. Acting as a network hub for the bicentennial celebration, ASU will encourage and coordinate collaboration across institutions and among diverse groups worldwide.

2016 Frankenstein events

Now, there’s an exhibition in Switzerland where Frankenstein was ‘born’ according to a May 12, 2016 news item on phys.org,

Frankenstein, the story of a scientist who brings to life a cadaver and causes his own downfall, has for two centuries given voice to anxiety surrounding the unrelenting advance of science.

To mark the 200 years since England’s Mary Shelley first imagined the ultimate horror story during a visit to a frigid, rain-drenched Switzerland, an exhibit opens in Geneva Friday called “Frankenstein, Creation of Darkness”.

In the dimly-lit, expansive basement at the Martin Bodmer Foundation, a long row of glass cases holds 15 hand-written, yellowed pages from a notebook where Shelley in 1816 wrote the first version of what is considered a masterpiece of romantic literature.

The idea for her “miserable monster” came when at just 18 she and her future husband, English poet Percy Bysshe Shelley, went to a summer home—the Villa Diodati—rented by literary great Lord Byron on the outskirts of Geneva.

The current private owners of the picturesque manor overlooking Lake Geneva will also open their lush gardens to guided tours during the nearby exhibit which runs to October 9 [May 13 – Oct. 9, 2016].

While the spot today is lovely, with pink and purple lilacs spilling from the terraces and gravel walkways winding through rose-covered arches, in the summer of 1816 the atmosphere was more somber.

A massive eruption from the Tambora volcano in Indonesia wreaked havoc with the global climate that year, and a weather report for Geneva in June on display at the exhibit mentions “not a single leaf” had yet appeared on the oak trees.

To pass the time, poet Lord Byron challenged the band of literary bohemians gathered at the villa to each invent a ghost story, resulting in several famous pieces of writing.

English doctor and author John Polidori came up with the idea for “The Vampyre”, which was published three years later and is considered to have pioneered the romantic vampyre genre, including works like Bram Stoker’s “Dracula”.

That book figures among a multitude of first editions at the Geneva exhibit, including three of Mary Shelley’s “Frankenstein, or the Modern Prometheus”—the most famous story to emerge from the competition.

Here’s a description of the exhibit, from the Martin Bodmer Foundation’s Frankenstein webpage,

To celebrate the 200th anniversary of the writing of this historically influential work of literature, the Martin Bodmer Foundation presents a major exhibition on the origins of Frankenstein, the perspectives it opens and the questions it raises.

A best seller since its first publication in 1818, Mary Shelley’s novel continues to demand attention. The questions it raises remain at the heart of literary and philosophical concerns: the ethics of science, climate change, the technologisation of the human body, the unconscious, human otherness, the plight of the homeless and the dispossessed.

The exposition Frankenstein: Creation of Darkness recreates the beginnings of the novel in its first manuscript and printed forms, along with paintings and engravings that evoke the world of 1816. A variety of literary and scientific works are presented as sources of the novel’s ideas. While exploring the novel’s origins, the exhibition also evokes the social and scientific themes of the novel that remain important in our own day.

For what it’s worth, I have come across analyses which suggest science and technology may not have been the primary concern at the time. There are interpretations which suggest issues around childbirth (very dangerous until modern times) and fear of disfigurement and disfigured individuals. What makes Frankenstein and the book so fascinating is how flexible interpretations can be. (For more about Frankenstein and flexibility, read Susan Tyler Hitchcock’s 2009 book, Frankenstein: a cultural history.)

There’s one more upcoming Frankenstein event, from The Frankenstein Bicentennial announcement webpage,

On June 14 and 15, 2016, the Brocher Foundation, Arizona State University, Duke University, and the University of Lausanne will host “Frankenstein’s Shadow,” a symposium in Geneva, Switzerland to commemorate the origin of Frankenstein and assess its influence in different times and cultures, particularly its resonance in debates about public policy governing biotechnology and medicine. These dates place the symposium almost exactly 200 years after Mary Shelley initially conceived the idea for Frankenstein on June 16, 1816, and in almost exactly the same geographical location on the shores of Lake Geneva.

If you’re interested in details such as the programme schedule, there’s this PDF,

Frankenstein¹s_ShadowConference

Enjoy!

Measuring the van der Waals forces between individual atoms for the first time

A May 13, 2016 news item on Nanowerk heralds the first time measuring the van der Waals forces between individual atoms,

Physicists at the Swiss Nanoscience Institute and the University of Basel have succeeded in measuring the very weak van der Waals forces between individual atoms for the first time. To do this, they fixed individual noble gas atoms within a molecular network and determined the interactions with a single xenon atom that they had positioned at the tip of an atomic force microscope. As expected, the forces varied according to the distance between the two atoms; but, in some cases, the forces were several times larger than theoretically calculated.

A May 13, 2016 University of Basel press release (also on EurekAlert), which originated the news item, provides an explanation of van der Waals forces (the most comprehensive I’ve seen) and technical details about how the research was conducted,

Van der Waals forces act between non-polar atoms and molecules. Although they are very weak in comparison to chemical bonds, they are hugely significant in nature. They play an important role in all processes relating to cohesion, adhesion, friction or condensation and are, for example, essential for a gecko’s climbing skills.

Van der Waals interactions arise due to a temporary redistribution of electrons in the atoms and molecules. This results in the occasional formation of dipoles, which in turn induce a redistribution of electrons in closely neighboring molecules. Due to the formation of dipoles, the two molecules experience a mutual attraction, which is referred to as a van der Waals interaction. This only exists temporarily but is repeatedly re-formed. The individual forces are the weakest binding forces that exist in nature, but they add up to reach magnitudes that we can perceive very clearly on the macroscopic scale – as in the example of the gecko.

Fixed within the nano-beaker

To measure the van der Waals forces, scientists in Basel used a low-temperature atomic force microscope with a single xenon atom on the tip. They then fixed the individual argon, krypton and xenon atoms in a molecular network. This network, which is self-organizing under certain experimental conditions, contains so-called nano-beakers of copper atoms in which the noble gas atoms are held in place like a bird egg. Only with this experimental set-up is it possible to measure the tiny forces between microscope tip and noble gas atom, as a pure metal surface would allow the noble gas atoms to slide around.

Compared with theory

The researchers compared the measured forces with calculated values and displayed them graphically. As expected from the theoretical calculations, the measured forces fell dramatically as the distance between the atoms increased. While there was good agreement between measured and calculated curve shapes for all of the noble gases analyzed, the absolute measured forces were larger than had been expected from calculations according to the standard model. Above all for xenon, the measured forces were larger than the calculated values by a factor of up to two.

The scientists are working on the assumption that, even in the noble gases, charge transfer occurs and therefore weak covalent bonds are occasionally formed, which would explain the higher values.

Here’s a link to and a citation for the paper,

Van der Waals interactions and the limits of isolated atom models at interfaces by Shigeki Kawai, Adam S. Foster, Torbjörn Björkman, Sylwia Nowakowska, Jonas Björk, Filippo Federici Canova, Lutz H. Gade, Thomas A. Jung, & Ernst Meyer. Nature Communications 7, Article number: 11559  doi:10.1038/ncomms11559 Published 13 May 2016

This is an open access paper.

The Leonardo Project and the master’s DNA (deoxyribonucleic acid)

I’ve never really understood the mania for digging up bodies of famous people in history and trying to ascertain how the person really died or what kind of diseases they may have had but the practice fascinates me. The latest famous person to be subjected to a forensic inquiry centuries after death is Leonardo da Vinci. A May 5, 2016 Human Evolution (journal) news release on EurekAlert provides details,

A team of eminent specialists from a variety of academic disciplines has coalesced around a goal of creating new insight into the life and genius of Leonardo da Vinci by means of authoritative new research and modern detective technologies, including DNA science.

The Leonardo Project is in pursuit of several possible physical connections to Leonardo, beaming radar, for example, at an ancient Italian church floor to help corroborate extensive research to pinpoint the likely location of the tomb of his father and other relatives. A collaborating scholar also recently announced the successful tracing of several likely DNA relatives of Leonardo living today in Italy (see endnotes).

If granted the necessary approvals, the Project will compare DNA from Leonardo’s relatives past and present with physical remnants — hair, bones, fingerprints and skin cells — associated with the Renaissance figure whose life marked the rebirth of Western civilization.

The Project’s objectives, motives, methods, and work to date are detailed in a special issue of the journal Human Evolution, published coincident with a meeting of the group hosted in Florence this week under the patronage of Eugenio Giani, President of the Tuscan Regional Council (Consiglio Regionale della Toscana).

The news release goes on to provide some context for the work,

Born in Vinci, Italy, Leonardo died in 1519, age 67, and was buried in Amboise, southwest of Paris. His creative imagination foresaw and described innovations hundreds of years before their invention, such as the helicopter and armored tank. His artistic legacy includes the iconic Mona Lisa and The Last Supper.

The idea behind the Project, founded in 2014, has inspired and united anthropologists, art historians, genealogists, microbiologists, and other experts from leading universities and institutes in France, Italy, Spain, Canada and the USA, including specialists from the J. Craig Venter Institute of California, which pioneered the sequencing of the human genome.

The work underway resembles in complexity recent projects such as the successful search for the tomb of historic author Miguel de Cervantes and, in March 2015, the identification of England’s King Richard III from remains exhumed from beneath a UK parking lot, fittingly re-interred 500 years after his death.

Like Richard, Leonardo was born in 1452, and was buried in a setting that underwent changes in subsequent years such that the exact location of the grave was lost.

If DNA and other analyses yield a definitive identification, conventional and computerized techniques might reconstruct the face of Leonardo from models of the skull.”

In addition to Leonardo’s physical appearance, information potentially revealed from the work includes his ancestry and additional insight into his diet, state of health, personal habits, and places of residence.

According to the news release, the researchers have an agenda that goes beyond facial reconstruction and clues about  ancestry and diet,

Beyond those questions, and the verification of Leonardo’s “presumed remains” in the chapel of Saint-Hubert at the Château d’Amboise, the Project aims to develop a genetic profile extensive enough to understand better his abilities and visual acuity, which could provide insights into other individuals with remarkable qualities.

It may also make a lasting contribution to the art world, within which forgery is a multi-billion dollar industry, by advancing a technique for extracting and sequencing DNA from other centuries-old works of art, and associated methods of attribution.

Says Jesse Ausubel, Vice Chairman of the Richard Lounsbery Foundation, sponsor of the Project’s meetings in 2015 and 2016: “I think everyone in the group believes that Leonardo, who devoted himself to advancing art and science, who delighted in puzzles, and whose diverse talents and insights continue to enrich society five centuries after his passing, would welcome the initiative of this team — indeed would likely wish to lead it were he alive today.”

The researchers aim to have the work complete by 2019,

In the journal, group members underline the highly conservative, precautionary approach required at every phase of the Project, which they aim to conclude in 2019 to mark the 500th anniversary of Leonardo’s death.

For example, one objective is to verify whether fingerprints on Leonardo’s paintings, drawings, and notebooks can yield DNA consistent with that extracted from identified remains.

Early last year, Project collaborators from the International Institute for Humankind Studies in Florence opened discussions with the laboratory in that city where Leonardo’s Adoration of the Magi has been undergoing restoration for nearly two years, to explore the possibility of analyzing dust from the painting for possible DNA traces. A crucial question is whether traces of DNA remain or whether restoration measures and the passage of time have obliterated all evidence of Leonardo’s touch.

In preparation for such analysis, a team from the J. Craig Venter Institute and the University of Florence is examining privately owned paintings believed to be of comparable age to develop and calibrate techniques for DNA extraction and analysis. At this year’s meeting in Florence, the researchers also described a pioneering effort to analyze the microbiome of a painting thought to be about five centuries old.

If human DNA can one day be obtained from Leonardo’s work and sequenced, the genetic material could then be compared with genetic information from skeletal or other remains that may be exhumed in the future.

Here’s a list of the participating organizations (from the news release),

  • The Institut de Paléontologie Humaine, Paris
  • The International Institute for Humankind Studies, Florence
  • The Laboratory of Molecular Anthropology and Paleogenetics, Biology Department, University of Florence
  • Museo Ideale Leonardo da Vinci, in Vinci, Italy
  • J. Craig Venter Institute, La Jolla, California
  • Laboratory of Genetic Identification, University of Granada, Spain
  • The Rockefeller University, New York City

You can find the special issue of Human Evolution (HE Vol. 31, 2016 no. 3) here. The introductory essay is open access but the other articles are behind a paywall.

Reddit’s Ask Me Anything with Stacy Konkiel about research metrics today (May 10, 2016)

You have a chance to ask your most pressing questions about research metrics today, May 10, 2016, by 10 am PDT. Here’s more about the panelist for this discussion, Stacy Konkiel, from her Reddit AMA (Ask Me Anything) introduction,

Hi, I am Stacy Konkiel, Outreach & Engagement Manager at Altmetric, and I’m here to talk about whether the metrics and indicators we like to rely upon in science (impact factor, altmetrics, citation counts, etc) to understand “broader impact” and “intellectual merit” are actually measuring what we purport they measure.

I’m not sure they do. Instead, I think that right now we’re just using rough proxies to understand influence and attention, and that we’re in danger of abusing the metrics that are supposed to save us all–altmetrics–just like science has done with the journal impact factor.

But altmetrics and other research metrics don’t have to be Taylorist tools of control. I love the promise they hold for scientists who want to truly understand how their research is truly changing the world.

I especially appreciate the fact that newer metrics allow the “invisible work” that’s being done in science (the data curators, the software developers, etc) can be recognized on its standalone merits, rather than as a byproduct of the publication process. That’s been my favorite part of working for Altmetric and, previously, Impactstory–that I can help others to better value the work of grad students, librarians, data scientists, etc.

Today, I want to talk about better measuring research impact, but I’m also open to taking other relevant questions. There will also be some live tweeting from @Altmetric and @digitalsci and questions using the #askstacyaltmetric hashtag.

My favourite question so far is this (it’s a long one),

gocsick 1 point

I might be pessimistic, but I am not sure there will ever be a satisfactory metric or indicator for the simple reason that it is impossible to normalize across fields or even within sub-disciplines within a field. Take my department of materials science for example, we recently hired two assistant professors one in the area of materials chemistry for batteries and energy storage and the other in welding metallurgy. The top candidates for the energy storage position had multiple science and nature publications with h-indices of ~40. The welding candidates had h-indices ~6. The interview process for both positions was interlaced, and in general we felt that the welding candidates were better in terms of depth and breadth of their knowledge and ability to do high quality science. Conversely, while the energy candidates had great numbers and publications seemed to be a bit of a one-trick pony in terms of ability to contribute outside of their very narrow speciality.

My point is that any metric is going to underestimate the contribution of people working in science areas outside of the current sexy topics. Within the field, we all know what journals are well respected and only publish high quality science. For example the Welding Journal and Corrosion are both the premier publications in their fields but each have an impact factor of <2. This is a reflection of the number of people and publications in the field rather than the quality of science.

I suppose I should end the rant and ask some questions.

1) Is the situation I described really a problem or am i just being grumpy? After all the metrics are supposed to measure the impact of the work, and people who work in less popular fields are less likely to make a big “impact”.

2) Is it possible to develop a metric on “quality” rather than impact. If so what would such a metric look like?

3) Altmetrics will likely have the same failings. There are many ways someone’s work can have broader impact that will not necessarily be talked about on social media or popular science blog posts. For example research leading to $$$ saved in industry by making a small change to a manufacturing operation, could have a huge broader impact but never once be tweeted (or cited more than a handful of times for that matter). Is there any potential to develop metrics to help shed light on these other “blind” activities?

Thanks

I hope you get there in time, if not, perhaps someone else has asked the question for you. I look forward to seeing the answers.

Science (magazine) investigates Sci-Hub (a pirate site for scientific papers)

Sci-Hub, a pirate website for scientific papers, and its progenitor, Alexandra Elbakyan, have generated a couple of articles and an editorial in Science magazine’s latest issue (April 28, 2016?). An April 29, 2016 article by Bob Yirka for phys.org describes one of the articles (Note: Links have been removed),

A correspondent for the Science family of journals has published an investigative piece in Science on Sci-Hub, a website that illegally publishes scholarly literature, i.e. research papers. In his article, John Bohannon describes how he made contact with Alexandra Elbakyan, the founder of what is now the world’s largest site for pirated scholarly articles, data she gave him, and commentary on what was revealed. Bohannon has also published another piece focused exclusively on Elbakyan, describing her as a frustrated science student. Marcia McNutt, Editor-in-Chief of the Science Family also weighs in on her “love-hate” relationship with Sci-Hub, and explains in detail why she believes the site is likely to cause problems for scholarly publishing heading into the future.

An April 28, 2016 American Association for the Advancement of Science (AAAS) news release provides some detail about the number of downloads from the Sci-Hub site,

In this investigative news piece from Science, contributing correspondent John Bohannon dives into data from Sci-Hub, the world’s largest pirate website for scholarly literature. For the first time, basic questions about Sci-Hub’s millions of users can be answered: Where are they and what are they reading? Bohannon’s statistical analysis is based on server log data supplied by Alexandra Elbakyan herself, the neuroscientist who created Sci-Hub in 2011. After establishing contact with her through an encrypted chat system, Bohannon and Elbakyan worked together to create a data set for public release: 28 million Sci-Hub download requests going back to 1 September 2015, including the digital object identifier (DOI) for every paper and the clustered locations of users based on their Internet Protocol address. In his story, Bohannon reveals that Sci-Hub usage is highest in China with 4.4 million download requests over the 6-month period, followed by India and Iran. But Sci-Hub users are not limited to the developing world, he reports; the U.S. is the fifth largest downloader and some of the most intense Sci-Hub activity seems to be happening on US and European university campuses, supporting the claim that many users could be accessing the papers through their libraries, but turn to Sci-Hub for convenience.

Bohanon’s piece appears to be open access. Here’s a link and a citation,

Who’s downloading pirated papers? Everyone by John Bohannon. Science (2016). DOI: 10.1126/science.aaf5664 Published April 28, 2016.

Comments

The analysis of the data is fascinating but I’m not sure why this is being billed as an ‘investigative’ piece. Generally speaking I would expect an investigative piece to unearth new information which has likely been hidden. At the very least, I would expect some juicy inside information (i.e., gossip).

Bohannon certainly had no difficulty getting information (from the April 28, 2016 Science article),

For someone denounced as a criminal by powerful corporations and scholarly societies, Elbakyan was surprisingly forthcoming and transparent. After establishing contact through an encrypted chat system, she worked with me over the course of several weeks to create a data set for public release: every download event over the 6-month period starting 1 September 2015, including the digital object identifier (DOI) for every paper. To protect the privacy of Sci-Hub users, we agreed that she would first aggregate users’ geographic locations to the nearest city using data from Google Maps; no identifying internet protocol (IP) addresses were given to me. (The data set and details on how it was analyzed are freely accessible)

Why would it be surprising that someone who has made a point of freeing scientific research and making it accessible also makes the data from her Sci-Hub site freely available? The action certainly seems consistent with her raison d’être.

Bohannon steers away from making any serious criticisms of the current publishing régimes although he does mention a few bones of contention while laying them to rest, more or less. This is no great surprise since he’s writing for one of the ‘big three’, a journal that could be described as having a vested interest in maintaining the status quo. (For those who are unaware, there are three journal considered the most prestigious or high impact for scientific studies: Nature, Cell, and Science.)

Characterizing Elbakyan as a ‘frustrated’ student in an April 28, 2016 profile by John Bohannon (The frustrated science student behind Sci-Hub) seems a bit dismissive. Sci-Hub may have been borne of frustration but it is an extraordinary accomplishment.

The piece has resulted in at least one very irate librarian, John Dupuis, from an April 29, 2016 posting on his Confessions of a Science Librarian blog,

Overall, the articles are pretty good descriptions of the Sci-Hub phenomenon and relatively even-handed [emphasis mine], especially coming from one of the big society publishers like AAAS.

There was one bit in the main article, Who’s downloading pirated papers? Everyone, that really stuck in my craw. Basically, Sci-Hub — and all that article piracy — is librarians’ fault.

And for all the researchers at Western universities who use Sci-Hub instead, the anonymous publisher lays the blame on librarians for not making their online systems easier to use and educating their researchers. “I don’t think the issue is access—it’s the perception that access is difficult,” he says.

Fortunately it was countered, in the true “give both sides of the story” style of mainstream journalism, by another quote, this time from a librarian.

“I don’t agree,” says Ivy Anderson, the director of collections for the California Digital Library in Oakland, which provides journal access to the 240,000 researchers of the University of California system. The authentication systems that university researchers must use to read subscription journals from off campus, and even sometimes on campus with personal computers, “are there to enforce publisher restrictions,” she says.

But of course, I couldn’t let it go. Anderson’s response is perfectly fine but somehow there just wasn’t enough rage and exasperation in it. So I stewed about it over night and tweeted up a tweetstorm of rage this morning, with the idea that if the rant was well-received I would capture the text as part of a blog post.

As you may have guessed by my previous comments, I didn’t find the article quite as even-handed as Dupuis did. As for the offence to librarians, I did notice but it seems in line with the rest of the piece which dismisses, downplays, and offloads a few serious criticisms while ignoring how significant issues (problematic peer review process,  charging exorbitant rates for access to publicly funded research, failure to adequately tag published papers that are under review after serious concerns are raised, failure to respond in a timely fashion when serious concerns are raised about a published paper, positive publication bias, …) have spawned the open access movement and also Sci-Hub. When you consider that governments rely on bibliometric data such as number of papers published and number of papers published in high impact journals (such as one of the ‘big three’), it’s clear there’s a great deal at stake.

Other Sci-Hub pieces here

My last piece about Sci-Hub was a February 25, 2016 posting titled,’ Using copyright to shut down easy access to scientific research‘ featuring some of the discussion around Elsevier and its legal suite against Sci-Hub.

Science-themed scriptwriting competition for Google (call for submissions)

David Bruggeman writes about a Google-sponsored scriptwriting competition in an April 28, 2016 posting on his Pasco Phronesis blog (Note: Links have been removed),

At the Tribeca Film Festival last week [the festival ran from April 13 – 24, 2016] Google announced that its CS Education in Media Program is partnering with the website The Black List for a fellowship competition to support the image of computer science and computer scientists in media (H/T STEMDaily).  The Black List is a screenwriting site known for hosting the best unproduced screenplays in Hollywood.

The fellowship could award up to $15,000 for as many as three scripts (one film script and two episodic television pilots).  The writers would use the money to support their work on new materials for six months.  At the end of that period the writer(s) would present that work to Google along with a summary of how the grant helped advance that work and/or affected their career.

Here’s more about the competition from The Black list website’s The Google Computer Science Education in Media Fellowship Call for Submissions webpage,

The Black List is pleased to partner with Google’s Computer Science Education in Media program to offer financial grants in support of the development of three scripts with a focus on changing the image in popular culture of computer science and computer scientists.

REQUIREMENTS

  • The candidate must host a script on www.blcklst.com for a least one week during the opt-in period.
  • Such script must be original to the candidate.
  • The candidate must be competent to contract.
  • If selected for the fellowship, writers must develop a feature screenplay or episodic pilot that changes the image of computer science or computer scientists, particular as it applies to women and minorities, in popular culture.
  • Further, selected writers must agree that six months following receipt of the fellowship that they will provide a designated representative of Google with a sample of his/her new work along with a report addressing how the grant has been used to advance his/her work and/or impacted his/her career.

SELECTION PROCESS

Beginning April 20, 2016, users of the Black List website can opt into consideration for this fellowship.

On July 15 [2016], the Black List will short list ten writers based on all data available on the Black List website about their opted in feature screenplays and teleplays.

These ten short listed candidates will be asked to submit one-page biographies, which will be sent to Google along with the screenplays/teleplays.

Google will review these 10 scripts and choose the Fellowship recipients. Google reserves the right to grant no fellowships if, in Google’s opinion, no entry is of sufficient merit.

DEADLINES OF NOTE (ALL TIMES 11:59 PM PT)

Evaluation purchase deadline* June 15, 2016

Opt in deadline July 15, 2016

* In order for new script evaluations to guarantee consideration for this opportunity, they must be purchased by midnight on the Evaluation deadline.

ADDITIONAL INFORMATION ABOUT GOOGLE’S COMPUTER SCIENCE EDUCATION IN MEDIA PROGRAM

Why is Google working with Hollywood? 

Google aims to inspire young people around the world not just to use technology, but to create it.  To do so, we need more students pursuing an education in CS, particularly girls and minorities, who have historically been underrepresented in the field. Google wants to prepare the next generation for the workplace of the future, and expand access to CS education that engages and retains students from all backgrounds.

  • Moreover, Google’s research shows that perceptions of CS and computer scientists are primary drivers that motivate girls to pursue CS. “If you can’t see it, you can’t be it,” as our friend Geena Davis notes.
  • Google’s hope is that by dispelling stereotypes and identifying positive portrayals of women in tech it can do for CS what CSI did for the field of forensic science, changing its gender make-up and increasing its appeal to a wider audience.
  • Media is part of the ecosystem that needs to change in conjunction with the other areas of work where Google has invested including increasing access to curriculum, non-profit grants, and policy support. If we don’t address the perceptions piece for both young people and adults through mainstream media, we run the risk of undermining our other efforts in CS education.

Background stats on perceptions of CS: 

Google’s research shows that perceptions of careers in computer science really matter.  Girls who feel that television portrays programmers negatively or who don’t see other students like them taking CS are significantly less likely to get into computing. Interestingly, girls who want a career with social impact are also less likely to go into CS.

Google conducted a research study to identify the factors that most influence girls to study computer science, and the second most important category of factors was Career Perceptions.

  • Girls who felt that television portrays programmers in a negative light were less likely to pursue CS.
  • If a girl didn’t see the right social crowd in a class — that is, if there weren’t enough students like her — she was less likely to go into CS.
  • Girls who want careers with social impact are less likely to go into CS. (It’s clear we need to do a better job of showing how CS can be used to develop solutions to some of the world’s most challenging problems.)
  • Perception accounts for 27% of the decision making for girls to pursue CS.. #1 factor is parent/adult encouragement which is also influenced by media.

Stats on representation in media:

  • Blacks & Hispanics are already underrepresented on-screen 14.1% and 4.9%, respectively.
  • Combine this with lack of / misrepresentation of STEM/CS characters in family movies and prime TV, you get STEM characters < 18% women; CS characters <13%.

Proven Success with other Fields:

  • Forensic Science – CSI increased the number of forensic science majors in nationally recognized programs by at least 50% in 5 years – a majority being women.
  • Law – UCLA claimed a 16.5% increase in law school applicants 1 year after LA Law premiered.  Justice Sotomayor credits her interest in law from watching Perry Mason at 10 years old.
 …

FREQUENTLY ASKED QUESTIONS

FAQ & Answers

Go here to register (there is a cost associated with registering but there don’t appear to be any citizenship or residency restrictions, e.g., must be US citizen or must reside in the US. Good Luck!

Nucleic acid-based memory storage

We’re running out of memory. To be more specific, there are two problems: the supply of silicon and a limit to how much silicon-based memory can store. An April 27, 2016 news item on Nanowerk announces a nucleic acid-based approach to solving the memory problem,

A group of Boise State [Boise State University in Idaho, US] researchers, led by associate professor of materials science and engineering and associate dean of the College of Innovation and Design Will Hughes, is working toward a better way to store digital information using nucleic acid memory (NAM).

An April 25, 2016 Boise State University news release, which originated the news item, expands on the theme of computer memory and provides more details about the approach,

It’s no secret that as a society we generate vast amounts of data each year. So much so that the 30 billion watts of electricity used annually by server farms today is roughly equivalent to the output of 30 nuclear power plants.

And the demand keeps growing. The global flash memory market is predicted to reach $30.2 billion this year, potentially growing to $80.3 billion by 2025. Experts estimate that by 2040, the demand for global memory will exceed the projected supply of silicon (the raw material used to store flash memory). Furthermore, electronic memory is rapidly approaching its fundamental size limits because of the difficulty in storing electrons in small dimensions.

Hughes, with post-doctoral researcher Reza Zadegan and colleagues Victor Zhirnov (Semiconductor Research Corporation), Gurtej Sandhun (Micron Technology Inc.) and George Church (Harvard University), is looking to DNA molecules to solve the problem. Nucleic acid — the “NA” in “DNA” — far surpasses electronic memory in retention time, according to the researchers, while also providing greater information density and energy of operation.

Their conclusions are outlined in an invited commentary in the prestigious journal Nature Materials published earlier this month.

“DNA is the data storage material of life in general,” said Hughes. “Because of its physical and chemical properties, it also may become the data storage material of our lives.” It may sound like science fiction, but Hughes will participate in an invitation-only workshop this month at the Intelligence Advanced Research Projects Activity (IARPA) Agency to envision a portable DNA hard drive that would have 500 Terabytes of searchable data – that’s about the the size of the Library of Congress Web Archive.

“When information bits are encoded into polymer strings, researchers and manufacturers can manage and manipulate physical, chemical and biological information with standard molecular biology techniques,” the paper [in Nature Materials?] states.

Cost-competitive technologies to read and write DNA could lead to real-world applications ranging from artificial chromosomes, digital hard drives and information-management systems, to a platform for watermarking and tracking genetic content or next-generation encryption tools that necessitate physical rather than electronic embodiment.

Here’s how it works. Current binary code uses 0’s and 1’s to represent bits of information. A computer program then accesses a specific decoder to turn the numbers back into usable data. With nucleic acid memory, 0’s and 1’s are replaced with the nucleotides A, T, C and G. Known as monomers, they are covalently bonded to form longer polymer chains, also known as information strings.

Because of DNA’s superior ability to store data, DNA can contain all the information in the world in a small box measuring 10 x 10 x 10 centimeters cubed. NAM could thus be used as a sustainable time capsule for massive, scientific, financial, governmental, historical, genealogical, personal and genetic records.

Better yet, DNA can store digital information for a very long time – thousands to millions of years. Currently, usable information has been extracted from DNA in bones that are 700,000 years old, making nucleic acid memory a promising archival material. And nucleic acid memory uses 100 million times less energy than storing data electronically in flash, and the data can live on for generations.

At Boise State, Hughes and Zadegan are examining DNA’s stability under extreme conditions. DNA strands are subjected to temperatures varying from negative 20 degrees Celsius to 100 degrees Celsius, and to a variety of UV exposures to see if they can still retain their information. What they’re finding is that much less information is lost with NAM than with the current state of the industry.

Here’s a link to and a citation for the Nature Materials paper,

Nucleic acid memory by Victor Zhirnov, Reza M. Zadegan, Gurtej S. Sandhu, George M. Church, & William L. Hughes. Nature Materials 15, 366–370 (2016)  doi:10.1038/nmat4594 Published online 23 March 2016

This paper is behind a paywall.

Exploring the science of Iron Man (prior to the opening of Captain America: Civil War, aka, Captain America vs. Iron Man)

Not unexpectedly, there’s a news item about science and Iron Man (it’s getting quite common for the science in movies to be promoted and discussed) just a few weeks before the movie Captain America: Civil War or, as it’s also known, Captain America vs. Iron Man opens in the US. From an April 26, 2016 news item on phys.org,

… how much of our favourite superheros’ power lies in science and how much is complete fiction?

As Iron Man’s name suggests, he wears a suit of “iron” which gives him his abilities—superhuman strength, flight and an arsenal of weapons—and protects him from harm.

In scientific parlance, the Iron man suit is an exoskeleton which is worn outside the body to enhance it.

An April 26, 2016 posting by Chris Marr on the ScienceNetwork Western Australia blog, which originated the news item, provides an interesting overview of exoskeletons and some of the scientific obstacles still to be overcome before they become commonplace,

In the 1960s, the first real powered exoskeleton appeared—a machine integrated with the human frame and movements which provided the wearer with 25 times his natural lifting capacity.

The major drawback then was that the unit itself weighed in at 680kg.

UWA [University of Western Australia] Professor Adrian Keating suggests that some of the technology seen in the latest Marvel blockbuster, such as controlling the exoskeleton with simple thoughts, will be available in the near future by leveraging ongoing advances of multi-disciplinary research teams.

“Dust grain-sized micromachines could be programmed to cooperate to form reconfigurable materials such as the retractable face mask, for example,” Prof Keating says.

However, all of these devices are in need of a power unit small enough to be carried yet providing enough capacity for more than a few minutes of superhuman use, he says.

Does anyone have a spare Arc Reactor?

Currently, most exoskeleton development has been for medical applications, with devices designed to give mobility to amputees and paraplegics, and there are a number in commercial production and use.

Dr Lei Cui, who lectures in Mechatronics at Curtin University, has recently developed both a hand and leg exoskeleton, designed for use by patients who have undergone surgery or have nerve dysfunction, spinal injuries or muscular dysfunction.

“Currently we use an internal battery that lasts about two hours in the glove, which can be programmed for only four different movement patterns,” Dr Cui says.

Dr Cui’s exoskeletons are made from plastic, making them light but offering little protection compared to the titanium exterior of Stark’s favourite suit.

It’s clear that we are a long way from being able to produce a working Iron Man suit at all, let alone one that flies, protects the wearer and has the capacity to fight back.

This is not the first time I’ve featured a science and pop culture story here. You can check out my April 28, 2014 posting for a story about how Captain America’s shield could be a supercapacitor (it also has a link to a North Carolina State University blog featuring science and other comic book heroes) and there is my May 6, 2013 post about Iron Man 3 and a real life injectable nano-network.

As for ScienceNetwork Western Australia, here’s more from their About SWNA page,

ScienceNetwork Western Australia (SNWA) is an online science news service devoted to sharing WA’s achievements in science and technology.

SNWA is produced by Scitech, the state’s science and technology centre and supported by the WA Government’s Office of Science via the Department of the Premier and Cabinet.

Our team of freelance writers work with in-house editors based at Scitech to bring you news from all fields of science, and from the research, government and private industry sectors working throughout the state. Our writers also produce profile stories on scientists. We collaborate with leading WA institutions to bring you Perspectives from prominent WA scientists and opinion leaders.

We also share news of science-related events and information about the greater WA science community including WA’s Chief Scientist, the Premier’s Science Awards, Innovator of the Year Awards and information on regional community science engagement.

Since our commencement in 2003 we have grown to share WA’s stories with local, national and global audiences. Our articles are regularly republished in print and online media in the metropolitan and regional areas.

Bravo to the Western Australia government! I wish there  initiatives of this type in Canada, the closest we have is the French language Agence Science-Presse supported by the Province of Québec.