Author Archives: Maryse de la Giroday

New principles for AI (artificial intelligence) research along with some history and a plea for a democratic discussion

For almost a month I’ve been meaning to get to this Feb. 1, 2017 essay by Andrew Maynard (director of Risk Innovation Lab at Arizona State University) and Jack Stilgoe (science policy lecturer at University College London [UCL]) on the topic of artificial intelligence and principles (Note: Links have been removed). First, a walk down memory lane,

Today [Feb. 1, 2017] in Washington DC, leading US and UK scientists are meeting to share dispatches from the frontiers of machine learning – an area of research that is creating new breakthroughs in artificial intelligence (AI). Their meeting follows the publication of a set of principles for beneficial AI that emerged from a conference earlier this year at a place with an important history.

In February 1975, 140 people – mostly scientists, with a few assorted lawyers, journalists and others – gathered at a conference centre on the California coast. A magazine article from the time by Michael Rogers, one of the few journalists allowed in, reported that most of the four days’ discussion was about the scientific possibilities of genetic modification. Two years earlier, scientists had begun using recombinant DNA to genetically modify viruses. The Promethean nature of this new tool prompted scientists to impose a moratorium on such experiments until they had worked out the risks. By the time of the Asilomar conference, the pent-up excitement was ready to burst. It was only towards the end of the conference when a lawyer stood up to raise the possibility of a multimillion-dollar lawsuit that the scientists focussed on the task at hand – creating a set of principles to govern their experiments.

The 1975 Asilomar meeting is still held up as a beacon of scientific responsibility. However, the story told by Rogers, and subsequently by historians, is of scientists motivated by a desire to head-off top down regulation with a promise of self-governance. Geneticist Stanley Cohen said at the time, ‘If the collected wisdom of this group doesn’t result in recommendations, the recommendations may come from other groups less well qualified’. The mayor of Cambridge, Massachusetts was a prominent critic of the biotechnology experiments then taking place in his city. He said, ‘I don’t think these scientists are thinking about mankind at all. I think that they’re getting the thrills and the excitement and the passion to dig in and keep digging to see what the hell they can do’.

The concern in 1975 was with safety and containment in research, not with the futures that biotechnology might bring about. A year after Asilomar, Cohen’s colleague Herbert Boyer founded Genentech, one of the first biotechnology companies. Corporate interests barely figured in the conversations of the mainly university scientists.

Fast-forward 42 years and it is clear that machine learning, natural language processing and other technologies that come under the AI umbrella are becoming big business. The cast list of the 2017 Asilomar meeting included corporate wunderkinds from Google, Facebook and Tesla as well as researchers, philosophers, and other academics. The group was more intellectually diverse than their 1975 equivalents, but there were some notable absences – no public and their concerns, no journalists, and few experts in the responsible development of new technologies.

Maynard and Stilgoe offer a critique of the latest principles,

The principles that came out of the meeting are, at least at first glance, a comforting affirmation that AI should be ‘for the people’, and not to be developed in ways that could cause harm. They promote the idea of beneficial and secure AI, development for the common good, and the importance of upholding human values and shared prosperity.

This is good stuff. But it’s all rather Motherhood and Apple Pie: comforting and hard to argue against, but lacking substance. The principles are short on accountability, and there are notable absences, including the need to engage with a broader set of stakeholders and the public. At the early stages of developing new technologies, public concerns are often seen as an inconvenience. In a world in which populism appears to be trampling expertise into the dirt, it is easy to understand why scientists may be defensive.

I encourage you to read this thoughtful essay in its entirety although I do have one nit to pick:  Why only US and UK scientists? I imagine the answer may lie in funding and logistics issues but I find it surprising that the critique makes no mention of the international community as a nod to inclusion.

For anyone interested in the Asolimar AI principles (2017), you can find them here. You can also find videos of the two-day workshop (Jan. 31 – Feb. 1, 2017 workshop titled The Frontiers of Machine Learning (a Raymond and Beverly Sackler USA-UK Scientific Forum [US National Academy of Sciences]) here (videos for each session are available on Youtube).

The physics of melting in two-dimensional systems

You might want to skip over the reference to snow as it doesn’t have much relevance to this story about ‘melting’, from a Feb. 1, 2017 news item on Nanowerk (Note: A link has been removed),

Snow falls in winter and melts in spring, but what drives the phase change in between?
Although melting is a familiar phenomenon encountered in everyday life, playing a part in many industrial and commercial processes, much remains to be discovered about this transformation at a fundamental level.

In 2015, a team led by the University of Michigan’s Sharon Glotzer used high-performance computing at the Department of Energy’s (DOE’s) Oak Ridge National Laboratory [ORNL] to study melting in two-dimensional (2-D) systems, a problem that could yield insights into surface interactions in materials important to technologies like solar panels, as well as into the mechanism behind three-dimensional melting. The team explored how particle shape affects the physics of a solid-to-fluid melting transition in two dimensions.

Using the Cray XK7 Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility, the team’s [latest?] work revealed that the shape and symmetry of particles can dramatically affect the melting process (“Shape and symmetry determine two-dimensional melting transitions of hard regular polygons”). This fundamental finding could help guide researchers in search of nanoparticles with desirable properties for energy applications.

There is a video  of the ‘melting’ process but I have to confess to finding it a bit enigmatic,

A Feb. 1, 2017 ORNL news release (also on EurekAlert), which originated the news item, provides more detail about the research,

o tackle the problem, Glotzer’s team needed a supercomputer capable of simulating systems of up to 1 million hard polygons, simple particles used as stand-ins for atoms, ranging from triangles to 14-sided shapes. Unlike traditional molecular dynamics simulations that attempt to mimic nature, hard polygon simulations give researchers a pared-down environment in which to evaluate shape-influenced physics.

“Within our simulated 2-D environment, we found that the melting transition follows one of three different scenarios depending on the shape of the systems’ polygons,” University of Michigan research scientist Joshua Anderson said. “Notably, we found that systems made up of hexagons perfectly follow a well-known theory for 2-D melting, something that hasn’t been described until now.”

Shifting Shape Scenarios

In 3-D systems such as a thinning icicle, melting takes the form of a first-order phase transition. This means that collections of molecules within these systems exist in either solid or liquid form with no in-between in the presence of latent heat, the energy that fuels a solid-to-fluid phase change . In 2-D systems, such as thin-film materials used in batteries and other technologies, melting can be more complex, sometimes exhibiting an intermediate phase known as the hexatic phase.

The hexatic phase, a state characterized as a halfway point between an ordered solid and a disordered liquid, was first theorized in the 1970s by researchers John Kosterlitz, David Thouless, Burt Halperin, David Nelson, and Peter Young. The phase is a principle feature of the KTHNY theory, a 2-D melting theory posited by the researchers (and named based on the first letters of their last names). In 2016 Kosterlitz and Thouless were awarded the Nobel Prize in Physics, along with physicist Duncan Haldane, for their contributions to 2-D materials research.

At the molecular level, solid, hexatic, and liquid systems are defined by the arrangement of their atoms. In a crystalline solid, two types of order are present: translational and orientational. Translational order describes the well-defined paths between atoms over distances, like blocks in a carefully constructed Jenga tower. Orientational order describes the relational and clustered order shared between atoms and groups of atoms over distances. Think of that same Jenga tower turned askew after several rounds of play. The general shape of the tower remains, but its order is now fragmented.

The hexatic phase has no translational order but possesses orientational order. (A liquid has neither translational nor orientational order but exhibits short-range order, meaning any atom will have some average number of neighbors nearby but with no predicable order.)

Deducing the presence of a hexatic phase requires a leadership-class computer that can calculate large hard-particle systems. Glotzer’s team gained access to the OLCF’s 27-petaflop Titan through the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, running its GPU-accelerated HOOMD-blue code to maximize time on the machine.

On Titan, HOOMD-blue used 64 GPUs for each massively parallel Monte Carlo simulation of up to 1 million particles. Researchers explored 11 different shape systems, applying an external pressure to push the particles together. Each system was simulated at 21 different densities, with the lowest densities representing a fluid state and the highest densities a solid state.

The simulations demonstrated multiple melting scenarios hinging on the polygons’ shape. Systems with polygons of seven sides or more closely followed the melting behavior of hard disks, or circles, exhibiting a continuous phase transition from the solid to the hexatic phase and a first-order phase transition from the hexatic to the liquid phase. A continuous phase transition means a constantly changing area in response to a changing external pressure. A first-order phase transition is characterized by a discontinuity in which the volume jumps across the phase transition in response to the changing external pressure. The team found pentagons and fourfold pentilles, irregular pentagons with two different edge lengths, exhibit a first-order solid-to-liquid phase transition.

The most significant finding, however, emerged from hexagon systems, which perfectly followed the phase transition described by the KTHNY theory. In this scenario, the particles’ shift from solid to hexatic and hexatic to fluid in a perfect continuous phase transition pattern.

“It was actually sort of surprising that no one else has found that until now,” Anderson said, “because it seems natural that the hexagon, with its six sides, and the honeycomb-like hexagonal arrangement would be a perfect match for this theory” in which the hexatic phase generally contains sixfold orientational order.

Glotzer’s team, which recently received a 2017 INCITE allocation, is now applying its leadership-class computing prowess to tackle phase transitions in 3-D. The team is focusing on how fluid particles crystallize into complex colloids—mixtures in which particles are suspended throughout another substance. Common examples of colloids include milk, paper, fog, and stained glass.

“We’re planning on using Titan to study how complexity can arise from these simple interactions, and to do that we’re actually going to look at how the crystals grow and study the kinetics of how that happens,” said Anderson.

There is a paper on arXiv,

Shape and symmetry determine two-dimensional melting transitions of hard regular polygons by Joshua A. Anderson, James Antonaglia, Jaime A. Millan, Michael Engel, Sharon C. Glotzer
(Submitted on 2 Jun 2016 (v1), last revised 23 Dec 2016 (this version, v2))  arXiv:1606.00687 [cond-mat.soft] (or arXiv:1606.00687v2

This paper is open access and open to public peer review.

US report on Women, minorities, and people with disabilities in science and engineerin

A Jan. 31, 2017 news item on ScienceDaily announces a new report from the US National Science Foundation’s (NSF) National Center for Science and Engineering Statistics (NCSES),

The National Center for Science and Engineering Statistics (NCSES) today [Jan. 31, 2017,] announced the release of the 2017 Women, Minorities, and Persons with Disabilities in Science and Engineering (WMPD) report, the federal government’s most comprehensive look at the participation of these three demographic groups in science and engineering education and employment.

The report shows the degree to which women, people with disabilities and minorities from three racial and ethnic groups — black, Hispanic and American Indian or Alaska Native — are underrepresented in science and engineering (S&E). Women have reached parity with men in educational attainment but not in S&E employment. Underrepresented minorities account for disproportionately smaller percentages in both S&E education and employment

Congress mandated the biennial report in the Science and Engineering Equal Opportunities Act as part of the National Science Foundation’s (NSF) mission to encourage and strengthen the participation of underrepresented groups in S&E.

A Jan. 31, 2017 NSF news release (also on EurekAlert), which originated the news item, provides information about why the report is issued every two years and provides highlights from the 2017 report,

“An important part of fulfilling our mission to further the progress of science is producing current, accurate information about the U.S. STEM workforce,” said NSF Director France Córdova. “This report is a valuable resource to the science and engineering policy community.”

NSF maintains a portfolio of programs aimed at broadening participation in S&E, including ADVANCE: Increasing the Participation and Advancement of Women in Academic Science and Engineering Careers; LSAMP: the Louis Stokes Alliances for Minority Participation; and NSF INCLUDES, which focuses on building networks that can scale up proven approaches to broadening participation.

The digest provides highlights and analysis in five topic areas: enrollment, field of degree, occupation, employment status and early career doctorate holders. That last topic area includes analysis of pilot study data from the Early Career Doctorates Survey, a new NCSES product. NCSES also maintains expansive WMPD data tables, updated periodically as new data become available, which present the latest S&E education and workforce data available from NCSES and other agencies. The tables provide the public access to detailed, field-by-field information that includes both percentages and the actual numbers of people involved in S&E.

“WMPD is more than just a single report or presentation,” said NCSES Director John Gawalt. “It is a vast and unique information resource, carefully curated and maintained, that allows anyone (from the general public to highly trained researchers) ready access to data that facilitate and support their own exploration and analyses.”

Key findings from the new digest include:

  • The types of schools where students enroll vary among racial and ethnic groups. Hispanics, American Indians or Alaska Natives and Native Hawaiians or Other Pacific Islanders are more likely to enroll in community colleges. Blacks and Native Hawaiian or Other Pacific Islanders are more likely to enroll in private, for profit schools.
  • Since the late 1990s, women have earned about half of S&E bachelor’s degrees. But their representation varies widely by field, ranging from 70 percent in psychology to 18 percent in computer sciences.
  • At every level — bachelor’s, master’s and doctorate — underrepresented minority women earn a higher proportion of degrees than their male counterparts. White women, in contrast earn a smaller proportion of degrees than their male counterparts.
  • Despite two decades of progress, a wide gap in educational attainment remains between underrepresented minorities and whites and Asians, two groups that have higher representation in S&E education than they do in the U.S. population.
  • White men constitute about one-third of the overall U.S. population; they comprise half of the S&E workforce. Blacks, Hispanics and people with disabilities are underrepresented in the S&E workforce.
  • Women’s participation in the workforce varies greatly by field of occupation.
  • In 2015, scientists and engineers had a lower unemployment rate compared to the general U.S. population (3.3 percent versus 5.8 percent), although the rate varied among groups. For example, it was 2.8 percent among white women in S&E but 6.0 percent for underrepresented minority women.

For more information, including access to the digest and data tables, see the updated WMPD website.

Caption: In 2015, women and some minority groups were represented less in science and engineering (S&E) occupations than they were in the US general population.. Credit: NSF

R.I.P. Mildred Dresselhaus, Queen of Carbon

I’ve been hearing about Mildred Dresselhaus, professor emerita (retired professor) at the Massachusetts Institute of Technology (MIT), just about as long as I’ve been researching and writing about nanotechnology (about 10 years including the work for my master’s project with the almost eight years on this blog).

She died on Monday, Feb. 20, 2017 at the age of 86 having broken through barriers for those of her gender, barriers for her subject area, and barriers for her age.

Mark Anderson in his Feb. 22, 2017 obituary for the IEEE (Institute of Electrical and Electronics Engineers) Spectrum website provides a brief overview of her extraordinary life and accomplishments,

Called the “Queen of Carbon Science,” Dresselhaus pioneered the study of carbon nanostructures at a time when studying physical and material properties of commonplace atoms like carbon was out of favor. Her visionary perspectives on the sixth atom in the periodic table—including exploring individual layers of carbon atoms (precursors to graphene), developing carbon fibers stronger than steel, and revealing new carbon structures that were ultimately developed into buckyballs and nanotubes—invigorated the field.

“Millie Dresselhaus began life as the child of poor Polish immigrants in the Bronx; by the end, she was Institute Professor Emerita, the highest distinction awarded by the MIT faculty. A physicist, materials scientist, and electrical engineer, she was known as the ‘Queen of Carbon’ because her work paved the way for much of today’s carbon-based nanotechnology,” MIT president Rafael Reif said in a prepared statement.

Friends and colleagues describe Dresselhaus as a gifted instructor as well as a tireless and inspired researcher. And her boundless generosity toward colleagues, students, and girls and women pursuing careers in science is legendary.

In 1963, Dresselhaus began her own career studying carbon by publishing a paper on graphite in the IBM Journal for Research and Development, a foundational work in the history of nanotechnology. To this day, her studies of the electronic structure of this material serve as a reference point for explorations of the electronic structure of fullerenes and carbon nanotubes. Coauthor, with her husband Gene Dresselhaus, of a leading book on carbon fibers, she began studying the laser vaporation of carbon and the “carbon clusters” that resulted. Researchers who followed her lead discovered a 60-carbon structure that was soon identified as the icosahedral “soccer ball” molecular configuration known as buckminsterfullerene, or buckyball. In 1991, Dresselhaus further suggested that fullerene could be elongated as a tube, and she outlined these imagined objects’ symmetries. Not long after, researchers announced the discovery of carbon nanotubes.

When she began her nearly half-century career at MIT, as a visiting professor, women consisted of just 4 percent of the undergraduate student population.  So Dresselhaus began working toward the improvement of living conditions for women students at the university. Through her leadership, MIT adopted an equal and joint admission process for women and men. (Previously, MIT had propounded the self-fulfilling prophecy of harboring more stringent requirements for women based on less dormitory space and perceived poorer performance.) And so promoting women in STEM—before it was ever called STEM—became one of her passions. Serving as president of the American Physical Society, she spearheaded and launched initiatives like the Committee on the Status of Women in Physics and the society’s more informal committees of visiting women physicists on campuses around the United States, which have increased the female faculty and student populations on the campuses they visit.

If you have the time, please read Anderson’s piece in its entirety.

One fact that has impressed me greatly is that Dresselhaus kept working into her eighties. I featured a paper she published in an April 27, 2012 posting at the age of 82 and she was described in the MIT write up at the time as a professor, not a professor emerita. I later featured Dresselhaus in a May 31, 2012 posting when she was awarded the Kavli Prize for Nanoscience.

It seems she worked almost to the end. Recently, GE (General Electric) posted a video “What If Scientists Were Celebrities?” starring Mildred Dresselhaus,

H/t Mark Anderson’s obituary Feb. 22, 2017 piece. The video was posted on Feb. 8, 2017.

Goodbye to the Queen of Carbon!

University of Alberta scientists use ultra fast (terahertz) microscopy to see ultra small (electron dynamics)

This is exciting news for Canadian science and the second time there has been a breakthrough development from the province of Alberta within the last five months (see Sept. 21, 2016 posting on quantum teleportation). From a Feb. 21, 2017 news item on ScienceDaily,

For the first time ever, scientists have captured images of terahertz electron dynamics of a semiconductor surface on the atomic scale. The successful experiment indicates a bright future for the new and quickly growing sub-field called terahertz scanning tunneling microscopy (THz-STM), pioneered by the University of Alberta in Canada. THz-STM allows researchers to image electron behaviour at extremely fast timescales and explore how that behaviour changes between different atoms.

From a Feb. 21, 2017 University of Alberta news release on EurekAlert, which originated the news item, expands on the theme,

“We can essentially zoom in to observe very fast processes with atomic precision and over super fast time scales,” says Vedran Jelic, PhD student at the University of Alberta and lead author on the new study. “THz-STM provides us with a new window into the nanoworld, allowing us to explore ultrafast processes on the atomic scale. We’re talking a picosecond, or a millionth millionth of a second. It’s something that’s never been done before.”

Jelic and his collaborators used their scanning tunneling microscope (STM) to capture images of silicon atoms by raster scanning a very sharp tip across the surface and recording the tip height as it follows the atomic corrugations of the surface. While the original STM can measure and manipulate single atoms–for which its creators earned a Nobel Prize in 1986–it does so using wired electronics and is ultimately limited in speed and thus time resolution.

Modern lasers produce very short light pulses that can measure a whole range of ultra-fast processes, but typically over length scales limited by the wavelength of light at hundreds of nanometers. Much effort has been expended to overcome the challenges of combining ultra-fast lasers with ultra-small microscopy. The University of Alberta scientists addressed these challenges by working in a unique terahertz frequency range of the electromagnetic spectrum that allows wireless implementation. Normally the STM needs an applied voltage in order to operate, but Jelic and his collaborators are able to drive their microscope using pulses of light instead. These pulses occur over really fast timescales, which means the microscope is able to see really fast events.

By incorporating the THz-STM into an ultrahigh vacuum chamber, free from any external contamination or vibration, they are able to accurately position their tip and maintain a perfectly clean surface while imaging ultrafast dynamics of atoms on surfaces. Their next step is to collaborate with fellow material scientists and image a variety of new surfaces on the nanoscale that may one day revolutionize the speed and efficiency of current technology, ranging from solar cells to computer processing.

“Terahertz scanning tunneling microscopy is opening the door to an unexplored regime in physics,” concludes Jelic, who is studying in the Ultrafast Nanotools Lab with University of Alberta professor Frank Hegmann, a world expert in ultra-fast terahertz science and nanophysics.

Here’s are links to and citations for the team’s 2013 paper and their latest,

An ultrafast terahertz scanning tunnelling microscope by Tyler L. Cocker, Vedran Jelic, Manisha Gupta, Sean J. Molesky, Jacob A. J. Burgess, Glenda De Los Reyes, Lyubov V. Titova, Ying Y. Tsui, Mark R. Freeman, & Frank A. Hegmann. Nature Photonics 7, 620–625 (2013) doi:10.1038/nphoton.2013.151 Published online 07 July 2013

Ultrafast terahertz control of extreme tunnel currents through single atoms on a silicon surface by Vedran Jelic, Krzysztof Iwaszczuk, Peter H. Nguyen, Christopher Rathje, Graham J. Hornig, Haille M. Sharum, James R. Hoffman, Mark R. Freeman, & Frank A. Hegmann. Nature Physics (2017)  doi:10.1038/nphys4047 Published online 20 February 2017

Both papers are behind a paywall.

Quantum Shorts & Quantum Applications event at Vancouver’s (Canada) Science World

This is very short notice but if you do have some free time on Thursday, Feb. 23, 2017 from 6 – 8:30 pm, you can check out Science World’s Quantum: The Exhibition for free and watch a series of short films. Here’s more from the Quantum Shorts & Quantum Applications event page,

Join us for an evening of quantum art and science. Visit Quantum: The Exhibition and view a series of short films inspired by the science, history, and philosophy of quantum. Find some answers to your Quantum questions at this mind-expanding panel discussion.

Thursday, February 23: 

6pm                      Check out Quantum: The Exhibition
7pm                      Quantum Shorts Screening
7:45pm                 Panel Discussion/Presentation
8:30pm                 Q & A

Light refreshments will be available.

There are still spaces as of Weds., Feb. 22, 2017:; you can register for the event here.

This will be of the last chances you’ll have to see Quantum: The Exhibition as the show’s here last day is scheduled for Feb. 26, 2017.

Nominations open for Kabiller Prizes in Nanoscience and Nanomedicine ($250,000 for visionary researcher and $10,000 for young investigator)

For a change I can publish something that doesn’t have a deadline in three days or less! Without more ado (from a Feb. 20, 2017 Northwestern University news release by Megan Fellman [h/t Nanowerk’s Feb. 20, 2017 news item]),

Northwestern University’s International Institute for Nanotechnology (IIN) is now accepting nominations for two prestigious international prizes: the $250,000 Kabiller Prize in Nanoscience and Nanomedicine and the $10,000 Kabiller Young Investigator Award in Nanoscience and Nanomedicine.

The deadline for nominations is May 15, 2017. Details are available on the IIN website.

“Our goal is to recognize the outstanding accomplishments in nanoscience and nanomedicine that have the potential to benefit all humankind,” said David G. Kabiller, a Northwestern trustee and alumnus. He is a co-founder of AQR Capital Management, a global investment management firm in Greenwich, Connecticut.

The two prizes, awarded every other year, were established in 2015 through a generous gift from Kabiller. Current Northwestern-affiliated researchers are not eligible for nomination until 2018 for the 2019 prizes.

The Kabiller Prize — the largest monetary award in the world for outstanding achievement in the field of nanomedicine — celebrates researchers who have made the most significant contributions to the field of nanotechnology and its application to medicine and biology.

The Kabiller Young Investigator Award recognizes young emerging researchers who have made recent groundbreaking discoveries with the potential to make a lasting impact in nanoscience and nanomedicine.

“The IIN at Northwestern University is a hub of excellence in the field of nanotechnology,” said Kabiller, chair of the IIN executive council and a graduate of Northwestern’s Weinberg College of Arts and Sciences and Kellogg School of Management. “As such, it is the ideal organization from which to launch these awards recognizing outstanding achievements that have the potential to substantially benefit society.”

Nanoparticles for medical use are typically no larger than 100 nanometers — comparable in size to the molecules in the body. At this scale, the essential properties (e.g., color, melting point, conductivity, etc.) of structures behave uniquely. Researchers are capitalizing on these unique properties in their quest to realize life-changing advances in the diagnosis, treatment and prevention of disease.

“Nanotechnology is one of the key areas of distinction at Northwestern,” said Chad A. Mirkin, IIN director and George B. Rathmann Professor of Chemistry in Weinberg. “We are very grateful for David’s ongoing support and are honored to be stewards of these prestigious awards.”

An international committee of experts in the field will select the winners of the 2017 Kabiller Prize and the 2017 Kabiller Young Investigator Award and announce them in September.

The recipients will be honored at an awards banquet Sept. 27 in Chicago. They also will be recognized at the 2017 IIN Symposium, which will include talks from prestigious speakers, including 2016 Nobel Laureate in Chemistry Ben Feringa, from the University of Groningen, the Netherlands.

2015 recipient of the Kabiller Prize

The winner of the inaugural Kabiller Prize, in 2015, was Joseph DeSimone the Chancellor’s Eminent Professor of Chemistry at the University of North Carolina at Chapel Hill and the William R. Kenan Jr. Distinguished Professor of Chemical Engineering at North Carolina State University and of Chemistry at UNC-Chapel Hill.

DeSimone was honored for his invention of particle replication in non-wetting templates (PRINT) technology that enables the fabrication of precisely defined, shape-specific nanoparticles for advances in disease treatment and prevention. Nanoparticles made with PRINT technology are being used to develop new cancer treatments, inhalable therapeutics for treating pulmonary diseases, such as cystic fibrosis and asthma, and next-generation vaccines for malaria, pneumonia and dengue.

2015 recipient of the Kabiller Young Investigator Award

Warren Chan, professor at the Institute of Biomaterials and Biomedical Engineering at the University of Toronto, was the recipient of the inaugural Kabiller Young Investigator Award, also in 2015. Chan and his research group have developed an infectious disease diagnostic device for a point-of-care use that can differentiate symptoms.

BTW, Warren Chan, winner of the ‘Young Investigator Award’, and/or his work have been featured here a few times, most recently in a Nov. 1, 2016 posting, which is mostly about another award he won but also includes links to some his work including my April 27, 2016 post about the discovery that fewer than 1% of nanoparticle-based drugs reach their destination.

Aliens wreak havoc on our personal electronics

The aliens in question are subatomic particles and the havoc they wreak is low-grade according to the scientist who was presenting on the topic at the AAAS (American Association for the Advancement of Science) 2017 Annual Meeting (Feb. 16 – 20, 2017) in Boston, Massachusetts. From a Feb. 17, 2017 news item on ScienceDaily,

You may not realize it but alien subatomic particles raining down from outer space are wreaking low-grade havoc on your smartphones, computers and other personal electronic devices.

When your computer crashes and you get the dreaded blue screen or your smartphone freezes and you have to go through the time-consuming process of a reset, most likely you blame the manufacturer: Microsoft or Apple or Samsung. In many instances, however, these operational failures may be caused by the impact of electrically charged particles generated by cosmic rays that originate outside the solar system.

“This is a really big problem, but it is mostly invisible to the public,” said Bharat Bhuva, professor of electrical engineering at Vanderbilt University, in a presentation on Friday, Feb. 17 at a session titled “Cloudy with a Chance of Solar Flares: Quantifying the Risk of Space Weather” at the annual meeting of the American Association for the Advancement of Science in Boston.

A Feb. 17, 2017 Vanderbilt University news release (also on EurekAlert), which originated the news item, expands on  the theme,

When cosmic rays traveling at fractions of the speed of light strike the Earth’s atmosphere they create cascades of secondary particles including energetic neutrons, muons, pions and alpha particles. Millions of these particles strike your body each second. Despite their numbers, this subatomic torrent is imperceptible and has no known harmful effects on living organisms. However, a fraction of these particles carry enough energy to interfere with the operation of microelectronic circuitry. When they interact with integrated circuits, they may alter individual bits of data stored in memory. This is called a single-event upset or SEU.

Since it is difficult to know when and where these particles will strike and they do not do any physical damage, the malfunctions they cause are very difficult to characterize. As a result, determining the prevalence of SEUs is not easy or straightforward. “When you have a single bit flip, it could have any number of causes. It could be a software bug or a hardware flaw, for example. The only way you can determine that it is a single-event upset is by eliminating all the other possible causes,” Bhuva explained.

There have been a number of incidents that illustrate how serious the problem can be, Bhuva reported. For example, in 2003 in the town of Schaerbeek, Belgium a bit flip in an electronic voting machine added 4,096 extra votes to one candidate. The error was only detected because it gave the candidate more votes than were possible and it was traced to a single bit flip in the machine’s register. In 2008, the avionics system of a Qantus passenger jet flying from Singapore to Perth appeared to suffer from a single-event upset that caused the autopilot to disengage. As a result, the aircraft dove 690 feet in only 23 seconds, injuring about a third of the passengers seriously enough to cause the aircraft to divert to the nearest airstrip. In addition, there have been a number of unexplained glitches in airline computers – some of which experts feel must have been caused by SEUs – that have resulted in cancellation of hundreds of flights resulting in significant economic losses.

An analysis of SEU failure rates for consumer electronic devices performed by Ritesh Mastipuram and Edwin Wee at Cypress Semiconductor on a previous generation of technology shows how prevalent the problem may be. Their results were published in 2004 in Electronic Design News and provided the following estimates:

  • A simple cell phone with 500 kilobytes of memory should only have one potential error every 28 years.
  • A router farm like those used by Internet providers with only 25 gigabytes of memory may experience one potential networking error that interrupts their operation every 17 hours.
  • A person flying in an airplane at 35,000 feet (where radiation levels are considerably higher than they are at sea level) who is working on a laptop with 500 kilobytes of memory may experience one potential error every five hours.

Bhuva is a member of Vanderbilt’s Radiation Effects Research Group, which was established in 1987 and is the largest academic program in the United States that studies the effects of radiation on electronic systems. The group’s primary focus was on military and space applications. Since 2001, the group has also been analyzing radiation effects on consumer electronics in the terrestrial environment. They have studied this phenomenon in the last eight generations of computer chip technology, including the current generation that uses 3D transistors (known as FinFET) that are only 16 nanometers in size. The 16-nanometer study was funded by a group of top microelectronics companies, including Altera, ARM, AMD, Broadcom, Cisco Systems, Marvell, MediaTek, Renesas, Qualcomm, Synopsys, and TSMC

“The semiconductor manufacturers are very concerned about this problem because it is getting more serious as the size of the transistors in computer chips shrink and the power and capacity of our digital systems increase,” Bhuva said. “In addition, microelectronic circuits are everywhere and our society is becoming increasingly dependent on them.”

To determine the rate of SEUs in 16-nanometer chips, the Vanderbilt researchers took samples of the integrated circuits to the Irradiation of Chips and Electronics (ICE) House at Los Alamos National Laboratory. There they exposed them to a neutron beam and analyzed how many SEUs the chips experienced. Experts measure the failure rate of microelectronic circuits in a unit called a FIT, which stands for failure in time. One FIT is one failure per transistor in one billion hours of operation. That may seem infinitesimal but it adds up extremely quickly with billions of transistors in many of our devices and billions of electronic systems in use today (the number of smartphones alone is in the billions). Most electronic components have failure rates measured in 100’s and 1,000’s of FITs.

chart

Trends in single event upset failure rates at the individual transistor, integrated circuit and system or device level for the three most recent manufacturing technologies. (Bharat Bhuva, Radiation Effects Research Group, Vanderbilt University)

“Our study confirms that this is a serious and growing problem,” said Bhuva.“This did not come as a surprise. Through our research on radiation effects on electronic circuits developed for military and space applications, we have been anticipating such effects on electronic systems operating in the terrestrial environment.”

Although the details of the Vanderbilt studies are proprietary, Bhuva described the general trend that they have found in the last three generations of integrated circuit technology: 28-nanometer, 20-nanometer and 16-nanometer.

As transistor sizes have shrunk, they have required less and less electrical charge to represent a logical bit. So the likelihood that one bit will “flip” from 0 to 1 (or 1 to 0) when struck by an energetic particle has been increasing. This has been partially offset by the fact that as the transistors have gotten smaller they have become smaller targets so the rate at which they are struck has decreased.

More significantly, the current generation of 16-nanometer circuits have a 3D architecture that replaced the previous 2D architecture and has proven to be significantly less susceptible to SEUs. Although this improvement has been offset by the increase in the number of transistors in each chip, the failure rate at the chip level has also dropped slightly. However, the increase in the total number of transistors being used in new electronic systems has meant that the SEU failure rate at the device level has continued to rise.

Unfortunately, it is not practical to simply shield microelectronics from these energetic particles. For example, it would take more than 10 feet of concrete to keep a circuit from being zapped by energetic neutrons. However, there are ways to design computer chips to dramatically reduce their vulnerability.

For cases where reliability is absolutely critical, you can simply design the processors in triplicate and have them vote. Bhuva pointed out: “The probability that SEUs will occur in two of the circuits at the same time is vanishingly small. So if two circuits produce the same result it should be correct.” This is the approach that NASA used to maximize the reliability of spacecraft computer systems.

The good news, Bhuva said, is that the aviation, medical equipment, IT, transportation, communications, financial and power industries are all aware of the problem and are taking steps to address it. “It is only the consumer electronics sector that has been lagging behind in addressing this problem.”

The engineer’s bottom line: “This is a major problem for industry and engineers, but it isn’t something that members of the general public need to worry much about.”

That’s fascinating and I hope the consumer electronics industry catches up with this ‘alien invasion’ issue. Finally, the ‘bit flips’ made me think of the 1956 movie ‘Invasion of the Body Snatchers‘.

Effective sunscreens from nature

The dream is to find sunscreens that don’t endanger humans or pollute the environment and it seems that Spanish scientists may have taken a step closer to making that dream a reality (from a Jan. 30, 2017 Wiley Publications press release (also on EurekAlert),

The ideal sunscreen should block UVB and UVA radiation while being safe and stable. In the journal Angewandte Chemie, Spanish scientists have introduced a new family of UVA and UVB filters based on natural sunscreen substances found in algae and cyanobacteria. They are highly stable and enhance the effectivity [sic] of commercial sunscreens.

Good news for sunseekers. Commercial [sic] available sunscreen lotions can very effectively protect from dangerous radiation in the ultraviolet [spectrum], but they need to be applied regularly and in high amounts to develop their full potential. One of the most critical issues is the limited stability of the UV filter molecules. Inspired by nature, Diego Sampedro and his colleagues from La Rioja University in Logrono and collaborators from Malaga University and Alcala University, Madrid, Spain, have screened a natural class of UV-protecting [blocking?] molecules for their possible use in skin protection. They adjusted the nature-given motif [sic] to the requirements of chemical synthesis and found that the molecules could indeed boost the sun protection factor of common formulations.

The natural sunscreen molecules are called microsporine-like amino acids (MAAs) and are widespread in the microbial world, most prominently in marine algae and cyanobacteria. MAAs are small molecules derived from amino acids, thermally stable, and they absorb light in the ultraviolet region, protecting the microbial DNA from radiation damage. Thus they are natural sunscreens, which inspired Sampedro and his colleagues to create [a] new class of organic sunscreen compounds.

Theoretical calculations revealed what is chemically needed for a successful design. “We performed a computer calculation of several basic scaffolds [..] to identify the simplest compound that fulfills the requisites for efficient sunscreens”, the authors write. The result of their search was a set of molecules which were readily synthesized, “avoiding the decorating substituents that come from the biosynthetic route.” Thus the small basic molecules can be tuned to give them more favorable properties.

The authors found that the synthesized compounds are characterized by excellent filter capacities in the relevant UV range. In addition they are photostable, much more than, for example, oxybenzene [sic] which is a widely used sunscreen in commercial formulations. They do not react chemically and dissipate radiation as heat (but not to such an extent that the skin temperature would rise as well). And, most importantly, when tested in real formulations, the sun protection factor (SPF) rose by a factor of more than two. Thus they could be promising targets for more stable, more efficient sunscreen lotions. Good news for your next summer vacation.

There’s some unusual phrasing so, I’m guessing that the writer it not accustomed to writing press releases in English. One other comment, it’s oxybenzone that’s often used as an ingredient in commercial sunscreens.

Here’s a link to and a citation for the paper,

Rational Design and Synthesis of Efficient Sunscreens To Boost the Solar Protection Factor by Raúl Losantos, Ignacio Funes-Ardoiz, Dr. José Aguilera, Prof. Enrique Herrera-Ceballos, Dr. Cristina García-Iriepa, Prof. Pedro J. Campos, and Diego Sampedro. Angewandte Chemie International Edition Volume 56, Issue 10, pages 2632–2635, March 1, 2017 DOI: 10.1002/anie.201611627 Version of Record online: 27 JAN 2017

© 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim

This paper is behind a paywall.

I have previously featured work on another natural sunscreen. In that case it was to be derived from English ivy (July 22, 2010 posting); there was an update on the English ivy work in a May 30, 2016 posting but the researcher has moved in a different direction looking at wound healing and armour as possible applications for the research.

Curiosity may not kill the cat but, in science, it might be an antidote to partisanship

I haven’t stumbled across anything from the Cultural Cognition Project at Yale Law School in years so before moving onto their latest news, here’s more about the project,

The Cultural Cognition Project is a group of scholars interested in studying how cultural values shape public risk perceptions and related policy beliefs. Cultural cognition refers to the tendency of individuals to conform their beliefs about disputed matters of fact (e.g., whether global warming is a serious threat; whether the death penalty deters murder; whether gun control makes society more safe or less) to values that define their cultural identities.Project members are using the methods of various disciplines — including social psychology, anthropology, communications, and political science — to chart the impact of this phenomenon and to identify the mechanisms through which it operates. The Project also has an explicit normative objective: to identify processes of democratic decisionmaking by which society can resolve culturally grounded differences in belief in a manner that is both congenial to persons of diverse cultural outlooks and consistent with sound public policymaking.

It’s nice to catch up with some of the project’s latest work, from a Jan. 26, 2017 Yale University news release (also on EurekAlert),

Disputes over science-related policy issues such as climate change or fracking often seem as intractable as other politically charged debates. But in science, at least, simple curiosity might help bridge that partisan divide, according to new research.

In a study slated for publication in the journal Advances in Political Psychology, a Yale-led research team found that people who are curious about science are less polarized in their views on contentious issues than less-curious peers.

In an experiment, they found out why: Science-curious individuals are more willing to engage with surprising information that runs counter to their political predispositions.

“It’s a well-established finding that most people prefer to read or otherwise be exposed to information that fits rather than challenges their political preconceptions,” said research team leader Dan Kahan, Elizabeth K. Dollard Professor of Law and professor of psychology at Yale Law School. “This is called the echo-chamber effect.”

But science-curious individuals are more likely to venture out of that chamber, he said.

“When they are offered the choice to read news articles that support their views or challenge them on the basis of new evidence, science-curious individuals opt for the challenging information,” Kahan said. “For them, surprising pieces of evidence are bright shiny objects — they can’t help but grab at them.”

Kahan and other social scientists previously have shown that information based on scientific evidence can actually intensify — rather than moderate — political polarization on contentious topics such as gun control, climate change, fracking, or the safety of certain vaccines. The new study, which assessed science knowledge among subjects, reiterates the gaping divide separating how conservatives and liberals view science.

Republicans and Democrats with limited knowledge of science were equally likely to agree or disagree with the statement that “there is solid evidence that global warming is caused by human activity. However, the most science-literate conservatives were much more likely to disagree with the statement than less-knowledgeable peers. The most knowledgeable liberals almost universally agreed with the statement.

“Whatever measure of critical reasoning we used, we always observed this depressing pattern: The members of the public most able to make sense of scientific evidence are in fact the most polarized,” Kahan said.

But knowledge of science, and curiosity about science, are not the same thing, the study shows.

The team became interested in curiosity because of its ongoing collaborative research project to improve public engagement with science documentaries involving the Cultural Cognition Project at Yale Law School, the Annenberg Public Policy Center of the University of Pennsylvania, and Tangled Bank Studios at the Howard Hughes Medical Institute.

They noticed that the curious — those who sought out science stories for personal pleasure — not only were more interested in viewing science films on a variety of topics but also did not display political polarization associated with contentious science issues.

The new study found, for instance, that a much higher percentage of curious liberals and conservatives chose to read stories that ran counter to their political beliefs than did their non-curious peers.

“As their science curiosity goes up, the polarizing effects of higher science comprehension dissipate, and people move the same direction on contentious policies like climate change and fracking,” Kahan said.

It is unclear whether curiosity applied to other controversial issues can minimize the partisan rancor that infects other areas of society. But Kahan believes that the curious from both sides of the political and cultural divide should make good ambassadors to the more doctrinaire members of their own groups.

“Politically curious people are a resource who can promote enlightened self-government by sharing scientific information they are naturally inclined to learn and share,” he said.

Here’s my standard link to and citation for the paper,

Science Curiosity and Political Information Processing by Dan M. Kahan, Asheley R Landrum, Katie Carpenter, Laura Helft, and Kathleen Hall Jamieson. Political Psychology Volume 38, Issue Supplement S1 February 2017 Pages 179–199 DOI: 10.1111/pops.12396View First published: 26 January 2017

This paper is open and it can also be accessed here.

I last mentioned Kahan and The Cultural Cognition Project in an April 10, 2014 posting (scroll down about 45% of the way) about responsible science.