Monthly Archives: February 2017

Big data in the Cascadia region: a University of British Columbia (Canada) and University of Washington (US state) collaboration

Before moving onto the news and for anyone unfamiliar with the concept of the Cascadia region, it is an informally proposed political region or a bioregion, depending on your perspective. Adding to the lack of clarity, the region generally includes the province of British Columbia in Canada and the two US states, Washington and Oregon but Alaska (another US state) and the Yukon (a Canadian territory) may also be included, as well as, parts of California, Wyoming, Idaho, and Montana. (You can read more about the Cascadia bioregion here and the proposed political region here.)  While it sounds as if more of the US is part of the ‘Cascadia region’, British Columbia and the Yukon cover considerably more territory than all of the mentioned states combined, if you’re taking a landmass perspective.

Cascadia Urban Analytics Cooperative

There was some big news about the smallest version of the Cascadia region on Thursday, Feb. 23, 2017 when the University of British Columbia (UBC) , the University of Washington (state; UW), and Microsoft announced the launch of the Cascadia Urban Analytics Cooperative. From the joint Feb. 23, 2017 news release (read on the UBC website or read on the UW website),

In an expansion of regional cooperation, the University of British Columbia and the University of Washington today announced the establishment of the Cascadia Urban Analytics Cooperative to use data to help cities and communities address challenges from traffic to homelessness. The largest industry-funded research partnership between UBC and the UW, the collaborative will bring faculty, students and community stakeholders together to solve problems, and is made possible thanks to a $1-million gift from Microsoft.

“Thanks to this generous gift from Microsoft, our two universities are poised to help transform the Cascadia region into a technological hub comparable to Silicon Valley and Boston,” said Professor Santa J. Ono, President of the University of British Columbia. “This new partnership transcends borders and strives to unleash our collective brain power, to bring about economic growth that enriches the lives of Canadians and Americans as well as urban communities throughout the world.”

“We have an unprecedented opportunity to use data to help our communities make decisions, and as a result improve people’s lives and well-being. That commitment to the public good is at the core of the mission of our two universities, and we’re grateful to Microsoft for making a community-minded contribution that will spark a range of collaborations,” said UW President Ana Mari Cauce.

Today’s announcement follows last September’s [2016] Emerging Cascadia Innovation Corridor Conference in Vancouver, B.C. The forum brought together regional leaders for the first time to identify concrete opportunities for partnerships in education, transportation, university research, human capital and other areas.

A Boston Consulting Group study unveiled at the conference showed the region between Seattle and Vancouver has “high potential to cultivate an innovation corridor” that competes on an international scale, but only if regional leaders work together. The study says that could be possible through sustained collaboration aided by an educated and skilled workforce, a vibrant network of research universities and a dynamic policy environment.

Microsoft President Brad Smith, who helped convene the conference, said, “We believe that joint research based on data science can help unlock new solutions for some of the most pressing issues in both Vancouver and Seattle. But our goal is bigger than this one-time gift. We hope this investment will serve as a catalyst for broader and more sustainable efforts between these two institutions.”

As part of the Emerging Cascadia conference, British Columbia Premier Christy Clark and Washington Governor Jay Inslee signed a formal agreement that committed the two governments to work closely together to “enhance meaningful and results-driven innovation and collaboration.”  The agreement outlined steps the two governments will take to collaborate in several key areas including research and education.

“Increasingly, tech is not just another standalone sector of the economy, but fully integrated into everything from transportation to social work,” said Premier Clark. “That’s why we’ve invested in B.C.’s thriving tech sector, but committed to working with our neighbours in Washington – and we’re already seeing the results.”

“This data-driven collaboration among some of our smartest and most creative thought-leaders will help us tackle a host of urgent issues,” Gov. Inslee said. “I’m encouraged to see our partnership with British Columbia spurring such interesting cross-border dialogue and excited to see what our students and researchers come up with.”

The Cascadia Urban Analytics Cooperative will revolve around four main programs:

  • The Cascadia Data Science for Social Good (DSSG) Summer Program, which builds on the success of the DSSG program at the UW eScience Institute. The cooperative will coordinate a joint summer program for students across UW and UBC campuses where they work with faculty to create and incubate data-intensive research projects that have concrete benefits for urban communities. One past DSSG project analyzed data from Seattle’s regional transportation system – ORCA – to improve its effectiveness, particularly for low-income transit riders. Another project sought to improve food safety by text mining product reviews to identify unsafe products.
  • Cascadia Data Science for Social Good Scholar Symposium, which will foster innovation and collaboration by bringing together scholars from UBC and the UW involved in projects utilizing technology to advance the social good. The first symposium will be hosted at UW in 2017.
  • Sustained Research Partnerships designed to establish the Pacific Northwest as a center of expertise and activity in urban analytics. The cooperative will support sustained research partnerships between UW and UBC researchers, providing technical expertise, stakeholder engagement and seed funding.
  • Responsible Data Management Systems and Services to ensure data integrity, security and usability. The cooperative will develop new software, systems and services to facilitate data management and analysis, as well as ensure projects adhere to best practices in fairness, accountability and transparency.

At UW, the Cascadia Urban Analytics Collaborative will be overseen by Urbanalytics (urbanalytics.uw.edu), a new research unit in the Information School focused on responsible urban data science. The Collaborative builds on previous investments in data-intensive science through the UW eScience Institute (escience.washington.edu) and investments in urban scholarship through Urban@UW (urban.uw.edu), and also aligns with the UW’s Population Health Initiative (uw.edu/populationhealth) that is addressing the most persistent and emerging challenges in human health, environmental resiliency and social and economic equity. The gift counts toward the UW’s Be Boundless – For Washington, For the World campaign (uw.edu/boundless).

The Collaborative also aligns with the UBC Sustainability Initiative (sustain.ubc.ca) that fosters partnerships beyond traditional boundaries of disciplines, sectors and geographies to address critical issues of our time, as well as the UBC Data Science Institute (dsi.ubc.ca), which aims to advance data science research to address complex problems across domains, including health, science and arts.

Brad Smith, President and Chief Legal Officer of Microsoft, wrote about the joint centre in a Feb. 23, 2017 posting on the Microsoft on the Issues blog (Note:,

The cities of Vancouver and Seattle share many strengths: a long history of innovation, world-class universities and a region rich in cultural and ethnic diversity. While both cities have achieved great success on their own, leaders from both sides of the border realize that tighter partnership and collaboration, through the creation of a Cascadia Innovation Corridor, will expand economic opportunity and prosperity well beyond what each community can achieve separately.

Microsoft supports this vision and today is making a $1 million investment in the Cascadia Urban Analytics Cooperative (CUAC), which is a new joint effort by the University of British Columbia (UBC) and the University of Washington (UW).  It will use data to help local cities and communities address challenges from traffic to homelessness and will be the region’s single largest university-based, industry-funded joint research project. While we recognize the crucial role that universities play in building great companies in the Pacific Northwest, whether it be in computing, life sciences, aerospace or interactive entertainment, we also know research, particularly data science, holds the key to solving some of Vancouver and Seattle’s most pressing issues. This grant will advance this work.

An Oct. 21, 2016 article by Hana Golightly for the Ubyssey newspaper provides a little more detail about the province/state agreement mentioned in the joint UBC/UW news release,

An agreement between BC Premier Christy Clark and Washington Governor Jay Inslee means UBC will be collaborating with the University of Washington (UW) more in the future.

At last month’s [Sept. 2016] Cascadia Conference, Clark and Inslee signed a Memorandum of Understanding with the goal of fostering the growth of the technology sector in both regions. Officially referred to as the Cascadia Innovation Corridor, this partnership aims to reduce boundaries across the region — economic and otherwise.

While the memorandum provides broad goals and is not legally binding, it sets a precedent of collaboration between businesses, governments and universities, encouraging projects that span both jurisdictions. Aiming to capitalize on the cultural commonalities of regional centres Seattle and Vancouver, the agreement prioritizes development in life sciences, clean technology, data analytics and high tech.

Metropolitan centres like Seattle and Vancouver have experienced a surge in growth that sees planners envisioning them as the next Silicon Valleys. Premier Clark and Governor Inslee want to strengthen the ability of their jurisdictions to compete in innovation on a global scale. Accordingly, the memorandum encourages the exploration of “opportunities to advance research programs in key areas of innovation and future technologies among the region’s major universities and institutes.”

A few more questions about the Cooperative

I had a few more questions about the Feb. 23, 2017 announcement, for which (from UBC) Gail C. Murphy, PhD, FRSC, Associate Vice President Research pro tem, Professor, Computer Science of UBC and (from UW) Bill Howe, Associate Professor, Information School, Adjunct Associate Professor, Computer Science & Engineering, Associate Director and Senior Data Science Fellow,, UW eScience Institute Program Director and Faculty Chair, UW Data Science Masters Degree have kindly provided answers (Gail Murphy’s replies are prefaced with [GM] and one indent and Bill Howe’s replies are prefaced with [BH] and two indents),

  • Do you have any projects currently underway? e.g. I see a summer programme is planned. Will there be one in summer 2017? What focus will it have?

[GM] UW and UBC will each be running the Data Science for Social Good program in the summer of 2017. UBC’s announcement of the program is available at: http://dsi.ubc.ca/data-science-social-good-dssg-fellowships

  • Is the $1M from Microsoft going to be given in cash or as ‘in kind goods’ or some combination?

[GM] The $1-million donation is in cash. Microsoft organized the Emerging Cascadia Innovation Corridor Conference in September 2017. It was at the conference that the idea for the partnership was hatched. Through this initiative, UBC and UW will continue to engage with Microsoft to further shared goals in promoting evidence-based innovation to improve life for people in the Cascadia region and beyond.

  • How will the money or goods be disbursed? e.g. Will each institution get 1/2 or is there some sort of joint account?

[GM] The institutions are sharing the funds but will be separately administering the funds they receive.

  • Is data going to be crossing borders? e.g. You mentioned some health care projects. In that case, will data from BC residents be accessed and subject to US rules and regulations? Will BC residents know that there data is being accessed by a 3rd party? What level of consent is required?

[GM] As you point out, there are many issues involved with transferring data across the border. Any projects involving private data will adhere to local laws and ethical frameworks set out by the institutions.

  • Privacy rules vary greatly between the US and Canada. How is that being addressed in this proposed new research?

[No Reply]

  • Will new software and other products be created and who will own them?

[GM] It is too soon for us to comment on whether new software or other products will be created. Any creation of software or other products within the institutions will be governed by institutional policy.

  • Will the research be made freely available?

[GM] UBC researchers must be able to publish the results of research as set out by UBC policy.

[BH] Research output at UW will be made available according to UW policy, but I’ll point out that Microsoft has long been a fantastic partner in advancing our efforts in open and reproducible science, open source software, and open access publishing. 

 UW’s discussion on open access policies is available online.

 

  • What percentage of public funds will be used to enable this project? Will the province of BC and the state of Washington be splitting the costs evenly?

[GM] It is too soon for us to report on specific percentages. At UBC, we will be looking to partner with appropriate funding agencies to support more research with this donation. Applications to funding agencies will involve review of any proposals as per the rules of the funding agency.

  • Will there be any social science and/or ethics component to this collaboration? The press conference referenced data science only.

[GM] We expect, but cannot yet confirm, that some of the projects will involve collaborations with faculty from a broad range of research areas at UBC.

[BH] We are indeed placing a strong emphasis on the intersection between data science, the social sciences, and data ethics.  As examples of activities in this space around UW:

* The Information School at UW (my home school) is actively recruiting a new faculty candidate in data ethics this year

* The Education Working Group at the eScience Institute has created a new campus-wide Data & Society seminar course.

* The Center for Statistics in the Social Sciences (CSSS), which represents the marriage of data science and the social sciences, has been a long-term partner in our activities.

More specifically for this collaboration, we are collecting requirements for new software that emphasizes responsible data science: properly managing sensitive data, combating algorithmic bias, protecting privacy, and more.

Microsoft has been a key partner in this work through their Civic Technology group, for which the Seattle arm is led by Graham Thompson.

  • What impact do you see the new US federal government’s current concerns over borders and immigrants hav[ing] on this project? e.g. Are people whose origins are in Iran, Syria, Yemen, etc. and who are residents of Canada going to be able to participate?

[GM] Students and others eligible to participate in research projects in Canada will be welcomed into the UBC projects. Our hope is that faculty and students working on the Cascadia Urban Analytics Cooperative will be able to exchange ideas freely and move freely back and forth across the border.

  • How will seed funding for Sustained Research Partnerships’ be disbursed? Will there be a joint committee making these decisions?

[GM] We are in the process of elaborating this part of the program. At UBC, we are already experiencing, enjoying and benefitting from increased interaction with the University of Washington and look forward to elaborating more aspects of the program together as the year unfolds.

I had to make a few formatting changes when transferring the answers from emails to this posting: my numbered questions (1-11) became bulleted points and ‘have’ in what was question 10 was changed to ‘having’. The content for the answers has been untouched.

I’m surprised no one answered the privacy question but perhaps they thought the other answers sufficed. Despite an answer to my question, I don’t understand how the universities are sharing the funds but that may just mean I’m having a bad day. (Or perhaps the folks at UBC are being overly careful after the scandals rocking the Vancouver campus over the last 18 months to two years (see Sophie Sutcliffe’s Dec. 3, 2015 opinion piece for the Ubyssey for details about the scandals).

Bill Howe’s response about open access (where you can read the journal articles for free) and open source (where you have free access to the software code) was interesting to me as I once worked for a company where the developers complained loud and long about Microsoft’s failure to embrace open source code. Howe’s response is particularly interesting given that Microsoft’s president is also the Chief Legal Officer whose portfolio of responsibilities (I imagine) includes patents.

Matt Day in a Feb. 23, 2017 article for the The Seattle Times provides additional perspective (Note: Links have been removed),

Microsoft’s effort to nudge Seattle and Vancouver, B.C., a bit closer together got an endorsement Thursday [Feb. 23, 2017] from the leading university in each city.

The University of Washington and the University of British Columbia announced the establishment of a joint data-science research unit, called the Cascadia Urban Analytics Cooperative, funded by a $1 million grant from Microsoft.

The collaboration will support study of shared urban issues, from health to transit to homelessness, drawing on faculty and student input from both universities.

The partnership has its roots in a September [2016] conference in Vancouver organized by Microsoft’s public affairs and lobbying unit [emphasis mine.] That gathering was aimed at tying business, government and educational institutions in Microsoft’s home region in the Seattle area closer to its Canadian neighbor.

Microsoft last year [2016]* opened an expanded office in downtown Vancouver with space for 750 employees, an outpost partly designed to draw to the Northwest more engineers than the company can get through the U.S. guest worker system [emphasis mine].

There’s nothing wrong with a business offering to contribute to the social good but it does well to remember that a business’s primary agenda is not the social good.  So in this case, it seems that public affairs and lobbying is really governmental affairs and that Microsoft has anticipated, for some time, greater difficulties with getting workers from all sorts of countries across the US border to work in Washington state making an outpost in British Columbia and closer relations between the constituencies quite advantageous. I wonder what else is on their agenda.

Getting back to UBC and UW, thank you to both Gail Murphy (in particular) and Bill Howe for taking the time to answer my questions. I very much appreciate it as answering 10 questions is a lot of work.

There were one area of interest (cities) that I did not broach with the either academic but will mention here.

Cities and their increasing political heft

Clearly Microsoft is focused on urban issues and that would seem to be the ‘flavour du jour’. There’s a May 31, 2016 piece on the TED website by Robert Muggah and Benjamin Fowler titled: ‘Why cities rule the world‘ (there are video talks embedded in the piece),

Cities are the the 21st century’s dominant form of civilization — and they’re where humanity’s struggle for survival will take place. Robert Muggah and Benjamin Barber spell out the possibilities.

Half the planet’s population lives in cities. They are the world’s engines, generating four-fifths of the global GDP. There are over 2,100 cities with populations of 250,000 people or more, including a growing number of mega-cities and sprawling, networked-city areas — conurbations, they’re called — with at least 10 million residents. As the economist Ed Glaeser puts it, “we are an urban species.”

But what makes cities so incredibly important is not just population or economics stats. Cities are humanity’s most realistic hope for future democracy to thrive, from the grassroots to the global. This makes them a stark contrast to so many of today’s nations, increasingly paralyzed by polarization, corruption and scandal.

In a less hyperbolic vein, Parag Khanna’s April 20,2016 piece for Quartz describes why he (and others) believe that megacities are where the future lies (Note: A link has been removed),

Cities are mankind’s most enduring and stable mode of social organization, outlasting all empires and nations over which they have presided. Today cities have become the world’s dominant demographic and economic clusters.

As the sociologist Christopher Chase-Dunn has pointed out, it is not population or territorial size that drives world-city status, but economic weight, proximity to zones of growth, political stability, and attractiveness for foreign capital. In other words, connectivity matters more than size. Cities thus deserve more nuanced treatment on our maps than simply as homogeneous black dots.

Within many emerging markets such as Brazil, Turkey, Russia, and Indonesia, the leading commercial hub or financial center accounts for at least one-third or more of national GDP. In the UK, London accounts for almost half Britain’s GDP. And in America, the Boston-New York-Washington corridor and greater Los Angeles together combine for about one-third of America’s GDP.

By 2025, there will be at least 40 such megacities. The population of the greater Mexico City region is larger than that of Australia, as is that of Chongqing, a collection of connected urban enclaves in China spanning an area the size of Austria. Cities that were once hundreds of kilometers apart have now effectively fused into massive urban archipelagos, the largest of which is Japan’s Taiheiyo Belt that encompasses two-thirds of Japan’s population in the Tokyo-Nagoya-Osaka megalopolis.

Great and connected cities, Saskia Sassen argues, belong as much to global networks as to the country of their political geography. Today the world’s top 20 richest cities have forged a super-circuit driven by capital, talent, and services: they are home to more than 75% of the largest companies, which in turn invest in expanding across those cities and adding more to expand the intercity network. Indeed, global cities have forged a league of their own, in many ways as denationalized as Formula One racing teams, drawing talent from around the world and amassing capital to spend on themselves while they compete on the same circuit.

The rise of emerging market megacities as magnets for regional wealth and talent has been the most significant contributor to shifting the world’s focal point of economic activity. McKinsey Global Institute research suggests that from now until 2025, one-third of world growth will come from the key Western capitals and emerging market megacities, one-third from the heavily populous middle-weight cities of emerging markets, and one-third from small cities and rural areas in developing countries.

Khanna’s megacities all exist within one country. If Vancouver and Seattle (and perhaps Portland?) were to become a become a megacity it would be one of the only or few to cross national borders.

Khanna has been mentioned here before in a Jan. 27, 2016 posting about cities and technology and a public engagement exercise with the National Research of Council of Canada (scroll down to the subsection titled: Cities rising in important as political entities).

Muggah/Fowler’s and Khanna’s 2016 pieces are well worth reading if you have the time.

For what it’s worth, I’m inclined to agree that cities will be and are increasing in political  importance along with this area of development:

Algorithms and big data

Concerns are being raised about how big data is being utilized so I was happy to see specific initiatives to address ethics issues in Howe’s response. For anyone not familiar with the concerns, here’s an excerpt from Cathy O’Neal’s Oct. 18, 2016 article for Wired magazine,

The age of Big Data has generated new tools and ideas on an enormous scale, with applications spreading from marketing to Wall Street, human resources, college admissions, and insurance. At the same time, Big Data has opened opportunities for a whole new class of professional gamers and manipulators, who take advantage of people using the power of statistics.

I should know. I was one of them.

Information is power, and in the age of corporate surveillance, profiles on every active American consumer means that the system is slanted in favor of those with the data. This data helps build tailor-made profiles that can be used for or against someone in a given situation. Insurance companies, which historically sold car insurance based on driving records, have more recently started using such data-driven profiling methods. A Florida insurance company has been found to charge people with low credit scores and good driving records more than people with high credit scores and a drunk driving conviction. It’s become standard practice for insurance companies to charge people not what they represent as a risk, but what they can get away with. The victims, of course, are those least likely to be able to afford the extra cost, but who need a car to get to work.

Big data profiling techniques are exploding in the world of politics. It’s estimated that over $1 billion will be spent on digital political ads in this election cycle, almost 50 times as much as was spent in 2008; this field is a growing part of the budget for presidential as well as down-ticket races. Political campaigns build scoring systems on potential voters—your likelihood of voting for a given party, your stance on a given issue, and the extent to which you are persuadable on that issue. It’s the ultimate example of asymmetric information, and the politicians can use what they know to manipulate your vote or your donation.

I highly recommend reading O’Neal’s article and, if you have the time, her book ‘Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy’.

Finally

I look forward to hearing more about the Cascadia Urban Analytics Cooperative and the Cascadia Innovation Corridor as they develop. This has the potential to be very exciting although I do have some concerns such as MIcrosoft and its agendas, both stated and unstated. After all, the Sept. 2016 meeting was convened by Microsoft and its public affairs/lobbying group and the topic was innovation, which is code for business and as hinted earlier, business is not synonymous with social good. Having said that I’m not about to demonize business either. I just think a healthy dose of skepticism is called for. Good things can happen but we need to ensure they do.

Thankfully, my concerns regarding algorithms and big data seem to be shared in some quarters, unfortunately none of these quarters appear to be located at the University of British Columbia. I hope that’s over caution with regard to communication rather than a failure to recognize any pitfalls.

ETA Mar. 1, 2017: Interestingly, the UK House of Commons Select Committee on Science and Technology announced an inquiry into the use of algorithms in public and business decision-making on Feb. 28, 2017. As this posting as much too big already, I’ve posted about the UK inquire separately in a Mar. 1, 2017 posting.

*’2016′ added for clarity on March 24, 2017.

New gecko species in Madagascar with skin specially adapted to tearing

After all these years of writing about geckos and their adhesive properties (they can hang off a wall or ceiling by a single toe), I’ve developed a mild interest in them. From a Feb. 7, 2017 posting by Dirk Steinke for the One species a day blog,

The new species was found in northern Madagascar and its name was build [sic] from the two Greek stems mégas, meaning ‘very large’ and lepís, meaning ‘scale’, and refers to the large size of the scales of this species in comparison to other geckos.

Caption: The new fish-scale gecko, Geckolepis megalepis, has the largest body scales of all geckos. This nocturnal lizard was discovered in the ‘tsingy’ karst formations in northern Madagascar Credit: F. Glaw

There’s more about the new species and the research in a Feb. 7, 2017 PeerJ news release on EurekAlert,

Many lizards can drop their tails when grabbed, but one group of geckos has gone to particularly extreme lengths to escape predation. Fish-scale geckos in the genus Geckolepis have large scales that tear away with ease, leaving them free to escape whilst the predator is left with a mouth full of scales. Scientists have now described a new species (Geckolepis megalepis) that is the master of this art, possessing the largest scales of any gecko.

The skin of fish-scale geckos is specially adapted to tearing. The large scales are attached only by a relatively narrow region that tears with ease, and beneath them they have a pre-formed splitting zone within the skin itself. Together, these features make them especially good at escaping from predators. Although several other geckos are able to lose their skin like this if they are grasped really firmly, Geckolepis are apparently able to do it actively, and at the slightest touch. And while others might take a long time to regenerate their scales, fish-scale geckos can grow them back, scar-free, in a matter of weeks.

This remarkable (if somewhat gruesome) ability has made these geckos a serious challenge to the scientists who want to study them. Early researchers described how it was necessary to catch them with bundles of cotton wool, to avoid them losing almost all of their skin. Today, little has changed, and researchers try to catch them without touching them if possible, by luring them into plastic bags. But once they are caught, the challenges are not over; identifying and describing them is even harder.

“A study a few years ago showed that our understanding of the diversity of fish-scale geckos was totally inadequate,” says Mark D. Scherz, lead author of the new study and PhD student at the Ludwig Maximilian University of Munich and Zoologische Staatssammlung München, “it showed us that there were actually about thirteen highly distinct genetic lineages in this genus, and not just the three or four species we thought existed. One of the divergent lineages they identified was immediately obvious as a new species, because it had such massive scales. But to name it, we had to find additional reliable characteristics that distinguish it from the other species.” A challenging task indeed: one of the main ways reptile species can be told apart is by their scale patterns, but these geckos lose their scales with such ease that the patterns are often lost by the time they reach adulthood. “You have to think a bit outside the box with Geckolepis. They’re a nightmare to identify. So we turned to micro-CT to get at their skeletons and search there for identifying features.” Micro-CT (micro-computed tomography) is essentially a 3D x-ray of an object. This method is allowing morphologists like Scherz to examine the skeletons of animals without having to dissect them, opening up new approaches to quickly study the internal morphology of animals.

By looking at the skeletons of the geckos, the team was able to identify some features of the skull that distinguish their new species from all others. But they also found some surprises; a species named 150 years ago, Geckolepis maculata, was confirmed to be different from the genetic lineage that it had been thought to be. “This is just typical of Geckolepis. You think you have them sorted out, but then you get a result that turns your hypothesis on its head. We still have no idea what Geckolepis maculata really is–we are just getting more and more certain what it’s not.”

The new species, Geckolepis megalepis, which was described by researchers from the US, Germany, and Columbia [sic] in a paper published today in the open access journal PeerJ, is most remarkable because of its huge scales, which are by far the largest of any gecko. The researchers hypothesize that the larger scales tear more easily than smaller scales, because of their greater surface area relative to the attachment area, and larger friction surface. “What’s really remarkable though is that these scales–which are really dense and may even be bony, and must be quite energetically costly to produce–and the skin beneath them tear away with such ease, and can be regenerated quickly and without a scar,” says Scherz. The mechanism for regeneration, which is not well understood, could potentially have applications in human medicine, where regeneration research is already being informed by studies on salamander limbs and lizard tails.

Here’s a link to and a citation for the paper,

Off the scale: a new species of fish-scale gecko (Squamata: Gekkonidae: Geckolepis) with exceptionally large scales by Mark D. Scherz​, Juan D. Daza, Jörn Köhler, Miguel Vences, Frank Glaw. PeerJ 5:e2955 https://doi.org/10.7717/peerj.2955

This paper is open access.

For anyone unfamiliar with ‘gecko research’, scientists are fascinated by their abilities and have been researching them (in a field known as biomimicry or bioinspired engineering or biomimetics) for years with the hope of mimicking those abilities for new applications. You can check out a March 19, 2015 posting or this July 10, 2014 posting for examples or you can search ‘gecko’ on this blog for more examples.

Figuring out how stars are born by watching neutrons ‘quantum tunnelling’ on graphene

A Feb. 3, 2017 news item on Nanowerk announces research that could help us better understand how stars are ‘born’,

Graphene is known as the world’s thinnest material due to its 2D structure, where each sheet is only one carbon atom thick, allowing each atom to engage in a chemical reaction from two sides. Graphene flakes can have a very large proportion of edge atoms, all of which have a particular chemical reactivity.

In addition, chemically active voids created by missing atoms are a surface defect of graphene sheets. These structural defects and edges play a vital role in carbon chemistry and physics, as they alter the chemical reactivity of graphene. In fact, chemical reactions have repeatedly been shown to be favoured at these defect sites.

Interstellar molecular clouds are predominantly composed of hydrogen in molecular form (H2), but also contain a small percentage of dust particles mostly in the form of carbon nanostructures, called polyaromatic hydrocarbons (PAH). These clouds are often referred to as ‘star nurseries’ as their low temperature and high density allows gravity to locally condense matter in such a way that it initiates H fusion, the nuclear reaction at the heart of each star.

Graphene-based materials, prepared from the exfoliation of graphite oxide, are used as a model of interstellar carbon dust as they contain a relatively large amount of atomic defects, either at their edges or on their surface. These defects are thought to sustain the Eley-Rideal chemical reaction, which recombines two H atoms into one H2 molecule. The observation of interstellar clouds in inhospitable regions of space, including in the direct proximity of giant stars, poses the question of the origin of the stability of hydrogen in the molecular form (H2).

This question stands because the clouds are constantly being washed out by intense radiation, hence cracking the hydrogen molecules into atoms. Astrochemists suggest that the chemical mechanism responsible for the recombination of atomic H into molecular H2 is catalysed by carbon flakes in interstellar clouds.

A Feb. 2, 2017 Institut Laue-Langevin press release, which originated the news item, provides more insight into the research,

Their [astrochemists’s] theories are challenged by the need for a very efficient surface chemistry scenario to explain the observed equilibrium between dissociation and recombination. They had to introduce highly reactive sites into their models so that the capture of an atomic H nearby occurs without fail. These sites, in the form of atomic defects at the surface or edge of the carbon flakes, should be such that the C-H bond formed thereafter allows the H atom to be released easily to recombine with another H atom flying nearby.

A collaboration between the Institut Laue-Langevin (ILL), France, the University of Parma, Italy, and the ISIS Neutron and Muon Source, UK, combined neutron spectroscopy with density functional theory (DFT) molecular dynamics simulations in order to characterise the local environment and vibrations of hydrogen atoms chemically bonded at the surface of substantially defected graphene flakes. Additional analyses were carried out using muon spectroscopy (muSR) and nuclear magnetic resonance (NMR). As availability of the samples is very low, these highly specific techniques were necessary to study the samples; neutron spectroscopy is highly sensitive to hydrogen and allowed accurate data to be gathered at small concentrations.

For the first time ever, this study showed ‘quantum tunnelling’ in these systems, allowing the H atoms bound to C atoms to explore relatively long distances at temperatures as low as those in interstitial clouds. The process involves hydrogen ‘quantum hopping’ from one carbon atom to another in its direct vicinity, tunnelling through energy barriers which could not be overcome given the lack of heat in the interstellar cloud environment. This movement is sustained by the fluctuations of the graphene structure, which bring the H atom into unstable regions and catalyse the recombination process by allowing the release of the chemically bonded H atom. Therefore, it is believed that quantum tunnelling facilitates the reaction for the formation of molecular H2.

ILL scientist and carbon nanostructure specialist, Stéphane Rols says: “The question of how molecular hydrogen forms at the low temperatures in interstellar clouds has always been a driver in astrochemistry research. We’re proud to have combined spectroscopy expertise with the sensitivity of neutrons to identify the intriguing quantum tunnelling phenomenon as a possible mechanism behind the formation of H2; these observations are significant in furthering our understanding of the universe.”

Here’s a link to and a citation for the paper (which dates from Aug. 2016),

Hydrogen motions in defective graphene: the role of surface defects by Chiara Cavallari, Daniele Pontiroli, Mónica Jiménez-Ruiz, Mark Johnson, Matteo Aramini, Mattia Gaboardi, Stewart F. Parker, Mauro Riccó, and Stéphane Rols. Phys. Chem. Chem. Phys., 2016, Issue 36, 18, 24820-24824 DOI: 10.1039/C6CP04727K First published online 22 Aug 2016

This paper is behind a paywall.

Mapping 23,000 atoms in a nanoparticle

Identification of the precise 3-D coordinates of iron, shown in red, and platinum atoms in an iron-platinum nanoparticle.. Courtesy of Colin Ophus and Florian Nickel/Berkeley Lab

The image of the iron-platinum nanoparticle (referenced in the headline) reminds of foetal ultrasound images. A Feb. 1, 2017 news item on ScienceDaily tells us more,

In the world of the very tiny, perfection is rare: virtually all materials have defects on the atomic level. These imperfections — missing atoms, atoms of one type swapped for another, and misaligned atoms — can uniquely determine a material’s properties and function. Now, UCLA [University of California at Los Angeles] physicists and collaborators have mapped the coordinates of more than 23,000 individual atoms in a tiny iron-platinum nanoparticle to reveal the material’s defects.

The results demonstrate that the positions of tens of thousands of atoms can be precisely identified and then fed into quantum mechanics calculations to correlate imperfections and defects with material properties at the single-atom level.

A Feb. 1, 2017 UCLA news release, which originated the news item, provides more detail about the work,

Jianwei “John” Miao, a UCLA professor of physics and astronomy and a member of UCLA’s California NanoSystems Institute, led the international team in mapping the atomic-level details of the bimetallic nanoparticle, more than a trillion of which could fit within a grain of sand.

“No one has seen this kind of three-dimensional structural complexity with such detail before,” said Miao, who is also a deputy director of the Science and Technology Center on Real-Time Functional Imaging. This new National Science Foundation-funded consortium consists of scientists at UCLA and five other colleges and universities who are using high-resolution imaging to address questions in the physical sciences, life sciences and engineering.

Miao and his team focused on an iron-platinum alloy, a very promising material for next-generation magnetic storage media and permanent magnet applications.

By taking multiple images of the iron-platinum nanoparticle with an advanced electron microscope at Lawrence Berkeley National Laboratory and using powerful reconstruction algorithms developed at UCLA, the researchers determined the precise three-dimensional arrangement of atoms in the nanoparticle.

“For the first time, we can see individual atoms and chemical composition in three dimensions. Everything we look at, it’s new,” Miao said.

The team identified and located more than 6,500 iron and 16,600 platinum atoms and showed how the atoms are arranged in nine grains, each of which contains different ratios of iron and platinum atoms. Miao and his colleagues showed that atoms closer to the interior of the grains are more regularly arranged than those near the surfaces. They also observed that the interfaces between grains, called grain boundaries, are more disordered.

“Understanding the three-dimensional structures of grain boundaries is a major challenge in materials science because they strongly influence the properties of materials,” Miao said. “Now we are able to address this challenge by precisely mapping out the three-dimensional atomic positions at the grain boundaries for the first time.”

The researchers then used the three-dimensional coordinates of the atoms as inputs into quantum mechanics calculations to determine the magnetic properties of the iron-platinum nanoparticle. They observed abrupt changes in magnetic properties at the grain boundaries.

“This work makes significant advances in characterization capabilities and expands our fundamental understanding of structure-property relationships, which is expected to find broad applications in physics, chemistry, materials science, nanoscience and nanotechnology,” Miao said.

In the future, as the researchers continue to determine the three-dimensional atomic coordinates of more materials, they plan to establish an online databank for the physical sciences, analogous to protein databanks for the biological and life sciences. “Researchers can use this databank to study material properties truly on the single-atom level,” Miao said.

Miao and his team also look forward to applying their method called GENFIRE (GENeralized Fourier Iterative Reconstruction) to biological and medical applications. “Our three-dimensional reconstruction algorithm might be useful for imaging like CT scans,” Miao said. Compared with conventional reconstruction methods, GENFIRE requires fewer images to compile an accurate three-dimensional structure.

That means that radiation-sensitive objects can be imaged with lower doses of radiation.

The US Dept. of Energy (DOE) Lawrence Berkeley National Laboratory issued their own Feb. 1, 2017 news release (also on EurekAlert) about the work with a focus on how their equipment made this breakthrough possible (it repeats a little of the info. from the UCLA news release),

Scientists used one of the world’s most powerful electron microscopes to map the precise location and chemical type of 23,000 atoms in an extremely small particle made of iron and platinum.

The 3-D reconstruction reveals the arrangement of atoms in unprecedented detail, enabling the scientists to measure chemical order and disorder in individual grains, which sheds light on the material’s properties at the single-atom level. Insights gained from the particle’s structure could lead to new ways to improve its magnetic performance for use in high-density, next-generation hard drives.

What’s more, the technique used to create the reconstruction, atomic electron tomography (which is like an incredibly high-resolution CT scan), lays the foundation for precisely mapping the atomic composition of other useful nanoparticles. This could reveal how to optimize the particles for more efficient catalysts, stronger materials, and disease-detecting fluorescent tags.

Microscopy data was obtained and analyzed by scientists from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) at the Molecular Foundry, in collaboration with Foundry users from UCLA, Oak Ridge National Laboratory, and the United Kingdom’s University of Birmingham. …

Atoms are the building blocks of matter, and the patterns in which they’re arranged dictate a material’s properties. These patterns can also be exploited to greatly improve a material’s function, which is why scientists are eager to determine the 3-D structure of nanoparticles at the smallest scale possible.

“Our research is a big step in this direction. We can now take a snapshot that shows the positions of all the atoms in a nanoparticle at a specific point in its growth. This will help us learn how nanoparticles grow atom by atom, and it sets the stage for a materials-design approach starting from the smallest building blocks,” says Mary Scott, who conducted the research while she was a Foundry user, and who is now a staff scientist. Scott and fellow Foundry scientists Peter Ercius and Colin Ophus developed the method in close collaboration with Jianwei Miao, a UCLA professor of physics and astronomy.

Their nanoparticle reconstruction builds on an achievement they reported last year in which they measured the coordinates of more than 3,000 atoms in a tungsten needle to a precision of 19 trillionths of a meter (19 picometers), which is many times smaller than a hydrogen atom. Now, they’ve taken the same precision, added the ability to distinguish different elements, and scaled up the reconstruction to include tens of thousands of atoms.

Importantly, their method maps the position of each atom in a single, unique nanoparticle. In contrast, X-ray crystallography and cryo-electron microscopy plot the average position of atoms from many identical samples. These methods make assumptions about the arrangement of atoms, which isn’t a good fit for nanoparticles because no two are alike.

“We need to determine the location and type of each atom to truly understand how a nanoparticle functions at the atomic scale,” says Ercius.

A TEAM approach

The scientists’ latest accomplishment hinged on the use of one of the highest-resolution transmission electron microscopes in the world, called TEAM I. It’s located at the National Center for Electron Microscopy, which is a Molecular Foundry facility. The microscope scans a sample with a focused beam of electrons, and then measures how the electrons interact with the atoms in the sample. It also has a piezo-controlled stage that positions samples with unmatched stability and position-control accuracy.

The researchers began growing an iron-platinum nanoparticle from its constituent elements, and then stopped the particle’s growth before it was fully formed. They placed the “partially baked” particle in the TEAM I stage, obtained a 2-D projection of its atomic structure, rotated it a few degrees, obtained another projection, and so on. Each 2-D projection provides a little more information about the full 3-D structure of the nanoparticle.

They sent the projections to Miao at UCLA, who used a sophisticated computer algorithm to convert the 2-D projections into a 3-D reconstruction of the particle. The individual atomic coordinates and chemical types were then traced from the 3-D density based on the knowledge that iron atoms are lighter than platinum atoms. The resulting atomic structure contains 6,569 iron atoms and 16,627 platinum atoms, with each atom’s coordinates precisely plotted to less than the width of a hydrogen atom.

Translating the data into scientific insights

Interesting features emerged at this extreme scale after Molecular Foundry scientists used code they developed to analyze the atomic structure. For example, the analysis revealed chemical order and disorder in interlocking grains, in which the iron and platinum atoms are arranged in different patterns. This has large implications for how the particle grew and its real-world magnetic properties. The analysis also revealed single-atom defects and the width of disordered boundaries between grains, which was not previously possible in complex 3-D boundaries.

“The important materials science problem we are tackling is how this material transforms from a highly randomized structure, what we call a chemically-disordered structure, into a regular highly-ordered structure with the desired magnetic properties,” says Ophus.

To explore how the various arrangements of atoms affect the nanoparticle’s magnetic properties, scientists from DOE’s Oak Ridge National Laboratory ran computer calculations on the Titan supercomputer at ORNL–using the coordinates and chemical type of each atom–to simulate the nanoparticle’s behavior in a magnetic field. This allowed the scientists to see patterns of atoms that are very magnetic, which is ideal for hard drives. They also saw patterns with poor magnetic properties that could sap a hard drive’s performance.

“This could help scientists learn how to steer the growth of iron-platinum nanoparticles so they develop more highly magnetic patterns of atoms,” says Ercius.

Adds Scott, “More broadly, the imaging technique will shed light on the nucleation and growth of ordered phases within nanoparticles, which isn’t fully theoretically understood but is critically important to several scientific disciplines and technologies.”

The folks at the Berkeley Lab have created a video (notice where the still image from the beginning of this post appears),

The Oak Ridge National Laboratory (ORNL), not wanting to be left out, has been mentioned in a Feb. 3, 2017 news item on ScienceDaily,

… researchers working with magnetic nanoparticles at the University of California, Los Angeles (UCLA), and the US Department of Energy’s (DOE’s) Lawrence Berkeley National Laboratory (Berkeley Lab) approached computational scientists at DOE’s Oak Ridge National Laboratory (ORNL) to help solve a unique problem: to model magnetism at the atomic level using experimental data from a real nanoparticle.

“These types of calculations have been done for ideal particles with ideal crystal structures but not for real particles,” said Markus Eisenbach, a computational scientist at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at ORNL.

A Feb. 2, 2017 ORNL news release on EurekAlert, which originated the news item, elucidates further on how their team added to the research,

Eisenbach develops quantum mechanical electronic structure simulations that predict magnetic properties in materials. Working with Paul Kent, a computational materials scientist at ORNL’s Center for Nanophase Materials Sciences, the team collaborated with researchers at UCLA and Berkeley Lab’s Molecular Foundry to combine world-class experimental data with world-class computing to do something new–simulate magnetism atom by atom in a real nanoparticle.

Using the new data from the research teams on the West Coast, Eisenbach and Kent were able to precisely model the measured atomic structure, including defects, from a unique iron-platinum (FePt) nanoparticle and simulate its magnetic properties on the 27-petaflop Titan supercomputer at the OLCF.

Electronic structure codes take atomic and chemical structure and solve for the corresponding magnetic properties. However, these structures are typically derived from many 2-D electron microscopy or x-ray crystallography images averaged together, resulting in a representative, but not true, 3-D structure.

“In this case, researchers were able to get the precise 3-D structure for a real particle,” Eisenbach said. “The UCLA group has developed a new experimental technique where they can tell where the atoms are–the coordinates–and the chemical resolution, or what they are — iron or platinum.”

The ORNL news release goes on to describe the work from the perspective of the people who ran the supercompute simulationsr,

A Supercomputing Milestone

Magnetism at the atomic level is driven by quantum mechanics — a fact that has shaken up classical physics calculations and called for increasingly complex, first-principle calculations, or calculations working forward from fundamental physics equations rather than relying on assumptions that reduce computational workload.

For magnetic recording and storage devices, researchers are particularly interested in magnetic anisotropy, or what direction magnetism favors in an atom.

“If the anisotropy is too weak, a bit written to the nanoparticle might flip at room temperature,” Kent said.

To solve for magnetic anisotropy, Eisenbach and Kent used two computational codes to compare and validate results.

To simulate a supercell of about 1,300 atoms from strongly magnetic regions of the 23,000-atom nanoparticle, they used the Linear Scaling Multiple Scattering (LSMS) code, a first-principles density functional theory code developed at ORNL.

“The LSMS code was developed for large magnetic systems and can tackle lots of atoms,” Kent said.

As principal investigator on 2017, 2016, and previous INCITE program awards, Eisenbach has scaled the LSMS code to Titan for a range of magnetic materials projects, and the in-house code has been optimized for Titan’s accelerated architecture, speeding up calculations more than 8 times on the machine’s GPUs. Exceptionally capable of crunching large magnetic systems quickly, the LSMS code received an Association for Computing Machinery Gordon Bell Prize in high-performance computing achievement in 1998 and 2009, and developments continue to enhance the code for new architectures.

Working with Renat Sabirianov at the University of Nebraska at Omaha, the team also ran VASP, a simulation package that is better suited for smaller atom counts, to simulate regions of about 32 atoms.

“With both approaches, we were able to confirm that the local VASP results were consistent with the LSMS results, so we have a high confidence in the simulations,” Eisenbach said.

Computer simulations revealed that grain boundaries have a strong effect on magnetism. “We found that the magnetic anisotropy energy suddenly transitions at the grain boundaries. These magnetic properties are very important,” Miao said.

In the future, researchers hope that advances in computing and simulation will make a full-particle simulation possible — as first-principles calculations are currently too intensive to solve small-scale magnetism for regions larger than a few thousand atoms.

Also, future simulations like these could show how different fabrication processes, such as the temperature at which nanoparticles are formed, influence magnetism and performance.

“There’s a hope going forward that one would be able to use these techniques to look at nanoparticle growth and understand how to optimize growth for performance,” Kent said.

Finally, here’s a link to and a citation for the paper,

Deciphering chemical order/disorder and material properties at the single-atom level by Yongsoo Yang, Chien-Chun Chen, M. C. Scott, Colin Ophus, Rui Xu, Alan Pryor, Li Wu, Fan Sun, Wolfgang Theis, Jihan Zhou, Markus Eisenbach, Paul R. C. Kent, Renat F. Sabirianov, Hao Zeng, Peter Ercius, & Jianwei Miao. Nature 542, 75–79 (02 February 2017) doi:10.1038/nature21042 Published online 01 February 2017

This paper is behind a paywall.

New principles for AI (artificial intelligence) research along with some history and a plea for a democratic discussion

For almost a month I’ve been meaning to get to this Feb. 1, 2017 essay by Andrew Maynard (director of Risk Innovation Lab at Arizona State University) and Jack Stilgoe (science policy lecturer at University College London [UCL]) on the topic of artificial intelligence and principles (Note: Links have been removed). First, a walk down memory lane,

Today [Feb. 1, 2017] in Washington DC, leading US and UK scientists are meeting to share dispatches from the frontiers of machine learning – an area of research that is creating new breakthroughs in artificial intelligence (AI). Their meeting follows the publication of a set of principles for beneficial AI that emerged from a conference earlier this year at a place with an important history.

In February 1975, 140 people – mostly scientists, with a few assorted lawyers, journalists and others – gathered at a conference centre on the California coast. A magazine article from the time by Michael Rogers, one of the few journalists allowed in, reported that most of the four days’ discussion was about the scientific possibilities of genetic modification. Two years earlier, scientists had begun using recombinant DNA to genetically modify viruses. The Promethean nature of this new tool prompted scientists to impose a moratorium on such experiments until they had worked out the risks. By the time of the Asilomar conference, the pent-up excitement was ready to burst. It was only towards the end of the conference when a lawyer stood up to raise the possibility of a multimillion-dollar lawsuit that the scientists focussed on the task at hand – creating a set of principles to govern their experiments.

The 1975 Asilomar meeting is still held up as a beacon of scientific responsibility. However, the story told by Rogers, and subsequently by historians, is of scientists motivated by a desire to head-off top down regulation with a promise of self-governance. Geneticist Stanley Cohen said at the time, ‘If the collected wisdom of this group doesn’t result in recommendations, the recommendations may come from other groups less well qualified’. The mayor of Cambridge, Massachusetts was a prominent critic of the biotechnology experiments then taking place in his city. He said, ‘I don’t think these scientists are thinking about mankind at all. I think that they’re getting the thrills and the excitement and the passion to dig in and keep digging to see what the hell they can do’.

The concern in 1975 was with safety and containment in research, not with the futures that biotechnology might bring about. A year after Asilomar, Cohen’s colleague Herbert Boyer founded Genentech, one of the first biotechnology companies. Corporate interests barely figured in the conversations of the mainly university scientists.

Fast-forward 42 years and it is clear that machine learning, natural language processing and other technologies that come under the AI umbrella are becoming big business. The cast list of the 2017 Asilomar meeting included corporate wunderkinds from Google, Facebook and Tesla as well as researchers, philosophers, and other academics. The group was more intellectually diverse than their 1975 equivalents, but there were some notable absences – no public and their concerns, no journalists, and few experts in the responsible development of new technologies.

Maynard and Stilgoe offer a critique of the latest principles,

The principles that came out of the meeting are, at least at first glance, a comforting affirmation that AI should be ‘for the people’, and not to be developed in ways that could cause harm. They promote the idea of beneficial and secure AI, development for the common good, and the importance of upholding human values and shared prosperity.

This is good stuff. But it’s all rather Motherhood and Apple Pie: comforting and hard to argue against, but lacking substance. The principles are short on accountability, and there are notable absences, including the need to engage with a broader set of stakeholders and the public. At the early stages of developing new technologies, public concerns are often seen as an inconvenience. In a world in which populism appears to be trampling expertise into the dirt, it is easy to understand why scientists may be defensive.

I encourage you to read this thoughtful essay in its entirety although I do have one nit to pick:  Why only US and UK scientists? I imagine the answer may lie in funding and logistics issues but I find it surprising that the critique makes no mention of the international community as a nod to inclusion.

For anyone interested in the Asolimar AI principles (2017), you can find them here. You can also find videos of the two-day workshop (Jan. 31 – Feb. 1, 2017 workshop titled The Frontiers of Machine Learning (a Raymond and Beverly Sackler USA-UK Scientific Forum [US National Academy of Sciences]) here (videos for each session are available on Youtube).

The physics of melting in two-dimensional systems

You might want to skip over the reference to snow as it doesn’t have much relevance to this story about ‘melting’, from a Feb. 1, 2017 news item on Nanowerk (Note: A link has been removed),

Snow falls in winter and melts in spring, but what drives the phase change in between?
Although melting is a familiar phenomenon encountered in everyday life, playing a part in many industrial and commercial processes, much remains to be discovered about this transformation at a fundamental level.

In 2015, a team led by the University of Michigan’s Sharon Glotzer used high-performance computing at the Department of Energy’s (DOE’s) Oak Ridge National Laboratory [ORNL] to study melting in two-dimensional (2-D) systems, a problem that could yield insights into surface interactions in materials important to technologies like solar panels, as well as into the mechanism behind three-dimensional melting. The team explored how particle shape affects the physics of a solid-to-fluid melting transition in two dimensions.

Using the Cray XK7 Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility, the team’s [latest?] work revealed that the shape and symmetry of particles can dramatically affect the melting process (“Shape and symmetry determine two-dimensional melting transitions of hard regular polygons”). This fundamental finding could help guide researchers in search of nanoparticles with desirable properties for energy applications.

There is a video  of the ‘melting’ process but I have to confess to finding it a bit enigmatic,

A Feb. 1, 2017 ORNL news release (also on EurekAlert), which originated the news item, provides more detail about the research,

o tackle the problem, Glotzer’s team needed a supercomputer capable of simulating systems of up to 1 million hard polygons, simple particles used as stand-ins for atoms, ranging from triangles to 14-sided shapes. Unlike traditional molecular dynamics simulations that attempt to mimic nature, hard polygon simulations give researchers a pared-down environment in which to evaluate shape-influenced physics.

“Within our simulated 2-D environment, we found that the melting transition follows one of three different scenarios depending on the shape of the systems’ polygons,” University of Michigan research scientist Joshua Anderson said. “Notably, we found that systems made up of hexagons perfectly follow a well-known theory for 2-D melting, something that hasn’t been described until now.”

Shifting Shape Scenarios

In 3-D systems such as a thinning icicle, melting takes the form of a first-order phase transition. This means that collections of molecules within these systems exist in either solid or liquid form with no in-between in the presence of latent heat, the energy that fuels a solid-to-fluid phase change . In 2-D systems, such as thin-film materials used in batteries and other technologies, melting can be more complex, sometimes exhibiting an intermediate phase known as the hexatic phase.

The hexatic phase, a state characterized as a halfway point between an ordered solid and a disordered liquid, was first theorized in the 1970s by researchers John Kosterlitz, David Thouless, Burt Halperin, David Nelson, and Peter Young. The phase is a principle feature of the KTHNY theory, a 2-D melting theory posited by the researchers (and named based on the first letters of their last names). In 2016 Kosterlitz and Thouless were awarded the Nobel Prize in Physics, along with physicist Duncan Haldane, for their contributions to 2-D materials research.

At the molecular level, solid, hexatic, and liquid systems are defined by the arrangement of their atoms. In a crystalline solid, two types of order are present: translational and orientational. Translational order describes the well-defined paths between atoms over distances, like blocks in a carefully constructed Jenga tower. Orientational order describes the relational and clustered order shared between atoms and groups of atoms over distances. Think of that same Jenga tower turned askew after several rounds of play. The general shape of the tower remains, but its order is now fragmented.

The hexatic phase has no translational order but possesses orientational order. (A liquid has neither translational nor orientational order but exhibits short-range order, meaning any atom will have some average number of neighbors nearby but with no predicable order.)

Deducing the presence of a hexatic phase requires a leadership-class computer that can calculate large hard-particle systems. Glotzer’s team gained access to the OLCF’s 27-petaflop Titan through the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, running its GPU-accelerated HOOMD-blue code to maximize time on the machine.

On Titan, HOOMD-blue used 64 GPUs for each massively parallel Monte Carlo simulation of up to 1 million particles. Researchers explored 11 different shape systems, applying an external pressure to push the particles together. Each system was simulated at 21 different densities, with the lowest densities representing a fluid state and the highest densities a solid state.

The simulations demonstrated multiple melting scenarios hinging on the polygons’ shape. Systems with polygons of seven sides or more closely followed the melting behavior of hard disks, or circles, exhibiting a continuous phase transition from the solid to the hexatic phase and a first-order phase transition from the hexatic to the liquid phase. A continuous phase transition means a constantly changing area in response to a changing external pressure. A first-order phase transition is characterized by a discontinuity in which the volume jumps across the phase transition in response to the changing external pressure. The team found pentagons and fourfold pentilles, irregular pentagons with two different edge lengths, exhibit a first-order solid-to-liquid phase transition.

The most significant finding, however, emerged from hexagon systems, which perfectly followed the phase transition described by the KTHNY theory. In this scenario, the particles’ shift from solid to hexatic and hexatic to fluid in a perfect continuous phase transition pattern.

“It was actually sort of surprising that no one else has found that until now,” Anderson said, “because it seems natural that the hexagon, with its six sides, and the honeycomb-like hexagonal arrangement would be a perfect match for this theory” in which the hexatic phase generally contains sixfold orientational order.

Glotzer’s team, which recently received a 2017 INCITE allocation, is now applying its leadership-class computing prowess to tackle phase transitions in 3-D. The team is focusing on how fluid particles crystallize into complex colloids—mixtures in which particles are suspended throughout another substance. Common examples of colloids include milk, paper, fog, and stained glass.

“We’re planning on using Titan to study how complexity can arise from these simple interactions, and to do that we’re actually going to look at how the crystals grow and study the kinetics of how that happens,” said Anderson.

There is a paper on arXiv,

Shape and symmetry determine two-dimensional melting transitions of hard regular polygons by Joshua A. Anderson, James Antonaglia, Jaime A. Millan, Michael Engel, Sharon C. Glotzer
(Submitted on 2 Jun 2016 (v1), last revised 23 Dec 2016 (this version, v2))  arXiv:1606.00687 [cond-mat.soft] (or arXiv:1606.00687v2

This paper is open access and open to public peer review.

US report on Women, minorities, and people with disabilities in science and engineerin

A Jan. 31, 2017 news item on ScienceDaily announces a new report from the US National Science Foundation’s (NSF) National Center for Science and Engineering Statistics (NCSES),

The National Center for Science and Engineering Statistics (NCSES) today [Jan. 31, 2017,] announced the release of the 2017 Women, Minorities, and Persons with Disabilities in Science and Engineering (WMPD) report, the federal government’s most comprehensive look at the participation of these three demographic groups in science and engineering education and employment.

The report shows the degree to which women, people with disabilities and minorities from three racial and ethnic groups — black, Hispanic and American Indian or Alaska Native — are underrepresented in science and engineering (S&E). Women have reached parity with men in educational attainment but not in S&E employment. Underrepresented minorities account for disproportionately smaller percentages in both S&E education and employment

Congress mandated the biennial report in the Science and Engineering Equal Opportunities Act as part of the National Science Foundation’s (NSF) mission to encourage and strengthen the participation of underrepresented groups in S&E.

A Jan. 31, 2017 NSF news release (also on EurekAlert), which originated the news item, provides information about why the report is issued every two years and provides highlights from the 2017 report,

“An important part of fulfilling our mission to further the progress of science is producing current, accurate information about the U.S. STEM workforce,” said NSF Director France Córdova. “This report is a valuable resource to the science and engineering policy community.”

NSF maintains a portfolio of programs aimed at broadening participation in S&E, including ADVANCE: Increasing the Participation and Advancement of Women in Academic Science and Engineering Careers; LSAMP: the Louis Stokes Alliances for Minority Participation; and NSF INCLUDES, which focuses on building networks that can scale up proven approaches to broadening participation.

The digest provides highlights and analysis in five topic areas: enrollment, field of degree, occupation, employment status and early career doctorate holders. That last topic area includes analysis of pilot study data from the Early Career Doctorates Survey, a new NCSES product. NCSES also maintains expansive WMPD data tables, updated periodically as new data become available, which present the latest S&E education and workforce data available from NCSES and other agencies. The tables provide the public access to detailed, field-by-field information that includes both percentages and the actual numbers of people involved in S&E.

“WMPD is more than just a single report or presentation,” said NCSES Director John Gawalt. “It is a vast and unique information resource, carefully curated and maintained, that allows anyone (from the general public to highly trained researchers) ready access to data that facilitate and support their own exploration and analyses.”

Key findings from the new digest include:

  • The types of schools where students enroll vary among racial and ethnic groups. Hispanics, American Indians or Alaska Natives and Native Hawaiians or Other Pacific Islanders are more likely to enroll in community colleges. Blacks and Native Hawaiian or Other Pacific Islanders are more likely to enroll in private, for profit schools.
  • Since the late 1990s, women have earned about half of S&E bachelor’s degrees. But their representation varies widely by field, ranging from 70 percent in psychology to 18 percent in computer sciences.
  • At every level — bachelor’s, master’s and doctorate — underrepresented minority women earn a higher proportion of degrees than their male counterparts. White women, in contrast earn a smaller proportion of degrees than their male counterparts.
  • Despite two decades of progress, a wide gap in educational attainment remains between underrepresented minorities and whites and Asians, two groups that have higher representation in S&E education than they do in the U.S. population.
  • White men constitute about one-third of the overall U.S. population; they comprise half of the S&E workforce. Blacks, Hispanics and people with disabilities are underrepresented in the S&E workforce.
  • Women’s participation in the workforce varies greatly by field of occupation.
  • In 2015, scientists and engineers had a lower unemployment rate compared to the general U.S. population (3.3 percent versus 5.8 percent), although the rate varied among groups. For example, it was 2.8 percent among white women in S&E but 6.0 percent for underrepresented minority women.

For more information, including access to the digest and data tables, see the updated WMPD website.

Caption: In 2015, women and some minority groups were represented less in science and engineering (S&E) occupations than they were in the US general population.. Credit: NSF

R.I.P. Mildred Dresselhaus, Queen of Carbon

I’ve been hearing about Mildred Dresselhaus, professor emerita (retired professor) at the Massachusetts Institute of Technology (MIT), just about as long as I’ve been researching and writing about nanotechnology (about 10 years including the work for my master’s project with the almost eight years on this blog).

She died on Monday, Feb. 20, 2017 at the age of 86 having broken through barriers for those of her gender, barriers for her subject area, and barriers for her age.

Mark Anderson in his Feb. 22, 2017 obituary for the IEEE (Institute of Electrical and Electronics Engineers) Spectrum website provides a brief overview of her extraordinary life and accomplishments,

Called the “Queen of Carbon Science,” Dresselhaus pioneered the study of carbon nanostructures at a time when studying physical and material properties of commonplace atoms like carbon was out of favor. Her visionary perspectives on the sixth atom in the periodic table—including exploring individual layers of carbon atoms (precursors to graphene), developing carbon fibers stronger than steel, and revealing new carbon structures that were ultimately developed into buckyballs and nanotubes—invigorated the field.

“Millie Dresselhaus began life as the child of poor Polish immigrants in the Bronx; by the end, she was Institute Professor Emerita, the highest distinction awarded by the MIT faculty. A physicist, materials scientist, and electrical engineer, she was known as the ‘Queen of Carbon’ because her work paved the way for much of today’s carbon-based nanotechnology,” MIT president Rafael Reif said in a prepared statement.

Friends and colleagues describe Dresselhaus as a gifted instructor as well as a tireless and inspired researcher. And her boundless generosity toward colleagues, students, and girls and women pursuing careers in science is legendary.

In 1963, Dresselhaus began her own career studying carbon by publishing a paper on graphite in the IBM Journal for Research and Development, a foundational work in the history of nanotechnology. To this day, her studies of the electronic structure of this material serve as a reference point for explorations of the electronic structure of fullerenes and carbon nanotubes. Coauthor, with her husband Gene Dresselhaus, of a leading book on carbon fibers, she began studying the laser vaporation of carbon and the “carbon clusters” that resulted. Researchers who followed her lead discovered a 60-carbon structure that was soon identified as the icosahedral “soccer ball” molecular configuration known as buckminsterfullerene, or buckyball. In 1991, Dresselhaus further suggested that fullerene could be elongated as a tube, and she outlined these imagined objects’ symmetries. Not long after, researchers announced the discovery of carbon nanotubes.

When she began her nearly half-century career at MIT, as a visiting professor, women consisted of just 4 percent of the undergraduate student population.  So Dresselhaus began working toward the improvement of living conditions for women students at the university. Through her leadership, MIT adopted an equal and joint admission process for women and men. (Previously, MIT had propounded the self-fulfilling prophecy of harboring more stringent requirements for women based on less dormitory space and perceived poorer performance.) And so promoting women in STEM—before it was ever called STEM—became one of her passions. Serving as president of the American Physical Society, she spearheaded and launched initiatives like the Committee on the Status of Women in Physics and the society’s more informal committees of visiting women physicists on campuses around the United States, which have increased the female faculty and student populations on the campuses they visit.

If you have the time, please read Anderson’s piece in its entirety.

One fact that has impressed me greatly is that Dresselhaus kept working into her eighties. I featured a paper she published in an April 27, 2012 posting at the age of 82 and she was described in the MIT write up at the time as a professor, not a professor emerita. I later featured Dresselhaus in a May 31, 2012 posting when she was awarded the Kavli Prize for Nanoscience.

It seems she worked almost to the end. Recently, GE (General Electric) posted a video “What If Scientists Were Celebrities?” starring Mildred Dresselhaus,

H/t Mark Anderson’s obituary Feb. 22, 2017 piece. The video was posted on Feb. 8, 2017.

Goodbye to the Queen of Carbon!

University of Alberta scientists use ultra fast (terahertz) microscopy to see ultra small (electron dynamics)

This is exciting news for Canadian science and the second time there has been a breakthrough development from the province of Alberta within the last five months (see Sept. 21, 2016 posting on quantum teleportation). From a Feb. 21, 2017 news item on ScienceDaily,

For the first time ever, scientists have captured images of terahertz electron dynamics of a semiconductor surface on the atomic scale. The successful experiment indicates a bright future for the new and quickly growing sub-field called terahertz scanning tunneling microscopy (THz-STM), pioneered by the University of Alberta in Canada. THz-STM allows researchers to image electron behaviour at extremely fast timescales and explore how that behaviour changes between different atoms.

From a Feb. 21, 2017 University of Alberta news release on EurekAlert, which originated the news item, expands on the theme,

“We can essentially zoom in to observe very fast processes with atomic precision and over super fast time scales,” says Vedran Jelic, PhD student at the University of Alberta and lead author on the new study. “THz-STM provides us with a new window into the nanoworld, allowing us to explore ultrafast processes on the atomic scale. We’re talking a picosecond, or a millionth millionth of a second. It’s something that’s never been done before.”

Jelic and his collaborators used their scanning tunneling microscope (STM) to capture images of silicon atoms by raster scanning a very sharp tip across the surface and recording the tip height as it follows the atomic corrugations of the surface. While the original STM can measure and manipulate single atoms–for which its creators earned a Nobel Prize in 1986–it does so using wired electronics and is ultimately limited in speed and thus time resolution.

Modern lasers produce very short light pulses that can measure a whole range of ultra-fast processes, but typically over length scales limited by the wavelength of light at hundreds of nanometers. Much effort has been expended to overcome the challenges of combining ultra-fast lasers with ultra-small microscopy. The University of Alberta scientists addressed these challenges by working in a unique terahertz frequency range of the electromagnetic spectrum that allows wireless implementation. Normally the STM needs an applied voltage in order to operate, but Jelic and his collaborators are able to drive their microscope using pulses of light instead. These pulses occur over really fast timescales, which means the microscope is able to see really fast events.

By incorporating the THz-STM into an ultrahigh vacuum chamber, free from any external contamination or vibration, they are able to accurately position their tip and maintain a perfectly clean surface while imaging ultrafast dynamics of atoms on surfaces. Their next step is to collaborate with fellow material scientists and image a variety of new surfaces on the nanoscale that may one day revolutionize the speed and efficiency of current technology, ranging from solar cells to computer processing.

“Terahertz scanning tunneling microscopy is opening the door to an unexplored regime in physics,” concludes Jelic, who is studying in the Ultrafast Nanotools Lab with University of Alberta professor Frank Hegmann, a world expert in ultra-fast terahertz science and nanophysics.

Here’s are links to and citations for the team’s 2013 paper and their latest,

An ultrafast terahertz scanning tunnelling microscope by Tyler L. Cocker, Vedran Jelic, Manisha Gupta, Sean J. Molesky, Jacob A. J. Burgess, Glenda De Los Reyes, Lyubov V. Titova, Ying Y. Tsui, Mark R. Freeman, & Frank A. Hegmann. Nature Photonics 7, 620–625 (2013) doi:10.1038/nphoton.2013.151 Published online 07 July 2013

Ultrafast terahertz control of extreme tunnel currents through single atoms on a silicon surface by Vedran Jelic, Krzysztof Iwaszczuk, Peter H. Nguyen, Christopher Rathje, Graham J. Hornig, Haille M. Sharum, James R. Hoffman, Mark R. Freeman, & Frank A. Hegmann. Nature Physics (2017)  doi:10.1038/nphys4047 Published online 20 February 2017

Both papers are behind a paywall.

Quantum Shorts & Quantum Applications event at Vancouver’s (Canada) Science World

This is very short notice but if you do have some free time on Thursday, Feb. 23, 2017 from 6 – 8:30 pm, you can check out Science World’s Quantum: The Exhibition for free and watch a series of short films. Here’s more from the Quantum Shorts & Quantum Applications event page,

Join us for an evening of quantum art and science. Visit Quantum: The Exhibition and view a series of short films inspired by the science, history, and philosophy of quantum. Find some answers to your Quantum questions at this mind-expanding panel discussion.

Thursday, February 23: 

6pm                      Check out Quantum: The Exhibition
7pm                      Quantum Shorts Screening
7:45pm                 Panel Discussion/Presentation
8:30pm                 Q & A

Light refreshments will be available.

There are still spaces as of Weds., Feb. 22, 2017:; you can register for the event here.

This will be of the last chances you’ll have to see Quantum: The Exhibition as the show’s here last day is scheduled for Feb. 26, 2017.