Tag Archives: UK

Revisiting the scientific past for new breakthroughs

A March 2, 2017 article on phys.org features a thought-provoking (and, for some of us, confirming) take on scientific progress  (Note: Links have been removed),

The idea that science isn’t a process of constant progress might make some modern scientists feel a bit twitchy. Surely we know more now than we did 100 years ago? We’ve sequenced the genome, explored space and considerably lengthened the average human lifespan. We’ve invented aircraft, computers and nuclear energy. We’ve developed theories of relativity and quantum mechanics to explain how the universe works.

However, treating the history of science as a linear story of progression doesn’t reflect wholly how ideas emerge and are adapted, forgotten, rediscovered or ignored. While we are happy with the notion that the arts can return to old ideas, for example in neoclassicism, this idea is not commonly recognised in science. Is this constraint really present in principle? Or is it more a comment on received practice or, worse, on the general ignorance of the scientific community of its own intellectual history?

For one thing, not all lines of scientific enquiry are pursued to conclusion. For example, a few years ago, historian of science Hasok Chang undertook a careful examination of notebooks from scientists working in the 19th century. He unearthed notes from experiments in electrochemistry whose results received no explanation at the time. After repeating the experiments himself, Chang showed the results still don’t have a full explanation today. These research programmes had not been completed, simply put to one side and forgotten.

A March 1, 2017 essay by Giles Gasper (Durham University), Hannah Smithson (University of Oxford) and Tom Mcleish (Durham University) for The Conversation, which originated the article, expands on the theme (Note: Links have been removed),

… looping back into forgotten scientific history might also provide an alternative, regenerative way of thinking that doesn’t rely on what has come immediately before it.

Collaborating with an international team of colleagues, we have taken this hypothesis further by bringing scientists into close contact with scientific treatises from the early 13th century. The treatises were composed by the English polymath Robert Grosseteste – who later became Bishop of Lincoln – between 1195 and 1230. They cover a wide range of topics we would recognise as key to modern physics, including sound, light, colour, comets, the planets, the origin of the cosmos and more.

We have worked with paleographers (handwriting experts) and Latinists to decipher Grosseteste’s manuscripts, and with philosophers, theologians, historians and scientists to provide intellectual interpretation and context to his work. As a result, we’ve discovered that scientific and mathematical minds today still resonate with Grosseteste’s deeply physical and structured thinking.

Our first intuition and hope was that the scientists might bring a new analytic perspective to these very technical texts. And so it proved: the deep mathematical structure of a small treatise on colour, the De colore, was shown to describe what we would now call a three-dimensional abstract co-ordinate space for colour.

But more was true. During the examination of each treatise, at some point one of the group would say: “Did anyone ever try doing …?” or “What would happen if we followed through with this calculation, supposing he meant …”. Responding to this thinker from eight centuries ago has, to our delight and surprise, inspired new scientific work of a rather fresh cut. It isn’t connected in a linear way to current research programmes, but sheds light on them from new directions.

I encourage you to read the essay in its entirety.

The Canadian science scene and the 2017 Canadian federal budget

There’s not much happening in the 2017-18 budget in terms of new spending according to Paul Wells’ March 22, 2017 article for TheStar.com,

This is the 22nd or 23rd federal budget I’ve covered. And I’ve never seen the like of the one Bill Morneau introduced on Wednesday [March 22, 2017].

Not even in the last days of the Harper Conservatives did a budget provide for so little new spending — $1.3 billion in the current budget year, total, in all fields of government. That’s a little less than half of one per cent of all federal program spending for this year.

But times are tight. The future is a place where we can dream. So the dollars flow more freely in later years. In 2021-22, the budget’s fifth planning year, new spending peaks at $8.2 billion. Which will be about 2.4 per cent of all program spending.

He’s not alone in this 2017 federal budget analysis; CBC (Canadian Broadcasting Corporation) pundits, Chantal Hébert, Andrew Coyne, and Jennifer Ditchburn said much the same during their ‘At Issue’ segment of the March 22, 2017 broadcast of The National (news).

Before I focus on the science and technology budget, here are some general highlights from the CBC’s March 22, 2017 article on the 2017-18 budget announcement (Note: Links have been removed,

Here are highlights from the 2017 federal budget:

  • Deficit: $28.5 billion, up from $25.4 billion projected in the fall.
  • Trend: Deficits gradually decline over next five years — but still at $18.8 billion in 2021-22.
  • Housing: $11.2 billion over 11 years, already budgeted, will go to a national housing strategy.
  • Child care: $7 billion over 10 years, already budgeted, for new spaces, starting 2018-19.
  • Indigenous: $3.4 billion in new money over five years for infrastructure, health and education.
  • Defence: $8.4 billion in capital spending for equipment pushed forward to 2035.
  • Care givers: New care-giving benefit up to 15 weeks, starting next year.
  • Skills: New agency to research and measure skills development, starting 2018-19.
  • Innovation: $950 million over five years to support business-led “superclusters.”
  • Startups: $400 million over three years for a new venture capital catalyst initiative.
  • AI: $125 million to launch a pan-Canadian Artificial Intelligence Strategy.
  • Coding kids: $50 million over two years for initiatives to teach children to code.
  • Families: Option to extend parental leave up to 18 months.
  • Uber tax: GST to be collected on ride-sharing services.
  • Sin taxes: One cent more on a bottle of wine, five cents on 24 case of beer.
  • Bye-bye: No more Canada Savings Bonds.
  • Transit credit killed: 15 per cent non-refundable public transit tax credit phased out this year.

You can find the entire 2017-18 budget here.

Science and the 2017-18 budget

For anyone interested in the science news, you’ll find most of that in the 2017 budget’s Chapter 1 — Skills, Innovation and Middle Class jobs. As well, Wayne Kondro has written up a précis in his March 22, 2017 article for Science (magazine),

Finance officials, who speak on condition of anonymity during the budget lock-up, indicated the budgets of the granting councils, the main source of operational grants for university researchers, will be “static” until the government can assess recommendations that emerge from an expert panel formed in 2015 and headed by former University of Toronto President David Naylor to review basic science in Canada [highlighted in my June 15, 2016 posting ; $2M has been allocated for the advisor and associated secretariat]. Until then, the officials said, funding for the Natural Sciences and Engineering Research Council of Canada (NSERC) will remain at roughly $848 million, whereas that for the Canadian Institutes of Health Research (CIHR) will remain at $773 million, and for the Social Sciences and Humanities Research Council [SSHRC] at $547 million.

NSERC, though, will receive $8.1 million over 5 years to administer a PromoScience Program that introduces youth, particularly unrepresented groups like Aboriginal people and women, to science, technology, engineering, and mathematics through measures like “space camps and conservation projects.” CIHR, meanwhile, could receive modest amounts from separate plans to identify climate change health risks and to reduce drug and substance abuse, the officials added.

… Canada’s Innovation and Skills Plan, would funnel $600 million over 5 years allocated in 2016, and $112.5 million slated for public transit and green infrastructure, to create Silicon Valley–like “super clusters,” which the budget defined as “dense areas of business activity that contain large and small companies, post-secondary institutions and specialized talent and infrastructure.” …

… The Canadian Institute for Advanced Research will receive $93.7 million [emphasis mine] to “launch a Pan-Canadian Artificial Intelligence Strategy … (to) position Canada as a world-leading destination for companies seeking to invest in artificial intelligence and innovation.”

… Among more specific measures are vows to: Use $87.7 million in previous allocations to the Canada Research Chairs program to create 25 “Canada 150 Research Chairs” honoring the nation’s 150th year of existence, provide $1.5 million per year to support the operations of the office of the as-yet-unappointed national science adviser [see my Dec. 7, 2016 post for information about the job posting, which is now closed]; provide $165.7 million [emphasis mine] over 5 years for the nonprofit organization Mitacs to create roughly 6300 more co-op positions for university students and grads, and provide $60.7 million over five years for new Canadian Space Agency projects, particularly for Canadian participation in the National Aeronautics and Space Administration’s next Mars Orbiter Mission.

Kondros was either reading an earlier version of the budget or made an error regarding Mitacs (from the budget in the “A New, Ambitious Approach to Work-Integrated Learning” subsection),

Mitacs has set an ambitious goal of providing 10,000 work-integrated learning placements for Canadian post-secondary students and graduates each year—up from the current level of around 3,750 placements. Budget 2017 proposes to provide $221 million [emphasis mine] over five years, starting in 2017–18, to achieve this goal and provide relevant work experience to Canadian students.

As well, the budget item for the Pan-Canadian Artificial Intelligence Strategy is $125M.

Moving from Kondros’ précis, the budget (in the “Positioning National Research Council Canada Within the Innovation and Skills Plan” subsection) announces support for these specific areas of science,

Stem Cell Research

The Stem Cell Network, established in 2001, is a national not-for-profit organization that helps translate stem cell research into clinical applications, commercial products and public policy. Its research holds great promise, offering the potential for new therapies and medical treatments for respiratory and heart diseases, cancer, diabetes, spinal cord injury, multiple sclerosis, Crohn’s disease, auto-immune disorders and Parkinson’s disease. To support this important work, Budget 2017 proposes to provide the Stem Cell Network with renewed funding of $6 million in 2018–19.

Space Exploration

Canada has a long and proud history as a space-faring nation. As our international partners prepare to chart new missions, Budget 2017 proposes investments that will underscore Canada’s commitment to innovation and leadership in space. Budget 2017 proposes to provide $80.9 million on a cash basis over five years, starting in 2017–18, for new projects through the Canadian Space Agency that will demonstrate and utilize Canadian innovations in space, including in the field of quantum technology as well as for Mars surface observation. The latter project will enable Canada to join the National Aeronautics and Space Administration’s (NASA’s) next Mars Orbiter Mission.

Quantum Information

The development of new quantum technologies has the potential to transform markets, create new industries and produce leading-edge jobs. The Institute for Quantum Computing is a world-leading Canadian research facility that furthers our understanding of these innovative technologies. Budget 2017 proposes to provide the Institute with renewed funding of $10 million over two years, starting in 2017–18.

Social Innovation

Through community-college partnerships, the Community and College Social Innovation Fund fosters positive social outcomes, such as the integration of vulnerable populations into Canadian communities. Following the success of this pilot program, Budget 2017 proposes to invest $10 million over two years, starting in 2017–18, to continue this work.

International Research Collaborations

The Canadian Institute for Advanced Research (CIFAR) connects Canadian researchers with collaborative research networks led by eminent Canadian and international researchers on topics that touch all humanity. Past collaborations facilitated by CIFAR are credited with fostering Canada’s leadership in artificial intelligence and deep learning. Budget 2017 proposes to provide renewed and enhanced funding of $35 million over five years, starting in 2017–18.

Earlier this week, I highlighted Canada’s strength in the field of regenerative medicine, specifically stem cells in a March 21, 2017 posting. The $6M in the current budget doesn’t look like increased funding but rather a one-year extension. I’m sure they’re happy to receive it  but I imagine it’s a little hard to plan major research projects when you’re not sure how long your funding will last.

As for Canadian leadership in artificial intelligence, that was news to me. Here’s more from the budget,

Canada a Pioneer in Deep Learning in Machines and Brains

CIFAR’s Learning in Machines & Brains program has shaken up the field of artificial intelligence by pioneering a technique called “deep learning,” a computer technique inspired by the human brain and neural networks, which is now routinely used by the likes of Google and Facebook. The program brings together computer scientists, biologists, neuroscientists, psychologists and others, and the result is rich collaborations that have propelled artificial intelligence research forward. The program is co-directed by one of Canada’s foremost experts in artificial intelligence, the Université de Montréal’s Yoshua Bengio, and for his many contributions to the program, the University of Toronto’s Geoffrey Hinton, another Canadian leader in this field, was awarded the title of Distinguished Fellow by CIFAR in 2014.

Meanwhile, from chapter 1 of the budget in the subsection titled “Preparing for the Digital Economy,” there is this provision for children,

Providing educational opportunities for digital skills development to Canadian girls and boys—from kindergarten to grade 12—will give them the head start they need to find and keep good, well-paying, in-demand jobs. To help provide coding and digital skills education to more young Canadians, the Government intends to launch a competitive process through which digital skills training organizations can apply for funding. Budget 2017 proposes to provide $50 million over two years, starting in 2017–18, to support these teaching initiatives.

I wonder if BC Premier Christy Clark is heaving a sigh of relief. At the 2016 #BCTECH Summit, she announced that students in BC would learn to code at school and in newly enhanced coding camp programmes (see my Jan. 19, 2016 posting). Interestingly, there was no mention of additional funding to support her initiative. I guess this money from the federal government comes at a good time as we will have a provincial election later this spring where she can announce the initiative again and, this time, mention there’s money for it.

Attracting brains from afar

Ivan Semeniuk in his March 23, 2017 article (for the Globe and Mail) reads between the lines to analyze the budget’s possible impact on Canadian science,

But a between-the-lines reading of the budget document suggests the government also has another audience in mind: uneasy scientists from the United States and Britain.

The federal government showed its hand at the 2017 #BCTECH Summit. From a March 16, 2017 article by Meera Bains for the CBC news online,

At the B.C. tech summit, Navdeep Bains, Canada’s minister of innovation, said the government will act quickly to fast track work permits to attract highly skilled talent from other countries.

“We’re taking the processing time, which takes months, and reducing it to two weeks for immigration processing for individuals [who] need to come here to help companies grow and scale up,” Bains said.

“So this is a big deal. It’s a game changer.”

That change will happen through the Global Talent Stream, a new program under the federal government’s temporary foreign worker program.  It’s scheduled to begin on June 12, 2017.

U.S. companies are taking notice and a Canadian firm, True North, is offering to help them set up shop.

“What we suggest is that they think about moving their operations, or at least a chunk of their operations, to Vancouver, set up a Canadian subsidiary,” said the company’s founder, Michael Tippett.

“And that subsidiary would be able to house and accommodate those employees.”

Industry experts says while the future is unclear for the tech sector in the U.S., it’s clear high tech in B.C. is gearing up to take advantage.

US business attempts to take advantage of Canada’s relative stability and openness to immigration would seem to be the motive for at least one cross border initiative, the Cascadia Urban Analytics Cooperative. From my Feb. 28, 2017 posting,

There was some big news about the smallest version of the Cascadia region on Thursday, Feb. 23, 2017 when the University of British Columbia (UBC) , the University of Washington (state; UW), and Microsoft announced the launch of the Cascadia Urban Analytics Cooperative. From the joint Feb. 23, 2017 news release (read on the UBC website or read on the UW website),

In an expansion of regional cooperation, the University of British Columbia and the University of Washington today announced the establishment of the Cascadia Urban Analytics Cooperative to use data to help cities and communities address challenges from traffic to homelessness. The largest industry-funded research partnership between UBC and the UW, the collaborative will bring faculty, students and community stakeholders together to solve problems, and is made possible thanks to a $1-million gift from Microsoft.

Today’s announcement follows last September’s [2016] Emerging Cascadia Innovation Corridor Conference in Vancouver, B.C. The forum brought together regional leaders for the first time to identify concrete opportunities for partnerships in education, transportation, university research, human capital and other areas.

A Boston Consulting Group study unveiled at the conference showed the region between Seattle and Vancouver has “high potential to cultivate an innovation corridor” that competes on an international scale, but only if regional leaders work together. The study says that could be possible through sustained collaboration aided by an educated and skilled workforce, a vibrant network of research universities and a dynamic policy environment.

It gets better, it seems Microsoft has been positioning itself for a while if Matt Day’s analysis is correct (from my Feb. 28, 2017 posting),

Matt Day in a Feb. 23, 2017 article for the The Seattle Times provides additional perspective (Note: Links have been removed),

Microsoft’s effort to nudge Seattle and Vancouver, B.C., a bit closer together got an endorsement Thursday [Feb. 23, 2017] from the leading university in each city.

The partnership has its roots in a September [2016] conference in Vancouver organized by Microsoft’s public affairs and lobbying unit [emphasis mine.] That gathering was aimed at tying business, government and educational institutions in Microsoft’s home region in the Seattle area closer to its Canadian neighbor.

Microsoft last year [2016] opened an expanded office in downtown Vancouver with space for 750 employees, an outpost partly designed to draw to the Northwest more engineers than the company can get through the U.S. guest worker system [emphasis mine].

This was all prior to President Trump’s legislative moves in the US, which have at least one Canadian observer a little more gleeful than I’m comfortable with. From a March 21, 2017 article by Susan Lum  for CBC News online,

U.S. President Donald Trump’s efforts to limit travel into his country while simultaneously cutting money from science-based programs provides an opportunity for Canada’s science sector, says a leading Canadian researcher.

“This is Canada’s moment. I think it’s a time we should be bold,” said Alan Bernstein, president of CIFAR [which on March 22, 2017 was awarded $125M to launch the Pan Canada Artificial Intelligence Strategy in the Canadian federal budget announcement], a global research network that funds hundreds of scientists in 16 countries.

Bernstein believes there are many reasons why Canada has become increasingly attractive to scientists around the world, including the political climate in the United States and the Trump administration’s travel bans.

Thankfully, Bernstein calms down a bit,

“It used to be if you were a bright young person anywhere in the world, you would want to go to Harvard or Berkeley or Stanford, or what have you. Now I think you should give pause to that,” he said. “We have pretty good universities here [emphasis mine]. We speak English. We’re a welcoming society for immigrants.”​

Bernstein cautions that Canada should not be seen to be poaching scientists from the United States — but there is an opportunity.

“It’s as if we’ve been in a choir of an opera in the back of the stage and all of a sudden the stars all left the stage. And the audience is expecting us to sing an aria. So we should sing,” Bernstein said.

Bernstein said the federal government, with this week’s so-called innovation budget, can help Canada hit the right notes.

“Innovation is built on fundamental science, so I’m looking to see if the government is willing to support, in a big way, fundamental science in the country.”

Pretty good universities, eh? Thank you, Dr. Bernstein, for keeping some of the boosterism in check. Let’s leave the chest thumping to President Trump and his cronies.

Ivan Semeniuk’s March 23, 2017 article (for the Globe and Mail) provides more details about the situation in the US and in Britain,

Last week, Donald Trump’s first budget request made clear the U.S. President would significantly reduce or entirely eliminate research funding in areas such as climate science and renewable energy if permitted by Congress. Even the National Institutes of Health, which spearheads medical research in the United States and is historically supported across party lines, was unexpectedly targeted for a $6-billion (U.S.) cut that the White House said could be achieved through “efficiencies.”

In Britain, a recent survey found that 42 per cent of academics were considering leaving the country over worries about a less welcoming environment and the loss of research money that a split with the European Union is expected to bring.

In contrast, Canada’s upbeat language about science in the budget makes a not-so-subtle pitch for diversity and talent from abroad, including $117.6-million to establish 25 research chairs with the aim of attracting “top-tier international scholars.”

For good measure, the budget also includes funding for science promotion and $2-million annually for Canada’s yet-to-be-hired Chief Science Advisor, whose duties will include ensuring that government researchers can speak freely about their work.

“What we’ve been hearing over the last few months is that Canada is seen as a beacon, for its openness and for its commitment to science,” said Ms. Duncan [Kirsty Duncan, Minister of Science], who did not refer directly to either the United States or Britain in her comments.

Providing a less optimistic note, Erica Alini in her March 22, 2017 online article for Global News mentions a perennial problem, the Canadian brain drain,

The budget includes a slew of proposed reforms and boosted funding for existing training programs, as well as new skills-development resources for unemployed and underemployed Canadians not covered under current EI-funded programs.

There are initiatives to help women and indigenous people get degrees or training in science, technology, engineering and mathematics (the so-called STEM subjects) and even to teach kids as young as kindergarten-age to code.

But there was no mention of how to make sure Canadians with the right skills remain in Canada, TD’s DePratto {Toronto Dominion Bank} Economics; TD is currently experiencing a scandal {March 13, 2017 Huffington Post news item}] told Global News.

Canada ranks in the middle of the pack compared to other advanced economies when it comes to its share of its graduates in STEM fields, but the U.S. doesn’t shine either, said DePratto [Brian DePratto, senior economist at TD .

The key difference between Canada and the U.S. is the ability to retain domestic talent and attract brains from all over the world, he noted.

To be blunt, there may be some opportunities for Canadian science but it does well to remember (a) US businesses have no particular loyalty to Canada and (b) all it takes is an election to change any perceived advantages to disadvantages.

Digital policy and intellectual property issues

Dubbed by some as the ‘innovation’ budget (official title:  Building a Strong Middle Class), there is an attempt to address a longstanding innovation issue (from a March 22, 2017 posting by Michael Geist on his eponymous blog (Note: Links have been removed),

The release of today’s [march 22, 2017] federal budget is expected to include a significant emphasis on innovation, with the government revealing how it plans to spend (or re-allocate) hundreds of millions of dollars that is intended to support innovation. Canada’s dismal innovation record needs attention, but spending our way to a more innovative economy is unlikely to yield the desired results. While Navdeep Bains, the Innovation, Science and Economic Development Minister, has talked for months about the importance of innovation, Toronto Star columnist Paul Wells today delivers a cutting but accurate assessment of those efforts:

“This government is the first with a minister for innovation! He’s Navdeep Bains. He frequently posts photos of his meetings on Twitter, with the hashtag “#innovation.” That’s how you know there is innovation going on. A year and a half after he became the minister for #innovation, it’s not clear what Bains’s plans are. It’s pretty clear that within the government he has less than complete control over #innovation. There’s an advisory council on economic growth, chaired by the McKinsey guru Dominic Barton, which periodically reports to the government urging more #innovation.

There’s a science advisory panel, chaired by former University of Toronto president David Naylor, that delivered a report to Science Minister Kirsty Duncan more than three months ago. That report has vanished. One presumes that’s because it offered some advice. Whatever Bains proposes, it will have company.”

Wells is right. Bains has been very visible with plenty of meetings and public photo shoots but no obvious innovation policy direction. This represents a missed opportunity since Bains has plenty of policy tools at his disposal that could advance Canada’s innovation framework without focusing on government spending.

For example, Canada’s communications system – wireless and broadband Internet access – falls directly within his portfolio and is crucial for both business and consumers. Yet Bains has been largely missing in action on the file. He gave approval for the Bell – MTS merger that virtually everyone concedes will increase prices in the province and make the communications market less competitive. There are potential policy measures that could bring new competitors into the market (MVNOs [mobile virtual network operators] and municipal broadband) and that could make it easier for consumers to switch providers (ban on unlocking devices). Some of this falls to the CRTC, but government direction and emphasis would make a difference.

Even more troubling has been his near total invisibility on issues relating to new fees or taxes on Internet access and digital services. Canadian Heritage Minister Mélanie Joly has taken control of the issue with the possibility that Canadians could face increased costs for their Internet access or digital services through mandatory fees to contribute to Canadian content.  Leaving aside the policy objections to such an approach (reducing affordable access and the fact that foreign sources now contribute more toward Canadian English language TV production than Canadian broadcasters and distributors), Internet access and e-commerce are supposed to be Bains’ issue and they have a direct connection to the innovation file. How is it possible for the Innovation, Science and Economic Development Minister to have remained silent for months on the issue?

Bains has been largely missing on trade related innovation issues as well. My Globe and Mail column today focuses on a digital-era NAFTA, pointing to likely U.S. demands on data localization, data transfers, e-commerce rules, and net neutrality.  These are all issues that fall under Bains’ portfolio and will impact investment in Canadian networks and digital services. There are innovation opportunities for Canada here, but Bains has been content to leave the policy issues to others, who will be willing to sacrifice potential gains in those areas.

Intellectual property policy is yet another area that falls directly under Bains’ mandate with an obvious link to innovation, but he has done little on the file. Canada won a huge NAFTA victory late last week involving the Canadian patent system, which was challenged by pharmaceutical giant Eli Lilly. Why has Bains not promoted the decision as an affirmation of how Canada’s intellectual property rules?

On the copyright front, the government is scheduled to conduct a review of the Copyright Act later this year, but it is not clear whether Bains will take the lead or again cede responsibility to Joly. The Copyright Act is statutorily under the Industry Minister and reform offers the chance to kickstart innovation. …

For anyone who’s not familiar with this area, innovation is often code for commercialization of science and technology research efforts. These days, digital service and access policies and intellectual property policies are all key to research and innovation efforts.

The country that’s most often (except in mainstream Canadian news media) held up as an example of leadership in innovation is Estonia. The Economist profiled the country in a July 31, 2013 article and a July 7, 2016 article on apolitical.co provides and update.

Conclusions

Science monies for the tri-council science funding agencies (NSERC, SSHRC, and CIHR) are more or less flat but there were a number of line items in the federal budget which qualify as science funding. The $221M over five years for Mitacs, the $125M for the Pan-Canadian Artificial Intelligence Strategy, additional funding for the Canada research chairs, and some of the digital funding could also be included as part of the overall haul. This is in line with the former government’s (Stephen Harper’s Conservatives) penchant for keeping the tri-council’s budgets under control while spreading largesse elsewhere (notably the Perimeter Institute, TRIUMF [Canada’s National Laboratory for Particle and Nuclear Physics], and, in the 2015 budget, $243.5-million towards the Thirty Metre Telescope (TMT) — a massive astronomical observatory to be constructed on the summit of Mauna Kea, Hawaii, a $1.5-billion project). This has lead to some hard feelings in the past with regard to ‘big science’ projects getting what some have felt is an undeserved boost in finances while the ‘small fish’ are left scrabbling for the ever-diminishing (due to budget cuts in years past and inflation) pittances available from the tri-council agencies.

Mitacs, which started life as a federally funded Network Centre for Excellence focused on mathematics, has since shifted focus to become an innovation ‘champion’. You can find Mitacs here and you can find the organization’s March 2016 budget submission to the House of Commons Standing Committee on Finance here. At the time, they did not request a specific amount of money; they just asked for more.

The amount Mitacs expects to receive this year is over $40M which represents more than double what they received from the federal government and almost of 1/2 of their total income in the 2015-16 fiscal year according to their 2015-16 annual report (see p. 327 for the Mitacs Statement of Operations to March 31, 2016). In fact, the federal government forked over $39,900,189. in the 2015-16 fiscal year to be their largest supporter while Mitacs’ total income (receipts) was $81,993,390.

It’s a strange thing but too much money, etc. can be as bad as too little. I wish the folks Mitacs nothing but good luck with their windfall.

I don’t see anything in the budget that encourages innovation and investment from the industrial sector in Canada.

Finallyl, innovation is a cultural issue as much as it is a financial issue and having worked with a number of developers and start-up companies, the most popular business model is to develop a successful business that will be acquired by a large enterprise thereby allowing the entrepreneurs to retire before the age of 30 (or 40 at the latest). I don’t see anything from the government acknowledging the problem let alone any attempts to tackle it.

All in all, it was a decent budget with nothing in it to seriously offend anyone.

Brown recluse spider, one of the world’s most venomous spiders, shows off unique spinning technique

Caption: American Brown Recluse Spider is pictured. Credit: Oxford University

According to scientists from Oxford University this deadly spider could teach us a thing or two about strength. From a Feb. 15, 2017 news item on ScienceDaily,

Brown recluse spiders use a unique micro looping technique to make their threads stronger than that of any other spider, a newly published UK-US collaboration has discovered.

One of the most feared and venomous arachnids in the world, the American brown recluse spider has long been known for its signature necro-toxic venom, as well as its unusual silk. Now, new research offers an explanation for how the spider is able to make its silk uncommonly strong.

Researchers suggest that if applied to synthetic materials, the technique could inspire scientific developments and improve impact absorbing structures used in space travel.

The study, published in the journal Material Horizons, was produced by scientists from Oxford University’s Department of Zoology, together with a team from the Applied Science Department at Virginia’s College of William & Mary. Their surveillance of the brown recluse spider’s spinning behaviour shows how, and to what extent, the spider manages to strengthen the silk it makes.

A Feb. 15, 2017 University of Oxford press release, which originated the news item,  provides more detail about the research,

From observing the arachnid, the team discovered that unlike other spiders, who produce round ribbons of thread, recluse silk is thin and flat. This structural difference is key to the thread’s strength, providing the flexibility needed to prevent premature breakage and withstand the knots created during spinning which give each strand additional strength.

Professor Hannes Schniepp from William & Mary explains: “The theory of knots adding strength is well proven. But adding loops to synthetic filaments always seems to lead to premature fibre failure. Observation of the recluse spider provided the breakthrough solution; unlike all spiders its silk is not round, but a thin, nano-scale flat ribbon. The ribbon shape adds the flexibility needed to prevent premature failure, so that all the microloops can provide additional strength to the strand.”

By using computer simulations to apply this technique to synthetic fibres, the team were able to test and prove that adding even a single loop significantly enhances the strength of the material.

William & Mary PhD student Sean Koebley adds: “We were able to prove that adding even a single loop significantly enhances the toughness of a simple synthetic sticky tape. Our observations open the door to new fibre technology inspired by the brown recluse.”

Speaking on how the recluse’s technique could be applied more broadly in the future, Professor Fritz Vollrath, of the Department of Zoology at Oxford University, expands: “Computer simulations demonstrate that fibres with many loops would be much, much tougher than those without loops. This right away suggests possible applications. For example carbon filaments could be looped to make them less brittle, and thus allow their use in novel impact absorbing structures. One example would be spider-like webs of carbon-filaments floating in outer space, to capture the drifting space debris that endangers astronaut lives’ and satellite integrity.”

Here’s a link to and a citation for the paper,

Toughness-enhancing metastructure in the recluse spider’s looped ribbon silk by
S. R. Koebley, F. Vollrath, and H. C. Schniepp. Mater. Horiz., 2017, Advance Article DOI: 10.1039/C6MH00473C First published online 15 Feb 2017

This paper is open access although you may need to register with the Royal Society of Chemistry’s publishing site to get access.

Why do objects feel solid when atoms are mostly empty space?

Roger Barlow (professor at University of Huddersfield, UK) has written a Feb. 16, 2017 essay for The Conversation explaining why objects feel solid (Note: A link has been removed),

Chemist John Dalton proposed the theory that all matter and objects are made up of particles called atoms, and this is still accepted by the scientific community, almost two centuries later. Each of these atoms is each made up of an incredibly small nucleus and even smaller electrons, which move around at quite a distance from the centre.

If you imagine a table that is a billion times larger, its atoms would be the size of melons. But even so, the nucleus at the centre would still be far too small to see and so would the electrons as they dance around it. So why don’t our fingers just pass through atoms, and why doesn’t light get through the gaps?

To explain why we must look at the electrons. Unfortunately, much of what we are taught at school is simplified – electrons do not orbit the centre of an atom like planets around the sun, like you may have been taught. Instead, think of electrons like a swarm of bees or birds, where the individual motions are too fast to track, but you still see the shape of the overall swarm.

In fact, electrons dance – there is no better word for it. …

Electrons are like a swarm of birds. John Holmes/Wikimedia Commons, CC BY-SA

Here’s one more excerpt from Barlow’s essay,

So why does a table also feel solid? Many websites will tell you that this is due to the repulsion – that two negatively charged things must repel each other. But this is wrong, and shows you should never trust some things on the internet. It feels solid because of the dancing electrons.

Do enjoy!

Algorithms in decision-making: a government inquiry in the UK

Yesterday’s (Feb. 28, 2017) posting about the newly launched Cascadia Urban Analytics Cooperative grew too big to include interesting tidbits such as this one from Sense about Science, (from a Feb. 28, 2017 announcement received via email),

The House of Commons science and technology select committee announced
today that it will launch an inquiry into the use of algorithms in
decision-making […].

Our campaigns and policy officer Dr Stephanie Mathisen brought this
important and under-scrutinised issue to the committee as part of their
#MyScienceInquiry initiative; so fantastic news that they are taking up
the call.

A Feb. 28, 2017 UK House of Commons Science and Technology Select Committee press release gives more details about the inquiry,

The Science and Technology Committee is launching a new inquiry into the use of algorithms in public and business decision making.

In an increasingly digital world, algorithms are being used to make decisions in a growing range of contexts. From decisions about offering mortgages and credit cards to sifting job applications and sentencing criminals, the impact of algorithms is far reaching.

How an algorithm is formulated, its scope for error or correction, the impact it may have on an individual—and their ability to understand or challenge that decision—are increasingly relevant questions.

This topic was pitched to the Committee by Dr Stephanie Mathisen (Sense about Science) through the Committee’s ‘My Science Inquiry’ open call for inquiry suggestions, and has been chosen as the first subject for the Committee’s attention following that process. It follows the Committee’s recent work on Robotics and AI, and its call for a standing Commission on Artificial Intelligence.

Submit written evidence

The Committee would welcome written submissions by Friday 21 April 2017 on the following points:

  • The extent of current and future use of algorithms in decision-making in Government and public bodies, businesses and others, and the corresponding risks and opportunities;
  • Whether ‘good practice’ in algorithmic decision-making can be identified and spread, including in terms of:
    —  The scope for algorithmic decision-making to eliminate, introduce or amplify biases or discrimination, and how any such bias can be detected and overcome;
    — Whether and how algorithmic decision-making can be conducted in a ‘transparent’ or ‘accountable’ way, and the scope for decisions made by an algorithm to be fully understood and challenged;
    — DThe implications of increased transparency in terms of copyright and commercial sensitivity, and protection of an individual’s data;
  • Methods for providing regulatory oversight of algorithmic decision-making, such as the rights described in the EU General Data Protection Regulation 2016.

The Committee would welcome views on the issues above, and submissions that illustrate how the issues vary by context through case studies of the use of algorithmic decision-making.

You can submit written evidence through the algorithms in decision-making inquiry page.

I looked at the submission form and while it assumes the submitter is from the UK, there doesn’t seem to be any impediment to citizens of other countries from making a submission. Since there is some personal information included as part of the submission, there is a note about data protection on the Guidance on giving evidence to a Select Committee of the House of Commons webpage.

Big data in the Cascadia region: a University of British Columbia (Canada) and University of Washington (US state) collaboration

Before moving onto the news and for anyone unfamiliar with the concept of the Cascadia region, it is an informally proposed political region or a bioregion, depending on your perspective. Adding to the lack of clarity, the region generally includes the province of British Columbia in Canada and the two US states, Washington and Oregon but Alaska (another US state) and the Yukon (a Canadian territory) may also be included, as well as, parts of California, Wyoming, Idaho, and Montana. (You can read more about the Cascadia bioregion here and the proposed political region here.)  While it sounds as if more of the US is part of the ‘Cascadia region’, British Columbia and the Yukon cover considerably more territory than all of the mentioned states combined, if you’re taking a landmass perspective.

Cascadia Urban Analytics Cooperative

There was some big news about the smallest version of the Cascadia region on Thursday, Feb. 23, 2017 when the University of British Columbia (UBC) , the University of Washington (state; UW), and Microsoft announced the launch of the Cascadia Urban Analytics Cooperative. From the joint Feb. 23, 2017 news release (read on the UBC website or read on the UW website),

In an expansion of regional cooperation, the University of British Columbia and the University of Washington today announced the establishment of the Cascadia Urban Analytics Cooperative to use data to help cities and communities address challenges from traffic to homelessness. The largest industry-funded research partnership between UBC and the UW, the collaborative will bring faculty, students and community stakeholders together to solve problems, and is made possible thanks to a $1-million gift from Microsoft.

“Thanks to this generous gift from Microsoft, our two universities are poised to help transform the Cascadia region into a technological hub comparable to Silicon Valley and Boston,” said Professor Santa J. Ono, President of the University of British Columbia. “This new partnership transcends borders and strives to unleash our collective brain power, to bring about economic growth that enriches the lives of Canadians and Americans as well as urban communities throughout the world.”

“We have an unprecedented opportunity to use data to help our communities make decisions, and as a result improve people’s lives and well-being. That commitment to the public good is at the core of the mission of our two universities, and we’re grateful to Microsoft for making a community-minded contribution that will spark a range of collaborations,” said UW President Ana Mari Cauce.

Today’s announcement follows last September’s [2016] Emerging Cascadia Innovation Corridor Conference in Vancouver, B.C. The forum brought together regional leaders for the first time to identify concrete opportunities for partnerships in education, transportation, university research, human capital and other areas.

A Boston Consulting Group study unveiled at the conference showed the region between Seattle and Vancouver has “high potential to cultivate an innovation corridor” that competes on an international scale, but only if regional leaders work together. The study says that could be possible through sustained collaboration aided by an educated and skilled workforce, a vibrant network of research universities and a dynamic policy environment.

Microsoft President Brad Smith, who helped convene the conference, said, “We believe that joint research based on data science can help unlock new solutions for some of the most pressing issues in both Vancouver and Seattle. But our goal is bigger than this one-time gift. We hope this investment will serve as a catalyst for broader and more sustainable efforts between these two institutions.”

As part of the Emerging Cascadia conference, British Columbia Premier Christy Clark and Washington Governor Jay Inslee signed a formal agreement that committed the two governments to work closely together to “enhance meaningful and results-driven innovation and collaboration.”  The agreement outlined steps the two governments will take to collaborate in several key areas including research and education.

“Increasingly, tech is not just another standalone sector of the economy, but fully integrated into everything from transportation to social work,” said Premier Clark. “That’s why we’ve invested in B.C.’s thriving tech sector, but committed to working with our neighbours in Washington – and we’re already seeing the results.”

“This data-driven collaboration among some of our smartest and most creative thought-leaders will help us tackle a host of urgent issues,” Gov. Inslee said. “I’m encouraged to see our partnership with British Columbia spurring such interesting cross-border dialogue and excited to see what our students and researchers come up with.”

The Cascadia Urban Analytics Cooperative will revolve around four main programs:

  • The Cascadia Data Science for Social Good (DSSG) Summer Program, which builds on the success of the DSSG program at the UW eScience Institute. The cooperative will coordinate a joint summer program for students across UW and UBC campuses where they work with faculty to create and incubate data-intensive research projects that have concrete benefits for urban communities. One past DSSG project analyzed data from Seattle’s regional transportation system – ORCA – to improve its effectiveness, particularly for low-income transit riders. Another project sought to improve food safety by text mining product reviews to identify unsafe products.
  • Cascadia Data Science for Social Good Scholar Symposium, which will foster innovation and collaboration by bringing together scholars from UBC and the UW involved in projects utilizing technology to advance the social good. The first symposium will be hosted at UW in 2017.
  • Sustained Research Partnerships designed to establish the Pacific Northwest as a center of expertise and activity in urban analytics. The cooperative will support sustained research partnerships between UW and UBC researchers, providing technical expertise, stakeholder engagement and seed funding.
  • Responsible Data Management Systems and Services to ensure data integrity, security and usability. The cooperative will develop new software, systems and services to facilitate data management and analysis, as well as ensure projects adhere to best practices in fairness, accountability and transparency.

At UW, the Cascadia Urban Analytics Collaborative will be overseen by Urbanalytics (urbanalytics.uw.edu), a new research unit in the Information School focused on responsible urban data science. The Collaborative builds on previous investments in data-intensive science through the UW eScience Institute (escience.washington.edu) and investments in urban scholarship through Urban@UW (urban.uw.edu), and also aligns with the UW’s Population Health Initiative (uw.edu/populationhealth) that is addressing the most persistent and emerging challenges in human health, environmental resiliency and social and economic equity. The gift counts toward the UW’s Be Boundless – For Washington, For the World campaign (uw.edu/boundless).

The Collaborative also aligns with the UBC Sustainability Initiative (sustain.ubc.ca) that fosters partnerships beyond traditional boundaries of disciplines, sectors and geographies to address critical issues of our time, as well as the UBC Data Science Institute (dsi.ubc.ca), which aims to advance data science research to address complex problems across domains, including health, science and arts.

Brad Smith, President and Chief Legal Officer of Microsoft, wrote about the joint centre in a Feb. 23, 2017 posting on the Microsoft on the Issues blog (Note:,

The cities of Vancouver and Seattle share many strengths: a long history of innovation, world-class universities and a region rich in cultural and ethnic diversity. While both cities have achieved great success on their own, leaders from both sides of the border realize that tighter partnership and collaboration, through the creation of a Cascadia Innovation Corridor, will expand economic opportunity and prosperity well beyond what each community can achieve separately.

Microsoft supports this vision and today is making a $1 million investment in the Cascadia Urban Analytics Cooperative (CUAC), which is a new joint effort by the University of British Columbia (UBC) and the University of Washington (UW).  It will use data to help local cities and communities address challenges from traffic to homelessness and will be the region’s single largest university-based, industry-funded joint research project. While we recognize the crucial role that universities play in building great companies in the Pacific Northwest, whether it be in computing, life sciences, aerospace or interactive entertainment, we also know research, particularly data science, holds the key to solving some of Vancouver and Seattle’s most pressing issues. This grant will advance this work.

An Oct. 21, 2016 article by Hana Golightly for the Ubyssey newspaper provides a little more detail about the province/state agreement mentioned in the joint UBC/UW news release,

An agreement between BC Premier Christy Clark and Washington Governor Jay Inslee means UBC will be collaborating with the University of Washington (UW) more in the future.

At last month’s [Sept. 2016] Cascadia Conference, Clark and Inslee signed a Memorandum of Understanding with the goal of fostering the growth of the technology sector in both regions. Officially referred to as the Cascadia Innovation Corridor, this partnership aims to reduce boundaries across the region — economic and otherwise.

While the memorandum provides broad goals and is not legally binding, it sets a precedent of collaboration between businesses, governments and universities, encouraging projects that span both jurisdictions. Aiming to capitalize on the cultural commonalities of regional centres Seattle and Vancouver, the agreement prioritizes development in life sciences, clean technology, data analytics and high tech.

Metropolitan centres like Seattle and Vancouver have experienced a surge in growth that sees planners envisioning them as the next Silicon Valleys. Premier Clark and Governor Inslee want to strengthen the ability of their jurisdictions to compete in innovation on a global scale. Accordingly, the memorandum encourages the exploration of “opportunities to advance research programs in key areas of innovation and future technologies among the region’s major universities and institutes.”

A few more questions about the Cooperative

I had a few more questions about the Feb. 23, 2017 announcement, for which (from UBC) Gail C. Murphy, PhD, FRSC, Associate Vice President Research pro tem, Professor, Computer Science of UBC and (from UW) Bill Howe, Associate Professor, Information School, Adjunct Associate Professor, Computer Science & Engineering, Associate Director and Senior Data Science Fellow,, UW eScience Institute Program Director and Faculty Chair, UW Data Science Masters Degree have kindly provided answers (Gail Murphy’s replies are prefaced with [GM] and one indent and Bill Howe’s replies are prefaced with [BH] and two indents),

  • Do you have any projects currently underway? e.g. I see a summer programme is planned. Will there be one in summer 2017? What focus will it have?

[GM] UW and UBC will each be running the Data Science for Social Good program in the summer of 2017. UBC’s announcement of the program is available at: http://dsi.ubc.ca/data-science-social-good-dssg-fellowships

  • Is the $1M from Microsoft going to be given in cash or as ‘in kind goods’ or some combination?

[GM] The $1-million donation is in cash. Microsoft organized the Emerging Cascadia Innovation Corridor Conference in September 2017. It was at the conference that the idea for the partnership was hatched. Through this initiative, UBC and UW will continue to engage with Microsoft to further shared goals in promoting evidence-based innovation to improve life for people in the Cascadia region and beyond.

  • How will the money or goods be disbursed? e.g. Will each institution get 1/2 or is there some sort of joint account?

[GM] The institutions are sharing the funds but will be separately administering the funds they receive.

  • Is data going to be crossing borders? e.g. You mentioned some health care projects. In that case, will data from BC residents be accessed and subject to US rules and regulations? Will BC residents know that there data is being accessed by a 3rd party? What level of consent is required?

[GM] As you point out, there are many issues involved with transferring data across the border. Any projects involving private data will adhere to local laws and ethical frameworks set out by the institutions.

  • Privacy rules vary greatly between the US and Canada. How is that being addressed in this proposed new research?

[No Reply]

  • Will new software and other products be created and who will own them?

[GM] It is too soon for us to comment on whether new software or other products will be created. Any creation of software or other products within the institutions will be governed by institutional policy.

  • Will the research be made freely available?

[GM] UBC researchers must be able to publish the results of research as set out by UBC policy.

[BH] Research output at UW will be made available according to UW policy, but I’ll point out that Microsoft has long been a fantastic partner in advancing our efforts in open and reproducible science, open source software, and open access publishing. 

 UW’s discussion on open access policies is available online.

 

  • What percentage of public funds will be used to enable this project? Will the province of BC and the state of Washington be splitting the costs evenly?

[GM] It is too soon for us to report on specific percentages. At UBC, we will be looking to partner with appropriate funding agencies to support more research with this donation. Applications to funding agencies will involve review of any proposals as per the rules of the funding agency.

  • Will there be any social science and/or ethics component to this collaboration? The press conference referenced data science only.

[GM] We expect, but cannot yet confirm, that some of the projects will involve collaborations with faculty from a broad range of research areas at UBC.

[BH] We are indeed placing a strong emphasis on the intersection between data science, the social sciences, and data ethics.  As examples of activities in this space around UW:

* The Information School at UW (my home school) is actively recruiting a new faculty candidate in data ethics this year

* The Education Working Group at the eScience Institute has created a new campus-wide Data & Society seminar course.

* The Center for Statistics in the Social Sciences (CSSS), which represents the marriage of data science and the social sciences, has been a long-term partner in our activities.

More specifically for this collaboration, we are collecting requirements for new software that emphasizes responsible data science: properly managing sensitive data, combating algorithmic bias, protecting privacy, and more.

Microsoft has been a key partner in this work through their Civic Technology group, for which the Seattle arm is led by Graham Thompson.

  • What impact do you see the new US federal government’s current concerns over borders and immigrants hav[ing] on this project? e.g. Are people whose origins are in Iran, Syria, Yemen, etc. and who are residents of Canada going to be able to participate?

[GM] Students and others eligible to participate in research projects in Canada will be welcomed into the UBC projects. Our hope is that faculty and students working on the Cascadia Urban Analytics Cooperative will be able to exchange ideas freely and move freely back and forth across the border.

  • How will seed funding for Sustained Research Partnerships’ be disbursed? Will there be a joint committee making these decisions?

[GM] We are in the process of elaborating this part of the program. At UBC, we are already experiencing, enjoying and benefitting from increased interaction with the University of Washington and look forward to elaborating more aspects of the program together as the year unfolds.

I had to make a few formatting changes when transferring the answers from emails to this posting: my numbered questions (1-11) became bulleted points and ‘have’ in what was question 10 was changed to ‘having’. The content for the answers has been untouched.

I’m surprised no one answered the privacy question but perhaps they thought the other answers sufficed. Despite an answer to my question, I don’t understand how the universities are sharing the funds but that may just mean I’m having a bad day. (Or perhaps the folks at UBC are being overly careful after the scandals rocking the Vancouver campus over the last 18 months to two years (see Sophie Sutcliffe’s Dec. 3, 2015 opinion piece for the Ubyssey for details about the scandals).

Bill Howe’s response about open access (where you can read the journal articles for free) and open source (where you have free access to the software code) was interesting to me as I once worked for a company where the developers complained loud and long about Microsoft’s failure to embrace open source code. Howe’s response is particularly interesting given that Microsoft’s president is also the Chief Legal Officer whose portfolio of responsibilities (I imagine) includes patents.

Matt Day in a Feb. 23, 2017 article for the The Seattle Times provides additional perspective (Note: Links have been removed),

Microsoft’s effort to nudge Seattle and Vancouver, B.C., a bit closer together got an endorsement Thursday [Feb. 23, 2017] from the leading university in each city.

The University of Washington and the University of British Columbia announced the establishment of a joint data-science research unit, called the Cascadia Urban Analytics Cooperative, funded by a $1 million grant from Microsoft.

The collaboration will support study of shared urban issues, from health to transit to homelessness, drawing on faculty and student input from both universities.

The partnership has its roots in a September [2016] conference in Vancouver organized by Microsoft’s public affairs and lobbying unit [emphasis mine.] That gathering was aimed at tying business, government and educational institutions in Microsoft’s home region in the Seattle area closer to its Canadian neighbor.

Microsoft last year [2016]* opened an expanded office in downtown Vancouver with space for 750 employees, an outpost partly designed to draw to the Northwest more engineers than the company can get through the U.S. guest worker system [emphasis mine].

There’s nothing wrong with a business offering to contribute to the social good but it does well to remember that a business’s primary agenda is not the social good.  So in this case, it seems that public affairs and lobbying is really governmental affairs and that Microsoft has anticipated, for some time, greater difficulties with getting workers from all sorts of countries across the US border to work in Washington state making an outpost in British Columbia and closer relations between the constituencies quite advantageous. I wonder what else is on their agenda.

Getting back to UBC and UW, thank you to both Gail Murphy (in particular) and Bill Howe for taking the time to answer my questions. I very much appreciate it as answering 10 questions is a lot of work.

There were one area of interest (cities) that I did not broach with the either academic but will mention here.

Cities and their increasing political heft

Clearly Microsoft is focused on urban issues and that would seem to be the ‘flavour du jour’. There’s a May 31, 2016 piece on the TED website by Robert Muggah and Benjamin Fowler titled: ‘Why cities rule the world‘ (there are video talks embedded in the piece),

Cities are the the 21st century’s dominant form of civilization — and they’re where humanity’s struggle for survival will take place. Robert Muggah and Benjamin Barber spell out the possibilities.

Half the planet’s population lives in cities. They are the world’s engines, generating four-fifths of the global GDP. There are over 2,100 cities with populations of 250,000 people or more, including a growing number of mega-cities and sprawling, networked-city areas — conurbations, they’re called — with at least 10 million residents. As the economist Ed Glaeser puts it, “we are an urban species.”

But what makes cities so incredibly important is not just population or economics stats. Cities are humanity’s most realistic hope for future democracy to thrive, from the grassroots to the global. This makes them a stark contrast to so many of today’s nations, increasingly paralyzed by polarization, corruption and scandal.

In a less hyperbolic vein, Parag Khanna’s April 20,2016 piece for Quartz describes why he (and others) believe that megacities are where the future lies (Note: A link has been removed),

Cities are mankind’s most enduring and stable mode of social organization, outlasting all empires and nations over which they have presided. Today cities have become the world’s dominant demographic and economic clusters.

As the sociologist Christopher Chase-Dunn has pointed out, it is not population or territorial size that drives world-city status, but economic weight, proximity to zones of growth, political stability, and attractiveness for foreign capital. In other words, connectivity matters more than size. Cities thus deserve more nuanced treatment on our maps than simply as homogeneous black dots.

Within many emerging markets such as Brazil, Turkey, Russia, and Indonesia, the leading commercial hub or financial center accounts for at least one-third or more of national GDP. In the UK, London accounts for almost half Britain’s GDP. And in America, the Boston-New York-Washington corridor and greater Los Angeles together combine for about one-third of America’s GDP.

By 2025, there will be at least 40 such megacities. The population of the greater Mexico City region is larger than that of Australia, as is that of Chongqing, a collection of connected urban enclaves in China spanning an area the size of Austria. Cities that were once hundreds of kilometers apart have now effectively fused into massive urban archipelagos, the largest of which is Japan’s Taiheiyo Belt that encompasses two-thirds of Japan’s population in the Tokyo-Nagoya-Osaka megalopolis.

Great and connected cities, Saskia Sassen argues, belong as much to global networks as to the country of their political geography. Today the world’s top 20 richest cities have forged a super-circuit driven by capital, talent, and services: they are home to more than 75% of the largest companies, which in turn invest in expanding across those cities and adding more to expand the intercity network. Indeed, global cities have forged a league of their own, in many ways as denationalized as Formula One racing teams, drawing talent from around the world and amassing capital to spend on themselves while they compete on the same circuit.

The rise of emerging market megacities as magnets for regional wealth and talent has been the most significant contributor to shifting the world’s focal point of economic activity. McKinsey Global Institute research suggests that from now until 2025, one-third of world growth will come from the key Western capitals and emerging market megacities, one-third from the heavily populous middle-weight cities of emerging markets, and one-third from small cities and rural areas in developing countries.

Khanna’s megacities all exist within one country. If Vancouver and Seattle (and perhaps Portland?) were to become a become a megacity it would be one of the only or few to cross national borders.

Khanna has been mentioned here before in a Jan. 27, 2016 posting about cities and technology and a public engagement exercise with the National Research of Council of Canada (scroll down to the subsection titled: Cities rising in important as political entities).

Muggah/Fowler’s and Khanna’s 2016 pieces are well worth reading if you have the time.

For what it’s worth, I’m inclined to agree that cities will be and are increasing in political  importance along with this area of development:

Algorithms and big data

Concerns are being raised about how big data is being utilized so I was happy to see specific initiatives to address ethics issues in Howe’s response. For anyone not familiar with the concerns, here’s an excerpt from Cathy O’Neal’s Oct. 18, 2016 article for Wired magazine,

The age of Big Data has generated new tools and ideas on an enormous scale, with applications spreading from marketing to Wall Street, human resources, college admissions, and insurance. At the same time, Big Data has opened opportunities for a whole new class of professional gamers and manipulators, who take advantage of people using the power of statistics.

I should know. I was one of them.

Information is power, and in the age of corporate surveillance, profiles on every active American consumer means that the system is slanted in favor of those with the data. This data helps build tailor-made profiles that can be used for or against someone in a given situation. Insurance companies, which historically sold car insurance based on driving records, have more recently started using such data-driven profiling methods. A Florida insurance company has been found to charge people with low credit scores and good driving records more than people with high credit scores and a drunk driving conviction. It’s become standard practice for insurance companies to charge people not what they represent as a risk, but what they can get away with. The victims, of course, are those least likely to be able to afford the extra cost, but who need a car to get to work.

Big data profiling techniques are exploding in the world of politics. It’s estimated that over $1 billion will be spent on digital political ads in this election cycle, almost 50 times as much as was spent in 2008; this field is a growing part of the budget for presidential as well as down-ticket races. Political campaigns build scoring systems on potential voters—your likelihood of voting for a given party, your stance on a given issue, and the extent to which you are persuadable on that issue. It’s the ultimate example of asymmetric information, and the politicians can use what they know to manipulate your vote or your donation.

I highly recommend reading O’Neal’s article and, if you have the time, her book ‘Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy’.

Finally

I look forward to hearing more about the Cascadia Urban Analytics Cooperative and the Cascadia Innovation Corridor as they develop. This has the potential to be very exciting although I do have some concerns such as MIcrosoft and its agendas, both stated and unstated. After all, the Sept. 2016 meeting was convened by Microsoft and its public affairs/lobbying group and the topic was innovation, which is code for business and as hinted earlier, business is not synonymous with social good. Having said that I’m not about to demonize business either. I just think a healthy dose of skepticism is called for. Good things can happen but we need to ensure they do.

Thankfully, my concerns regarding algorithms and big data seem to be shared in some quarters, unfortunately none of these quarters appear to be located at the University of British Columbia. I hope that’s over caution with regard to communication rather than a failure to recognize any pitfalls.

ETA Mar. 1, 2017: Interestingly, the UK House of Commons Select Committee on Science and Technology announced an inquiry into the use of algorithms in public and business decision-making on Feb. 28, 2017. As this posting as much too big already, I’ve posted about the UK inquire separately in a Mar. 1, 2017 posting.

*’2016′ added for clarity on March 24, 2017.

Figuring out how stars are born by watching neutrons ‘quantum tunnelling’ on graphene

A Feb. 3, 2017 news item on Nanowerk announces research that could help us better understand how stars are ‘born’,

Graphene is known as the world’s thinnest material due to its 2D structure, where each sheet is only one carbon atom thick, allowing each atom to engage in a chemical reaction from two sides. Graphene flakes can have a very large proportion of edge atoms, all of which have a particular chemical reactivity.

In addition, chemically active voids created by missing atoms are a surface defect of graphene sheets. These structural defects and edges play a vital role in carbon chemistry and physics, as they alter the chemical reactivity of graphene. In fact, chemical reactions have repeatedly been shown to be favoured at these defect sites.

Interstellar molecular clouds are predominantly composed of hydrogen in molecular form (H2), but also contain a small percentage of dust particles mostly in the form of carbon nanostructures, called polyaromatic hydrocarbons (PAH). These clouds are often referred to as ‘star nurseries’ as their low temperature and high density allows gravity to locally condense matter in such a way that it initiates H fusion, the nuclear reaction at the heart of each star.

Graphene-based materials, prepared from the exfoliation of graphite oxide, are used as a model of interstellar carbon dust as they contain a relatively large amount of atomic defects, either at their edges or on their surface. These defects are thought to sustain the Eley-Rideal chemical reaction, which recombines two H atoms into one H2 molecule. The observation of interstellar clouds in inhospitable regions of space, including in the direct proximity of giant stars, poses the question of the origin of the stability of hydrogen in the molecular form (H2).

This question stands because the clouds are constantly being washed out by intense radiation, hence cracking the hydrogen molecules into atoms. Astrochemists suggest that the chemical mechanism responsible for the recombination of atomic H into molecular H2 is catalysed by carbon flakes in interstellar clouds.

A Feb. 2, 2017 Institut Laue-Langevin press release, which originated the news item, provides more insight into the research,

Their [astrochemists’s] theories are challenged by the need for a very efficient surface chemistry scenario to explain the observed equilibrium between dissociation and recombination. They had to introduce highly reactive sites into their models so that the capture of an atomic H nearby occurs without fail. These sites, in the form of atomic defects at the surface or edge of the carbon flakes, should be such that the C-H bond formed thereafter allows the H atom to be released easily to recombine with another H atom flying nearby.

A collaboration between the Institut Laue-Langevin (ILL), France, the University of Parma, Italy, and the ISIS Neutron and Muon Source, UK, combined neutron spectroscopy with density functional theory (DFT) molecular dynamics simulations in order to characterise the local environment and vibrations of hydrogen atoms chemically bonded at the surface of substantially defected graphene flakes. Additional analyses were carried out using muon spectroscopy (muSR) and nuclear magnetic resonance (NMR). As availability of the samples is very low, these highly specific techniques were necessary to study the samples; neutron spectroscopy is highly sensitive to hydrogen and allowed accurate data to be gathered at small concentrations.

For the first time ever, this study showed ‘quantum tunnelling’ in these systems, allowing the H atoms bound to C atoms to explore relatively long distances at temperatures as low as those in interstitial clouds. The process involves hydrogen ‘quantum hopping’ from one carbon atom to another in its direct vicinity, tunnelling through energy barriers which could not be overcome given the lack of heat in the interstellar cloud environment. This movement is sustained by the fluctuations of the graphene structure, which bring the H atom into unstable regions and catalyse the recombination process by allowing the release of the chemically bonded H atom. Therefore, it is believed that quantum tunnelling facilitates the reaction for the formation of molecular H2.

ILL scientist and carbon nanostructure specialist, Stéphane Rols says: “The question of how molecular hydrogen forms at the low temperatures in interstellar clouds has always been a driver in astrochemistry research. We’re proud to have combined spectroscopy expertise with the sensitivity of neutrons to identify the intriguing quantum tunnelling phenomenon as a possible mechanism behind the formation of H2; these observations are significant in furthering our understanding of the universe.”

Here’s a link to and a citation for the paper (which dates from Aug. 2016),

Hydrogen motions in defective graphene: the role of surface defects by Chiara Cavallari, Daniele Pontiroli, Mónica Jiménez-Ruiz, Mark Johnson, Matteo Aramini, Mattia Gaboardi, Stewart F. Parker, Mauro Riccó, and Stéphane Rols. Phys. Chem. Chem. Phys., 2016, Issue 36, 18, 24820-24824 DOI: 10.1039/C6CP04727K First published online 22 Aug 2016

This paper is behind a paywall.

Mapping 23,000 atoms in a nanoparticle

Identification of the precise 3-D coordinates of iron, shown in red, and platinum atoms in an iron-platinum nanoparticle.. Courtesy of Colin Ophus and Florian Nickel/Berkeley Lab

The image of the iron-platinum nanoparticle (referenced in the headline) reminds of foetal ultrasound images. A Feb. 1, 2017 news item on ScienceDaily tells us more,

In the world of the very tiny, perfection is rare: virtually all materials have defects on the atomic level. These imperfections — missing atoms, atoms of one type swapped for another, and misaligned atoms — can uniquely determine a material’s properties and function. Now, UCLA [University of California at Los Angeles] physicists and collaborators have mapped the coordinates of more than 23,000 individual atoms in a tiny iron-platinum nanoparticle to reveal the material’s defects.

The results demonstrate that the positions of tens of thousands of atoms can be precisely identified and then fed into quantum mechanics calculations to correlate imperfections and defects with material properties at the single-atom level.

A Feb. 1, 2017 UCLA news release, which originated the news item, provides more detail about the work,

Jianwei “John” Miao, a UCLA professor of physics and astronomy and a member of UCLA’s California NanoSystems Institute, led the international team in mapping the atomic-level details of the bimetallic nanoparticle, more than a trillion of which could fit within a grain of sand.

“No one has seen this kind of three-dimensional structural complexity with such detail before,” said Miao, who is also a deputy director of the Science and Technology Center on Real-Time Functional Imaging. This new National Science Foundation-funded consortium consists of scientists at UCLA and five other colleges and universities who are using high-resolution imaging to address questions in the physical sciences, life sciences and engineering.

Miao and his team focused on an iron-platinum alloy, a very promising material for next-generation magnetic storage media and permanent magnet applications.

By taking multiple images of the iron-platinum nanoparticle with an advanced electron microscope at Lawrence Berkeley National Laboratory and using powerful reconstruction algorithms developed at UCLA, the researchers determined the precise three-dimensional arrangement of atoms in the nanoparticle.

“For the first time, we can see individual atoms and chemical composition in three dimensions. Everything we look at, it’s new,” Miao said.

The team identified and located more than 6,500 iron and 16,600 platinum atoms and showed how the atoms are arranged in nine grains, each of which contains different ratios of iron and platinum atoms. Miao and his colleagues showed that atoms closer to the interior of the grains are more regularly arranged than those near the surfaces. They also observed that the interfaces between grains, called grain boundaries, are more disordered.

“Understanding the three-dimensional structures of grain boundaries is a major challenge in materials science because they strongly influence the properties of materials,” Miao said. “Now we are able to address this challenge by precisely mapping out the three-dimensional atomic positions at the grain boundaries for the first time.”

The researchers then used the three-dimensional coordinates of the atoms as inputs into quantum mechanics calculations to determine the magnetic properties of the iron-platinum nanoparticle. They observed abrupt changes in magnetic properties at the grain boundaries.

“This work makes significant advances in characterization capabilities and expands our fundamental understanding of structure-property relationships, which is expected to find broad applications in physics, chemistry, materials science, nanoscience and nanotechnology,” Miao said.

In the future, as the researchers continue to determine the three-dimensional atomic coordinates of more materials, they plan to establish an online databank for the physical sciences, analogous to protein databanks for the biological and life sciences. “Researchers can use this databank to study material properties truly on the single-atom level,” Miao said.

Miao and his team also look forward to applying their method called GENFIRE (GENeralized Fourier Iterative Reconstruction) to biological and medical applications. “Our three-dimensional reconstruction algorithm might be useful for imaging like CT scans,” Miao said. Compared with conventional reconstruction methods, GENFIRE requires fewer images to compile an accurate three-dimensional structure.

That means that radiation-sensitive objects can be imaged with lower doses of radiation.

The US Dept. of Energy (DOE) Lawrence Berkeley National Laboratory issued their own Feb. 1, 2017 news release (also on EurekAlert) about the work with a focus on how their equipment made this breakthrough possible (it repeats a little of the info. from the UCLA news release),

Scientists used one of the world’s most powerful electron microscopes to map the precise location and chemical type of 23,000 atoms in an extremely small particle made of iron and platinum.

The 3-D reconstruction reveals the arrangement of atoms in unprecedented detail, enabling the scientists to measure chemical order and disorder in individual grains, which sheds light on the material’s properties at the single-atom level. Insights gained from the particle’s structure could lead to new ways to improve its magnetic performance for use in high-density, next-generation hard drives.

What’s more, the technique used to create the reconstruction, atomic electron tomography (which is like an incredibly high-resolution CT scan), lays the foundation for precisely mapping the atomic composition of other useful nanoparticles. This could reveal how to optimize the particles for more efficient catalysts, stronger materials, and disease-detecting fluorescent tags.

Microscopy data was obtained and analyzed by scientists from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) at the Molecular Foundry, in collaboration with Foundry users from UCLA, Oak Ridge National Laboratory, and the United Kingdom’s University of Birmingham. …

Atoms are the building blocks of matter, and the patterns in which they’re arranged dictate a material’s properties. These patterns can also be exploited to greatly improve a material’s function, which is why scientists are eager to determine the 3-D structure of nanoparticles at the smallest scale possible.

“Our research is a big step in this direction. We can now take a snapshot that shows the positions of all the atoms in a nanoparticle at a specific point in its growth. This will help us learn how nanoparticles grow atom by atom, and it sets the stage for a materials-design approach starting from the smallest building blocks,” says Mary Scott, who conducted the research while she was a Foundry user, and who is now a staff scientist. Scott and fellow Foundry scientists Peter Ercius and Colin Ophus developed the method in close collaboration with Jianwei Miao, a UCLA professor of physics and astronomy.

Their nanoparticle reconstruction builds on an achievement they reported last year in which they measured the coordinates of more than 3,000 atoms in a tungsten needle to a precision of 19 trillionths of a meter (19 picometers), which is many times smaller than a hydrogen atom. Now, they’ve taken the same precision, added the ability to distinguish different elements, and scaled up the reconstruction to include tens of thousands of atoms.

Importantly, their method maps the position of each atom in a single, unique nanoparticle. In contrast, X-ray crystallography and cryo-electron microscopy plot the average position of atoms from many identical samples. These methods make assumptions about the arrangement of atoms, which isn’t a good fit for nanoparticles because no two are alike.

“We need to determine the location and type of each atom to truly understand how a nanoparticle functions at the atomic scale,” says Ercius.

A TEAM approach

The scientists’ latest accomplishment hinged on the use of one of the highest-resolution transmission electron microscopes in the world, called TEAM I. It’s located at the National Center for Electron Microscopy, which is a Molecular Foundry facility. The microscope scans a sample with a focused beam of electrons, and then measures how the electrons interact with the atoms in the sample. It also has a piezo-controlled stage that positions samples with unmatched stability and position-control accuracy.

The researchers began growing an iron-platinum nanoparticle from its constituent elements, and then stopped the particle’s growth before it was fully formed. They placed the “partially baked” particle in the TEAM I stage, obtained a 2-D projection of its atomic structure, rotated it a few degrees, obtained another projection, and so on. Each 2-D projection provides a little more information about the full 3-D structure of the nanoparticle.

They sent the projections to Miao at UCLA, who used a sophisticated computer algorithm to convert the 2-D projections into a 3-D reconstruction of the particle. The individual atomic coordinates and chemical types were then traced from the 3-D density based on the knowledge that iron atoms are lighter than platinum atoms. The resulting atomic structure contains 6,569 iron atoms and 16,627 platinum atoms, with each atom’s coordinates precisely plotted to less than the width of a hydrogen atom.

Translating the data into scientific insights

Interesting features emerged at this extreme scale after Molecular Foundry scientists used code they developed to analyze the atomic structure. For example, the analysis revealed chemical order and disorder in interlocking grains, in which the iron and platinum atoms are arranged in different patterns. This has large implications for how the particle grew and its real-world magnetic properties. The analysis also revealed single-atom defects and the width of disordered boundaries between grains, which was not previously possible in complex 3-D boundaries.

“The important materials science problem we are tackling is how this material transforms from a highly randomized structure, what we call a chemically-disordered structure, into a regular highly-ordered structure with the desired magnetic properties,” says Ophus.

To explore how the various arrangements of atoms affect the nanoparticle’s magnetic properties, scientists from DOE’s Oak Ridge National Laboratory ran computer calculations on the Titan supercomputer at ORNL–using the coordinates and chemical type of each atom–to simulate the nanoparticle’s behavior in a magnetic field. This allowed the scientists to see patterns of atoms that are very magnetic, which is ideal for hard drives. They also saw patterns with poor magnetic properties that could sap a hard drive’s performance.

“This could help scientists learn how to steer the growth of iron-platinum nanoparticles so they develop more highly magnetic patterns of atoms,” says Ercius.

Adds Scott, “More broadly, the imaging technique will shed light on the nucleation and growth of ordered phases within nanoparticles, which isn’t fully theoretically understood but is critically important to several scientific disciplines and technologies.”

The folks at the Berkeley Lab have created a video (notice where the still image from the beginning of this post appears),

The Oak Ridge National Laboratory (ORNL), not wanting to be left out, has been mentioned in a Feb. 3, 2017 news item on ScienceDaily,

… researchers working with magnetic nanoparticles at the University of California, Los Angeles (UCLA), and the US Department of Energy’s (DOE’s) Lawrence Berkeley National Laboratory (Berkeley Lab) approached computational scientists at DOE’s Oak Ridge National Laboratory (ORNL) to help solve a unique problem: to model magnetism at the atomic level using experimental data from a real nanoparticle.

“These types of calculations have been done for ideal particles with ideal crystal structures but not for real particles,” said Markus Eisenbach, a computational scientist at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at ORNL.

A Feb. 2, 2017 ORNL news release on EurekAlert, which originated the news item, elucidates further on how their team added to the research,

Eisenbach develops quantum mechanical electronic structure simulations that predict magnetic properties in materials. Working with Paul Kent, a computational materials scientist at ORNL’s Center for Nanophase Materials Sciences, the team collaborated with researchers at UCLA and Berkeley Lab’s Molecular Foundry to combine world-class experimental data with world-class computing to do something new–simulate magnetism atom by atom in a real nanoparticle.

Using the new data from the research teams on the West Coast, Eisenbach and Kent were able to precisely model the measured atomic structure, including defects, from a unique iron-platinum (FePt) nanoparticle and simulate its magnetic properties on the 27-petaflop Titan supercomputer at the OLCF.

Electronic structure codes take atomic and chemical structure and solve for the corresponding magnetic properties. However, these structures are typically derived from many 2-D electron microscopy or x-ray crystallography images averaged together, resulting in a representative, but not true, 3-D structure.

“In this case, researchers were able to get the precise 3-D structure for a real particle,” Eisenbach said. “The UCLA group has developed a new experimental technique where they can tell where the atoms are–the coordinates–and the chemical resolution, or what they are — iron or platinum.”

The ORNL news release goes on to describe the work from the perspective of the people who ran the supercompute simulationsr,

A Supercomputing Milestone

Magnetism at the atomic level is driven by quantum mechanics — a fact that has shaken up classical physics calculations and called for increasingly complex, first-principle calculations, or calculations working forward from fundamental physics equations rather than relying on assumptions that reduce computational workload.

For magnetic recording and storage devices, researchers are particularly interested in magnetic anisotropy, or what direction magnetism favors in an atom.

“If the anisotropy is too weak, a bit written to the nanoparticle might flip at room temperature,” Kent said.

To solve for magnetic anisotropy, Eisenbach and Kent used two computational codes to compare and validate results.

To simulate a supercell of about 1,300 atoms from strongly magnetic regions of the 23,000-atom nanoparticle, they used the Linear Scaling Multiple Scattering (LSMS) code, a first-principles density functional theory code developed at ORNL.

“The LSMS code was developed for large magnetic systems and can tackle lots of atoms,” Kent said.

As principal investigator on 2017, 2016, and previous INCITE program awards, Eisenbach has scaled the LSMS code to Titan for a range of magnetic materials projects, and the in-house code has been optimized for Titan’s accelerated architecture, speeding up calculations more than 8 times on the machine’s GPUs. Exceptionally capable of crunching large magnetic systems quickly, the LSMS code received an Association for Computing Machinery Gordon Bell Prize in high-performance computing achievement in 1998 and 2009, and developments continue to enhance the code for new architectures.

Working with Renat Sabirianov at the University of Nebraska at Omaha, the team also ran VASP, a simulation package that is better suited for smaller atom counts, to simulate regions of about 32 atoms.

“With both approaches, we were able to confirm that the local VASP results were consistent with the LSMS results, so we have a high confidence in the simulations,” Eisenbach said.

Computer simulations revealed that grain boundaries have a strong effect on magnetism. “We found that the magnetic anisotropy energy suddenly transitions at the grain boundaries. These magnetic properties are very important,” Miao said.

In the future, researchers hope that advances in computing and simulation will make a full-particle simulation possible — as first-principles calculations are currently too intensive to solve small-scale magnetism for regions larger than a few thousand atoms.

Also, future simulations like these could show how different fabrication processes, such as the temperature at which nanoparticles are formed, influence magnetism and performance.

“There’s a hope going forward that one would be able to use these techniques to look at nanoparticle growth and understand how to optimize growth for performance,” Kent said.

Finally, here’s a link to and a citation for the paper,

Deciphering chemical order/disorder and material properties at the single-atom level by Yongsoo Yang, Chien-Chun Chen, M. C. Scott, Colin Ophus, Rui Xu, Alan Pryor, Li Wu, Fan Sun, Wolfgang Theis, Jihan Zhou, Markus Eisenbach, Paul R. C. Kent, Renat F. Sabirianov, Hao Zeng, Peter Ercius, & Jianwei Miao. Nature 542, 75–79 (02 February 2017) doi:10.1038/nature21042 Published online 01 February 2017

This paper is behind a paywall.

New principles for AI (artificial intelligence) research along with some history and a plea for a democratic discussion

For almost a month I’ve been meaning to get to this Feb. 1, 2017 essay by Andrew Maynard (director of Risk Innovation Lab at Arizona State University) and Jack Stilgoe (science policy lecturer at University College London [UCL]) on the topic of artificial intelligence and principles (Note: Links have been removed). First, a walk down memory lane,

Today [Feb. 1, 2017] in Washington DC, leading US and UK scientists are meeting to share dispatches from the frontiers of machine learning – an area of research that is creating new breakthroughs in artificial intelligence (AI). Their meeting follows the publication of a set of principles for beneficial AI that emerged from a conference earlier this year at a place with an important history.

In February 1975, 140 people – mostly scientists, with a few assorted lawyers, journalists and others – gathered at a conference centre on the California coast. A magazine article from the time by Michael Rogers, one of the few journalists allowed in, reported that most of the four days’ discussion was about the scientific possibilities of genetic modification. Two years earlier, scientists had begun using recombinant DNA to genetically modify viruses. The Promethean nature of this new tool prompted scientists to impose a moratorium on such experiments until they had worked out the risks. By the time of the Asilomar conference, the pent-up excitement was ready to burst. It was only towards the end of the conference when a lawyer stood up to raise the possibility of a multimillion-dollar lawsuit that the scientists focussed on the task at hand – creating a set of principles to govern their experiments.

The 1975 Asilomar meeting is still held up as a beacon of scientific responsibility. However, the story told by Rogers, and subsequently by historians, is of scientists motivated by a desire to head-off top down regulation with a promise of self-governance. Geneticist Stanley Cohen said at the time, ‘If the collected wisdom of this group doesn’t result in recommendations, the recommendations may come from other groups less well qualified’. The mayor of Cambridge, Massachusetts was a prominent critic of the biotechnology experiments then taking place in his city. He said, ‘I don’t think these scientists are thinking about mankind at all. I think that they’re getting the thrills and the excitement and the passion to dig in and keep digging to see what the hell they can do’.

The concern in 1975 was with safety and containment in research, not with the futures that biotechnology might bring about. A year after Asilomar, Cohen’s colleague Herbert Boyer founded Genentech, one of the first biotechnology companies. Corporate interests barely figured in the conversations of the mainly university scientists.

Fast-forward 42 years and it is clear that machine learning, natural language processing and other technologies that come under the AI umbrella are becoming big business. The cast list of the 2017 Asilomar meeting included corporate wunderkinds from Google, Facebook and Tesla as well as researchers, philosophers, and other academics. The group was more intellectually diverse than their 1975 equivalents, but there were some notable absences – no public and their concerns, no journalists, and few experts in the responsible development of new technologies.

Maynard and Stilgoe offer a critique of the latest principles,

The principles that came out of the meeting are, at least at first glance, a comforting affirmation that AI should be ‘for the people’, and not to be developed in ways that could cause harm. They promote the idea of beneficial and secure AI, development for the common good, and the importance of upholding human values and shared prosperity.

This is good stuff. But it’s all rather Motherhood and Apple Pie: comforting and hard to argue against, but lacking substance. The principles are short on accountability, and there are notable absences, including the need to engage with a broader set of stakeholders and the public. At the early stages of developing new technologies, public concerns are often seen as an inconvenience. In a world in which populism appears to be trampling expertise into the dirt, it is easy to understand why scientists may be defensive.

I encourage you to read this thoughtful essay in its entirety although I do have one nit to pick:  Why only US and UK scientists? I imagine the answer may lie in funding and logistics issues but I find it surprising that the critique makes no mention of the international community as a nod to inclusion.

For anyone interested in the Asolimar AI principles (2017), you can find them here. You can also find videos of the two-day workshop (Jan. 31 – Feb. 1, 2017 workshop titled The Frontiers of Machine Learning (a Raymond and Beverly Sackler USA-UK Scientific Forum [US National Academy of Sciences]) here (videos for each session are available on Youtube).

Nanoparticle fertilizer and dreams of a new ‘Green’ revolution

There were hints even while it was happening that the ‘Green Revolution’ of the 1960s was not all it was touted to be. (For those who haven’t come across the term before, the Green Revolution was a better way to farm, a way that would feed everyone on earth. Or, that was the dream.)

Perhaps this time, they’ll be more successful. From a Jan. 15, 2017 news item on ScienceDaily, which offers a perspective on the ‘Green Revolution’ that differs from mine,

The “Green Revolution” of the ’60s and ’70s has been credited with helping to feed billions around the world, with fertilizers being one of the key drivers spurring the agricultural boom. But in developing countries, the cost of fertilizer remains relatively high and can limit food production. Now researchers report in the journal ACS Nano a simple way to make a benign, more efficient fertilizer that could contribute to a second food revolution.

A Jan. 25, 2017 American Chemical Society news release on EurekAlert, which originated the news item, expands on the theme,

Farmers often use urea, a rich source of nitrogen, as fertilizer. Its flaw, however, is that it breaks down quickly in wet soil and forms ammonia. The ammonia is washed away, creating a major environmental issue as it leads to eutrophication of water ways and ultimately enters the atmosphere as nitrogen dioxide, the main greenhouse gas associated with agriculture. This fast decomposition also limits the amount of nitrogen that can get absorbed by crop roots and requires farmers to apply more fertilizer to boost production. However, in low-income regions where populations continue to grow and the food supply is unstable, the cost of fertilizer can hinder additional applications and cripple crop yields. Nilwala Kottegoda, Veranja Karunaratne, Gehan Amaratunga and colleagues wanted to find a way to slow the breakdown of urea and make one application of fertilizer last longer.

To do this, the researchers developed a simple and scalable method for coating hydroxyapatite (HA) nanoparticles with urea molecules. HA is a mineral found in human and animal tissues and is considered to be environmentally friendly. In water, the hybridization of the HA nanoparticles and urea slowly released nitrogen, 12 times slower than urea by itself. Initial field tests on rice farms showed that the HA-urea nanohybrid lowered the need for fertilizer by one-half. The researchers say their development could help contribute to a new green revolution to help feed the world’s continuously growing population and also improve the environmental sustainability of agriculture.

Here’s a link to and a citation for the paper,

Urea-Hydroxyapatite Nanohybrids for Slow Release of Nitrogen by Nilwala Kottegoda, Chanaka Sandaruwan, Gayan Priyadarshana, Asitha Siriwardhana, Upendra A. Rathnayake, Danushka Madushanka Berugoda Arachchige, Asurusinghe R. Kumarasinghe, Damayanthi Dahanayake, Veranja Karunaratne, and Gehan A. J. Amaratunga. ACS Nano, Article ASAP DOI: 10.1021/acsnano.6b07781 Publication Date (Web): January 25, 2017

Copyright © 2017 American Chemical Society

This paper is open access.