This is the final commentary on the report titled,(INVESTING IN CANADA’S FUTURE; Strengthening the Foundations of Canadian Research). Part 1 of my commentary having provided some introductory material and first thoughts about the report, Part 2 offering more detailed thoughts; this part singles out ‘special cases’, sums up* my thoughts (circling back to ideas introduced in the first part), and offers link to other commentaries.
Not all of the science funding in Canada is funneled through the four agencies designed for that purpose, (The Natural Sciences and Engineering Research Council (NSERC), Social Sciences and Humanities Research Council (SSHRC), Canadian Institutes of Health Research (CIHR) are known collectively as the tri-council funding agencies and are focused on disbursement of research funds received from the federal government. The fourth ‘pillar’ agency, the Canada Foundation for Innovation (CFI) is focused on funding for infrastructure and, technically speaking, is a 3rd party organization along with MITACS, CANARIE, the Perimeter Institute, and others.
In any event, there are also major research facilities and science initiatives which may receive direct funding from the federal government bypassing the funding agencies and, it would seem, peer review. For example, I featured this in my April 28, 2015 posting about the 2015 federal budget,
The $45 million announced for TRIUMF will support the laboratory’s role in accelerating science in Canada, an important investment in discovery research.
While the news about the CFI seems to have delighted a number of observers, it should be noted (as per Woodgett’s piece) that the $1.3B is to be paid out over six years ($220M per year, more or less) and the money won’t be disbursed until the 2017/18 fiscal year. As for the $45M designated for TRIUMF (Canada’s National Laboratory for Particle and Nuclear Physics), this is exciting news for the lab which seems to have bypassed the usual channels, as it has before, to receive its funding directly from the federal government. [emphases mine]
The Naylor report made this recommendation for Canada’s major research facilities, (MRF)
We heard from many who recommended that the federal government should manage its investments in “Big Science” in a more coordinated manner, with a cradle-to-grave perspective. The Panel agrees. Consistent with NACRI’s overall mandate, it should work closely with the CSA [Chief Science Advisor] in establishing a Standing Committee on Major Research Facilities (MRFs).
CFI defines a national research facility in the following way:
We define a national research facility as one that addresses the needs of a community of Canadian researchers representing a critical mass of users distributed across the country. This is done by providing shared access to substantial and advanced specialized equipment, services, resources, and scientific and technical personnel. The facility supports leading-edge research and technology development, and promotes the mobilization of knowledge and transfer of technology to society. A national research facility requires resource commitments well beyond the capacity of any one institution. A national research facility, whether single-sited, distributed or virtual, is specifically identified or recognized as serving pan-Canadian needs and its governance and management structures reflect this mandate.8
We accept this definition as appropriate for national research facilities to be considered by the Standing Committee on MRFs, but add that the committee should:
• define a capital investment or operating cost level above which such facilities are considered “major” and thus require oversight by this committee (e.g., defined so as to include the national MRFs proposed in Section 6.3: Compute Canada, Canadian Light Source, Canada’s National Design Network, Canadian Research Icebreaker Amundsen, International Vaccine Centre, Ocean Networks Canada, Ocean Tracking Network, and SNOLAB plus the TRIUMF facility); and
• consider international MRFs in which Canada has a significant role, such as astronomical telescopes of global significance.
The structure and function of this Special Standing Committee would closely track the proposal made in 2006 by former NSA [National Science Advisor] Dr Arthur Carty. We return to this topic in Chapter 6. For now, we observe that this approach would involve:
• a peer-reviewed decision on beginning an investment;
• a funded plan for the construction and operation of the facility, with continuing oversight by a peer specialist/agency review group for the specific facility;
• a plan for decommissioning; and
• a regular review scheduled to consider whether the facility still serves current needs.
We suggest that the committee have 10 members, with an eminent scientist as Chair. The members should include the CSA, two representatives from NACRI for liaison, and seven others. The other members should include Canadian and international scientists from a broad range of disciplines and experts on the construction, operation, and administration of MRFs. Consideration should be given to inviting the presidents of NRC [National Research Council of Canada] and CFI to serve as ex-officio members. The committee should be convened by the CSA, have access to the Secretariat associated with the CSA and NACRI, and report regularly to NACRI. (pp. 66-7 print; pp. 100-1 PDF)
I have the impression there’s been some ill feeling over the years regarding some of the major chunks of money given for ‘big science’. At a guess, direct appeals to a federal government that has no official mechanism for assessing the proposed ‘big science’ whether that means a major research facility (e.g., TRIUMF) or major science initiative (e.g., Pan Canadian Artificial Intelligence Strategy [keep reading to find out how I got the concept of a major science initiative wrong]) or 3rd party (MITACS) has seemed unfair to those who have to submit funding applications and go through vetting processes. This recommendation would seem to be an attempt to redress some of the issues.
Moving onto the third-party delivery and matching programs,
Three bodies in particular are the largest of these third-party organizations and illustrate the challenges of evaluating contribution agreements: Genome Canada, Mitacs, and Brain Canada. Genome Canada was created in 2000 at a time when many national genomics initiatives were being developed in the wake of the Human Genome Project. It emerged from a “bottom-up” design process driven by genomic scientists to complement existing programs by focusing on large-scale projects and technology platforms. Its funding model emphasized partnerships and matching funds to leverage federal commitments with the objective of rapidly ramping up genomics research in Canada.
This approach has been successful: Genome Canada has received $1.1 billion from the Government of Canada since its creation in 2000, and has raised over $1.6 billion through co-funding commitments, for a total investment in excess of $2.7 billion.34 The scale of Genome Canada’s funding programs allows it to support large-scale genomics research that the granting councils might otherwise not be able to fund. Genome Canada also supports a network of genomics technology and innovation centres with an emphasis on knowledge translation and has built domestic and international strategic partnerships. While its primary focus has been human health, it has also invested extensively in agriculture, forestry, fisheries, environment, and, more recently, oil and gas and mining— all with a view to the application and commercialization of genomic biotechnology.
Mitacs attracts, trains, and retains HQP [highly qualified personnel] in the Canadian research enterprise. Founded in 1999 as an NCE [Network Centre for Excellence], it was developed at a time when enrolments in graduate programs had flat-lined, and links between mathematics and industry were rare. Independent since 2011, Mitacs has focused on providing industrial research internships and postdoctoral fellowships, branching out beyond mathematics to all disciplines. It has leveraged funding effectively from the federal and provincial governments, industry, and not-for-profit organizations. It has also expanded internationally, providing two-way research mobility. Budget 2015 made Mitacs the single mechanism of federal support for postsecondary research internships with a total federal investment of $135.4 million over the next five years. This led to the wind-down of NSERC’s Industrial Postgraduate Scholarships Program. With matching from multiple other sources, Mitacs’ average annual budget is now $75 to $80 million. The organization aims to more than double the number of internships it funds to 10,000 per year by 2020.35
Finally, Brain Canada was created in 1998 (originally called NeuroScience Canada) to increase the scale of brain research funding in Canada and widen its scope with a view to encouraging interdisciplinary collaboration. In 2011 the federal government established the Canada Brain Research Fund to expand Brain Canada’s work, committing $100 million in new public investment for brain research to be matched 1:1 through contributions raised by Brain Canada. According to the STIC ‘State of the Nation’ 2014 report, Canada’s investment in neuroscience research is only about 40 per cent of that in the U.S. after adjusting for the size of the U.S. economy.36 Brain Canada may be filling a void left by declining success rates and flat funding at CIHR.
Recommendation and Elaboration
The Panel noted that, in general, third-party organizations for delivering research funding are particularly effective in leveraging funding from external partners. They fill important gaps in research funding and complement the work of the granting councils and CFI. At the same time, we questioned the overall efficiency of directing federal research funding through third-party organizations, noting that our consultations solicited mixed reactions. Some respondents favoured more overall funding concentrated in the agencies rather than diverting the funding to third-party entities. Others strongly supported the business models of these organizations.
We have indicated elsewhere that a system-wide review panel such as ours is not well-suited to examine these and other organizations subject to third-party agreements. We recommended instead in Chapter 4 that a new oversight body, NACRI, be created to provide expert advice and guidance on when a new entity might reasonably be supported by such an agreement. Here we make the case for enlisting NACRI in determining not just the desirability of initiating a new entity, but also whether contribution agreements should continue and, if so, on what terms.
The preceding sketches of three diverse organizations subject to contribution agreements help illustrate the rationale for this proposal. To underscore the challenges of adjudication, we elaborate briefly. Submissions highlighted that funding from Genome Canada has enabled fundamental discoveries to be made and important knowledge to be disseminated to the Canadian and international research communities. However, other experts suggested a bifurcation with CIHR or NSERC funding research-intensive development of novel technologies, while Genome Canada would focus on application (e.g., large-scale whole genome studies) and commercialization of existing technologies. From the Panel’s standpoint, these observations underscore the subtleties of determining where and how Genome Canada’s mandate overlaps and departs from that of CIHR and NSERC as well as CFI. Added to the complexity of any assessment is Genome Canada’s meaningful role in providing large-scale infrastructure grants and its commercialization program. Mitacs, even more than Genome Canada, bridges beyond academe to the private and non-profit sectors, again highlighting the advantage of having any review overseen by a body with representatives from both spheres. Finally, as did the other two entities, Brain Canada won plaudits, but some interchanges saw discussants ask when and whether it might be more efficient to flow this type of funding on a programmatic basis through CIHR.
We emphasize that the Panel’s intent here is neither to signal agreement nor disagreement with any of these submissions or discussions. We simply wish to highlight that decisions about ongoing funding will involve expert judgments informed by deep expertise in the relevant research areas and, in two of these examples, an ability to bridge from research to innovation and from extramural independent research to the private and non-profit sectors. Under current arrangements, management consulting firms and public servants drive the review and decision-making processes. Our position is that oversight by NACRI and stronger reliance on advice from content experts would be prudent given the sums involved and the nature of the issues. (pp. 102-4 print; pp. 136-8 PDF)
I wasn’t able to find anything other than this about major science initiatives (MSIs),
Big Science facilities, such as MSIs, have had particular challenges in securing ongoing stable operating support. Such facilities often have national or international missions. We termed them “major research facilities” (MRFs) xi in Chapter 4, and proposed an improved oversight mechanism that would provide lifecycle stewardship of these national science resources, starting with the decision to build them in the first instance. (p. 132 print; p. 166 PDF)
So, an MSI is an MRF? (head shaking) Why two terms for the same thing? And, how does the newly announced Pan Canadian Artificial Intelligence Strategy fit into the grand scheme of things?
The last ‘special case’ I’m featuring is the ‘Programme for Research Chairs for Excellent Scholars and Scientists’. Here’s what the report had to say about the state of affairs,
The major sources of federal funding for researcher salary support are the CRC [Canada Research Chair]and CERC [Canada Excellence Reseach Chair] programs. While some salary support is provided through council-specific programs, these investments have been declining over time. The Panel supports program simplification but, as noted in Chapter 5, we are concerned about the gaps created by the elimination of these personnel awards. While we focus here on the CRC and CERC programs because of their size, profile, and impact, our recommendations will reflect these concerns.
The CRC program was launched in 2000 and remains the Government of Canada’s flagship initiative to keep Canada among the world’s leading countries in higher education R&D. The program has created 2,000 research professorships across Canada with the stated aim “to attract and retain some of the world’s most accomplished and promising minds”5 as part of an effort to curtail the potential academic brain drain to the U.S. and elsewhere. The program is a tri-council initiative with most Chairs allocated to eligible institutions based on the national proportion of total research grant funding they receive from the three granting councils. The vast majority of Chairs are distributed based on area of research, of which 45 per cent align with NSERC, 35 per cent with CIHR, and 20 per cent with SSHRC; an additional special allocation of 120 Chairs can be used in the area of research chosen by the universities receiving the Chairs. There are two types of Chairs: Tier 1 Chairs are intended for outstanding researchers who are recognized as world leaders in their fields and are renewable; Tier 2 Chairs are targeted at exceptional emerging researchers with the potential to become leaders in their field and can be renewed once. Awards are paid directly to the universities and are valued at $200,000 annually for seven years (Tier 1) or $100,000 annually for five years (Tier 2). The program notes that Tier 2 Chairs are not meant to be a feeder group for Tier 1 Chairs; rather, universities are expected to develop a succession plan for their Tier 2 Chairs.
The CERC program was established in 2008 with the expressed aim of “support[ing] Canadian universities in their efforts to build on Canada’s growing reputation as a global leader in research and innovation.”6 The program aims to award world-renowned researchers and their teams with up to $10 million over seven years to establish ambitious research programs at Canadian universities, making these awards among the most prestigious and generous available internationally. There are currently 27 CERCs with funding available to support up to 30 Chairs, which are awarded in the priority areas established by the federal government. The awards, which are not renewable, require 1:1 matching funds from the host institution, and all degree-granting institutions that receive tri-council funding are eligible to compete. Both the CERC and CRC programs are open to Canadians and foreign citizens. However, until the most recent round, the CERCs have been constrained to the government’s STEM-related priorities; this has limited their availability to scholars and scientists from SSHRC-related disciplines. As well, even though Canadian-based researchers are eligible for CERC awards, the practice has clearly been to use them for international recruitment with every award to date going to researchers from abroad.
Similar to research training support, the funding for salary support to researchers and scholars is a significant proportion of total federal research investments, but relatively small with respect to the research ecosystem as a whole. There are more than 45,000 professors and teaching staff at Canada’s universities7 and a very small fraction hold these awards. Nevertheless, the programs can support research excellence by repatriating top Canadian talent from abroad and by recruiting and retaining top international talent in Canada.
The programs can also lead by example in promoting equity and diversity in the research enterprise. Unfortunately, both the CRC and CERC programs suffer from serious challenges regarding equity and diversity, as described in Chapter 5. Both programs have been criticized in particular for under-recruitment of women.
While the CERC program has recruited exclusively from outside Canada, the CRC program has shown declining performance in that regard. A 2016 evaluation of the CRC program8 observed that a rising number of chairholders were held by nominees who originated from within the host institution (57.5 per cent), and another 14.4 per cent had been recruited from other Canadian institutions. The Panel acknowledges that some of these awards may be important to retaining Canadian talent. However, we were also advised in our consultations that CRCs are being used with some frequency to offset salaries as part of regular faculty complement planning.
The evaluation further found that 28.1 per cent of current chairholders had been recruited from abroad, a decline from 32 per cent in the 2010 evaluation. That decline appears set to continue. The evaluation reported that “foreign nominees accounted, on average, for 13 per cent and 15 per cent respectively of new Tier 1 and Tier 2 nominees over the five-year period 2010 to 2014”, terming it a “large decrease” from 2005 to 2009 when the averages respectively were 32 per cent and 31 per cent. As well, between 2010-11 and 2014-15, the attrition rate for chairholders recruited from abroad was 75 per cent higher than for Canadian chairholders, indicating that the program is also falling short in its ability to retain international talent.9
One important factor here appears to be the value of the CRC awards. While they were generous in 2000, their value has remained unchanged for some 17 years, making it increasingly difficult to offer the level of support that world-leading research professors require. The diminishing real value of the awards also means that Chair positions are becoming less distinguishable from regular faculty positions, threatening the program’s relevance and effectiveness. To rejuvenate this program and make it relevant for recruitment and retention of top talent, it seems logical to take two steps:
• ask the granting councils and the Chairs Secretariat to work with universities in developing a plan to restore the effectiveness of these awards; and
• once that plan is approved, increase the award values by 35 per cent, thereby restoring the awards to their original value and making them internationally competitive once again.
In addition, the Panel observes that the original goal was for the program to fund 2,000 Chairs. Due to turnover and delays in filling Chair positions, approximately 10 to 15 per cent of them are unoccupied at any one time.i As a result, the program budget was reduced by $35 million in 2012. However, the occupancy rate has continued to decline since then, with an all-time low of only 1,612 Chair positions (80.6 per cent) filled as of December 2016. The Panel is dismayed by this inefficiency, especially at a time when Tier 2 Chairs remain one of the only external sources of salary support for ECRs [early career researchers]—a group that represents the future of Canadian research and scholarship. (pp. 142-4 print; pp. 176-8 PDF)
I think what you can see as a partial subtext in this report and which I’m attempting to highlight here in ‘special cases’ is a balancing act between supporting a broad range of research inquiries and focusing or pouring huge sums of money into ‘important’ research inquiries for high impact outcomes.
There are many things to commend this report including the writing style. The notion that more coordination is needed amongst the various granting agencies, that greater recognition (i.e,, encouragement and funding opportunities) should be given to boundary-crossing research, and that we need to do more interprovincial collaboration is welcome. And yes, they want more money too. (That request is perfectly predictable. When was the last time a report suggested less funding?) Perhaps more tellingly, the request for money is buttressed with a plea to make it partisan-proof. In short, that funding doesn’t keep changing with the political tides.
One area that was not specifically mentioned, except when discussing prizes, was mathematics. I found that a bit surprising given how important the field of mathematics is to to virtually all the ‘sciences’. A 2013 report, Spotlight on Science, suggests there’s a problem(as noted my Oct. 9, 2013 posting about that report, (I also mention Canada’s PISA scores [Programme for International Student Assessment] by the OECD [Organization for Economic Cooperation and Development], which consistently show Canadian students at the age of 15 [grade 10] do well) ,
… it appears that we have high drop out rates in the sciences and maths, from an Oct. 8, 2013 news item on the CBC (Canadian Broadcasting Corporation) website,
… Canadians are paying a heavy price for the fact that less than 50 per cent of Canadian high school students graduate with senior courses in science, technology, engineering and math (STEM) at a time when 70 per cent of Canada’s top jobs require an education in those fields, said report released by the science education advocacy group Let’s Talk Science and the pharmaceutical company Amgen Canada.
Spotlight on Science Learning 2013 compiles publicly available information about individual and societal costs of students dropping out STEM courses early.
Even though most provinces only require math and science courses until Grade 10, the report [Spotlight on Science published by Let’s Talk Science and pharmaceutical company Amgen Canada) found students without Grade 12 math could expect to be excluded from 40 to 75 per cent of programs at Canadian universities, and students without Grade 11 could expect to be excluded from half of community college programs. [emphasis mine]
While I realize that education wasn’t the panel’s mandate they do reference the topic elsewhere and while secondary education is a provincial responsibility there is a direct relationship between it and postsecondary education.
On the lack of imagination front, there was some mention of our aging population but not much planning or discussion about integrating older researchers into the grand scheme of things. It’s all very well to talk about the aging population but shouldn’t we start introducing these ideas into more of our discussions on such topics as research rather than only those discussions focused on aging?
Continuing on with the lack of imagination and lack of forethought, I was not able to find any mention of independent scholars. The assumption, as always, is that one is affiliated with an institution. Given the ways in which our work world is changing with fewer jobs at the institutional level, it seems the panel was not focused on important and fra reaching trends. Also, there was no mention of technologies, such as artificial intelligence, that could affect basic research. One other thing from my wish list, which didn’t get mentioned, art/science or SciArt. Although that really would have been reaching.
Weirdly, one of the topics the panel did note, the pitiifull lack of interprovincial scientific collaboration, was completely ignored when it came time for recommendations.
Should you spot any errors in this commentary, please do drop me a comment.
Other responses to the report:
Nassif Ghoussoub (Piece of Mind blog; he’s a professor mathematics at the University of British Columbia; he attended one of the roundtable discussions held by the panel). As you might expect, he focuses on the money end of things in his May 1, 2017 posting.
You can find a series of essays about the report here under the title Response to Naylor Panel Report ** on the Canadian Science Policy Centre website.
There’s also this May 31, 2017 opinion piece by Jamie Cassels for The Vancouver Sun exhorting us to go forth collaborate internationally, presumably with added funding for the University of Victoria of which Cassels is the president and vice-chancellor. He seems not to have noticed that Canadian do much more poorly with interprovincial collaboration.
*ETA June 21, 2017: I’ve just stumbled across Ivan Semeniuk’s April 10, 2017 analysis (Globe and Mail newspaper) of the report. It’s substantive and well worth checking out.*
Again, here’s a link to the other parts:
INVESTING IN CANADA’S FUTURE; Strengthening the Foundations of Canadian Research (Review of fundamental research final report) Commentaries
*’up’ added on June 8, 2017 at 15:10 hours PDT.
**’Science Funding Review Panel Repor’t was changed to ‘Responses to Naylor Panel Report’ on June 22, 2017.