The critiques I offered in relation to the report’s executive summary (written in early Oct. 2012 but not published ’til now) and other materials can remain more or less intact now that I’ve read the rest of the report (State of Science and Technology in Canada, 2012 [link to full PDF report]). Overall, I think it’s a useful and good report despite what I consider to be some significant shortcomings, not least of which is the uncritical acceptance of the view Canada doesn’t patent enough of its science and its copyright laws are insufficient.
My concern regarding the technometrics (counting patents) is definitely not echoed in the report,
One key weakness of these measures is that not all types of technology development lead to patentable technologies. Some, such as software development, are typically subject to copyright instead. This is particularly relevant for research fields where software development may be a key aspect of developing new technologies such as computer sciences or digital media. Even when patenting is applicable as a means of commercializing and protecting intellectual property (IP), not all inventions are patented. (p. 18 print, p. 42 PDF)
In my view this is a little bit like fussing over the electrical wiring when the foundations of your house are in such bad repair that the whole structure is in imminent danger of falling. As noted in my critique of the executive summary, the patent system in the US and elsewhere is in deep, deep trouble and, is in fact, hindering innovation. Here’s an interesting comment about patent issues being covered in the media (from a Dec. 27, 2012 posting by Mike Masnick for Techdirt),
There’s been a recent uptick in stories about patent trolling getting mainstream media attention, and the latest example is a recent segment on CBS’s national morning program, CBS This Morning, which explored how patent trolls are hurting the US economy …
… After the segment, done by Jeff Glor, one of the anchors specifically says to him [Austin Meyer of the Laminer company which is fighting a patent troll in court and getting coverage on the morning news]: “So it sounds like this is really stifling innovation and it hurts small businesses!”
Getting back to the report, I’m in more sympathy with the panel’s use of bibliometrics,
As a mode of research assessment, bibliometric analysis has several important advantages. First, these techniques are built on a well-developed foundation of quantitative data. Publication in peer-reviewed journals is a cornerstone of research dissemination in most scientific and academic disciplines, and bibliometric data are therefore one of the few readily available sources of quantitative information on research activity that allow for comparisons across many fields of research. Second, bibliometric analyses are able to provide information about both research productivity (i.e., the quantity of journal articles produced) and research impact (measured through citations). While there are important methodological issues associated with these metrics (e.g., database coverage by discipline, correct procedures for normalization and aggregation, self-citations, and negative citations, etc.), [emphasis mine] most bibliometric experts agree that, when used appropriately, citation based indicators can be valid measures of the degree to which research has had an impact on later scientific work … (p. 15 print, p. 39, PDF)
Still, I do think that a positive publication bias (i.e., the tendency to publish positive results over negative or inclusive results) in the field medical research should have been mentioned as it is a major area of concern in the use of bibliometrics and especially since one of the identified areas of Canadian excellence is in the field of medical research.
The report’s critique of the opinion surveys has to be the least sophisticated in the entire report,
There are limitations related to the use of opinion surveys generally. The most important of these is simply that their results are, in the end, based entirely on the opinions of those surveyed. (p. 20 print, p. 44 PDF)
Let’s see if I’ve got this right. Counting the number of citations a paper, which was peer-reviewed (i.e., a set of experts were asked for their opinions about the paper prior to publication) and which may have been published due to a positive publication, bias yields data (bibliometrics) which are by definition more reliable than an opinion. In short, the Holy Grail (a sacred object in Christian traditions) is data even though that data or ‘evidence’ is provably based on and biased by opinion which the report writers identify as a limitation. Talk about a conundrum.
Sadly the humanities, arts, and social sciences (but especially humanities and arts) posed quite the problem regarding evidence-based analysis,
While the Panel believes that most other evidence-gathering activities undertaken for this assessment are equally valid across all fields, the limitations of bibliometrics led the Panel to seek measures of the impact of HASS [Humanities, Arts, and Social Sciences] research that would be equivalent to the use of bibliometrics, and would measure knowledge dissemination by books, book chapters, international awards, exhibitions, and other arts productions (e.g., theatre, cinema, etc.). Despite considerable efforts to collect information, however, the Panel found the data to be sparse and methods to collect it unreliable, such that it was not possible to draw conclusions from the resulting data. In short, the available data for HASS-specific outputs did not match the quality and rigour of the other evidence collected for this report. As a result, this evidence was not used in the Panel’s deliberations.
Interestingly, the expert panel was led by Dr. Eliot Phillipson, Sir John and Lady Eaton Professor of Medicine Emeritus, [emphasis mine] University of Toronto, who received his MD in 1963. Evidence-based medicine is the ne plus ultra of medical publishing these days. Is this deep distress over a lack of evidence/data in other fields a reflection of the chair’s biases? In all the discussion and critique of the methodologies, there was no discussion about reflexivity, i. e., the researcher’s or, in this case, the individual panel members’ (individually or collectively) biases and their possible impact on the report. Even with so called evidence-based medicine, bias and opinion are issues.
While the panel was not tasked to look into business-led R&D efforts (there is a forthcoming assessment focused on that question) mention was made in Chapter 3 (Research Investment) of the report. I was particularly pleased to see mention of the now defunct Nortel with its important century long contribution to Canadian R&D efforts. [Full disclosure: I did contract work for Nortel on and off for two years.]
A closer look at recent R&D expenditure trends shows that Canada’s total investment in R&D has declined in real terms between 2006 and 2010, driven mainly by declining private-sector research performance. Both government and higher education R&D expenditures increased modestly over the same five-year period (growing by 4.5 per cent and 7.1 per cent respectively), while business R&D declined by 17 per cent (see Figure 3.3). Much of this decline can be attributed to the failing fortunes and bankruptcy of Nortel Networks Corporation, which was one of Canada’s top corporate R&D spenders for many years. Between 2008 and 2009 alone, global R&D expenditure at Nortel dropped by 48 per cent, from nearly $1.7 billion to approximately $865 million (Re$earch Infosource, 2010) with significant impact on Canada. Although growth in R&D expenditure at other Canadian companies, particularly Research In Motion, partially compensated for the decline at Nortel, the overall downward trend remains. (p. 30 print, p. 54 PDF)
Chapter 4 of the report (Research Productivity and Impact) is filled with colourful tables and various diagrams and charts illustrating areas of strength and weakness within the Canadian research endeavour, my concerns over the metrics notwithstanding. I was a bit startled by our strength in Philosophy and Theology (Table 4.2 on p. 41 print, p. 65 PDF) as it was not touted in the initial publicity about the report. Of course, they can’t mention everything so there are some other pleasant surprises in here. Going in the other direction, I’m a little disturbed by the drop (down from 1.32 in 1999-2004 to 1.12 in 2005-1010) in the ICT (Information and Communication Technologies) specialization index but that is, as the report notes, a consequence of the Nortel loss and ICT scores better in other measures.
I very much appreciated the inclusion of the questions used in the surveys and the order in which they were asked, a practice which seems to be disappearing elsewhere. The discussion about possible biases and how the data was weighted to account for biases is interesting,
Because the responding population was significantly different than the sample population (p<0.01) for some countries, the data were weighted to correct for over- or under-representation. For example, Canadians accounted for 4.4 per cent of top-cited researchers, but 7.0 per cent of those that responded. After weighting, Canadians account for 4.4 per cent in the analyses that follow. This weighting changed overall results of how many people ranked each country in the top five by less than one per cent.
Even with weighting to remove bias in choice to respond, there could be a perception that self-selection is responsible for some results. Top-cited Canadian researchers in the population sample were not excluded from the survey but the results for Canada cannot be explained by self-promotion since 37 per cent of all respondents identified Canada among the top five countries in their field, but only 7 per cent (4.4 per cent after weighting) of respondents were from Canada. Similarly, 94 per cent of respondents identified the United States as a top country in their field, yet only 33 per cent (41 per cent after weighting) were from the United States. Furthermore, only 9 per cent of respondents had either worked or studied in Canada, and 28 per cent had no personal experience of, or association with, Canada or Canadian researchers (see Table 5.2). It is reasonable to conclude that the vast majority of respondents based their evaluation of Canadian S&T on its scientific contributions and reputation alone. (p. 65 print, p. 89 PDF)
There is another possible bias not mentioned in the report and that has to do with answering the question: What do you think my strengths and weaknesses are? If somebody asks you that question and you are replying directly, you are likely to focus on their strong points and be as gentle as possible about their weaknesses. Perhaps the panel should consider having another country ask those questions about Canadian research. We might find the conversation becomes a little more forthright and critical.
Chapter 6 of the report discusses research collaboration which is acknowledged as poorly served by bibliometrics. Of course, collaboration is a strategy which Canadians have succeeded with not least because we simply don’t have the resources to go it alone.
One of the features I quite enjoyed in this report are the spotlight features. For example, there’s the one on stem cell research,
Spotlight on Canadian Stem Cell Research
Stem cells were discovered by two Canadian researchers, Dr. James Till and the late Dr. Ernest McCulloch, at the University of Toronto over 50 years ago. This great Canadian contribution to medicine laid the foundation for all stem cell research, and put Canada firmly at the forefront of this field, an international leadership position that is still maintained.
Stem cell research, which is increasingly important to the future of cell replacement therapy for diseased or damaged tissues, spans many disciplines. These disciplines include biology, genetics, bioengineering, social sciences, ethics and law, chemical biology, and bioinformatics. The research aims to understand the mechanisms that govern stem cell behaviour, particularly as it relates to disease development and ultimately treatments or cures.
Stem cell researchers in Canada have a strong history of collaboration that has been supported and strengthened since 2001 by the Stem Cell Network (SCN) (one of the federal Networks of Centres of Excellence), a network considered to be a world leader in the field. Grants awarded through the SCN alone have affected the work of more than 125 principal investigators working in 30 institutions from Halifax to Vancouver. Particularly noteworthy institutions include the Terry Fox Laboratory at the BC Cancer Agency; the Hotchkiss Brain Institute in Calgary; Toronto’s Hospital for Sick Children, Mount Sinai Hospital, University Health Network, and the University of Toronto; the Sprott Centre for Stem Cell Research in Ottawa; and the Institute for Research in Immunology and Cancer in Montréal. In 2010, a new Centre for the Commercialization of Regenerative Medicine was formed to further support stem cell initiatives of interest to industry partners.
Today, Canadian researchers are among the most influential in the stem cell and regenerative medicine field. SCN investigators have published nearly 1,000 papers since 2001 in areas such as cancer stem cells; the endogenous repair of heart, muscle, and neural systems; the expansion of blood stem cells for the treatment of a variety of blood-borne diseases; the development of biomaterials for the delivery and support of cellular structures to replace damaged tissues; the direct conversion of skin stem cells to blood; the evolutionary analysis of leukemia stem cells; the identification of pancreatic stem cells; and the isolation of multipotent blood stem cells capable of forming all cells in the human blood system. (p. 96 print, p. 120 PDF)
Getting back to the report and my concerns, Chapter 8 on S&T capacity focuses on science training and education,
• From 2005 to 2009, there were increases in the number of students graduating from Canadian universities at the college, undergraduate, master’s and doctoral levels, with the largest increase at the doctoral level.
• Canada ranks first in the world for its share of population with post-secondary education.
• International students comprise 11 per cent of doctoral students graduating from Canadian universities. The fields with the largest proportions of international students include Earth and Environmental Sciences; Mathematics and Statistics; Agriculture, Fisheries, and Forestry; and Physics and Astronomy.
• From 1997 to 2010, Canada experienced a positive migration flow of researchers, particularly in the fields of Clinical Medicine, Information and Communication Technologies (ICT), Engineering, and Chemistry. Based on Average Relative Citations, the quality of researchers emigrating and immigrating was comparable.
• In three-quarters of fields, the majority of top-cited researchers surveyed thought Canada has world-leading research infrastructure or programs. (p. 118 print, p. 142 PDF)
Getting back to more critical matters, I don’t see a reference to jobs in this report. It’s all very well to graduate a large number of science PhDs, which we do, but what’s the point if they can’t find work?
- From 2005 to 2009, there were increases in the number of students graduating from Canadian universities at the college, undergraduate, master’s and doctoral levels, with the largest increase at the doctoral level.
- Canada ranks first in the world for its share of population with post-secondary education.
- International students comprise 11 per cent of doctoral students graduating from Canadian universities. The fields with the largest proportions of international students include Earth and Environmental Sciences; Mathematics and Statistics; Agriculture, Fisheries, and Forestry; and Physics and Astronomy.
- From 1997 to 2010, Canada experienced a positive migration flow of researchers, particularly in the fields of Clinical Medicine, Information and Communication Technologies (ICT), Engineering, and Chemistry. Based on Average Relative Citations, the quality of researchers emigrating and immigrating was comparable.
- In three-quarters of fields, the majority of top-cited researchers surveyed thought Canada has world-leading research infrastructure or programs. (p. 118 print, p. 142 PDF)
The Black Whole blog on the University Affairs website has discussed and continues to discuss the dearth of jobs in Canada for science graduates.
Chapter 9 of the report breaks down the information on a regional (provincial) bases. As you might expect, the research powerhouses are Ontario, Québec, Alberta and BC. Chapter 10 summarizes the material on a field basis, i.e., Biology; Chemistry; Agriculture, Fisheries, and Forestry; Econ0mics; Social Sciences; etc. and those results were widely discussed at the time and are mentioned in part 1 of this commentary.
One of the most striking results in the report is Chapter 11: Conclusions,
The geographic distribution of the six fields of strength is difficult to determine with precision because of the diminished reliability of data below the national level, and the vastly different size of the research enterprise in each province.
The most reliable data that are independent of size are provincial ARC scores. Using this metric, the leading provinces in each field are as follows:
- Clinical Medicine: Ontario, Quebec, British Columbia, Alberta
- Historical Studies: New Brunswick, Ontario, British Columbia
- ICT: British Columbia, Ontario
- Physics and Astronomy: British Columbia, Alberta, Ontario, Quebec
- Psychology and Cognitive Sciences: British Columbia, Nova Scotia, Ontario
- Visual and Performing Arts: Quebec [emphasis mine] (p. 193 print, p. 217 PDF)
Canada has an international reputation in visual and performing which is driven by one province alone.
As for our national fading reputation in natural resources and environmental S&T that seems predictable by almost any informed observer given funding decisions over the last several years.
The report does identify some emerging strengths,
Although robust methods of identifying emerging areas of S&T are still in their infancy, the Panel used new bibliometric techniques to identify research clusters and their rates of growth. Rapidly emerging research clusters in Canada have keywords relating, most notably, to:
• wireless technologies and networking,
• information processing and computation,
• nanotechnologies and carbon nanotubes, and
• digital media technologies.
The Survey of Canadian S&T Experts pointed to personalized medicine and health care, several energy technologies, tissue engineering, and digital media as areas in which Canada is well placed to become a global leader in development and application. (p. 195 print; p. 219 PDF)
I wish I was better and faster at crunching numbers because I’d like to spend time examining the data more closely but the reality is that all data is imperfect so this report like any snapshot is an approximation. Still, I would have liked to have seen some mention of changing practices in science. For example, there’s the protein-folding game, Foldit, which has attracted over 50,000 players (citizen scientists) who have answered questions and posed possibilities that had not occurred to scientists. Whether this trend will continue to disappear is to be answered in the future. What I find disconcerting is how thoroughly this and other shifting practices (scientists publishing research in blogs) and thorny issues such as the highly problematic patent system were ignored. Individual panel members or the report writers themselves may have wanted to include some mention but we’ll never know because the report is presented as a singular, united authority.
In any event, Bravo! to the expert panel and their support team as this can’t have been an easy job.
If you have anything to say about this commentary or the report please do comment, I would love to hear more opinions.