Tag Archives: citations

The State of Science and Technology in Canada, 2012 report—examined (part 2: the rest of the report)

The critiques I offered in relation to the report’s  executive summary (written in early Oct. 2012 but not published ’til now) and other materials can remain more or less intact now that I’ve read the rest of the report (State of Science and Technology in Canada, 2012 [link to full PDF report]). Overall, I think it’s a useful and good report despite what I consider to be some significant shortcomings, not least of which is the uncritical acceptance of the view Canada doesn’t patent enough of its science and its copyright laws are insufficient.

My concern regarding the technometrics (counting patents) is definitely not echoed in the report,

One key weakness of these measures is that not all types of technology development lead to patentable technologies. Some, such as software development, are typically subject to copyright instead. This is particularly relevant for research fields where software development may be a key aspect of developing new technologies such as computer sciences or digital media. Even when patenting is applicable as a means of commercializing and protecting intellectual property (IP), not all inventions are patented. (p. 18 print, p. 42 PDF)

In my view this is a little bit like fussing over the electrical wiring when the foundations of your house are  in such bad repair that the whole structure is in imminent danger of falling. As noted in my critique of the executive summary, the patent system in the US and elsewhere is in deep, deep trouble and, is in fact, hindering innovation. Here’s an interesting comment about patent issues being covered in the media (from a Dec. 27, 2012 posting by Mike Masnick for Techdirt),

There’s been a recent uptick in stories about patent trolling getting mainstream media attention, and the latest example is a recent segment on CBS’s national morning program, CBS This Morning, which explored how patent trolls are hurting the US economy …

… After the segment, done by Jeff Glor, one of the anchors specifically says to him [Austin Meyer of the Laminer company which is fighting a patent troll in court and getting coverage on the morning news]: “So it sounds like this is really stifling innovation and it hurts small businesses!”

Getting back to the report, I’m in more sympathy with the panel’s use of  bibliometrics,

As a mode of research assessment, bibliometric analysis has several important advantages. First, these techniques are built on a well-developed foundation of quantitative data. Publication in peer-reviewed journals is a cornerstone of research dissemination in most scientific and academic disciplines, and bibliometric data are therefore one of the few readily available sources of quantitative information on research activity that allow for comparisons across many fields of research. Second, bibliometric analyses are able to provide information about both research productivity (i.e., the quantity of journal articles produced) and research impact (measured through citations). While there are important methodological issues associated with these metrics (e.g., database coverage by discipline, correct procedures for normalization and aggregation, self-citations, and negative citations, etc.), [emphasis mine] most bibliometric experts agree that, when used appropriately, citation based indicators can be valid measures of the degree to which research has had an impact on later scientific work … (p. 15 print, p. 39, PDF)

Still, I do think that a positive publication bias (i.e., the tendency to publish positive results over negative or inclusive results) in the field medical research should have been mentioned as it is a major area of concern in the use  of bibliometrics and especially since one of the identified areas of  Canadian excellence is  in the field of medical research.

The report’s critique of the opinion surveys has to be the least sophisticated in the entire report,

There are limitations related to the use of opinion surveys generally. The most important of these is simply that their results are, in the end, based entirely on the opinions of those surveyed. (p. 20 print, p. 44 PDF)

Let’s see if I’ve got this right. Counting the number of citations a paper, which was peer-reviewed (i.e., a set of experts were asked for their opinions about the paper prior to publication) and which may have been published due to a positive publication, bias yields data (bibliometrics) which are by definition more reliable than an opinion. In short, the Holy Grail (a sacred object in Christian traditions) is data even though that data or ‘evidence’  is provably based on and biased by opinion which the report writers identify as a limitation. Talk about a conundrum.

Sadly the humanities, arts, and social sciences (but especially humanities and arts) posed quite the problem regarding evidence-based analysis,

While the Panel believes that most other evidence-gathering activities undertaken for this assessment are equally valid across all fields, the limitations of bibliometrics led the Panel to seek measures of the impact of HASS [Humanities, Arts, and Social Sciences] research that would be equivalent to the use of bibliometrics, and would measure knowledge dissemination by books, book chapters, international awards, exhibitions, and other arts productions (e.g., theatre, cinema, etc.). Despite considerable efforts to collect information, however, the Panel found the data to be sparse and methods to collect it unreliable, such that it was not possible to draw conclusions from the resulting data. In short, the available data for HASS-specific outputs did not match the quality and rigour of the other evidence collected for this report. As a result, this evidence was not used in the Panel’s deliberations.

Interestingly, the expert panel was led by Dr. Eliot Phillipson, Sir John and Lady Eaton Professor of Medicine Emeritus, [emphasis mine] University of Toronto, who received his MD in 1963. Evidence-based medicine is the ne plus ultra of medical publishing these days. Is this deep distress over a lack of evidence/data in other fields a reflection of the chair’s biases?  In all the discussion and critique of the methodologies, there was no discussion about reflexivity, i. e., the researcher’s or, in this case, the individual panel members’ (individually or collectively) biases and their possible impact on the report. Even with so called evidence-based medicine, bias and opinion are issues.

While the panel was not tasked to look into business-led R&D efforts (there is a forthcoming assessment focused on that question) mention was made in Chapter 3 (Research Investment) of the report. I was particularly pleased to see mention of the now defunct Nortel with its important century long contribution to Canadian R&D efforts. [Full disclosure: I did contract work for Nortel on and off for two years.]

A closer look at recent R&D expenditure trends shows that Canada’s total investment in R&D has declined in real terms between 2006 and 2010, driven mainly by declining private-sector research performance. Both government and higher education R&D expenditures increased modestly over the same five-year period (growing by 4.5 per cent and 7.1 per cent respectively), while business R&D declined by 17 per cent (see Figure 3.3). Much of this decline can be attributed to the failing fortunes and bankruptcy of Nortel Networks Corporation, which was one of Canada’s top corporate R&D spenders for many years. Between 2008 and 2009 alone, global R&D expenditure at Nortel dropped by 48 per cent, from nearly $1.7 billion to approximately $865 million (Re$earch Infosource, 2010) with significant impact on Canada. Although growth in R&D expenditure at other Canadian companies, particularly Research In Motion, partially compensated for the decline at Nortel, the overall downward trend remains. (p. 30 print, p. 54 PDF)

Chapter 4 of the report (Research Productivity and Impact) is filled with colourful tables and various diagrams and charts illustrating areas of strength and weakness within the Canadian research endeavour, my concerns over the metrics notwithstanding. I was a bit startled by our strength in Philosophy and Theology (Table 4.2 on p. 41 print, p. 65 PDF) as it was not touted in the initial publicity about the report. Of course, they can’t mention everything so there are some other pleasant surprises in here. Going in the other direction, I’m a little disturbed by the drop (down from 1.32 in 1999-2004 to 1.12 in 2005-1010) in the ICT (Information and Communication Technologies) specialization index but that is, as the report notes, a consequence of the Nortel loss and ICT scores better in other measures.

I very much appreciated the inclusion of the questions used in the surveys and the order in which they were asked, a practice which seems to be disappearing elsewhere. The discussion about possible biases and how the data was weighted to account for biases is interesting,

Because the responding population was significantly different than the sample population (p<0.01) for some countries, the data were weighted to correct for over- or under-representation. For example, Canadians accounted for 4.4 per cent of top-cited researchers, but 7.0 per cent of those that responded. After weighting, Canadians account for 4.4 per cent in the analyses that follow. This weighting changed overall results of how many people ranked each country in the top five by less than one per cent.

Even with weighting to remove bias in choice to respond, there could be a perception that self-selection is responsible for some results. Top-cited Canadian researchers in the population sample were not excluded from the survey but the results for Canada cannot be explained by self-promotion since 37 per cent of all respondents identified Canada among the top five countries in their field, but only 7 per cent (4.4 per cent after weighting) of respondents were from Canada. Similarly, 94 per cent of respondents identified the United States as a top country in their field, yet only 33 per cent (41 per cent after weighting) were from the United States. Furthermore, only 9 per cent of respondents had either worked or studied in Canada, and 28 per cent had no personal experience of, or association with, Canada or Canadian researchers (see Table 5.2). It is reasonable to conclude that the vast majority of respondents based their evaluation of Canadian S&T on its scientific contributions and reputation alone. (p. 65 print, p. 89 PDF)

There is another possible bias  not mentioned in the report and that has to do with answering the question: What do you think my strengths and weaknesses are? If somebody asks you that question and you are replying directly, you are likely to focus on their strong points and be as gentle as possible about their weaknesses. Perhaps the panel should consider having another country ask those questions about Canadian research. We might find the conversation becomes a little more forthright and critical.

Chapter 6 of the report discusses research collaboration which is acknowledged as poorly served by bibliometrics. Of course, collaboration is a strategy which Canadians have succeeded with not least because we simply don’t have the resources to go it alone.

One of the features I quite enjoyed in this report are the spotlight features. For example, there’s the one on stem cell research,

Spotlight on Canadian Stem Cell Research

Stem cells were discovered by two Canadian researchers, Dr. James Till and the late Dr. Ernest McCulloch, at the University of Toronto over 50 years ago. This great Canadian contribution to medicine laid the foundation for all stem cell research, and put Canada firmly at the forefront of this field, an international leadership position that is still maintained.

Stem cell research, which is increasingly important to the future of cell replacement therapy for diseased or damaged tissues, spans many disciplines. These disciplines include biology, genetics, bioengineering, social sciences, ethics and law, chemical biology, and bioinformatics. The research aims to understand the mechanisms that govern stem cell behaviour, particularly as it relates to disease development and ultimately treatments or cures.

Stem cell researchers in Canada have a strong history of collaboration that has been supported and strengthened since 2001 by the Stem Cell Network (SCN) (one of the federal Networks of Centres of Excellence), a network considered to be a world leader in the field. Grants awarded through the SCN alone have affected the work of more than 125 principal investigators working in 30 institutions from Halifax to Vancouver. Particularly noteworthy institutions include the Terry Fox Laboratory at the BC Cancer Agency; the Hotchkiss Brain Institute in Calgary; Toronto’s Hospital for Sick Children, Mount Sinai Hospital, University Health Network, and the University of Toronto; the Sprott Centre for Stem Cell Research in Ottawa; and the Institute for Research in Immunology and Cancer in Montréal. In 2010, a new Centre for the Commercialization of Regenerative Medicine was formed to further support stem cell initiatives of interest to industry partners.

Today, Canadian researchers are among the most influential in the stem cell and regenerative medicine field. SCN investigators have published nearly 1,000 papers since 2001 in areas such as cancer stem cells; the endogenous repair of heart, muscle, and neural systems; the expansion of blood stem cells for the treatment of a variety of blood-borne diseases; the development of biomaterials for the delivery and support of cellular structures to replace damaged tissues; the direct conversion of skin stem cells to blood; the evolutionary analysis of leukemia stem cells; the identification of pancreatic stem cells; and the isolation of multipotent blood stem cells capable of forming all cells in the human blood system. (p. 96 print, p. 120 PDF)

Getting back to the report and my concerns, Chapter 8 on S&T capacity focuses on science training and education,

• From 2005 to 2009, there were increases in the number of students graduating from Canadian universities at the college, undergraduate, master’s and doctoral levels, with the largest increase at the doctoral level.

• Canada ranks first in the world for its share of population with post-secondary education.

• International students comprise 11 per cent of doctoral students graduating from Canadian universities. The fields with the largest proportions of international students include Earth and Environmental Sciences; Mathematics and Statistics; Agriculture, Fisheries, and Forestry; and Physics and Astronomy.

• From 1997 to 2010, Canada experienced a positive migration flow of researchers, particularly in the fields of Clinical Medicine, Information and Communication Technologies (ICT), Engineering, and Chemistry. Based on Average Relative Citations, the quality of researchers emigrating and immigrating was comparable.

• In three-quarters of fields, the majority of top-cited researchers surveyed thought Canada has world-leading research infrastructure or programs. (p. 118 print, p. 142 PDF)

Getting back to more critical matters, I don’t see a reference to jobs in this report. It’s all very well to graduate a large number of science PhDs, which we do,  but what’s the point if they can’t find work?

  • From 2005 to 2009, there were increases in the number of students graduating from Canadian universities at the college, undergraduate, master’s and doctoral levels, with the largest increase at the doctoral level.
  • Canada ranks first in the world for its share of population with post-secondary education.
  • International students comprise 11 per cent of doctoral students graduating from Canadian universities. The fields with the largest proportions of international students include Earth and Environmental Sciences; Mathematics and Statistics; Agriculture, Fisheries, and Forestry; and Physics and Astronomy.
  • From 1997 to 2010, Canada experienced a positive migration flow of researchers, particularly in the fields of Clinical Medicine, Information and Communication Technologies (ICT), Engineering, and Chemistry. Based on Average Relative Citations, the quality of researchers emigrating and immigrating was comparable.
  • In three-quarters of fields, the majority of top-cited researchers surveyed thought Canada has world-leading research infrastructure or programs. (p. 118 print, p. 142 PDF)

The Black Whole blog on the University Affairs website has discussed and continues to discuss the dearth of jobs in Canada for science graduates.

Chapter 9 of the report breaks down the information on a regional (provincial) bases. As you might expect, the research powerhouses are Ontario, Québec, Alberta and BC. Chapter 10 summarizes the material on a field basis, i.e., Biology; Chemistry; Agriculture, Fisheries, and Forestry; Econ0mics; Social Sciences; etc.  and those results were widely discussed at the time and are mentioned in part 1 of this commentary.

One of the most striking results in the report is Chapter 11: Conclusions,

The geographic distribution of the six fields of strength is difficult to determine with precision because of the diminished reliability of data below the national level, and the vastly different size of the research enterprise in each province.

The most reliable data that are independent of size are provincial ARC scores. Using this metric, the leading provinces in each field are as follows:

  • Clinical Medicine: Ontario, Quebec, British Columbia, Alberta
  • Historical Studies: New Brunswick, Ontario, British Columbia
  • ICT: British Columbia, Ontario
  •  Physics and Astronomy: British Columbia, Alberta, Ontario, Quebec
  • Psychology and Cognitive Sciences: British Columbia, Nova Scotia, Ontario
  • Visual and Performing Arts: Quebec [emphasis mine] (p. 193 print, p. 217 PDF)

Canada has an international reputation in visual and performing *arts* which is driven by one province alone.

As for our national fading reputation in natural resources and environmental S&T that seems predictable by almost any informed observer given funding decisions over the last several years.

The report does identify some emerging strengths,

Although robust methods of identifying emerging areas of S&T are still in their infancy, the Panel used new bibliometric techniques to identify research clusters and their rates of growth. Rapidly emerging research clusters in Canada have keywords relating, most notably, to:

• wireless technologies and networking,

• information processing and computation,

• nanotechnologies and carbon nanotubes, and

• digital media technologies.

The Survey of Canadian S&T Experts pointed to personalized medicine and health care, several energy technologies, tissue engineering, and digital media as areas in which Canada is well placed to become a global leader in development and application. (p. 195 print; p. 219 PDF)

I wish I was better and faster at crunching numbers because I’d like to spend time examining the data more closely but the reality is that all data is imperfect so this report like any snapshot is an approximation. Still, I would have liked to have seen some mention of changing practices in science. For example, there’s the protein-folding game, Foldit, which has attracted over 50,000 players (citizen scientists) who have answered questions and posed possibilities that had not occurred to scientists. Whether this trend will continue to disappear is to be answered in the future. What I find disconcerting is how thoroughly this and other shifting practices (scientists publishing research in blogs) and thorny issues such as the highly problematic patent system were ignored. Individual panel members or the report writers themselves may have wanted to include some mention but we’ll never know because the report is presented as a singular, united authority.

In any event, Bravo! to the expert panel and their support team as this can’t have been an easy job.

If you have anything to say about this commentary or the report please do comment, I would love to hear more opinions.

*’arts’ added Jan. 19, 2016.

The State of Science and Technology in Canada, 2012 report—examined (part 1: the executive summary)

In my Sept. 27, 2012 posting about its launch,  we celebrated the Council of Canadian Academies, The State of science and Technology in Canada, 2012 report unconditionally. Today (Dec. , 2012), it’s time for a closer look.

I’m going to start with the report’s executive summary and some of the background information. Here’s the question the 18-member expert panel attempted to answer,

What is the current state of science and technology in Canada?

Additional direction was provided through two sub-questions:

Considering both basic and applied research fields, what are the scientific disciplines and technological applications in which Canada excels? How are these strengths distributed geographically across the country? How do these trends compare with what has been taking place in comparable countries?

In which scientific disciplines and technological applications has Canada shown the greatest improvement/decline in the last five years? What major trends have emerged? Which scientific disciplines and technological applications have the potential to emerge as areas of prominent strength for Canada?  (p. xi paper, p. 13 PDF)

Here’s more general information about the expert panel,

The Council appointed a multidisciplinary expert panel (the Panel) to address these questions. The Panel’s mandate spanned the full spectrum of fields in engineering, the natural sciences, health sciences, social sciences, the arts, and humanities. It focused primarily on research performed in the higher education sector, as well as the government and not-for-profit sectors. The mandate specifically excluded an examination of S&T performed in the private sector (which is the subject of a separate Council assessment on the state of industrial research and development). The Panel’s report builds upon, updates, and expands the Council’s 2006 report, The State of Science and Technology in Canada. (p. xi paper, p. 13 PDF)

As I noted in my Sept. 27, 2012 posting, the experts have stated,

  • The six research fields in which Canada excels are: clinical medicine, historical studies, information and communication technologies (ICT), physics and astronomy, psychology and cognitive sciences, and visual and performing arts.
  • Canadian science and technology is healthy and growing in both output and impact. With less than 0.5 per cent of the world’s population, Canada produces 4.1 per cent of the world’s research papers and nearly 5 per cent of the world’s most frequently cited papers.
  • In a survey of over 5,000 leading international scientists, Canada’s scientific research enterprise was ranked fourth highest in the world, after the United States, United Kingdom, and Germany.
  • Canada is part of a network of international science and technology collaboration that includes the most scientifically advanced countries in the world. Canada is also attracting high-quality researchers from abroad, such that over the past decade there has been a net migration of researchers into the country.
  • Ontario, Quebec, British Columbia and Alberta are the powerhouses of Canadian science and technology, together accounting for 97 per cent of total Canadian output in terms of research papers. These provinces also have the best performance in patent-related measures and the highest per capita numbers of doctoral students, accounting for more than 90 per cent of doctoral graduates in Canada in 2009.
  • Several fields of specialization were identified in other provinces, such as: agriculture, fisheries, and forestry in Prince Edward Island and Manitoba; historical studies in New Brunswick; biology in Saskatchewan; as well as earth and environmental sciences in Newfoundland and Labrador and Nova Scotia.

The Council did release a backgrounder describing the methodology the experts used to arrive at their conclusions,

In total, the Panel used a number of different methodologies to conduct this assessment, including: bibliometrics (the study of patterns in peer-reviewed journal articles); technometrics (the analysis of patent statistics and indicators), an analysis of highly qualified and skilled personnel; and opinion surveys of Canadian and international experts.

• To draw comparisons among the results derived through the different methodologies, and to integrate the findings, a common classification system was required. The Panel selected a classification system that includes 22 research fields composed of 176 sub-fields, which included fields in the humanities, arts, and social sciences.

Recognizing that some measurement tools used by the Panel (e.g. bibliometric measures) are a less relevant way of measuring science and technology strength in the humanities, arts, and social sciences, where research advances may be less often communicated in peer-reviewed journal articles, the Panel made considerable attempts to evaluate measures such as books and book chapters, exhibitions, and esteem measures such as international awards. However, the Panel was hampered by a lack of available data. As a result, the information and data collected did not meet the Council’s high standards and was excluded from the assessment.

• The Panel determined two measures of quality, a field’s international average relative citations (ARC) rank and its rank in the international survey, to be the most relevant in determining the field’s position compared with other advanced countries. Based on these measures of quality, the

Bibliometric Analysis (the study of patterns in peer-reviewed journal articles)

• Bibliometric analysis has several advantages, namely, that it is built on a well-developed foundation of quantitative data and it is able to provide information on research productivity and impact.

• For this assessment, the Panel relied heavily on bibliometrics to inform their deliberations. The Panel commissioned a comprehensive analysis of Canadian and world publication trends. It included consideration of many different indicators of output and impact, a study of collaboration patterns, and an analysis of researcher migration. Overall, the resulting research was extensive and critical for determining the research fields in which Canada excels.

• Standard bibliometrics do not identify patterns of collaboration among researchers, and may not adequately capture research activity within an interdisciplinary realm. Therefore, the Panel used advanced bibliometric techniques that allow for the identification of patterns of collaboration between Canadian researchers and those in other countries (based on the co-authorship of research papers); and clusters of related research papers, as an alternative approach to assessing Canada’s research strengths.

Technometrics (analysis of patent statistics and indicators)

• Technometrics is an important tool for determining trends in applied research. This type of analysis is routinely used by the Organisation for Economic Co-operation and Development (OECD) and other international organizations in comparing and assessing science and technology outputs across countries.

• In 2006, the Expert Panel on Science and Technology used technometrics to inform their work. In an effort to ensure consistency between the 2006 and the 2012 assessments, technometrics were once again used as a measurement tool.

• The 2012 Panel commissioned a full analysis of Canadian and international patent holdings in the United States Patent and Trademark Office (USPTO) to capture information about Canada’s patent stock and production of intellectual property relative to other advanced economies. Canadians accounted for 18,000 patented inventions in the USPTO, compared to 12,000 at the Canadian Intellectual Property Office during the period 2005-2010.

Opinion Surveys

• To capture a full range of Canadian science and technology activities and strengths, two extensive surveys were commissioned to gather opinions from Canadian experts and from the top one per cent of cited researchers from around the world.

• A survey of Canadian science and technology experts was conducted for the 2006 report. In

2012 this exercise was repeated, however, the survey was modified with three key changes:

o respondents were pre-chosen to ensure those responding were experts in Canadian science and technology;

o to allow comparisons of bibliometric data, the survey was based on the taxonomy of 22 scientific fields and 176 sub-fields; and

o a question regarding the identification of areas of provincial science and technology strength was added.

• To obtain the opinions of international science and technology experts regarding Canada’s science and technology strengths, the Panel conducted a survey of the top cited one percent of international researchers. Over 5,000 responded to the survey, including Canadians. This survey, combined with the results from the bibliometric analysis were used to determine the top six fields of research in which Canada excels.

..

Research Capacity

• The Panel conducted an analysis related to Canadian research capacity. This analysis drew evidence from a variety of sources including bibliometric data and existing information from publications by organizations such as the OECD and Statistics Canada.

• The Panel was also able to look at various Canadian research capacities which included research infrastructure and facilities, trends in Canada’s research faculty and student populations, the degree of collaboration among researchers in Canada and other countries, and researcher migration between Canada and other countries.

To sum it up, they used bibliometrics (how many citations, publications in peer-reviewed journals, etc.), technometrics (the number of patents filed, etc.), and opinion surveys, along with data from other publications. it sounds very impressive but I am wondering why Canada is so often unmentioned as a top research country in analyses produced outside of Canada. In the 2011 OECD (Organization for Economic Cooperation and Development) Science, Technology, and Industry scorecard, we didn’t place all that well according to my Sept. 27, 2011 posting,

Other topics were covered as well, the page hosting the OECD scorecard information boasts a couple of animations, one of particular interest to me (sadly I cannot embed it here). The item of interest is the animation featuring 30 years of R&D investments in OECD and non-OECD countries. It’s a very lively 16 seconds and you may need to view it a few times. You’ll see some countries rocket out of nowhere to make their appearance on the chart (Finland and Korea come to mind) and you’ll see some countries progress steadily while others fall back. The Canadian trajectory shows slow and steady growth until approximately 2000 when we fall back for a year or two after which we remain stagnant. [emphasis added here]

Notably, the 2012 State of Canadian Science and Technology does not mention investment in this sector as they do in the OECD scorecard and  even though that’s usually one of the measures for assessing the health of your science and technology sector.

For reasons that are somewhat of a mystery to me, the report indicates dissatisfaction with Canada’s patent performance (we don’t patent often enough),

In contrast to the nation’s strong performance in knowledge generation is its weaker performance in patents and related measures. Despite producing 4.1 per cent of the world’s scientific papers, Canada holds only 1.7 per cent of world patents, and in 2010 had a negative balance of nearly five billion dollars in royalties and licensing revenues. Despite its low quantity of patents, Canada excels in international comparisons of quality, with citations to patents (ARC scores), ranking second in the world, behind the United States. (p. xiii print, p. 15 PDF)

I have written extensively about the problems with the patent system, especially the system in the US, as per Billions lost to patent trolls; US White House asks for comments on intellectual property (IP) enforcement; and more on IP, in my June 28, 2012 posting and many others. As an indicator or metric for excellence in science and technology, counting your patents (or technometrics as defined by the Council of Canadian Academies) seems problematic. I appreciate this is a standard technique practiced by other countries but couldn’t the panel have expressed some reservations about the practice? Yes, they mention problems with the methodology but they seem unaware that there is growing worldwide dissatisfaction with patent practices.

Thankfully this report is not just a love letter to ourselves. There was an acknowledgement that some areas of excellence have declined since the 2006 report. For those following the Canadian science and technology scene, it can’t be a surprise to see that natural resources and environmental science and technology (S&T) are among the declining areas (not so coincidentally there is less financial investment by the federal government),

This assessment is, in part, an update of the Council’s 2006 assessment of the state of S&T in Canada. Results of the two assessments are not entirely comparable due to methodological differences such as the bibliometric database and classification system used in the two studies, and the survey of top-cited international researchers which was not undertaken in the 2006 assessment. Nevertheless, the Panel concluded that real improvements have occurred in the magnitude and quality of Canadian S&T in several fields including Biology, Clinical Medicine, ICT, Physics and Astronomy, Psychology and Cognitive Sciences, Public Health and Health Services, and Visual and Performing Arts. Two of the four areas identified as strengths in the 2006 report — ICT and health and related life sciences and technologies — have improved by most measures since 2006.

The other two areas identified as strengths in the 2006 report — natural resources and environmental S&T — have not experienced the same improvement as Canadian S&T in general. In the current classification system, these broad areas are now represented mainly by the fields of Agriculture, Fisheries, and Forestry; and Earth and Environmental Sciences. The Panel mapped the current classification system for these fields to the 2006 system and is confident that the overall decline in these fields is real, and not an artefact of different classifications. Scientific output and impact in these fields were either static or declined in 2005–2010 compared to 1994–2004. It should be noted, however, that even though these fields are declining relative to S&T in general, both maintain considerable strength, with Canadian research in Agriculture, Fisheries, and Forestry ranked second in the world in the survey of international researchers, and Earth and Environmental Sciences ranked fourth.

I’m not sure when I’ll get to part 2 of this as I have much on my plate at the moment but I will get back to this.

*ETA July 1, 2016: Evidently I got to part 2 sooner than I planned. It’s in a second Dec. 28, 2012 posting.*

Collaborative nano research

The journal, Nature, published a study about a trend towards collaborative nanotechnology research in its Dec. 2, 2010 online edition (Note: There’s a paywall and I don’t usually link to articles behind them).  From the Dec. 9, 2010 news item on Nanowerk,

Despite their initial focus on national economic competitiveness, the nanotechnology research initiatives now funded by more than 60 countries have become increasingly collaborative, with nearly a quarter of all papers co-authored by researchers across borders.

Researchers from the two leading producers of nanotechnology papers – China and the United States – have become each nation’s most frequent international co-authors. Though Chinese and U.S. researchers now publish roughly the same number of nanotechnology papers, the U.S. retains a lead in the quality of publications – as measured by the number of early citations.

“Despite ten years of emphasis by governments on national nanotechnology initiatives, we find that patterns of nanotechnology research collaboration and funding transcend country boundaries,” said Phillip Shapira, study co-author and a professor in the School of Public Policy at the Georgia Institute of Technology. “For example, we found that U.S. and Chinese researchers have developed a relatively high level of collaboration in nanotechnology research. Each country is the other’s leading collaborator in nanotechnology R&D.”

I’m not convinced that the number of early citations is a good indicator of quality and I have a couple questions. First, are papers published in prestigious journals like Science, Nature, etc. more likely to be cited early? Also, are the Chinese papers being published in English or in Chinese first?

Despite my reservations about this ‘quality issue’, I do find the research quite illuminating. More from the news item,

They [the study’s authors] found that although researchers from 152 nations were represented in the survey, just 15 countries represented 90 percent of the papers. The top four countries by author affiliation were the United States (23 percent), China (22 percent), Germany (8 percent) and Japan (8 percent). Papers authored by researchers from more than one nation – which constituted 23 percent of those examined – were assigned to more than one country.

Though the United States and China now produce approximately the same number of papers, the U.S. maintains significant advantages.

“Compared with Chinese counterparts, papers authored by U.S. researchers still have a substantial lead in terms of citation quality and U.S. corporate activity in nanotechnology innovation remains rather larger,” Shapira said. “However, Chinese quality is improving and an increasing number of Chinese companies are becoming engaged in developing and commercializing nano-enabled products.”

Shapira and study collaborator Jue Wang, an assistant professor at Florida International University, had some other interesting findings,

The study also found that sponsors concentrating their funding in fewer institutions had lower research impact as measured by early citation counts.

“Our starting hypothesis is that when groups from multiple institutions vie for funding, there is increased competition, review processes are less partial, and there are more opportunities to select the most improving projects,” Shapira explained.

With increasing budget pressures, growth in nanotechnology funding appears unlikely. How should countries invest their limited funding for greatest benefit?

“One way would be to foster more high-quality international collaborations, perhaps by opening funding competitions to international researchers and by offering travel and mobility awards for domestic researchers to increase alliances with colleagues in other countries,” the researchers suggested in their paper.

China’s nanotechnology rise

Eric Berger’s blog, SciGuy, recently highlighted some data about the number of nanotechnology/nanoscience articles published by Chinese researchers. You can see the entry and the table listing the world’s most prolific (overwhelmingly Chinese)  nanotech authors here. It’s interesting to contrast this data with a Nature Nanotechnology editorial from June 2008 where they had tables listing the countries with the most published nanotech articles and the most frequently cited articles. At the time, I thought China was under-represented although I don’t state it explicitly in my comments here.

Berger was inspired to write his commentary after seeing Eric Drexler’s posting on the topic (Oct. 30, 2009) but I’m directing you to Drexler’s followup comments where he provides some context for better understanding the statistics and cites sources that discuss the matter at more length.

The general consensus seems to be that some of China’s nanotech research is world class and the quality of majority of the research papers is either very good or improving rapidly.

There’s also this from the Center for Nanotechnology in Society University of California Santa Barbara (CNS-UCSB) paper, Chinese Nanotechnology Publications (scroll down the page to IRG 4-3 to read the full abstract),

China’s top-down and government-centered approach toward science and technology policy is succeeding in driving academic-publications output. By 2005 China had equaled or possibly surpassed the U.S. in terms of total output for academic/peer-reviewed publications, with a substantial increase in publication rate from around 2003. … We examined US and Chinese nanotechnology trends in the scientific literature and found that Chinese nanotechnology output is growing rapidly and will likely [outperform?] US output in terms of quality as well as quantity within a decade or less (Appelbaum & Parker 2008).

I include this portion of the abstract because  the phrase, “China’s top-down and government-centered approach to science and technology” points to something that’s not explicitly noted in the abstract, cultural and political climate. Nor was it noted in Bruce Alberts’ speech (in my Is science superior? posting) and as Inkbat noted in her comments to that posting. (My apologies to Mr. Alberts if he did make those points, unfortunately his speech is not available on the conference website so I’m depending on attendee reports.)

It’s a tricky matter trying to compare countries. China has more people and presumably more scientists than anyone else, all of which should result in more published articles if the area of research is supported by policy.

One of the issues for Canada is that we have a relatively small population and consequently fewer scientists. I commented on some work done by M. Fatih Yegul (in June 2008) where he contextualizes the number of Canadian articles published on nanotechnology and our focus on collaboration. Here’s part 2 of the series where I mentioned the numbers. (I did not post much material from Yegul’s paper as he was about to present it an international conference and it had yet to be published. I just checked today [Nov.4.09] and cannot confirm publication.)  My comments from part 3 of the series,

It’s all pretty interesting including the suggestion (based on a study that showed Canada as ranking 6th in numbers of science articles published from 1995-2005) that Canada is performing below its own average with regard to nanotechnology research.

I don’t know if the situation in Canada has changed since Yegul wrote and presented his paper but I strongly suspect it has not.

As for the roles that culture, social mores, history, and political environment play, I just can’t manage more than a mention in this posting in an effort to acknowledge their importance.

Do check out Rob Annan’s posting today (Nov. 4, 2009) about Science and Innovation in the wake of the 2009 Canadian Science Policy Conference.

Numbers of published nano articles, China, and Canada’s nanotech

Nature Nanotechnology published an editorial in their June 2008 issue about which countries have published the most articles and which are most often cited. In examining their own journal and a couple studies, they found that the US has published the most with China coming up quickly to overtake US output in the near future.

How do you attribute an article to a country? In these studies, they looked at the lead author’s affiliation. For number freaks, Nature Nanotechnology published 94 letters and 55 articles with 47.6% of the authors being located in the US, followed by 8% from the UK, 7.4% from Japan and 6.7% from Germany. I guess the rest of us make up the other 30% or so. The figures about the China’s articles come from other studies that the editorial cites. (I’d link to Nature Nanotech but the journal’s latest issues are behind a paywall. They’ll let you sniff some of the cheese but you won’t be able to take a bite for at least a year.)

One point they do make is that the Chinese articles aren’t cited as often as US articles or even Japanese articles (China’s output has been higher than Japan’s since 1990). All of which is interesting since, citations are one measure of quality and/or influence. I think it’s safe to assume that  they’re talking about articles that were written in English so we’re not looking at language issues. Still, I can think of at least one reason why work from China might not be cited as often: geopolitical tensions.

Here’s another suggestion: where are the Chinese authors getting published? If your work isn’t being published in journals that other interested parties are reading, how are you going to get cited? (Brief related story) I do research for a psychiatrist (he specializes in pain management) who’s interested in checking out some of the latest research on morphine. I have two entirely separate research tracks each with their own specialized vocabularies and specialty-specific journals. If I use the wrong words, I won’t find the other research material. (Back to the nano) So now there are two other possible problems. Researchers casually thumbing through Nature Nanotechnology are not going to see many articles from China (as per the June 2008 editorial) and, if Chinese researchers are using the vocabulary differently, standard keyword research strategies aren’t going to  lead you to their work.

As for the Canadian nanotechnology scene, we don’t seem to be on the radar for either Nature Nanotechnology or the two studies they cited. I’m a little curious about that since there was a presenter at the 2008 Cascadia Nanotechnology Symposium in March who focussed on numbers of articles published by Canadian nano researchers. As I recall, he indicated that our numbers are pretty healthy.  I’m trying to track that info. down but I can’t find M. Fatih Yegul’s (University of Waterloo) presentation on the symposium website or any other published version of his information.