Tag Archives: (Canada) Privy Council Office (PCO)

Canada’s Chief Science Advisor: the first annual report

Dr. Mona Nemer, Canada’s Chief Science Advisor, and her office have issued their 2018 annual report. It is also the office’s first annual report. (Brief bit of history: There was a similar position, National Science Advisor (Canada) from 2004-2008. Dr. Arthur Carty, the advisor, was dumped and the position eliminated when a new government by what was previously the opposition party won the federal election.)

The report can be found in html format here which is where you’ll also find a link to download the full 40 pp. report as a PDF.

Being an inveterate ‘skip to the end’ kind of a reader, I took a boo at the Conclusion,

A chief science advisor plays an important role in building consensus and supporting positive change to our national institutions. Having laid the foundation for the function of the Chief Science Advisor in year one, I look forward to furthering the work that has been started and responding to new requests from the Prime Minister, the Minister of Science and members of Cabinet in my second year. I intend to continue working closely with agency heads, deputy ministers and the science community to better position Canada for global leadership in science and innovation.

I think the conclusion could have done with a little more imagination and/or personality, even by government report standards. While there is some interesting material in other parts of the report, on the whole, it is written in a pleasant, accessible style that raises questions.

Let’s start here with one of the sections that raises questions,

The Budget 2018 commitment of $2.8 billion to build multi-purpose, collaborative federal science and technology facilities presents a once-in-a-generation opportunity to establish a strong foundation for Canada’s future federal science enterprise. Similarly momentous opportunities exist to create a Digital Research Infrastructure Strategy [emphasis mine] for extramural science and a strategic national approach to Major Research Facilities. This is an important step forward for the Government of Canada in bringing principles and standards for federal science in line with practices in the broader scientific community; as such, the Model Policy will not only serve for the benefit of federal science, but will also help facilitate collaboration with the broader scientific community.

Over the past year, my office has participated in the discussion around these potentially transformative initiatives. We offered perspectives on the evidence needed to guide decisions regarding the kind of science infrastructure that can support the increasingly collaborative and multidisciplinary nature of science.

Among other things, I took an active part in the deliberations of the Deputy Ministers Science Committee (DMSC) and in the production of recommendations to Cabinet with respect to federal science infrastructure renewal. I have also analyzed the evolving nature and needs for national science facilities that support the Canadian science community.

Success in building the science infrastructure of the next 50 years and propelling Canadian science and research to new heights will require that the multiple decisions around these foundational initiatives dovetail based on a deep and integrated understanding of Canada’s science system and commitment to maintaining a long-term vision.

Ultimately, these infrastructure investments will need to support a collective vision and strategy for science in Canada, including which functions federal science should perform, and which should be shared with or carried out separately by academic and private sector researchers, in Canada and abroad. These are some of the essential considerations that must guide questions of infrastructure and resource allocation.

Canadian government digital infrastructure is a pretty thorny issue and while the article I’m referencing is old, I strongly suspect they still have many of the same problems. I’m going to excerpt sections that are focused on IT, specifically, the data centres and data storage. Prepare yourself for a bit of a shock as you discover the government did not know how many data centres it had. From a December 28, 2016 article by James Bagnall for the Ottawa Citizen,

Information technology [IT} offered an even richer vein of potential savings. For half a century, computer networks and software applications had multiplied willy-nilly as individual departments and agencies looked after their own needs. [emphases mine] The result was a patchwork of incompatible, higher-cost systems. Standardizing common, basic technologies such as email, data storage and telecommunications seemed logical. [emphasis mine]

It had been tried before. But attempts to centralize the buying of high-tech gear and services had failed, largely because federal departments were allowed to opt out. Most did so. They did want to give up control of their IT networks to a central agency.

The PCO determined this time would be different. The prime minister had the authority to create a new federal department through a simple cabinet approval known as an order-in-council. Most departments, including a reluctant Canada Revenue Agency and Department of National Defence, would be forced to carve out a significant portion of their IT groups and budgets — about 40 per cent on average — and hand them over to Shared Services.

Crucially, the move would not be subject to scrutiny by Parliament. And so Shared Services [Shared Services Canada] was born on Aug. 3, 2011.

Speed was demanded of the agency from the start. Minutes of meetings involving senior Shared Services staff are studded with references to “tight schedules” and the “urgency” of getting projects done.

Part of that had to do with the sheer age of the government’s infrastructure. The hardware was in danger of breaking down and the underlying software for many applications was so old that suppliers such as Microsoft, PeopleSoft and Adobe had stopped supporting it. [emphases mine]

The faster Shared Services could install new networks, the less money it would be forced to throw at solving the problems caused by older technology.

… Because the formation of Shared Services had been shrouded in such secrecy, the hiring of experienced outsiders happened last minute. Grant Westcott would join Shared Services as COO in mid-September.

Sixty-three years old at the time, Wescott had served as assistant deputy minister of Public Services in the late 1990s, when he managed the federal government’s telecommunications networks. In that role, he oversaw the successful effort to prepare systems for the year 2000.

More relevant to his new position at Shared Services, Westcott had also worked for nearly a decade at CIBC. As executive vice-president, he had been instrumental in overhauling the bank’s IT infrastructure. Twenty-two of the Canadian Imperial Bank of Commerce’s [CIBC] data centres were shrunk into just two, saving the bank more than $40 million annually in operating costs. Among other things, Westcott and his team consolidated 15 global communications networks into a single voice and data system. More savings resulted.

It wasn’t without glitches. CIBC was forced on several occasions to apologize to customers for a breakdown in certain banking services.

Nevertheless, Westcott’s experience suggested this was not rocket science, at least not in the private sector. [emphasis mine] Streamlining and modernizing IT networks could be done under the right conditions. …


While Shared Services is paying Bell only for the mailboxes it has already moved [there was an email project, which has since been resolved], the agency is still responsible for keeping the lights on at the government’s legacy data centres [emphasis mine]— where most of the electronic mailboxes, and much of the government’s software applications, for that matter, are currently stored.

These jumbles of computers, servers, wires and cooling systems are weighed down by far too much history — and it took Shared Services a long time to get its arms around it.

“We were not created with a transformation plan already in place because to do that, we needed to know what we had,” Forand [Liseanne Forand, then president of Shared Services Canada] explained to members of Parliament. “So we spent a year counting things.” [emphasis mine]

Shared Services didn’t have to start from scratch. Prior to the agency’s 2011 launch, the PCO’s administrative review tried to estimate the number of data centres in government by interviewing the IT czars for each of the departments. Symptomatic of the far-flung nature of the government’s IT networks — and how loosely these were managed — the PCO [Privy Council Office] didn’t even come close.

“When I was at Privy Council, we thought there might be about 200 data centres,” Benoit Long, senior assistant deputy minister of Shared Services told the House of Commons committee last May [2016], “After a year, we had counted 495 and I am still discovering others today.” [emphasis mine]

Most of these data centres were small — less than 1,000 square feet — often set up by government scientists who didn’t bother telling their superiors. Just 22 were larger than 5,000 square feet — and most of these were in the National Capital Region. These were where the bulk of the government’s data and software applications were stored.

In 2012, Grant Westcott organized a six-week tour of these data centres to see for himself what Shared Services was dealing with. According to an agency employee familiar with the tour’s findings, Westcott came away profoundly shocked.

Nearly all the large data centres were decrepit [emphasis mine], undercapitalized and inefficient users of energy. Yet just one was in the process of being replaced. This was a Heron Road facility with a leaking roof, considered key because it housed Canada Revenue Agency [CRA] data.

Bell Canada had been awarded a contract to build and operate a new data centre in Buckingham. The CRA data would be transferred there in the fall of 2013. A separate part of that facility is also meant to host the government’s email system.

The remaining data centres offered Westcott a depressing snapshot. [emphasis mine]

Most of the gear was housed in office towers, rather than facilities tailormade for heavy-duty electronics. With each new generation of computer or servers came requirements for more power and more robust cooling systems. The entire system was approaching a nervous breakdown.[emphasis mine]

Not only were the data centres inherited by Shared Services running out of space, the arrangements for emergency power bordered on makeshift. One Defence Department data centre, for instance, relied [on] a backup generator powered by a relatively ancient jet engine. [emphases mine] While unconventional, there was nothing unique about pressing military hardware into this type of service. The risk had to do with with training — the older the jet engine, the fewer employees there were who knew how to fix it. [emphasis mine]

The system for backing up data wasn’t much better. At Statistics Canada — where Westcott’s group was actually required to swear an oath to keep the data confidential before entering the storage rooms — the backup procedure called for storing copies of the data in a separate but nearby facility.While that’s OK for most disasters, the relative proximity still meant Statcan’s repository of historical data was vulnerable to a bombing or major earthquake.

Two and a quarter years later, I imagine they’ve finished counting their data centres but how much progress they might have made on replacing and upgrading the hardware and the facilities is not known to me but it’s hard to believe that all of the fixes have been made, especially after reading the Chief Science Advisor’s 2018 annual report (the relevant bits are coming up shortly.

If a more comprehensive view of the current government digital infrastructure interests you, I highly recommend Bagnall’s article. It’s heavy going for someone who’s not part of the Ottawa scene but it’s worth the effort as it gives you some insight into the federal government’s workings, which seem remarkably similar regardless as to which party is in power. Plus, it’s really well written.

Getting back to the report, I realize it’s not within the Chief Science Advisor’s scope to discuss the entirety of the government’s digital infrastructure but ***Corrected March 22, 2019 : Ooops! As originally written this section was incorrect. Here’s the corrected version; you can find the original version at the end of this post: “… 40 percent of facilities are more than 50 years old …” doesn’t really convey the depth and breadth of the problem. Presumably, these facilities also include digital infrastructure. At any rate, 40 percent of what? Just how many of these facilities are over 50 year old?*** By the way, the situation as the Chief Science Advisor’s Office describes it, gets a little worse, keep reading,

Over the past two years, the Treasury Board Secretariat has worked with the federal science community to develop an inventory of federal science infrastructure.

The project was successful in creating a comprehensive list of federal science facilities and documenting their state of repair. It reveals that some 40 percent of facilities are more than 50 years old and another 40 percent are more than 25 years old. The problem is most acute in the National Capital Region.

A similar inventory of research equipment is now underway. This will allow for effective coordination of research activities across organizations and with external collaborators. The Canada Foundation for Innovation, which provides federal funding toward the costs of academic research infrastructure, has created a platform upon which this work will build.

***Also corrected on March 20, 2019: Bravo to the Treasury Board for creating an inventory of the aging federal science infrastructure, which hopefully includes the digital.*** Following on Bagnall’s outline of the problems, it had to be discouraging work, necessary but discouraging. ***Also corrected on March 20, 2019: Did they take advantage of the Shared Services Canada data centrre counting exercise? Or did they redo the work?*** In any event, both the Treasury Board and the Chief Science Advisor have even more ahead of them.

Couldn’t there have been a link to the inventory? Is it secret information? It would have been nice if Canada’s Chief Science Advisor’s website had offered additional information or a link to supplement the 2018 annual report.

Admittedly there probably aren’t that many people who’d like more information about infrastructure and federal science offices but surely they could somehow give us a little more detail. For example, I understand there’s an AI (artificial intelligence) initiative within the government and given some of the issues, such as the Phoenix payroll system debacle and the digital infrastructure consisting of ‘chewing gum and baling wire’, my confidence is shaken. Have they learned any lessons? The report doesn’t offer any assurance they are taking these ‘lessons’ into account as they forge onwards.

Lies, damned lies, and statistics

I’ve always liked that line about lies and I’ll get to its applicability with regard to the Chief Science Advisor’s 2018 report but first, from the Lies, Damned Lies, and Statistics Wikipedia entry (Note: Links have been removed),


Lies, damned lies, and statistics” is a phrase describing the persuasive power of numbers, particularly the use of statistics to bolster weak arguments. It is also sometimes colloquially used to doubt statistics used to prove an opponent’s point.

The phrase was popularized in the United States by Mark Twain (among others), who attributed it to the British prime minister Benjamin Disraeli: “There are three kinds of lies: lies, damned lies, and statistics.” However, the phrase is not found in any of Disraeli’s works and the earliest known appearances were years after his death. Several other people have been listed as originators of the quote, and it is often erroneously attributed to Twain himself.[1]

Here’s a portion of the 2018 report, which is “using the persuasive power of numbers”, in this case, to offer a suggestion about why people mistrust science,

A recent national public survey found that eight in ten respondents wanted to know more about science and how it affects our world, and roughly the same percentage of people are comfortable knowing that scientific answers may not be definitive. 1 [emphases mine] While this would seem to be a positive result, it may also suggest why some people mistrust science, [emphasis mine] believing that results are fluid and can support several different positions. Again, this reveals the importance of effectively communicating science, not only to counter misinformation, but to inspire critical thinking and an appreciation for curiosity and discovery.

I was happy to see the footnote but surprised that the writer(s) simply trotted out a statistic without any hedging about the reliability of the data. Just because someone says that eight out of ten respondents feel a certain way doesn’t give me confidence in the statistic and I explain why in the next section.

They even added that ‘people are aware that scientific answers are not necessarily definitive’. So why not be transparent about the fallibility of statistics in your own report? If that’s not possible in the body of the report, then maybe put the more detailed, nuanced analysis in an appendix.Finally, how did they derive the notion from the data suggesting that people are aware that science is a process of refining knowledge and adjusting it as needed might lead to distrust of science? It’s possible of course but how do you make that leap based on the data you’re referencing? As for that footnote, where was the link to the report? Curious, I took a look at the report.

Ontario Science Centre and its statistics

For the curious, you can find the Ontario Science Centre. Canadian Science Attitudes Research (July 6, 2018) here where you’ll see the methodology is a little light on detail. (The company doing this work, Leger, is a marketing research and analysitcs company.) Here’s the methodology, such as it is,


QUANTITATIVE RESEARCH INSTRUMENT

An online survey of 1501 Canadians was completed between June 18 and 26, 2018, using Leger’s online panel.The margin of error for this study was +/-2.5%, 19 times out of 20.

COMPARISON DATA

Where applicable, this year’s results have been compared back to a similar study done in 2017 (OSC Canadian Science Attitudes Research, August 2017). The use of arrows ( ) indicate significant changes between the two datasets.

ABOUT LEGER’S ONLINE PANEL

Leger’s online panel has approximately 450,000 members nationally and has a retention rate of 90%.

QUALITY CONTROL

Stringent quality assurance measures allow Leger to achieve the high-quality standards set by the company. As a result, its methods of data collection and storage outperform the norms set by WAPOR (The World Association for Public Opinion Research). These measures are applied at every stage of the project: from data collection to processing, through to analysis.We aim to answer our clients’ needs with honesty, total confidentiality, and integrity.

I didn’t find any more substantive description of the research methodology but there is a demographic breakdown at the end of the report and because they’ve given you the number of the number of people paneled (1501) at the beginning of the report, you can figure out just how many people fit into the various categories.

Now, the questions: What is a Leger online panel? How can they confirm that the people who took the survey are Canadian? How did they put together this Leger panel, e.g., do people self-seletct? Why did they establish a category for people aged 65+? (Actually, in one section of the report, they make reference to 55+.) Read on for why that last one might be a good question.

I haven’t seen any surveys or polls originating in Canada that seem to recognize that a 65 year old is not an 80 year old. After all, they don’t lump in 20 year olds with 40 year olds because these people are at different stages in their lifespans, often with substantive differences in their outlook.. Maybe there are no differences between 65 year olds and 80 year olds but we’ll never know because no one ever asks. When you add in Canada’s aging population, the tendency to lump all ‘old’ people together seems even more thoughtless.

These are just a few questions that spring to mind and most likely, the pollsters have a more substantive methodology. There may have been a perfectly good reason for not including more detail in the version of the report I’ve linked to. For example, a lot of people will stop reading a report that’s ‘bogged down’ in too much detail. Fair enough but why not give a link to a more substantive methodology or put it in an appendix?

One more thing, I couldn’t find any data in the Ontario Science Centre report that would support the Chief Science Advisor Office’s speculation about why people might not trust science. They did hedge, as they should, “… seem to be a positive result, it may also suggest why some people mistrust science …” but the hedging follows directly after the ” … eight in ten respondents …”. which structurally suggests that there is data supporting the speculation. If there is, it’s not in the pollster’s report.

Empire building?

I just wish the office of the Chief Science Advisor had created the foundation to support this,

Establishing a National Science Advisory System

Science and scientific information permeate the work of the federal government, creating a continuous need for decision-makers to negotiate uncertainty in interpreting often variable and generally incomplete scientific evidence for policy making. With science and technology increasingly a part of daily life, the need for science advice will only intensify.

As such, my office has been asked to assess and recommend ways to improve the existing science advisory function [emphasis mine] within the federal government. To that end, we examined the science advisory systems in other countries, such as Australia, New Zealand, the United Kingdom, and the United States, as well as the European Union. [emphasis mine]

A key feature of all the systems was a network of department-level science advisors. These are subject matter experts who work closely with senior departmental officials and support the mandate of the chief science advisor. They stand apart from day-to-day operations and provide a neutral sounding board for senior officials and decision-makers evaluating various streams of information. They also facilitate the incorporation of evidence in decision-making processes and act as a link between the department and external stakeholders.

I am pleased to report that our efforts to foster a Canadian science advisory network are already bearing fruit. Four organizations [emphasis mine] have so far moved to create a departmental science advisor position, and the first incumbents at the Canadian Space Agency and the National Research Council [emphasis mine] are now in place. I have every confidence that the network of departmental science advisors will play an important role in enhancing science advice and science activities planning, especially on cross-cutting issues.

Who asked the office to assess and recommend ways to improve the existing science advisory function? By the way, the European Union axed the position of Chief Science Adviser, after Anne Glover, the first such adviser in their history, completed the term of her appointment in 2014 (see James Wilsdon’s November 13, 2014 article for the Guardian). Perhaps they meant countries in the European Union?

Which four organizations have created departmental science advisor positions? Presumably the Canadian Space Agency and the National Research Council were two of the organizations?

If you’re looking at another country’s science advisory systems do you have some publicly available information, perhaps a report and analysis? Is there a set of best practices? And, how do these practices fit into the Canadian context?

More generally, how much is this going to cost? Are these departmental advisors expected to report to the Canadian federal government’s Chief Science Advisor, as well as, their own ministry deputy minister or higher? Will the Office of the Chief Science Advisor need more staff?

I could be persuaded that increasing the number of advisors and increasing costs to support these efforts are good ideas but it would be nice to see the Office of Chief Science Advisor put some effort into persuasion rather than assuming that someone outside the tight nexus of federal government institutions based in Ottawa is willing to accept statements. that aren’t detailed and/or supported by data.

Final thoughts

I’m torn. I’m supportive of the position of the Chief Science Advisor of Canada and happy to see a report. It looks like there’s some exciting work being done.

I also hold the Chief Advisor and the Office to a high standard. It’s understandable that they may have decided to jettison detail in favour of making the 2018 annual report more readable but what I don’t understand is the failure to provide a more substantive report or information that supports and details the work. There’s a lack of transparency and, also, clarity. What do they mean by the word ‘science’. At a guess, they’re not including the social sciences.

On the whole, the report looks a little sloppy and that’s the last thing I expect from the Chief Science Advisor.

*** This is the original and not accurate version: “… 40 percent of facilities are more than 50 years old …” doesn’t really convey the depth and breadth of the problem. Let’s start with any easy question: 40 percent of what? Just how many of these computers are over 50 year old? By the way, the situation as the Chief Science Advisor’s Office describes it, gets a little worse, keep reading, AND THIS: Bravo to the Treasury Board for tracking down those data centres and more. AND THIS: Also, was this part of the Shared Services Canada counting exercise?