Tag Archives: Treasury Board of Canada

Governments need to tell us when and how they’re using AI (artificial intelligence) algorithms to make decisions

I have two items and an exploration of the Canadian scene all three of which feature governments, artificial intelligence, and responsibility.

Special issue of Information Polity edited by Dutch academics,

A December 14, 2020 IOS Press press release (also on EurekAlert) announces a special issue of Information Polity focused on algorithmic transparency in government,

Amsterdam, NL – The use of algorithms in government is transforming the way bureaucrats work and make decisions in different areas, such as healthcare or criminal justice. Experts address the transparency challenges of using algorithms in decision-making procedures at the macro-, meso-, and micro-levels in this special issue of Information Polity.

Machine-learning algorithms hold huge potential to make government services fairer and more effective and have the potential of “freeing” decision-making from human subjectivity, according to recent research. Algorithms are used in many public service contexts. For example, within the legal system it has been demonstrated that algorithms can predict recidivism better than criminal court judges. At the same time, critics highlight several dangers of algorithmic decision-making, such as racial bias and lack of transparency.

Some scholars have argued that the introduction of algorithms in decision-making procedures may cause profound shifts in the way bureaucrats make decisions and that algorithms may affect broader organizational routines and structures. This special issue on algorithm transparency presents six contributions to sharpen our conceptual and empirical understanding of the use of algorithms in government.

“There has been a surge in criticism towards the ‘black box’ of algorithmic decision-making in government,” explain Guest Editors Sarah Giest (Leiden University) and Stephan Grimmelikhuijsen (Utrecht University). “In this special issue collection, we show that it is not enough to unpack the technical details of algorithms, but also look at institutional, organizational, and individual context within which these algorithms operate to truly understand how we can achieve transparent and responsible algorithms in government. For example, regulations may enable transparency mechanisms, yet organizations create new policies on how algorithms should be used, and individual public servants create new professional repertoires. All these levels interact and affect algorithmic transparency in public organizations.”

The transparency challenges for the use of algorithms transcend different levels of government – from European level to individual public bureaucrats. These challenges can also take different forms; transparency can be enabled or limited by technical tools as well as regulatory guidelines or organizational policies. Articles in this issue address transparency challenges of algorithm use at the macro-, meso-, and micro-level. The macro level describes phenomena from an institutional perspective – which national systems, regulations and cultures play a role in algorithmic decision-making. The meso-level primarily pays attention to the organizational and team level, while the micro-level focuses on individual attributes, such as beliefs, motivation, interactions, and behaviors.

“Calls to ‘keep humans in the loop’ may be moot points if we fail to understand how algorithms impact human decision-making and how algorithmic design impacts the practical possibilities for transparency and human discretion,” notes Rik Peeters, research professor of Public Administration at the Centre for Research and Teaching in Economics (CIDE) in Mexico City. In a review of recent academic literature on the micro-level dynamics of algorithmic systems, he discusses three design variables that determine the preconditions for human transparency and discretion and identifies four main sources of variation in “human-algorithm interaction.”

The article draws two major conclusions: First, human agents are rarely fully “out of the loop,” and levels of oversight and override designed into algorithms should be understood as a continuum. The second pertains to bounded rationality, satisficing behavior, automation bias, and frontline coping mechanisms that play a crucial role in the way humans use algorithms in decision-making processes.

For future research Dr. Peeters suggests taking a closer look at the behavioral mechanisms in combination with identifying relevant skills of bureaucrats in dealing with algorithms. “Without a basic understanding of the algorithms that screen- and street-level bureaucrats have to work with, it is difficult to imagine how they can properly use their discretion and critically assess algorithmic procedures and outcomes. Professionals should have sufficient training to supervise the algorithms with which they are working.”

At the macro-level, algorithms can be an important tool for enabling institutional transparency, writes Alex Ingrams, PhD, Governance and Global Affairs, Institute of Public Administration, Leiden University, Leiden, The Netherlands. This study evaluates a machine-learning approach to open public comments for policymaking to increase institutional transparency of public commenting in a law-making process in the United States. The article applies an unsupervised machine learning analysis of thousands of public comments submitted to the United States Transport Security Administration on a 2013 proposed regulation for the use of new full body imaging scanners in airports. The algorithm highlights salient topic clusters in the public comments that could help policymakers understand open public comments processes. “Algorithms should not only be subject to transparency but can also be used as tool for transparency in government decision-making,” comments Dr. Ingrams.

“Regulatory certainty in combination with organizational and managerial capacity will drive the way the technology is developed and used and what transparency mechanisms are in place for each step,” note the Guest Editors. “On its own these are larger issues to tackle in terms of developing and passing laws or providing training and guidance for public managers and bureaucrats. The fact that they are linked further complicates this process. Highlighting these linkages is a first step towards seeing the bigger picture of why transparency mechanisms are put in place in some scenarios and not in others and opens the door to comparative analyses for future research and new insights for policymakers. To advocate the responsible and transparent use of algorithms, future research should look into the interplay between micro-, meso-, and macro-level dynamics.”

“We are proud to present this special issue, the 100th issue of Information Polity. Its focus on the governance of AI demonstrates our continued desire to tackle contemporary issues in eGovernment and the importance of showcasing excellent research and the insights offered by information polity perspectives,” add Professor Albert Meijer (Utrecht University) and Professor William Webster (University of Stirling), Editors-in-Chief.

This image illustrates the interplay between the various level dynamics,

Caption: Studying algorithms and algorithmic transparency from multiple levels of analyses. Credit: Information Polity.

Here’s a link, to and a citation for the special issue,

Algorithmic Transparency in Government: Towards a Multi-Level Perspective
Guest Editors: Sarah Giest, PhD, and Stephan Grimmelikhuijsen, PhD
Information Polity, Volume 25, Issue 4 (December 2020), published by IOS Press

The issue is open access for three months, Dec. 14, 2020 – March 14, 2021.

Two articles from the special were featured in the press release,

“The agency of algorithms: Understanding human-algorithm interaction in administrative decision-making,” by Rik Peeters, PhD (https://doi.org/10.3233/IP-200253)

“A machine learning approach to open public comments for policymaking,” by Alex Ingrams, PhD (https://doi.org/10.3233/IP-200256)

An AI governance publication from the US’s Wilson Center

Within one week of the release of a special issue of Information Polity on AI and governments, a Wilson Center (Woodrow Wilson International Center for Scholars) December 21, 2020 news release (received via email) announces a new publication,

Governing AI: Understanding the Limits, Possibilities, and Risks of AI in an Era of Intelligent Tools and Systems by John Zysman & Mark Nitzberg

Abstract

In debates about artificial intelligence (AI), imaginations often run wild. Policy-makers, opinion leaders, and the public tend to believe that AI is already an immensely powerful universal technology, limitless in its possibilities. However, while machine learning (ML), the principal computer science tool underlying today’s AI breakthroughs, is indeed powerful, ML is fundamentally a form of context-dependent statistical inference and as such has its limits. Specifically, because ML relies on correlations between inputs and outputs or emergent clustering in training data, today’s AI systems can only be applied in well- specified problem domains, still lacking the context sensitivity of a typical toddler or house-pet. Consequently, instead of constructing policies to govern artificial general intelligence (AGI), decision- makers should focus on the distinctive and powerful problems posed by narrow AI, including misconceived benefits and the distribution of benefits, autonomous weapons, and bias in algorithms. AI governance, at least for now, is about managing those who create and deploy AI systems, and supporting the safe and beneficial application of AI to narrow, well-defined problem domains. Specific implications of our discussion are as follows:

  • AI applications are part of a suite of intelligent tools and systems and must ultimately be regulated as a set. Digital platforms, for example, generate the pools of big data on which AI tools operate and hence, the regulation of digital platforms and big data is part of the challenge of governing AI. Many of the platform offerings are, in fact, deployments of AI tools. Hence, focusing on AI alone distorts the governance problem.
  • Simply declaring objectives—be they assuring digital privacy and transparency, or avoiding bias—is not sufficient. We must decide what the goals actually will be in operational terms.
  • The issues and choices will differ by sector. For example, the consequences of bias and error will differ from a medical domain or a criminal justice domain to one of retail sales.
  • The application of AI tools in public policy decision making, in transportation design or waste disposal or policing among a whole variety of domains, requires great care. There is a substantial risk of focusing on efficiency when the public debate about what the goals should be in the first place is in fact required. Indeed, public values evolve as part of social and political conflict.
  • The economic implications of AI applications are easily exaggerated. Should public investment concentrate on advancing basic research or on diffusing the tools, user interfaces, and training needed to implement them?
  • As difficult as it will be to decide on goals and a strategy to implement the goals of one community, let alone regional or international communities, any agreement that goes beyond simple objective statements is very unlikely.

Unfortunately, I haven’t been able to successfully download the working paper/report from the Wilson Center’s Governing AI: Understanding the Limits, Possibilities, and Risks of AI in an Era of Intelligent Tools and Systems webpage.

However, I have found a draft version of the report (Working Paper) published August 26, 2020 on the Social Science Research Network. This paper originated at the University of California at Berkeley as part of a series from the Berkeley Roundtable on the International Economy (BRIE). ‘Governing AI: Understanding the Limits, Possibility, and Risks of AI in an Era of Intelligent Tools and Systems’ is also known as the BRIE Working Paper 2020-5.

Canadian government and AI

The special issue on AI and governance and the the paper published by the Wilson Center stimulated my interest in the Canadian government’s approach to governance, responsibility, transparency, and AI.

There is information out there but it’s scattered across various government initiatives and ministries. Above all, it is not easy to find, open communication. Whether that’s by design or the blindness and/or ineptitude to be found in all organizations I leave that to wiser judges. (I’ve worked in small companies and they too have the problem. In colloquial terms, ‘the right hand doesn’t know what the left hand is doing’.)

Responsible use? Maybe not after 2019

First there’s a government of Canada webpage, Responsible use of artificial intelligence (AI). Other than a note at the bottom of the page “Date modified: 2020-07-28,” all of the information dates from 2016 up to March 2019 (which you’ll find on ‘Our Timeline’). Is nothing new happening?

For anyone interested in responsible use, there are two sections “Our guiding principles” and “Directive on Automated Decision-Making” that answer some questions. I found the ‘Directive’ to be more informative with its definitions, objectives, and, even, consequences. Sadly, you need to keep clicking to find consequences and you’ll end up on The Framework for the Management of Compliance. Interestingly, deputy heads are assumed in charge of managing non-compliance. I wonder how employees deal with a non-compliant deputy head?

What about the government’s digital service?

You might think Canadian Digital Service (CDS) might also have some information about responsible use. CDS was launched in 2017, according to Luke Simon’s July 19, 2017 article on Medium,

In case you missed it, there was some exciting digital government news in Canada Tuesday. The Canadian Digital Service (CDS) launched, meaning Canada has joined other nations, including the US and the UK, that have a federal department dedicated to digital.

At the time, Simon was Director of Outreach at Code for Canada.

Presumably, CDS, from an organizational perspective, is somehow attached to the Minister of Digital Government (it’s a position with virtually no governmental infrastructure as opposed to the Minister of Innovation, Science and Economic Development who is responsible for many departments and agencies). The current minister is Joyce Murray whose government profile offers almost no information about her work on digital services. Perhaps there’s a more informative profile of the Minister of Digital Government somewhere on a government website.

Meanwhile, they are friendly folks at CDS but they don’t offer much substantive information. From the CDS homepage,

Our aim is to make services easier for government to deliver. We collaborate with people who work in government to address service delivery problems. We test with people who need government services to find design solutions that are easy to use.

Learn more

After clicking on Learn more, I found this,

At the Canadian Digital Service (CDS), we partner up with federal departments to design, test and build simple, easy to use services. Our goal is to improve the experience – for people who deliver government services and people who use those services.

How it works

We work with our partners in the open, regularly sharing progress via public platforms. This creates a culture of learning and fosters best practices. It means non-partner departments can apply our work and use our resources to develop their own services.

Together, we form a team that follows the ‘Agile software development methodology’. This means we begin with an intensive ‘Discovery’ research phase to explore user needs and possible solutions to meeting those needs. After that, we move into a prototyping ‘Alpha’ phase to find and test ways to meet user needs. Next comes the ‘Beta’ phase, where we release the solution to the public and intensively test it. Lastly, there is a ‘Live’ phase, where the service is fully released and continues to be monitored and improved upon.

Between the Beta and Live phases, our team members step back from the service, and the partner team in the department continues the maintenance and development. We can help partners recruit their service team from both internal and external sources.

Before each phase begins, CDS and the partner sign a partnership agreement which outlines the goal and outcomes for the coming phase, how we’ll get there, and a commitment to get them done.

As you can see, there’s not a lot of detail and they don’t seem to have included anything about artificial intelligence as part of their operation. (I’ll come back to the government’s implementation of artificial intelligence and information technology later.)

Does the Treasury Board of Canada have charge of responsible AI use?

I think so but there are government departments/ministries that also have some responsibilities for AI and I haven’t seen any links back to the Treasury Board documentation.

For anyone not familiar with the Treasury Board or even if you are, December 14, 2009 article (Treasury Board of Canada: History, Organization and Issues) on Maple Leaf Web is quite informative,

The Treasury Board of Canada represent a key entity within the federal government. As an important cabinet committee and central agency, they play an important role in financial and personnel administration. Even though the Treasury Board plays a significant role in government decision making, the general public tends to know little about its operation and activities. [emphasis mine] The following article provides an introduction to the Treasury Board, with a focus on its history, responsibilities, organization, and key issues.

It seems the Minister of Digital Government, Joyce Murray is part of the Treasury Board and the Treasury Board is the source for the Digital Operations Strategic Plan: 2018-2022,

I haven’t read the entire document but the table of contents doesn’t include a heading for artificial intelligence and there wasn’t any mention of it in the opening comments.

But isn’t there a Chief Information Officer for Canada?

Herein lies a tale (I doubt I’ll ever get the real story) but the answer is a qualified ‘no’. The Chief Information Officer for Canada, Alex Benay (there is an AI aspect) stepped down in September 2019 to join a startup company according to an August 6, 2019 article by Mia Hunt for Global Government Forum,

Alex Benay has announced he will step down as Canada’s chief information officer next month to “take on new challenge” at tech start-up MindBridge.

“It is with mixed emotions that I am announcing my departure from the Government of Canada,” he said on Wednesday in a statement posted on social media, describing his time as CIO as “one heck of a ride”.

He said he is proud of the work the public service has accomplished in moving the national digital agenda forward. Among these achievements, he listed the adoption of public Cloud across government; delivering the “world’s first” ethical AI management framework; [emphasis mine] renewing decades-old policies to bring them into the digital age; and “solidifying Canada’s position as a global leader in open government”.

He also led the introduction of new digital standards in the workplace, and provided “a clear path for moving off” Canada’s failed Phoenix pay system. [emphasis mine]

I cannot find a current Chief Information of Canada despite searches but I did find this List of chief information officers (CIO) by institution. Where there was one, there are now many.

Since September 2019, Mr. Benay has moved again according to a November 7, 2019 article by Meagan Simpson on the BetaKit,website (Note: Links have been removed),

Alex Benay, the former CIO [Chief Information Officer] of Canada, has left his role at Ottawa-based Mindbridge after a short few months stint.

The news came Thursday, when KPMG announced that Benay was joining the accounting and professional services organization as partner of digital and government solutions. Benay originally announced that he was joining Mindbridge in August, after spending almost two and a half years as the CIO for the Government of Canada.

Benay joined the AI startup as its chief client officer and, at the time, was set to officially take on the role on September 3rd. According to Benay’s LinkedIn, he joined Mindbridge in August, but if the September 3rd start date is correct, Benay would have only been at Mindbridge for around three months. The former CIO of Canada was meant to be responsible for Mindbridge’s global growth as the company looked to prepare for an IPO in 2021.

Benay told The Globe and Mail that his decision to leave Mindbridge was not a question of fit, or that he considered the move a mistake. He attributed his decision to leave to conversations with Mindbridge customer KPMG, over a period of three weeks. Benay told The Globe that he was drawn to the KPMG opportunity to lead its digital and government solutions practice, something that was more familiar to him given his previous role.

Mindbridge has not completely lost what was touted as a start hire, though, as Benay will be staying on as an advisor to the startup. “This isn’t a cutting the cord and moving on to something else completely,” Benay told The Globe. “It’s a win-win for everybody.”

Via Mr. Benay, I’ve re-introduced artificial intelligence and introduced the Phoenix Pay system and now I’m linking them to government implementation of information technology in a specific case and speculating about implementation of artificial intelligence algorithms in government.

Phoenix Pay System Debacle (things are looking up), a harbinger for responsible use of artificial intelligence?

I’m happy to hear that the situation where government employees had no certainty about their paycheques is becoming better. After the ‘new’ Phoenix Pay System was implemented in early 2016, government employees found they might get the correct amount on their paycheque or might find significantly less than they were entitled to or might find huge increases.

The instability alone would be distressing but adding to it with the inability to get the problem fixed must have been devastating. Almost five years later, the problems are being resolved and people are getting paid appropriately, more often.

The estimated cost for fixing the problems was, as I recall, over $1B; I think that was a little optimistic. James Bagnall’s July 28, 2020 article for the Ottawa Citizen provides more detail, although not about the current cost, and is the source of my measured optimism,

Something odd has happened to the Phoenix Pay file of late. After four years of spitting out errors at a furious rate, the federal government’s new pay system has gone quiet.

And no, it’s not because of the even larger drama written by the coronavirus. In fact, there’s been very real progress at Public Services and Procurement Canada [PSPC; emphasis mine], the department in charge of pay operations.

Since January 2018, the peak of the madness, the backlog of all pay transactions requiring action has dropped by about half to 230,000 as of late June. Many of these involve basic queries for information about promotions, overtime and rules. The part of the backlog involving money — too little or too much pay, incorrect deductions, pay not received — has shrunk by two-thirds to 125,000.

These are still very large numbers but the underlying story here is one of long-delayed hope. The government is processing the pay of more than 330,000 employees every two weeks while simultaneously fixing large batches of past mistakes.

While officials with two of the largest government unions — Public Service Alliance of Canada [PSAC] and the Professional Institute of the Public Service of Canada [PPSC] — disagree the pay system has worked out its kinks, they acknowledge it’s considerably better than it was. New pay transactions are being processed “with increased timeliness and accuracy,” the PSAC official noted.

Neither union is happy with the progress being made on historical mistakes. PIPSC president Debi Daviau told this newspaper that many of her nearly 60,000 members have been waiting for years to receive salary adjustments stemming from earlier promotions or transfers, to name two of the more prominent sources of pay errors.

Even so, the sharp improvement in Phoenix Pay’s performance will soon force the government to confront an interesting choice: Should it continue with plans to replace the system?

Treasury Board, the government’s employer, two years ago launched the process to do just that. Last March, SAP Canada — whose technology underpins the pay system still in use at Canada Revenue Agency — won a competition to run a pilot project. Government insiders believe SAP Canada is on track to build the full system starting sometime in 2023.

When Public Services set out the business case in 2009 for building Phoenix Pay, it noted the pay system would have to accommodate 150 collective agreements that contained thousands of business rules and applied to dozens of federal departments and agencies. The technical challenge has since intensified.

Under the original plan, Phoenix Pay was to save $70 million annually by eliminating 1,200 compensation advisors across government and centralizing a key part of the operation at the pay centre in Miramichi, N.B., where 550 would manage a more automated system.

Instead, the Phoenix Pay system currently employs about 2,300.  This includes 1,600 at Miramichi and five regional pay offices, along with 350 each at a client contact centre (which deals with relatively minor pay issues) and client service bureau (which handles the more complex, longstanding pay errors). This has naturally driven up the average cost of managing each pay account — 55 per cent higher than the government’s former pay system according to last fall’s estimate by the Parliamentary Budget Officer.

… As the backlog shrinks, the need for regional pay offices and emergency staffing will diminish. Public Services is also working with a number of high-tech firms to develop ways of accurately automating employee pay using artificial intelligence [emphasis mine].

Given the Phoenix Pay System debacle, it might be nice to see a little information about how the government is planning to integrate more sophisticated algorithms (artificial intelligence) in their operations.

I found this on a Treasury Board webpage, all 1 minute and 29 seconds of it,

The blonde model or actress mentions that companies applying to Public Services and Procurement Canada for placement on the list must use AI responsibly. Her script does not include a definition or guidelines, which, as previously noted, as on the Treasury Board website.

As for Public Services and Procurement Canada, they have an Artificial intelligence source list,

Public Services and Procurement Canada (PSPC) is putting into operation the Artificial intelligence source list to facilitate the procurement of Canada’s requirements for Artificial intelligence (AI).

After research and consultation with industry, academia, and civil society, Canada identified 3 AI categories and business outcomes to inform this method of supply:

Insights and predictive modelling

Machine interactions

Cognitive automation

PSPC is focused only on procuring AI. If there are guidelines on their website for its use, I did not find them.

I found one more government agency that might have some information about artificial intelligence and guidelines for its use, Shared Services Canada,

Shared Services Canada (SSC) delivers digital services to Government of Canada organizations. We provide modern, secure and reliable IT services so federal organizations can deliver digital programs and services that meet Canadians needs.

Since the Minister of Digital Government, Joyce Murray, is listed on the homepage, I was hopeful that I could find out more about AI and governance and whether or not the Canadian Digital Service was associated with this government ministry/agency. I was frustrated on both counts.

To sum up, there is no information that I could find after March 2019 about Canada, it’s government and plans for AI, especially responsible management/governance and AI on a Canadian government website although I have found guidelines, expectations, and consequences for non-compliance. (Should anyone know which government agency has up-to-date information on its responsible use of AI, please let me know in the Comments.

Canadian Institute for Advanced Research (CIFAR)

The first mention of the Pan-Canadian Artificial Intelligence Strategy is in my analysis of the Canadian federal budget in a March 24, 2017 posting. Briefly, CIFAR received a big chunk of that money. Here’s more about the strategy from the CIFAR Pan-Canadian AI Strategy homepage,

In 2017, the Government of Canada appointed CIFAR to develop and lead a $125 million Pan-Canadian Artificial Intelligence Strategy, the world’s first national AI strategy.

CIFAR works in close collaboration with Canada’s three national AI Institutes — Amii in Edmonton, Mila in Montreal, and the Vector Institute in Toronto, as well as universities, hospitals and organizations across the country.

The objectives of the strategy are to:

Attract and retain world-class AI researchers by increasing the number of outstanding AI researchers and skilled graduates in Canada.

Foster a collaborative AI ecosystem by establishing interconnected nodes of scientific excellence in Canada’s three major centres for AI: Edmonton, Montreal, and Toronto.

Advance national AI initiatives by supporting a national research community on AI through training programs, workshops, and other collaborative opportunities.

Understand the societal implications of AI by developing global thought leadership on the economic, ethical, policy, and legal implications [emphasis mine] of advances in AI.

Responsible AI at CIFAR

You can find Responsible AI in a webspace devoted to what they have called, AI & Society. Here’s more from the homepage,

CIFAR is leading global conversations about AI’s impact on society.

The AI & Society program, one of the objectives of the CIFAR Pan-Canadian AI Strategy, develops global thought leadership on the economic, ethical, political, and legal implications of advances in AI. These dialogues deliver new ways of thinking about issues, and drive positive change in the development and deployment of responsible AI.

Solution Networks

AI Futures Policy Labs

AI & Society Workshops

Building an AI World

Under the category of building an AI World I found this (from CIFAR’s AI & Society homepage),

BUILDING AN AI WORLD

Explore the landscape of global AI strategies.

Canada was the first country in the world to announce a federally-funded national AI strategy, prompting many other nations to follow suit. CIFAR published two reports detailing the global landscape of AI strategies.

I skimmed through the second report and it seems more like a comparative study of various country’s AI strategies than a overview of responsible use of AI.

Final comments about Responsible AI in Canada and the new reports

I’m glad to see there’s interest in Responsible AI but based on my adventures searching the Canadian government websites and the Pan-Canadian AI Strategy webspace, I’m left feeling hungry for more.

I didn’t find any details about how AI is being integrated into government departments and for what uses. I’d like to know and I’d like to have some say about how it’s used and how the inevitable mistakes will be dealh with.

The great unwashed

What I’ve found is high minded, but, as far as I can tell, there’s absolutely no interest in talking to the ‘great unwashed’. Those of us who are not experts are being left out of these earlier stage conversations.

I’m sure we’ll be consulted at some point but it will be long past the time when are our opinions and insights could have impact and help us avoid the problems that experts tend not to see. What we’ll be left with is protest and anger on our part and, finally, grudging admissions and corrections of errors on the government’s part.

Let’s take this for an example. The Phoenix Pay System was implemented in its first phase on Feb. 24, 2016. As I recall, problems develop almost immediately. The second phase of implementation starts April 21, 2016. In May 2016 the government hires consultants to fix the problems. November 29, 2016 the government minister, Judy Foote, admits a mistake has been made. February 2017 the government hires consultants to establish what lessons they might learn. February 15, 2018 the pay problems backlog amounts to 633,000. Source: James Bagnall, Feb. 23, 2018 ‘timeline‘ for Ottawa Citizen

Do take a look at the timeline, there’s more to it than what I’ve written here and I’m sure there’s more to the Phoenix Pay System debacle than a failure to listen to warnings from those who would be directly affected. It’s fascinating though how often a failure to listen presages far deeper problems with a project.

The Canadian government, both a conservative and a liberal government, contributed to the Phoenix Debacle but it seems the gravest concern is with senior government bureaucrats. You might think things have changed since this recounting of the affair in a June 14, 2018 article by Michelle Zilio for the Globe and Mail,

The three public servants blamed by the Auditor-General for the Phoenix pay system problems were not fired for mismanagement of the massive technology project that botched the pay of tens of thousands of public servants for more than two years.

Marie Lemay, deputy minister for Public Services and Procurement Canada (PSPC), said two of the three Phoenix executives were shuffled out of their senior posts in pay administration and did not receive performance bonuses for their handling of the system. Those two employees still work for the department, she said. Ms. Lemay, who refused to identify the individuals, said the third Phoenix executive retired.

In a scathing report last month, Auditor-General Michael Ferguson blamed three “executives” – senior public servants at PSPC, which is responsible for Phoenix − for the pay system’s “incomprehensible failure.” [emphasis mine] He said the executives did not tell the then-deputy minister about the known problems with Phoenix, leading the department to launch the pay system despite clear warnings it was not ready.

Speaking to a parliamentary committee on Thursday, Ms. Lemay said the individuals did not act with “ill intent,” noting that the development and implementation of the Phoenix project were flawed. She encouraged critics to look at the “bigger picture” to learn from all of Phoenix’s failures.

Mr. Ferguson, whose office spoke with the three Phoenix executives as a part of its reporting, said the officials prioritized some aspects of the pay-system rollout, such as schedule and budget, over functionality. He said they also cancelled a pilot implementation project with one department that would have helped it detect problems indicating the system was not ready.

Mr. Ferguson’s report warned the Phoenix problems are indicative of “pervasive cultural problems” [emphasis mine] in the civil service, which he said is fearful of making mistakes, taking risks and conveying “hard truths.”

Speaking to the same parliamentary committee on Tuesday, Privy Council Clerk [emphasis mine] Michael Wernick challenged Mr. Ferguson’s assertions, saying his chapter on the federal government’s cultural issues is an “opinion piece” containing “sweeping generalizations.”

The Privy Council Clerk is the top level bureaucrat (and there is only one such clerk) in the civil/public service and I think his quotes are quite telling of “pervasive cultural problems.” There’s a new Privy Council Clerk but from what I can tell he was well trained by his predecessor.

Do* we really need senior government bureaucrats?

I now have an example of bureaucratic interference, specifically with the Global Public Health Information Network (GPHIN) where it would seem that not much has changed, from a December 26, 2020 article by Grant Robertson for the Globe & Mail,

When Canada unplugged support for its pandemic alert system [GPHIN] last year, it was a symptom of bigger problems inside the Public Health Agency. Experienced scientists were pushed aside, expertise was eroded, and internal warnings went unheeded, which hindered the department’s response to COVID-19

As a global pandemic began to take root in February, China held a series of backchannel conversations with Canada, lobbying the federal government to keep its borders open.

With the virus already taking a deadly toll in Asia, Heng Xiaojun, the Minister Counsellor for the Chinese embassy, requested a call with senior Transport Canada officials. Over the course of the conversation, the Chinese representatives communicated Beijing’s desire that flights between the two countries not be stopped because it was unnecessary.

“The Chinese position on the continuation of flights was reiterated,” say official notes taken from the call. “Mr. Heng conveyed that China is taking comprehensive measures to combat the coronavirus.”

Canadian officials seemed to agree, since no steps were taken to restrict or prohibit travel. To the federal government, China appeared to have the situation under control and the risk to Canada was low. Before ending the call, Mr. Heng thanked Ottawa for its “science and fact-based approach.”

It was a critical moment in the looming pandemic, but the Canadian government lacked the full picture, instead relying heavily on what Beijing was choosing to disclose to the World Health Organization (WHO). Ottawa’s ability to independently know what was going on in China – on the ground and inside hospitals – had been greatly diminished in recent years.

Canada once operated a robust pandemic early warning system and employed a public-health doctor based in China who could report back on emerging problems. But it had largely abandoned those international strategies over the past five years, and was no longer as plugged-in.

By late February [2020], Ottawa seemed to be taking the official reports from China at their word, stating often in its own internal risk assessments that the threat to Canada remained low. But inside the Public Health Agency of Canada (PHAC), rank-and-file doctors and epidemiologists were growing increasingly alarmed at how the department and the government were responding.

“The team was outraged,” one public-health scientist told a colleague in early April, in an internal e-mail obtained by The Globe and Mail, criticizing the lack of urgency shown by Canada’s response during January, February and early March. “We knew this was going to be around for a long time, and it’s serious.”

China had locked down cities and restricted travel within its borders. Staff inside the Public Health Agency believed Beijing wasn’t disclosing the whole truth about the danger of the virus and how easily it was transmitted. “The agency was just too slow to respond,” the scientist said. “A sane person would know China was lying.”

It would later be revealed that China’s infection and mortality rates were played down in official records, along with key details about how the virus was spreading.

But the Public Health Agency, which was created after the 2003 SARS crisis to bolster the country against emerging disease threats, had been stripped of much of its capacity to gather outbreak intelligence and provide advance warning by the time the pandemic hit.

The Global Public Health Intelligence Network, an early warning system known as GPHIN that was once considered a cornerstone of Canada’s preparedness strategy, had been scaled back over the past several years, with resources shifted into projects that didn’t involve outbreak surveillance.

However, a series of documents obtained by The Globe during the past four months, from inside the department and through numerous Access to Information requests, show the problems that weakened Canada’s pandemic readiness run deeper than originally thought. Pleas from the international health community for Canada to take outbreak detection and surveillance much more seriously were ignored by mid-level managers [emphasis mine] inside the department. A new federal pandemic preparedness plan – key to gauging the country’s readiness for an emergency – was never fully tested. And on the global stage, the agency stopped sending experts [emphasis mine] to international meetings on pandemic preparedness, instead choosing senior civil servants with little or no public-health background [emphasis mine] to represent Canada at high-level talks, The Globe found.

The curtailing of GPHIN and allegations that scientists had become marginalized within the Public Health Agency, detailed in a Globe investigation this past July [2020], are now the subject of two federal probes – an examination by the Auditor-General of Canada and an independent federal review, ordered by the Minister of Health.

Those processes will undoubtedly reshape GPHIN and may well lead to an overhaul of how the agency functions in some areas. The first steps will be identifying and fixing what went wrong. With the country now topping 535,000 cases of COVID-19 and more than 14,700 dead, there will be lessons learned from the pandemic.

Prime Minister Justin Trudeau has said he is unsure what role added intelligence [emphasis mine] could have played in the government’s pandemic response, though he regrets not bolstering Canada’s critical supplies of personal protective equipment sooner. But providing the intelligence to make those decisions early is exactly what GPHIN was created to do – and did in previous outbreaks.

Epidemiologists have described in detail to The Globe how vital it is to move quickly and decisively in a pandemic. Acting sooner, even by a few days or weeks in the early going, and throughout, can have an exponential impact on an outbreak, including deaths. Countries such as South Korea, Australia and New Zealand, which have fared much better than Canada, appear to have acted faster in key tactical areas, some using early warning information they gathered. As Canada prepares itself in the wake of COVID-19 for the next major health threat, building back a better system becomes paramount.

If you have time, do take a look at Robertson’s December 26, 2020 article and the July 2020 Globe investigation. As both articles make clear, senior bureaucrats whose chief attribute seems to have been longevity took over, reallocated resources, drove out experts, and crippled the few remaining experts in the system with a series of bureaucratic demands while taking trips to attend meetings (in desirable locations) for which they had no significant or useful input.

The Phoenix and GPHIN debacles bear a resemblance in that senior bureaucrats took over and in a state of blissful ignorance made a series of disastrous decisions bolstered by politicians who seem to neither understand nor care much about the outcomes.

If you think I’m being harsh watch Canadian Broadcasting Corporation (CBC) reporter Rosemary Barton interview Prime Minister Trudeau for a 2020 year-end interview, Note: There are some commercials. Then, pay special attention to the Trudeau’s answer to the first question,

Responsible AI, eh?

Based on the massive mishandling of the Phoenix Pay System implementation where top bureaucrats did not follow basic and well established information services procedures and the Global Public Health Information Network mismanagement by top level bureaucrats, I’m not sure I have a lot of confidence in any Canadian government claims about a responsible approach to using artificial intelligence.

Unfortunately, it doesn’t matter as implementation is most likely already taking place here in Canada.

Enough with the pessimism. I feel it’s necessary to end this on a mildly positive note. Hurray to the government employees who worked through the Phoenix Pay System debacle, the current and former GPHIN experts who continued to sound warnings, and all those people striving to make true the principles of ‘Peace, Order, and Good Government’, the bedrock principles of the Canadian Parliament.

A lot of mistakes have been made but we also do make a lot of good decisions.

*’Doe’ changed to ‘Do’ on May 14, 2021.

Wizards wanted for Canadian federal government positions

The final (you may want to apply as soon as possible) deadline for applying is August 30, 2019 and the salary range is from $57,000 $61,000. (H/T: Liz Haq’s March 27, 2019 article for Huffington Post Canada.) While this might seem like a departure from my usual fare, it’s possible there’s some science involved since the Treasury Board President, Joyce Murray, is also the Minister of Digital Government and that ‘ministry’ is tightly interwoven with the Treasury Board Secretariat. Other than having a deputy minister and chief information office, Alex Benay, who reports to the Treasury Board President, there doesn’t appear to be an office or even a webpage dedicated to this ‘ministry’. You can find the Office of the Chief Information Officer in the Treasury Board’s Organizational Structure webpage. Moving on.

Has there been anything this whimsical from any Canadian government (pick your jurisdiction, federal, provincial, or municipal) job posting since the fabled 1960s and 70s? From the Government of Canada Jobs webpage hosting: AS-02 Various Administrative Wizardry Positions – INVENTORY (Note: I’ve changed some of the formatting),

Work environment
Are you a Gryffindor (brave, loyal, courageous and adventurous), a Ravenclaw (wise, creative, clever and knowledgeable) a Hufflepuff (hard working, dedicated, fair, patient) or a Slytherin (resourceful, ambitious, determined and crave leadership)?

No matter what ‘house’ you belong to, Treasury Board of Canada Secretariat (TBS) has various teams that we would love to use our ‘sorting hat’ to place you into. We are looking for strong and motivated candidates that are interested in making an impact on Canadian citizens. With our Talent Management Program, we will help you grow, learn and further develop your magical career within the Public Service. Come and let TBS become your home away from home!

Intent of the process
We will conduct the first random selection of applicants – also known as “wizards” – on April 8, 2019. Therefore, if you would like to increase your chance of being considered in this first group, please ensure to submit your application by April 7, 2019.

A pool of partially qualified persons resulting from this process WILL be created and WILL be used to fill similar positions with linguistic profiles (Bilingual Imperative BBB/BBB and CBC/CBC. In order to continue creating a diverse workforce, some positions may be filled on a bilingual Non-Imperative BBB/BBB and CBC/CBC basis for the following Employment Equity groups: Indigenous Persons, Visible Minorities and Persons with Disabilities) as well as tenures (please refer to Employment Tenure section of this poster) that may vary according to the position being staffed. This pool may be used to staff similar positions in other organizations within the core public administration (http://www.psc-cfp.gc.ca/plcy-pltq/rfli-lirf/index-eng.htm). By applying to this process, you consent to your personal application-related information being shared with other government departments interested in staffing similar positions.

Positions to be filled: Number to be determined
Information you must provide
Your résumé.
In order to be considered, your application must clearly explain how you meet the following (essential qualifications)
Education:
• A secondary school diploma or an acceptable combination of education, training and/or experience.
Degree equivalency
Experience:
• Significant* experience in providing administrative support services;
• Significant* experience in processing, tracking and proof reading documents such as reports, letters, briefing notes, memos or correspondence;
• Experience liaising with and providing advice or guidance to management, staff or clients.

*Significant experience is defined as having performed the duties for a minimum of one (1) year.
The following will be applied / assessed at a later date (essential for the job)
Various language requirements
Bilingual Imperative BBB/BBB and CBC/CBC
Bilingual Non-Imperative BBB/BBB and CBC/CBC. In order to continue creating a diverse workforce, some positions may be filled on a bilingual Non-Imperative BBB/BBB and CBC/CBC basis for the following Employment Equity groups: Indigenous Persons, Visible Minorities and Persons with Disabilities
Information on language requirements
Second Language Writing Skills Self-Assessment
In order to help you decide if you should apply to a bilingual position, an optional self-assessment of your writing skills in your second official language is available for you to take before completing your application.
For more information, please consult:
Unsupervised Internet Test of Second Language Writing Skills
Competencies:
• Demonstrates integrity and respect;
• Thinking things through;
• Working effectively with others;
• Showing initiative and being action-oriented.
Abilities:
• Ability to communicate effectively in writing;
• Ability to communicate effectively orally.
Personal Suitability:
• Reliability;
• Attention to detail.
The following may be applied / assessed at a later date (may be needed for the job)
Asset Qualifications (Although these are not mandatory to be found qualified in this appointment process, you must clearly demonstrate in your resume how you meet the asset criterion if you respond yes.)

Experience:
• Experience working in a legal environment;
• Experience working in a Human Resources environment;
• Experience in using a human resources information management system;
• Experience providing functional support and advice to clients on systems;
• Experience working with advanced Excel functions (for example: macros, pivot tables, formulas, etc.);
• Experience working in a security environment or related field;
• Experience in scheduling and coordinating an Executives calendar (EX-01 level or equivalent or above);
• Experience supervising/managing a team;
• Experience in providing budget support and financial services;
• Experience organizing events or government travel arrangements;
• Experience working on a project or program;
• Experience working in a communication environment;
• Experience working in accommodations or facilities management.
Other information

The Public Service of Canada is committed to building a skilled and diverse workforce that reflects the Canadians we serve. We promote employment equity and encourage you to indicate if you belong to one of the designated groups when you apply.
Information on employment equity

We will communicate with you about this process by email. As a result, you must update your Public Service Resourcing System profile if it changes as well as advise us of these changes via email. Applicants should use an email address that accepts messages from unknown senders (some email systems block such messages).

Come join TBS the “Hogwarts” of the Public Service!
Preference
Preference will be given to veterans and to Canadian citizens, in that order, with the exception of a job located in Nunavut, where Nunavut Inuit will be appointed first.
Information on the preference to veterans
We thank all those who apply. Only those selected for further consideration will be contacted.
Contact information
AS-02 Inventory team / “Dumbledore’s army”
AS02.TBS-AS02.SCT@tbs-sct.gc.ca

Apply online

Here are a few more details that might help you decide if you want to ‘throw your hat in’, from the Government of Canada Jobs webpage hosting: AS-02 Various Administrative Wizardry Positions – INVENTORY (Note: I’ve changed some of the formatting),

Treasury Board of Canada Secretariat
Ottawa (Ontario)
AS-02
Permanent, acting and temporary
$57,430 to $61,877

For further information on the organization, please visit Treasury Board of Canada Secretariat
The Cracking the Code video helps people who are looking for a new career with the Government of Canada to navigate the application process step by step.
Closing date: 30 August 2019 – 23:59, Pacific Time
Who can apply: Persons residing in Canada and Canadian citizens residing abroad. Apply online

I think they’re trying to introduce some fresh air into the federal civil service (or public service, if you prefer) . It is badly needed if the about-to-be former Clerk of the Privy Council (Canada’s top ranking bureaucrat), Michael Wernick is any indication of the state of our government bureaucracy (from a March 18, 2019 news item on CTV [Canadian Television] news online),

He [Martin Wernick] was directly named by Jody Wilson-Raybould [former Attorney General and Justice Minister] as one of the senior officials who she alleges was involved in a “sustained effort” to politically interfere in the criminal prosecution of SNC-Lavalin. She accused Wernick of issuing “veiled threats” if she did not change her mind about instructing federal prosecutors to pursue a remediation agreement rather than continuing with the criminal trial.

During his two appearances before the House Justice Committee on this matter, Wernick delivered direct and sometimes terse responses to MPs’ questions about his alleged involvement. He denied ever making any threats in relation to Wilson-Raybould’s handling of the criminal case against the Quebec company, as she had alleged.

He also raised eyebrows during his first round of testimony when he offered off-topic opening remarks on the state of online discourse, partisanship and the prospect of political assassinations in the upcoming campaign. [emphases mine]

In addition to concerns over his behaviour and perceived partisan comments as part of the SNC-Lavalin affair, MPs have also registered their discomfort with a related role he held: being part of a high-level panel responsible for deciding when and how to inform Canadians about concerning online behaviour during an election campaign.

NDP MP and ethics critic Charlie Angus sent an open letter to Prime Minister Justin Trudeau prior to Wernick’s second round of testimony, saying that Wernick was “deeply compromised,” has “overstepped his role,” and could not remain in his position

In the video clips I’ve seen of Wernick’s testimony before the Justice Committee , he seemed a little condescending and arrogant. If you want to see for yourself, there’s an embedded video of the CTV report on Wernick’s resignation in the March 18, 2019 CTV news item, which includes some of his testimony.

Canada’s Chief Science Advisor: the first annual report

Dr. Mona Nemer, Canada’s Chief Science Advisor, and her office have issued their 2018 annual report. It is also the office’s first annual report. (Brief bit of history: There was a similar position, National Science Advisor (Canada) from 2004-2008. Dr. Arthur Carty, the advisor, was dumped and the position eliminated when a new government by what was previously the opposition party won the federal election.)

The report can be found in html format here which is where you’ll also find a link to download the full 40 pp. report as a PDF.

Being an inveterate ‘skip to the end’ kind of a reader, I took a boo at the Conclusion,

A chief science advisor plays an important role in building consensus and supporting positive change to our national institutions. Having laid the foundation for the function of the Chief Science Advisor in year one, I look forward to furthering the work that has been started and responding to new requests from the Prime Minister, the Minister of Science and members of Cabinet in my second year. I intend to continue working closely with agency heads, deputy ministers and the science community to better position Canada for global leadership in science and innovation.

I think the conclusion could have done with a little more imagination and/or personality, even by government report standards. While there is some interesting material in other parts of the report, on the whole, it is written in a pleasant, accessible style that raises questions.

Let’s start here with one of the sections that raises questions,

The Budget 2018 commitment of $2.8 billion to build multi-purpose, collaborative federal science and technology facilities presents a once-in-a-generation opportunity to establish a strong foundation for Canada’s future federal science enterprise. Similarly momentous opportunities exist to create a Digital Research Infrastructure Strategy [emphasis mine] for extramural science and a strategic national approach to Major Research Facilities. This is an important step forward for the Government of Canada in bringing principles and standards for federal science in line with practices in the broader scientific community; as such, the Model Policy will not only serve for the benefit of federal science, but will also help facilitate collaboration with the broader scientific community.

Over the past year, my office has participated in the discussion around these potentially transformative initiatives. We offered perspectives on the evidence needed to guide decisions regarding the kind of science infrastructure that can support the increasingly collaborative and multidisciplinary nature of science.

Among other things, I took an active part in the deliberations of the Deputy Ministers Science Committee (DMSC) and in the production of recommendations to Cabinet with respect to federal science infrastructure renewal. I have also analyzed the evolving nature and needs for national science facilities that support the Canadian science community.

Success in building the science infrastructure of the next 50 years and propelling Canadian science and research to new heights will require that the multiple decisions around these foundational initiatives dovetail based on a deep and integrated understanding of Canada’s science system and commitment to maintaining a long-term vision.

Ultimately, these infrastructure investments will need to support a collective vision and strategy for science in Canada, including which functions federal science should perform, and which should be shared with or carried out separately by academic and private sector researchers, in Canada and abroad. These are some of the essential considerations that must guide questions of infrastructure and resource allocation.

Canadian government digital infrastructure is a pretty thorny issue and while the article I’m referencing is old, I strongly suspect they still have many of the same problems. I’m going to excerpt sections that are focused on IT, specifically, the data centres and data storage. Prepare yourself for a bit of a shock as you discover the government did not know how many data centres it had. From a December 28, 2016 article by James Bagnall for the Ottawa Citizen,

Information technology [IT} offered an even richer vein of potential savings. For half a century, computer networks and software applications had multiplied willy-nilly as individual departments and agencies looked after their own needs. [emphases mine] The result was a patchwork of incompatible, higher-cost systems. Standardizing common, basic technologies such as email, data storage and telecommunications seemed logical. [emphasis mine]

It had been tried before. But attempts to centralize the buying of high-tech gear and services had failed, largely because federal departments were allowed to opt out. Most did so. They did want to give up control of their IT networks to a central agency.

The PCO determined this time would be different. The prime minister had the authority to create a new federal department through a simple cabinet approval known as an order-in-council. Most departments, including a reluctant Canada Revenue Agency and Department of National Defence, would be forced to carve out a significant portion of their IT groups and budgets — about 40 per cent on average — and hand them over to Shared Services.

Crucially, the move would not be subject to scrutiny by Parliament. And so Shared Services [Shared Services Canada] was born on Aug. 3, 2011.

Speed was demanded of the agency from the start. Minutes of meetings involving senior Shared Services staff are studded with references to “tight schedules” and the “urgency” of getting projects done.

Part of that had to do with the sheer age of the government’s infrastructure. The hardware was in danger of breaking down and the underlying software for many applications was so old that suppliers such as Microsoft, PeopleSoft and Adobe had stopped supporting it. [emphases mine]

The faster Shared Services could install new networks, the less money it would be forced to throw at solving the problems caused by older technology.

… Because the formation of Shared Services had been shrouded in such secrecy, the hiring of experienced outsiders happened last minute. Grant Westcott would join Shared Services as COO in mid-September.

Sixty-three years old at the time, Wescott had served as assistant deputy minister of Public Services in the late 1990s, when he managed the federal government’s telecommunications networks. In that role, he oversaw the successful effort to prepare systems for the year 2000.

More relevant to his new position at Shared Services, Westcott had also worked for nearly a decade at CIBC. As executive vice-president, he had been instrumental in overhauling the bank’s IT infrastructure. Twenty-two of the Canadian Imperial Bank of Commerce’s [CIBC] data centres were shrunk into just two, saving the bank more than $40 million annually in operating costs. Among other things, Westcott and his team consolidated 15 global communications networks into a single voice and data system. More savings resulted.

It wasn’t without glitches. CIBC was forced on several occasions to apologize to customers for a breakdown in certain banking services.

Nevertheless, Westcott’s experience suggested this was not rocket science, at least not in the private sector. [emphasis mine] Streamlining and modernizing IT networks could be done under the right conditions. …


While Shared Services is paying Bell only for the mailboxes it has already moved [there was an email project, which has since been resolved], the agency is still responsible for keeping the lights on at the government’s legacy data centres [emphasis mine]— where most of the electronic mailboxes, and much of the government’s software applications, for that matter, are currently stored.

These jumbles of computers, servers, wires and cooling systems are weighed down by far too much history — and it took Shared Services a long time to get its arms around it.

“We were not created with a transformation plan already in place because to do that, we needed to know what we had,” Forand [Liseanne Forand, then president of Shared Services Canada] explained to members of Parliament. “So we spent a year counting things.” [emphasis mine]

Shared Services didn’t have to start from scratch. Prior to the agency’s 2011 launch, the PCO’s administrative review tried to estimate the number of data centres in government by interviewing the IT czars for each of the departments. Symptomatic of the far-flung nature of the government’s IT networks — and how loosely these were managed — the PCO [Privy Council Office] didn’t even come close.

“When I was at Privy Council, we thought there might be about 200 data centres,” Benoit Long, senior assistant deputy minister of Shared Services told the House of Commons committee last May [2016], “After a year, we had counted 495 and I am still discovering others today.” [emphasis mine]

Most of these data centres were small — less than 1,000 square feet — often set up by government scientists who didn’t bother telling their superiors. Just 22 were larger than 5,000 square feet — and most of these were in the National Capital Region. These were where the bulk of the government’s data and software applications were stored.

In 2012, Grant Westcott organized a six-week tour of these data centres to see for himself what Shared Services was dealing with. According to an agency employee familiar with the tour’s findings, Westcott came away profoundly shocked.

Nearly all the large data centres were decrepit [emphasis mine], undercapitalized and inefficient users of energy. Yet just one was in the process of being replaced. This was a Heron Road facility with a leaking roof, considered key because it housed Canada Revenue Agency [CRA] data.

Bell Canada had been awarded a contract to build and operate a new data centre in Buckingham. The CRA data would be transferred there in the fall of 2013. A separate part of that facility is also meant to host the government’s email system.

The remaining data centres offered Westcott a depressing snapshot. [emphasis mine]

Most of the gear was housed in office towers, rather than facilities tailormade for heavy-duty electronics. With each new generation of computer or servers came requirements for more power and more robust cooling systems. The entire system was approaching a nervous breakdown.[emphasis mine]

Not only were the data centres inherited by Shared Services running out of space, the arrangements for emergency power bordered on makeshift. One Defence Department data centre, for instance, relied [on] a backup generator powered by a relatively ancient jet engine. [emphases mine] While unconventional, there was nothing unique about pressing military hardware into this type of service. The risk had to do with with training — the older the jet engine, the fewer employees there were who knew how to fix it. [emphasis mine]

The system for backing up data wasn’t much better. At Statistics Canada — where Westcott’s group was actually required to swear an oath to keep the data confidential before entering the storage rooms — the backup procedure called for storing copies of the data in a separate but nearby facility.While that’s OK for most disasters, the relative proximity still meant Statcan’s repository of historical data was vulnerable to a bombing or major earthquake.

Two and a quarter years later, I imagine they’ve finished counting their data centres but how much progress they might have made on replacing and upgrading the hardware and the facilities is not known to me but it’s hard to believe that all of the fixes have been made, especially after reading the Chief Science Advisor’s 2018 annual report (the relevant bits are coming up shortly.

If a more comprehensive view of the current government digital infrastructure interests you, I highly recommend Bagnall’s article. It’s heavy going for someone who’s not part of the Ottawa scene but it’s worth the effort as it gives you some insight into the federal government’s workings, which seem remarkably similar regardless as to which party is in power. Plus, it’s really well written.

Getting back to the report, I realize it’s not within the Chief Science Advisor’s scope to discuss the entirety of the government’s digital infrastructure but ***Corrected March 22, 2019 : Ooops! As originally written this section was incorrect. Here’s the corrected version; you can find the original version at the end of this post: “… 40 percent of facilities are more than 50 years old …” doesn’t really convey the depth and breadth of the problem. Presumably, these facilities also include digital infrastructure. At any rate, 40 percent of what? Just how many of these facilities are over 50 year old?*** By the way, the situation as the Chief Science Advisor’s Office describes it, gets a little worse, keep reading,

Over the past two years, the Treasury Board Secretariat has worked with the federal science community to develop an inventory of federal science infrastructure.

The project was successful in creating a comprehensive list of federal science facilities and documenting their state of repair. It reveals that some 40 percent of facilities are more than 50 years old and another 40 percent are more than 25 years old. The problem is most acute in the National Capital Region.

A similar inventory of research equipment is now underway. This will allow for effective coordination of research activities across organizations and with external collaborators. The Canada Foundation for Innovation, which provides federal funding toward the costs of academic research infrastructure, has created a platform upon which this work will build.

***Also corrected on March 20, 2019: Bravo to the Treasury Board for creating an inventory of the aging federal science infrastructure, which hopefully includes the digital.*** Following on Bagnall’s outline of the problems, it had to be discouraging work, necessary but discouraging. ***Also corrected on March 20, 2019: Did they take advantage of the Shared Services Canada data centrre counting exercise? Or did they redo the work?*** In any event, both the Treasury Board and the Chief Science Advisor have even more ahead of them.

Couldn’t there have been a link to the inventory? Is it secret information? It would have been nice if Canada’s Chief Science Advisor’s website had offered additional information or a link to supplement the 2018 annual report.

Admittedly there probably aren’t that many people who’d like more information about infrastructure and federal science offices but surely they could somehow give us a little more detail. For example, I understand there’s an AI (artificial intelligence) initiative within the government and given some of the issues, such as the Phoenix payroll system debacle and the digital infrastructure consisting of ‘chewing gum and baling wire’, my confidence is shaken. Have they learned any lessons? The report doesn’t offer any assurance they are taking these ‘lessons’ into account as they forge onwards.

Lies, damned lies, and statistics

I’ve always liked that line about lies and I’ll get to its applicability with regard to the Chief Science Advisor’s 2018 report but first, from the Lies, Damned Lies, and Statistics Wikipedia entry (Note: Links have been removed),


Lies, damned lies, and statistics” is a phrase describing the persuasive power of numbers, particularly the use of statistics to bolster weak arguments. It is also sometimes colloquially used to doubt statistics used to prove an opponent’s point.

The phrase was popularized in the United States by Mark Twain (among others), who attributed it to the British prime minister Benjamin Disraeli: “There are three kinds of lies: lies, damned lies, and statistics.” However, the phrase is not found in any of Disraeli’s works and the earliest known appearances were years after his death. Several other people have been listed as originators of the quote, and it is often erroneously attributed to Twain himself.[1]

Here’s a portion of the 2018 report, which is “using the persuasive power of numbers”, in this case, to offer a suggestion about why people mistrust science,

A recent national public survey found that eight in ten respondents wanted to know more about science and how it affects our world, and roughly the same percentage of people are comfortable knowing that scientific answers may not be definitive. 1 [emphases mine] While this would seem to be a positive result, it may also suggest why some people mistrust science, [emphasis mine] believing that results are fluid and can support several different positions. Again, this reveals the importance of effectively communicating science, not only to counter misinformation, but to inspire critical thinking and an appreciation for curiosity and discovery.

I was happy to see the footnote but surprised that the writer(s) simply trotted out a statistic without any hedging about the reliability of the data. Just because someone says that eight out of ten respondents feel a certain way doesn’t give me confidence in the statistic and I explain why in the next section.

They even added that ‘people are aware that scientific answers are not necessarily definitive’. So why not be transparent about the fallibility of statistics in your own report? If that’s not possible in the body of the report, then maybe put the more detailed, nuanced analysis in an appendix.Finally, how did they derive the notion from the data suggesting that people are aware that science is a process of refining knowledge and adjusting it as needed might lead to distrust of science? It’s possible of course but how do you make that leap based on the data you’re referencing? As for that footnote, where was the link to the report? Curious, I took a look at the report.

Ontario Science Centre and its statistics

For the curious, you can find the Ontario Science Centre. Canadian Science Attitudes Research (July 6, 2018) here where you’ll see the methodology is a little light on detail. (The company doing this work, Leger, is a marketing research and analysitcs company.) Here’s the methodology, such as it is,


QUANTITATIVE RESEARCH INSTRUMENT

An online survey of 1501 Canadians was completed between June 18 and 26, 2018, using Leger’s online panel.The margin of error for this study was +/-2.5%, 19 times out of 20.

COMPARISON DATA

Where applicable, this year’s results have been compared back to a similar study done in 2017 (OSC Canadian Science Attitudes Research, August 2017). The use of arrows ( ) indicate significant changes between the two datasets.

ABOUT LEGER’S ONLINE PANEL

Leger’s online panel has approximately 450,000 members nationally and has a retention rate of 90%.

QUALITY CONTROL

Stringent quality assurance measures allow Leger to achieve the high-quality standards set by the company. As a result, its methods of data collection and storage outperform the norms set by WAPOR (The World Association for Public Opinion Research). These measures are applied at every stage of the project: from data collection to processing, through to analysis.We aim to answer our clients’ needs with honesty, total confidentiality, and integrity.

I didn’t find any more substantive description of the research methodology but there is a demographic breakdown at the end of the report and because they’ve given you the number of the number of people paneled (1501) at the beginning of the report, you can figure out just how many people fit into the various categories.

Now, the questions: What is a Leger online panel? How can they confirm that the people who took the survey are Canadian? How did they put together this Leger panel, e.g., do people self-seletct? Why did they establish a category for people aged 65+? (Actually, in one section of the report, they make reference to 55+.) Read on for why that last one might be a good question.

I haven’t seen any surveys or polls originating in Canada that seem to recognize that a 65 year old is not an 80 year old. After all, they don’t lump in 20 year olds with 40 year olds because these people are at different stages in their lifespans, often with substantive differences in their outlook.. Maybe there are no differences between 65 year olds and 80 year olds but we’ll never know because no one ever asks. When you add in Canada’s aging population, the tendency to lump all ‘old’ people together seems even more thoughtless.

These are just a few questions that spring to mind and most likely, the pollsters have a more substantive methodology. There may have been a perfectly good reason for not including more detail in the version of the report I’ve linked to. For example, a lot of people will stop reading a report that’s ‘bogged down’ in too much detail. Fair enough but why not give a link to a more substantive methodology or put it in an appendix?

One more thing, I couldn’t find any data in the Ontario Science Centre report that would support the Chief Science Advisor Office’s speculation about why people might not trust science. They did hedge, as they should, “… seem to be a positive result, it may also suggest why some people mistrust science …” but the hedging follows directly after the ” … eight in ten respondents …”. which structurally suggests that there is data supporting the speculation. If there is, it’s not in the pollster’s report.

Empire building?

I just wish the office of the Chief Science Advisor had created the foundation to support this,

Establishing a National Science Advisory System

Science and scientific information permeate the work of the federal government, creating a continuous need for decision-makers to negotiate uncertainty in interpreting often variable and generally incomplete scientific evidence for policy making. With science and technology increasingly a part of daily life, the need for science advice will only intensify.

As such, my office has been asked to assess and recommend ways to improve the existing science advisory function [emphasis mine] within the federal government. To that end, we examined the science advisory systems in other countries, such as Australia, New Zealand, the United Kingdom, and the United States, as well as the European Union. [emphasis mine]

A key feature of all the systems was a network of department-level science advisors. These are subject matter experts who work closely with senior departmental officials and support the mandate of the chief science advisor. They stand apart from day-to-day operations and provide a neutral sounding board for senior officials and decision-makers evaluating various streams of information. They also facilitate the incorporation of evidence in decision-making processes and act as a link between the department and external stakeholders.

I am pleased to report that our efforts to foster a Canadian science advisory network are already bearing fruit. Four organizations [emphasis mine] have so far moved to create a departmental science advisor position, and the first incumbents at the Canadian Space Agency and the National Research Council [emphasis mine] are now in place. I have every confidence that the network of departmental science advisors will play an important role in enhancing science advice and science activities planning, especially on cross-cutting issues.

Who asked the office to assess and recommend ways to improve the existing science advisory function? By the way, the European Union axed the position of Chief Science Adviser, after Anne Glover, the first such adviser in their history, completed the term of her appointment in 2014 (see James Wilsdon’s November 13, 2014 article for the Guardian). Perhaps they meant countries in the European Union?

Which four organizations have created departmental science advisor positions? Presumably the Canadian Space Agency and the National Research Council were two of the organizations?

If you’re looking at another country’s science advisory systems do you have some publicly available information, perhaps a report and analysis? Is there a set of best practices? And, how do these practices fit into the Canadian context?

More generally, how much is this going to cost? Are these departmental advisors expected to report to the Canadian federal government’s Chief Science Advisor, as well as, their own ministry deputy minister or higher? Will the Office of the Chief Science Advisor need more staff?

I could be persuaded that increasing the number of advisors and increasing costs to support these efforts are good ideas but it would be nice to see the Office of Chief Science Advisor put some effort into persuasion rather than assuming that someone outside the tight nexus of federal government institutions based in Ottawa is willing to accept statements. that aren’t detailed and/or supported by data.

Final thoughts

I’m torn. I’m supportive of the position of the Chief Science Advisor of Canada and happy to see a report. It looks like there’s some exciting work being done.

I also hold the Chief Advisor and the Office to a high standard. It’s understandable that they may have decided to jettison detail in favour of making the 2018 annual report more readable but what I don’t understand is the failure to provide a more substantive report or information that supports and details the work. There’s a lack of transparency and, also, clarity. What do they mean by the word ‘science’. At a guess, they’re not including the social sciences.

On the whole, the report looks a little sloppy and that’s the last thing I expect from the Chief Science Advisor.

*** This is the original and not accurate version: “… 40 percent of facilities are more than 50 years old …” doesn’t really convey the depth and breadth of the problem. Let’s start with any easy question: 40 percent of what? Just how many of these computers are over 50 year old? By the way, the situation as the Chief Science Advisor’s Office describes it, gets a little worse, keep reading, AND THIS: Bravo to the Treasury Board for tracking down those data centres and more. AND THIS: Also, was this part of the Shared Services Canada counting exercise?

FrogHeart’s good-bye to 2017 and hello to 2018

This is going to be relatively short and sweet(ish). Starting with the 2017 review:

Nano blogosphere and the Canadian blogosphere

From my perspective there’s been a change taking place in the nano blogosphere over the last few years. There are fewer blogs along with fewer postings from those who still blog. Interestingly, some blogs are becoming more generalized. At the same time, Foresight Institute’s Nanodot blog (as has FrogHeart) has expanded its range of topics to include artificial intelligence and other topics. Andrew Maynard’s 2020 Science blog now exists in an archived from but before its demise, it, too, had started to include other topics, notably risk in its many forms as opposed to risk and nanomaterials. Dexter Johnson’s blog, Nanoclast (on the IEEE [Institute for Electrical and Electronics Engineers] website), maintains its 3x weekly postings. Tim Harper who often wrote about nanotechnology on his Cientifica blog appears to have found a more freewheeling approach that is dominated by his Twitter feed although he also seems (I can’t confirm that the latest posts were written in 2017) to blog here on timharper.net.

The Canadian science blogosphere seems to be getting quieter if Science Borealis (blog aggregator) is a measure. My overall impression is that the bloggers have been a bit quieter this year with fewer postings on the feed or perhaps that’s due to some technical issues (sometimes FrogHeart posts do not get onto the feed). On the promising side, Science Borealis teamed with the Science Writers and Communicators of Canada Association to run a contest, “2017 People’s Choice Awards: Canada’s Favourite Science Online!”  There were two categories (Favourite Science Blog and Favourite Science Site) and you can find a list of the finalists with links to the winners here.

Big congratulations for the winners: Canada’s Favourite Blog 2017: Body of Evidence (Dec. 6, 2017 article by Alina Fisher for Science Borealis) and Let’s Talk Science won Canada’s Favourite Science Online 2017 category as per this announcement.

However, I can’t help wondering: where were ASAP Science, Acapella Science, Quirks & Quarks, IFLS (I f***ing love science), and others on the list for finalists? I would have thought any of these would have a lock on a position as a finalist. These are Canadian online science purveyors and they are hugely popular, which should mean they’d have no problem getting nominated and getting votes. I can’t find the criteria for nominations (or any hint there will be a 2018 contest) so I imagine their nonpresence on the 2017 finalists list will remain a mystery to me.

Looking forward to 2018, I think that the nano blogosphere will continue with its transformation into a more general science/technology-oriented community. To some extent, I believe this reflects the fact that nanotechnology is being absorbed into the larger science/technology effort as foundational (something wiser folks than me predicted some years ago).

As for Science Borealis and the Canadian science online effort, I’m going to interpret the quieter feeds as a sign of a maturing community. After all, there are always ups and downs in terms of enthusiasm and participation and as I noted earlier the launch of an online contest is promising as is the collaboration with Science Writers and Communicators of Canada.

Canadian science policy

It was a big year.

Canada’s Chief Science Advisor

With Canada’s first chief science advisor in many years, being announced Dr. Mona Nemer stepped into her position sometime in Fall 2017. The official announcement was made on Sept. 26, 2017. I covered the event in my Sept. 26, 2017 posting, which includes a few more details than found the official announcement.

You’ll also find in that Sept. 26, 2017 posting a brief discourse on the Naylor report (also known as the Review of Fundamental Science) and some speculation on why, to my knowledge, there has been no action taken as a consequence.  The Naylor report was released April 10, 2017 and was covered here in a three-part review, published on June 8, 2017,

INVESTING IN CANADA’S FUTURE; Strengthening the Foundations of Canadian Research (Review of fundamental research final report): 1 of 3

INVESTING IN CANADA’S FUTURE; Strengthening the Foundations of Canadian Research (Review of fundamental research final report): 2 of 3

INVESTING IN CANADA’S FUTURE; Strengthening the Foundations of Canadian Research (Review of fundamental research final report): 3 of 3

I have found another commentary (much briefer than mine) by Paul Dufour on the Canadian Science Policy Centre website. (November 9, 2017)

Subnational and regional science funding

This began in 2016 with a workshop mentioned in my November 10, 2016 posting: ‘Council of Canadian Academies and science policy for Alberta.” By the time the report was published the endeavour had been transformed into: Science Policy: Considerations for Subnational Governments (report here and my June 22, 2017 commentary here).

I don’t know what will come of this but I imagine scientists will be supportive as it means more money and they are always looking for more money. Still, the new government in British Columbia has only one ‘science entity’ and I’m not sure it’s still operational but i was called the Premier’s Technology Council. To my knowledge, there is no ministry or other agency that is focused primarily or partially on science.

Meanwhile, a couple of representatives from the health sciences (neither of whom were involved in the production of the report) seem quite enthused about the prospects for provincial money in their (Bev Holmes, Interim CEO, Michael Smith Foundation for Health Research, British Columbia, and Patrick Odnokon (CEO, Saskatchewan Health Research Foundation) October 27, 2017 opinion piece for the Canadian Science Policy Centre.

Artificial intelligence and Canadians

An event which I find more interesting with time was the announcement of the Pan=Canadian Artificial Intelligence Strategy in the 2017 Canadian federal budget. Since then there has been a veritable gold rush mentality with regard to artificial intelligence in Canada. One announcement after the next about various corporations opening new offices in Toronto or Montréal has been made in the months since.

What has really piqued my interest recently is a report being written for Canada’s Treasury Board by Michael Karlin (you can learn more from his Twitter feed although you may need to scroll down past some of his more personal tweets (something cassoulet in the Dec. 29, 2017 tweets).  As for Karlin’s report, which is a work in progress, you can find out more about the report and Karlin in a December 12, 2017 article by Rob Hunt for the Algorithmic Media Observatory (sponsored by the Social Sciences and Humanities Research Council of Canada [SHRCC], the Centre for Study of Democratic Citizenship, and the Fonds de recherche du Québec: Société et culture).

You can ring in 2018 by reading and making comments, which could influence the final version, on Karlin’s “Responsible Artificial Intelligence in the Government of Canada” part of the government’s Digital Disruption White Paper Series.

As for other 2018 news, the Council of Canadian Academies is expected to publish “The State of Science and Technology and Industrial Research and Development in Canada” at some point soon (we hope). This report follows and incorporates two previous ‘states’, The State of Science and Technology in Canada, 2012 (the first of these was a 2006 report) and the 2013 version of The State of Industrial R&D in Canada. There is already some preliminary data for this latest ‘state of’  (you can find a link and commentary in my December 15, 2016 posting).

FrogHeart then (2017) and soon (2018)

On looking back I see that the year started out at quite a clip as I was attempting to hit the 5000th blog posting mark, which I did on March 3,  2017. I have cut back somewhat from the 3 postings/day high to approximately 1 posting/day. It makes things more manageable allowing me to focus on other matters.

By the way, you may note that the ‘Donate’ button has disappeared from my sidebard. I thank everyone who donated from the bottom of my heart. The money was more than currency, it also symbolized encouragement. On the sad side, I moved from one hosting service to a new one (Sibername) late in December 2016 and have been experiencing serious bandwidth issues which result on FrogHeart’s disappearance from the web for days at a time. I am trying to resolve the issues and hope that such actions as removing the ‘Donate’ button will help.

I wish my readers all the best for 2018 as we explore nanotechnology and other emerging technologies!

(I apologize for any and all errors. I usually take a little more time to write this end-of-year and coming-year piece but due to bandwidth issues I was unable to access my draft and give it at least one review. And at this point, I’m too tired to try spotting error. If you see any, please do let me know.)