Tag Archives: Barack Obama

China, US, and the race for artificial intelligence research domination

John Markoff and Matthew Rosenberg have written a fascinating analysis of the competition between US and China regarding technological advances, specifically in the field of artificial intelligence. While the focus of the Feb. 3, 2017 NY Times article is military, the authors make it easy to extrapolate and apply the concepts to other sectors,

Robert O. Work, the veteran defense official retained as deputy secretary by President Trump, calls them his “A.I. dudes.” The breezy moniker belies their serious task: The dudes have been a kitchen cabinet of sorts, and have advised Mr. Work as he has sought to reshape warfare by bringing artificial intelligence to the battlefield.

Last spring, he asked, “O.K., you guys are the smartest guys in A.I., right?”

No, the dudes told him, “the smartest guys are at Facebook and Google,” Mr. Work recalled in an interview.

Now, increasingly, they’re also in China. The United States no longer has a strategic monopoly on the technology, which is widely seen as the key factor in the next generation of warfare.

The Pentagon’s plan to bring A.I. to the military is taking shape as Chinese researchers assert themselves in the nascent technology field. And that shift is reflected in surprising commercial advances in artificial intelligence among Chinese companies. [emphasis mine]

Having read Marshal McLuhan (de rigeur for any Canadian pursuing a degree in communications [sociology-based] anytime from the 1960s into the late 1980s [at least]), I took the movement of technology from military research to consumer applications as a standard. Television is a classic example but there are many others including modern plastic surgery. The first time, I encountered the reverse (consumer-based technology being adopted by the military) was in a 2004 exhibition “Massive Change: The Future of Global Design” produced by Bruce Mau for the Vancouver (Canada) Art Gallery.

Markoff and Rosenberg develop their thesis further (Note: Links have been removed),

Last year, for example, Microsoft researchers proclaimed that the company had created software capable of matching human skills in understanding speech.

Although they boasted that they had outperformed their United States competitors, a well-known A.I. researcher who leads a Silicon Valley laboratory for the Chinese web services company Baidu gently taunted Microsoft, noting that Baidu had achieved similar accuracy with the Chinese language two years earlier.

That, in a nutshell, is the challenge the United States faces as it embarks on a new military strategy founded on the assumption of its continued superiority in technologies such as robotics and artificial intelligence.

First announced last year by Ashton B. Carter, President Barack Obama’s defense secretary, the “Third Offset” strategy provides a formula for maintaining a military advantage in the face of a renewed rivalry with China and Russia.

As consumer electronics manufacturing has moved to Asia, both Chinese companies and the nation’s government laboratories are making major investments in artificial intelligence.

The advance of the Chinese was underscored last month when Qi Lu, a veteran Microsoft artificial intelligence specialist, left the company to become chief operating officer at Baidu, where he will oversee the company’s ambitious plan to become a global leader in A.I.

The authors note some recent military moves (Note: Links have been removed),

In August [2016], the state-run China Daily reported that the country had embarked on the development of a cruise missile system with a “high level” of artificial intelligence. The new system appears to be a response to a missile the United States Navy is expected to deploy in 2018 to counter growing Chinese military influence in the Pacific.

Known as the Long Range Anti-Ship Missile, or L.R.A.S.M., it is described as a “semiautonomous” weapon. According to the Pentagon, this means that though targets are chosen by human soldiers, the missile uses artificial intelligence technology to avoid defenses and make final targeting decisions.

The new Chinese weapon typifies a strategy known as “remote warfare,” said John Arquilla, a military strategist at the Naval Post Graduate School in Monterey, Calif. The idea is to build large fleets of small ships that deploy missiles, to attack an enemy with larger ships, like aircraft carriers.

“They are making their machines more creative,” he said. “A little bit of automation gives the machines a tremendous boost.”

Whether or not the Chinese will quickly catch the United States in artificial intelligence and robotics technologies is a matter of intense discussion and disagreement in the United States.

Markoff and Rosenberg return to the world of consumer electronics as they finish their article on AI and the military (Note: Links have been removed),

Moreover, while there appear to be relatively cozy relationships between the Chinese government and commercial technology efforts, the same cannot be said about the United States. The Pentagon recently restarted its beachhead in Silicon Valley, known as the Defense Innovation Unit Experimental facility, or DIUx. It is an attempt to rethink bureaucratic United States government contracting practices in terms of the faster and more fluid style of Silicon Valley.

The government has not yet undone the damage to its relationship with the Valley brought about by Edward J. Snowden’s revelations about the National Security Agency’s surveillance practices. Many Silicon Valley firms remain hesitant to be seen as working too closely with the Pentagon out of fear of losing access to China’s market.

“There are smaller companies, the companies who sort of decided that they’re going to be in the defense business, like a Palantir,” said Peter W. Singer, an expert in the future of war at New America, a think tank in Washington, referring to the Palo Alto, Calif., start-up founded in part by the venture capitalist Peter Thiel. “But if you’re thinking about the big, iconic tech companies, they can’t become defense contractors and still expect to get access to the Chinese market.”

Those concerns are real for Silicon Valley.

If you have the time, I recommend reading the article in its entirety.

Impact of the US regime on thinking about AI?

A March 24, 2017 article by Daniel Gross for Slate.com hints that at least one high level offician in the Trump administration may be a little naïve in his understanding of AI and its impending impact on US society (Note: Links have been removed),

Treasury Secretary Steven Mnuchin is a sharp guy. He’s a (legacy) alumnus of Yale and Goldman Sachs, did well on Wall Street, and was a successful movie producer and bank investor. He’s good at, and willing to, put other people’s money at risk alongside some of his own. While he isn’t the least qualified person to hold the post of treasury secretary in 2017, he’s far from the best qualified. For in his 54 years on this planet, he hasn’t expressed or displayed much interest in economic policy, or in grappling with the big picture macroeconomic issues that are affecting our world. It’s not that he is intellectually incapable of grasping them; they just haven’t been in his orbit.

Which accounts for the inanity he uttered at an Axios breakfast Friday morning about the impact of artificial intelligence on jobs.

“it’s not even on our radar screen…. 50-100 more years” away, he said. “I’m not worried at all” about robots displacing humans in the near future, he said, adding: “In fact I’m optimistic.”

A.I. is already affecting the way people work, and the work they do. (In fact, I’ve long suspected that Mike Allen, Mnuchin’s Axios interlocutor, is powered by A.I.) I doubt Mnuchin has spent much time in factories, for example. But if he did, he’d see that machines and software are increasingly doing the work that people used to do. They’re not just moving goods through an assembly line, they’re soldering, coating, packaging, and checking for quality. Whether you’re visiting a GE turbine plant in South Carolina, or a cable-modem factory in Shanghai, the thing you’ll notice is just how few people there actually are. It’s why, in the U.S., manufacturing output rises every year while manufacturing employment is essentially stagnant. It’s why it is becoming conventional wisdom that automation is destroying more manufacturing jobs than trade. And now we are seeing the prospect of dark factories, which can run without lights because there are no people in them, are starting to become a reality. The integration of A.I. into factories is one of the reasons Trump’s promise to bring back manufacturing employment is absurd. You’d think his treasury secretary would know something about that.

It goes far beyond manufacturing, of course. Programmatic advertising buying, Spotify’s recommendation engines, chatbots on customer service websites, Uber’s dispatching system—all of these are examples of A.I. doing the work that people used to do. …

Adding to Mnuchin’s lack of credibility on the topic of jobs and robots/AI, Matthew Rozsa’s March 28, 2017 article for Salon.com features a study from the US National Bureau of Economic Research (Note: Links have been removed),

A new study by the National Bureau of Economic Research shows that every fully autonomous robot added to an American factory has reduced employment by an average of 6.2 workers, according to a report by BuzzFeed. The study also found that for every fully autonomous robot per thousand workers, the employment rate dropped by 0.18 to 0.34 percentage points and wages fell by 0.25 to 0.5 percentage points.

I can’t help wondering if the US Secretary of the Treasury is so oblivious to what is going on in the workplace whether that’s representative of other top-tier officials such as the Secretary of Defense, Secretary of Labor, etc. What is going to happen to US research in fields such as robotics and AI?

I have two more questions, in future what happens to research which contradicts or makes a top tier Trump government official look foolish? Will it be suppressed?

You can find the report “Robots and Jobs: Evidence from US Labor Markets” by Daron Acemoglu and Pascual Restrepo. NBER (US National Bureau of Economic Research) WORKING PAPER SERIES (Working Paper 23285) released March 2017 here. The introduction featured some new information for me; the term ‘technological unemployment’ was introduced in 1930 by John Maynard Keynes.

Moving from a wholly US-centric view of AI

Naturally in a discussion about AI, it’s all US and the country considered its chief sceince rival, China, with a mention of its old rival, Russia. Europe did rate a mention, albeit as a totality. Having recently found out that Canadians were pioneers in a very important aspect of AI, machine-learning, I feel obliged to mention it. You can find more about Canadian AI efforts in my March 24, 2017 posting (scroll down about 40% of the way) where you’ll find a very brief history and mention of the funding for a newly launching, Pan-Canadian Artificial Intelligence Strategy.

If any of my readers have information about AI research efforts in other parts of the world, please feel free to write them up in the comments.

Changes to the US 21st Century Nanotechnology Research and Development Act

This is one of Barack Obama’s last acts as President of the US according to a Jan. 5, 2017 posting by Lynn L. Bergeson on the Nanotechnology Now website,

The American Innovation and Competitiveness Act (S. 3084) would amend the 21st Century Nanotechnology Research and Development Act (15 U.S.C. § 7501 et seq.) to change the frequency of National Nanotechnology Initiative (NNI) reports. The strategic plan would be released every five instead of every three years, and the triennial review would be renamed the quadrennial review and be prepared every four years instead of every three. The evaluation of the NNI, which is submitted to Congress, would be due every four instead of every three years. … On December 28, 2016, the bill was presented to President Obama. President Obama is expected to sign the bill.

Congress.gov is hosting the S.3084 – American Innovation and Competitiveness Act webpage listing all of the actions, to date, taken on behalf of this bill; Obama signed the act on Jan. 6, 2017.

One final note, Obama’s last day as US President is Friday, Jan. 20, 2016 but his last ‘full’ day is Thursday, Jan. 19, 2016 (according to a Nov. 4, 2016 posting by Tom Muse for About.com).

Dear Science Minister Kirsty Duncan and Science, Innovation and Economic Development Minister Navdeep Bains: a Happy Canada Day! open letter

Dear Minister of Science Kirsty Duncan and Minister of Science, Innovation and Economic Development Navdeep Bains,

Thank you both. It’s been heartening to note some of the moves you’ve made since entering office. Taking the muzzles off Environment Canada and Natural Resources Canada scientists was a big relief and it was wonderful to hear that the mandatory longform census was reinstated along with the Experimental Lakes Area programme. (Btw, I can’t be the only one who’s looking forward to hearing the news once Canada’s Chief Science Officer is appointed. In the fall, eh?)

Changing the National Science and Technology week by giving it a news name “Science Odyssey” and rescheduling it from the fall to the spring seems to have revitalized the effort. Then, there was the news about a review focused on fundamental science (see my June 16, 2016 post). It seems as if the floodgates have opened or at least communication about what’s going on has become much freer. Brava and Bravo!

The recently announced (June 29, 2016) third assessment on the State of S&T (Science and Technology) and IR&D (Industrial Research and Development; my July 1, 2016 post features the announcement) by the Council of Canadian Academies adds to the impression that you both have adopted a dizzying pace for science of all kinds in Canada.

With the initiatives I’ve just mentioned in mind, it would seem that encouraging a more vital science culture and and re-establishing science as a fundamental part of Canadian society is your aim.

Science education and outreach as a whole population effort

It’s facey to ask for more but that’s what I’m going to do.

In general, the science education and outreach efforts in Canada have focused on children. This is wonderful but not likely to be as successful as we would hope when a significant and influential chunk of the population is largely ignored: adults. (There is a specific situation where outreach to adults is undertaken but more about that later.)

There is research suggesting that children’s attitudes to science and future careers is strongly influenced by their family. From my Oct. 9, 2013 posting,

One of the research efforts in the UK is the ASPIRES research project at King’s College London (KCL), which is examining children’s attitudes to science and future careers. Their latest report, Ten Science Facts and Fictions: the case for early education about STEM careers (PDF), is profiled in a Jan. 11, 2012 news item on physorg.com (from the news item),

Professor Archer [Louise Archer, Professor of Sociology of Education at King’s] said: “Children and their parents hold quite complex views of science and scientists and at age 10 or 11 these views are largely positive. The vast majority of children at this age enjoy science at school, have parents who are supportive of them studying science and even undertake science-related activities in their spare time. They associate scientists with important work, such as finding medical cures, and with work that is well paid.

“Nevertheless, less than 17 per cent aspire to a career in science. These positive impressions seem to lead to the perception that science offers only a very limited range of careers, for example doctor, scientist or science teacher. It appears that this positive stereotype is also problematic in that it can lead people to view science as out of reach for many, only for exceptional or clever people, and ‘not for me’. [emphases mine]

Family as a bigger concept

I suggest that ‘family’ be expanded to include the social environment in which children operate. When I was a kid no one in our family or extended group of friends had been to university let alone become a scientist. My parents had aspirations for me but when it came down to brass tacks, even though I was encouraged to go to university, they were much happier when I dropped out and got a job.

It’s very hard to break out of the mold. The odd thing about it all? I had two uncles who were electricians which when you think about it means they were working in STEM (science, technology,engineering, mathematics) jobs. Electricians, then and now. despite their technical skills, are considered tradespeople.

It seems to me that if more people saw themselves as having STEM or STEM-influenced occupations: hairdressers, artists, automechanics, plumbers, electricians, musicians, etc., we might find more children willing to engage directly in STEM opportunities. We might also find there’s more public support for science in all its guises.

That situation where adults are targeted for science outreach? It’s when the science is considered controversial or problematic and, suddenly, public (actually they mean voter) engagement or outreach is considered vital.

Suggestion

Given the initiatives you both have undertaken and Prime Minister Trudeau’s recent public outbreak of enthusiasm for and interest in quantum computing (my April 18, 2016 posting), I’m hopeful that you will consider the notion and encourage (fund?) science promotion programmes aimed at adults. Preferably attention-grabbing and imaginative programmes.

Should you want to discuss the matter further (I have some suggestions), please feel free to contact me.

Regardless, I’m very happy to see the initiatives that have been undertaken and, just as importantly, the communication about science.

Yours sincerely,

Maryse de la Giroday
(FrogHeart blog)

P.S. I very much enjoyed the June 22, 2016 interview with Léo Charbonneau for University Affairs,

UA: Looking ahead, where would you like Canada to be in terms of research in five to 10 years?

Dr. Duncan: Well, I’ll tell you, it breaks my heart that in a 10-year period we fell from third to eighth place among OECD countries in terms of HERD [government expenditures on higher education research and development as a percentage of gross domestic product]. That should never have happened. That’s why it was so important for me to get that big investment in the granting councils.

Do we have a strong vision for science? Do we have the support of the research community? Do we have the funding systems that allow our world-class researchers to do the work they want do to? And, with the chief science officer, are we building a system where we have the evidence to inform decision-making? My job is to support research and to make sure evidence makes its way to the cabinet table.

As stated earlier, I’m hoping you will expand your vision to include Canadian society, not forgetting seniors (being retired or older doesn’t mean that you’re senile and/or incapable of public participation), and supporting Canada’s emerging science media environment.

P.P.S. As a longstanding observer of the interplay between pop culture, science, and society I was much amused and inspired by news of Justin Trudeau’s emergence as a character in a Marvel comic book (from a June 28, 2016 CBC [Canadian Broadcasting Corporation] news online item),

Trudeau Comic Cover 20160628

The variant cover of the comic Civil War II: Choosing Sides #5, featuring Prime Minister Justin Trudeau surrounded by the members of Alpha Flight: Sasquatch, top, Puck, bottom left, Aurora, right, and Iron Man in the background. (The Canadian Press/Ramon Perez)

Make way, Liberal cabinet: Prime Minister Justin Trudeau will have another all-Canadian crew in his corner as he suits up for his latest feature role — comic book character.

Trudeau will grace the variant cover of issue No. 5 of Marvel’s “Civil War II: Choosing Sides,” due out Aug. 31 [2016].

Trudeau is depicted smiling, sitting relaxed in the boxing ring sporting a Maple Leaf-emblazoned tank, black shorts and red boxing gloves. Standing behind him are Puck, Sasquatch and Aurora, who are members of Canadian superhero squad Alpha Flight. In the left corner, Iron Man is seen with his arms crossed.

“I didn’t want to do a stuffy cover — just like a suit and tie — put his likeness on the cover and call it a day,” said award-winning Toronto-based cartoonist Ramon Perez.

“I wanted to kind of evoke a little bit of what’s different about him than other people in power right now. You don’t see (U.S. President Barack) Obama strutting around in boxing gear, doing push-ups in commercials or whatnot. Just throwing him in his gear and making him almost like an everyday person was kind of fun.”

The variant cover featuring Trudeau will be an alternative to the main cover in circulation showcasing Aurora, Puck, Sasquatch and Nick Fury.

It’s not the first time a Canadian Prime Minister has been featured in a Marvel comic book (from the CBC news item),

Trudeau Comic Cover 20160628

Prime Minister Pierre Trudeau in 1979’s Volume 120 of The Uncanny X-Men. (The Canadian Press/Marvel)

Trudeau follows in the prime ministerial footsteps of his late father, Pierre, who graced the pages of “Uncanny X-Men” in 1979.

The news item goes on to describe artist/writer Chip Zdarsky’s (Edmonton-born) ideas for the 2016 story.

h/t to Reva Seth’s June 29, 2016 article for Fast Company for pointing me to Justin Trudeau’s comic book cover.

Canadian science petition and a science diplomacy event in Ottawa on June 21, 2016*

The Canadian science policy and science funding scene is hopping these days. Canada’s Minister of Science, Kirsty Duncan, announced a new review of federal funding for fundamental science on Monday, June 13, 2016 (see my June 15, 2016 post for more details and a brief critique of the panel) and now, there’s a new Parliamentary campaign for a science advisor and a Canadian Science Policy Centre event on science diplomacy.

Petition for a science advisor

Kennedy Stewart, Canadian Member of Parliament (Burnaby South) and NDP (New Democratic Party) Science Critic, has launched a campaign for independent science advice for the government. Here’s more from a June 15, 2016 announcement (received via email),

After years of muzzling and misuse of science by the Conservatives, our scientists need lasting protections in order to finally turn the page on the lost Harper decade.

I am writing to ask your support for a new campaign calling for an independent science advisor.

While I applaud the new Liberal government for their recent promises to support science, we have a long way to go to rebuild Canada’s reputation as a global knowledge leader. As NDP Science Critic, I continue to push for renewed research funding and measures to restore scientific integrity.

Canada badly needs a new science advisor to act as a public champion for research and evidence in Ottawa. Although the Trudeau government has committed to creating a Chief Science Officer, the Minister of Science – Dr. Kirsty Duncan – has yet to state whether or not the new officer will be given real independence and a mandate protected by law.

Today, we’re launching a new parliamentary petition calling for just that: https://petitions.parl.gc.ca/en/Petition/Sign/e-415

Can you add your name right now?

Canada’s last national science advisor lacked independence from the government and was easily eliminated in 2008 after the anti-science Harper Conservatives took power.

That’s why the Minister needs to build the new CSO to last and entrench the position in legislation. Rhetoric and half-measures aren’t good enough.

Please add your voice for public science by signing our petition to the Minister of Science.

Thank you for your support,

Breakfast session on science diplomacy

One June 21, 2016 the Canadian Science Policy Centre is presenting a breakfast session on Parliament Hill in Ottawa, (from an announcement received via email),

“Science Diplomacy in the 21st Century: The Potential for Tomorrow”
Remarks by Dr. Vaughan Turekian,
Science and Technology Adviser to Secretary of State John Kerry

Event Information
Tuesday, June 21, 2016, Room 238-S, Parliament Hill
7:30am – 8:00am – Continental Breakfast
8:00am – 8:10am – Opening Remarks, MP Terry Beech
8:10am – 8:45am – Dr. Vaughan Turekian Remarks and Q&A

Dr. Turekian’s visit comes during a pivotal time as Canada is undergoing fundamental changes in numerous policy directions surrounding international affairs. With Canada’s comeback on the world stage, there is great potential for science to play an integral role in the conduct of our foreign affairs.  The United States is currently one of the leaders in science diplomacy, and as such, listening to Dr.Turekian will provide a unique perspective from the best practices of science diplomacy in the US.

Actually, Dr. Turekian’s visit comes before a North American Summit being held in Ottawa on June 29, 2016 and which has already taken a scientific turn. From a June 16, 2016 news item on phys.org,

Some 200 intellectuals, scientists and artists from around the world urged the leaders of Mexico, the United States and Canada on Wednesday to save North America’s endangered migratory Monarch butterfly.

US novelist Paul Auster, environmental activist Robert F. Kennedy Jr., Canadian poet [Canadian media usually describe her as a writer] Margaret Atwood, British writer Ali Smith and India’s women’s and children’s minister Maneka Sanjay Gandhi were among the signatories of an open letter to the three leaders.

US President Barack Obama, Canadian Prime Minister Justin Trudeau and Mexican President Enrique Pena Nieto will hold a North American summit in Ottawa on June 29 [2016].

The letter by the so-called Group of 100 calls on the three leaders to “take swift and energetic actions to preserve the Monarch’s migratory phenomenon” when they meet this month.

In 1996-1997, the butterflies covered 18.2 hectares (45 acres) of land in Mexico’s central mountains.

It fell to 0.67 hectares in 2013-2014 but rose to 4 hectares this year. Their population is measured by the territory they cover.

They usually arrive in Mexico between late October and early November and head back north in March.

Given this turn of events, I wonder how Turekian, given that he’s held his current position for less than a year, might (or might not) approach the question of Monarch butterflies and diplomacy.

I did a little research about Turekian and found this Sept. 10, 2016 news release announcing his appointment as the Science and Technology Adviser to the US Secretary of State,

On September 8, Dr. Vaughan Turekian, formerly the Chief International Officer at The American Association for the Advancement of Science (AAAS), was named the 5th Science and Technology Adviser to the Secretary of State. In this capacity, Dr. Turekian will advise the Secretary of State and the Under Secretary for Economic Growth, Energy, and the Environment on international environment, science, technology, and health matters affecting the foreign policy of the United States. Dr. Turekian will draw upon his background in atmospheric chemistry and extensive policy experience to promote science, technology, and engineering as integral components of U.S. diplomacy.

Dr. Turekian brings both technical expertise and 14 years of policy experience to the position. As former Chief International Officer for The American Association for the Advancement of Science (AAAS) and Director of AAAS’s Center for Science Diplomacy, Dr. Turekian worked to build bridges between nations based on shared scientific goals, placing special emphasis on regions where traditional political relationships are strained or do not exist. As Editor-in-Chief of Science & Diplomacy, an online quarterly publication, Dr. Turekian published original policy pieces that have served to inform international science policy recommendations. Prior to his work at AAAS, Turekian worked at the State Department as Special Assistant and Adviser to the Under Secretary for Global Affairs on issues related to sustainable development, climate change, environment, energy, science, technology, and health and as a Program Director for the Committee on Global Change Research at the National Academy of Sciences where he was study director for a White House report on climate change science.

Turekian’s last editorial for Science & Diplomacy dated June 30, 2015 features a title (Evolving Institutions for Twenty-First Century [Science] Diplomacy) bearing a resemblance to the title for his talk in Ottawa and perhaps it provides a preview (spoilers),

Over the recent decade, its treatment of science and technology issues has increased substantially, with a number of cover stories focused on topics that bridge science, technology, and foreign affairs. This thought leadership reflects a broader shift in thinking within institutions throughout the world about the importance of better integrating the communities of science and diplomacy in novel ways.

In May, a high-level committee convened by Japan’s minister of foreign affairs released fifteen recommendations for how Japan could better incorporate its scientific and technological expertise into its foreign policy. While many of the recommendations were to be predicted, including the establishment of the position of science adviser to the foreign minister, the breadth of the recommendations highlighted numerous new ways Japan could leverage science to meet its foreign policy objectives. The report itself marks a turning point for an institution looking to upgrade its ability to meet and shape the challenges of this still young century.

On the other side of the Pacific, the U.S. National Academy of Sciences released its own assessment of science in the U.S. Department of State. Their report, “Diplomacy for the 21st Century: Embedding a Culture of Science and Technology Throughout the Department of State,” builds on its landmark 1999 report, which, among other things, established the position of science and technology adviser to the secretary of state. The twenty-seven recommendations in the new report are wide ranging, but as a whole speak to the fact that while one of the oldest U.S. institutions of government has made much progress toward becoming more scientifically and technologically literate, there are many more steps that could be taken to leverage science and technology as a key element of U.S. foreign policy.

These two recent reports highlight the importance of foreign ministries as vital instruments of science diplomacy. These agencies of foreign affairs, like their counterparts around the world, are often viewed as conservative and somewhat inflexible institutions focused on stability rather than transformation. However, they are adjusting to a world in which developments in science and technology move rapidly and affect relationships and interactions at bilateral, regional, and global scales.

At the same time that some traditional national instruments of diplomacy are evolving to better incorporate science, international science institutions are also evolving to meet the diplomatic and foreign policy drivers of this more technical century. …

It’s an interesting read and I’m glad to see the mention of Japan in his article. I’d like to see Canadian science advice and policy initiatives take more notice of the rest of the world rather than focusing almost solely on what’s happening in the US and Great Britain (see my June 15, 2016 post for an example of what I mean). On another note, it was disconcerting to find out that Turekian believes that we are only now moving past the Cold War politics of the past.

Unfortunately for anyone wanting to attend the talk, ticket sales have ended even though they were supposed to be open until June 17, 2016. And, there doesn’t seem to be a wait list.

You may want to try arriving at the door and hoping that people have cancelled or fail to arrive therefore acquiring a ticket. Should you be an MP (Member of Parliament), Senator, or guest of the Canadian Science Policy Conference, you get a free ticket. Should you be anyone else, expect to pay $15, assuming no one is attempting to scalp (sell one for more than it cost) these tickets.

*’ … on June’ in headline changed to ‘ … on June 21, 2016’ on June 17, 2016.

Prime Minister Trudeau, the quantum physicist

Prime Minister Justin Trudeau’s apparently extemporaneous response to a joking (non)question about quantum computing by a journalist during an April 15, 2016 press conference at the Perimeter Institute for Theoretical Physics in Waterloo, Ontario, Canada has created a buzz online, made international news, and caused Canadians to sit taller.

For anyone who missed the moment, here’s a video clip from the Canadian Broadcasting Corporation (CBC),

Aaron Hutchins in an April 15, 2016 article for Maclean’s magazine digs deeper to find out more about Trudeau and quantum physics (Note: A link has been removed),

Raymond Laflamme knows the drill when politicians visit the Perimeter Institute. A photo op here, a few handshakes there and a tour with “really basic, basic, basic facts” about the field of quantum mechanics.

But when the self-described “geek” Justin Trudeau showed up for a funding announcement on Friday [April 15, 2016], the co-founder and director of the Institute for Quantum Computing at the University of Waterloo wasn’t met with simple nods of the Prime Minister pretending to understand. Trudeau immediately started talking about things being waves and particles at the same time, like cats being dead and alive at the same time. It wasn’t just nonsense—Trudeau was referencing the famous thought experiment of the late legendary physicist Erwin Schrödinger.

“I don’t know where he learned all that stuff, but we were all surprised,” Laflamme says. Soon afterwards, as Trudeau met with one student talking about superconductivity, the Prime Minister asked her, “Why don’t we have high-temperature superconducting systems?” something Laflamme describes as the institute’s “Holy Grail” quest.

“I was flabbergasted,” Laflamme says. “I don’t know how he does in other subjects, but in quantum physics, he knows the basic pieces and the important questions.”

Strangely, Laflamme was not nearly as excited (tongue in cheek) when I demonstrated my understanding of quantum physics during our interview (see my May 11, 2015 posting; scroll down about 40% of the way to the Ramond Laflamme subhead).

As Jon Butterworth comments in his April 16, 2016 posting on the Guardian science blog, the response says something about our expectations regarding politicians,

This seems to have enhanced Trudeau’s reputation no end, and quite right too. But it is worth thinking a bit about why.

The explanation he gives is clear, brief, and understandable to a non-specialist. It is the kind of thing any sufficiently engaged politician could pick up from a decent briefing, given expert help. …

Butterworth also goes on to mention journalists’ expectations,

The reporter asked the question in a joking fashion, not unkindly as far as I can tell, but not expecting an answer either. If this had been an announcement about almost any other government investment, wouldn’t the reporter have expected a brief explanation of the basic ideas behind it? …

As for the announcement being made by Trudeau, there is this April 15, 2016 Perimeter Institute press release (Note: Links have been removed),

Prime Minister Justin Trudeau says the work being done at Perimeter and in Canada’s “Quantum Valley” [emphasis mine] is vital to the future of the country and the world.

Prime Minister Justin Trudeau became both teacher and student when he visited Perimeter Institute today to officially announce the federal government’s commitment to support fundamental scientific research at Perimeter.

Joined by Minister of Science Kirsty Duncan and Small Business and Tourism Minister Bardish Chagger, the self-described “geek prime minister” listened intensely as he received brief overviews of Perimeter research in areas spanning from quantum science to condensed matter physics and cosmology.

“You don’t have to be a geek like me to appreciate how important this work is,” he then told a packed audience of scientists, students, and community leaders in Perimeter’s atrium.

The Prime Minister was also welcomed by 200 teenagers attending the Institute’s annual Inspiring Future Women in Science conference, and via video greetings from cosmologist Stephen Hawking [he was Laflamme’s PhD supervisor], who is a Perimeter Distinguished Visiting Research Chair. The Prime Minister said he was “incredibly overwhelmed” by Hawking’s message.

“Canada is a wonderful, huge country, full of people with big hearts and forward-looking minds,” Hawking said in his message. “It’s an ideal place for an institute dedicated to the frontiers of physics. In supporting Perimeter, Canada sets an example for the world.”

The visit reiterated the Government of Canada’s pledge of $50 million over five years announced in last month’s [March 2016] budget [emphasis mine] to support Perimeter research, training, and outreach.

It was the Prime Minister’s second trip to the Region of Waterloo this year. In January [2016], he toured the region’s tech sector and universities, and praised the area’s innovation ecosystem.

This time, the focus was on the first link of the innovation chain: fundamental science that could unlock important discoveries, advance human understanding, and underpin the groundbreaking technologies of tomorrow.

As for the “quantum valley’ in Ontario, I think there might be some competition here in British Columbia with D-Wave Systems (first commercially available quantum computing, of a sort; my Dec. 16, 2015 post is the most recent one featuring the company) and the University of British Columbia’s Stewart Blusson Quantum Matter Institute.

Getting back to Trudeau, it’s exciting to have someone who seems so interested in at least some aspects of science that he can talk about it with a degree of understanding. I knew he had an interest in literature but there is also this (from his Wikipedia entry; Note: Links have been removed),

Trudeau has a bachelor of arts degree in literature from McGill University and a bachelor of education degree from the University of British Columbia…. After graduation, he stayed in Vancouver and he found substitute work at several local schools and permanent work as a French and math teacher at the private West Point Grey Academy … . From 2002 to 2004, he studied engineering at the École Polytechnique de Montréal, a part of the Université de Montréal.[67] He also started a master’s degree in environmental geography at McGill University, before suspending his program to seek public office.[68] [emphases mine]

Trudeau is not the only political leader to have a strong interest in science. In our neighbour to the south, there’s President Barack Obama who has done much to promote science since he was elected in 2008. David Bruggeman in an April 15, 2016  post (Obama hosts DNews segments for Science Channel week of April 11-15, 2016) and an April 17, 2016 post (Obama hosts White House Science Fair) describes two of Obama’s most recent efforts.

ETA April 19, 2016: I’ve found confirmation that this Q&A was somewhat staged as I hinted in the opening with “Prime Minister Justin Trudeau’s apparently extemporaneous response … .” Will Oremus’s April 19, 2016 article for Slate.com breaks the whole news cycle down and points out (Note: A link has been removed),

Over the weekend, even as latecomers continued to dine on the story’s rapidly decaying scraps, a somewhat different picture began to emerge. A Canadian blogger pointed out that Trudeau himself had suggested to reporters at the event that they lob him a question about quantum computing so that he could knock it out of the park with the newfound knowledge he had gleaned on his tour.

The Canadian blogger who tracked this down is J. J. McCullough (Jim McCullough) and you can read his Oct. 16, 2016 posting on the affair here. McCullough has a rather harsh view of the media response to Trudeau’s lecture. Oremus is a bit more measured,

… Monday brought the countertake parade—smaller and less pompous, if no less righteous—led by Gawker with the headline, “Justin Trudeau’s Quantum Computing Explanation Was Likely Staged for Publicity.”

But few of us in the media today are immune to the forces that incentivize timeliness and catchiness over subtlety, and even Gawker’s valuable corrective ended up meriting a corrective of its own. Author J.K. Trotter soon updated his post with comments from Trudeau’s press secretary, who maintained (rather convincingly, I think) that nothing in the episode was “staged”—at least, not in the sinister way that the word implies. Rather, Trudeau had joked that he was looking forward to someone asking him about quantum computing; a reporter at the press conference jokingly complied, without really expecting a response (he quickly moved on to his real question before Trudeau could answer); Trudeau responded anyway, because he really did want to show off his knowledge.

Trotter deserves credit, regardless, for following up and getting a fuller picture of what transpired. He did what those who initially jumped on the story did not, which was to contact the principals for context and comment.

But my point here is not to criticize any particular writer or publication. The too-tidy Trudeau narrative was not the deliberate work of any bad actor or fabricator. Rather, it was the inevitable product of today’s inexorable social-media machine, in which shareable content fuels the traffic-referral engines that pay online media’s bills.

I suggest reading both McCullough’s and Oremus’s posts in their entirety should you find debates about the role of media compelling.

A study in contrasts: innovation and education strategies in US and British Columbia (Canada)

It’s always interesting to contrast two approaches to the same issue, in this case, innovation and education strategies designed to improve the economies of the United States and of British Columbia, a province in Canada.

One of the major differences regarding education in the US and in Canada is that the Canadian federal government, unlike the US federal government, has no jurisdiction over the matter. Education is strictly a provincial responsibility.

I recently wrote a commentary (a Jan. 19, 2016 posting) about the BC government’s Jan. 18, 2016 announcement of its innovation strategy in a special emphasis on the education aspect. Premier Christy Clark focused largely on the notion of embedding courses on computer coding in schools from K-12 (kindergarten through grade 12) as Jonathon Narvey noted in his Jan. 19, 2016 event recap for Betakit,

While many in the tech sector will be focused on the short-term benefits of a quick injection of large capital [a $100M BC Tech Fund as part of a new strategy was announced in Dec. 2015 but details about the new #BCTECH Strategy were not shared until Jan. 18, 2016], the long-term benefits for the local tech sector are being seeded in local schools. More than 600,000 BC students will be getting basic skills in the K-12 curriculum, with coding academies, more work experience electives and partnerships between high school and post-secondary institutions.

Here’s what I had to say in my commentary (from the Jan. 19, 2016 posting),

… the government wants to embed  computer coding into the education system for K-12 (kindergarten to grade 12). One determined reporter (Canadian Press if memory serves) attempted to find out how much this would cost. No answer was forthcoming although there were many words expended. Whether this failure was due to ignorance (disturbing!) or a reluctance to share (also disturbing!) was impossible to tell. Another reporter (Georgia Straight) asked about equipment (coding can be taught with pen and paper but hardware is better). … Getting back to the reporter’s question, no answer was forthcoming although the speaker was loquacious.

Another reporter asked if the government had found any jurisdictions doing anything similar regarding computer coding. It seems they did consider other jurisdictions although it was claimed that BC is the first to strike out in this direction. Oddly, no one mentioned Estonia, known in some circles as E-stonia, where the entire school system was online by the late 1990s in an initiative known as the ‘Tiger Leap Foundation’ which also supported computer coding classes in secondary school (there’s more in Tim Mansel’s May 16, 2013 article about Estonia’s then latest initiative to embed computer coding into grade school.) …

Aside from the BC government’s failure to provide details, I am uncomfortable with what I see as an overemphasis on computer coding that suggests a narrow focus on what constitutes a science and technology strategy for education. I find the US approach closer to what I favour although I may be biased since they are building their strategy around nanotechnology education.

The US approach had been announced in dribs and drabs until recently when a Jan. 26, 2016 news item on Nanotechnology Now indicated a broad-based plan for nanotechnology education (and computer coding),

Over the past 15 years, the Federal Government has invested over $22 billion in R&D under the auspices of the National Nanotechnology Initiative (NNI) to understand and control matter at the nanoscale and develop applications that benefit society. As these nanotechnology-enabled applications become a part of everyday life, it is important for students to have a basic understanding of material behavior at the nanoscale, and some states have even incorporated nanotechnology concepts into their K-12 science standards. Furthermore, application of the novel properties that exist at the nanoscale, from gecko-inspired climbing gloves and invisibility cloaks, to water-repellent coatings on clothes or cellphones, can spark students’ excitement about science, technology, engineering, and mathematics (STEM).

An earlier Jan. 25, 2016 White House blog posting by Lisa Friedersdorf and Lloyd Whitman introduced the notion that nanotechnology is viewed as foundational and a springboard for encouraging interest in STEM (science, technology, engineering, and mathematics) careers while outlining several formal and information education efforts,

The Administration’s updated Strategy for American Innovation, released in October 2015, identifies nanotechnology as one of the emerging “general-purpose technologies”—a technology that, like the steam engine, electricity, and the Internet, will have a pervasive impact on our economy and our society, with the ability to create entirely new industries, create jobs, and increase productivity. To reap these benefits, we must train our Nation’s students for these high-tech jobs of the future. Fortunately, the multidisciplinary nature of nanotechnology and the unique and fascinating phenomena that occur at the nanoscale mean that nanotechnology is a perfect topic to inspire students to pursue careers in science, technology, engineering, and mathematics (STEM).

The Nanotechnology: Super Small Science series [mentioned in my Jan. 21, 2016 posting] is just the latest example of the National Nanotechnology Initiative (NNI)’s efforts to educate and inspire our Nation’s students. Other examples include:

The announcement about computer coding and courses being integrated in the US education curricula K-12 was made in US President Barack Obama’s 2016 State of the Union speech and covered in a Jan. 30, 2016 article by Jessica Hullinger for Fast Company,

In his final State Of The Union address earlier this month, President Obama called for providing hands-on computer science classes for all students to make them “job ready on day one.” Today, he is unveiling how he plans to do that with his upcoming budget.

The President’s Computer Science for All Initiative seeks to provide $4 billion in funding for states and an additional $100 million directly to school districts in a push to provide access to computer science training in K-12 public schools. The money would go toward things like training teachers, providing instructional materials, and getting kids involved in computer science early in elementary and middle school.

There are more details in the Hullinger’s article and in a Jan. 30, 2016 White House blog posting by Megan Smith,

Computer Science for All is the President’s bold new initiative to empower all American students from kindergarten through high school to learn computer science and be equipped with the computational thinking skills they need to be creators in the digital economy, not just consumers, and to be active citizens in our technology-driven world. Our economy is rapidly shifting, and both educators and business leaders are increasingly recognizing that computer science (CS) is a “new basic” skill necessary for economic opportunity and social mobility.

CS for All builds on efforts already being led by parents, teachers, school districts, states, and private sector leaders from across the country.

Nothing says one approach has to be better than the other as there’s usually more than one way to accomplish a set of goals. As well, it’s unfair to expect a provincial government to emulate the federal government of a larger country with more money to spend. I just wish the BC government (a) had shared details such as the budget allotment for their initiative and (b) would hint at a more imaginative, long range view of STEM education.

Going back to Estonia one last time, in addition to the country’s recent introduction of computer coding classes in grade school, it has also embarked on a nanotechnology/nanoscience educational and entrepreneurial programme as noted in my Sept. 30, 2014 posting,

The University of Tartu (Estonia) announced in a Sept. 29, 2014 press release an educational and entrepreneurial programme about nanotechnology/nanoscience for teachers and students,

To bring nanoscience closer to pupils, educational researchers of the University of Tartu decided to implement the European Union LLP Comenius project “Quantum Spin-Off – connecting schools with high-tech research and entrepreneurship”. The objective of the project is to build a kind of a bridge: at one end, pupils can familiarise themselves with modern science, and at the other, experience its application opportunities at high-tech enterprises. “We also wish to inspire these young people to choose a specialisation related to science and technology in the future,” added Lukk [Maarika Lukk, Coordinator of the project].

The pupils can choose between seven topics of nanotechnology: the creation of artificial muscles, microbiological fuel elements, manipulation of nanoparticles, nanoparticles and ionic liquids as oil additives, materials used in regenerative medicine, deposition and 3D-characterisation of atomically designed structures and a topic covered in English, “Artificial robotic fish with EAP elements”.

Learning is based on study modules in the field of nanotechnology. In addition, each team of pupils will read a scientific publication, selected for them by an expert of that particular field. In that way, pupils will develop an understanding of the field and of scientific texts. On the basis of the scientific publication, the pupils prepare their own research project and a business plan suitable for applying the results of the project.

In each field, experts of the University of Tartu will help to understand the topics. Participants will visit a nanotechnology research laboratory and enterprises using nanotechnologies.

The project lasts for two years and it is also implemented in Belgium, Switzerland and Greece.

As they say, time will tell.

Performances Tom Hanks never gave

The answer to the question, “What makes Tom Hanks look like Tom  Hanks?” leads to machine learning and algorithms according to a Dec. 7, 2015 University of Washington University news release (also on EurekAlert) Note: Link have been removed,

Tom Hanks has appeared in many acting roles over the years, playing young and old, smart and simple. Yet we always recognize him as Tom Hanks.

Why? Is it his appearance? His mannerisms? The way he moves?

University of Washington researchers have demonstrated that it’s possible for machine learning algorithms to capture the “persona” and create a digital model of a well-photographed person like Tom Hanks from the vast number of images of them available on the Internet.

With enough visual data to mine, the algorithms can also animate the digital model of Tom Hanks to deliver speeches that the real actor never performed.

“One answer to what makes Tom Hanks look like Tom Hanks can be demonstrated with a computer system that imitates what Tom Hanks will do,” said lead author Supasorn Suwajanakorn, a UW graduate student in computer science and engineering.

As for the performances Tom Hanks never gave, the news release offers more detail,

The technology relies on advances in 3-D face reconstruction, tracking, alignment, multi-texture modeling and puppeteering that have been developed over the last five years by a research group led by UW assistant professor of computer science and engineering Ira Kemelmacher-Shlizerman. The new results will be presented in a paper at the International Conference on Computer Vision in Chile on Dec. 16.

The team’s latest advances include the ability to transfer expressions and the way a particular person speaks onto the face of someone else — for instance, mapping former president George W. Bush’s mannerisms onto the faces of other politicians and celebrities.

Here’s a video demonstrating how former President Bush’s speech and mannerisms have mapped onto other famous faces including Hanks’s,

The research team has future plans for this technology (from the news release)

It’s one step toward a grand goal shared by the UW computer vision researchers: creating fully interactive, three-dimensional digital personas from family photo albums and videos, historic collections or other existing visuals.

As virtual and augmented reality technologies develop, they envision using family photographs and videos to create an interactive model of a relative living overseas or a far-away grandparent, rather than simply Skyping in two dimensions.

“You might one day be able to put on a pair of augmented reality glasses and there is a 3-D model of your mother on the couch,” said senior author Kemelmacher-Shlizerman. “Such technology doesn’t exist yet — the display technology is moving forward really fast — but how do you actually re-create your mother in three dimensions?”

One day the reconstruction technology could be taken a step further, researchers say.

“Imagine being able to have a conversation with anyone you can’t actually get to meet in person — LeBron James, Barack Obama, Charlie Chaplin — and interact with them,” said co-author Steve Seitz, UW professor of computer science and engineering. “We’re trying to get there through a series of research steps. One of the true tests is can you have them say things that they didn’t say but it still feels like them? This paper is demonstrating that ability.”

Existing technologies to create detailed three-dimensional holograms or digital movie characters like Benjamin Button often rely on bringing a person into an elaborate studio. They painstakingly capture every angle of the person and the way they move — something that can’t be done in a living room.

Other approaches still require a person to be scanned by a camera to create basic avatars for video games or other virtual environments. But the UW computer vision experts wanted to digitally reconstruct a person based solely on a random collection of existing images.

To reconstruct celebrities like Tom Hanks, Barack Obama and Daniel Craig, the machine learning algorithms mined a minimum of 200 Internet images taken over time in various scenarios and poses — a process known as learning ‘in the wild.’

“We asked, ‘Can you take Internet photos or your personal photo collection and animate a model without having that person interact with a camera?'” said Kemelmacher-Shlizerman. “Over the years we created algorithms that work with this kind of unconstrained data, which is a big deal.”

Suwajanakorn more recently developed techniques to capture expression-dependent textures — small differences that occur when a person smiles or looks puzzled or moves his or her mouth, for example.

By manipulating the lighting conditions across different photographs, he developed a new approach to densely map the differences from one person’s features and expressions onto another person’s face. That breakthrough enables the team to ‘control’ the digital model with a video of another person, and could potentially enable a host of new animation and virtual reality applications.

“How do you map one person’s performance onto someone else’s face without losing their identity?” said Seitz. “That’s one of the more interesting aspects of this work. We’ve shown you can have George Bush’s expressions and mouth and movements, but it still looks like George Clooney.”

Here’s a link to and a citation for the paper presented at the conference in Chile,

What Makes Tom Hanks Look Like Tom Hanks by Supasorn Suwajanakorn, Steven M. Seitz, Ira Kemelmacher-Shlizerman for the 2015 ICCV conference, Dec. 13 – 15, 2015 in Chile.

You can find out more about the conference here.

Manufacturing innovation in the US and the Institutes for Manufacturing Innovation (IMI)

The announcement from US President Barack Obama about creating a National Network for Manufacturing Innovation (NNMI) resulting in 45 Institutes for Manufacturing Innovation (IMI) seems to have been made a while back as one of the technical focus areas mentioned in the current round of RFIs (request for information) has closed. Regardless, here’s more from a Sept. 18, 2014 news item on Azonano,

The President of the United States has launched a major, new initiative focused on strengthening the innovation, performance, competitiveness, and job-creating power of U.S. manufacturing called the National Network for Manufacturing Innovation (NNMI).

The NNMI is comprised of Institutes for Manufacturing Innovation (IMIs) and the President has proposed establishing up to 45 IMIs around the country.

A Sept. ??, 2014 National Nanotechnology Initiative (NNI) news release, which originated the news item, describes the program and the RFIs in more detail,

The IMIs will be regionally centered public private partnerships enabling the scale-up of advanced manufacturing technologies and processes, with the goal of successful transition of existing science and technology into the marketplace for both defense and commercial applications. The purpose of the RFI is for DOD to consider input from industry and academia as part of an effort to select and scope the technology focus areas for future IMIs. The RFI originally sought information about the following technical focus areas:

  • Flexible Hybrid Electronics
  • Photonics (now closed)
  • Engineered Nanomaterials
  • Fiber and Textiles
  • Electronic Packaging and Reliability
  • Aerospace Composites

Submissions received to date relevant to the Photonics topic have been deemed sufficient and this topic area is now closed; all other areas remain open. The RFI contains detailed descriptions of the focus areas along with potential applications, market opportunities, and discussion of current and future Technology Readiness Levels (TRLs).

The National Nanotechnology Coordination Office encourages interested members of the nanotechnology community to view and respond to the RFI as appropriate. [emphasis mine] The IMI institutes have the potential to provide game-changing resources and foster exciting new partnerships for the nanotechnology community.

The current closing date is 10 October 2014. Additional details can be found in the RFI and its amendments.

(I’m highlighting the nanotechnology connection for discussion later in this posting.)

You can find the official RFI for the Institutes for Manufacturing Innovation here along with this information,

The Department of Defense (DoD) wishes to consider input from Industry and Academia as part of an effort to select and scope the technology focus areas for future Institutes for Manufacturing Innovation (IMIs). These IMIs will be regionally centered Public Private Partnerships enabling the scale-up of advanced manufacturing technologies and processes with the goal of successful transition of existing science and technology into the marketplace for both Defense and commercial applications. Each Institute will be led by a not-for-profit organization and focus on one technology area. The Department is requesting responses which will assist in the selection of a technology focus area from those currently under consideration, based upon evidence of national security requirement, economic benefit, technical opportunity, relevance to industry, business case for sustainability, and workforce challenge.

There is also some information about this opportunity on the US government’s Advanced Manufacturing Portal here.

This National Network for Manufacturing Innovation is a particularly interesting development in light of my Feb. 10, 2014 posting about a US Government Accountability Office (GAO) report titled: “Nanomanufacturing: Emergence and Implications for U.S. Competitiveness, the Environment, and Human Health.”

Later in 2014, the NNI budget request was shrunk by $200M (mentioned in my March 31, 2014 posting) and shortly thereafter members of the nanotech community went to Washington as per my May 23, 2014 posting. Prior to hearing testimony, the representatives on the subcommittee hearing testimony were given a a 22 pp. précis (PDF; titled: NANOMANUFACTURING AND U.S. COMPETITIVENESS; Challenges and Opportunities) of the GAO report published in Feb. 2014.

I’ve already highlighted mention of the National Nanotechnology Coordination Office in a news release generated by the National Nanotechnology Initiative (NNI) which features a plea to the nanotechnology community to respond to the RFIs.

Clearly, the US NNI is responding to the notion that research generated by the NNI needs to be commercialized.

Finally, the involvement of the US Department of Defense can’t be a huge surprise to anyone given that military research has contributed greatly to consumer technology. As well, it seems the Dept. of Defense might wish to further capitalize on its own research efforts.

Mopping up that oil spill with a nanocellulose sponge and a segue into Canadian oil and politics

Empa (Swiss Federal Laboratories for Materials Science and Technology or ,in German, Eidgenössische Materialprüfungs- und Forschungsanstalt) has announced the development of a nanocellulose sponge useful for cleaning up oil spills in a May 5, 2014 news item on Nanowerk (Note: A link has been removed),

A new, absorbable material from Empa wood research could be of assistance in future oil spill accidents: a chemically modified nanocellulose sponge. The light material absorbs the oil spill, remains floating on the surface and can then be recovered. The absorbent can be produced in an environmentally-friendly manner from recycled paper, wood or agricultural by-products (“Ultralightweight and Flexible Silylated Nanocellulose Sponges for the Selective Removal of Oil from Water”).

A May 2, 2014 Empa news release (also on EurekAlert*}, which originated the news item, includes a description of the potential for oil spills due to transport issues, Empa’s proposed clean-up technology, and a request for investment,

All industrial nations need large volumes of oil which is normally delivered by ocean-going tankers or via inland waterways to its destination. The most environmentally-friendly way of cleaning up nature after an oil spill accident is to absorb and recover the floating film of oil. The Empa researchers Tanja Zimmermann and Philippe Tingaut, in collaboration with Gilles Sèbe from the University of Bordeaux, have now succeeded in developing a highly absorbent material which separates the oil film from the water and can then be easily recovered, “silylated” nanocellulose sponge. In laboratory tests the sponges absorbed up to 50 times their own weight of mineral oil or engine oil. They kept their shape to such an extent that they could be removed with pincers from the water. The next step is to fine tune the sponges so that they can be used not only on a laboratory scale but also in real disasters. To this end, a partner from industry is currently seeked.

Here’s what the nanocellulose sponge looks like (oil was dyed red and the sponge has absorbed it from the water),

The sponge remains afloat and can be pulled out easily. The oil phase is selectively removed from the surface of water. Image: Empa

The sponge remains afloat and can be pulled out easily. The oil phase is selectively removed from the surface of water.
Image: Empa

The news release describes the substance, nanofibrillated cellulose (NFC), and its advantages,

Nanofibrillated Cellulose (NFC), the basic material for the sponges, is extracted from cellulose-containing materials like wood pulp, agricultural by products (such as straw) or waste materials (such as recycled paper) by adding water to them and pressing the aqueous pulp through several narrow nozzles at high pressure. This produces a suspension with gel-like properties containing long and interconnected cellulose nanofibres .

When the water from the gel is replaced with air by freeze-drying, a nanocellulose sponge is formed which absorbs both water and oil. This pristine material sinks in water and is thus not useful for the envisaged purpose. The Empa researchers have succeeded in modifying the chemical properties of the nanocellulose in just one process step by admixing a reactive alkoxysilane molecule in the gel before freeze-drying. The nanocellulose sponge loses its hydrophilic properties, is no longer suffused with water and only binds with oily substances.

In the laboratory the “silylated” nanocellulose sponge absorbed test substances like engine oil, silicone oil, ethanol, acetone or chloroform within seconds. Nanofibrillated cellulose sponge, therefore, reconciles several desirable properties: it is absorbent, floats reliably on water even when fully saturated and is biodegradable.

Here’s a link to and a citation for the paper,

Ultralightweight and Flexible Silylated Nanocellulose Sponges for the Selective Removal of Oil from Water by Zheng Zhang, Gilles Sèbe, Daniel Rentsch, Tanja Zimmermann, and Philippe Tingaut. Chem. Mater., 2014, 26 (8), pp 2659–2668 DOI: 10.1021/cm5004164 Publication Date (Web): April 10, 2014

Copyright © 2014 American Chemical Society

This article is behind a paywall.

I featured ‘nanocellulose and oil spills’ research at the University Wisconsin-Madison in a Feb. 26, 2014 post titled, Cleaning up oil* spills with cellulose nanofibril aerogels (Note: I corrected a typo in my headline hence the asterisk). I also have a Dec. 31, 2013 piece about a nanotechnology-enabled oil spill recovery technology project (Naimor) searching for funds via crowdfunding. Some major oil projects being considered in Canada and the lack of research on remediation are also mentioned in the post.

Segue Alert! As for the latest on Canada and its oil export situation, there’s a rather interesting May 2, 2014 Bloomberg.com article Canada Finds China Option No Easy Answer to Keystone Snub‘ by Edward Greenspon, Andrew Mayeda, Jeremy van Loon and Rebecca Penty describing two Canadian oil projects and offering a US perspective,

It was February 2012, three months since President Barack Obama had phoned the Canadian prime minister to say the Keystone XL pipeline designed to carry vast volumes of Canadian crude to American markets would be delayed.

Now Harper [Canadian Prime Minister Stephen Harper] found himself thousands of miles from Canada on the banks of the Pearl River promoting Plan B: a pipeline from Alberta’s landlocked oil sands to the Pacific Coast where it could be shipped in tankers to a place that would certainly have it — China. It was a country to which he had never warmed yet that served his current purposes. [China’s President at that time was Hu Jintao, 2002 – 2012; currently the President is Xi Jinping, 2013 – ]

The writers do a good job of describing a number of factors having an impact on one or both of the pipeline projects. However, no mention is made in the article that Harper is from the province of Alberta and represents that province’s Calgary Southwest riding. For those unfamiliar with Calgary, it is a city dominated by oil companies. I imagine Mr. Harper is under considerable pressure to resolve oil export and transport issues and I would expect they would prefer to resolve the US issues since many of those oil companies in Calgary have US headquarters.

Still, it seems simple, if the US is not interested as per the problems with the Keystone XL pipeline project, ship the oil to China via a pipeline through the province of British Columbia and onto a tanker. What the writers do not mention is yet another complicating factor, Trudeau, both Justin and, the deceased, Pierre.

As Prime Minister of Canada, Pierre Trudeau was unloved in Alberta, Harper’s home province, due to his energy policies and the formation of the National Energy Board. Harper appears, despite his denials, to have an antipathy towards Pierre Trudeau that goes beyond the political to the personal and it seems to extend beyond Pierre’s grave to his son, Justin. A March 21, 2014 article by Mark Kennedy for the National Post describes Harper’s response to Trudeau’s 2000 funeral this way,

Stephen Harper, then the 41-year-old president of the National Citizens Coalition (NCC), was a proud conservative who had spent three years as a Reform MP. He had entered politics in the mid-1980s, in part because of his disdain for how Pierre Trudeau’s “Just Society” had changed Canada.

So while others were celebrating Trudeau’s legacy, Harper hammered out a newspaper article eviscerating the former prime minister on everything from policy to personality.

Harper blasted Trudeau Sr. for creating “huge deficits, a mammoth national debt, high taxes, bloated bureaucracy, rising unemployment, record inflation, curtailed trade and declining competitiveness.”

On national unity, he wrote that Trudeau was a failure. “Only a bastardized version of his unity vision remains and his other policies have been rejected and repealed by even his own Liberal party.”

Trudeau had merely “embraced the fashionable causes of his time,” wrote Harper.

Getting personal, he took a jab at Trudeau over not joining the military during the Second World War: “He was also a member of the ‘greatest generation,’ the one that defeated the Nazis in war and resolutely stood down the Soviets in the decades that followed. In those battles however, the ones that truly defined his century, Mr. Trudeau took a pass.”

The article was published in the National Post Oct. 5, 2000 — two days after the funeral.

Kennedy’s article was occasioned by the campaign being led by Harper’;s Conservative party against the  leader (as of April 2013) of the Liberal Party, Justin Trudeau.

It’s hard to believe that Harper’s hesitation over China is solely due to human rights issues especially  since Harper has not been noted for consistent interest in those issues and, more particularly, since Prime Minister Pierre Trudeau was one of the first ‘Western’ leaders to visit communist China . Interestingly, Harper has been much more enthusiastic about the US than Pierre Trudeau who while addressing the Press Club in Washington, DC in March 1969, made this observation (from the Pierre Trudeau Wikiquote entry),

Living next to you [the US] is in some ways like sleeping with an elephant. No matter how friendly and even-tempered is the beast, if I can call it that, one is affected by every twitch and grunt.

On that note, I think Canada is always going to be sleeping with an elephant; the only question is, who’s the elephant now? In any event, perhaps Harper is more comfortable with the elephant he knows and that may explain why China’s offer to negotiate a free trade agreement has been left unanswered (this too was not noted in the Bloomberg article). The offer and lack of response were mentioned by Yuen Pau Woo, President and CEO of the Asia Pacific Foundation of Canada, who spoke at length about China, Canada, and their trade relations at a Jan. 31, 2014 MP breakfast (scroll down for video highlights of the Jan. 31, 2014 breakfast) held by Member of Parliament (MP) for Vancouver-Quadra, Joyce Murray.

Geopolitical tensions and Canadian sensitivities aside, I think Canadians in British Columbia (BC), at least, had best prepare for more oil being transported and the likelihood of spills. In fact, there are already more shipments according to a May 6, 2014 article by Larry Pynn for the Vancouver Sun,

B.C. municipalities work to prevent a disastrous accident as rail transport of oil skyrockets

The number of rail cars transporting crude oil and petroleum products through B.C. jumped almost 200 per cent last year, reinforcing the resolve of municipalities to prevent a disastrous accident similar to the derailment in Lac-Mégantic in Quebec last July [2013].

Transport Canada figures provided at The Vancouver Sun’s request show just under 3,400 oil and petroleum rail-car shipments in B.C. last year, compared with about 1,200 in 2012 and 50 in 2011.

The figures come a week after The Sun revealed that train derailments jumped 20 per cent to 110 incidents last year in B.C., the highest level in five years.

Between 2011 and 2012, there was an increase of 2400% (from 50 to 1200) of oil and petroleum rail-car shipments in BC. The almost 300% increase in shipments between 2012 and 2013 seems paltry in comparison.  Given the increase in shipments and the rise in the percentage of derailments, one assumes there’s an oil spill waiting to happen. Especially so, if the Canadian government manages to come to an agreement regarding the proposed pipeline for BC and frankly, I have concerns about the other pipeline too, since either will require more rail cars, trucks, and/or tankers for transport to major centres edging us all closer to a major oil spill.

All of this brings me back to Empa, its oil-absorbing nanocellulose sponges, and the researchers’ plea for investors and funds to further their research. I hope they and all the other researchers (e.g., Naimor) searching for ways to develop and bring their clean-up ideas to market find some support.

*EurekAlert link added May 7, 2014.

ETA May 8, 2014:  Some types of crude oil are more flammable than others according to a May 7, 2014 article by Lindsay Abrams for Salon.com (Note: Links have been removed),

Why oil-by-rail is an explosive disaster waiting to happen
A recent spate of fiery train accidents all have one thing in common: highly volatile cargo from North Dakota

In case the near continuous reports of fiery, deadly oil train accidents hasn’t been enough to convince you, Earth Island Journal is out with a startling investigative piece on North Dakota’s oil boom and the dire need for regulations governing that oil’s transport by rail.

The article is pegged to the train that derailed and exploded last summer in  [Lac-Mégantic] Quebec, killing 47 people, although it just as well could have been the story of the train that derailed and exploded in Alabama last November, the train that derailed and exploded in North Dakota last December, the train that derailed and exploded in Virginia last week or — let’s face it — any future accidents that many see as an inevitability.

The Bakken oil fields in North Dakota are producing over a million barrels of crude oil a day, more than 60 percent of which is shipped by rail. All that greenhouse gas-emitting fossil fuel is bad enough; that more oil spilled in rail accidents last year than the past 35 years combined is also no small thing. But the particular chemical composition of Bakken oil lends an extra weight to these concerns: according to the Pipeline and Hazardous Materials Safety Administration, it may be more flammable and explosive than traditional crude.

While Abrams’ piece is not focused on oil cleanups, it does raise some interesting questions about crude oil transport and whether or not the oil from Alberta might also be more than usually dangerous.

Nanotechnology and the US mega science project: BAM (Brain Activity Map) and more

The Brain Activity Map (BAM) project received budgetary approval as of this morning, Apr. 2, 2013 (I first mentioned BAM in my Mar. 4, 2013 posting when approval seemed imminent). From the news item, Obama Announces Huge Brain-Mapping Project, written by Stephanie Pappas for Yahoo News (Note: Links have been removed),

 President Barack Obama announced a new research initiative this morning (April 2) to map the human brain, a project that will launch with $100 million in funding in 2014.

The Brain Activity Map (BAM) project, as it is called, has been in the planning stages for some time. In the June 2012 issue of the journal Neuron, six scientists outlined broad proposals for developing non-invasive sensors and methods to experiment on single cells in neural networks. This February, President Obama made a vague reference to the project in his State of the Union address, mentioning that it could “unlock the answers to Alzheimer’s.”

In March, the project’s visionaries outlined their final goals in the journal Science. They call for an extended effort, lasting several years, to develop tools for monitoring up to a million neurons at a time. The end goal is to understand how brain networks function.

“It could enable neuroscience to really get to the nitty-gritty of brain circuits, which is the piece that’s been missing from the puzzle,” Rafael Yuste, the co-director of the Kavli Institute for Brain Circuits at Columbia University, who is part of the group spearheading the project, told LiveScience in March. “The reason it’s been missing is because we haven’t had the techniques, the tools.” [Inside the Brain: A Journey Through Time]

Not all neuroscientists support the project, however, with some arguing that it lacks clear goals and may cannibalize funds for other brain research.

….

I believe the $100M mentioned for 2014 would one installment in a series totaling up to $1B or more. In any event, it seems like a timely moment to comment on the communications campaign that has been waged on behalf of the BAM. It reminds me a little of the campaign for graphene, which was waged in the build up to the decision as to which two projects (in a field of six semi-finalists, then narrowed to a field of four finalists) should receive a FET (European Union’s Future and Emerging Technology) 1 billion euro research prize each. It seemed to me even a year or so before the decision that graphene’s win was a foregone conclusion but the organizers left nothing to chance and were relentless in their pursuit of attention and media coverage in the buildup to the final decision.

The most recent salvo in the BAM campaign was an attempt to link it with nanotechnology. A shrewd move given that the US has spent well over $1B since the US National Nanotechnology Initiative (NNI) was first approved in 2000. Linking the two projects means the NNI can lend a little authority to the new project (subtext: we’ve supported a mega-project before and that was successful) while the new project BAM can imbue the ageing NNI with some excitement.

Here’s more about nanotechnology and BAM from a Mar. 27, 2013 Spotlight article by Michael Berger on Nanowerk,

A comprehensive understanding of the brain remains an elusive, distant frontier. To arrive at a general theory of brain function would be an historic event, comparable to inferring quantum theory from huge sets of complex spectra and inferring evolutionary theory from vast biological field work. You might have heard about the proposed Brain Activity Map – a project that, like the Human Genome Project, will tap the hive mind of experts to make headway in the understanding of the field. Engineers and nanotechnologists will be needed to help build ever smaller devices for measuring the activity of individual neurons and, later, to control how those neurons function. Computer scientists will be called upon to develop methods for storing and analyzing the vast quantities of imaging and physiological data, and for creating virtual models for studying brain function. Neuroscientists will provide critical biological expertise to guide the research and interpret the results.

Berger goes on to highlight some of the ways nanotechnology-enabled devices could contribute to the effort. He draws heavily on a study published Mar. 20, 2013 online in ACS (American Chemical Society)Nano. Shockingly, the article is open access. Given that this is the first time I’ve come across an open access article in any of the American Chemical Society’s journals, I suspect that there was payment of some kind involved to make this information freely available. (The practice of allowing researchers to pay more in order to guarantee open access to their research in journals that also have articles behind paywalls seems to be in the process of becoming more common.)

Here’s a citation and a link to the article about nanotechnology and BAM,

Nanotools for Neuroscience and Brain Activity Mapping by A. Paul Alivisatos, Anne M. Andrews, Edward S. Boyden, Miyoung Chun, George M. Church, Karl Deisseroth, John P. Donoghue, Scott E. Fraser, Jennifer Lippincott-Schwartz, Loren L. Looger, Sotiris Masmanidis, Paul L. McEuen, Arto V. Nurmikko, Hongkun Park, Darcy S. Peterka, Clay Reid, Michael L. Roukes, Axel Scherer, Mark Schnitzer, Terrence J. Sejnowski, Kenneth L. Shepard, Doris Tsao, Gina Turrigiano, Paul S. Weiss, Chris Xu, Rafael Yuste, and Xiaowei Zhuang. ACS Nano, 2013, 7 (3), pp 1850–1866 DOI: 10.1021/nn4012847 Publication Date (Web): March 20, 2013
Copyright © 2013 American Chemical Society

As these things go, it’s a readable article for people without a neuroscience education provided they don’t mind feeling a little confused from time to time. From Nanotools for Neuroscience and Brain Activity Mapping (Note: Footnotes and links removed),

The Brain Activity Mapping (BAM) Project (…) has three goals in terms of building tools for neuroscience capable of (…) measuring the activity of large sets of neurons in complex brain circuits, (…) computationally analyzing and modeling these brain circuits, and (…) testing these models by manipulating the activities of chosen sets of neurons in these brain circuits.

As described below, many different approaches can, and likely will, be taken to achieve these goals as neural circuits of increasing size and complexity are studied and probed.

The BAM project will focus both on dynamic voltage activity and on chemical neurotransmission. With an estimated 85 billion neurons, 100 trillion synapses, and 100 chemical neurotransmitters in the human brain,(…) this is a daunting task. Thus, the BAM project will start with model organisms, neural circuits (vide infra), and small subsets of specific neural circuits in humans.

Among the approaches that show promise for the required dynamic, parallel measurements are optical and electro-optical methods that can be used to sense neural cell activity such as Ca2+,(7) voltage,(…) and (already some) neurotransmitters;(…) electrophysiological approaches that sense voltages and some electrochemically active neurotransmitters;(…) next-generation photonics-based probes with multifunctional capabilities;(18) synthetic biology approaches for recording histories of function;(…) and nanoelectronic measurements of voltage and local brain chemistry.(…) We anticipate that tools developed will also be applied to glia and more broadly to nanoscale and microscale monitoring of metabolic processes.

Entirely new tools will ultimately be required both to study neurons and neural circuits with minimal perturbation and to study the human brain. These tools might include “smart”, active nanoscale devices embedded within the brain that report on neural circuit activity wirelessly and/or entirely new modalities of remote sensing of neural circuit dynamics from outside the body. Remarkable advances in nanoscience and nanotechnology thus have key roles to play in transduction, reporting, power, and communications.

One of the ultimate goals of the BAM project is that the knowledge acquired and tools developed will prove useful in the intervention and treatment of a wide variety of diseases of the brain, including depression, epilepsy, Parkinson’s, schizophrenia, and others. We note that tens of thousands of patients have already been treated with invasive (i.e., through the skull) treatments. [emphases mine] While we hope to reduce the need for such measures, greatly improved and more robust interfaces to the brain would impact effectiveness and longevity where such treatments remain necessary.

Perhaps not so coincidentally, there was this Mar. 29, 2013 news item on Nanowerk,

Some human cells forget to empty their trash bins, and when the garbage piles up, it can lead to Parkinson’s disease and other genetic and age-related disorders. Scientists don’t yet understand why this happens, and Rice University engineering researcher Laura Segatori is hoping to change that, thanks to a prestigious five-year CAREER Award from the National Science Foundation (NSF).

Segatori, Rice’s T.N. Law Assistant Professor of Chemical and Biomolecular Engineering and assistant professor of bioengineering and of biochemistry and cell biology, will use her CAREER grant to create a toolkit for probing the workings of the cellular processes that lead to accumulation of waste material and development of diseases, such as Parkinson’s and lysosomal storage disorders. Each tool in the kit will be a nanoparticle — a speck of matter about the size of a virus — with a specific shape, size and charge.  [emphases mine] By tailoring each of these properties, Segatori’s team will create a series of specialized probes that can undercover the workings of a cellular process called autophagy.

“Eventually, once we understand how to design a nanoparticle to activate autophagy, we will use it as a tool to learn more about the autophagic process itself because there are still many question marks in biology regarding how this pathway works,” Segatori said. “It’s not completely clear how it is regulated. It seems that excessive autophagy may activate cell death, but it’s not yet clear. In short, we are looking for more than therapeutic applications. We are also hoping to use these nanoparticles as tools to study the basic science of autophagy.”

There is no direct reference to BAM but there are some intriguing correspondences.

Finally, there is no mention of nanotechnology in this radio broadcast/podcast and transcript but it does provide more information about BAM (for many folks this was first time they’d heard about the project) and the hopes and concerns this project raises while linking it to the Human Genome Project. From the Mar. 31, 2013 posting of a transcript and radio (Kera News; a National Public Radio station) podcast titled, Somewhere Over the Rainbow: The Journey to Map the Human Brain,

During the State of the Union, President Obama said the nation is about to embark on an ambitious project: to examine the human brain and create a road map to the trillions of connections that make it work.

“Every dollar we invested to map the human genome returned $140 to our economy — every dollar,” the president said. “Today, our scientists are mapping the human brain to unlock the answers to Alzheimer’s.”

Details of the project have slowly been leaking out: $3 billion, 10 years of research and hundreds of scientists. The National Institutes of Health is calling it the Brain Activity Map.

Obama isn’t the first to tout the benefits of a huge government science project. But can these projects really deliver? And what is mapping the human brain really going to get us?

Whether one wants to call it a public relations campaign or a marketing campaign is irrelevant. Science does not take place in an environment where data and projects are considered dispassionately. Enormous amounts of money are spent to sway public opinion and policymakers’ decisions.

ETA Ap. 3, 2013: Here are more stories about BAM and the announcement:

BRAIN Initiative Launched to Unlock Mysteries of Human Mind

Obama’s BRAIN Only 1/13 The Size Of Europe’s

BRAIN Initiative Builds on Efforts of Leading Neuroscientists and Nanotechnologists