Tag Archives: Ontario

ARPICO November 13, 2018 event in Vancouver (Canada): The Mysterious Dark-Side of the Universe: From Quarks to the Big Bang with Dark Matter

The Society of Italian Researchers and Professionals in Western Canada (ARPICO) is hosting a physics event for those of us who don’t have Phd’s in physics. From an October 24, 2018 ARPICO announcement (received via email),

The second event of ARPICO’s fall 2018 activity will take place on Tuesday, November 13th, 2018 at the Roundhouse Community Centre (Room B). Our speaker will be Dr. Pietro Giampa, a physicist who recently joined the ranks of the TRIUMF laboratories [Canada’s particle accelerator centre and, formerly, Canada’s National Laboratory for Particle and Nuclear Physics] here in Vancouver. Dr. Giampa will give us an intriguing and, importantly, layperson-intelligible overview on the state of our knowledge of the universe especially in regards to so-called dark matter, a chapter of physics that the most complete theoretical model to-date cannot explain. We will learn, among other things, about an ambitious experiment (set up in a Canadian mine!) [emphasis mine] to detect neutrinos, fundamental and very elusive particles of our  cosmos. You can read a summary of Pietro Giampa’s lecture as well as his short professional biography below.

We look forward to seeing everyone there.

The evening agenda is as follows:

  • 6:30 pm – Doors Open for Registration
  • 7:00 pm – Start of the evening event with introductions & lecture by Dr. Pietro Giampa
  • ~8:15 pm – Q & A Period
  • to follow – Mingling & Refreshments until about 9:30 pm

If you have not already done so, please register for the event by visiting the EventBrite link or RSVPing to info@arpico.ca.

Further details are also available at arpico.ca and Eventbrite.

More details from the email announcement,

The Mysterious Dark-Side of the Universe: From Quarks to the Big Bang with Dark Matter

Understanding the true nature of our universe is one of the most fundamental quests of our society. The path of knowledge acquisition in that quest has led us to the hypothesis of “dark matter”, that is, a large proportion of the mass of the universe which appears invisible. In this lecture, with minimal technical language we will journey through the structure and evolution of the universe, from subatomic particles to the big bang, which gave rise to our universe, in an ultimate research to describe the dark side of the universe called dark matter. We will review what we have learnt thus far about dark matter, and get an in-depth look at how scientists are searching for something that can not be seen.

Dr. Pietro Giampa originally completed his undergraduate in physics at Royal Holloway University of London in the UK, where he wrote a thesis on SuperSymmetry Searches with the ATLAS Detector (so LHC related). Following his undergraduate, he completed a Master Degree in particle physics at the same institute where he developed a novel technique for directional detection of neutrons. It was after his master that he moved to Canada to complete his Ph.D at Queen’s University in Particle Astrophysics, working on the DEAP-3600 Experiment with Nobel laureate Prof. Arthur McDonald. In the summer of 2017 he moved to TRIUMF, where he is currently the Otto Hausser Fellow. At TRIUMF he continues his research for new forms of physics, by studying Dark Matter and Ultra-Cold Neutrons.

 


WHEN: Tuesday, November 13th, 2018 at 7:00pm (doors open at 6:30pm)

WHERE: Roundhouse Community Centre, Room B – 181 Roundhouse Mews, Vancouver, BC, V6Z 2W3

RSVP: Please RSVP at EventBrite (https://mysteryofdarkmatter.eventbrite.ca/) or email info@arpico.ca


Tickets are Needed

  • Tickets are FREE, but all individuals are requested to obtain “free-admission” tickets on EventBrite site due to limited seating at the venue. Organizers need accurate registration numbers to manage wait lists and prepare name tags.
  • All ARPICO events are 100% staffed by volunteer organizers and helpers, however, room rental, stationery, and guest refreshments are costs incurred and underwritten by members of ARPICO. Therefore to be fair, all audience participants are asked to donate to the best of their ability at the door or via EventBrite to “help” defray costs of the event.

FAQs

  • Where can I contact the organizer with any questions? info@arpico.ca
  • Do I have to bring my printed ticket to the event? No, you do not. Your name will be on our Registration List at the Check-in Desk.
  • Is my registration/ticket transferrable? If you are unable to attend, another person may use your ticket. Please send us an email at info@arpico.ca of this substitution to correct our audience Registration List and to prepare guest name tags.
  • Can I update my registration information? Yes. If you have any questions, contact us at info@arpico.ca
  • I am having trouble using EventBrite and cannot reserve my ticket(s). Can someone at ARPICO help me with my ticket reservation? Of course, simply send your ticket request to us at info@arpico.ca so we help you.

What are my transport/parking options?

  • Bus/Train: The Canada Line Yaletown Skytrain station is a 1 minute walk from the Roundhouse Community Centre.
  • Parking: Pay Parking is underground at the community centre.  Access is available via Drake Street.

With regard to the Canadian mine and neutrino experiments, I hunted down a little more information (from an October 6, 2015 article by Kate Allen for thestar.com), Note: A link has been removed,

Canadian physicist Arthur B. McDonald has won the Nobel Prize for discoveries about the behaviour of a mysterious solar particle, teased from an experiment buried two kilometres below Sudbury [Ontario].

The Queen’s University professor emeritus was honoured for co-discovering that elusive particles known as neutrinos can change their identity — or “oscillate” — as they travel from the sun. It proved that neutrinos must have mass, a finding that upset the Standard Model of particle physics and opened new avenues for research into the fundamental properties of the universe.

McDonald, 72, shares the prize with Takaaki Kajita, whose Japanese collaboration made the same discovery with slightly different methods.

To measure solar neutrinos, McDonald and a 130-person international team built a massive detector in an operational copper mine southwest of Sudbury. …

To solve this problem, McDonald and his colleagues dreamt up SNO. Deep in an INCO mine (now owned by Vale), protected from cosmic radiation constantly bombarding the earth’s surface, the scientists installed a 12-metre-wide acrylic vessel filled with 1,000 tonnes of ultra-pure heavy water. The vessel was surrounded by a geodesic sphere equipped with 9,456 light sensors. The whole thing was sunk in a 34-metre-high cavity filled with regular water.

When neutrinos hit the heavy water, an event that occurred about 10 times a day, they emitted a flash of light, which researchers could analyze to measure the particles’ properties.

Allen’s article has more details for anyone who might want to read up on neutrinos. Regardless, I’m sure Dr.Giampa is fully prepared to guide the uninitiated into the mysteries of the universe as they pertain to dark matter, neutrinos, and ultra-cold neutrons.

The Hedy Lamarr of international research: Canada’s Third assessment of The State of Science and Technology and Industrial Research and Development in Canada (2 of 2)

Taking up from where I left off with my comments on Competing in a Global Innovation Economy: The Current State of R and D in Canada or as I prefer to call it the Third assessment of Canadas S&T (science and technology) and R&D (research and development). (Part 1 for anyone who missed it).

Is it possible to get past Hedy?

Interestingly (to me anyway), one of our R&D strengths, the visual and performing arts, features sectors where a preponderance of people are dedicated to creating culture in Canada and don’t spend a lot of time trying to make money so they can retire before the age of 40 as so many of our start-up founders do. (Retiring before the age of 40 just reminded me of Hollywood actresses {Hedy] who found and still do find that work was/is hard to come by after that age. You may be able but I’m not sure I can get past Hedy.) Perhaps our business people (start-up founders) could take a leaf out of the visual and performing arts handbook? Or, not. There is another question.

Does it matter if we continue to be a ‘branch plant’ economy? Somebody once posed that question to me when I was grumbling that our start-ups never led to larger businesses and acted more like incubators (which could describe our R&D as well),. He noted that Canadians have a pretty good standard of living and we’ve been running things this way for over a century and it seems to work for us. Is it that bad? I didn’t have an  answer for him then and I don’t have one now but I think it’s a useful question to ask and no one on this (2018) expert panel or the previous expert panel (2013) seems to have asked.

I appreciate that the panel was constrained by the questions given by the government but given how they snuck in a few items that technically speaking were not part of their remit, I’m thinking they might have gone just a bit further. The problem with answering the questions as asked is that if you’ve got the wrong questions, your answers will be garbage (GIGO; garbage in, garbage out) or, as is said, where science is concerned, it’s the quality of your questions.

On that note, I would have liked to know more about the survey of top-cited researchers. I think looking at the questions could have been quite illuminating and I would have liked some information on from where (geographically and area of specialization) they got most of their answers. In keeping with past practice (2012 assessment published in 2013), there is no additional information offered about the survey questions or results. Still, there was this (from the report released April 10, 2018; Note: There may be some difference between the formatting seen here and that seen in the document),

3.1.2 International Perceptions of Canadian Research
As with the 2012 S&T report, the CCA commissioned a survey of top-cited researchers’ perceptions of Canada’s research strength in their field or subfield relative to that of other countries (Section 1.3.2). Researchers were asked to identify the top five countries in their field and subfield of expertise: 36% of respondents (compared with 37% in the 2012 survey) from across all fields of research rated Canada in the top five countries in their field (Figure B.1 and Table B.1 in the appendix). Canada ranks fourth out of all countries, behind the United States, United Kingdom, and Germany, and ahead of France. This represents a change of about 1 percentage point from the overall results of the 2012 S&T survey. There was a 4 percentage point decrease in how often France is ranked among the top five countries; the ordering of the top five countries, however, remains the same.

When asked to rate Canada’s research strength among other advanced countries in their field of expertise, 72% (4,005) of respondents rated Canadian research as “strong” (corresponding to a score of 5 or higher on a 7-point scale) compared with 68% in the 2012 S&T survey (Table 3.4). [pp. 40-41 Print; pp. 78-70 PDF]

Before I forget, there was mention of the international research scene,

Growth in research output, as estimated by number of publications, varies considerably for the 20 top countries. Brazil, China, India, Iran, and South Korea have had the most significant increases in publication output over the last 10 years. [emphases mine] In particular, the dramatic increase in China’s output means that it is closing the gap with the United States. In 2014, China’s output was 95% of that of the United States, compared with 26% in 2003. [emphasis mine]

Table 3.2 shows the Growth Index (GI), a measure of the rate at which the research output for a given country changed between 2003 and 2014, normalized by the world growth rate. If a country’s growth in research output is higher than the world average, the GI score is greater than 1.0. For example, between 2003 and 2014, China’s GI score was 1.50 (i.e., 50% greater than the world average) compared with 0.88 and 0.80 for Canada and the United States, respectively. Note that the dramatic increase in publication production of emerging economies such as China and India has had a negative impact on Canada’s rank and GI score (see CCA, 2016).

As long as I’ve been blogging (10 years), the international research community (in particular the US) has been looking over its shoulder at China.

Patents and intellectual property

As an inventor, Hedy got more than one patent. Much has been made of the fact that  despite an agreement, the US Navy did not pay her or her partner (George Antheil) for work that would lead to significant military use (apparently, it was instrumental in the Bay of Pigs incident, for those familiar with that bit of history), GPS, WiFi, Bluetooth, and more.

Some comments about patents. They are meant to encourage more innovation by ensuring that creators/inventors get paid for their efforts .This is true for a set time period and when it’s over, other people get access and can innovate further. It’s not intended to be a lifelong (or inheritable) source of income. The issue in Lamarr’s case is that the navy developed the technology during the patent’s term without telling either her or her partner so, of course, they didn’t need to compensate them despite the original agreement. They really should have paid her and Antheil.

The current patent situation, particularly in the US, is vastly different from the original vision. These days patents are often used as weapons designed to halt innovation. One item that should be noted is that the Canadian federal budget indirectly addressed their misuse (from my March 16, 2018 posting),

Surprisingly, no one else seems to have mentioned a new (?) intellectual property strategy introduced in the document (from Chapter 2: Progress; scroll down about 80% of the way, Note: The formatting has been changed),

Budget 2018 proposes measures in support of a new Intellectual Property Strategy to help Canadian entrepreneurs better understand and protect intellectual property, and get better access to shared intellectual property.

What Is a Patent Collective?
A Patent Collective is a way for firms to share, generate, and license or purchase intellectual property. The collective approach is intended to help Canadian firms ensure a global “freedom to operate”, mitigate the risk of infringing a patent, and aid in the defence of a patent infringement suit.

Budget 2018 proposes to invest $85.3 million over five years, starting in 2018–19, with $10 million per year ongoing, in support of the strategy. The Minister of Innovation, Science and Economic Development will bring forward the full details of the strategy in the coming months, including the following initiatives to increase the intellectual property literacy of Canadian entrepreneurs, and to reduce costs and create incentives for Canadian businesses to leverage their intellectual property:

  • To better enable firms to access and share intellectual property, the Government proposes to provide $30 million in 2019–20 to pilot a Patent Collective. This collective will work with Canada’s entrepreneurs to pool patents, so that small and medium-sized firms have better access to the critical intellectual property they need to grow their businesses.
  • To support the development of intellectual property expertise and legal advice for Canada’s innovation community, the Government proposes to provide $21.5 million over five years, starting in 2018–19, to Innovation, Science and Economic Development Canada. This funding will improve access for Canadian entrepreneurs to intellectual property legal clinics at universities. It will also enable the creation of a team in the federal government to work with Canadian entrepreneurs to help them develop tailored strategies for using their intellectual property and expanding into international markets.
  • To support strategic intellectual property tools that enable economic growth, Budget 2018 also proposes to provide $33.8 million over five years, starting in 2018–19, to Innovation, Science and Economic Development Canada, including $4.5 million for the creation of an intellectual property marketplace. This marketplace will be a one-stop, online listing of public sector-owned intellectual property available for licensing or sale to reduce transaction costs for businesses and researchers, and to improve Canadian entrepreneurs’ access to public sector-owned intellectual property.

The Government will also consider further measures, including through legislation, in support of the new intellectual property strategy.

Helping All Canadians Harness Intellectual Property
Intellectual property is one of our most valuable resources, and every Canadian business owner should understand how to protect and use it.

To better understand what groups of Canadians are benefiting the most from intellectual property, Budget 2018 proposes to provide Statistics Canada with $2 million over three years to conduct an intellectual property awareness and use survey. This survey will help identify how Canadians understand and use intellectual property, including groups that have traditionally been less likely to use intellectual property, such as women and Indigenous entrepreneurs. The results of the survey should help the Government better meet the needs of these groups through education and awareness initiatives.

The Canadian Intellectual Property Office will also increase the number of education and awareness initiatives that are delivered in partnership with business, intermediaries and academia to ensure Canadians better understand, integrate and take advantage of intellectual property when building their business strategies. This will include targeted initiatives to support underrepresented groups.

Finally, Budget 2018 also proposes to invest $1 million over five years to enable representatives of Canada’s Indigenous Peoples to participate in discussions at the World Intellectual Property Organization related to traditional knowledge and traditional cultural expressions, an important form of intellectual property.

It’s not wholly clear what they mean by ‘intellectual property’. The focus seems to be on  patents as they are the only intellectual property (as opposed to copyright and trademarks) singled out in the budget. As for how the ‘patent collective’ is going to meet all its objectives, this budget supplies no clarity on the matter. On the plus side, I’m glad to see that indigenous peoples’ knowledge is being acknowledged as “an important form of intellectual property” and I hope the discussions at the World Intellectual Property Organization are fruitful.

As for the patent situation in Canada (from the report released April 10, 2018),

Over the past decade, the Canadian patent flow in all technical sectors has consistently decreased. Patent flow provides a partial picture of how patents in Canada are exploited. A negative flow represents a deficit of patented inventions owned by Canadian assignees versus the number of patented inventions created by Canadian inventors. The patent flow for all Canadian patents decreased from about −0.04 in 2003 to −0.26 in 2014 (Figure 4.7). This means that there is an overall deficit of 26% of patent ownership in Canada. In other words, fewer patents were owned by Canadian institutions than were invented in Canada.

This is a significant change from 2003 when the deficit was only 4%. The drop is consistent across all technical sectors in the past 10 years, with Mechanical Engineering falling the least, and Electrical Engineering the most (Figure 4.7). At the technical field level, the patent flow dropped significantly in Digital Communication and Telecommunications. For example, the Digital Communication patent flow fell from 0.6 in 2003 to −0.2 in 2014. This fall could be partially linked to Nortel’s US$4.5 billion patent sale [emphasis mine] to the Rockstar consortium (which included Apple, BlackBerry, Ericsson, Microsoft, and Sony) (Brickley, 2011). Food Chemistry and Microstructural [?] and Nanotechnology both also showed a significant drop in patent flow. [p. 83 Print; p. 121 PDF]

Despite a fall in the number of parents for ‘Digital Communication’, we’re still doing well according to statistics elsewhere in this report. Is it possible that patents aren’t that big a deal? Of course, it’s also possible that we are enjoying the benefits of past work and will miss out on future work. (Note: A video of the April 10, 2018 report presentation by Max Blouw features him saying something like that.)

One last note, Nortel died many years ago. Disconcertingly, this report, despite more than one reference to Nortel, never mentions the company’s demise.

Boxed text

While the expert panel wasn’t tasked to answer certain types of questions, as I’ve noted earlier they managed to sneak in a few items.  One of the strategies they used was putting special inserts into text boxes including this (from the report released April 10, 2018),

Box 4.2
The FinTech Revolution

Financial services is a key industry in Canada. In 2015, the industry accounted for 4.4%

of Canadia jobs and about 7% of Canadian GDP (Burt, 2016). Toronto is the second largest financial services hub in North America and one of the most vibrant research hubs in FinTech. Since 2010, more than 100 start-up companies have been founded in Canada, attracting more than $1 billion in investment (Moffatt, 2016). In 2016 alone, venture-backed investment in Canadian financial technology companies grew by 35% to $137.7 million (Ho, 2017). The Toronto Financial Services Alliance estimates that there are approximately 40,000 ICT specialists working in financial services in Toronto alone.

AI, blockchain, [emphasis mine] and other results of ICT research provide the basis for several transformative FinTech innovations including, for example, decentralized transaction ledgers, cryptocurrencies (e.g., bitcoin), and AI-based risk assessment and fraud detection. These innovations offer opportunities to develop new markets for established financial services firms, but also provide entry points for technology firms to develop competing service offerings, increasing competition in the financial services industry. In response, many financial services companies are increasing their investments in FinTech companies (Breznitz et al., 2015). By their own account, the big five banks invest more than $1 billion annually in R&D of advanced software solutions, including AI-based innovations (J. Thompson, personal communication, 2016). The banks are also increasingly investing in university research and collaboration with start-up companies. For instance, together with several large insurance and financial management firms, all big five banks have invested in the Vector Institute for Artificial Intelligence (Kolm, 2017).

I’m glad to see the mention of blockchain while AI (artificial intelligence) is an area where we have innovated (from the report released April 10, 2018),

AI has attracted researchers and funding since the 1960s; however, there were periods of stagnation in the 1970s and 1980s, sometimes referred to as the “AI winter.” During this period, the Canadian Institute for Advanced Research (CIFAR), under the direction of Fraser Mustard, started supporting AI research with a decade-long program called Artificial Intelligence, Robotics and Society, [emphasis mine] which was active from 1983 to 1994. In 2004, a new program called Neural Computation and Adaptive Perception was initiated and renewed twice in 2008 and 2014 under the title, Learning in Machines and Brains. Through these programs, the government provided long-term, predictable support for high- risk research that propelled Canadian researchers to the forefront of global AI development. In the 1990s and early 2000s, Canadian research output and impact on AI were second only to that of the United States (CIFAR, 2016). NSERC has also been an early supporter of AI. According to its searchable grant database, NSERC has given funding to research projects on AI since at least 1991–1992 (the earliest searchable year) (NSERC, 2017a).

The University of Toronto, the University of Alberta, and the Université de Montréal have emerged as international centres for research in neural networks and deep learning, with leading experts such as Geoffrey Hinton and Yoshua Bengio. Recently, these locations have expanded into vibrant hubs for research in AI applications with a diverse mix of specialized research institutes, accelerators, and start-up companies, and growing investment by major international players in AI development, such as Microsoft, Google, and Facebook. Many highly influential AI researchers today are either from Canada or have at some point in their careers worked at a Canadian institution or with Canadian scholars.

As international opportunities in AI research and the ICT industry have grown, many of Canada’s AI pioneers have been drawn to research institutions and companies outside of Canada. According to the OECD, Canada’s share of patents in AI declined from 2.4% in 2000 to 2005 to 2% in 2010 to 2015. Although Canada is the sixth largest producer of top-cited scientific publications related to machine learning, firms headquartered in Canada accounted for only 0.9% of all AI-related inventions from 2012 to 2014 (OECD, 2017c). Canadian AI researchers, however, remain involved in the core nodes of an expanding international network of AI researchers, most of whom continue to maintain ties with their home institutions. Compared with their international peers, Canadian AI researchers are engaged in international collaborations far more often than would be expected by Canada’s level of research output, with Canada ranking fifth in collaboration. [p. 97-98 Print; p. 135-136 PDF]

The only mention of robotics seems to be here in this section and it’s only in passing. This is a bit surprising given its global importance. I wonder if robotics has been somehow hidden inside the term artificial intelligence, although sometimes it’s vice versa with robot being used to describe artificial intelligence. I’m noticing this trend of assuming the terms are synonymous or interchangeable not just in Canadian publications but elsewhere too.  ’nuff said.

Getting back to the matter at hand, t he report does note that patenting (technometric data) is problematic (from the report released April 10, 2018),

The limitations of technometric data stem largely from their restricted applicability across areas of R&D. Patenting, as a strategy for IP management, is similarly limited in not being equally relevant across industries. Trends in patenting can also reflect commercial pressures unrelated to R&D activities, such as defensive or strategic patenting practices. Finally, taxonomies for assessing patents are not aligned with bibliometric taxonomies, though links can be drawn to research publications through the analysis of patent citations. [p. 105 Print; p. 143 PDF]

It’s interesting to me that they make reference to many of the same issues that I mention but they seem to forget and don’t use that information in their conclusions.

There is one other piece of boxed text I want to highlight (from the report released April 10, 2018),

Box 6.3
Open Science: An Emerging Approach to Create New Linkages

Open Science is an umbrella term to describe collaborative and open approaches to
undertaking science, which can be powerful catalysts of innovation. This includes
the development of open collaborative networks among research performers, such
as the private sector, and the wider distribution of research that usually results when
restrictions on use are removed. Such an approach triggers faster translation of ideas
among research partners and moves the boundaries of pre-competitive research to
later, applied stages of research. With research results freely accessible, companies
can focus on developing new products and processes that can be commercialized.

Two Canadian organizations exemplify the development of such models. In June
2017, Genome Canada, the Ontario government, and pharmaceutical companies
invested $33 million in the Structural Genomics Consortium (SGC) (Genome Canada,
2017). Formed in 2004, the SGC is at the forefront of the Canadian open science
movement and has contributed to many key research advancements towards new
treatments (SGC, 2018). McGill University’s Montréal Neurological Institute and
Hospital has also embraced the principles of open science. Since 2016, it has been
sharing its research results with the scientific community without restriction, with
the objective of expanding “the impact of brain research and accelerat[ing] the
discovery of ground-breaking therapies to treat patients suffering from a wide range
of devastating neurological diseases” (neuro, n.d.).

This is exciting stuff and I’m happy the panel featured it. (I wrote about the Montréal Neurological Institute initiative in a Jan. 22, 2016 posting.)

More than once, the report notes the difficulties with using bibliometric and technometric data as measures of scientific achievement and progress and open science (along with its cousins, open data and open access) are contributing to the difficulties as James Somers notes in his April 5, 2018 article ‘The Scientific Paper is Obsolete’ for The Atlantic (Note: Links have been removed),

The scientific paper—the actual form of it—was one of the enabling inventions of modernity. Before it was developed in the 1600s, results were communicated privately in letters, ephemerally in lectures, or all at once in books. There was no public forum for incremental advances. By making room for reports of single experiments or minor technical advances, journals made the chaos of science accretive. Scientists from that point forward became like the social insects: They made their progress steadily, as a buzzing mass.

The earliest papers were in some ways more readable than papers are today. They were less specialized, more direct, shorter, and far less formal. Calculus had only just been invented. Entire data sets could fit in a table on a single page. What little “computation” contributed to the results was done by hand and could be verified in the same way.

The more sophisticated science becomes, the harder it is to communicate results. Papers today are longer than ever and full of jargon and symbols. They depend on chains of computer programs that generate data, and clean up data, and plot data, and run statistical models on data. These programs tend to be both so sloppily written and so central to the results that it’s [sic] contributed to a replication crisis, or put another way, a failure of the paper to perform its most basic task: to report what you’ve actually discovered, clearly enough that someone else can discover it for themselves.

Perhaps the paper itself is to blame. Scientific methods evolve now at the speed of software; the skill most in demand among physicists, biologists, chemists, geologists, even anthropologists and research psychologists, is facility with programming languages and “data science” packages. And yet the basic means of communicating scientific results hasn’t changed for 400 years. Papers may be posted online, but they’re still text and pictures on a page.

What would you get if you designed the scientific paper from scratch today? A little while ago I spoke to Bret Victor, a researcher who worked at Apple on early user-interface prototypes for the iPad and now runs his own lab in Oakland, California, that studies the future of computing. Victor has long been convinced that scientists haven’t yet taken full advantage of the computer. “It’s not that different than looking at the printing press, and the evolution of the book,” he said. After Gutenberg, the printing press was mostly used to mimic the calligraphy in bibles. It took nearly 100 years of technical and conceptual improvements to invent the modern book. “There was this entire period where they had the new technology of printing, but they were just using it to emulate the old media.”Victor gestured at what might be possible when he redesigned a journal article by Duncan Watts and Steven Strogatz, “Collective dynamics of ‘small-world’ networks.” He chose it both because it’s one of the most highly cited papers in all of science and because it’s a model of clear exposition. (Strogatz is best known for writing the beloved “Elements of Math” column for The New York Times.)

The Watts-Strogatz paper described its key findings the way most papers do, with text, pictures, and mathematical symbols. And like most papers, these findings were still hard to swallow, despite the lucid prose. The hardest parts were the ones that described procedures or algorithms, because these required the reader to “play computer” in their head, as Victor put it, that is, to strain to maintain a fragile mental picture of what was happening with each step of the algorithm.Victor’s redesign interleaved the explanatory text with little interactive diagrams that illustrated each step. In his version, you could see the algorithm at work on an example. You could even control it yourself….

For anyone interested in the evolution of how science is conducted and communicated, Somers’ article is a fascinating and in depth look at future possibilities.

Subregional R&D

I didn’t find this quite as compelling as the last time and that may be due to the fact that there’s less information and I think the 2012 report was the first to examine the Canadian R&D scene with a subregional (in their case, provinces) lens. On a high note, this report also covers cities (!) and regions, as well as, provinces.

Here’s the conclusion (from the report released April 10, 2018),

Ontario leads Canada in R&D investment and performance. The province accounts for almost half of R&D investment and personnel, research publications and collaborations, and patents. R&D activity in Ontario produces high-quality publications in each of Canada’s five R&D strengths, reflecting both the quantity and quality of universities in the province. Quebec lags Ontario in total investment, publications, and patents, but performs as well (citations) or better (R&D intensity) by some measures. Much like Ontario, Quebec researchers produce impactful publications across most of Canada’s five R&D strengths. Although it invests an amount similar to that of Alberta, British Columbia does so at a significantly higher intensity. British Columbia also produces more highly cited publications and patents, and is involved in more international research collaborations. R&D in British Columbia and Alberta clusters around Vancouver and Calgary in areas such as physics and ICT and in clinical medicine and energy, respectively. [emphasis mine] Smaller but vibrant R&D communities exist in the Prairies and Atlantic Canada [also referred to as the Maritime provinces or Maritimes] (and, to a lesser extent, in the Territories) in natural resource industries.

Globally, as urban populations expand exponentially, cities are likely to drive innovation and wealth creation at an increasing rate in the future. In Canada, R&D activity clusters around five large cities: Toronto, Montréal, Vancouver, Ottawa, and Calgary. These five cities create patents and high-tech companies at nearly twice the rate of other Canadian cities. They also account for half of clusters in the services sector, and many in advanced manufacturing.

Many clusters relate to natural resources and long-standing areas of economic and research strength. Natural resource clusters have emerged around the location of resources, such as forestry in British Columbia, oil and gas in Alberta, agriculture in Ontario, mining in Quebec, and maritime resources in Atlantic Canada. The automotive, plastics, and steel industries have the most individual clusters as a result of their economic success in Windsor, Hamilton, and Oshawa. Advanced manufacturing industries tend to be more concentrated, often located near specialized research universities. Strong connections between academia and industry are often associated with these clusters. R&D activity is distributed across the country, varying both between and within regions. It is critical to avoid drawing the wrong conclusion from this fact. This distribution does not imply the existence of a problem that needs to be remedied. Rather, it signals the benefits of diverse innovation systems, with differentiation driven by the needs of and resources available in each province. [pp.  132-133 Print; pp. 170-171 PDF]

Intriguingly, there’s no mention that in British Columbia (BC), there are leading areas of research: Visual & Performing Arts, Psychology & Cognitive Sciences, and Clinical Medicine (according to the table on p. 117 Print, p. 153 PDF).

As I said and hinted earlier, we’ve got brains; they’re just not the kind of brains that command respect.

Final comments

My hat’s off to the expert panel and staff of the Council of Canadian Academies. Combining two previous reports into one could not have been easy. As well, kudos to their attempts to broaden the discussion by mentioning initiative such as open science and for emphasizing the problems with bibliometrics, technometrics, and other measures. I have covered only parts of this assessment, (Competing in a Global Innovation Economy: The Current State of R&D in Canada), there’s a lot more to it including a substantive list of reference materials (bibliography).

While I have argued that perhaps the situation isn’t quite as bad as the headlines and statistics may suggest, there are some concerning trends for Canadians but we have to acknowledge that many countries have stepped up their research game and that’s good for all of us. You don’t get better at anything unless you work with and play with others who are better than you are. For example, both India and Italy surpassed us in numbers of published research papers. We slipped from 7th place to 9th. Thank you, Italy and India. (And, Happy ‘Italian Research in the World Day’ on April 15, 2018, the day’s inaugural year. In Italian: Piano Straordinario “Vivere all’Italiana” – Giornata della ricerca Italiana nel mondo.)

Unfortunately, the reading is harder going than previous R&D assessments in the CCA catalogue. And in the end, I can’t help thinking we’re just a little bit like Hedy Lamarr. Not really appreciated in all of our complexities although the expert panel and staff did try from time to time. Perhaps the government needs to find better ways of asking the questions.

***ETA April 12, 2018 at 1500 PDT: Talking about missing the obvious! I’ve been ranting on about how research strength in visual and performing arts and in philosophy and theology, etc. is perfectly fine and could lead to ‘traditional’ science breakthroughs without underlining the point by noting that Antheil was a musician, Lamarr was as an actress and they set the foundation for work by electrical engineers (or people with that specialty) for their signature work leading to WiFi, etc.***

There is, by the way, a Hedy-Canada connection. In 1998, she sued Canadian software company Corel, for its unauthorized use of her image on their Corel Draw 8 product packaging. She won.

More stuff

For those who’d like to see and hear the April 10, 2017 launch for “Competing in a Global Innovation Economy: The Current State of R&D in Canada” or the Third Assessment as I think of it, go here.

The report can be found here.

For anyone curious about ‘Bombshell: The Hedy Lamarr Story’ to be broadcast on May 18, 2018 as part of PBS’s American Masters series, there’s this trailer,

For the curious, I did find out more about the Hedy Lamarr and Corel Draw. John Lettice’s December 2, 1998 article The Rgister describes the suit and her subsequent victory in less than admiring terms,

Our picture doesn’t show glamorous actress Hedy Lamarr, who yesterday [Dec. 1, 1998] came to a settlement with Corel over the use of her image on Corel’s packaging. But we suppose that following the settlement we could have used a picture of Corel’s packaging. Lamarr sued Corel earlier this year over its use of a CorelDraw image of her. The picture had been produced by John Corkery, who was 1996 Best of Show winner of the Corel World Design Contest. Corel now seems to have come to an undisclosed settlement with her, which includes a five-year exclusive (oops — maybe we can’t use the pack-shot then) licence to use “the lifelike vector illustration of Hedy Lamarr on Corel’s graphic software packaging”. Lamarr, bless ‘er, says she’s looking forward to the continued success of Corel Corporation,  …

There’s this excerpt from a Sept. 21, 2015 posting (a pictorial essay of Lamarr’s life) by Shahebaz Khan on The Blaze Blog,

6. CorelDRAW:
For several years beginning in 1997, the boxes of Corel DRAW’s software suites were graced by a large Corel-drawn image of Lamarr. The picture won Corel DRAW’s yearly software suite cover design contest in 1996. Lamarr sued Corel for using the image without her permission. Corel countered that she did not own rights to the image. The parties reached an undisclosed settlement in 1998.

There’s also a Nov. 23, 1998 Corel Draw 8 product review by Mike Gorman on mymac.com, which includes a screenshot of the packaging that precipitated the lawsuit. Once they settled, it seems Corel used her image at least one more time.

Canada’s ‘Smart Cities’ will need new technology (5G wireless) and, maybe, graphene

I recently published [March 20, 2018] a piece on ‘smart cities’ both an art/science event in Toronto and a Canadian government initiative without mentioning the necessity of new technology to support all of the grand plans. On that note, it seems the Canadian federal government and two provincial (Québec and Ontario) governments are prepared to invest in one of the necessary ‘new’ technologies, 5G wireless. The Canadian Broadcasting Corporation’s (CBC) Shawn Benjamin reports about Canada’s 5G plans in suitably breathless (even in text only) tones of excitement in a March 19, 2018 article,

The federal, Ontario and Quebec governments say they will spend $200 million to help fund research into 5G wireless technology, the next-generation networks with download speeds 100 times faster than current ones can handle.

The so-called “5G corridor,” known as ENCQOR, will see tech companies such as Ericsson, Ciena Canada, Thales Canada, IBM and CGI kick in another $200 million to develop facilities to get the project up and running.

The idea is to set up a network of linked research facilities and laboratories that these companies — and as many as 1,000 more across Canada — will be able to use to test products and services that run on 5G networks.

Benjamin’s description of 5G is focused on what it will make possible in the future,

If you think things are moving too fast, buckle up, because a new 5G cellular network is just around the corner and it promises to transform our lives by connecting nearly everything to a new, much faster, reliable wireless network.

The first networks won’t be operational for at least a few years, but technology and telecom companies around the world are already planning to spend billions to make sure they aren’t left behind, says Lawrence Surtees, a communications analyst with the research firm IDC.

The new 5G is no tentative baby step toward the future. Rather, as Surtees puts it, “the move from 4G to 5G is a quantum leap.”

In a downtown Toronto soundstage, Alan Smithson recently demonstrated a few virtual reality and augmented reality projects that his company MetaVRse is working on.

The potential for VR and AR technology is endless, he said, in large part for its potential to help hurdle some of the walls we are already seeing with current networks.

Virtual Reality technology on the market today is continually increasing things like frame rates and screen resolutions in a constant quest to make their devices even more lifelike.

… They [current 4G networks] can’t handle the load. But 5G can do so easily, Smithson said, so much so that the current era of bulky augmented reality headsets could be replaced buy a pair of normal looking glasses.

In a 5G world, those internet-connected glasses will automatically recognize everyone you meet, and possibly be able to overlay their name in your field of vision, along with a link to their online profile. …

Benjamin also mentions ‘smart cities’,

In a University of Toronto laboratory, Professor Alberto Leon-Garcia researches connected vehicles and smart power grids. “My passion right now is enabling smart cities — making smart cities a reality — and that means having much more immediate and detailed sense of the environment,” he said.

Faster 5G networks will assist his projects in many ways, by giving planners more, instant data on things like traffic patterns, energy consumption, variou carbon footprints and much more.

Leon-Garcia points to a brightly lit map of Toronto [image embedded in Benjamin’s article] in his office, and explains that every dot of light represents a sensor transmitting real time data.

Currently, the network is hooked up to things like city buses, traffic cameras and the city-owned fleet of shared bicycles. He currently has thousands of data points feeding him info on his map, but in a 5G world, the network will support about a million sensors per square kilometre.

Very exciting but where is all this data going? What computers will be processing the information? Where are these sensors located? Benjamin does not venture into those waters nor does The Economist in a February 13, 2018 article about 5G, the Olympic Games in Pyeonchang, South Korea, but the magazine does note another barrier to 5G implementation,

“FASTER, higher, stronger,” goes the Olympic motto. So it is only appropriate that the next generation of wireless technology, “5G” for short, should get its first showcase at the Winter Olympics  under way in Pyeongchang, South Korea. Once fully developed, it is supposed to offer download speeds of at least 20 gigabits per second (4G manages about half that at best) and response times (“latency”) of below 1 millisecond. So the new networks will be able to transfer a high-resolution movie in two seconds and respond to requests in less than a hundredth of the time it takes to blink an eye. But 5G is not just about faster and swifter wireless connections.

The technology is meant to enable all sorts of new services. One such would offer virtual- or augmented-reality experiences. At the Olympics, for example, many contestants are being followed by 360-degree video cameras. At special venues sports fans can don virtual-reality goggles to put themselves right into the action. But 5G is also supposed to become the connective tissue for the internet of things, to link anything from smartphones to wireless sensors and industrial robots to self-driving cars. This will be made possible by a technique called “network slicing”, which allows operators quickly to create bespoke networks that give each set of devices exactly the connectivity they need.

Despite its versatility, it is not clear how quickly 5G will take off. The biggest brake will be economic. [emphasis mine] When the GSMA, an industry group, last year asked 750 telecoms bosses about the most salient impediment to delivering 5G, more than half cited the lack of a clear business case. People may want more bandwidth, but they are not willing to pay for it—an attitude even the lure of the fanciest virtual-reality applications may not change. …

That may not be the only brake, Dexter Johnson in a March 19, 2018 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website), covers some of the others (Note: Links have been removed),

Graphene has been heralded as a “wonder material” for well over a decade now, and 5G has been marketed as the next big thing for at least the past five years. Analysts have suggested that 5G could be the golden ticket to virtual reality and artificial intelligence, and promised that graphene could improve technologies within electronics and optoelectronics.

But proponents of both graphene and 5G have also been accused of stirring up hype. There now seems to be a rising sense within industry circles that these glowing technological prospects will not come anytime soon.

At Mobile World Congress (MWC) in Barcelona last month [February 2018], some misgivings for these long promised technologies may have been put to rest, though, thanks in large part to each other.

In a meeting at MWC with Jari Kinaret, a professor at Chalmers University in Sweden and director of the Graphene Flagship, I took a guided tour around the Pavilion to see some of the technologies poised to have an impact on the development of 5G.

Being invited back to the MWC for three years is a pretty clear indication of how important graphene is to those who are trying to raise the fortunes of 5G. But just how important became more obvious to me in an interview with Frank Koppens, the leader of the quantum nano-optoelectronic group at Institute of Photonic Sciences (ICFO) just outside of Barcelona, last year.

He said: “5G cannot just scale. Some new technology is needed. And that’s why we have several companies in the Graphene Flagship that are putting a lot of pressure on us to address this issue.”

In a collaboration led by CNIT—a consortium of Italian universities and national laboratories focused on communication technologies—researchers from AMO GmbH, Ericsson, Nokia Bell Labs, and Imec have developed graphene-based photodetectors and modulators capable of receiving and transmitting optical data faster than ever before.

The aim of all this speed for transmitting data is to support the ultrafast data streams with extreme bandwidth that will be part of 5G. In fact, at another section during MWC, Ericsson was presenting the switching of a 100 Gigabits per second (Gbps) channel based on the technology.

“The fact that Ericsson is demonstrating another version of this technology demonstrates that from Ericsson’s point of view, this is no longer just research” said Kinaret.

It’s no mystery why the big mobile companies are jumping on this technology. Not only does it provide high-speed data transmission, but it also does it 10 times more efficiently than silicon or doped silicon devices, and will eventually do it more cheaply than those devices, according to Vito Sorianello, senior researcher at CNIT.

Interestingly, Ericsson is one of the tech companies mentioned with regard to Canada’s 5G project, ENCQOR and Sweden’s Chalmers University, as Dexter Johnson notes, is the lead institution for the Graphene Flagship.. One other fact to note, Canada’s resources include graphite mines with ‘premium’ flakes for producing graphene. Canada’s graphite mines are located (as far as I know) in only two Canadian provinces, Ontario and Québec, which also happen to be pitching money into ENCQOR. My March 21, 2018 posting describes the latest entry into the Canadian graphite mining stakes.

As for the questions I posed about processing power, etc. It seems the South Koreans have found answers of some kind but it’s hard to evaluate as I haven’t found any additional information about 5G and its implementation in South Korea. If anyone has answers, please feel free to leave them in the ‘comments’. Thank you.

Graphite ‘gold’ rush?

Someone in Germany (I think) is very excited about graphite, more specifically, there’s excitement around graphite flakes located in the province of Québec, Canada. Although, the person who wrote this news release might have wanted to run a search for ‘graphite’ and ‘gold rush’. The last graphite gold rush seems to have taken place in 2013.

Here’s the March 1, 2018 news release on PR Newswire (Cision), Note: Some links have been removed),

PALM BEACH, Florida, March 1, 2018 /PRNewswire/ —

MarketNewsUpdates.com News Commentary

Much like the gold rush in North America in the 1800s, people are going out in droves searching for a different kind of precious metal, graphite. The thing your third grade pencils were made of is now one of the hottest commodities on the market. This graphite is not being mined by your run-of-the-mill old-timey soot covered prospectors anymore. Big mining companies are all looking for this important resource integral to the production of lithium ion batteries due to the rise in popularity of electric cars. These players include Graphite Energy Corp. (OTC: GRXXF) (CSE: GRE), Teck Resources Limited (NYSE: TECK), Nemaska Lithium (TSX: NMX), Lithium Americas Corp. (TSX: LAC), and Cruz Cobalt Corp. (TSX-V: CUZ) (OTC: BKTPF).

These companies looking to manufacturer their graphite-based products, have seen steady positive growth over the past year. Their development of cutting-edge new products seems to be paying off. But in order to continue innovating, these companies need the graphite to do it. One junior miner looking to capitalize on the growing demand for this commodity is Graphite Energy Corp.

Graphite Energy is a mining company, that is focused on developing graphite resources. Graphite Energy’s state-of-the-art mining technology is friendly to the environment and has indicate graphite carbon (Cg) in the range of 2.20% to 22.30% with average 10.50% Cg from their Lac Aux Bouleaux Graphite Property in Southern Quebec [Canada].

Not Just Any Graphite Will Do

Graphite is one of the most in demand technology metals that is required for a green and sustainable world. Demand is only set to increase as the need for lithium ion batteries grows, fueled by the popularity of electric vehicles. However, not all graphite is created equal. The price of natural graphite has more than doubled since 2013 as companies look to maintain environmental standards which the use of synthetic graphite cannot provide due to its pollutant manufacturing process. Synthetic graphite is also very expensive to produce, deriving from petroleum and costing up to ten times as much as natural graphite. Therefore manufacturers are interested in increasing the proportion of natural graphite in their products in order to lower their costs.

High-grade large flake graphite is the solution to the environmental issues these companies are facing. But there is only so much supply to go around. Recent news by Graphite Energy Corp. on February 26th [2018] showed promising exploratory results. The announcement of the commencement of drilling is a positive step forward to meeting this increased demand.

Everything from batteries to solar panels need to be made with this natural high-grade flake graphite because what is the point of powering your home with the sun or charging your car if the products themselves do more harm than good to the environment when produced. However, supply consistency remains an issue since mines have different raw material impurities which vary from mine to mine. Certain types of battery technology already require graphite to be almost 100% pure. It is very possible that the purity requirements will increase in the future.

Natural graphite is also the basis of graphene, the uses of which seem limited only by scientists’ imaginations, given the host of new applications announced daily. In a recent study by ResearchSEA, a team from the Ocean University of China and Yunnan Normal University developed a highly efficient dye-sensitized solar cell using a graphene layer. This thin layer of graphene will allow solar panels to generate electricity when it rains.

Graphite Energy Is Keeping It Green

Whether it’s the graphite for the solar panels that will power the homes of tomorrow, or the lithium ion batteries that will fuel the latest cars, these advancements need to made in an environmentally conscious way. Mining companies like Graphite Energy Corp. specialize in the production of environmentally friendly graphite. The company will be producing its supply of natural graphite with the lowest environmental footprint possible.

From Saltwater To Clean Water Using Graphite

The world’s freshwater supply is at risk of running out. In order to mitigate this global disaster, worldwide spending on desalination technology was an estimated $16.6 billion in 2016. Due to the recent intense droughts in California, the state has accelerated the construction of desalination plants. However, the operating costs and the impact on the environment due to energy requirements for the process, is hindering any real progress in the space, until now.

Jeffrey Grossman, a professor at MIT’s [Massachusetts Institute of Technology, United States] Department of Materials Science and Engineering (DMSE), has been looking into whether graphite/graphene might reduce the cost of desalination.

“A billion people around the world lack regular access to clean water, and that’s expected to more than double in the next 25 years,” Grossman says. “Desalinated water costs five to 10 times more than regular municipal water, yet we’re not investing nearly enough money into research. If we don’t have clean energy we’re in serious trouble, but if we don’t have water we die.”

Grossman’s lab has demonstrated strong results showing that new filters made from graphene could greatly improve the energy efficiency of desalination plants while potentially reducing other costs as well.

Graphite/Graphene producers like Graphite Energy Corp. (OTC: GRXXF) (CSE: GRE) are moving quickly to provide the materials necessary to develop this new generation of desalination plants.

Potential Comparables

Cruz Cobalt Corp. (TSX-V: CUZ) (OTC: BKTPF) Cruz Cobalt Corp. is cobalt mining company involved in the identification, acquisition and exploration of mineral properties. The company’s geographical segments include the United States and Canada. They are focused on acquiring and developing high-grade Cobalt projects in politically stable, environmentally responsible and ethical mining jurisdictions, essential for the rapidly growing rechargeable battery and renewable energy.

Nemaska Lithium (TSE: NMX.TO)

Nemaska Lithium is lithium mining company. The company is a supplier of lithium hydroxide and lithium carbonate to the emerging lithium battery market that is largely driven by electric vehicles. Nemaska mining operations are located in the mining friendly jurisdiction of Quebec, Canada. Nemaska Lithium has received a notice of allowance of a main patent application on its proprietary process to produce lithium hydroxide and lithium carbonate.

Lithium Americas Corp. (TSX: LAC.TO)

Lithium Americas is developing one of North America’s largest lithium deposits in northern Nevada. It operates nearly two lithium projects namely Cauchari-Olaroz project which is located in Argentina, and the Lithium Nevada project located in Nevada. The company manufactures specialty organoclay products, derived from clays, for sale to the oil and gas and other sectors.

Teck Resources Limited (NYSE: TECK)

Teck Resources Limited is a Canadian metals and mining company.Teck’s principal products include coal, copper, zinc, with secondary products including lead, silver, gold, molybdenum, germanium, indium and cadmium. Teck’s diverse resources focuses on providing products that are essential to building a better quality of life for people around the globe.

Graphite Mining Today For A Better Tomorrow

Graphite mining will forever be intertwined with the latest advancements in science and technology. Graphite deserves attention for its various use cases in automotive, energy, aerospace and robotics industries. In order for these and other industries to become sustainable and environmentally friendly, a reliance on graphite is necessary. Therefore, this rapidly growing sector has the potential to fuel investor interest in the mining space throughout 2018. The near limitless uses of graphite has the potential to impact every facet of our lives. Companies like Graphite Energy Corp. (OTC: GRXXF); (CSE: GRE) is at the forefront in this technological revolution.

For more information on Graphite Energy Corp. (OTC: GRXXF) (CSE: GRE), please visit streetsignals.com for a free research report.

Streetsignals.com (SS) is the source of the Article and content set forth above. References to any issuer other than the profiled issuer are intended solely to identify industry participants and do not constitute an endorsement of any issuer and do not constitute a comparison to the profiled issuer. FN Media Group (FNM) is a third-party publisher and news dissemination service provider, which disseminates electronic information through multiple online media channels. FNM is NOT affiliated with SS or any company mentioned herein. The commentary, views and opinions expressed in this release by SS are solely those of SS and are not shared by and do not reflect in any manner the views or opinions of FNM. Readers of this Article and content agree that they cannot and will not seek to hold liable SS and FNM for any investment decisions by their readers or subscribers. SS and FNM and their respective affiliated companies are a news dissemination and financial marketing solutions provider and are NOT registered broker-dealers/analysts/investment advisers, hold no investment licenses and may NOT sell, offer to sell or offer to buy any security.

The Article and content related to the profiled company represent the personal and subjective views of the Author (SS), and are subject to change at any time without notice. The information provided in the Article and the content has been obtained from sources which the Author believes to be reliable. However, the Author (SS) has not independently verified or otherwise investigated all such information. None of the Author, SS, FNM, or any of their respective affiliates, guarantee the accuracy or completeness of any such information. This Article and content are not, and should not be regarded as investment advice or as a recommendation regarding any particular security or course of action; readers are strongly urged to speak with their own investment advisor and review all of the profiled issuer’s filings made with the Securities and Exchange Commission before making any investment decisions and should understand the risks associated with an investment in the profiled issuer’s securities, including, but not limited to, the complete loss of your investment. FNM was not compensated by any public company mentioned herein to disseminate this press release but was compensated seventy six hundred dollars by SS, a non-affiliated third party to distribute this release on behalf of Graphite Energy Corp.

FNM HOLDS NO SHARES OF ANY COMPANY NAMED IN THIS RELEASE.

This release contains “forward-looking statements” within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E the Securities Exchange Act of 1934, as amended and such forward-looking statements are made pursuant to the safe harbor provisions of the Private Securities Litigation Reform Act of 1995. “Forward-looking statements” describe future expectations, plans, results, or strategies and are generally preceded by words such as “may”, “future”, “plan” or “planned”, “will” or “should”, “expected,” “anticipates”, “draft”, “eventually” or “projected”. You are cautioned that such statements are subject to a multitude of risks and uncertainties that could cause future circumstances, events, or results to differ materially from those projected in the forward-looking statements, including the risks that actual results may differ materially from those projected in the forward-looking statements as a result of various factors, and other risks identified in a company’s annual report on Form 10-K or 10-KSB and other filings made by such company with the Securities and Exchange Commission. You should consider these factors in evaluating the forward-looking statements included herein, and not place undue reliance on such statements. The forward-looking statements in this release are made as of the date hereof and SS and FNM undertake no obligation to update such statements.

Media Contact:

FN Media Group, LLC
info@marketnewsupdates.com
+1(561)325-8757

SOURCE MarketNewsUpdates.com

Hopefully my insertions of ‘Canada’ and the ‘United States’ help to clarify matters. North America and the United States are not synonyms although they are sometimes used synonymously.

There is another copy of this news release on Wall Street Online (Deutschland), both in English and German.By the way, that was my first clue that there might be some German interest. The second clue was the Graphite Energy Corp. homepage. Unusually for a company with ‘headquarters’ in the Canadian province of British Columbia, there’s an option to read the text in German.

Graphite Energy Corp. seems to be a relatively new player in the ‘rush’ to mine graphite flakes for use in graphene-based applications. One of my first posts about mining for graphite flakes was a July 26, 2011 posting concerning Northern Graphite and their mining operation (Bissett Creek) in Ontario. I don’t write about them often but they are still active if their news releases are to be believed. The latest was issued February 28, 2018 and offers “financial metrics for the Preliminary Economic Assessment (the “PEA”) on the Company’s 100% owned Bissett Creek graphite project.”

The other graphite mining company mentioned here is Lomiko Metals. The latest posting here about Lomiko is a December 23, 2015 piece regarding an analysis and stock price recommendation by a company known as SeeThruEquity. Like Graphite Energy Corp., Lomiko’s mines are located in Québec and their business headquarters in British Columbia. Lomiko has a March 16, 2018 news release announcing its reinstatement for trading on the TSX (Toronto Stock Exchange),

(Vancouver, B.C.) Lomiko Metals Inc. (“Lomiko”) (“Lomiko”) (TSX-V: LMR, OTC: LMRMF, FSE: DH8C) announces it has been successful in its reinstatement application with the TSX Venture Exchange and trading will begin at the opening on Tuesday, March 20, 2018.

Getting back to the flakes, here’s more about Graphite Energy Corp.’s mine (from the About Lac Aux Bouleaux webpage),

Lac Aux Bouleaux

The Lac Aux Bouleaux Property is comprised of 14 mineral claims in one contiguous block totaling 738.12 hectares land on NTS 31J05, near the town of Mont-Laurier in southern Québec. Lac Aux Bouleaux “LAB” is a world class graphite property that borders the only producing graphite in North America [Note: There are three countries in North America, Canada, the United States, and Mexico. Québec is in Canada.]. On the property we have a full production facility already built which includes an open pit mine, processing facility, tailings pond, power and easy access to roads.

High Purity Levels

An important asset of LAB is its metallurgy. The property contains a high proportion of large and jumbo flakes from which a high purity concentrate was proven to be produced across all flakes by a simple flotation process. The concentrate can then be further purified using the province’s green and affordable hydro-electricity to be used in lithium-ion batteries.

The geological work performed in order to verify the existing data consisted of visiting approachable graphite outcrops, historical exploration and development work on the property. Large flake graphite showings located on the property were confirmed with flake size in the range of 0.5 to 2 millimeters, typically present in shear zones at the contact of gneisses and marbles where the graphite content usually ranges from 2% to 20%. The results of the property are outstanding showing to have jumbo flake natural graphite.

An onsite mill structure, a tailing dam facility, and a historical open mining pit is already present and constructed on the property. The property is ready to be put into production based on the existing infrastructure already built. The company would hope to be able to ship by rail its mined graphite directly to Teslas Gigafactory being built in Nevada [United States] which will produce 35GWh of batteries annually by 2020.

Adjacent Properties

The property is located in a very active graphite exploration and production area, adjacent to the south of TIMCAL’s Lac des Iles graphite mine in Quebec which is a world class deposit producing 25,000 tonnes of graphite annually. There are several graphite showings and past producing mines in its vicinity, including a historic deposit located on the property.

The open pit mine in operation since 1989 with an onsite plant ranked 5th in the world production of graphite. The mine is operated by TIMCAL Graphite & Carbon which is a subsidiary of Imerys S.A., a French multinational company. The mine has an average grade of 7.5% Cg (graphite carbon) and has been producing 50 different graphite products for various graphite end users around the globe.

Canadians! We have great flakes!

Alberta adds a newish quantum nanotechnology research hub to the Canada’s quantum computing research scene

One of the winners in Canada’s 2017 federal budget announcement of the Pan-Canadian Artificial Intelligence Strategy was Edmonton, Alberta. It’s a fact which sometimes goes unnoticed while Canadians marvel at the wonderfulness found in Toronto and Montréal where it seems new initiatives and monies are being announced on a weekly basis (I exaggerate) for their AI (artificial intelligence) efforts.

Alberta’s quantum nanotechnology hub (graduate programme)

Intriguingly, it seems that Edmonton has higher aims than (an almost unnoticed) leadership in AI. Physicists at the University of Alberta have announced hopes to be just as successful as their AI brethren in a Nov. 27, 2017 article by Juris Graney for the Edmonton Journal,

Physicists at the University of Alberta [U of A] are hoping to emulate the success of their artificial intelligence studying counterparts in establishing the city and the province as the nucleus of quantum nanotechnology research in Canada and North America.

Google’s artificial intelligence research division DeepMind announced in July [2017] it had chosen Edmonton as its first international AI research lab, based on a long-running partnership with the U of A’s 10-person AI lab.

Retaining the brightest minds in the AI and machine-learning fields while enticing a global tech leader to Alberta was heralded as a coup for the province and the university.

It is something U of A physics professor John Davis believes the university’s new graduate program, Quanta, can help achieve in the world of quantum nanotechnology.

The field of quantum mechanics had long been a realm of theoretical science based on the theory that atomic and subatomic material like photons or electrons behave both as particles and waves.

“When you get right down to it, everything has both behaviours (particle and wave) and we can pick and choose certain scenarios which one of those properties we want to use,” he said.

But, Davis said, physicists and scientists are “now at the point where we understand quantum physics and are developing quantum technology to take to the marketplace.”

“Quantum computing used to be realm of science fiction, but now we’ve figured it out, it’s now a matter of engineering,” he said.

Quantum computing labs are being bought by large tech companies such as Google, IBM and Microsoft because they realize they are only a few years away from having this power, he said.

Those making the groundbreaking developments may want to commercialize their finds and take the technology to market and that is where Quanta comes in.

East vs. West—Again?

Ivan Semeniuk in his article, Quantum Supremacy, ignores any quantum research effort not located in either Waterloo, Ontario or metro Vancouver, British Columbia to describe a struggle between the East and the West (a standard Canadian trope). From Semeniuk’s Oct. 17, 2017 quantum article [link follows the excerpts] for the Globe and Mail’s October 2017 issue of the Report on Business (ROB),

 Lazaridis [Mike], of course, has experienced lost advantage first-hand. As co-founder and former co-CEO of Research in Motion (RIM, now called Blackberry), he made the smartphone an indispensable feature of the modern world, only to watch rivals such as Apple and Samsung wrest away Blackberry’s dominance. Now, at 56, he is engaged in a high-stakes race that will determine who will lead the next technology revolution. In the rolling heartland of southwestern Ontario, he is laying the foundation for what he envisions as a new Silicon Valley—a commercial hub based on the promise of quantum technology.

Semeniuk skips over the story of how Blackberry lost its advantage. I came onto that story late in the game when Blackberry was already in serious trouble due to a failure to recognize that the field they helped to create was moving in a new direction. If memory serves, they were trying to keep their technology wholly proprietary which meant that developers couldn’t easily create apps to extend the phone’s features. Blackberry also fought a legal battle in the US with a patent troll draining company resources and energy in proved to be a futile effort.

Since then Lazaridis has invested heavily in quantum research. He gave the University of Waterloo a serious chunk of money as they named their Quantum Nano Centre (QNC) after him and his wife, Ophelia (you can read all about it in my Sept. 25, 2012 posting about the then new centre). The best details for Lazaridis’ investments in Canada’s quantum technology are to be found on the Quantum Valley Investments, About QVI, History webpage,

History-bannerHistory has repeatedly demonstrated the power of research in physics to transform society.  As a student of history and a believer in the power of physics, Mike Lazaridis set out in 2000 to make real his bold vision to establish the Region of Waterloo as a world leading centre for physics research.  That is, a place where the best researchers in the world would come to do cutting-edge research and to collaborate with each other and in so doing, achieve transformative discoveries that would lead to the commercialization of breakthrough  technologies.

Establishing a World Class Centre in Quantum Research:

The first step in this regard was the establishment of the Perimeter Institute for Theoretical Physics.  Perimeter was established in 2000 as an independent theoretical physics research institute.  Mike started Perimeter with an initial pledge of $100 million (which at the time was approximately one third of his net worth).  Since that time, Mike and his family have donated a total of more than $170 million to the Perimeter Institute.  In addition to this unprecedented monetary support, Mike also devotes his time and influence to help lead and support the organization in everything from the raising of funds with government and private donors to helping to attract the top researchers from around the globe to it.  Mike’s efforts helped Perimeter achieve and grow its position as one of a handful of leading centres globally for theoretical research in fundamental physics.

Stephen HawkingPerimeter is located in a Governor-General award winning designed building in Waterloo.  Success in recruiting and resulting space requirements led to an expansion of the Perimeter facility.  A uniquely designed addition, which has been described as space-ship-like, was opened in 2011 as the Stephen Hawking Centre in recognition of one of the most famous physicists alive today who holds the position of Distinguished Visiting Research Chair at Perimeter and is a strong friend and supporter of the organization.

Recognizing the need for collaboration between theorists and experimentalists, in 2002, Mike applied his passion and his financial resources toward the establishment of The Institute for Quantum Computing at the University of Waterloo.  IQC was established as an experimental research institute focusing on quantum information.  Mike established IQC with an initial donation of $33.3 million.  Since that time, Mike and his family have donated a total of more than $120 million to the University of Waterloo for IQC and other related science initiatives.  As in the case of the Perimeter Institute, Mike devotes considerable time and influence to help lead and support IQC in fundraising and recruiting efforts.  Mike’s efforts have helped IQC become one of the top experimental physics research institutes in the world.

Quantum ComputingMike and Doug Fregin have been close friends since grade 5.  They are also co-founders of BlackBerry (formerly Research In Motion Limited).  Doug shares Mike’s passion for physics and supported Mike’s efforts at the Perimeter Institute with an initial gift of $10 million.  Since that time Doug has donated a total of $30 million to Perimeter Institute.  Separately, Doug helped establish the Waterloo Institute for Nanotechnology at the University of Waterloo with total gifts for $29 million.  As suggested by its name, WIN is devoted to research in the area of nanotechnology.  It has established as an area of primary focus the intersection of nanotechnology and quantum physics.

With a donation of $50 million from Mike which was matched by both the Government of Canada and the province of Ontario as well as a donation of $10 million from Doug, the University of Waterloo built the Mike & Ophelia Lazaridis Quantum-Nano Centre, a state of the art laboratory located on the main campus of the University of Waterloo that rivals the best facilities in the world.  QNC was opened in September 2012 and houses researchers from both IQC and WIN.

Leading the Establishment of Commercialization Culture for Quantum Technologies in Canada:

In the Research LabFor many years, theorists have been able to demonstrate the transformative powers of quantum mechanics on paper.  That said, converting these theories to experimentally demonstrable discoveries has, putting it mildly, been a challenge.  Many naysayers have suggested that achieving these discoveries was not possible and even the believers suggested that it could likely take decades to achieve these discoveries.  Recently, a buzz has been developing globally as experimentalists have been able to achieve demonstrable success with respect to Quantum Information based discoveries.  Local experimentalists are very much playing a leading role in this regard.  It is believed by many that breakthrough discoveries that will lead to commercialization opportunities may be achieved in the next few years and certainly within the next decade.

Recognizing the unique challenges for the commercialization of quantum technologies (including risk associated with uncertainty of success, complexity of the underlying science and high capital / equipment costs) Mike and Doug have chosen to once again lead by example.  The Quantum Valley Investment Fund will provide commercialization funding, expertise and support for researchers that develop breakthroughs in Quantum Information Science that can reasonably lead to new commercializable technologies and applications.  Their goal in establishing this Fund is to lead in the development of a commercialization infrastructure and culture for Quantum discoveries in Canada and thereby enable such discoveries to remain here.

Semeniuk goes on to set the stage for Waterloo/Lazaridis vs. Vancouver (from Semeniuk’s 2017 ROB article),

… as happened with Blackberry, the world is once again catching up. While Canada’s funding of quantum technology ranks among the top five in the world, the European Union, China, and the US are all accelerating their investments in the field. Tech giants such as Google [also known as Alphabet], Microsoft and IBM are ramping up programs to develop companies and other technologies based on quantum principles. Meanwhile, even as Lazaridis works to establish Waterloo as the country’s quantum hub, a Vancouver-area company has emerged to challenge that claim. The two camps—one methodically focused on the long game, the other keen to stake an early commercial lead—have sparked an East-West rivalry that many observers of the Canadian quantum scene are at a loss to explain.

Is it possible that some of the rivalry might be due to an influential individual who has invested heavily in a ‘quantum valley’ and has a history of trying to ‘own’ a technology?

Getting back to D-Wave Systems, the Vancouver company, I have written about them a number of times (particularly in 2015; for the full list: input D-Wave into the blog search engine). This June 26, 2015 posting includes a reference to an article in The Economist magazine about D-Wave’s commercial opportunities while the bulk of the posting is focused on a technical breakthrough.

Semeniuk offers an overview of the D-Wave Systems story,

D-Wave was born in 1999, the same year Lazaridis began to fund quantum science in Waterloo. From the start, D-Wave had a more immediate goal: to develop a new computer technology to bring to market. “We didn’t have money or facilities,” says Geordie Rose, a physics PhD who co0founded the company and served in various executive roles. …

The group soon concluded that the kind of machine most scientists were pursing based on so-called gate-model architecture was decades away from being realized—if ever. …

Instead, D-Wave pursued another idea, based on a principle dubbed “quantum annealing.” This approach seemed more likely to produce a working system, even if the application that would run on it were more limited. “The only thing we cared about was building the machine,” says Rose. “Nobody else was trying to solve the same problem.”

D-Wave debuted its first prototype at an event in California in February 2007 running it through a few basic problems such as solving a Sudoku puzzle and finding the optimal seating plan for a wedding reception. … “They just assumed we were hucksters,” says Hilton [Jeremy Hilton, D.Wave senior vice-president of systems]. Federico Spedalieri, a computer scientist at the University of Southern California’s [USC} Information Sciences Institute who has worked with D-Wave’s system, says the limited information the company provided about the machine’s operation provoked outright hostility. “I think that played against them a lot in the following years,” he says.

It seems Lazaridis is not the only one who likes to hold company information tightly.

Back to Semeniuk and D-Wave,

Today [October 2017], the Los Alamos National Laboratory owns a D-Wave machine, which costs about $15million. Others pay to access D-Wave systems remotely. This year , for example, Volkswagen fed data from thousands of Beijing taxis into a machine located in Burnaby [one of the municipalities that make up metro Vancouver] to study ways to optimize traffic flow.

But the application for which D-Wave has the hights hope is artificial intelligence. Any AI program hings on the on the “training” through which a computer acquires automated competence, and the 2000Q [a D-Wave computer] appears well suited to this task. …

Yet, for all the buzz D-Wave has generated, with several research teams outside Canada investigating its quantum annealing approach, the company has elicited little interest from the Waterloo hub. As a result, what might seem like a natural development—the Institute for Quantum Computing acquiring access to a D-Wave machine to explore and potentially improve its value—has not occurred. …

I am particularly interested in this comment as it concerns public funding (from Semeniuk’s article),

Vern Brownell, a former Goldman Sachs executive who became CEO of D-Wave in 2009, calls the lack of collaboration with Waterloo’s research community “ridiculous,” adding that his company’s efforts to establish closer ties have proven futile, “I’ll be blunt: I don’t think our relationship is good enough,” he says. Brownell also point out that, while  hundreds of millions in public funds have flowed into Waterloo’s ecosystem, little funding is available for  Canadian scientists wishing to make the most of D-Wave’s hardware—despite the fact that it remains unclear which core quantum technology will prove the most profitable.

There’s a lot more to Semeniuk’s article but this is the last excerpt,

The world isn’t waiting for Canada’s quantum rivals to forge a united front. Google, Microsoft, IBM, and Intel are racing to develop a gate-model quantum computer—the sector’s ultimate goal. (Google’s researchers have said they will unveil a significant development early next year.) With the U.K., Australia and Japan pouring money into quantum, Canada, an early leader, is under pressure to keep up. The federal government is currently developing  a strategy for supporting the country’s evolving quantum sector and, ultimately, getting a return on its approximately $1-billion investment over the past decade [emphasis mine].

I wonder where the “approximately $1-billion … ” figure came from. I ask because some years ago MP Peter Julian asked the government for information about how much Canadian federal money had been invested in nanotechnology. The government replied with sheets of paper (a pile approximately 2 inches high) that had funding disbursements from various ministries. Each ministry had its own method with different categories for listing disbursements and the titles for the research projects were not necessarily informative for anyone outside a narrow specialty. (Peter Julian’s assistant had kindly sent me a copy of the response they had received.) The bottom line is that it would have been close to impossible to determine the amount of federal funding devoted to nanotechnology using that data. So, where did the $1-billion figure come from?

In any event, it will be interesting to see how the Council of Canadian Academies assesses the ‘quantum’ situation in its more academically inclined, “The State of Science and Technology and Industrial Research and Development in Canada,” when it’s released later this year (2018).

Finally, you can find Semeniuk’s October 2017 article here but be aware it’s behind a paywall.

Whither we goest?

Despite any doubts one might have about Lazaridis’ approach to research and technology, his tremendous investment and support cannot be denied. Without him, Canada’s quantum research efforts would be substantially less significant. As for the ‘cowboys’ in Vancouver, it takes a certain temperament to found a start-up company and it seems the D-Wave folks have more in common with Lazaridis than they might like to admit. As for the Quanta graduate  programme, it’s early days yet and no one should ever count out Alberta.

Meanwhile, one can continue to hope that a more thoughtful approach to regional collaboration will be adopted so Canada can continue to blaze trails in the field of quantum research.

Ora Sound, a Montréal-based startup, and its ‘graphene’ headphones

For all the excitement about graphene there aren’t that many products as Glenn Zorpette notes in a June 20, 2017 posting about Ora Sound and its headphones on the Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website; Note: Links have been removed),

Graphene has long been touted as a miracle material that would deliver everything from tiny, ultralow-power transistors to the vastly long and ultrastrong cable [PDF] needed for a space elevator. And yet, 13 years of graphene development, and R&D expenditures well in the tens of billions of dollars have so far yielded just a handful of niche products. The most notable by far is a line of tennis racquets in which relatively small amounts of graphene are used to stiffen parts of the frame.

Ora Sound, a Montreal-based [Québec, Canada] startup, hopes to change all that. On 20 June [2017], it unveiled a Kickstarter campaign for a new audiophile-grade headphone that uses cones, also known as membranes, made of a form of graphene. “To the best of our knowledge, we are the first company to find a significant, commercially viable application for graphene,” says Ora cofounder Ari Pinkas, noting that the cones in the headphones are 95 percent graphene.

Kickstarter

It should be noted that participating in a Kickstarter campaign is an investment/gamble. I am not endorsing Ora Sound or its products. That said, this does look interesting (from the ORA: The World’s First Graphene Headphones Kickstarter campaign webpage),

ORA GQ Headphones uses nanotechnology to deliver the most groundbreaking audio listening experience. Scientists have long promised that one day Graphene will find its way into many facets of our lives including displays, electronic circuits and sensors. ORA’s Graphene technology makes it one of the first companies to have created a commercially viable application for this Nobel-prize winning material, a major scientific achievement.

The GQ Headphones come equipped with ORA’s patented GrapheneQ™ membranes, providing unparalleled fidelity. The headphones also offer all the features you would expect from a high-end audio product: wired/wireless operation, a gesture control track-pad, a digital MEMS microphone, breathable lambskin leather and an ear-shaped design optimized for sound quality and isolated comfort.

They have produced a slick video to promote their campaign,

At the time of publishing this post, the campaign will run for another eight days and has raised $650,949 CAD. This is more than $500,000 dollars over the company’s original goal of $135,000. I’m sure they’re ecstatic but this success can be a mixed blessing. They have many more people expecting a set of headphones than they anticipated and that can mean production issues.

Further, there appears to be only one member of the team with business experience and his (Ari Pinkas) experience includes marketing strategy for a few years and then founding an online marketplace for teachers. I would imagine Pinkas will be experiencing a very steep learning curve. Hopefully, Helge Seetzen, a member of the company’s advisory board will be able to offer assistance. According to Seetzen’s Wikipedia entry, he is a “… German technologist and businessman known for imaging & multimedia research and commercialization,” as well as, having a Canadian educational background and business experience. The rest of the team and advisory board appear to be academics.

The technology

A March 14, 2017 article by Andy Riga for the Montréal Gazette gives a general description of the technology,

A Montreal startup is counting on technology sparked by a casual conversation between two brothers pursuing PhDs at McGill University.

They were chatting about their disparate research areas — one, in engineering, was working on using graphene, a form of carbon, in batteries; the other, in music, was looking at the impact of electronics on the perception of audio quality.

At first glance, the invention that ensued sounds humdrum.

It’s a replacement for an item you use every day. It’s paper thin, you probably don’t realize it’s there and its design has not changed much in more than a century. Called a membrane or diaphragm, it’s the part of a loudspeaker that vibrates to create the sound from the headphones over your ears, the wireless speaker on your desk, the cellphone in your hand.

Membranes are normally made of paper, Mylar or aluminum.

Ora’s innovation uses graphene, a remarkable material whose discovery garnered two scientists the 2010 Nobel Prize in physics but which has yet to fulfill its promise.

“Because it’s so stiff, our membrane gets better sound quality,” said Robert-Eric Gaskell, who obtained his PhD in sound recording in 2015. “It can produce more sound with less distortion, and the sound that you hear is more true to the original sound intended by the artist.

“And because it’s so light, we get better efficiency — the lighter it is, the less energy it takes.”

In January, the company demonstrated its membrane in headphones at the Consumer Electronics Show, a big trade convention in Las Vegas.

Six cellphone manufacturers expressed interest in Ora’s technology, some of which are now trying prototypes, said Ari Pinkas, in charge of product marketing at Ora. “We’re talking about big cellphone manufacturers — big, recognizable names,” he said.

Technology companies are intrigued by the idea of using Ora’s technology to make smaller speakers so they can squeeze other things, such as bigger batteries, into the limited space in electronic devices, Pinkas said. Others might want to use Ora’s membrane to allow their devices to play music louder, he added.

Makers of regular speakers, hearing aids and virtual-reality headsets have also expressed interest, Pinkas said.

Ora is still working on headphones.

Riga’s article offers a good overview for people who are not familiar with graphene.

Zorpette’s June 20, 2017 posting (on Nanoclast) offers a few more technical details (Note: Links have been removed),

During an interview and demonstration in the IEEE Spectrum offices, Pinkas and Robert-Eric Gaskell, another of the company’s cofounders, explained graphene’s allure to audiophiles. “Graphene has the ideal properties for a membrane,” Gaskell says. “It’s incredibly stiff, very lightweight—a rare combination—and it’s well damped,” which means it tends to quell spurious vibrations. By those metrics, graphene soundly beats all the usual choices: mylar, paper, aluminum, or even beryllium, Gaskell adds.

The problem is making it in sheets large enough to fashion into cones. So-called “pristine” graphene exists as flakes, [emphasis mine] perhaps 10 micrometers across, and a single atom thick. To make larger, strong sheets of graphene, researchers attach oxygen atoms to the flakes, and then other elements to the oxygen atoms to cross-link the flakes and hold them together strongly in what materials scientists call a laminate structure. The intellectual property behind Ora’s advance came from figuring out how to make these structures suitably thick and in the proper shape to function as speaker cones, Gaskell says. In short, he explains, the breakthrough was, “being able to manufacture” in large numbers, “and in any geometery we want.”

Much of the R&D work that led to Ora’s process was done at nearby McGill University, by professor Thomas Szkopek of the Electrical and Computer Engineering department. Szkopek worked with Peter Gaskell, Robert-Eric’s younger brother. Ora is also making use of patents that arose from work done on graphene by the Nguyen Group at Northwestern University, in Evanston, Ill.

Robert-Eric Gaskell and Pinkas arrived at Spectrum with a preproduction model of their headphones, as well as some other headphones for the sake of comparison. The Ora prototype is clearly superior to the comparison models, but that’s not much of a surprise. …

… In the 20 minutes or so I had to audition Ora’s preproduction model, I listened to an assortment of classical and jazz standards and I came away impressed. The sound is precise, with fine details sharply rendered. To my surprise, I was reminded of planar-magnetic type headphones that are now surging in popularity in the upper reaches of the audiophile headphone market. Bass is smooth and tight. Overall, the unit holds up quite well against closed-back models in the $400 to $500 range I’ve listened to from Grado, Bowers & Wilkins, and Audeze.

Ora’s Kickstarter campaign page (Graphene vs GrapheneQ subsection) offers some information about their unique graphene composite,

A TECHNICAL INTRODUCTION TO GRAPHENE

Graphene is a new material, first isolated only 13 years ago. Formed from a single layer of carbon atoms, Graphene is a hexagonal crystal lattice in a perfect honeycomb structure. This fundamental geometry makes Graphene ridiculously strong and lightweight. In its pure form, Graphene is a single atomic layer of carbon. It can be very expensive and difficult to produce in sizes any bigger than small flakes. These challenges have prevented pristine Graphene from being integrated into consumer technologies.

THE GRAPHENEQ™ SOLUTION

At ORA, we’ve spent the last few years creating GrapheneQ, our own, proprietary Graphene-based nanocomposite formulation. We’ve specifically designed and optimized it for use in acoustic transducers. GrapheneQ is a composite material which is over 95% Graphene by weight. It is formed by depositing flakes of Graphene into thousands of layers that are bonded together with proprietary cross-linking agents. Rather than trying to form one, continuous layer of Graphene, GrapheneQ stacks flakes of Graphene together into a laminate material that preserves the benefits of Graphene while allowing the material to be formed into loudspeaker cones.

Scanning Electron Microscope (SEM) Comparison
Scanning Electron Microscope (SEM) Comparison

If you’re interested in more technical information on sound, acoustics, soundspeakers, and Ora’s graphene-based headphones, it’s all there on Ora’s Kickstarter campaign page.

The Québec nanotechnology scene in context and graphite flakes for graphene

There are two Canadian provinces that are heavily invested in nanotechnology research and commercialization efforts. The province of Québec has poured money into their nanotechnology efforts, while the province of Alberta has also invested heavily in nanotechnology, it has also managed to snare additional federal funds to host Canada’s National Institute of Nanotechnology (NINT). (This appears to be a current NINT website or you can try this one on the National Research Council website). I’d rank Ontario as being a third centre with the other provinces being considerably less invested. As for the North, I’ve not come across any nanotechnology research from that region. Finally, as I stumble more material about nanotechnology in Québec than I do for any other province, that’s the reason I rate Québec as the most successful in its efforts.

Regarding graphene, Canada seems to have an advantage. We have great graphite flakes for making graphene. With mines in at least two provinces, Ontario and Québec, we have a ready source of supply. In my first posting (July 25, 2011) about graphite mines here, I had this,

Who knew large flakes could be this exciting? From the July 25, 2011 news item on Nanowerk,

Northern Graphite Corporation has announced that graphene has been successfully made on a test basis using large flake graphite from the Company’s Bissett Creek project in Northern Ontario. Northern’s standard 95%C, large flake graphite was evaluated as a source material for making graphene by an eminent professor in the field at the Chinese Academy of Sciences who is doing research making graphene sheets larger than 30cm2 in size using the graphene oxide methodology. The tests indicated that graphene made from Northern’s jumbo flake is superior to Chinese powder and large flake graphite in terms of size, higher electrical conductivity, lower resistance and greater transparency.

Approximately 70% of production from the Bissett Creek property will be large flake (+80 mesh) and almost all of this will in fact be +48 mesh jumbo flake which is expected to attract premium pricing and be a better source material for the potential manufacture of graphene. The very high percentage of large flakes makes Bissett Creek unique compared to most graphite deposits worldwide which produce a blend of large, medium and small flakes, as well as a large percentage of low value -150 mesh flake and amorphous powder which are not suitable for graphene, Li ion batteries or other high end, high growth applications.

Since then I’ve stumbled across more information about Québec’s mines than Ontario’s  as can be seen:

There are some other mentions of graphite mines in other postings but they are tangential to what’s being featured:

  • (my Oct. 26, 2015 posting about St. Jean Carbon and its superconducting graphene and
  • my Feb. 20, 2015 posting about Nanoxplore and graphene production in Québec; and
  • this Feb. 23, 2015 posting about Grafoid and its sister company, Focus Graphite which gets its graphite flakes from a deposit in the northeastern part of Québec).

 

After reviewing these posts, I’ve begun to wonder where Ora’s graphite flakes come from? In any event, I wish the folks at Ora and their Kickstarter funders the best of luck.

Artificial intelligence (AI) company (in Montréal, Canada) attracts $135M in funding from Microsoft, Intel, Nvidia and others

It seems there’s a push on to establish Canada as a centre for artificial intelligence research and, if the federal and provincial governments have their way, for commercialization of said research. As always, there seems to be a bit of competition between Toronto (Ontario) and Montréal (Québec) as to which will be the dominant hub for the Canadian effort if one is to take Braga’s word for the situation.

In any event, Toronto seemed to have a mild advantage over Montréal initially with the 2017 Canadian federal government  budget announcement that the Canadian Institute for Advanced Research (CIFAR), based in Toronto, would launch a Pan-Canadian Artificial Intelligence Strategy and with an announcement from the University of Toronto shortly after (from my March 31, 2017 posting),

On the heels of the March 22, 2017 federal budget announcement of $125M for a Pan-Canadian Artificial Intelligence Strategy, the University of Toronto (U of T) has announced the inception of the Vector Institute for Artificial Intelligence in a March 28, 2017 news release by Jennifer Robinson (Note: Links have been removed),

A team of globally renowned researchers at the University of Toronto is driving the planning of a new institute staking Toronto’s and Canada’s claim as the global leader in AI.

Geoffrey Hinton, a University Professor Emeritus in computer science at U of T and vice-president engineering fellow at Google, will serve as the chief scientific adviser of the newly created Vector Institute based in downtown Toronto.

“The University of Toronto has long been considered a global leader in artificial intelligence research,” said U of T President Meric Gertler. “It’s wonderful to see that expertise act as an anchor to bring together researchers, government and private sector actors through the Vector Institute, enabling them to aim even higher in leading advancements in this fast-growing, critical field.”

As part of the Government of Canada’s Pan-Canadian Artificial Intelligence Strategy, Vector will share $125 million in federal funding with fellow institutes in Montreal and Edmonton. All three will conduct research and secure talent to cement Canada’s position as a world leader in AI.

However, Montréal and the province of Québec are no slouches when it comes to supporting to technology. From a June 14, 2017 article by Matthew Braga for CBC (Canadian Broadcasting Corporation) news online (Note: Links have been removed),

One of the most promising new hubs for artificial intelligence research in Canada is going international, thanks to a $135 million investment with contributions from some of the biggest names in tech.

The company, Montreal-based Element AI, was founded last October [2016] to help companies that might not have much experience in artificial intelligence start using the technology to change the way they do business.

It’s equal parts general research lab and startup incubator, with employees working to develop new and improved techniques in artificial intelligence that might not be fully realized for years, while also commercializing products and services that can be sold to clients today.

It was co-founded by Yoshua Bengio — one of the pioneers of a type of AI research called machine learning — along with entrepreneurs Jean-François Gagné and Nicolas Chapados, and the Canadian venture capital fund Real Ventures.

In an interview, Bengio and Gagné said the money from the company’s funding round will be used to hire 250 new employees by next January. A hundred will be based in Montreal, but an additional 100 employees will be hired for a new office in Toronto, and the remaining 50 for an Element AI office in Asia — its first international outpost.

They will join more than 100 employees who work for Element AI today, having left jobs at Amazon, Uber and Google, among others, to work at the company’s headquarters in Montreal.

The expansion is a big vote of confidence in Element AI’s strategy from some of the world’s biggest technology companies. Microsoft, Intel and Nvidia all contributed to the round, and each is a key player in AI research and development.

The company has some not unexpected plans and partners (from the Braga, article, Note: A link has been removed),

The Series A round was led by Data Collective, a Silicon Valley-based venture capital firm, and included participation by Fidelity Investments Canada, National Bank of Canada, and Real Ventures.

What will it help the company do? Scale, its founders say.

“We’re looking at domain experts, artificial intelligence experts,” Gagné said. “We already have quite a few, but we’re looking at people that are at the top of their game in their domains.

“And at this point, it’s no longer just pure artificial intelligence, but people who understand, extremely well, robotics, industrial manufacturing, cybersecurity, and financial services in general, which are all the areas we’re going after.”

Gagné says that Element AI has already delivered 10 projects to clients in those areas, and have many more in development. In one case, Element AI has been helping a Japanese semiconductor company better analyze the data collected by the assembly robots on its factory floor, in a bid to reduce manufacturing errors and improve the quality of the company’s products.

There’s more to investment in Québec’s AI sector than Element AI (from the Braga article; Note: Links have been removed),

Element AI isn’t the only organization in Canada that investors are interested in.

In September, the Canadian government announced $213 million in funding for a handful of Montreal universities, while both Google and Microsoft announced expansions of their Montreal AI research groups in recent months alongside investments in local initiatives. The province of Quebec has pledged $100 million for AI initiatives by 2022.

Braga goes on to note some other initiatives but at that point the article’s focus is exclusively Toronto.

For more insight into the AI situation in Québec, there’s Dan Delmar’s May 23, 2017 article for the Montreal Express (Note: Links have been removed),

Advocating for massive government spending with little restraint admittedly deviates from the tenor of these columns, but the AI business is unlike any other before it. [emphasis misn] Having leaders acting as fervent advocates for the industry is crucial; resisting the coming technological tide is, as the Borg would say, futile.

The roughly 250 AI researchers who call Montreal home are not simply part of a niche industry. Quebec’s francophone character and Montreal’s multilingual citizenry are certainly factors favouring the development of language technology, but there’s ample opportunity for more ambitious endeavours with broader applications.

AI isn’t simply a technological breakthrough; it is the technological revolution. [emphasis mine] In the coming decades, modern computing will transform all industries, eliminating human inefficiencies and maximizing opportunities for innovation and growth — regardless of the ethical dilemmas that will inevitably arise.

“By 2020, we’ll have computers that are powerful enough to simulate the human brain,” said (in 2009) futurist Ray Kurzweil, author of The Singularity Is Near, a seminal 2006 book that has inspired a generation of AI technologists. Kurzweil’s projections are not science fiction but perhaps conservative, as some forms of AI already effectively replace many human cognitive functions. “By 2045, we’ll have expanded the intelligence of our human-machine civilization a billion-fold. That will be the singularity.”

The singularity concept, borrowed from physicists describing event horizons bordering matter-swallowing black holes in the cosmos, is the point of no return where human and machine intelligence will have completed their convergence. That’s when the machines “take over,” so to speak, and accelerate the development of civilization beyond traditional human understanding and capability.

The claims I’ve highlighted in Delmar’s article have been made before for other technologies, “xxx is like no other business before’ and “it is a technological revolution.”  Also if you keep scrolling down to the bottom of the article, you’ll find Delmar is a ‘public relations consultant’ which, if you look at his LinkedIn profile, you’ll find means he’s a managing partner in a PR firm known as Provocateur.

Bertrand Marotte’s May 20, 2017 article for the Montreal Gazette offers less hyperbole along with additional detail about the Montréal scene (Note: Links have been removed),

It might seem like an ambitious goal, but key players in Montreal’s rapidly growing artificial-intelligence sector are intent on transforming the city into a Silicon Valley of AI.

Certainly, the flurry of activity these days indicates that AI in the city is on a roll. Impressive amounts of cash have been flowing into academia, public-private partnerships, research labs and startups active in AI in the Montreal area.

…, researchers at Microsoft Corp. have successfully developed a computing system able to decipher conversational speech as accurately as humans do. The technology makes the same, or fewer, errors than professional transcribers and could be a huge boon to major users of transcription services like law firms and the courts.

Setting the goal of attaining the critical mass of a Silicon Valley is “a nice point of reference,” said tech entrepreneur Jean-François Gagné, co-founder and chief executive officer of Element AI, an artificial intelligence startup factory launched last year.

The idea is to create a “fluid, dynamic ecosystem” in Montreal where AI research, startup, investment and commercialization activities all mesh productively together, said Gagné, who founded Element with researcher Nicolas Chapados and Université de Montréal deep learning pioneer Yoshua Bengio.

“Artificial intelligence is seen now as a strategic asset to governments and to corporations. The fight for resources is global,” he said.

The rise of Montreal — and rival Toronto — as AI hubs owes a lot to provincial and federal government funding.

Ottawa promised $213 million last September to fund AI and big data research at four Montreal post-secondary institutions. Quebec has earmarked $100 million over the next five years for the development of an AI “super-cluster” in the Montreal region.

The provincial government also created a 12-member blue-chip committee to develop a strategic plan to make Quebec an AI hub, co-chaired by Claridge Investments Ltd. CEO Pierre Boivin and Université de Montréal rector Guy Breton.

But private-sector money has also been flowing in, particularly from some of the established tech giants competing in an intense AI race for innovative breakthroughs and the best brains in the business.

Montreal’s rich talent pool is a major reason Waterloo, Ont.-based language-recognition startup Maluuba decided to open a research lab in the city, said the company’s vice-president of product development, Mohamed Musbah.

“It’s been incredible so far. The work being done in this space is putting Montreal on a pedestal around the world,” he said.

Microsoft struck a deal this year to acquire Maluuba, which is working to crack one of the holy grails of deep learning: teaching machines to read like the human brain does. Among the company’s software developments are voice assistants for smartphones.

Maluuba has also partnered with an undisclosed auto manufacturer to develop speech recognition applications for vehicles. Voice recognition applied to cars can include such things as asking for a weather report or making remote requests for the vehicle to unlock itself.

Marotte’s Twitter profile describes him as a freelance writer, editor, and translator.

Meet Pepper, a robot for health care clinical settings

A Canadian project to introduce robots like Pepper into clinical settings (aside: can seniors’ facilities be far behind?) is the subject of a June 23, 2017 news item on phys.org,

McMaster and Ryerson universities today announced the Smart Robots for Health Communication project, a joint research initiative designed to introduce social robotics and artificial intelligence into clinical health care.

A June 22, 2017 McMaster University news release, which originated the news item, provides more detail,

With the help of Softbank’s humanoid robot Pepper and IBM Bluemix Watson Cognitive Services, the researchers will study health information exchange through a state-of-the-art human-robot interaction system. The project is a collaboration between David Harris Smith, professor in the Department of Communication Studies and Multimedia at McMaster University, Frauke Zeller, professor in the School of Professional Communication at Ryerson University and Hermenio Lima, a dermatologist and professor of medicine at McMaster’s Michael G. DeGroote School of Medicine. His main research interests are in the area of immunodermatology and technology applied to human health.

The research project involves the development and analysis of physical and virtual human-robot interactions, and has the capability to improve healthcare outcomes by helping healthcare professionals better understand patients’ behaviour.

Zeller and Harris Smith have previously worked together on hitchBOT, the friendly hitchhiking robot that travelled across Canada and has since found its new home in the [Canada] Science and Technology Museum in Ottawa.

“Pepper will help us highlight some very important aspects and motives of human behaviour and communication,” said Zeller.

Designed to be used in professional environments, Pepper is a humanoid robot that can interact with people, ‘read’ emotions, learn, move and adapt to its environment, and even recharge on its own. Pepper is able to perform facial recognition and develop individualized relationships when it interacts with people.

Lima, the clinic director, said: “We are excited to have the opportunity to potentially transform patient engagement in a clinical setting, and ultimately improve healthcare outcomes by adapting to clients’ communications needs.”

At Ryerson, Pepper was funded by the Co-lab in the Faculty of Communication and Design. FCAD’s Co-lab provides strategic leadership, technological support and acquisitions of technologies that are shaping the future of communications.

“This partnership is a testament to the collaborative nature of innovation,” said dean of FCAD, Charles Falzon. “I’m thrilled to support this multidisciplinary project that pushes the boundaries of research, and allows our faculty and students to find uses for emerging tech inside and outside the classroom.”

“This project exemplifies the value that research in the Humanities can bring to the wider world, in this case building understanding and enhancing communications in critical settings such as health care,” says McMaster’s Dean of Humanities, Ken Cruikshank.

The integration of IBM Watson cognitive computing services with the state-of-the-art social robot Pepper, offers a rich source of research potential for the projects at Ryerson and McMaster. This integration is also supported by IBM Canada and [Southern Ontario Smart Computing Innovation Platform] SOSCIP by providing the project access to high performance research computing resources and staff in Ontario.

“We see this as the initiation of an ongoing collaborative university and industry research program to develop and test applications of embodied AI, a research program that is well-positioned to integrate and apply emerging improvements in machine learning and social robotics innovations,” said Harris Smith.

I just went to a presentation at the facility where my mother lives and it was all about delivering more individualized and better care for residents. Given that most seniors in British Columbia care facilities do not receive the number of service hours per resident recommended by the province due to funding issues, it seemed a well-meaning initiative offered in the face of daunting odds against success. Now with this news, I wonder what impact ‘Pepper’ might ultimately have on seniors and on the people who currently deliver service. Of course, this assumes that researchers will be able to tackle problems with understanding various accents and communication strategies, which are strongly influenced by culture and, over time, the aging process.

After writing that last paragraph I stumbled onto this June 27, 2017 Sage Publications press release on EurekAlert about a related matter,

Existing digital technologies must be exploited to enable a paradigm shift in current healthcare delivery which focuses on tests, treatments and targets rather than the therapeutic benefits of empathy. Writing in the Journal of the Royal Society of Medicine, Dr Jeremy Howick and Dr Sian Rees of the Oxford Empathy Programme, say a new paradigm of empathy-based medicine is needed to improve patient outcomes, reduce practitioner burnout and save money.

Empathy-based medicine, they write, re-establishes relationship as the heart of healthcare. “Time pressure, conflicting priorities and bureaucracy can make practitioners less likely to express empathy. By re-establishing the clinical encounter as the heart of healthcare, and exploiting available technologies, this can change”, said Dr Howick, a Senior Researcher in Oxford University’s Nuffield Department of Primary Care Health Sciences.

Technology is already available that could reduce the burden of practitioner paperwork by gathering basic information prior to consultation, for example via email or a mobile device in the waiting room.

During the consultation, the computer screen could be placed so that both patient and clinician can see it, a help to both if needed, for example, to show infographics on risks and treatment options to aid decision-making and the joint development of a treatment plan.

Dr Howick said: “The spread of alternatives to face-to-face consultations is still in its infancy, as is our understanding of when a machine will do and when a person-to-person relationship is needed.” However, he warned, technology can also get in the way. A computer screen can become a barrier to communication rather than an aid to decision-making. “Patients and carers need to be involved in determining the need for, and designing, new technologies”, he said.

I sincerely hope that the Canadian project has taken into account some of the issues described in the ’empathy’ press release and in the article, which can be found here,

Overthrowing barriers to empathy in healthcare: empathy in the age of the Internet
by J Howick and S Rees. Journaly= of the Royal Society of Medicine Article first published online: June 27, 2017 DOI: https://doi.org/10.1177/0141076817714443

This article is open access.

Vector Institute and Canada’s artificial intelligence sector

On the heels of the March 22, 2017 federal budget announcement of $125M for a Pan-Canadian Artificial Intelligence Strategy, the University of Toronto (U of T) has announced the inception of the Vector Institute for Artificial Intelligence in a March 28, 2017 news release by Jennifer Robinson (Note: Links have been removed),

A team of globally renowned researchers at the University of Toronto is driving the planning of a new institute staking Toronto’s and Canada’s claim as the global leader in AI.

Geoffrey Hinton, a University Professor Emeritus in computer science at U of T and vice-president engineering fellow at Google, will serve as the chief scientific adviser of the newly created Vector Institute based in downtown Toronto.

“The University of Toronto has long been considered a global leader in artificial intelligence research,” said U of T President Meric Gertler. “It’s wonderful to see that expertise act as an anchor to bring together researchers, government and private sector actors through the Vector Institute, enabling them to aim even higher in leading advancements in this fast-growing, critical field.”

As part of the Government of Canada’s Pan-Canadian Artificial Intelligence Strategy, Vector will share $125 million in federal funding with fellow institutes in Montreal and Edmonton. All three will conduct research and secure talent to cement Canada’s position as a world leader in AI.

In addition, Vector is expected to receive funding from the Province of Ontario and more than 30 top Canadian and global companies eager to tap this pool of talent to grow their businesses. The institute will also work closely with other Ontario universities with AI talent.

(See my March 24, 2017 posting; scroll down about 25% for the science part, including the Pan-Canadian Artificial Intelligence Strategy of the budget.)

Not obvious in last week’s coverage of the Pan-Canadian Artificial Intelligence Strategy is that the much lauded Hinton has been living in the US and working for Google. These latest announcements (Pan-Canadian AI Strategy and Vector Institute) mean that he’s moving back.

A March 28, 2017 article by Kate Allen for TorontoStar.com provides more details about the Vector Institute, Hinton, and the Canadian ‘brain drain’ as it applies to artificial intelligence, (Note:  A link has been removed)

Toronto will host a new institute devoted to artificial intelligence, a major gambit to bolster a field of research pioneered in Canada but consistently drained of talent by major U.S. technology companies like Google, Facebook and Microsoft.

The Vector Institute, an independent non-profit affiliated with the University of Toronto, will hire about 25 new faculty and research scientists. It will be backed by more than $150 million in public and corporate funding in an unusual hybridization of pure research and business-minded commercial goals.

The province will spend $50 million over five years, while the federal government, which announced a $125-million Pan-Canadian Artificial Intelligence Strategy in last week’s budget, is providing at least $40 million, backers say. More than two dozen companies have committed millions more over 10 years, including $5 million each from sponsors including Google, Air Canada, Loblaws, and Canada’s five biggest banks [Bank of Montreal (BMO). Canadian Imperial Bank of Commerce ({CIBC} President’s Choice Financial},  Royal Bank of Canada (RBC), Scotiabank (Tangerine), Toronto-Dominion Bank (TD Canada Trust)].

The mode of artificial intelligence that the Vector Institute will focus on, deep learning, has seen remarkable results in recent years, particularly in image and speech recognition. Geoffrey Hinton, considered the “godfather” of deep learning for the breakthroughs he made while a professor at U of T, has worked for Google since 2013 in California and Toronto.

Hinton will move back to Canada to lead a research team based at the tech giant’s Toronto offices and act as chief scientific adviser of the new institute.

Researchers trained in Canadian artificial intelligence labs fill the ranks of major technology companies, working on tools like instant language translation, facial recognition, and recommendation services. Academic institutions and startups in Toronto, Waterloo, Montreal and Edmonton boast leaders in the field, but other researchers have left for U.S. universities and corporate labs.

The goals of the Vector Institute are to retain, repatriate and attract AI talent, to create more trained experts, and to feed that expertise into existing Canadian companies and startups.

Hospitals are expected to be a major partner, since health care is an intriguing application for AI. Last month, researchers from Stanford University announced they had trained a deep learning algorithm to identify potentially cancerous skin lesions with accuracy comparable to human dermatologists. The Toronto company Deep Genomics is using deep learning to read genomes and identify mutations that may lead to disease, among other things.

Intelligent algorithms can also be applied to tasks that might seem less virtuous, like reading private data to better target advertising. Zemel [Richard Zemel, the institute’s research director and a professor of computer science at U of T] says the centre is creating an ethics working group [emphasis mine] and maintaining ties with organizations that promote fairness and transparency in machine learning. As for privacy concerns, “that’s something we are well aware of. We don’t have a well-formed policy yet but we will fairly soon.”

The institute’s annual funding pales in comparison to the revenues of the American tech giants, which are measured in tens of billions. The risk the institute’s backers are taking is simply creating an even more robust machine learning PhD mill for the U.S.

“They obviously won’t all stay in Canada, but Toronto industry is very keen to get them,” Hinton said. “I think Trump might help there.” Two researchers on Hinton’s new Toronto-based team are Iranian, one of the countries targeted by U.S. President Donald Trump’s travel bans.

Ethics do seem to be a bit of an afterthought. Presumably the Vector Institute’s ‘ethics working group’ won’t include any regular folks. Is there any thought to what the rest of us think about these developments? As there will also be some collaboration with other proposed AI institutes including ones at the University of Montreal (Université de Montréal) and the University of Alberta (Kate McGillivray’s article coming up shortly mentions them), might the ethics group be centered in either Edmonton or Montreal? Interestingly, two Canadians (Timothy Caulfield at the University of Alberta and Eric Racine at Université de Montréa) testified at the US Commission for the Study of Bioethical Issues Feb. 10 – 11, 2014 meeting, the Brain research, ethics, and nanotechnology. Still speculating here but I imagine Caulfield and/or Racine could be persuaded to extend their expertise in ethics and the human brain to AI and its neural networks.

Getting back to the topic at hand the ‘AI sceneCanada’, Allen’s article is worth reading in its entirety if you have the time.

Kate McGillivray’s March 29, 2017 article for the Canadian Broadcasting Corporation’s (CBC) news online provides more details about the Canadian AI situation and the new strategies,

With artificial intelligence set to transform our world, a new institute is putting Toronto to the front of the line to lead the charge.

The Vector Institute for Artificial Intelligence, made possible by funding from the federal government revealed in the 2017 budget, will move into new digs in the MaRS Discovery District by the end of the year.

Vector’s funding comes partially from a $125 million investment announced in last Wednesday’s federal budget to launch a pan-Canadian artificial intelligence strategy, with similar institutes being established in Montreal and Edmonton.

“[A.I.] cuts across pretty well every sector of the economy,” said Dr. Alan Bernstein, CEO and president of the Canadian Institute for Advanced Research, the organization tasked with administering the federal program.

“Silicon Valley and England and other places really jumped on it, so we kind of lost the lead a little bit. I think the Canadian federal government has now realized that,” he said.

Stopping up the brain drain

Critical to the strategy’s success is building a homegrown base of A.I. experts and innovators — a problem in the last decade, despite pioneering work on so-called “Deep Learning” by Canadian scholars such as Yoshua Bengio and Geoffrey Hinton, a former University of Toronto professor who will now serve as Vector’s chief scientific advisor.

With few university faculty positions in Canada and with many innovative companies headquartered elsewhere, it has been tough to keep the few graduates specializing in A.I. in town.

“We were paying to educate people and shipping them south,” explained Ed Clark, chair of the Vector Institute and business advisor to Ontario Premier Kathleen Wynne.

The existence of that “fantastic science” will lean heavily on how much buy-in Vector and Canada’s other two A.I. centres get.

Toronto’s portion of the $125 million is a “great start,” said Bernstein, but taken alone, “it’s not enough money.”

“My estimate of the right amount of money to make a difference is a half a billion or so, and I think we will get there,” he said.

Jessica Murphy’s March 29, 2017 article for the British Broadcasting Corporation’s (BBC) news online offers some intriguing detail about the Canadian AI scene,

Canadian researchers have been behind some recent major breakthroughs in artificial intelligence. Now, the country is betting on becoming a big player in one of the hottest fields in technology, with help from the likes of Google and RBC [Royal Bank of Canada].

In an unassuming building on the University of Toronto’s downtown campus, Geoff Hinton laboured for years on the “lunatic fringe” of academia and artificial intelligence, pursuing research in an area of AI called neural networks.

Also known as “deep learning”, neural networks are computer programs that learn in similar way to human brains. The field showed early promise in the 1980s, but the tech sector turned its attention to other AI methods after that promise seemed slow to develop.

“The approaches that I thought were silly were in the ascendancy and the approach that I thought was the right approach was regarded as silly,” says the British-born [emphasis mine] professor, who splits his time between the university and Google, where he is a vice-president of engineering fellow.

Neural networks are used by the likes of Netflix to recommend what you should binge watch and smartphones with voice assistance tools. Google DeepMind’s AlphaGo AI used them to win against a human in the ancient game of Go in 2016.

Foteini Agrafioti, who heads up the new RBC Research in Machine Learning lab at the University of Toronto, said those recent innovations made AI attractive to researchers and the tech industry.

“Anything that’s powering Google’s engines right now is powered by deep learning,” she says.

Developments in the field helped jumpstart innovation and paved the way for the technology’s commercialisation. They also captured the attention of Google, IBM and Microsoft, and kicked off a hiring race in the field.

The renewed focus on neural networks has boosted the careers of early Canadian AI machine learning pioneers like Hinton, the University of Montreal’s Yoshua Bengio, and University of Alberta’s Richard Sutton.

Money from big tech is coming north, along with investments by domestic corporations like banking multinational RBC and auto parts giant Magna, and millions of dollars in government funding.

Former banking executive Ed Clark will head the institute, and says the goal is to make Toronto, which has the largest concentration of AI-related industries in Canada, one of the top five places in the world for AI innovation and business.

The founders also want it to serve as a magnet and retention tool for top talent aggressively head-hunted by US firms.

Clark says they want to “wake up” Canadian industry to the possibilities of AI, which is expected to have a massive impact on fields like healthcare, banking, manufacturing and transportation.

Google invested C$4.5m (US$3.4m/£2.7m) last November [2016] in the University of Montreal’s Montreal Institute for Learning Algorithms.

Microsoft is funding a Montreal startup, Element AI. The Seattle-based company also announced it would acquire Montreal-based Maluuba and help fund AI research at the University of Montreal and McGill University.

Thomson Reuters and General Motors both recently moved AI labs to Toronto.

RBC is also investing in the future of AI in Canada, including opening a machine learning lab headed by Agrafioti, co-funding a program to bring global AI talent and entrepreneurs to Toronto, and collaborating with Sutton and the University of Alberta’s Machine Intelligence Institute.

Canadian tech also sees the travel uncertainty created by the Trump administration in the US as making Canada more attractive to foreign talent. (One of Clark’s the selling points is that Toronto as an “open and diverse” city).

This may reverse the ‘brain drain’ but it appears Canada’s role as a ‘branch plant economy’ for foreign (usually US) companies could become an important discussion once more. From the ‘Foreign ownership of companies of Canada’ Wikipedia entry (Note: Links have been removed),

Historically, foreign ownership was a political issue in Canada in the late 1960s and early 1970s, when it was believed by some that U.S. investment had reached new heights (though its levels had actually remained stable for decades), and then in the 1980s, during debates over the Free Trade Agreement.

But the situation has changed, since in the interim period Canada itself became a major investor and owner of foreign corporations. Since the 1980s, Canada’s levels of investment and ownership in foreign companies have been larger than foreign investment and ownership in Canada. In some smaller countries, such as Montenegro, Canadian investment is sizable enough to make up a major portion of the economy. In Northern Ireland, for example, Canada is the largest foreign investor. By becoming foreign owners themselves, Canadians have become far less politically concerned about investment within Canada.

Of note is that Canada’s largest companies by value, and largest employers, tend to be foreign-owned in a way that is more typical of a developing nation than a G8 member. The best example is the automotive sector, one of Canada’s most important industries. It is dominated by American, German, and Japanese giants. Although this situation is not unique to Canada in the global context, it is unique among G-8 nations, and many other relatively small nations also have national automotive companies.

It’s interesting to note that sometimes Canadian companies are the big investors but that doesn’t change our basic position. And, as I’ve noted in other postings (including the March 24, 2017 posting), these government investments in science and technology won’t necessarily lead to a move away from our ‘branch plant economy’ towards an innovative Canada.

You can find out more about the Vector Institute for Artificial Intelligence here.

BTW, I noted that reference to Hinton as ‘British-born’ in the BBC article. He was educated in the UK and subsidized by UK taxpayers (from his Wikipedia entry; Note: Links have been removed),

Hinton was educated at King’s College, Cambridge graduating in 1970, with a Bachelor of Arts in experimental psychology.[1] He continued his study at the University of Edinburgh where he was awarded a PhD in artificial intelligence in 1977 for research supervised by H. Christopher Longuet-Higgins.[3][12]

It seems Canadians are not the only ones to experience  ‘brain drains’.

Finally, I wrote at length about a recent initiative taking place between the University of British Columbia (Vancouver, Canada) and the University of Washington (Seattle, Washington), the Cascadia Urban Analytics Cooperative in a Feb. 28, 2017 posting noting that the initiative is being funded by Microsoft to the tune $1M and is part of a larger cooperative effort between the province of British Columbia and the state of Washington. Artificial intelligence is not the only area where US technology companies are hedging their bets (against Trump’s administration which seems determined to terrify people from crossing US borders) by investing in Canada.

For anyone interested in a little more information about AI in the US and China, there’s today’s (March 31, 2017)earlier posting: China, US, and the race for artificial intelligence research domination.

York University (Toronto, Ontario, Canada) research team creates 3D beating heart and matters of the heart at the Ontario Institute for Regenerative Medicine

I have two items about cardiac research in Ontario. Not strictly speaking about nanotechnology, the two items do touch on topics covered here before, 3D organs and stem cells.

York University and its 3D beating heart

A Feb. 9, 2017 York University news release (also on EurekAlert), describe an innovative approach to creating 3D heart tissue,

Matters of the heart can be complicated, but York University scientists have found a way to create 3D heart tissue that beats in synchronized harmony, like a heart in love, that will lead to better understanding of cardiac health, and improved treatments.

York U chemistry Professor Muhammad Yousaf and his team of grad students have devised a way to stick three different types of cardiac cells together, like Velcro, to make heart tissue that beats as one.

Until now, most 2D and 3D in vitro tissue did not beat in harmony and required scaffolding for the cells to hold onto and grow, causing limitations. In this research, Yousaf and his team made a scaffold free beating tissue out of three cell types found in the heart – contractile cardiac muscle cells, connective tissue cells and vascular cells.

The researchers believe this is the first 3D in vitro cardiac tissue with three cell types that can beat together as one entity rather than at different intervals.

“This breakthrough will allow better and earlier drug testing, and potentially eliminate harmful or toxic medications sooner,” said Yousaf of York U’s Faculty of Science.

In addition, the substance used to stick cells together (ViaGlue), will provide researchers with tools to create and test 3D in vitro cardiac tissue in their own labs to study heart disease and issues with transplantation. Cardiovascular associated diseases are the leading cause of death globally and are responsible for 40 per cent of deaths in North America.

“Making in vitro 3D cardiac tissue has long presented a challenge to scientists because of the high density of cells and muscularity of the heart,” said Dmitry Rogozhnikov, a chemistry PhD student at York. “For 2D or 3D cardiac tissue to be functional it needs the same high cellular density and the cells must be in contact to facilitate synchronized beating.”

Although the 3D cardiac tissue was created at a millimeter scale, larger versions could be made, said Yousaf, who has created a start-up company OrganoLinX to commercialize the ViaGlue reagent and to provide custom 3D tissues on demand.

Here’s a link to and a citation for the paper,

Scaffold Free Bio-orthogonal Assembly of 3-Dimensional Cardiac Tissue via Cell Surface Engineering by Dmitry Rogozhnikov, Paul J. O’Brien, Sina Elahipanah, & Muhammad N. Yousaf. Scientific Reports 6, Article number: 39806 (2016) doi:10.1038/srep39806 Published online: 23 December 2016

This paper is open access.

Ontario Institute for Regenerative Medicine and its heart stem cell research

Steven Erwood has written about how Toronto has become a centre for certain kinds of cardiac research by focusing on specific researchers in a Feb. 13, 2017 posting on the Ontario Institute for Regenerative Medicine’s expression blog (Note: Links have been removed),

You may have heard that Paris is the city of love, but you might not know that Toronto specializes in matters of the heart, particularly broken hearts.

Dr. Ren Ke Li, an investigator with the Ontario Institute for Regenerative Medicine, established his lab at the Toronto General Hospital Research Institute in 1993 hoping to find a way to replace the muscle cells, or cardiomyocytes, that are lost after a heart attack. Specifically, Li hoped to transplant a collection of cells, called stem cells, into a heart damaged by a heart attack. Stem cells have the power to differentiate into virtually any cell type, so if Li could coax them to become cardiomyocytes, they could theoretically reverse the damage caused by the heart attack.

Over the years, Li’s experiments using stem cells to regenerate and repair damaged heart tissue, which progressed all the way through to human clinical trials, pushed Li to rethink his approach to heart repair. Most of the transplanted cells failed to engraft to the host tissue and many of those that did successfully integrate into the patient’s heart remained non-contractile, sitting still beside the rest of the beating heart muscle. Despite this, the treatments were still proving beneficial — albeit less beneficial than Li had hoped. These cells weren’t replacing the lost cardiomyocytes, but they were still helping the patient recover. Li was then just beginning to reveal something that is now well described: transplanting exogenous stem cells (originating outside the patient) onto damaged tissue stimulated the endogenous stem cells to repair that damage. These transplanted stem cells were changing the behaviour of the patient’s own stem cells, enhancing their response to injury.

Li calls this process “rejuvenation” — arguing that the reason older populations can’t recover from cardiac injury is because they have fewer stem cells, and those stem cells have lost their ability to repair and regenerate damaged tissue over time. Li argues that the positive effects he was seeing in his experiments and clinical trials was a restoration or reversal of age-related deterioration in repair capability — a rejuvenation of the aged heart.

Li, alongside fellow OIRM [Ontario Institute for Regenerative Medicine] researcher and cardiac surgeon at Toronto General Hospital, Dr. Richard Weisel, dedicated a large part of their research effort to understanding this process. Weisel explains, “We put young cells into old animals, and we can get them to respond to a heart attack like a young person — which is remarkable!”

A team of researchers led by the duo published an article in Basic Research in Cardiology last month describing a new method to rejuvenate the aged heart, and characterizing this rejuvenation at the molecular and cellular level.

Successfully advancing this research to the clinic is where Weisel thinks Toronto provides a unique advantage. “We have the ability to do the clinical trials — the same people who are working on these projects [in the lab], can also take them into the clinic, and a lot of other places in the world [the clinicians and the researchers] are separate. We’ve been doing that for all the areas of stem cell research.” This unique set of circumstances, Weisel argues, more readily allows for a successful transition from research to clinical practice.

But an integrated research and clinical environment isn’t all the city has to offer to those looking to make substantial progress in stem cell therapies. Dr. Michael Laflamme, OIRM researcher and a leading authority on stem cell therapies for cardiac repair, called his decision to relocate to Toronto from the University of Washington in Seattle “a no-brainer”.

Laflamme focuses on improving the existing approaches to exogenous stem cell transplantation in cardiac repair and believes that solving the problems Li faced in his early experiments is just a matter of finding the right cell type. Laflamme, in an ongoing preclinical trial funded by OIRM, is differentiating stem cells in a bioreactor into ventricular cardiomyocytes, the specific type of cell lost after a heart attack, and delivering those cells directly to the scar tissue in hopes of turning it back into muscle. Laflamme is optimistic these ventricular cardiomyocytes might be just the cell type he’s looking for. Using these cells in animal models, although in a mixture of other cardiac cell types, Laflamme explains, “We’ve shown that those cells will stably engraft and they actually become electrically integrated with the rest of the tissue — they will [beat] in synchrony with the rest of the heart.”

Laflamme states that “Toronto is the place where we can get this stuff done better and we can get it done faster,” citing the existing Toronto-based expertise in both the differentiation of stem cells and the biotechnological means to scale these processes as being unparalleled elsewhere in the world.

It’s not only academic researchers and clinicians that recognize Toronto’s potential to advance regenerative medicine and stem cell therapy. Pharmaceutical giant Bayer, partnered with San Francisco-based venture capital firm Versant Ventures, announced last December a USD 225 million investment in a stem cell biotechnology company called BlueRock Therapeutics — the second largest investment of it’s kind in the history of the biotechnology industry. …

There’s substantially to more Erwood’s piece in the original posting.

One final thought, I wonder if there is a possibility that York University’s ViaGlue might be useful in the work talking place at Ontario Institute for Regenerative Medicine. I realize the two institutions are in the same city but do the researchers even know about each other’s work?