Tag Archives: virtual reality (VR)

“Living in a Dream,” part of Cambridge Festival (on display March 31 and April 1, 2023 in the UK)

Caption: Dream artwork by Jewel Chang of Anglia Ruskin University, which will be on display at the Cambridge Festival. Credit: Jewel Chang, Anglia Ruskin University

Let’s clear up a few things. First, as noted in the headline, the Cambridge Festival (March 17 – April 2, 2023) is being held in the UK by the University of Cambridge in the town of Cambridge. Second, the specific festival event featured here is a display put together by students and professors at Anglia Ruskin University (ARU) and in the town of Cambridge as part of the festival and will be held for two days, March 31 – April 1, 2023.

A March 27, 2023 ARU press release (also on EurekAlert) provides more details about the two day display, Note: Links have been removed,

Dreams are being turned into reality as new research investigating the unusual experiences of people with depersonalisation symptoms is being brought to life in an art exhibition at Anglia Ruskin University (ARU) in Cambridge, England.

ARU neuroscientist Dr Jane Aspell has led a major international study into depersonalisation, funded by the Bial Foundation. The “Living in a Dream” project, results from which will be published later this year, found that people who experience depersonalisation symptoms sometimes experience life from a very different perspective, both while awake and while dreaming.

Those experiencing depersonalisation often report feeling as though they are not real and that their body does not belong to them. Dr Aspell’s study, which is the first to examine how people with this disorder experience dreams, collected almost 1,000 dream reports from participants.

Now these dreams have been recreated by eight students from ARU’s MA Illustration course and the artwork will go on display for the first time on 31 March and 1 April as part of the Cambridge Festival.

This collaboration between art and science, led by psychologist Matt Gwyther and illustrator Dr Nanette Hoogslag, with the support of artist and creative technologist Emily Godden, has resulted in 12 original artworks, which have been created using the latest audio-visual technologies, including artificial intelligence (AI), and are presented using a mix of audio-visual installation, virtual reality (VR) experiences, and traditional media.

Dr Jane Aspell, Associate Professor of Cognitive Neuroscience at ARU and Head of the Self and Body Lab, said: “People who experience depersonalisation sometimes feel detached from their self and body, and a common complaint is that it’s like they are watching their own life as a film.

“Because their waking reality is so different, myself and my international collaborators – Dr Anna Ciaunica, Professor Bigna Lenggenhager and Dr Jennifer Windt – were keen to investigate how they experience their dreams.

“People who took part in the study completed daily ‘dream diaries’, and it is fabulous to see how these dreams have been recreated by this group of incredibly talented artists.”

Matt Gwyther added: “Dreams are both incredibly visual and surreal, and you lose so much when attempting to put them into words. By bringing them to life as art, it has not only produced fabulous artwork, but it also helps us as scientists better understand the experiences of our research participants.”

Amongst the artists contributing to the exhibition is MA student Jewel Chang, who has recreated a dream about being chased. When the person woke up, they continued to experience it and were unsure whether they were experiencing the dream or reality.

False awakenings and multiple layers of dreams can be confusing, affecting our perception of time and space. Jewel used AI to create an environment with depth and endless moving patterns that makes the visitor feel trapped in their dream, unable to escape.

Kelsey Wu, meanwhile, used special 3D software and cameras to recreate a dream of floating over hills and forests, and losing balance. The immersive piece, with the audience invited to sit on a grass-covered floor, creates a sense of loss of control of the body, which moves in an abnormal and unbalanced way, and evokes a struggle between illusion and reality as the landscape continuously moves.

Dr Nanette Hoogslag, Course Leader for the MA in Illustration at ARU, said: “This project has been a unique challenge, where students not only applied themselves in supporting scientific research, but investigated and used a range of new technologies, including virtual reality and AI-generated imagery. The final pieces are absolutely remarkable, and also slightly unsettling!”

You can find out more about the 2023 Cambridge Festival here and about the Anglia Ruskin University exhibit, “Living in a Dream: A visual exploration of the self in dreams using AI technology” here.

Electrotactile rendering device virtualizes the sense of touch

I stumbled across this November 15, 2022 news item on Nanowerk highlighting work on the sense of touch in the virual originally announced in October 2022,

A collaborative research team co-led by City University of Hong Kong (CityU) has developed a wearable tactile rendering system, which can mimic the sensation of touch with high spatial resolution and a rapid response rate. The team demonstrated its application potential in a braille display, adding the sense of touch in the metaverse for functions such as virtual reality shopping and gaming, and potentially facilitating the work of astronauts, deep-sea divers and others who need to wear thick gloves.

Here’s what you’ll need to wear for this virtual tactile experience,

Caption: The new wearable tactile rendering system can mimic touch sensations with high spatial resolution and a rapid response rate. Credit: Robotics X Lab and City University of Hong Kong

An October 20, 2022 City University of Hong Kong (CityU) press release (also on EurekAlert), which originated the news item, delves further into the research,

“We can hear and see our families over a long distance via phones and cameras, but we still cannot feel or hug them. We are physically isolated by space and time, especially during this long-lasting pandemic,” said Dr Yang Zhengbao,Associate Professor in the Department of Mechanical Engineering of CityU, who co-led the study. “Although there has been great progress in developing sensors that digitally capture tactile features with high resolution and high sensitivity, we still lack a system that can effectively virtualize the sense of touch that can record and playback tactile sensations over space and time.”

In collaboration with Chinese tech giant Tencent’s Robotics X Laboratory, the team developed a novel electrotactile rendering system for displaying various tactile sensations with high spatial resolution and a rapid response rate. Their findings were published in the scientific journal Science Advances under the title “Super-resolution Wearable Electro-tactile Rendering System”.

Limitations in existing techniques

Existing techniques to reproduce tactile stimuli can be broadly classified into two categories: mechanical and electrical stimulation. By applying a localised mechanical force or vibration on the skin, mechanical actuators can elicit stable and continuous tactile sensations. However, they tend to be bulky, limiting the spatial resolution when integrated into a portable or wearable device. Electrotactile stimulators, in contrast, which evoke touch sensations in the skin at the location of the electrode by passing a local electric current though the skin, can be light and flexible while offering higher resolution and a faster response. But most of them rely on high voltage direct-current (DC) pulses (up to hundreds of volts) to penetrate the stratum corneum, the outermost layer of the skin, to stimulate the receptors and nerves, which poses a safety concern. Also, the tactile rendering resolution needed to be improved.

The latest electro-tactile actuator developed by the team is very thin and flexible and can be easily integrated into a finger cot. This fingertip wearable device can display different tactile sensations, such as pressure, vibration, and texture roughness in high fidelity. Instead of using DC pulses, the team developed a high-frequency alternating stimulation strategy and succeeded in lowering the operating voltage under 30 V, ensuring the tactile rendering is safe and comfortable.

They also proposed a novel super-resolution strategy that can render tactile sensation at locations between physical electrodes, instead of only at the electrode locations. This increases the spatial resolution of their stimulators by more than three times (from 25 to 105 points), so the user can feel more realistic tactile perception.

Tactile stimuli with high spatial resolution

“Our new system can elicit tactile stimuli with both high spatial resolution (76 dots/cm2), similar to the density of related receptors in the human skin, and a rapid response rate (4 kHz),” said Mr Lin Weikang, a PhD student at CityU, who made and tested the device.

The team ran different tests to show various application possibilities of this new wearable electrotactile rendering system. For example, they proposed a new Braille strategy that is much easier for people with a visual impairment to learn.

The proposed strategy breaks down the alphabet and numerical digits into individual strokes and order in the same way they are written. By wearing the new electrotactile rendering system on a fingertip, the user can recognise the alphabet presented by feeling the direction and the sequence of the strokes with the fingertip sensor. “This would be particularly useful for people who lose their eye sight later in life, allowing them to continue to read and write using the same alphabetic system they are used to, without the need to learn the whole Braille dot system,” said Dr Yang.

Enabling touch in the metaverse

Second, the new system is well suited for VR/AR [virtual reality/augmented reality] applications and games, adding the sense of touch to the metaverse. The electrodes can be made highly flexible and scalable to cover larger areas, such as the palm. The team demonstrated that a user can virtually sense the texture of clothes in a virtual fashion shop. The user also experiences an itchy sensation in the fingertips when being licked by a VR cat. When stroking a virtual cat’s fur, the user can feel a variance in the roughness as the strokes change direction and speed.

The system can also be useful in transmitting fine tactile details through thick gloves. The team successfully integrated the thin, light electrodes of the electrotactile rendering system into flexible tactile sensors on a safety glove. The tactile sensor array captures the pressure distribution on the exterior of the glove and relays the information to the user in real time through tactile stimulation. In the experiment, the user could quickly and accurately locate a tiny steel washer just 1 mm in radius and 0.44mm thick based on the tactile feedback from the glove with sensors and stimulators. This shows the system’s potential in enabling high-fidelity tactile perception, which is currently unavailable to astronauts, firefighters, deep-sea divers and others who need wear thick protective suits or gloves.

“We expect our technology to benefit a broad spectrum of applications, such as information transmission, surgical training, teleoperation, and multimedia entertainment,” added Dr Yang.

Here’s a link to and a citation for the paper,

Super-resolution wearable electrotactile rendering system by Weikang Lin, Dongsheng Zhang, Wang Wei Lee, Xuelong Li, Ying Hong, Qiqi Pan, Ruirui Zhang, Guoxiang Peng, Hong Z. Tan, Zhengyou Zhang, Lei Wei, and Zhengbao Yang. Science Advances 9 Sep 2022 Vol 8, Issue 36 DOI: 10.1126/sciadv.abp8738

This paper is open access.

XR (extended reality) conference in Rome, Italy and four new projects at the Council of Canadian Academies (CCA)

As noted in the headline for this post, I have two items. For anyone unfamiliar with XR and the other (AR, MR, and VR) realities, I found a good description which I placed in my October 22, 2021 posting (scroll down to the “How many realities are there?” subhead about 70% of the way down).

eXtended Reality in Rome

I got an invitation (via a February 24, 2022 email) to participate in a special session at one of the 2022 IEEE (Institute of Electrical and Electronics Engineers) conference (more about the conference later).

First, from the Special Session 10, eXtended Reality as a gateway to the Metaverse: Practices, Theories, Technologies and Applications webpage,

ABSTRACT

The fast development of Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) solutions over the last few years are transforming how people interact, work, and communicate. The eXtended Reality (XR) term encloses all those immersive technologies that can shift the boundaries between digital and physical worlds to realize the Metaverse. According to tech companies and venture capitalists, the Metaverse will be a super-platform that convenes sub-platforms: social media, online video games, and ease-of-life apps, all accessible through the same digital space and sharing the same digital economy. Inside the Metaverse, virtual worlds will allow avatars to carry all human endeavours, including creation, display, entertainment, social, and trading. Thus, the Metaverse will evolve how users interact with brands, intellectual properties, and each other things on the Internet. A user could join friends to play a multiplayer game, watch a movie via a streaming service and then attend a university course precisely the same as in the real world.

The Metaverse development will require new software architecture that will enable decentralized and collaborative virtual worlds. These self-organized virtual worlds will be permanent and will require maintenance operations. In addition, it will be necessary to design efficient data management system and prevent privacy violations. Finally, the convergence of physical reality, virtually enhanced, and an always-on virtual space highlighted the need to rethink the actual paradigms for visualization, interaction, and sharing of digital information, moving toward more natural, intuitive, dynamically customizable, multimodal, and multi-user solutions.

TOPICS

The topics of interest include, but are not limited to, the following:

Hardware/Software Architectures for Metaverse

Decentralized and Collaborative Architectures for Metaverse

Interoperability for Metaverse

Tools to help creators to build the Metaverse

Operations and Maintenance in Metaverse

Data security and privacy mechanisms for Metaverse

Cryptocurrency, token, NFT Solutions for Metaverse

Fraud-Detection in Metaverse

Cyber Security for Metaverse

Data Analytics to Identify Malicious Behaviors in Metaverse

Blockchain/AI technologies in Metaverse

Emerging Technologies and Applications for Metaverse

New models to evaluate the impact of the Metaverse

Interactive Data Exploration and Presentation in Metaverse

Human factors issues related to Metaverse

Proof-of-Concept in Metaverse: Experimental Prototyping and Testbeds

ABOUT THE ORGANIZERS

Giuseppe Caggianese is a Research Scientist at the National Research Council of Italy. He received the Laurea degree in computer science magna cum laude in 2010 and the Ph.D. degree in Methods and Technologies for Environmental Monitoring in 2013 from the University of Basilicata, Italy.

His research activities are focused on the field of Human-Computer Interaction (HCI) and Artificial Intelligence (AI) to design and test advanced interfaces adaptive to specific uses and users in both augmented and virtual reality. He authored more than 30 scientific papers published in international journals, conference proceedings, and books. He also serves on program committees of several international conferences and workshops.

Ugo Erra is an Assistant Professor (qualified as Associate Professor) at the University of Basilicata (UNIBAS), Italy. He is the founder of the Computer Graphics Laboratory at the University of Basilicata. He received an MSc/diploma degree in Computer Science from the University of Salerno, Italy, in 2001 and a PhD in Computer Science in 2004.

His research focuses on Real-Time Computer Graphics, Information Visualization, Artificial Intelligence, and Parallel Computing. Has been involved in several research projects; among these, one project was funded by the European Commission as a research fellow, and four projects were founded by Area Science Park, a public national research organization that promotes the development of innovation processes, as principal investigator. He has (co-)authored about 14 international journal articles, 45 international conference proceedings, and two book chapters. He supervised four PhD students. He organized the Workshop on Parallel and Distributed Agent-Based Simulations, a satellite Workshop of Euro-Par, from 2013 to 2015. He served more than 20 international conferences as program committee member and more than ten journals as referee.

As promised, here’s more about the conference with information about how to respond to the call for papers both for the special session and the conference at large. From the 2022 IEEE International Conference on Metrology for Extended Reality, Artificial Intelligence and Neural Engineering (IEEE MetroXRAINE 2022) website,

The 2022 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence, and Neural Engineering (IEEE MetroXRAINE 2022) will be an international event mainly aimed at creating a synergy between experts in eXtended Reality, Brain-Computer Interface, and Artificial Intelligence, with special attention to measurement [i.e., metrology].

The conference will be a unique opportunity for discussion among scientists, technologists, and companies on very specific sectors in order to increase the visibility and the scientific impact for the participants. The organizing formula will be original owing to the emphasis on the interaction between the participants to exchange ideas and material useful for their research activities.

MetroXRAINE will be configured as a synergistic collection of sessions organized by the individual members of the Scientific Committee. Round tables will be held for different projects and hot research topics. Moreover, we will have demo sessions, students contests, interactive company expositions, awards, and so on.

The Conference will be a hybrid conference [emphasis mine], with the possibility of attendance remotely or in presence.

CALL FOR PAPERS

The Program Committee is inviting to submit Abstracts (1 – 2 pages) for the IEEE MetroXRAINE 2022 Conference, 26-28 October, 2022.

All contributions will be peer-reviewed and acceptance will be based on quality, originality and relevance. Accepted papers will be submitted for inclusion into IEEE Xplore Digital Library.

Extended versions of presented papers are eligible for post publication.

Abstract Submission Deadline:

March 28, 2022

Full Paper Submission Deadline:

May 10, 2022

Extended Abstract Acceptance Notification:

June 10, 2022

Final Paper Submission Deadline:

July 30, 2022

According to the email invitation, “IEEE MetroXRAINE 2022 … will be held on October 26-28, 2022 in Rome.” You can find more details on the conference website.

Council of Canadian Academies launches four projects

This too is from an email. From the Council of Canadian Academies (CCA) announcement received February 27, 2022 (you can find the original February 17, 2022 CCA news release here),

The Council of Canadian Academies (CCA) is pleased to announce it will undertake four new assessments beginning this spring:

Gene-edited Organisms for Pest Control
Advances in gene editing tools and technologies have made the process of changing an organism’s genome more efficient, opening up a range of potential applications. One such application is in pest control. By editing genomes of organisms, and introducing them to wild populations, it’s now possible to control insect-borne disease and invasive species, or reverse insecticide resistance in pests. But the full implications of using these methods remains uncertain.

This assessment will examine the scientific, bioethical, and regulatory challenges associated with the use of gene-edited organisms and technologies for pest control.

Sponsor: Health Canada’s Pest Management Regulatory Agency

The Future of Arctic and Northern Research in Canada
The Arctic is undergoing unprecedented changes, spurred in large part by climate change and globalization. Record levels of sea ice loss are expected to lead to increased trade through the Northwest Passage. Ocean warming and changes to the tundra will transform marine and terrestrial ecosystems, while permafrost thaw will have significant effects on infrastructure and the release of greenhouse gases. As a result of these trends, Northern communities, and Canada as an Arctic and maritime country, are facing profound economic, social, and ecosystem impacts.

This assessment will examine the key foundational elements to create an inclusive, collaborative, effective, and world-class Arctic and northern science system in Canada.

Sponsor: A consortium of Arctic and northern research and science organizations from across Canada led by ArcticNet

Quantum Technologies
Quantum technologies will affect all sectors of the Canadian economy. Built on the principles of quantum physics, these emerging technologies present significant opportunities in the areas of sensing and metrology, computation and communication, and data science and artificial intelligence, among others. But there is also the potential they could be used to facilitate cyberattacks, putting financial systems, utility grids, infrastructure, personal privacy, and national security at risk. A comprehensive exploration of the capabilities and potential vulnerabilities of these technologies will help to inform their future deployment across society and the economy.

This assessment will examine the impacts, opportunities, and challenges quantum technologies present for industry, governments, and people in Canada.

Sponsor: National Research Council Canada and Innovation, Science and Economic Development Canada

International Science and Technology Partnership Opportunities
International partnerships focused on science, technology, and innovation can provide Canada with an opportunity to advance the state of knowledge in areas of national importance, help address global challenges, and contribute to UN Sustainable Development Goals. Canadian companies could also benefit from global partnerships to access new and emerging markets.

While there are numerous opportunities for international collaborations, Canada has finite resources to support them. Potential partnerships need to be evaluated not just on strengths in areas such as science, technology, and innovation, but also political and economic factors.

This assessment will examine how public, private, and academic organizations can evaluate and prioritize science and technology partnership opportunities with other countries to achieve key national objectives.

Sponsor: Global Affairs Canada

Gene-edited Organisms for Pest Control and International Science and Technology Partnership Opportunities are funded by Innovation, Science and Economic Development Canada (ISED). Quantum Technologies is funded by the National Research Council of Council (NRC) and ISED, and the Future of Arctic and Northern Research in Canada is funded by a consortium of Arctic and northern research and science organizations from across Canada led by ArcticNet. The reports will be released in 2023-24.

Multidisciplinary expert panels will be appointed in the coming months for all four assessments.

You can find in-progress and completed CCA reports here.

Fingers crossed that the CCA looks a little further afield for their international experts than the US, UK, Australia, New Zealand, and northern Europe.

Finally, I’m guessing that the gene-editing and pest management report will cover and, gingerly, recommend germline editing (which is currently not allowed in Canada) and gene drives too.

It will be interesting to see who’s on that committee. If you’re really interested in the report topic, you may want to check out my April 26, 2019 posting and scroll down to the “Criminal ban on human gene-editing of inheritable cells (in Canada)” subhead where I examined what seemed to be an informal attempt to persuade policy makers to allow germline editing or gene-editing of inheritable cells in Canada.

INTER/her, a talk with Camille Baker about an immersive journey inside the female body on Friday, December 3, 2021

Before getting to the announcement, this talk and Q&A (question and answer) session is being co-hosted by ArtSci Salon at the Fields Institute for Research in Mathematical Sciences and the OCAD University/DMG Bodies in Play (BiP) initiative.

For anyone curious about OCAD, it was the Ontario College of Art and Design and then in a very odd government/marketing (?) move, they added the word university. As for DMG, in their own words and from their About page, “DMG is a not-for-profit videogame arts organization that creates space for marginalized creators to make, play and critique videogames within a cultural context.” They are located in Toronto, Ontario. Finally, the Art/Sci Salon and the Fields Institute are located at the University of Toronto.

As for the talk, here’s more from the November 28, 2021 Art/Sci Salon announcement (received via email),

Inspired by her own experience with the health care system to treat a
post-reproductive disease, interdisciplinary artist [Camille] Baker created the
project INTER/her, an immersive installation and VR [virtual reality] experience exploring
the inner world of women’s bodies and the reproductive diseases they
suffer. The project was created to open up the conversation about
phenomena experienced by women in their late 30’s (sometimes earlier)
their 40’s, and sometimes after menopause. Working in consultation
with a gynecologist, the project features interviews with several women
telling their stories. The themes in the work include issues of female
identity, sexuality, body image, loss of body parts, pain, disease, and
cancer. INTER/her has a focus on female reproductive diseases explored
through a feminist lens; as personal exploration, as a conversation
starter, to raise greater public awareness and encourage community
building. The work also represents the lived experience of women’s
pain and anger, conflicting thoughts through self-care and the growth of
disease. Feelings of mortality are explored through a medical process in
male-dominated medical institutions and a dearth of reliable
information. https://inter-her.art/ [1]

In 2021, the installation was shortlisted for the Lumen Prize.

 Join us for a talk and Q&A with the artist to discuss her work and its
future development.

 Friday, December 3,

6:00 pm EST

 Register in advance for this meeting:

https://utoronto.zoom.us/meeting/register/tZ0rcO6rpzsvGd057GQmTyAERmRRLI2MQ4L1

After registering, you will receive a confirmation email containing
information about joining the meeting.

This talk is  Co-Hosted by the ArtSci Salon at the Fields Institute for
Research in Mathematical Sciences and the OCAD University/DMG Bodies in
Play (BiP) initiative.

This event will be recorded and archived on the ArtSci Salon Youtube
channel

Bio

Camille Baker is a Professor in Interactive and Immersive Arts,
University for the Creative Arts [UCA], Farnham Surrey (UK). She is an
artist-performer/researcher/curator within various art forms: immersive
experiences, participatory performance and interactive art, mobile media
art, tech fashion/soft circuits/DIY electronics, responsive interfaces
and environments, and emerging media curating. Maker of participatory
performance and immersive artwork, Baker develops methods to explore
expressive non-verbal modes of communication, extended embodiment and
presence in real and mixed reality and interactive art contexts, using
XR, haptics/ e-textiles, wearable devices and mobile media. She has an
ongoing fascination with all things emotional, embodied, felt, sensed,
the visceral, physical, and relational.

Her 2018 book _New Directions in Mobile Media and Performance_ showcases
exciting approaches and artists in this space, as well as her own work.
She has been running a regular meetup group with smart/e-textile artists
and designers since 2014, called e-stitches, where participants share
their practice and facilitate workshops of new techniques and
innovations. Baker  also has been Principal Investigator for UCA for the
EU funded STARTS Ecosystem (starts.eu [2]) Apr 2019-Nov 2021 and founder
initiator for the EU WEAR Sustain project Jan 2017-April 2019
(wearsustain.eu [3]).

The EU or European Union is the agency that provided funding for S+T+Arts (Science, Technology & the Arts), which is an initiative of the European Commission’s. I gather that Baker was involved in two STARTS projects, one called the WEAR Sustain project and the other called, the STARTS Ecosystem.

The metaverse or not

The ‘metaverse’ seems to be everywhere these days (especially since Facebook has made a number of announcements bout theirs (more about that later in this posting).

At this point, the metaverse is very hyped up despite having been around for about 30 years. According to the Wikipedia timeline (see the Metaverse entry), the first one was a MOO in 1993 called ‘The Metaverse’. In any event, it seems like it might be a good time to see what’s changed since I dipped my toe into a metaverse (Second Life by Linden Labs) in 2007.

(For grammar buffs, I switched from definite article [the] to indefinite article [a] purposefully. In reading the various opinion pieces and announcements, it’s not always clear whether they’re talking about a single, overarching metaverse [the] replacing the single, overarching internet or whether there will be multiple metaverses, in which case [a].)

The hype/the buzz … call it what you will

This September 6, 2021 piece by Nick Pringle for Fast Company dates the beginning of the metaverse to a 1992 science fiction novel before launching into some typical marketing hype (for those who don’t know, hype is the short form for hyperbole; Note: Links have been removed),

The term metaverse was coined by American writer Neal Stephenson in his 1993 sci-fi hit Snow Crash. But what was far-flung fiction 30 years ago is now nearing reality. At Facebook’s most recent earnings call [June 2021], CEO Mark Zuckerberg announced the company’s vision to unify communities, creators, and commerce through virtual reality: “Our overarching goal across all of these initiatives is to help bring the metaverse to life.”

So what actually is the metaverse? It’s best explained as a collection of 3D worlds you explore as an avatar. Stephenson’s original vision depicted a digital 3D realm in which users interacted in a shared online environment. Set in the wake of a catastrophic global economic crash, the metaverse in Snow Crash emerged as the successor to the internet. Subcultures sprung up alongside new social hierarchies, with users expressing their status through the appearance of their digital avatars.

Today virtual worlds along these lines are formed, populated, and already generating serious money. Household names like Roblox and Fortnite are the most established spaces; however, there are many more emerging, such as Decentraland, Upland, Sandbox, and the soon to launch Victoria VR.

These metaverses [emphasis mine] are peaking at a time when reality itself feels dystopian, with a global pandemic, climate change, and economic uncertainty hanging over our daily lives. The pandemic in particular saw many of us escape reality into online worlds like Roblox and Fortnite. But these spaces have proven to be a place where human creativity can flourish amid crisis.

In fact, we are currently experiencing an explosion of platforms parallel to the dotcom boom. While many of these fledgling digital worlds will become what Ask Jeeves was to Google, I predict [emphasis mine] that a few will match the scale and reach of the tech giant—or even exceed it.

Because the metaverse brings a new dimension to the internet, brands and businesses will need to consider their current and future role within it. Some brands are already forging the way and establishing a new genre of marketing in the process: direct to avatar (D2A). Gucci sold a virtual bag for more than the real thing in Roblox; Nike dropped virtual Jordans in Fortnite; Coca-Cola launched avatar wearables in Decentraland, and Sotheby’s has an art gallery that your avatar can wander in your spare time.

D2A is being supercharged by blockchain technology and the advent of digital ownership via NFTs, or nonfungible tokens. NFTs are already making waves in art and gaming. More than $191 million was transacted on the “play to earn” blockchain game Axie Infinity in its first 30 days this year. This kind of growth makes NFTs hard for brands to ignore. In the process, blockchain and crypto are starting to feel less and less like “outsider tech.” There are still big barriers to be overcome—the UX of crypto being one, and the eye-watering environmental impact of mining being the other. I believe technology will find a way. History tends to agree.

Detractors see the metaverse as a pandemic fad, wrapping it up with the current NFT bubble or reducing it to Zuck’s [Jeffrey Zuckerberg and Facebook] dystopian corporate landscape. This misses the bigger behavior change that is happening among Gen Alpha. When you watch how they play, it becomes clear that the metaverse is more than a buzzword.

For Gen Alpha [emphasis mine], gaming is social life. While millennials relentlessly scroll feeds, Alphas and Zoomers [emphasis mine] increasingly stroll virtual spaces with their friends. Why spend the evening staring at Instagram when you can wander around a virtual Harajuku with your mates? If this seems ridiculous to you, ask any 13-year-old what they think.

Who is Nick Pringle and how accurate are his predictions?

At the end of his September 6, 2021 piece, you’ll find this,

Nick Pringle is SVP [Senior Vice President] executive creative director at R/GA London.

According to the R/GA Wikipedia entry,

… [the company] evolved from a computer-assisted film-making studio to a digital design and consulting company, as part of a major advertising network.

Here’s how Pringle sees our future, his September 6, 2021 piece,

By thinking “virtual first,” you can see how these spaces become highly experimental, creative, and valuable. The products you can design aren’t bound by physics or marketing convention—they can be anything, and are now directly “ownable” through blockchain. …

I believe that the metaverse is here to stay. That means brands and marketers now have the exciting opportunity to create products that exist in multiple realities. The winners will understand that the metaverse is not a copy of our world, and so we should not simply paste our products, experiences, and brands into it.

I emphasized “These metaverses …” in the previous section to highlight the fact that I find the use of ‘metaverses’ vs. ‘worlds’ confusing as the words are sometimes used as synonyms and sometimes as distinctions. We do it all the time in all sorts of conversations but for someone who’s an outsider to a particular occupational group or subculture, the shifts can make for confusion.

As for Gen Alpha and Zoomer, I’m not a fan of ‘Gen anything’ as shorthand for describing a cohort based on birth years. For example, “For Gen Alpha [emphasis mine], gaming is social life,” ignores social and economic classes, as well as, the importance of locations/geography, e.g., Afghanistan in contrast to the US.

To answer the question I asked, Pringle does not mention any record of accuracy for his predictions for the future but I was able to discover that he is a “multiple Cannes Lions award-winning creative” (more here).

A more measured view of the metaverse

An October 4, 2021 article (What is the metaverse, and do I have to care? One part definition, one part aspiration, one part hype) by Adi Robertson and Jay Peters for The Verge offers a deeper dive into the metaverse (Note: Links have been removed),

In recent months you may have heard about something called the metaverse. Maybe you’ve read that the metaverse is going to replace the internet. Maybe we’re all supposed to live there. Maybe Facebook (or Epic, or Roblox, or dozens of smaller companies) is trying to take it over. And maybe it’s got something to do with NFTs [non-fungible tokens]?

Unlike a lot of things The Verge covers, the metaverse is tough to explain for one reason: it doesn’t necessarily exist. It’s partly a dream for the future of the internet and partly a neat way to encapsulate some current trends in online infrastructure, including the growth of real-time 3D worlds.

Then what is the real metaverse?

There’s no universally accepted definition of a real “metaverse,” except maybe that it’s a fancier successor to the internet. Silicon Valley metaverse proponents sometimes reference a description from venture capitalist Matthew Ball, author of the extensive Metaverse Primer:

“The Metaverse is an expansive network of persistent, real-time rendered 3D worlds and simulations that support continuity of identity, objects, history, payments, and entitlements, and can be experienced synchronously by an effectively unlimited number of users, each with an individual sense of presence.”

Facebook, arguably the tech company with the biggest stake in the metaverse, describes it more simply:

“The ‘metaverse’ is a set of virtual spaces where you can create and explore with other people who aren’t in the same physical space as you.”

There are also broader metaverse-related taxonomies like one from game designer Raph Koster, who draws a distinction between “online worlds,” “multiverses,” and “metaverses.” To Koster, online worlds are digital spaces — from rich 3D environments to text-based ones — focused on one main theme. Multiverses are “multiple different worlds connected in a network, which do not have a shared theme or ruleset,” including Ready Player One’s OASIS. And a metaverse is “a multiverse which interoperates more with the real world,” incorporating things like augmented reality overlays, VR dressing rooms for real stores, and even apps like Google Maps.

If you want something a little snarkier and more impressionistic, you can cite digital scholar Janet Murray — who has described the modern metaverse ideal as “a magical Zoom meeting that has all the playful release of Animal Crossing.”

But wait, now Ready Player One isn’t a metaverse and virtual worlds don’t have to be 3D? It sounds like some of these definitions conflict with each other.

An astute observation.

Why is the term “metaverse” even useful? “The internet” already covers mobile apps, websites, and all kinds of infrastructure services. Can’t we roll virtual worlds in there, too?

Matthew Ball favors the term “metaverse” because it creates a clean break with the present-day internet. [emphasis mine] “Using the metaverse as a distinctive descriptor allows us to understand the enormity of that change and in turn, the opportunity for disruption,” he said in a phone interview with The Verge. “It’s much harder to say ‘we’re late-cycle into the last thing and want to change it.’ But I think understanding this next wave of computing and the internet allows us to be more proactive than reactive and think about the future as we want it to be, rather than how to marginally affect the present.”

A more cynical spin is that “metaverse” lets companies dodge negative baggage associated with “the internet” in general and social media in particular. “As long as you can make technology seem fresh and new and cool, you can avoid regulation,” researcher Joan Donovan told The Washington Post in a recent article about Facebook and the metaverse. “You can run defense on that for several years before the government can catch up.”

There’s also one very simple reason: it sounds more futuristic than “internet” and gets investors and media people (like us!) excited.

People keep saying NFTs are part of the metaverse. Why?

NFTs are complicated in their own right, and you can read more about them here. Loosely, the thinking goes: NFTs are a way of recording who owns a specific virtual good, creating and transferring virtual goods is a big part of the metaverse, thus NFTs are a potentially useful financial architecture for the metaverse. Or in more practical terms: if you buy a virtual shirt in Metaverse Platform A, NFTs can create a permanent receipt and let you redeem the same shirt in Metaverse Platforms B to Z.

Lots of NFT designers are selling collectible avatars like CryptoPunks, Cool Cats, and Bored Apes, sometimes for astronomical sums. Right now these are mostly 2D art used as social media profile pictures. But we’re already seeing some crossover with “metaverse”-style services. The company Polygonal Mind, for instance, is building a system called CryptoAvatars that lets people buy 3D avatars as NFTs and then use them across multiple virtual worlds.

If you have the time, the October 4, 2021 article (What is the metaverse, and do I have to care? One part definition, one part aspiration, one part hype) is definitely worth the read.

Facebook’s multiverse and other news

Since starting this post sometime in September 2021, the situation regarding Facebook has changed a few times. I’ve decided to begin my version of the story from a summer 2021 announcement.

On Monday, July 26, 2021, Facebook announced a new Metaverse product group. From a July 27, 2021 article by Scott Rosenberg for Yahoo News (Note: A link has been removed),

Facebook announced Monday it was forming a new Metaverse product group to advance its efforts to build a 3D social space using virtual and augmented reality tech.

Facebook’s new Metaverse product group will report to Andrew Bosworth, Facebook’s vice president of virtual and augmented reality [emphasis mine], who announced the new organization in a Facebook post.

Facebook, integrity, and safety in the metaverse

On September 27, 2021 Facebook posted this webpage (Building the Metaverse Responsibly by Andrew Bosworth, VP, Facebook Reality Labs [emphasis mine] and Nick Clegg, VP, Global Affairs) on its site,

The metaverse won’t be built overnight by a single company. We’ll collaborate with policymakers, experts and industry partners to bring this to life.

We’re announcing a $50 million investment in global research and program partners to ensure these products are developed responsibly.

We develop technology rooted in human connection that brings people together. As we focus on helping to build the next computing platform, our work across augmented and virtual reality and consumer hardware will deepen that human connection regardless of physical distance and without being tied to devices. 

Introducing the XR [extended reality] Programs and Research Fund

There’s a long road ahead. But as a starting point, we’re announcing the XR Programs and Research Fund, a two-year $50 million investment in programs and external research to help us in this effort. Through this fund, we’ll collaborate with industry partners, civil rights groups, governments, nonprofits and academic institutions to determine how to build these technologies responsibly. 

..

Where integrity and safety are concerned Facebook is once again having some credibility issues according to an October 5, 2021 Associated Press article (Whistleblower testifies Facebook chooses profit over safety, calls for ‘congressional action’) posted on the Canadian Broadcasting Corporation’s (CBC) news online website.

Rebranding Facebook’s integrity and safety issues away?

It seems Facebook’s credibility issues are such that the company is about to rebrand itself according to an October 19, 2021 article by Alex Heath for The Verge (Note: Links have been removed),

Facebook is planning to change its company name next week to reflect its focus on building the metaverse, according to a source with direct knowledge of the matter.

The coming name change, which CEO Mark Zuckerberg plans to talk about at the company’s annual Connect conference on October 28th [2021], but could unveil sooner, is meant to signal the tech giant’s ambition to be known for more than social media and all the ills that entail. The rebrand would likely position the blue Facebook app as one of many products under a parent company overseeing groups like Instagram, WhatsApp, Oculus, and more. A spokesperson for Facebook declined to comment for this story.

Facebook already has more than 10,000 employees building consumer hardware like AR glasses that Zuckerberg believes will eventually be as ubiquitous as smartphones. In July, he told The Verge that, over the next several years, “we will effectively transition from people seeing us as primarily being a social media company to being a metaverse company.”

A rebrand could also serve to further separate the futuristic work Zuckerberg is focused on from the intense scrutiny Facebook is currently under for the way its social platform operates today. A former employee turned whistleblower, Frances Haugen, recently leaked a trove of damning internal documents to The Wall Street Journal and testified about them before Congress. Antitrust regulators in the US and elsewhere are trying to break the company up, and public trust in how Facebook does business is falling.

Facebook isn’t the first well-known tech company to change its company name as its ambitions expand. In 2015, Google reorganized entirely under a holding company called Alphabet, partly to signal that it was no longer just a search engine, but a sprawling conglomerate with companies making driverless cars and health tech. And Snapchat rebranded to Snap Inc. in 2016, the same year it started calling itself a “camera company” and debuted its first pair of Spectacles camera glasses.

If you have time, do read Heath’s article in its entirety.

An October 20, 2021 Thomson Reuters item on CBC (Canadian Broadcasting Corporation) news online includes quotes from some industry analysts about the rebrand,

“It reflects the broadening out of the Facebook business. And then, secondly, I do think that Facebook’s brand is probably not the greatest given all of the events of the last three years or so,” internet analyst James Cordwell at Atlantic Equities said.

“Having a different parent brand will guard against having this negative association transferred into a new brand, or other brands that are in the portfolio,” said Shankha Basu, associate professor of marketing at University of Leeds.

Tyler Jadah’s October 20, 2021 article for the Daily Hive includes an earlier announcement (not mentioned in the other two articles about the rebranding), Note: A link has been removed,

Earlier this week [October 17, 2021], Facebook announced it will start “a journey to help build the next computing platform” and will hire 10,000 new high-skilled jobs within the European Union (EU) over the next five years.

“Working with others, we’re developing what is often referred to as the ‘metaverse’ — a new phase of interconnected virtual experiences using technologies like virtual and augmented reality,” wrote Facebook’s Nick Clegg, the VP of Global Affairs. “At its heart is the idea that by creating a greater sense of “virtual presence,” interacting online can become much closer to the experience of interacting in person.”

Clegg says the metaverse has the potential to help unlock access to new creative, social, and economic opportunities across the globe and the virtual world.

In an email with Facebook’s Corporate Communications Canada, David Troya-Alvarez told Daily Hive, “We don’t comment on rumour or speculation,” in regards to The Verge‘s report.

I will update this posting when and if Facebook rebrands itself into a ‘metaverse’ company.

***See Oct. 28, 2021 update at the end of this posting and prepare yourself for ‘Meta’.***

Who (else) cares about integrity and safety in the metaverse?

Apparently, the international legal firm, Norton Rose Fulbright also cares about safety and integrity in the metaverse. Here’s more from their July 2021 The Metaverse: The evolution of a universal digital platform webpage,

In technology, first-mover advantage is often significant. This is why BigTech and other online platforms are beginning to acquire software businesses to position themselves for the arrival of the Metaverse.  They hope to be at the forefront of profound changes that the Metaverse will bring in relation to digital interactions between people, between businesses, and between them both. 

What is the Metaverse? The short answer is that it does not exist yet. At the moment it is vision for what the future will be like where personal and commercial life is conducted digitally in parallel with our lives in the physical world. Sounds too much like science fiction? For something that does not exist yet, the Metaverse is drawing a huge amount of attention and investment in the tech sector and beyond.  

Here we look at what the Metaverse is, what its potential is for disruptive change, and some of the key legal and regulatory issues future stakeholders may need to consider.

What are the potential legal issues?

The revolutionary nature of the Metaverse is likely to give rise to a range of complex legal and regulatory issues. We consider some of the key ones below. As time goes by, naturally enough, new ones will emerge.

Data

Participation in the Metaverse will involve the collection of unprecedented amounts and types of personal data. Today, smartphone apps and websites allow organisations to understand how individuals move around the web or navigate an app. Tomorrow, in the Metaverse, organisations will be able to collect information about individuals’ physiological responses, their movements and potentially even brainwave patterns, thereby gauging a much deeper understanding of their customers’ thought processes and behaviours.

Users participating in the Metaverse will also be “logged in” for extended amounts of time. This will mean that patterns of behaviour will be continually monitored, enabling the Metaverse and the businesses (vendors of goods and services) participating in the Metaverse to understand how best to service the users in an incredibly targeted way.

The hungry Metaverse participant

How might actors in the Metaverse target persons participating in the Metaverse? Let us assume one such woman is hungry at the time of participating. The Metaverse may observe a woman frequently glancing at café and restaurant windows and stopping to look at cakes in a bakery window, and determine that she is hungry and serve her food adverts accordingly.

Contrast this with current technology, where a website or app can generally only ascertain this type of information if the woman actively searched for food outlets or similar on her device.

Therefore, in the Metaverse, a user will no longer need to proactively provide personal data by opening up their smartphone and accessing their webpage or app of choice. Instead, their data will be gathered in the background while they go about their virtual lives. 

This type of opportunity comes with great data protection responsibilities. Businesses developing, or participating in, the Metaverse will need to comply with data protection legislation when processing personal data in this new environment. The nature of the Metaverse raises a number of issues around how that compliance will be achieved in practice.

Who is responsible for complying with applicable data protection law? 

In many jurisdictions, data protection laws place different obligations on entities depending on whether an entity determines the purpose and means of processing personal data (referred to as a “controller” under the EU General Data Protection Regulation (GDPR)) or just processes personal data on behalf of others (referred to as a “processor” under the GDPR). 

In the Metaverse, establishing which entity or entities have responsibility for determining how and why personal data will be processed, and who processes personal data on behalf of another, may not be easy. It will likely involve picking apart a tangled web of relationships, and there may be no obvious or clear answers – for example:

Will there be one main administrator of the Metaverse who collects all personal data provided within it and determines how that personal data will be processed and shared?
Or will multiple entities collect personal data through the Metaverse and each determine their own purposes for doing so? 

Either way, many questions arise, including:

How should the different entities each display their own privacy notice to users? 
Or should this be done jointly? 
How and when should users’ consent be collected? 
Who is responsible if users’ personal data is stolen or misused while they are in the Metaverse? 
What data sharing arrangements need to be put in place and how will these be implemented?

There’s a lot more to this page including a look at Social Media Regulation and Intellectual Property Rights.

One other thing, according to the Norton Rose Fulbright Wikipedia entry, it is one of the ten largest legal firms in the world.

How many realities are there?

I’m starting to think we should talking about RR (real reality), as well as, VR (virtual reality), AR (augmented reality), MR (mixed reality), and XR (extended reality). It seems that all of these (except RR, which is implied) will be part of the ‘metaverse’, assuming that it ever comes into existence. Happily, I have found a good summarized description of VR/AR/MR/XR in a March 20, 2018 essay by North of 41 on medium.com,

Summary: VR is immersing people into a completely virtual environment; AR is creating an overlay of virtual content, but can’t interact with the environment; MR is a mixed of virtual reality and the reality, it creates virtual objects that can interact with the actual environment. XR brings all three Reality (AR, VR, MR) together under one term.

If you have the interest and approximately five spare minutes, read the entire March 20, 2018 essay, which has embedded images illustrating the various realities.

Alternate Mixed Realities: an example

TransforMR: Pose-Aware Object Substitution for Composing Alternate Mixed Realities (ISMAR ’21)

Here’s a description from one of the researchers, Mohamed Kari, of the video, which you can see above, and the paper he and his colleagues presented at the 20th IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2021 (from the TransforMR page on YouTube),

We present TransforMR, a video see-through mixed reality system for mobile devices that performs 3D-pose-aware object substitution to create meaningful mixed reality scenes in previously unseen, uncontrolled, and open-ended real-world environments.

To get a sense of how recent this work is, ISMAR 2021 was held from October 4 – 8, 2021.

The team’s 2021 ISMAR paper, TransforMR Pose-Aware Object Substitution for Composing Alternate Mixed Realities by Mohamed Kari, Tobias Grosse-Puppendah, Luis Falconeri Coelho, Andreas Rene Fender, David Bethge, Reinhard Schütte, and Christian Holz lists two educational institutions I’d expect to see (University of Duisburg-Essen and ETH Zürich), the surprise was this one: Porsche AG. Perhaps that explains the preponderance of vehicles in this demonstration.

Space walking in virtual reality

Ivan Semeniuk’s October 2, 2021 article for the Globe and Mail highlights a collaboration between Montreal’s Felix and Paul Studios with NASA (US National Aeronautics and Space Administration) and Time studios,

Communing with the infinite while floating high above the Earth is an experience that, so far, has been known to only a handful.

Now, a Montreal production company aims to share that experience with audiences around the world, following the first ever recording of a spacewalk in the medium of virtual reality.

The company, which specializes in creating virtual-reality experiences with cinematic flair, got its long-awaited chance in mid-September when astronauts Thomas Pesquet and Akihiko Hoshide ventured outside the International Space Station for about seven hours to install supports and other equipment in preparation for a new solar array.

The footage will be used in the fourth and final instalment of Space Explorers: The ISS Experience, a virtual-reality journey to space that has already garnered a Primetime Emmy Award for its first two episodes.

From the outset, the production was developed to reach audiences through a variety of platforms for 360-degree viewing, including 5G-enabled smart phones and tablets. A domed theatre version of the experience for group audiences opened this week at the Rio Tinto Alcan Montreal Planetarium. Those who desire a more immersive experience can now see the first two episodes in VR form by using a headset available through the gaming and entertainment company Oculus. Scenes from the VR series are also on offer as part of The Infinite, an interactive exhibition developed by Montreal’s Phi Studio, whose works focus on the intersection of art and technology. The exhibition, which runs until Nov. 7 [2021], has attracted 40,000 visitors since it opened in July [2021?].

At a time when billionaires are able to head off on private extraterrestrial sojourns that almost no one else could dream of, Lajeunesse [Félix Lajeunesse, co-founder and creative director of Felix and Paul studios] said his project was developed with a very different purpose in mind: making it easier for audiences to become eyewitnesses rather than distant spectators to humanity’s greatest adventure.

For the final instalments, the storyline takes viewers outside of the space station with cameras mounted on the Canadarm, and – for the climax of the series – by following astronauts during a spacewalk. These scenes required extensive planning, not only because of the limited time frame in which they could be gathered, but because of the lighting challenges presented by a constantly shifting sun as the space station circles the globe once every 90 minutes.

… Lajeunesse said that it was equally important to acquire shots that are not just technically spectacular but that serve the underlying themes of Space Explorers: The ISS Experience. These include an examination of human adaptation and advancement, and the unity that emerges within a group of individuals from many places and cultures and who must learn to co-exist in a high risk environment in order to achieve a common goal.

If you have the time, do read Semeniuk’s October 2, 2021 article in its entirety. You can find the exhibits (hopefully, you’re in Montreal) The Infinite here and Space Explorers: The ISS experience here (see the preview below),

The realities and the ‘verses

There always seems to be a lot of grappling with new and newish science/technology where people strive to coin terms and define them while everyone, including members of the corporate community, attempts to cash in.

The last time I looked (probably about two years ago), I wasn’t able to find any good definitions for alternate reality and mixed reality. (By good, I mean something which clearly explicated the difference between the two.) It was nice to find something this time.

As for Facebook and its attempts to join/create a/the metaverse, the company’s timing seems particularly fraught. As well, paradigm-shifting technology doesn’t usually start with large corporations. The company is ignoring its own history.

Multiverses

Writing this piece has reminded me of the upcoming movie, “Doctor Strange in the Multiverse of Madness” (Wikipedia entry). While this multiverse is based on a comic book, the idea of a Multiverse (Wikipedia entry) has been around for quite some time,

Early recorded examples of the idea of infinite worlds existed in the philosophy of Ancient Greek Atomism, which proposed that infinite parallel worlds arose from the collision of atoms. In the third century BCE, the philosopher Chrysippus suggested that the world eternally expired and regenerated, effectively suggesting the existence of multiple universes across time.[1] The concept of multiple universes became more defined in the Middle Ages.

Multiple universes have been hypothesized in cosmology, physics, astronomy, religion, philosophy, transpersonal psychology, music, and all kinds of literature, particularly in science fiction, comic books and fantasy. In these contexts, parallel universes are also called “alternate universes”, “quantum universes”, “interpenetrating dimensions”, “parallel universes”, “parallel dimensions”, “parallel worlds”, “parallel realities”, “quantum realities”, “alternate realities”, “alternate timelines”, “alternate dimensions” and “dimensional planes”.

The physics community has debated the various multiverse theories over time. Prominent physicists are divided about whether any other universes exist outside of our own.

Living in a computer simulation or base reality

The whole thing is getting a little confusing for me so I think I’ll stick with RR (real reality) or as it’s also known base reality. For the notion of base reality, I want to thank astronomer David Kipping of Columbia University in Anil Ananthaswamy’s article for this analysis of the idea that we might all be living in a computer simulation (from my December 8, 2020 posting; scroll down about 50% of the way to the “Are we living in a computer simulation?” subhead),

… there is a more obvious answer: Occam’s razor, which says that in the absence of other evidence, the simplest explanation is more likely to be correct. The simulation hypothesis is elaborate, presuming realities nested upon realities, as well as simulated entities that can never tell that they are inside a simulation. “Because it is such an overly complicated, elaborate model in the first place, by Occam’s razor, it really should be disfavored, compared to the simple natural explanation,” Kipping says.

Maybe we are living in base reality after all—The Matrix, Musk and weird quantum physics notwithstanding.

To sum it up (briefly)

I’m sticking with the base reality (or real reality) concept, which is where various people and companies are attempting to create a multiplicity of metaverses or the metaverse effectively replacing the internet. This metaverse can include any all of these realities (AR/MR/VR/XR) along with base reality. As for Facebook’s attempt to build ‘the metaverse’, it seems a little grandiose.

The computer simulation theory is an interesting thought experiment (just like the multiverse is an interesting thought experiment). I’ll leave them there.

Wherever it is we are living, these are interesting times.

***Updated October 28, 2021: D. (Devindra) Hardawar’s October 28, 2021 article for engadget offers details about the rebranding along with a dash of cynicism (Note: A link has been removed),

Here’s what Facebook’s metaverse isn’t: It’s not an alternative world to help us escape from our dystopian reality, a la Snow Crash. It won’t require VR or AR glasses (at least, not at first). And, most importantly, it’s not something Facebook wants to keep to itself. Instead, as Mark Zuckerberg described to media ahead of today’s Facebook Connect conference, the company is betting it’ll be the next major computing platform after the rise of smartphones and the mobile web. Facebook is so confident, in fact, Zuckerberg announced that it’s renaming itself to “Meta.”

After spending the last decade becoming obsessed with our phones and tablets — learning to stare down and scroll practically as a reflex — the Facebook founder thinks we’ll be spending more time looking up at the 3D objects floating around us in the digital realm. Or maybe you’ll be following a friend’s avatar as they wander around your living room as a hologram. It’s basically a digital world layered right on top of the real world, or an “embodied internet” as Zuckerberg describes.

Before he got into the weeds for his grand new vision, though, Zuckerberg also preempted criticism about looking into the future now, as the Facebook Papers paint the company as a mismanaged behemoth that constantly prioritizes profit over safety. While acknowledging the seriousness of the issues the company is facing, noting that it’ll continue to focus on solving them with “industry-leading” investments, Zuckerberg said: 

“The reality is is that there’s always going to be issues and for some people… they may have the view that there’s never really a great time to focus on the future… From my perspective, I think that we’re here to create things and we believe that we can do this and that technology can make things better. So we think it’s important to to push forward.”

Given the extent to which Facebook, and Zuckerberg in particular, have proven to be untrustworthy stewards of social technology, it’s almost laughable that the company wants us to buy into its future. But, like the rise of photo sharing and group chat apps, Zuckerberg at least has a good sense of what’s coming next. And for all of his talk of turning Facebook into a metaverse company, he’s adamant that he doesn’t want to build a metaverse that’s entirely owned by Facebook. He doesn’t think other companies will either. Like the mobile web, he thinks every major technology company will contribute something towards the metaverse. He’s just hoping to make Facebook a pioneer.

“Instead of looking at a screen, or today, how we look at the Internet, I think in the future you’re going to be in the experiences, and I think that’s just a qualitatively different experience,” Zuckerberg said. It’s not quite virtual reality as we think of it, and it’s not just augmented reality. But ultimately, he sees the metaverse as something that’ll help to deliver more presence for digital social experiences — the sense of being there, instead of just being trapped in a zoom window. And he expects there to be continuity across devices, so you’ll be able to start chatting with friends on your phone and seamlessly join them as a hologram when you slip on AR glasses.

D. (Devindra) Hardawar’s October 28, 2021 article provides a lot more details and I recommend reading it in its entirety.

A 3D spider web, a VR (virtual reality) setup, and sonification (music)

Markus Buehler and his musical spider webs are making news again.

Caption: Cross-sectional images (shown in different colors) of a spider web were combined into this 3D image and translated into music. Credit: Isabelle Su and Markus Buehler

The image (so pretty) you see in the above comes from a Markus Buehler presentation that was made at the American Chemical Society (ACS) meeting. ACS Spring 2021 being held online April 5-30, 2021. The image was also shown during a press conference which the ACS has made available for public viewing. More about that later in this posting.

The ACS issued an April 12, 2021 news release (also on EurekAlert), which provides details about Buehler’s latest work on spider webs and music,

Spiders are master builders, expertly weaving strands of silk into intricate 3D webs that serve as the spider’s home and hunting ground. If humans could enter the spider’s world, they could learn about web construction, arachnid behavior and more. Today, scientists report that they have translated the structure of a web into music, which could have applications ranging from better 3D printers to cross-species communication and otherworldly musical compositions.

The researchers will present their results today at the spring meeting of the American Chemical Society (ACS). ACS Spring 2021 is being held online April 5-30 [2021]. Live sessions will be hosted April 5-16, and on-demand and networking content will continue through April 30 [2021]. The meeting features nearly 9,000 presentations on a wide range of science topics.

“The spider lives in an environment of vibrating strings,” says Markus Buehler, Ph.D., the project’s principal investigator, who is presenting the work. “They don’t see very well, so they sense their world through vibrations, which have different frequencies.” Such vibrations occur, for example, when the spider stretches a silk strand during construction, or when the wind or a trapped fly moves the web.

Buehler, who has long been interested in music, wondered if he could extract rhythms and melodies of non-human origin from natural materials, such as spider webs. “Webs could be a new source for musical inspiration that is very different from the usual human experience,” he says. In addition, by experiencing a web through hearing as well as vision, Buehler and colleagues at the Massachusetts Institute of Technology (MIT), together with collaborator Tomás Saraceno at Studio Tomás Saraceno, hoped to gain new insights into the 3D architecture and construction of webs.

With these goals in mind, the researchers scanned a natural spider web with a laser to capture 2D cross-sections and then used computer algorithms to reconstruct the web’s 3D network. The team assigned different frequencies of sound to strands of the web, creating “notes” that they combined in patterns based on the web’s 3D structure to generate melodies. The researchers then created a harp-like instrument and played the spider web music in several live performances around the world.

The team also made a virtual reality setup that allowed people to visually and audibly “enter” the web. “The virtual reality environment is really intriguing because your ears are going to pick up structural features that you might see but not immediately recognize,” Buehler says. “By hearing it and seeing it at the same time, you can really start to understand the environment the spider lives in.”

To gain insights into how spiders build webs, the researchers scanned a web during the construction process, transforming each stage into music with different sounds. “The sounds our harp-like instrument makes change during the process, reflecting the way the spider builds the web,” Buehler says. “So, we can explore the temporal sequence of how the web is being constructed in audible form.” This step-by-step knowledge of how a spider builds a web could help in devising “spider-mimicking” 3D printers that build complex microelectronics. “The spider’s way of ‘printing’ the web is remarkable because no support material is used, as is often needed in current 3D printing methods,” he says.

In other experiments, the researchers explored how the sound of a web changes as it’s exposed to different mechanical forces, such as stretching. “In the virtual reality environment, we can begin to pull the web apart, and when we do that, the tension of the strings and the sound they produce change. At some point, the strands break, and they make a snapping sound,” Buehler says.

The team is also interested in learning how to communicate with spiders in their own language. They recorded web vibrations produced when spiders performed different activities, such as building a web, communicating with other spiders or sending courtship signals. Although the frequencies sounded similar to the human ear, a machine learning algorithm correctly classified the sounds into the different activities. “Now we’re trying to generate synthetic signals to basically speak the language of the spider,” Buehler says. “If we expose them to certain patterns of rhythms or vibrations, can we affect what they do, and can we begin to communicate with them? Those are really exciting ideas.”

You can go here for the April 12, 2021 ‘Making music from spider webs’ ACS press conference’ it runs about 30 mins. and you will hear some ‘spider music’ played.

Getting back to the image and spider webs in general, we are most familiar with orb webs (in the part of Canada where I from if nowhere else), which look like spirals and are 2D. There are several other types of webs some of which are 3D, like tangle webs, also known as cobwebs, funnel webs and more. See this March 18, 2020 article “9 Types of Spider Webs: Identification + Pictures & Spiders” by Zach David on Beyond the Treat for more about spiders and their webs. If you have the time, I recommend reading it.

I’ve been following Buehler’s spider web/music work for close to ten years now; the latest previous posting is an October 23, 2019 posting where you’ll find a link to an application that makes music from proteins (spider webs are made up of proteins; scroll down about 30% of the way; it’s in the 2nd to last line of the quoted text about the embedded video).

Here is a video (2 mins. 17 secs.) of a spider web music performance that Buehler placed on YouTube,

Feb 3, 2021

Markus J. Buehler

Spider’s Canvas/Arachonodrone show excerpt at Palais de Tokyo, Paris, on November 2018. Video by MIT CAST. More videos can be found on www.arachnodrone.com. The performance was commissioned by Studio Tomás Saraceno (STS), in the context of Saraceno’s carte blanche exhibition, ON AIR. Spider’s Canvas/Arachnodrone was performed by Isabelle Su and Ian Hattwick on the spider web instrument, Evan Ziporyn on the EWI (Electronic Wind Instrument), and Christine Southworth on the guitar and EBow (Electronic Bow)

You can find more about the spider web music and Buehler’s collaborators on http://www.arachnodrone.com/,

Spider’s Canvas / Arachnodrone is inspired by the multifaceted work of artist Tomas Saraceno, specifically his work using multiple species of spiders to make sculptural webs. Different species make very different types of webs, ranging not just in size but in design and functionality. Tomas’ own web sculptures are in essence collaborations with the spiders themselves, placing them sequentially over time in the same space, so that the complex, 3-dimensional sculptural web that results is in fact built by several spiders, working together.

Meanwhile, back among the humans at MIT, Isabelle Su, a Course 1 doctoral student in civil engineering, has been focusing on analyzing the structure of single-species spider webs, specifically the ‘tent webs’ of the cyrtophora citricola, a tropical spider of particular interest to her, Tomas, and Professor Markus Buehler. Tomas gave the department a cyrtophora spider, the department gave the spider a space (a small terrarium without glass), and she in turn built a beautiful and complex web. Isabelle then scanned it in 3D and made a virtual model. At the suggestion of Evan Ziporyn and Eran Egozy, she then ported the model into Unity, a VR/game making program, where a ‘player’ can move through it in numerous ways. Evan & Christine Southworth then worked with her on ‘sonifying’ the web and turning it into an interactive virtual instrument, effectively turning the web into a 1700-string resonating instrument, based on the proportional length of each individual piece of silk and their proximity to one another. As we move through the web (currently just with a computer trackpad, but eventually in a VR environment), we create a ‘sonic biome’: complex ‘just intonation’ chords that come in and out of earshot according to which of her strings we are closest to. That part was all done in MAX/MSP, a very flexible high level audio programming environment, which was connected with the virtual environment in Unity. Our new colleague Ian Hattwick joined the team focusing on sound design and spatialization, building an interface that allowed him the sonically ‘sculpt’ the sculpture in real time, changing amplitude, resonance, and other factors. During this performance at Palais de Tokyo, Isabelle toured the web – that’s what the viewer sees – while Ian adjusted sounds, so in essence they were together “playing the web.” Isabelle provides a space (the virtual web) and a specific location within it (by driving through), which is what the viewer sees, from multiple angles, on the 3 scrims. The location has certain acoustic potentialities, and Ian occupies them sonically, just as a real human performer does in a real acoustic space. A rough analogy might be something like wandering through a gothic cathedral or a resonant cave, using your voice or an instrument at different volumes and on different pitches to find sonorous resonances, echoes, etc. Meanwhile, Evan and Christine are improvising with the web instrument, building on Ian’s sound, with Evan on EWI (Electronic Wind Instrument) and Christine on electric guitar with EBow.

For the visuals, Southworth wanted to create the illusion that the performers were actually inside the web. We built a structure covered in sharkstooth scrim, with 3 projectors projecting in and through from 3 sides. Southworth created images using her photographs of local Lexington, MA spider webs mixed with slides of the scan of the web at MIT, and then mixed those images with the projection of the game, creating an interactive replica of Saraceno’s multi-species webs.

If you listen to the press conference, you will hear Buehler talk about practical applications for this work in materials science.

Vancouver (Canada) Biennale and #ArtProject2020, a free virtual art & technology expo from November 11th to 15th, 2020

It’s a bit odd that the organizers for an event held in Canada would arrange to have Remembrance Day for the opening day and not make any acknowledgements. (For those not familiar with it, here’s more about Remembrance Day (Wikipedia entry) and there’s more here on the Canadian Broadcasting Corporation’s [CBC] Remembrance Day 2020 webpage and on this Nov. 10, 2020 ‘Here’s everything you need to know about the poppy’ article for the Daily Hive.)

The event description is quite exciting and the poster image is engaging, although ….

Courtesy: Vancouver Biennale

Did they intend for the blocks to the left and right (gateway to the bridge?) to look like someone holding both hands giving you the finger on each side? Now that I’ve seen it, I can’t ‘unsee’ it.

Moving on, there’s more information about the expo from a Nov. 9, 2020 Vancouver Biennale announcement (received via email),

The Vancouver Biennale announces a global invitation to #ArtProject2020, a free virtual art and technology expo about how the latest technologies are influencing the art world. The expo will run from November 11th to 15th and feature over 80 international speakers and 40 events offering accessible information and educational resources for digital art. Everyone with a personal or professional interest in art and technology, including curators, galleries, museums, artists, collectors, innovators, experience designers, and futurists will find the expo fascinating and is invited to register. Trilingual programming in English, Spanish, and Chinese will be available.

To reserve a free ticket and see the complete speaker list and schedule, visit www.artproject.io.

Curated by New York-based Colombian artist Jessica Angel, the expo will accompany the Vancouver Biennale’s first exhibition of tokenized art with new works by Jessica Angel, Dina Goldstein, Diana Thorneycroft, and Kristin McIver. Tokenized art is powered by blockchain technology and has redefined digital artwork ownership, allowing artists and collectors the benefit of true digital scarcity. The exhibition will be launched via the blockchain marketplace, Ephimera.

About the Expo

Panel Discussions, Artist Talks, Keynote Speakers: Innovators, curators, legal experts, and artists working at the leading edge of digital art will cover topics including What Is Cryptoart?, Finding Opportunity in the Digital, Women Leading the Art and Tech Movement, The Art of Immersion, Decentralising Power and Resources in the Art World, and Tools for Artists and Collectors. Speakers include The Whitney Museum, Victoria & Albert Museum, Christie’s, Foundation for Art and Blockchain, SuperRare, and Art in America.

Learning: Barrier-free educational workshops will teach participants about using open-source and accessible innovative tools to create, monetize, and collect digital art. Workshops are integrated with various blockchain projects to drive adoption through experience. Featured presenters include Ephimera, Status, and MakerDAO. Indigenous Matriachs 4 will present from the Immersive Knowledge Transfer series for XR media creators, artists, and storytellers from diverse cultural communities.

Activities: A Crypto-Art Puzzle will drop clues every day of the event, and the Digital Art Battle will challenge artists to draw live. This gamified experience will offer winners rewards in different tokens. Participates can also join the Rare AF team on a Virtual Gallery Tour through the Metaverse, where gallery owners will share the inspirations behind their virtual spaces.

Anchoring the virtual expo is a future physical installation by Jessica Angel. Cleverly titled Voxel Bridge, this public artwork will transform the area underneath Vancouver’s Cambie Street Bridge into a three-layered immersive experience to transport visitors between physical and digital worlds. Working with the vastness of the concrete bridge as first layer, Angel adds her site-specific installation as a second layer, and completes the experience with augmented reality enhancements over the real world as the third and final layer. The installation is slated for completion in Spring 2021 as part of the Vancouver Biennale Exhibition.

“I never want to see the Biennale stuck in the past, presenting only static sculpture in an ever-changing world. We work with what comes next, the yet unknown, and we want to go where the future is heading and where public art has, perhaps, always been going. I am excited for this expo and the next chapter of the Biennale.”  – Barrie Mowatt, Founder & Artistic Director of Vancouver Biennale

“Art is a mobilizing force with the power to bridge seemingly dissimilar worlds, and Voxel Bridge exhibits this capacity. This expo transcends the enjoyment of art into a unifying and experimenting effort, that enables blockchain technology and established art institutions to examine ways of interaction. Join us in the virtual public space, to learn, and to cultivate new forms of participation.”             – Jessica Angel, Artist

Do check the schedule: http://www.artproject.io/ (keep scrolling) and don’t forget it’s free in exchange for your registration information. Enjoy!

Nano 2020: a US education initiative

The US Department of Agriculture has a very interesting funding opportunity, Higher Education Challenge (HEC) Grants Program, as evidenced by the Nano 2020 virtual reality (VR) classroom initiative. Before launching into the specifics of the Nano 2020 project, here’s a description of the funding program,

Projects supported by the Higher Education Challenge Grants Program will: (1) address a state, regional, national, or international educational need; (2) involve a creative or non-traditional approach toward addressing that need that can serve as a model to others; (3) encourage and facilitate better working relationships in the university science and education community, as well as between universities and the private sector, to enhance program quality and supplement available resources; and (4) result in benefits that will likely transcend the project duration and USDA support.

A February 3, 2020 University of Arizona news release by Stacy Pigott (also on EurekAlert but published February 7, 2020) announced a VR classroom where students will be able to interact with nanoscale data gained from agricultural sciences and the life sciences,

Sometimes the smallest of things lead to the biggest ideas. Case in point: Nano 2020, a University of Arizona-led initiative to develop curriculum and technology focused on educating students in the rapidly expanding field of nanotechnology.

The five-year, multi-university project recently met its goal of creating globally relevant and implementable curricula and instructional technologies, to include a virtual reality classroom, that enhance the capacity of educators to teach students about innovative nanotechnology applications in agriculture and the life sciences.

Here’s a video from the University of Arizona’s project proponents which illustrates their classroom,

For those who prefer text or like to have it as a backup, here’s the rest of the news release explaining the project,

Visualizing What is Too Small to be Seen

Nanotechnology involves particles and devices developed and used at the scale of 100 nanometers or less – to put that in perspective, the average diameter of a human hair is 80,000 nanometers. The extremely small scale can make comprehension challenging when it comes to learning about things that cannot be seen with the naked eye.

That’s where the Nano 2020 virtual reality classroom comes in. In a custom-developed VR classroom complete with a laboratory, nanoscale objects come to life for students thanks to the power of science data visualization.

Within the VR environment, students can interact with objects of nanoscale proportions – pick them up, turn them around and examine every nuance of things that would otherwise be too small to see. Students can also interact with their instructor or their peers. The Nano 2020 classroom allows for multi-player functionality, giving educators and students the opportunity to connect in a VR laboratory in real time, no matter where they are in the world.

“The virtual reality technology brings to life this complex content in a way that is oddly simple,” said Matt Mars, associate professor of agricultural leadership and innovation education in the College of Agriculture and Life Sciences and co-director of the Nano 2020 grant. “Imagine if you can take a student and they see a nanometer from a distance, and then they’re able to approach it and see how small it is by actually being in it. It’s mind-blowing, but in a way that students will be like, ‘Oh wow, that is really cool!'”

The technology was developed by Tech Core, a group of student programmers and developers led by director Ash Black in the Eller College of Management.

“The thing that I was the most fascinated with from the beginning was playing with a sense of scale,” said Black, a lifelong technologist and mentor-in-residence at the McGuire Center for Entrepreneurship. “What really intrigued me about virtual reality is that it is a tool where scale is elastic – you can dial it up and dial it down. Obviously, with nanotechnology, you’re dealing with very, very small things that nobody has seen yet, so it seemed like a perfect use of virtual reality.”

Black and Tech Core students including Robert Johnson, Hazza Alkaabi, Matthew Romero, Devon Oberdan, Brandon Erickson and Tim Lukau turned science data into an object, the object into an image, and the image into a 3D rendering that is functional in the VR environment they built.

“I think that being able to interact with objects of nanoscale data in this environment will result in a lot of light bulbs going off in the students’ minds. I think they’ll get it,” Black said. “To be able to experience something that is abstract – like, what does a carbon atom look like – well, if you can actually look at it, that’s suddenly a whole lot of context.”

The VR classroom complements the Nano 2020 curriculum, which globally expands the opportunities for nanotechnology education within the fields of agriculture and the life sciences.

Teaching the Workforce of the Future

“There have been great advances to the use of nanotechnology in the health sciences, but many more opportunities for innovation in this area still exist in the agriculture fields. The idea is to be able to advance these opportunities for innovation by providing some educational tools,” said Randy Burd, who was a nutritional sciences professor at the University of Arizona when he started the Nano 2020 project with funding from a National Institute of Food and Agriculture Higher Education Challenge grant through the United States Department of Agriculture. “It not only will give students the basics of the understanding of the applications, but will give them the innovative thought processes to think of new creations. That’s the real key.”

Unknown Object

The goal of the Nano 2020 team, which includes faculty from the University of Arizona, Northern Arizona University and Johns Hopkins University, was to create an online suite of undergraduate courses that was not university-specific, but could be accessed and added to by educators to reach students around the world.

To that end, the team built modular courses in nanotechnology subjects such as glycobiology, optical microscopy and histology, nanomicroscopy techniques, nutritional genomics, applications of magnetic nanotechnology, and design, innovation, and entrepreneurship, to name a few. An online library will be created to facilitate the ongoing expansion of the open-source curricula, which will be disseminated through novel technologies such as the virtual reality classroom.

“It isn’t practical to think that other universities and colleges are just going to be able to launch new courses, because they still need people to teach those courses,” Mars said. “So we created a robust and flexible set of module-based course packages that include exercises, lectures, videos, power points, tools. Instructors will be able to pull out components and integrate them into what already exists to continue to move toward a more comprehensive offering in nanotechnology education.”

According to Mars, the highly adaptable nature of the curriculum and the ability to deliver it in various ways were key components of the Nano 2020 project.

“We approach the project with a strong entrepreneurial mindset and heavy emphasis on innovation. We wanted it to be broadly defined and flexible in structure, so that other institutions access and model the curricula, see its foundation, and adapt that to what their needs were to begin to disseminate the notion of nanotechnology as an underdeveloped but really important field within the larger landscape of agriculture and life sciences,” Mars said. “We wanted to also provide an overlay to the scientific and technological components that would be about adoption in human application, and we approached that through an innovation and entrepreneurial leadership lens.”

Portions of the Nano 2020 curriculum are currently being offered as electives in a certificate program through the Department of Agriculture Education, Technology and Innovation at the University of Arizona. As it becomes more widely disseminated through the higher education community at large, researchers expect the curriculum and VR classroom technology to transcend the boundaries of discipline, institution and geography.

“An online open platform will exist where people can download components and courses, and all of it is framed by the technology, so that these experiences and research can be shared over this virtual reality component,” Burd said. “It’s technologically distinct from what exists now.”

“The idea is that it’s not just curriculum, but it’s the delivery of that curriculum, and the delivery of that curriculum in various ways,” Mars said. “There’s a relatability that comes with the virtual reality that I think is really cool. It allows students to relate to something as abstract as a nanometer, and that is what is really exciting.”

As best I can determine, this VR Nano 2020 classroom is not yet ready for a wide release and, for now, is being offered exclusively at the University of Arizona.

2018 Canadian Science Policy Conference (Nov. 7 – 9, 2018) highlights and Council of Canadian Academies: a communications job, a report, and more

This is a going to a science policy heavy posting with both a conference and the latest report from the Canadian Council of Academies (CCA).

2018 Canadian Science Policy Conference

As I noted in my March 1, 2018 posting, this is the fourth year in a row that the conference is being held in Ottawa and the theme for this 10th edition is ‘Building Bridges Between Science, Policy and Society‘.

The dates are November 7 -9, 2018 and as the opening draws closer I’m getting more ‘breathlessly enthusiastic’ announcements. Here are a few highlights from an October 23, 2018 announcement received via email,

CSPC 2018 is honoured to announce that the Honourable Kirsty Duncan, Minister of Science and Sport, will be delivering the keynote speech of the Gala Dinner on Thursday, November 8 at 7:00 PM. Minister Duncan will also hand out the 4th Science Policy Award of Excellence to the winner of this year’s competition.

CSPC 2018 features 250 speakers, a record number, and above is the breakdown of the positions they hold, over 43% of them being at the executive level and 57% of our speakers being women.

*All information as of October 15, 2018

If you think that you will not meet any new people at CSPC and all of the registrants are the same as last year, think again!

Over 57% of  registrants are attending the conference for the FIRST TIME!

Secure your spot today!

*All information as of October 15, 2018

Here’s more from an October 31, 2018 announcement received via email,

One year after her appointment as Canada’s Chief Science Advisor, Dr. Mona Nemer will discuss her experience with the community. Don’t miss this opportunity.

[Canadian Science Policy Centre editorials in advance of conference]

Paul Dufour
“Evidence and Science in Parliament–Looking Back at CSPC and Moving Forward”

Dr. Tom Corr
“Commercializing Innovation in Canada: Advancing in the Right Direction”

Joseph S Sparling, PhD
“Reimagining the Canadian Postdoctoral Training System”

Milton Friesen
“Conspiring Together for Good: Institutional Science and Religion”

Joseph Tafese
“Science and the Next Generation : Science and Inclusivity, Going beyond the Slogans”

Eva Greyeyes
“Opinion Editorial for CSPC, November 2018”

Monique Crichlow
Chris Loken

“Policy Considerations Towards Converged HPC-AI Platforms”

Should you be in the Ottawa area November 7 – 9, 2018, it’s still possible to register.

**Update November 6, 2018: The 2018 CSPC is Sold Out!**

Council of Canadian Academies: job and the ‘managing innovation’ report

Let’s start with the job (from the posting),

October 17, 2018

Role Title:      Director of Communications
Deadline:       November 5, 2018
Salary:            $115,000 to $165,000

About the Council of Canadian Academies
The Council of Canadian Academies (CCA) is a not-for-profit organization that conducts assessments of evidence on scientific topics of public interest to inform decision-making in Canada.

Role Summary
The CCA is seeking an experienced communications professional to join its senior management team as Director of Communications. Reporting to the President and CEO, the Director is responsible for developing and implementing a communications plan for the organization that promotes and highlights the CCA’s work, brand, and overall mission to a variety of potential users and stakeholders; overseeing the publication and dissemination of high-quality hard copy and online products; and providing strategic advice to the President and CCA’s Board, Committees, and Panels. In fulfilling these responsibilities, the Director of Communications is expected to work with a variety of interested groups including the media, the broad policy community, government, and non-governmental organizations.

Key Responsibilities and Accountabilities
Under the direction of the President and CEO, the Director leads a small team of communications and publishing professionals to meet the responsibilities and accountabilities outlined below.

Strategy Development and External Communications
• Develop and execute an overall strategic communications plan for the organization that promotes and highlights the CCA’s work, brand, and overall mission.
• Oversee the CCA’s presence and influence on digital and social platforms including the development and execution of a comprehensive content strategy for linking CCA’s work with the broader science and policy ecosystem with a focus on promoting and disseminating the findings of the CCA’s expert panel reports.
• Provide support, as needed for relevant government relations activities including liaising with communications counterparts, preparing briefing materials, responding to requests to share CCA information, and coordinating any appearances before Parliamentary committees or other bodies.
• Harness opportunities for advancing the uptake and use of CCA assessments, including leveraging the strengths of key partners particularly the founding Academies.

Publication and Creative Services
• Oversee the creative services, quality control, and publication of all CCA’s expert panel reports including translation, layout, quality assurance, graphic design, proofreading, and printing processes.
• Oversee the creative development and publication of all CCA’s corporate materials including the Annual Report and Corporate Plan through content development, editing, layout, translation, graphic design, proofreading, and printing processes.

Advice and Issues Management
• Provide strategic advice and support to the President’s Office, Board of Directors, Committees, and CCA staff about increasing the overall impact of CCA expert panel reports, brand awareness, outreach opportunities, and effective science communication.
• Provide support to the President by anticipating project-based or organizational issues, understanding potential implications, and suggesting strategic management solutions.
• Ensure consistent messages, style, and approaches in the delivery of all internal and external communications across the organization.

Leadership
• Mentor, train, and advise up to five communications and publishing staff on a day-to-day basis and complete annual performance reviews and planning.
• Lead the development and implementation of all CCA-wide policy and procedures relating to all aspects of communications and publishing.
• Represent the issues, needs, and ongoing requirements for the communications and publishing staff as a member of the CCA senior management team.

Knowledge Requirements
The Director of Communications requires:
• Superior knowledge of communications and public relations principles – preferably as it applies in a non-profit or academic setting;
• Extensive experience in communications planning and issues management;
• Knowledge of current research, editorial, and publication production standards and procedures including but not limited to: translation, copy-editing, layout/design, proofreading and publishing;
• Knowledge of evaluating impact of reports and assessments;
• Knowledge in developing content strategy, knowledge mobilization techniques, and creative services and design;
• Knowledge of human resource management techniques and experience managing a team;
• Experience in coordinating, organizing and implementing communications activities including those involving sensitive topics;
• Knowledge of the relationships and major players in Canada’s intramural and extramural science and public policy ecosystem, including awareness of federal science departments and Parliamentary committees, funding bodies, and related research groups;
• Knowledge of Microsoft Office Suite, Adobe Creative Suite, WordPress and other related programs;
• Knowledge of a variety of social media platforms and measurement tools.

Skills Requirements
The Director of Communications must have:
• Superior time and project management skills
• Superior writing skills
• Superior ability to think strategically regarding how best to raise the CCA’s profile and ensure impact of the CCA’s expert panel reports
• Ability to be flexible and adaptable; able to respond quickly to unanticipated demands
• Strong advisory, negotiation, and problem-solving skills
• Strong skills in risk mitigation
• Superior ability to communicate in both written and oral forms, effectively and diplomatically
• Ability to mentor, train, and provide constructive feedback to direct reports

Education and Experience
This knowledge and skillset is typically obtained through the completion of a post-secondary degree in Journalism, Communications, Public Affairs or a related field, and/or a minimum of 10
years of progressive and related experience. Experience in an organization that has addressed topics in public policy would be valuable.

Language Requirements: This position is English Essential. Fluency in French is a strong asset.

To apply to this position please send your CV and cover letter to careers@scienceadvice.ca before November 5, 2018. The cover letter should answer the following questions in 1,000 words or less:

1. How does your background and work experience make you well-suited for the position of Director of Communications at CCA?
2. What trends do you see emerging in the communications field generally, and in science and policy communications more specifically? How might CCA take advantage of these trends and developments?
3. Knowing that CCA is in the business of conducting assessments of evidence on important policy topics, how do you feel communicating this type of science differs from communicating other types of information and knowledge?

Improving Innovation Through Better Management

The Council of Canadian Academies released their ‘Improving Innovation Through Better Management‘ report on October 18, 2018..As some of my regular readers (assuming there are some) might have predicted, I have issues.

There’s a distinct disconnection between the described problem and the questions to be answered. From the ‘Improving Innovation Through Better Management‘ summary webpage,

While research is world-class and technology start-ups are thriving, few companies grow and mature in Canada. This cycle — invent and sell, invent and sell — allows other countries to capture much of the economic and social benefits of Canadian-invented products, processes, marketing methods, and business models. …

So, the problem is ‘invent and sell’. Leaving aside the questionable conclusion that other countries are reaping the benefits of Canadian innovation (I’ll get back to that shortly), what questions could you ask about how to break the ‘invent and sell, invent and sell’ cycle? Hmm, maybe we should ask, How do we break the ‘invent and sell’ cycle in Canada?

The government presented two questions to deal with the problem and no, how to break the cycle is not one of the questions. From the ‘Improving Innovation Through Better Management‘ summary webpage,

… Escaping this cycle may be aided through education and training of innovation managers who can systematically manage ideas for commercial success and motivate others to reimagine innovation in Canada.

To understand how to better support innovation management in Canada, Innovation, Science and Economic Development Canada (ISED) asked the CCA two critical questions: What are the key skills required to manage innovation? And, what are the leading practices for teaching these skills in business schools, other academic departments, colleges/polytechnics, and industry?

As lawyers, journalists, scientists, doctors, librarians, and anyone who’s ever received misinformation can tell you, asking the right questions can make a big difference.

As for the conclusion that other countries are reaping the benefits of Canadian innovation, is there any supporting data? We enjoy a very high standard of living and have done so for at least a couple of generations. The Organization for Economic Cooperation and Development (OECD) has a Better Life Index, which ranks well-being on these 11 dimensions (from the OECD Better Life Index entry on Wikipedia), Note: Links have been removed,

  1. Housing: housing conditions and spendings (e.g. real estate pricing)
  2. Income: household income and financial wealth
  3. Jobs: earnings, job security and unemployment
  4. Community: quality of social support network
  5. Education: education and what you get out of it
  6. Environment: quality of environment (e.g. environmental health)
  7. Governance: involvement in democracy
  8. Health
  9. Life Satisfaction: level of happiness
  10. Safety: murder and assault rates
  11. Work-life balance

In 2017, the index ranked Canada as fifth in the world while the US appears to have slipped from a previous ranking of 7th to 8th. (See these Wikipedia entries with relevant subsections for rankings:  OECD Better Life Index; Rankings, 2017 ranking and Standard of living in the United States, Measures, 3rd paragraph.)

This notion that other countries are profiting from Canadian innovation while we lag behind has been repeated so often that it’s become an article of faith and I never questioned it until someone else challenged me. This article of faith is repeated internationally and sometimes seems that every country in the world is worried that someone else will benefit from their national innovation.

Getting back to the Canadian situation, we’ve decided to approach the problem by not asking questions about our article of faith or how to break the ‘invent and sell’ cycle. Instead of questioning an assumption and producing an open-ended question, we have these questions (1) What are the key skills required to manage innovation? (2) And, what are the leading practices for teaching these skills in business schools, other academic departments, colleges/polytechnics, and industry?

in my world that first question, would be a second tier question, at best. The second question, presupposes the answer: more training in universities and colleges. I took a look at the report’s Expert Panel webpage and found it populated by five individuals who are either academics or have strong ties to academe. They did have a workshop and the list of participants does include people who run businesses, from the Improving Innovation Through Better Management‘ report (Note: Formatting has not been preserved),

Workshop Participants

Max Blouw,
Former President and Vice-Chancellor of
Wilfrid Laurier University (Waterloo, ON)

Richard Boudreault, FCAE,
Chairman, Sigma Energy
Storage (Montréal, QC)

Judy Fairburn, FCAE,
Past Board Chair, Alberta Innovates;
retired EVP Business Innovation & Chief Digital Officer,
Cenovus Energy Inc. (Calgary, AB)

Tom Jenkins, O.C., FCAE,
Chair of the Board, OpenText
(Waterloo, ON)

Sarah Kaplan,
Director of the Institute for Gender and the
Economy and Distinguished Professor, Rotman School of
Management, University of Toronto (Toronto, ON)

Jean-Michel Lemieux,
Senior Vice President of Engineering,
Shopify Inc. (Ottawa, ON)

Elicia Maine,
Academic Director and Professor, i2I, Beedie
School of Business, Simon Fraser University (Vancouver, BC)

Kathy Malas,
Innovation Platform Manager, CHU
Sainte Justine (Montréal, QC)

John L. Mann, FCAE,
Owner, Mann Consulting
(Blenheim, ON)

Jesse Rodgers,
CEO, Volta Labs (Halifax, NS)

Creso Sá,
Professor of Higher Education and Director of
the Centre for the Study of Canadian and International
Higher Education, Ontario Institute for Studies in Education,
University of Toronto (Toronto, ON)

Dhirendra Shukla,
Professor and Chair, J. Herbert Smith
Centre for Technology Management & Entrepreneurship,
Faculty of Engineering, University of New Brunswick
(Fredericton, NB)

Dan Sinai,
Senior Executive, Innovation, IBM Canada
(Toronto, ON)

Valerie Walker,
Executive Director, Business/Higher
Education Roundtable (Ottawa, ON)

J. Mark Weber,
Eyton Director, Conrad School of
Entrepreneurship & Business, University of Waterloo
(Waterloo, ON)

I am a little puzzled by the IBM executive’s presence (Dan Sinai) on this list. Wouldn’t Canadians holding onto their companies be counterproductive to IBM’s interests? As for John L. Mann, I’ve not been able to find him or his consulting company online. it’s unusual not to find any trace of an individual or company online these days.

In all there were nine individuals representing academic or government institutions in this list. The gender balance is 10 males and five females for the workshop participants and three males and two females for the expert panel. There is no representation from the North or from Manitoba, Saskatchewan, Prince Edward Island, or Newfoundland.

If they’re serious about looking at how to use innovation to drive higher standards of living, why aren’t there any people from Asian countries where they have been succeeding at that very project? South Korea and China come to mind.

I’m sure there are some excellent ideas in the report, I just wish they’d taken their topic to heart and actually tried to approach innovation in Canada in an innovative fashion.

Meanwhile, Vancouver gets another technology hub, from an October 30, 2018 article by Kenneth Chan for the Daily Hive (Vancouver [Canada]), Note: Links have been removed,

Vancouver’s rapidly growing virtual reality (VR) and augmented reality (AR) tech sectors will greatly benefit from a new VR and AR hub created by Launch Academy.

The technology incubator has opened a VR and AR hub at its existing office at 300-128 West Hastings Street in downtown, in partnership with VR/AR Association Vancouver. Immersive tech companies have access to desk space, mentorship programs, VR/AR equipment rentals, investor relations connected to Silicon Valley [emphasis mine], advisory services, and community events and workshops.

Within the Vancouver tech industry, the immersive sector has grown from 15 companies working in VR and AR in 2015 to 220 organizations today.

Globally, the VR and AR market is expected to hit a value of $108 billion by 2021, with tech giants like Amazon, Apple, Facebook, Google, and Microsoft [emphasis mine] investing billions into product development.

In the Vancouver region, the ‘invent and sell’ cycle can be traced back to the 19th century.

One more thing, as I was writing this piece I tripped across this news: “$7.7-billion pact makes Encana more American than Canadian‘ by Geoffrey Morgan. It’s in the Nov. 2, 2018 print edition of the Vancouver Sun’s front page for business. “Encana Corp., the storied Canadian company that had been slowly transitioning away from Canada and natural gas over the past few years under CEO [Chief Executive Officer] Doug Suttles, has pivoted aggressively to US shale basins. … Suttles, formerly as BP Plc. executive, moved from Calgary [Alberta, Canada] to Denver [Colorado, US], though the company said that was for personal reasons and not a precursor to relocation of Encana’s headquarters.”  Yes, that’s quite believable. By the way, Suttles has spent* most of his life in the US (Wikipedia entry).

In any event, it’s not just Canadian emerging technology companies that get sold or somehow shifted out of Canada.

So, should we break the cycle and, if so, how are we going to do it?

*’spend’ corrected to ‘spent’ on November 6, 2018.