There’s a local (Vancouver, Canada) event coming up, as well as, a call for papers, an opportunity to watch a workshop presented in Toronto, Montréal, and Berlin and more in these highlights from the April 2025 issue of the Metacreation Lab for Creative AI newsletter (received via email). The first items are being listed in date order.
Ars Electronica and Vancouver AI [artificial intelligence] Community Meetup
From the April 2025 Metacreation Lab newsletter,
Call for Papers – EXPANDED 2025 at Ars Electronica
The 13th edition of the EXPANDED Conference, focusing on animation and interactive art, will be held from September 3–5, 2025, at the Ars Electronica Center in Linz, Austria, as part of the Ars Electronica Festival.
Organized in cooperation with ACM [Association for Computing Machinery], the conference invites submissions in categories of Research Papers and Art Papers. Topics of interest include AI-generated images, generative art, virtual production, human-AI collaboration, XR, and more.
Vancouver AI – April 30 [2025] at the H.R. MacMillan Space Centre
Metacreation Lab has proudly supported the Vancouver AI Community Meetup since the beginning. This time, Mission #16 of BC’s vibrant AI community meetup series. This edition features a talk by Philippe Pasquier, exploring the latest in generative and creative AI systems.
Also on the lineup is a special performance by K-PHI-A, a live trio featuring Philippe, PhD student Keon Ju Maverick Lee, and VJ Amagi (Jun Yuri). Their piece, Revival, is an improvisational audiovisual performance where human musicians and AI agents co-create in real time. It blends percussion, electronics, and AI-driven visuals using Autolume and other systems developed at the Metacreation Lab.
Ars Electronica started life as a festival in 1979 still being produced annually and is now a larger enterprise. From the Ars Electronica About webpage,.Note Links have been removed
Art, Technology, Society
We have been analyzing and commenting on the Digital Revolution since 1979. Since then, we have been developing projects, strategies and competencies for the Digital Transformation. Together with artists, scientists, technologists, designers, developers, entrepreneurs and activists from all over the world, we address the central questions of our future. The focus is on new technologies and how they change the way we live and work together.
…
A new festival. The first Ars Electronica begins on September 18, 1979. 20 artists and scientists from all over the world gather at this new “Festival for Art, Technology and Society” in Linz to discuss the Digital Revolution and its possible consequences. This Ars Electronica is small, but groundbreaking. The initiative for this came from Hannes Leopoldseder (AT), director of the Upper Austria regional studio of the Austrian Broadcasting Company (ORF), who is passionate about everything that has to do with the future. Together with the electronic musician Hubert Bognermayr (AT), the music producer Ulli A. Rützel (DE) and the cyberneticist and physicist Herbert W. Franke (AT), he lays the foundation stone for a festival that will become the world’s largest and most important of its kind.
…
Between art, technology and society. Over the past four decades, a number of pioneers have turned Ars Electronica into a creative ecosystem that now enjoys a worldwide reputation.
Since 1979 we celebrate once a year the Ars Electronica Festival. More than 1,000 artists, scientists, developers, designers, entrepreneurs and activists are coming to Linz, Austria, to address central questions of our future. For five days, everything revolves around groundbreaking ideas and grand visions, unusual prototypes and innovative collaborations, inspiring art and groundbreaking research, extraordinary performances and irritating interventions, touching sounds and rousing concerts.
Since 1987 we have been awarding the Prix Ars Electronica every year. With several competition categories, we search for groundbreaking projects that revolve around questions of our digital society and rehearse the innovative use of technologies, promising strategies of collaboration and new forms of artistic expression. The best submissions will receive a Golden Nica, considered by the global media art scene to be the most traditional and prestigious award ever.
Since 1996 we have been working at the Ars Electronica Center year after year with tens of thousands of kindergarten children, pupils, apprentices and students on questions concerning the ever-increasing digitalization of our world. The focus is on the potential of the next Game Changer: Artificial Intelligence.
Also since 1996 we operate the Ars Electronica Futurelab, whose international and interdisciplinary team of artists and scientists is researching the future. With interactive scenarios, we prepare central aspects of the Digital Revolution for the general public in order to initiate a democratic discourse.
1998 we initiated create your world. The year-round programme is developed together with young people and includes a competition for under 19 year olds, a festival of its own and a tour through the region. We see create your world as an invitation and challenge at the same time and want to encourage young people to leave the role as mere users of technology behind, to discover new possibilities of acting and designing and to implement their own ideas.
2004 we started Ars Electronica Export with a big exhibition in New York. Since then we have been to Abuja, Athens, Bangkok, Beijing, Berlin, Bilbao, Brussels, Buenos Aires, Doha, Florence, Kiev, London, Madrid, Mexico City, Moscow, Mumbai, Osaka, Sao Paulo, Seoul, Shanghai, Singapore, Tokyo, Tunis, Venice and Zaragoza. Together with partners from art and culture, science and education, business and industry, we organize exhibitions and presentations, conferences and workshops, performances and interventions at all these locations.
Since 2013 our team at Ars Electronica Solutions has been developing market-ready products inspired by visions and prototypes from the artistic cosmos of Ars Electronica. We develop innovative, individual and interactive products and services for exhibitions, brands, trade fairs and events.
Since 2016 we are active all year round in Japan. Especially in Tokyo and Osaka we work together with leading Japanese universities, museums and companies, develop and present artistic projects, design workshop series and Open Labs and dedicate ourselves to the future of our digital society in conferences.
In order to actively shape the digital revolution, people are needed who have a feel for change and recognize connections, develop new strategies and set a course. This is precisely where the 2019 created Future Thinking School aims to support companies and institutions.
Whether at home in the living room or in the office, whether in the classroom or in the lecture hall, in the streetcar or subway, on the train – from everywhere Home Delivery accompanies our virtual visitors on an artistic-scientific journey into our future since 2020.
All our activities since September 18, 1979 have been documented in the form of texts, images and videos and stored in the Ars Electronica Archive. This archive provides us with a unique collection of descriptions and documentations of more than 75,000 projects from four decades of Ars Electronica.
The Expanded Conference (Expanded 2025) will take place from September 3rd to 5th as part of the Ars Electronica Festival 2025. This call for paper focuses on academic papers in the field of Expanded Animation and Interactive Art that explore and experiment with visual expression at the intersection of art, technology, and society. We will have two categories (Research Paper and Art Paper), where submissions will undergo a rigorous review process. All selected speakers will be given a free pass to the Ars Electronica Festival (September 3rd to 7th).
Topics of interest include, but are not limited to:
3D Scanning
AI-generated Images
AI-based artworks
Artistic Computer Animation
Art & Science collaboration projects
Audio-visual Experiments
Data Journalism and Animated Documentary
Data Visualizations
Digital Media Art History
Digital, Hybrid, and Expanded Theater
Expanded Animation
Generative Art
Human-AI interaction and Human-AI collaboration
Hybrids between Animation and Game
Media Facades
Music Visualization
New approaches to artistic research and practice-based methodologies
Participatory art projects
Performance Projects
Playful Interactions and Experiences
Projection Mapping
Projects using NFT, Metaverse, Social Media
Reactive and Interactive audio/visual Work
Real-time CG
Scientific Visualizations
Site-specific Installations
Sound Art and Soundscapes
Tangible Interfaces and New Forms of Experiences
Transmedia Narratives
Virtual Humans and Environments
Virtual Production
VR, AR, MR, XR
…
Again, the submission date for your paper is April 27, 2025. Good luck!
Vancouver AI Community Meetup
Prepare yourself for some sticker shock. Tickets for the meetup are listed at $63.00. As noted earlier, there will be a “talk by Philippe Pasquier, exploring the latest in generative and creative AI systems.and a special performance by K-PHI-A, a live trio featuring Philippe, PhD student Keon Ju Maverick Lee, and VJ Amagi (Jun Yuri). Their piece, Revival, is an improvisational audiovisual performance where human musicians and AI agents co-create in real time.”
Here’s more about Vancouver AI meetups in a video, which appears to have been excerpted from the March 2025 meetup,
We Don’t Do Panels. We Do Portals. Vancouver AI: March 2025 Recap
This wasn’t a meetup. It was a lightning strike. A 3-hour detonation of mind, matter, and machine where open-source fire met ancestral spirit, and the UFO building lit up like a neural rave.
—————-
⚡ What Went Down:
Damian George (Steloston) & his son Ethan kicked the night off with a warrior’s welcome—Indigenous songs from Tsleil-Waututh territory that cracked open the veil and set the frequency.
—————-
Cai & Charlie spun lo-fi beats with a side of C++ sorcery. DIY synths, live visuals, and analog rebellion powered by AI hacks and imagination. This is what machine-human symbiosis sounds like.
—————-
Michael Tippett dropped cinematic subversion with Mr. Canada, a gonzo AI-generated political series where satire meets social critique and deepfakes become truth bombs. (The king has a button that disables the F-35 fleet—yeah, that happened.)
—————-
Cian Whalley, Zen priest & CTO, took us beyond the binary—teaching us how emotion, code, and consciousness intersect like neural lace. Toyota factory metaphors and Digital Buddha hotlines included.
—————-
Philippe Pasquier, the SFU professor we don’t deserve, taught us how to train your own AI models on your art. No scraping, no stealing. Just artists owning their data and their destiny. Bonus: transparent LED cubes and a revival performance next month with AI-powered music agents. 🔮🎶
—————-
Michelle from Women X AI showed us what a real grassroots intelligence network looks like: 45+ women in tech meeting monthly, giving back to the DTES, and building equity into the foundation of AI.
—————-
Niels showed us what radical vulnerability looks like—raw stories of startup survival, burnout, almost crashing (literally), and choosing sustainable hustle over hypergrowth hype.
—————-
Loki Jorgensen repped the new Mind, AI, and Consciousness crew—channeling 2,000 years of philosophical grind into one big ontological jam session. Curious cats only.
—————-
Patrick Pennefather & Kevin the Pixel Wizard rolled out UBC’s AI video lab with student creators turning prompts into art and AI into cinema. Kevin’s mentorship = 🔥.
—————-
Brittany Smila, our resident poet laureate, slayed the crowd with a poem that read like a bootleg instruction manual for being human. Typos included. Plum cake recipes too.
—————-
Darby stepped up with real UX [user experience design] energy—running card sorts and mapping our collective brain to build a proper web infrastructure for the VAI [Vancouver artificial intelligence] hive mind. Web3 who?
—————-
Rival Technologies’ Julia & Dale announced our first-ever Data Storytelling Hackathon. $2,500 prize, survey data that slaps, and a chance to show how AI can amplify truth instead of burying it. (Brittany wrote the hot dog prompt, you’re welcome.)
—————-
Cloud Summit’s YK Sugi, Bibi Souza & Andre made waves repping an all-volunteer, all-heart community cloud event coming in hot during Web Summit week. Code meets care. Sponsors fund causes. Real ones only. Fergus dropped serious policy weight—WOSK Centre for Dialogue BC AI report now live. If you want a seat at the government table, this is your guy.
—————-
Kushal closed the night with a flamethrower. Called out UBC’s xenophobic DeepSeek ban. Defended open-source warriors from China and France (💥shoutout Mistral). No prisoners. No apologies. Just truth. –
—————
Khayyam Wakil wrapped it all up with the keynote of the night: a design rebel’s journey from Saskatoon boats to LA VR labs to immersive media Emmys. Lessons in surrender, reinvention, and the real art of quitting right. 🔥
—————-
📍Location: H.R. MacMillan Space Centre — Vancouver, BC (aka the UFO mothership)
🪐 Astronomers on deck. Observatories open till 11. Community stays weird till 10. 🎧 Full audio, speaker list & projects: vancouver.bc-ai.net
🎟️ Next portal opens May 28: lu.ma/VAI17 🖤❤️✊🔥🏴
We don’t do TED Talks. We host real-time cultural reckonings. This is AI for the people—and it’s only getting louder. Bring your edge. Bring your stickers. Bring your weird.
You can go here to get your ticket for the April 30, 2025 Vancouver AI Community Meetup and to find out about more about some of the AI events in Vancouver. You may want to check out the possibility of getting an annual pass or membership in the hope of making attendance more affordable.
Two papers and two workshop recordings from the Metacreation Lab
From the April 2025 Metacreation Lab newsletter,
Missed the Autolume Workshop? Watch It Online Now
After holding Autolume workshops in Toronto, Montreal, and Berlin, we brought the Autolume workshop online earlier in April, and the recordings are now available.
Whether you’re new to Autolume or want a refresher, this hands-on session walks you through training your own generative models, creating real-time visuals, and exploring interactive art, all without writing a single line of code.
The Metacreation Lab will be at ISEA 2025 with both a paper presentation and a live performance.
PhD student Arshia Sobhan, with Dr. Philippe Pasquier and Dr. Gabriela Aceves-Sepúlveda, will present “Broken Letters, Broken Narratives: A Case Study on Arabic Script in DALL-E 3”. This critical case study examines how text-to-image generative AI systems, such as DALL-E 3, misrepresent Arabic calligraphy, linking these failures to historical biases and Orientalist aesthetics.
In collaboration with sound artist Joshua Rodenberg, Arshia will also present “Reprising Elements,” an audiovisual performance combining Persian calligraphy, sound art, and generative AI powered by Autolume. This performance is an artistic endeavour that celebrates the fusion of time-honoured techniques with modern advancements.
Our paper “MIDI-GPT: A Controllable Generative Model for Computer-Assisted Multitrack Music Composition” is now officially published in the proceedings of the 39th AAAI Conference on Artificial Intelligence.
MIDI-GPT leverages Transformer architecture to infill musical material at both track and bar levels, with controls for instrument type, style, note density, polyphony, and more. Our experiments show it generates original, stylistically coherent compositions while avoiding duplication from its training data. The system is already making waves through industry collaborations and artistic projects.
Not sure how I stumbled across this XR (extended reality) artist-in-residence programme but it’s been in place since 2022 (albeit with some changes). Here’s the announcement for the 2024 artist-in-residence, from the August 14, 2024 Consulate of France in Vancouver press release,
French artist Pierre Friquet, also known as, PYARé, is the latest laureate of the “XR Fall” residency dedicated to XR/AR/VR [extended reality/augmented reality/virtual reality], its third edition. He will be in Vancouver from October 29 to November 28, 2024.
This residency is a collaboration between the Consulate General of France in Vancouver, the Alliance française of Vancouver, the cultural institution of the City of “Paris Forum des Images”, Emily Carr University of Art and Design and the Institut français.
A hybrid creator based in Paris, Pierre Friquet has been designing immersive experiences (VR, dome films, AR, video mapping,) such as Spaced Out, Jet Lag, Vibrations and Patterns since 2010. His intent is to make people reconnect with their body and sense of self through art and technology.
These experiments have won awards at the Festival du Nouveau Cinéma, the Kaléidoscope festival and the Filmgate festival. His latest VR project, SPACE OUT, an immersive diving mask, was selected for the Sundance New Frontier 2020 festival and featured in the cultural programme of the Paris 2024 Olympic Games. Founder of the NiGHT collective, his projects include aquatic virtual reality.
In Vancouver, he will be working around the character of Captain Nemo, the famous warrior scientist in Jules Verne’s novel “20,000 leagues under the sea”.
The residency’s objective is to create an immersive experience allowing users to embody Captain Nemo in a VR adventure, piloting a gondola or riding a whale using intuitive VR controls. His work will focus on the symbiosis between technology and nature, marine conservation and post-colonial adventure. Project by PYARé & INVR.
Find out more about his artistic vision and creations on his website.
You have to have been resident in France for at least five years and speak English to be eligible.
Preparing for the 2025 calls for applications?
There are, in fact, three programmes: two in Vancouver,(1) the XR/AR/VR [extended reality/augmented reality/virtual reality artist-in-residence and (2) Arts & Sciences Quantum Studio artist-in-residence and there’s another ‘quantum programme’ in Paris, also called the Arts & Sciences Quantum Studio artist-in-residence.
The 2025 calls haven’t been announced yet but I do have the 2024 calls for applications and they should give you some idea of what questions you’ll need to answer and what materials you’ll need to prepare. These calls are in French.
Résidence « XR Fall» à Vancouver 29 octobre au 28 novembre 2024
Initiée par l’ambassade de France au Canada / consulat général de Vancouver dans le cadre de leur programme « Résidences Ouest-Ouest », en partenariat avec le Forum des Images (Paris), Emily Carr University of Art + Design (Vancouver), l’Alliance française Vancouver, et avec le soutien de l’Institut français, la troisième édition de la résidence d’écriture et de recherche “XR Fall” à Vancouver se déroulera du 29 octobre au 28 novembre 2024 à Vancouver, en Colombie-Britannique, Canada.
Ouverte à l’ensemble des réalités immersives, cette résidence doit permettre à un·e créateur·rice français·e de s’immerger au sein de l’écosystème local vancouvérois afin d’enrichir son projet d’écriture-recherche et d’étoffer son réseau professionnel. Elle sera également l’occasion de renforcer les liens et de créer de nouvelles synergies entre la France et l’Ouest canadien dans le domaine des innovations numériques. Cette résidence se tiendra à Vancouver du mardi 29 octobre au jeudi 28 novembre 2024.
Pendant la Résidence d’écriture-recherche, le·a créateur·rice sélectionné·e se consacrera au développement de son projet immersif pour lequel iel est invité·e à travailler en coopération avec des professionnel·les vancouvérois.es, ainsi qu’avec des équipes techniques et des sociétés de production locales. Le programme a également pour but d’aider le·a créateur·rice sélectionné·e à renforcer son réseau et ses compétences.
1.2 – Déroulé de la résidence
Du 29 octobre au 28 novembre 2024 à Vancouver, sur le campus d’Emily Carr University of Art + Design.
1.3 – Objectifs
Impulser ou consolider un projet d’écriture-recherche.
Favoriser la découverte de l’écosystème numérique de l’Ouest canadien, ainsi que des collaborations structurantes.
Une attention privilégiée sera portée aux projets ancrés dans le contexte local.
À l’issue de la résidence, l’artiste devra proposer un compte-rendu de son expérience, de son travail et de l’évolution du projet durant cette période.
1.4 – Avantages
Ce programme garantit, notamment, à la lauréate / au lauréat :
Une bourse de résidence à hauteur de 2.000 € (correspondant aux per-diem et à la participation à trois demi-journées de conférences/classes de maître durant la résidence)
Mise en réseau et relations avec l’écosystème local
Participation à des événements en Colombie-Britannique
Autres contreparties (conditions à définir ensemble) :
présentation du projet dans le cadre de NewImages Festival 2025
accréditation pour les Journées pro de NewImages Festival 2025
Présenter le fruit de son travail en résidence (prototype, work-in-progress) dans le cadre de V-Unframed 2025 (Vancouver)
1.5 – Équipement et accompagnement
Au sein d’Emily Carr University of Art + Design, ce programme garantit, notamment, à la lauréate / au lauréat :
L’accès au Basically Good Media Lab en tant qu’espace de travail sur une base régulière. Il s’agit d’un espace collaboratif et partagé avec des chercheurs des premier et deuxième cycles et des assistants de recherche.
L’accès à un ordinateur de pointe : un Dell Precision 3660 ; 32 Go de RAM ; i9-12900K (16 cœurs) ; NVIDIA GeForce RTX 3080.
Appui technique : support technique ponctuel pour aider l’artiste à réaliser son projet.
Mentorat d’Emily Carr University of Art + Design pour fournir un retour sur le projet et les approches de l’artiste, aider à faciliter l’utilisation des ressources et fournir des opportunités potentielles de mise en réseau avec la communauté.
L’accès à d’autres installations sur le campus, en fonction de leur disponibilité, y compris l’Integrated Motion Studio pour une utilisation en tant qu’atelier ou espace boîte noire. Le Basically Good Media Lab dispose de casques de réalité augmentée et virtuelle, avec des caméras 360 grand public et prosumers.
L’artiste sera également accompagné durant la résidence par les équipes de l’ambassade de France au Canada présentes à Vancouver et par celles de l’Alliance française Vancouver.
2- Conditions d’éligibilité
2.1 – Profil des candidat.e.s
Ce programme est ouvert à tout·e artiste, créateur.rice ou porteur.euse d’un projet XR en écriture-recherche.
Âgé.e d’au moins 18 ans
Résidant en France depuis au moins 5 ans
Parlant anglais
Professionnel.le confirmé.e, justifiant de premières expériences dans le domaine des réalités immersive
2.2 – Projets acceptés
Ce programme est ouvert aux réalités immersives dans toute leur diversité (réalité virtuelle 360° ou interactive, augmentée, mixte, installation incluant des technologies immersives, en lien avec la création sonore ou la technologie 4D, etc.).
Les projets devront être reliés à au moins l’un des grands thèmes suivants :
Durabilité environnementale
Justice écologique et action climatique
Justice sociale, santé et bien-être de la communauté
Recherches portant sur le territoire et les lieux
3- Processus d’inscription
3.1 – À propos de l’appel à candidatures
L’inscription du projet :
Doit être faite en anglais
Doit être faite en ligne à https://zhx2xeql.paperform.com jusqu’au dimanche 30 juin 2024 (23:59, GMT)
Doit être envoyée en un seul PDF
Est gratuite pour l’ensemble des postulant·es
À noter également :
Les inscriptions incomplètes ne seront pas prises en considération
Vos informations sont automatiquement sauvegardées en local ; vous pouvez donc fermer et/ou revenir ultérieurement au formulaire depuis le même appareil et le même navigateur (hors fenêtres de navigation privée)
Nous vous conseillons vivement de ne pas attendre les derniers jours de l’appel à candidatures pour soumettre votre projet, afin d’éviter tout problème technique.
En inscrivant un projet, vous reconnaissez détenir les droits afférents à celui-ci ou être habilité·e par tou·te·s les autres ayants droit. Le Forum des images, l’ambassade de France au Canada / consulat général de Vancouver, Emily Carr University of Art + Design, l’Alliance française Vancouver et l’Institut français ne peuvent en aucun cas être tenus pour responsables en cas de réclamation, conflit ou poursuite en lien avec l’inscription du projet.
3.2 – Informations requises
Avant votre inscription, nous vous invitons à prendre connaissance des informations et pièces demandées dans le dossier de présentation devant être joint à votre inscription (dans le même ordre que ci-dessous) :
Le plan de travail envisagé pour la résidence (prévisionnel)
Des visuels du projet (le cas échéant)
Une lettre de recommandation et/ou une lettre d’une institution culturelle française accompagnant le projet en vue d’une future exposition ou production de l’œuvre (facultative)
L’ambassade de France au Canada, en partenariat avec le Quantum Information Center Sorbonne (Sorbonne Université), le CENTQUATRE-PARIS (Paris) et le programme des résidences internationales Ville de Paris aux Récollets, lance le volet français de la résidence arts-sciences “Quantum Studio”. Cette résidence d’artiste aura lieu du 9 au 30 septembre 2024à Paris, France. Elle s’adresse à un ou une artiste canadien.ne résidant en Colombie-Britannique explorant les croisements entre arts et sciences.
Ouverte à l’ensemble des pratiques artistiques, la résidence cherche à construire des échanges entre arts et sciences quantiques (physique quantique, informatique quantique, physique de l’infiniment petit, sciences des matériaux, physique fondamentale).
Le Quantum Information Center Sorbonne (Sorbonne Université) et le CENTQUATRE-PARIS offriront à l’artiste sélectionné.e un espace de réflexion dans lequel artistes et chercheurs pourront se réunir, échanger sur leurs pratiques, apprendre les uns des autres et réfléchir ensemble à un projet créatif à la croisée des arts et des sciences. En amont de la résidence à Paris, plusieurs rencontres en ligne seront organisées, afin d’établir et d’entretenir un premier contact entre l’artiste lauréat.e au Canada et l’équipe hôte (institutions et scientifiques) de Paris.
1.2 – Déroulé de la résidence
Du 9 au 30 septembre 2024à Paris (hébergement au couvent des Récollets).
1.3 – Objectifs
Impulser ou consolider un projet créatif.
Le ou la lauréat.e a une obligation de restitution de recherche ou de rendu artistique (projet écrit, esquisses et croquis, œuvre, etc.) pendant leur séjour.
Partager son travail lors de séminaires arts et sciences co-organisés avec le Quantum Information Center Sorbonne et le CENTQUATRE-PARIS.
Favoriser la découverte de l’écosystème scientifique et artistique parisien, ainsi que des collaborations structurantes. Une attention privilégiée sera portée aux projets ancrés dans le contexte local.
1.4 – Avantages
Ce programme garantit, notamment, à la lauréate ou au lauréat :
3 semaines de résidence à Paris.
Un hébergement au sein du Couvent des Récollets (Ville de Paris), un bureau de travail au Quantum Information Center Sorbonne et un bureau de production au CENTQUATRE-PARIS.
Prise en charge complète (vols Vancouver-Paris, logement).
Un cachet de résidence à hauteur de 1.635 € (correspondant aux per-diem et à la participation à trois demi-journées de conférences/classes de maître durant la résidence).
Mise en réseau et relations avec l’écosystème local.
Participation, durant la résidence, à des rencontres avec les équipes de la 104factory, à des ouvertures de résidences au CENTQUATRE-PARIS et à des événements se déroulant au CENTQUATRE-PARIS.
Possibilité de participation, en post-résidence, à des événements en lien avec Némo-Biennale internationale des arts numériques de la Région Île-de-France, produite par le CENTQUATRE-PARIS.
2- Conditions d’éligibilité
2.1 – Profil des candidat.e.s
Artiste porteuse ou porteur d’un projet artistique en écriture ou en développement,
Âgé.e d’au moins 18 ans,
De nationalité canadienne ou titulaire d’une carte de résident permanent au Canada
Résidant en Colombie-Britannique,
Justifiant idéalement de premières expériences de création mêlant arts et sciences (les candidatures d’artistes ayant déjà travaillé ou travaillant en lien avec les sciences physiques seront appréciées).
2.2 – Projets acceptés
Ce programme est ouvert aux pratiques artistiques dans toute leur diversité (écriture, arts visuels et plastiques, arts numériques, design, danse, performance, réalités immersives, création sonore, etc.).
I have more information about the Quantum Studio artist-in-residence in Vancouver programme in an October 7, 2024 posting, scroll down t the ‘Quantum Studio’ subhead.
Just when I thought I was almost caught up, I found this. The study I will be highlighting is from August 2023 but there are interesting developments all the way into October 2023 and beyond. First, the latest in AI (artificial intelligence) devices from an October 5, 2023 article by Lucas Arender for the Daily Hive, which describes the devices as AI wearables (you could also them wearable technology), Note: Links have been removed,
Rewind.ai launched Pendant, a necklace that records your conversations and transfers them to your smartphone, creating an audio database (of sorts) for your life.
Meta unveiled a pair of Ray-Ban smart glasses that include an AI chatbot that users can communicate with (which might make you look like you’re talking to yourself).
Sam Altman-backed startup Humane teased its new AI pin at Paris Fashion Week— a screenless lapel device that projects a smartphone-like interface onto users’ hands.
Microsoft filed a patent for an AI backpack that features GPS, voice command, and cameras that could… help us walk in the right direction?
The second item in the list ‘Ray-Ban Meta Smart Glasses’ is further described in an October 17, 2023 article by Sarah Bartnicka for the Daily Hive, Note: A link has been removed,
It’s a glorious day for tech dads everywhere: Meta and Ray-Ban smart glasses are officially for sale in Canada.
Driving the news: Meta has become the latest billion-dollar company to officially enter the smart glasses market with the second iteration [emphasis mine] of its design with Ray-Bans, now including a built-in Meta AI assistant, hands-free live streaming features, and a personal audio system.
…
This time around, the technology is better, and both Meta and Snap are pitching their smart glasses as a tool for creators to stay connected with their audiences rather than just a sleek piece of hardware that can blend your digital and physical realities [augmented or extended reality?].
…
Yes, but: As smart glasses creep back into the limelight, people are wary about wearing cameras on their faces. Concerns about always-on cameras and microphones that allow users to record their surroundings without the consent of others will likely stick around. [emphasis mine]
So, are these AI or smart or augmented reality (AR) glasses? In my October 22, 2021 post, I explored a number of realities in the context of the metaverse. Yes, it gets confusing. At any rate, i found these definitions,
Happily, I have found a good summarized description of VR/AR/MR/XR in a March 20, 2018 essay by North of 41 on medium.com,
“Summary: VR is immersing people into a completely virtual environment; AR is creating an overlay of virtual content, but can’t interact with the environment; MR is a mixed of virtual reality and the reality, it creates virtual objects that can interact with the actual environment. XR brings all three Reality (AR, VR, MR) together under one term.”
If you have the interest and approximately five spare minutes, read the entire March 20, 2018 essay, which has embedded images illustrating the various realities.
…
This may change over time but for now, answering the question, “AI or smart or augmented reality (AR) glasses?” you can say any or all three.
Someone wearing augmented reality (AR) or “smart” glasses could be Googling your face, turning you into a cat or recording your conversation – and that creates a major power imbalance, said Cornell researchers.
Currently, most work on AR glasses focuses primarily on the experience of the wearer. Researchers from the Cornell Ann S. Bowers College of Computing and Information Science and Brown University teamed up to explore how this technology affects interactions between the wearer and another person. Their explorations showed that, while the device generally made the wearer less anxious, things weren’t so rosy on the other side of the glasses.
Jenny Fu, a doctoral student in the field of information science, presented the findings in a new study, “Negotiating Dyadic Interactions through the Lens of Augmented Reality Glasses,” at the 2023 ACM Designing Interactive Systems Conference in July.
AR glasses superimpose virtual objects and text over the field of view to create a mixed-reality world for the user. Some designs are big and bulky, but as AR technology advances, smart glasses are becoming indistinguishable from regular glasses, raising concerns that a wearer could be secretly recording someone or even generating deepfakes with their likeness.
For the new study, Fu and co-author Malte Jung, associate professor of information science and the Nancy H. ’62 and Philip M. ’62 Young Sesquicentennial Faculty Fellow, worked with Ji Won Chung, a doctoral student, and Jeff Huang, associate professor of computer science, both at Brown, and Zachary Deocadiz-Smith, an independent extended reality designer.
They observed five pairs of individuals – a wearer and a non-wearer – as each pair discussed a desert survival activity. The wearer received Spectacles, an AR glasses prototype on loan from Snap Inc., the company behind Snapchat. The Spectacles look like avant-garde sunglasses and, for the study, came equipped with a video camera and five custom filters that transformed the non-wearer into a deer, cat, bear, clown or pig-bunny.
Following the activity, the pairs engaged in a participatory design session where they discussed how AR glasses could be improved, both for the wearer and the non-wearer. The participants were also interviewed and asked to reflect on their experiences.
According to the wearers, the fun filters reduced their anxiety and put them at ease during the exercise. The non-wearers, however, reported feeling disempowered because they didn’t know what was happening on the other side of the lenses. They were also upset that the filters robbed them of control over their own appearance. The possibility that the wearer could be secretly recording them without consent – especially when they didn’t know what they looked like – also put the non-wearers at a disadvantage.
The non-wearers weren’t completely powerless, however. A few demanded to know what the wearer was seeing, and moved their faces or bodies to evade the filters – giving them some control in negotiating their presence in the invisible mixed-reality world. “I think that’s the biggest takeaway I have from this study: I’m more powerful than I thought I was,” Fu said.
Another issue is that, like many AR glasses, Spectacles have darkened lenses so the wearer can see the projected virtual images. This lack of transparency also degraded the quality of the social interaction, the researchers reported.
“There is no direct eye contact, which makes people very confused, because they don’t know where the person is looking,” Fu said. “That makes their experiences of this conversation less pleasant, because the glasses blocked out all these nonverbal interactions.”
To create more positive experiences for people on both sides of the lenses, the study participants proposed that smart glasses designers add a projection display and a recording indicator light, so people nearby will know what the wearer is seeing and recording.
Fu also suggests designers test out their glasses in a social environment and hold a participatory design process like the one in their study. Additionally, they should consider these video interactions as a data source, she said.
That way, non-wearers can have a voice in the creation of the impending mixed-reality world.
Rina Diane Caballar’s September 25, 2023 article for IEEE (Institute of Electrical and Electronics Engineers) Spectrum magazine provides a few more insights about the research, Note: Links have been removed,
…
“This AR filter interaction is likely to happen in the future with the commercial emergence of AR glasses,” says Jenny Fu, a doctoral student at Cornell University’s Bowers College of Computing and Information Science and one of the two lead authors of the study. “How will that look like, and what are the social and emotional consequences of interacting and communicating through AR glasses?”
…
“When we think about design in HCI [human-computer interface], there is often a tendency to focus on the primary user and design just for them,” Jung says. “Because these technologies are so deeply embedded in social interactions and are used with others and around others, we often forget these ‘onlookers’ and we’re not designing with them in mind.”
…
Moreover, involving nonusers is especially key in developing more equitable tech products and creating more inclusive experiences. “That’s one of the points why previous AR iterations may not have worked—they designed it for the individual and not for the people surrounding them,” says Chung. She adds that a mindset shift is needed to actively make tech that doesn’t exclude people, which could lead to social systems that promote engagement and foster a sense of belonging for everyone.
…
Caballar’s September 25, 2023 article also appears in the January 2024 print version of the IEEE Spectrum with the title ““AR Glasses Upset the Social Dynamic.”
I received an April 5, 2023 announcement for the 2023 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence, and Neural Engineering (IEEE MetroXRAINE 2023) via email. Understandably given that it’s an Institute of Electrical and Electronics Engineers (IEEE) conference, they’re looking for submissions focused on developing the technology,
Last days to submit your contribution to our Special Session on “eXtended Reality as a gateway to the Metaverse: Practices, Theories, Technologies and Applications” – IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence, and Neural Engineering (IEEE MetroXRAINE 2023) – October 25-27, 2023 – Milan – https://metroxraine.org/special-session-17.
I want to remind you that the deadline of April 7 [2023] [extended to April 14, 2023 as per April 11, 2023 notice received via email] is for the submission of a 1-2 page Abstract or a Graphical Abstract to show the idea you are proposing. You will have time to finalise your work by the deadline of May 15 [2023].
Please see the CfP below for details and forward it to colleagues who might be interested in contributing to this special session.
I’m looking forward to meeting you, virtually or in your presence, at IEEE MetroXRAINE 2023.
Best regards, Giuseppe Caggianese
Research Scientist National Research Council (CNR) [Italy] Institute for High-Performance Computing and Networking (ICAR) Via Pietro Castellino 111, 80131, Naples, Italy
Here’s are specific for the Special Session’s Call for Papers (from the April 5, 2023 email announcement),
Call for Papers – Special Session on: “EXTENDED REALITY AS A GATEWAY TO THE METAVERSE: PRACTICES, THEORIES, TECHNOLOGIES AND APPLICATIONS” https://metroxraine.org/special-session-17
2023 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence, and Neural Engineering (IEEE MetroXRAINE 2023) https://metroxraine.org/
October 25-27, 2023 – Milan, Italy.
SPECIAL SESSION DESCRIPTION ————————- The fast development of Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) solutions over the last few years are transforming how people interact, work, and communicate. The eXtended Reality (XR) term encloses all those immersive technologies that can shift the boundaries between digital and physical worlds to realize the metaverse. According to tech companies and venture capitalists, the metaverse will be a super-platform that convenes sub-platforms: social media, online video games, and ease-of-life apps, all accessible through the same digital space and sharing the same digital economy. Inside the metaverse, virtual worlds will allow avatars to carry out all human endeavours, including creation, display, entertainment, social, and trading. Thus, the metaverse will evolve how users interact with brands, intellectual properties, health services, cultural heritage, and each other things on the Internet. A user could join friends to play a multiplayer game, watch a movie via a streaming service and then attend a university course precisely the same as in the real world. The metaverse development will require new software architecture that will enable decentralized and collaborative virtual worlds. These self-organized virtual worlds will be permanent and will require maintenance operations. In addition, it will be necessary to design an efficient data management system and prevent privacy violations. Finally, the convergence of physical reality, virtually enhanced, and an always-on virtual space highlighted the need to rethink the actual paradigms for visualization, interaction, and sharing of digital information, moving toward more natural, intuitive, dynamically customizable, multimodal, and multi-user solutions. This special session aims to focus on exploring how the realization of the metaverse can transform certain application domains such us: (i) healthcare, in which the metaverse solutions can, for instance, improve the communication between patients and physicians; (ii) cultural heritage, with potentially more effective solutions for tourism guidance, site maintenance, and heritage object conservation; and (iii) industry, where to enable data-driven decision making, smart maintenance, and overall asset optimisation.
The topics of interest include, but are not limited to, the following:
Hardware/Software Architectures for metaverse
Decentralized and Collaborative Architectures for metaverse
Interoperability for metaverse
Tools to help creators to build the metaverse0
Operations and Maintenance in metaverse
Data security and privacy mechanisms for metaverse
Cryptocurrency, token, NFT Solutions for metaverse
Fraud-Detection in metaverse
Cyber Security for metaverse
Data Analytics to Identify Malicious Behaviors in metaverse
Blockchain/AI technologies in metaverse
Emerging Technologies and Applications for metaverse
New models to evaluate the impact of the metaverse
Interactive Data Exploration and Presentation in metaverse
Human-Computer Interaction for metaverse
Human factors issues related to metaverse
Proof-of-Concept in Metaverse: Experimental Prototyping and Testbeds
IMPORTANT DATES
Abstract Submission Deadline: April 7, 2023 (extended) NOTE: 1-2 pages abstract or a graphical abstract Final Paper Submission Deadline: May 15, 2023 (extended) Full Paper Acceptance Notification: June 15, 2023 Final Paper Submission Deadline: July 31, 2023
SUBMISSION AND DECISIONS ———————— Authors should prepare an Abstract (1 – 2 pages) that clearly indicates the originality of the contribution and the relevance of the work. The Abstract should include the title of the paper, names and affiliations of the authors, an abstract, keywords, an introduction describing the nature of the problem, a description of the contribution, the results achieved and their applicability.
When the first review process has been completed, authors receive a notification of either acceptance or rejection of the submission. If the abstract has been accepted, the authors can prepare a full paper. The format for the full paper is identical to the format for the abstract except for the number of pages: the full paper has a required minimum length of five (5) pages and a maximum of six (6) pages. Full Papers will be reviewed by the Technical Program Committee. Authors of accepted full papers must submit the final paper version according to the deadline, register for the workshop, and attend to present their papers. The maximum length for final papers is 6 pages. All contributions will be peer-reviewed and acceptance will be based on quality, originality and relevance. Accepted papers will be submitted for inclusion into IEEE Xplore Digital Library.
Submissions must be written in English and prepared according to the IEEE Conference Proceedings template. LaTeX and Word templates and an Overleaf sample project can be found at: https://metroxraine.org/initial-author-instructions.
The papers must be submitted in PDF format electronically via EDAS online submission and review system: https://edas.info/newPaper.php?c=30746. To submit abstracts or draft papers to the special session, please follow the submission instructions for regular sessions, but remind to specify the special session to which the paper is directed.
The special session organizers and other external reviewers will review all submissions.
CONFERENCE PROCEEDINGS ———————————– All contributions will be peer-reviewed, and acceptance will be based on quality, originality, and relevance. Accepted papers will be submitted for inclusion into IEEE Xplore Digital Library.
Extended versions of presented papers are eligible for post-publication; more information will be provided soon.
IHEX has nothing to do with high tech witches (sigh … mildly disappointing), it is the abbreviation for “Intelligent interfaces and Human factors in EXtended environments” and I got a June 29, 2022 announcement or call for papers via email,
International Workshop on Intelligent interfaces and Human factors in EXtended environments (IHEX) – SITIS 2022 16th international conference on Signal Image Technology & Internet based Systems, Dijon, France, October 19-21, 2022
Dear Colleagues, It is with great pleasure that we would like to invite you to send a contribution to the International Workshop on Intelligent interfaces and Human factors in EXtended environments (IHEX) at SITIS 2022 16th international conference on Signal Image Technology & Internet based Systems (Conference website: https://www.sitis-conference.org).
The workshop is about new approaches for designing and implementing intelligent eXtended Reality systems. Please find the call for papers below and forward it to colleagues who might be interested in contributing to the workshop. For any questions and information, please do not hesitate to get in touch.
Best Regards, Giuseppe Caggianese
CFP [Call for papers] ———- eXtended Reality is becoming more and more widespread; going beyond entertainment and cultural heritage fruition purposes, these technologies offer new challenges and opportunities also in educational, industrial and healthcare domains. The research community in this field deals with technological and human factors issues, presenting theoretical and methodological proposals for perception, tracking, interaction and visualization. Increasing attention is observed towards the use of machine learning and AI methodologies to perform data analysis and reasoning, manage a multimodal interaction, and ensure an adaptation to users’ needs and preferences. The workshop is aimed at investigating new approaches for the design and implementation of intelligent eXtended Reality systems. It intends to provide a forum to share and discuss not only technological and design advances but also ethical concerns about the implications of these technologies on changing social interactions, information access and experiences.
Topics for the workshop include, but are not limited to:
– Intelligent User Interfaces in eXtended environments – Computational Interaction for XR – Quality and User Experience in XR – Cognitive Models for XR – Semantic Computing in environments – XR-based serious games – Virtual Agents in eXtended environments – Adaptive Interfaces – Visual Reasoning – Content Modelling – Responsible Design of eXtended Environments – XR systems for Human Augmentation – AI methodologies applied to XR – ML approaches in XR – Ethical concerns in XR
VENUE ———- University of Burgundy main campus, Dijon, France, October 19-21, 2022
WORKSHOP CO-CHAIRS ———————————– Agnese Augello, Institute for high performance computing and networking, National Research Council, Italy Giuseppe Caggianese, Institute for high performance computing and networking, National Research Council, Italy Boriana Koleva, University of Nottingham, United Kingdom
PROGRAM COMMITTEE ———————————- Agnese Augello, Institute for high performance computing and networking, National Research Council, Italy Giuseppe Caggianese, Institute for high performance computing and networking, National Research Council, Italy Giuseppe Chiazzese, Institute for Educational Technology, National Research Council, Italy Dimitri Darzentas, Edinburgh Napier University, Scotland Martin Flintham, University of Nottingham, United Kingdom Ignazio Infantino, Institute for high performance computing and networking, National Research Council, Italy Boriana Koleva, University of Nottingham, United Kingdom Emel Küpçü, Xtinge Technology Inc., Turkey Effie Lai-Chong Law, Durham University, United Kingdom Pietro Neroni, Institute for high performance computing and networking, National Research Council, Italy
SUBMISSION AND DECISIONS ——————————————- Each submission should be at most 8 pages in total including bibliography and well-marked appendices and must follow the IEEE [Institute of Electrical and Electronics Engineers] double columns publication format.
Submissions will be peer-reviewed by at least two peer reviewers. Papers will be evaluated based on relevance, significance, impact, originality, technical soundness, and quality of presentation. At least one author should attend the conference to present an accepted paper.
IMPORTANT DATES —————————- Paper Submission July 15, 2022 Acceptance/Reject Notification. September 9, 2022 Camera-ready September 16, 2022 Author Registration September 16, 2022
CONFERENCE PROCEEDINGS ——————————————– All papers accepted for presentation at the main tracks and workshops will be included in the conference proceedings, which will be published by IEEE Computer Society and referenced in IEEE Xplore Digital Library, Scopus, DBLP and major indexes.
REGISTRATION ———————– At least one author of each accepted paper must register for the conference and present the work. A single registration allows attending both track and workshop sessions.
CONTACTS —————- For any questions, please contact us via email.
As noted in the headline for this post, I have two items. For anyone unfamiliar with XR and the other (AR, MR, and VR) realities, I found a good description which I placed in my October 22, 2021 posting (scroll down to the “How many realities are there?” subhead about 70% of the way down).
eXtended Reality in Rome
I got an invitation (via a February 24, 2022 email) to participate in a special session at one of the 2022 IEEE (Institute of Electrical and Electronics Engineers) conference (more about the conference later).
The fast development of Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) solutions over the last few years are transforming how people interact, work, and communicate. The eXtended Reality (XR) term encloses all those immersive technologies that can shift the boundaries between digital and physical worlds to realize the Metaverse. According to tech companies and venture capitalists, the Metaverse will be a super-platform that convenes sub-platforms: social media, online video games, and ease-of-life apps, all accessible through the same digital space and sharing the same digital economy. Inside the Metaverse, virtual worlds will allow avatars to carry all human endeavours, including creation, display, entertainment, social, and trading. Thus, the Metaverse will evolve how users interact with brands, intellectual properties, and each other things on the Internet. A user could join friends to play a multiplayer game, watch a movie via a streaming service and then attend a university course precisely the same as in the real world.
The Metaverse development will require new software architecture that will enable decentralized and collaborative virtual worlds. These self-organized virtual worlds will be permanent and will require maintenance operations. In addition, it will be necessary to design efficient data management system and prevent privacy violations. Finally, the convergence of physical reality, virtually enhanced, and an always-on virtual space highlighted the need to rethink the actual paradigms for visualization, interaction, and sharing of digital information, moving toward more natural, intuitive, dynamically customizable, multimodal, and multi-user solutions.
TOPICS
The topics of interest include, but are not limited to, the following:
Hardware/Software Architectures for Metaverse
Decentralized and Collaborative Architectures for Metaverse
Interoperability for Metaverse
Tools to help creators to build the Metaverse
Operations and Maintenance in Metaverse
Data security and privacy mechanisms for Metaverse
Cryptocurrency, token, NFT Solutions for Metaverse
Fraud-Detection in Metaverse
Cyber Security for Metaverse
Data Analytics to Identify Malicious Behaviors in Metaverse
Blockchain/AI technologies in Metaverse
Emerging Technologies and Applications for Metaverse
New models to evaluate the impact of the Metaverse
Interactive Data Exploration and Presentation in Metaverse
Human factors issues related to Metaverse
Proof-of-Concept in Metaverse: Experimental Prototyping and Testbeds
ABOUT THE ORGANIZERS
Giuseppe Caggianese is a Research Scientist at the National Research Council of Italy. He received the Laurea degree in computer science magna cum laude in 2010 and the Ph.D. degree in Methods and Technologies for Environmental Monitoring in 2013 from the University of Basilicata, Italy.
His research activities are focused on the field of Human-Computer Interaction (HCI) and Artificial Intelligence (AI) to design and test advanced interfaces adaptive to specific uses and users in both augmented and virtual reality. He authored more than 30 scientific papers published in international journals, conference proceedings, and books. He also serves on program committees of several international conferences and workshops.
Ugo Erra is an Assistant Professor (qualified as Associate Professor) at the University of Basilicata (UNIBAS), Italy. He is the founder of the Computer Graphics Laboratory at the University of Basilicata. He received an MSc/diploma degree in Computer Science from the University of Salerno, Italy, in 2001 and a PhD in Computer Science in 2004.
His research focuses on Real-Time Computer Graphics, Information Visualization, Artificial Intelligence, and Parallel Computing. Has been involved in several research projects; among these, one project was funded by the European Commission as a research fellow, and four projects were founded by Area Science Park, a public national research organization that promotes the development of innovation processes, as principal investigator. He has (co-)authored about 14 international journal articles, 45 international conference proceedings, and two book chapters. He supervised four PhD students. He organized the Workshop on Parallel and Distributed Agent-Based Simulations, a satellite Workshop of Euro-Par, from 2013 to 2015. He served more than 20 international conferences as program committee member and more than ten journals as referee.
The 2022 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence, and Neural Engineering (IEEE MetroXRAINE 2022) will be an international event mainly aimed at creating a synergy between experts in eXtended Reality, Brain-Computer Interface, and Artificial Intelligence, with special attention to measurement [i.e., metrology].
The conference will be a unique opportunity for discussion among scientists, technologists, and companies on very specific sectors in order to increase the visibility and the scientific impact for the participants. The organizing formula will be original owing to the emphasis on the interaction between the participants to exchange ideas and material useful for their research activities.
MetroXRAINE will be configured as a synergistic collection of sessions organized by the individual members of the Scientific Committee. Round tables will be held for different projects and hot research topics. Moreover, we will have demo sessions, students contests, interactive company expositions, awards, and so on.
The Conference will be a hybrid conference [emphasis mine], with the possibility of attendance remotely or in presence.
CALL FOR PAPERS
The Program Committee is inviting to submit Abstracts (1 – 2 pages) for the IEEE MetroXRAINE 2022 Conference, 26-28 October, 2022.
All contributions will be peer-reviewed and acceptance will be based on quality, originality and relevance. Accepted papers will be submitted for inclusion into IEEE Xplore Digital Library.
Extended versions of presented papers are eligible for post publication.
…
Abstract Submission Deadline:
March 28, 2022
Full Paper Submission Deadline:
May 10, 2022
Extended Abstract Acceptance Notification:
June 10, 2022
Final Paper Submission Deadline:
July 30, 2022
According to the email invitation, “IEEE MetroXRAINE 2022 … will be held on October 26-28, 2022 in Rome.” You can find more details on the conference website.
Council of Canadian Academies launches four projects
This too is from an email. From the Council of Canadian Academies (CCA) announcement received February 27, 2022 (you can find the original February 17, 2022 CCA news release here),
The Council of Canadian Academies (CCA) is pleased to announce it will undertake four new assessments beginning this spring:
Gene-edited Organisms for Pest Control Advances in gene editing tools and technologies have made the process of changing an organism’s genome more efficient, opening up a range of potential applications. One such application is in pest control. By editing genomes of organisms, and introducing them to wild populations, it’s now possible to control insect-borne disease and invasive species, or reverse insecticide resistance in pests. But the full implications of using these methods remains uncertain.
This assessment will examine the scientific, bioethical, and regulatory challenges associated with the use of gene-edited organisms and technologies for pest control.
Sponsor: Health Canada’s Pest Management Regulatory Agency
The Future of Arctic and Northern Research in Canada The Arctic is undergoing unprecedented changes, spurred in large part by climate change and globalization. Record levels of sea ice loss are expected to lead to increased trade through the Northwest Passage. Ocean warming and changes to the tundra will transform marine and terrestrial ecosystems, while permafrost thaw will have significant effects on infrastructure and the release of greenhouse gases. As a result of these trends, Northern communities, and Canada as an Arctic and maritime country, are facing profound economic, social, and ecosystem impacts.
This assessment will examine the key foundational elements to create an inclusive, collaborative, effective, and world-class Arctic and northern science system in Canada.
Sponsor: A consortium of Arctic and northern research and science organizations from across Canada led by ArcticNet
Quantum Technologies Quantum technologies will affect all sectors of the Canadian economy. Built on the principles of quantum physics, these emerging technologies present significant opportunities in the areas of sensing and metrology, computation and communication, and data science and artificial intelligence, among others. But there is also the potential they could be used to facilitate cyberattacks, putting financial systems, utility grids, infrastructure, personal privacy, and national security at risk. A comprehensive exploration of the capabilities and potential vulnerabilities of these technologies will help to inform their future deployment across society and the economy.
This assessment will examine the impacts, opportunities, and challenges quantum technologies present for industry, governments, and people in Canada.
Sponsor: National Research Council Canada and Innovation, Science and Economic Development Canada
International Science and Technology Partnership Opportunities International partnerships focused on science, technology, and innovation can provide Canada with an opportunity to advance the state of knowledge in areas of national importance, help address global challenges, and contribute to UN Sustainable Development Goals. Canadian companies could also benefit from global partnerships to access new and emerging markets.
While there are numerous opportunities for international collaborations, Canada has finite resources to support them. Potential partnerships need to be evaluated not just on strengths in areas such as science, technology, and innovation, but also political and economic factors.
This assessment will examine how public, private, and academic organizations can evaluate and prioritize science and technology partnership opportunities with other countries to achieve key national objectives.
Sponsor: Global Affairs Canada
Gene-edited Organisms for Pest Control and International Science and Technology Partnership Opportunities are funded by Innovation, Science and Economic Development Canada (ISED). Quantum Technologies is funded by the National Research Council of Council (NRC) and ISED, and the Future of Arctic and Northern Research in Canada is funded by a consortium of Arctic and northern research and science organizations from across Canada led by ArcticNet. The reports will be released in 2023-24.
Multidisciplinary expert panels will be appointed in the coming months for all four assessments.
You can find in-progress and completed CCA reports here.
Fingers crossed that the CCA looks a little further afield for their international experts than the US, UK, Australia, New Zealand, and northern Europe.
Finally, I’m guessing that the gene-editing and pest management report will cover and, gingerly, recommend germline editing (which is currently not allowed in Canada) and gene drives too.
It will be interesting to see who’s on that committee. If you’re really interested in the report topic, you may want to check out my April 26, 2019 posting and scroll down to the “Criminal ban on human gene-editing of inheritable cells (in Canada)” subhead where I examined what seemed to be an informal attempt to persuade policy makers to allow germline editing or gene-editing of inheritable cells in Canada.
The ‘metaverse’ seems to be everywhere these days (especially since Facebook has made a number of announcements bout theirs (more about that later in this posting).
At this point, the metaverse is very hyped up despite having been around for about 30 years. According to the Wikipedia timeline (see the Metaverse entry), the first one was a MOO in 1993 called ‘The Metaverse’. In any event, it seems like it might be a good time to see what’s changed since I dipped my toe into a metaverse (Second Life by Linden Labs) in 2007.
(For grammar buffs, I switched from definite article [the] to indefinite article [a] purposefully. In reading the various opinion pieces and announcements, it’s not always clear whether they’re talking about a single, overarching metaverse [the] replacing the single, overarching internet or whether there will be multiple metaverses, in which case [a].)
The hype/the buzz … call it what you will
This September 6, 2021 piece by Nick Pringle for Fast Company dates the beginning of the metaverse to a 1992 science fiction novel before launching into some typical marketing hype (for those who don’t know, hype is the short form for hyperbole; Note: Links have been removed),
The term metaverse was coined by American writer Neal Stephenson in his 1993 sci-fi hit Snow Crash. But what was far-flung fiction 30 years ago is now nearing reality. At Facebook’s most recent earnings call [June 2021], CEO Mark Zuckerberg announced the company’s vision to unify communities, creators, and commerce through virtual reality: “Our overarching goal across all of these initiatives is to help bring the metaverse to life.”
So what actually is the metaverse? It’s best explained as a collection of 3D worlds you explore as an avatar. Stephenson’s original vision depicted a digital 3D realm in which users interacted in a shared online environment. Set in the wake of a catastrophic global economic crash, the metaverse in Snow Crash emerged as the successor to the internet. Subcultures sprung up alongside new social hierarchies, with users expressing their status through the appearance of their digital avatars.
Today virtual worlds along these lines are formed, populated, and already generating serious money. Household names like Roblox and Fortnite are the most established spaces; however, there are many more emerging, such as Decentraland, Upland, Sandbox, and the soon to launch Victoria VR.
These metaverses [emphasis mine] are peaking at a time when reality itself feels dystopian, with a global pandemic, climate change, and economic uncertainty hanging over our daily lives. The pandemic in particular saw many of us escape reality into online worlds like Roblox and Fortnite. But these spaces have proven to be a place where human creativity can flourish amid crisis.
In fact, we are currently experiencing an explosion of platforms parallel to the dotcom boom. While many of these fledgling digital worlds will become what Ask Jeeves was to Google, I predict [emphasis mine] that a few will match the scale and reach of the tech giant—or even exceed it.
Because the metaverse brings a new dimension to the internet, brands and businesses will need to consider their current and future role within it. Some brands are already forging the way and establishing a new genre of marketing in the process: direct to avatar (D2A). Gucci sold a virtual bag for more than the real thing in Roblox; Nike dropped virtual Jordans in Fortnite; Coca-Cola launched avatar wearables in Decentraland, and Sotheby’s has an art gallery that your avatar can wander in your spare time.
D2A is being supercharged by blockchain technology and the advent of digital ownership via NFTs, or nonfungible tokens. NFTs are already making waves in art and gaming. More than $191 million was transacted on the “play to earn” blockchain game Axie Infinity in its first 30 days this year. This kind of growth makes NFTs hard for brands to ignore. In the process, blockchain and crypto are starting to feel less and less like “outsider tech.” There are still big barriers to be overcome—the UX of crypto being one, and the eye-watering environmental impact of mining being the other. I believe technology will find a way. History tends to agree.
…
Detractors see the metaverse as a pandemic fad, wrapping it up with the current NFT bubble or reducing it to Zuck’s [Jeffrey Zuckerberg and Facebook] dystopian corporate landscape. This misses the bigger behavior change that is happening among Gen Alpha. When you watch how they play, it becomes clear that the metaverse is more than a buzzword.
For Gen Alpha [emphasis mine], gaming is social life. While millennials relentlessly scroll feeds, Alphas and Zoomers [emphasis mine] increasingly stroll virtual spaces with their friends. Why spend the evening staring at Instagram when you can wander around a virtual Harajuku with your mates? If this seems ridiculous to you, ask any 13-year-old what they think.
…
Who is Nick Pringle and how accurate are his predictions?
… [the company] evolved from a computer-assisted film-making studio to a digital design and consulting company, as part of a major advertising network.
By thinking “virtual first,” you can see how these spaces become highly experimental, creative, and valuable. The products you can design aren’t bound by physics or marketing convention—they can be anything, and are now directly “ownable” through blockchain. …
I believe that the metaverse is here to stay. That means brands and marketers now have the exciting opportunity to create products that exist in multiple realities. The winners will understand that the metaverse is not a copy of our world, and so we should not simply paste our products, experiences, and brands into it.
…
I emphasized “These metaverses …” in the previous section to highlight the fact that I find the use of ‘metaverses’ vs. ‘worlds’ confusing as the words are sometimes used as synonyms and sometimes as distinctions. We do it all the time in all sorts of conversations but for someone who’s an outsider to a particular occupational group or subculture, the shifts can make for confusion.
As for Gen Alpha and Zoomer, I’m not a fan of ‘Gen anything’ as shorthand for describing a cohort based on birth years. For example, “For Gen Alpha [emphasis mine], gaming is social life,” ignores social and economic classes, as well as, the importance of locations/geography, e.g., Afghanistan in contrast to the US.
To answer the question I asked, Pringle does not mention any record of accuracy for his predictions for the future but I was able to discover that he is a “multiple Cannes Lions award-winning creative” (more here).
In recent months you may have heard about something called the metaverse. Maybe you’ve read that the metaverse is going to replace the internet. Maybe we’re all supposed to live there. Maybe Facebook (or Epic, or Roblox, or dozens of smaller companies) is trying to take it over. And maybe it’s got something to do with NFTs [non-fungible tokens]?
Unlike a lot of things The Verge covers, the metaverse is tough to explain for one reason: it doesn’t necessarily exist. It’s partly a dream for the future of the internet and partly a neat way to encapsulate some current trends in online infrastructure, including the growth of real-time 3D worlds.
…
Then what is the real metaverse?
There’s no universally accepted definition of a real “metaverse,” except maybe that it’s a fancier successor to the internet. Silicon Valley metaverse proponents sometimes reference a description from venture capitalist Matthew Ball, author of the extensive Metaverse Primer:
“The Metaverse is an expansive network of persistent, real-time rendered 3D worlds and simulations that support continuity of identity, objects, history, payments, and entitlements, and can be experienced synchronously by an effectively unlimited number of users, each with an individual sense of presence.”
Facebook, arguably the tech company with the biggest stake in the metaverse, describes it more simply:
“The ‘metaverse’ is a set of virtual spaces where you can create and explore with other people who aren’t in the same physical space as you.”
There are also broader metaverse-related taxonomies like one from game designer Raph Koster, who draws a distinction between “online worlds,” “multiverses,” and “metaverses.” To Koster, online worlds are digital spaces — from rich 3D environments to text-based ones — focused on one main theme. Multiverses are “multiple different worlds connected in a network, which do not have a shared theme or ruleset,” including Ready Player One’s OASIS. And a metaverse is “a multiverse which interoperates more with the real world,” incorporating things like augmented reality overlays, VR dressing rooms for real stores, and even apps like Google Maps.
If you want something a little snarkier and more impressionistic, you can cite digital scholar Janet Murray — who has described the modern metaverse ideal as “a magical Zoom meeting that has all the playful release of Animal Crossing.”
But wait, now Ready Player One isn’t a metaverse and virtual worlds don’t have to be 3D? It sounds like some of these definitions conflict with each other.
An astute observation.
…
Why is the term “metaverse” even useful? “The internet” already covers mobile apps, websites, and all kinds of infrastructure services. Can’t we roll virtual worlds in there, too?
Matthew Ball favors the term “metaverse” because it creates a clean break with the present-day internet. [emphasis mine] “Using the metaverse as a distinctive descriptor allows us to understand the enormity of that change and in turn, the opportunity for disruption,” he said in a phone interview with The Verge. “It’s much harder to say ‘we’re late-cycle into the last thing and want to change it.’ But I think understanding this next wave of computing and the internet allows us to be more proactive than reactive and think about the future as we want it to be, rather than how to marginally affect the present.”
A more cynical spin is that “metaverse” lets companies dodge negative baggage associated with “the internet” in general and social media in particular. “As long as you can make technology seem fresh and new and cool, you can avoid regulation,” researcher Joan Donovan told The Washington Post in a recent article about Facebook and the metaverse. “You can run defense on that for several years before the government can catch up.”
There’s also one very simple reason: it sounds more futuristic than “internet” and gets investors and media people (like us!) excited.
…
People keep saying NFTs are part of the metaverse. Why?
NFTs are complicated in their own right, and you can read more about them here. Loosely, the thinking goes: NFTs are a way of recording who owns a specific virtual good, creating and transferring virtual goods is a big part of the metaverse, thus NFTs are a potentially useful financial architecture for the metaverse. Or in more practical terms: if you buy a virtual shirt in Metaverse Platform A, NFTs can create a permanent receipt and let you redeem the same shirt in Metaverse Platforms B to Z.
Lots of NFT designers are selling collectible avatars like CryptoPunks, Cool Cats, and Bored Apes, sometimes for astronomical sums. Right now these are mostly 2D art used as social media profile pictures. But we’re already seeing some crossover with “metaverse”-style services. The company Polygonal Mind, for instance, is building a system called CryptoAvatars that lets people buy 3D avatars as NFTs and then use them across multiple virtual worlds.
Since starting this post sometime in September 2021, the situation regarding Facebook has changed a few times. I’ve decided to begin my version of the story from a summer 2021 announcement.
On Monday, July 26, 2021, Facebook announced a new Metaverse product group. From a July 27, 2021 article by Scott Rosenberg for Yahoo News (Note: A link has been removed),
Facebook announced Monday it was forming a new Metaverse product group to advance its efforts to build a 3D social space using virtual and augmented reality tech.
…
Facebook’s new Metaverse product group will report to Andrew Bosworth, Facebook’s vice president of virtual and augmented reality [emphasis mine], who announced the new organization in a Facebook post.
…
Facebook, integrity, and safety in the metaverse
On September 27, 2021 Facebook posted this webpage (Building the Metaverse Responsibly by Andrew Bosworth, VP, Facebook Reality Labs [emphasis mine] and Nick Clegg, VP, Global Affairs) on its site,
The metaverse won’t be built overnight by a single company. We’ll collaborate with policymakers, experts and industry partners to bring this to life.
We’re announcing a $50 million investment in global research and program partners to ensure these products are developed responsibly.
We develop technology rooted in human connection that brings people together. As we focus on helping to build the next computing platform, our work across augmented and virtual reality and consumer hardware will deepen that human connection regardless of physical distance and without being tied to devices.
…
Introducing the XR [extended reality] Programs and Research Fund
There’s a long road ahead. But as a starting point, we’re announcing the XR Programs and Research Fund, a two-year $50 million investment in programs and external research to help us in this effort. Through this fund, we’ll collaborate with industry partners, civil rights groups, governments, nonprofits and academic institutions to determine how to build these technologies responsibly.
Rebranding Facebook’s integrity and safety issues away?
It seems Facebook’s credibility issues are such that the company is about to rebrand itself according to an October 19, 2021 article by Alex Heath for The Verge (Note: Links have been removed),
Facebook is planning to change its company name next week to reflect its focus on building the metaverse, according to a source with direct knowledge of the matter.
The coming name change, which CEO Mark Zuckerberg plans to talk about at the company’s annual Connect conference on October 28th [2021], but could unveil sooner, is meant to signal the tech giant’s ambition to be known for more than social media and all the ills that entail. The rebrand would likely position the blue Facebook app as one of many products under a parent company overseeing groups like Instagram, WhatsApp, Oculus, and more. A spokesperson for Facebook declined to comment for this story.
Facebook already has more than 10,000 employees building consumer hardware like AR glasses that Zuckerberg believes will eventually be as ubiquitous as smartphones. In July, he told The Verge that, over the next several years, “we will effectively transition from people seeing us as primarily being a social media company to being a metaverse company.”
A rebrand could also serve to further separate the futuristic work Zuckerberg is focused on from the intense scrutiny Facebook is currently under for the way its social platform operates today. A former employee turned whistleblower, Frances Haugen, recently leaked a trove of damning internal documents to The Wall Street Journal and testified about them before Congress. Antitrust regulators in the US and elsewhere are trying to break the company up, and public trust in how Facebook does business is falling.
Facebook isn’t the first well-known tech company to change its company name as its ambitions expand. In 2015, Google reorganized entirely under a holding company called Alphabet, partly to signal that it was no longer just a search engine, but a sprawling conglomerate with companies making driverless cars and health tech. And Snapchat rebranded to Snap Inc. in 2016, the same year it started calling itself a “camera company” and debuted its first pair of Spectacles camera glasses.
…
If you have time, do read Heath’s article in its entirety.
An October 20, 2021 Thomson Reuters item on CBC (Canadian Broadcasting Corporation) news online includes quotes from some industry analysts about the rebrand,
…
“It reflects the broadening out of the Facebook business. And then, secondly, I do think that Facebook’s brand is probably not the greatest given all of the events of the last three years or so,” internet analyst James Cordwell at Atlantic Equities said.
…
“Having a different parent brand will guard against having this negative association transferred into a new brand, or other brands that are in the portfolio,” said Shankha Basu, associate professor of marketing at University of Leeds.
…
Tyler Jadah’s October 20, 2021 article for the Daily Hive includes an earlier announcement (not mentioned in the other two articles about the rebranding), Note: A link has been removed,
…
Earlier this week [October 17, 2021], Facebook announced it will start “a journey to help build the next computing platform” and will hire 10,000 new high-skilled jobs within the European Union (EU) over the next five years.
“Working with others, we’re developing what is often referred to as the ‘metaverse’ — a new phase of interconnected virtual experiences using technologies like virtual and augmented reality,” wrote Facebook’s Nick Clegg, the VP of Global Affairs. “At its heart is the idea that by creating a greater sense of “virtual presence,” interacting online can become much closer to the experience of interacting in person.”
Clegg says the metaverse has the potential to help unlock access to new creative, social, and economic opportunities across the globe and the virtual world.
In an email with Facebook’s Corporate Communications Canada, David Troya-Alvarez told Daily Hive, “We don’t comment on rumour or speculation,” in regards to The Verge‘s report.
I will update this posting when and if Facebook rebrands itself into a ‘metaverse’ company.
***See Oct. 28, 2021 update at the end of this posting and prepare yourself for ‘Meta’.***
Who (else) cares about integrity and safety in the metaverse?
In technology, first-mover advantage is often significant. This is why BigTech and other online platforms are beginning to acquire software businesses to position themselves for the arrival of the Metaverse. They hope to be at the forefront of profound changes that the Metaverse will bring in relation to digital interactions between people, between businesses, and between them both.
What is the Metaverse? The short answer is that it does not exist yet. At the moment it is vision for what the future will be like where personal and commercial life is conducted digitally in parallel with our lives in the physical world. Sounds too much like science fiction? For something that does not exist yet, the Metaverse is drawing a huge amount of attention and investment in the tech sector and beyond.
Here we look at what the Metaverse is, what its potential is for disruptive change, and some of the key legal and regulatory issues future stakeholders may need to consider.
What are the potential legal issues?
The revolutionary nature of the Metaverse is likely to give rise to a range of complex legal and regulatory issues. We consider some of the key ones below. As time goes by, naturally enough, new ones will emerge.
Data
Participation in the Metaverse will involve the collection of unprecedented amounts and types of personal data. Today, smartphone apps and websites allow organisations to understand how individuals move around the web or navigate an app. Tomorrow, in the Metaverse, organisations will be able to collect information about individuals’ physiological responses, their movements and potentially even brainwave patterns, thereby gauging a much deeper understanding of their customers’ thought processes and behaviours.
Users participating in the Metaverse will also be “logged in” for extended amounts of time. This will mean that patterns of behaviour will be continually monitored, enabling the Metaverse and the businesses (vendors of goods and services) participating in the Metaverse to understand how best to service the users in an incredibly targeted way.
The hungry Metaverse participant
How might actors in the Metaverse target persons participating in the Metaverse? Let us assume one such woman is hungry at the time of participating. The Metaverse may observe a woman frequently glancing at café and restaurant windows and stopping to look at cakes in a bakery window, and determine that she is hungry and serve her food adverts accordingly.
Contrast this with current technology, where a website or app can generally only ascertain this type of information if the woman actively searched for food outlets or similar on her device.
Therefore, in the Metaverse, a user will no longer need to proactively provide personal data by opening up their smartphone and accessing their webpage or app of choice. Instead, their data will be gathered in the background while they go about their virtual lives.
This type of opportunity comes with great data protection responsibilities. Businesses developing, or participating in, the Metaverse will need to comply with data protection legislation when processing personal data in this new environment. The nature of the Metaverse raises a number of issues around how that compliance will be achieved in practice.
Who is responsible for complying with applicable data protection law?
In many jurisdictions, data protection laws place different obligations on entities depending on whether an entity determines the purpose and means of processing personal data (referred to as a “controller” under the EU General Data Protection Regulation (GDPR)) or just processes personal data on behalf of others (referred to as a “processor” under the GDPR).
In the Metaverse, establishing which entity or entities have responsibility for determining how and why personal data will be processed, and who processes personal data on behalf of another, may not be easy. It will likely involve picking apart a tangled web of relationships, and there may be no obvious or clear answers – for example:
Will there be one main administrator of the Metaverse who collects all personal data provided within it and determines how that personal data will be processed and shared? Or will multiple entities collect personal data through the Metaverse and each determine their own purposes for doing so?
Either way, many questions arise, including:
How should the different entities each display their own privacy notice to users? Or should this be done jointly? How and when should users’ consent be collected? Who is responsible if users’ personal data is stolen or misused while they are in the Metaverse? What data sharing arrangements need to be put in place and how will these be implemented?
…
There’s a lot more to this page including a look at Social Media Regulation and Intellectual Property Rights.
I’m starting to think we should talking about RR (real reality), as well as, VR (virtual reality), AR (augmented reality), MR (mixed reality), and XR (extended reality). It seems that all of these (except RR, which is implied) will be part of the ‘metaverse’, assuming that it ever comes into existence. Happily, I have found a good summarized description of VR/AR/MR/XR in a March 20, 2018 essay by North of 41 on medium.com,
Summary: VR is immersing people into a completely virtual environment; AR is creating an overlay of virtual content, but can’t interact with the environment; MR is a mixed of virtual reality and the reality, it creates virtual objects that can interact with the actual environment. XR brings all three Reality (AR, VR, MR) together under one term.
If you have the interest and approximately five spare minutes, read the entire March 20, 2018 essay, which has embedded images illustrating the various realities.
Here’s a description from one of the researchers, Mohamed Kari, of the video, which you can see above, and the paper he and his colleagues presented at the 20th IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2021 (from the TransforMR page on YouTube),
We present TransforMR, a video see-through mixed reality system for mobile devices that performs 3D-pose-aware object substitution to create meaningful mixed reality scenes in previously unseen, uncontrolled, and open-ended real-world environments.
To get a sense of how recent this work is, ISMAR 2021 was held from October 4 – 8, 2021.
The team’s 2021 ISMAR paper, TransforMR Pose-Aware Object Substitution for Composing Alternate Mixed Realities by Mohamed Kari, Tobias Grosse-Puppendah, Luis Falconeri Coelho, Andreas Rene Fender, David Bethge, Reinhard Schütte, and Christian Holz lists two educational institutions I’d expect to see (University of Duisburg-Essen and ETH Zürich), the surprise was this one: Porsche AG. Perhaps that explains the preponderance of vehicles in this demonstration.
Space walking in virtual reality
Ivan Semeniuk’s October 2, 2021 article for the Globe and Mail highlights a collaboration between Montreal’s Felix and Paul Studios with NASA (US National Aeronautics and Space Administration) and Time studios,
Communing with the infinite while floating high above the Earth is an experience that, so far, has been known to only a handful.
Now, a Montreal production company aims to share that experience with audiences around the world, following the first ever recording of a spacewalk in the medium of virtual reality.
…
The company, which specializes in creating virtual-reality experiences with cinematic flair, got its long-awaited chance in mid-September when astronauts Thomas Pesquet and Akihiko Hoshide ventured outside the International Space Station for about seven hours to install supports and other equipment in preparation for a new solar array.
The footage will be used in the fourth and final instalment of Space Explorers: The ISS Experience, a virtual-reality journey to space that has already garnered a Primetime Emmy Award for its first two episodes.
From the outset, the production was developed to reach audiences through a variety of platforms for 360-degree viewing, including 5G-enabled smart phones and tablets. A domed theatre version of the experience for group audiences opened this week at the Rio Tinto Alcan Montreal Planetarium. Those who desire a more immersive experience can now see the first two episodes in VR form by using a headset available through the gaming and entertainment company Oculus. Scenes from the VR series are also on offer as part of The Infinite, an interactive exhibition developed by Montreal’s Phi Studio, whose works focus on the intersection of art and technology. The exhibition, which runs until Nov. 7 [2021], has attracted 40,000 visitors since it opened in July [2021?].
…
At a time when billionaires are able to head off on private extraterrestrial sojourns that almost no one else could dream of, Lajeunesse [Félix Lajeunesse, co-founder and creative director of Felix and Paul studios] said his project was developed with a very different purpose in mind: making it easier for audiences to become eyewitnesses rather than distant spectators to humanity’s greatest adventure.
…
For the final instalments, the storyline takes viewers outside of the space station with cameras mounted on the Canadarm, and – for the climax of the series – by following astronauts during a spacewalk. These scenes required extensive planning, not only because of the limited time frame in which they could be gathered, but because of the lighting challenges presented by a constantly shifting sun as the space station circles the globe once every 90 minutes.
…
… Lajeunesse said that it was equally important to acquire shots that are not just technically spectacular but that serve the underlying themes of Space Explorers: The ISS Experience. These include an examination of human adaptation and advancement, and the unity that emerges within a group of individuals from many places and cultures and who must learn to co-exist in a high risk environment in order to achieve a common goal.
There always seems to be a lot of grappling with new and newish science/technology where people strive to coin terms and define them while everyone, including members of the corporate community, attempts to cash in.
The last time I looked (probably about two years ago), I wasn’t able to find any good definitions for alternate reality and mixed reality. (By good, I mean something which clearly explicated the difference between the two.) It was nice to find something this time.
As for Facebook and its attempts to join/create a/the metaverse, the company’s timing seems particularly fraught. As well, paradigm-shifting technology doesn’t usually start with large corporations. The company is ignoring its own history.
Multiverses
Writing this piece has reminded me of the upcoming movie, “Doctor Strange in the Multiverse of Madness” (Wikipedia entry). While this multiverse is based on a comic book, the idea of a Multiverse (Wikipedia entry) has been around for quite some time,
Early recorded examples of the idea of infinite worlds existed in the philosophy of Ancient Greek Atomism, which proposed that infinite parallel worlds arose from the collision of atoms. In the third century BCE, the philosopher Chrysippus suggested that the world eternally expired and regenerated, effectively suggesting the existence of multiple universes across time.[1] The concept of multiple universes became more defined in the Middle Ages.
…
Multiple universes have been hypothesized in cosmology, physics, astronomy, religion, philosophy, transpersonal psychology, music, and all kinds of literature, particularly in science fiction, comic books and fantasy. In these contexts, parallel universes are also called “alternate universes”, “quantum universes”, “interpenetrating dimensions”, “parallel universes”, “parallel dimensions”, “parallel worlds”, “parallel realities”, “quantum realities”, “alternate realities”, “alternate timelines”, “alternate dimensions” and “dimensional planes”.
The physics community has debated the various multiverse theories over time. Prominent physicists are divided about whether any other universes exist outside of our own.
…
Living in a computer simulation or base reality
The whole thing is getting a little confusing for me so I think I’ll stick with RR (real reality) or as it’s also known base reality. For the notion of base reality, I want to thank astronomer David Kipping of Columbia University in Anil Ananthaswamy’s article for this analysis of the idea that we might all be living in a computer simulation (from my December 8, 2020 posting; scroll down about 50% of the way to the “Are we living in a computer simulation?” subhead),
… there is a more obvious answer: Occam’s razor, which says that in the absence of other evidence, the simplest explanation is more likely to be correct. The simulation hypothesis is elaborate, presuming realities nested upon realities, as well as simulated entities that can never tell that they are inside a simulation. “Because it is such an overly complicated, elaborate model in the first place, by Occam’s razor, it really should be disfavored, compared to the simple natural explanation,” Kipping says.
Maybe we are living in base reality after all—The Matrix, Musk and weird quantum physics notwithstanding.
To sum it up (briefly)
I’m sticking with the base reality (or real reality) concept, which is where various people and companies are attempting to create a multiplicity of metaverses or the metaverse effectively replacing the internet. This metaverse can include any all of these realities (AR/MR/VR/XR) along with base reality. As for Facebook’s attempt to build ‘the metaverse’, it seems a little grandiose.
The computer simulation theory is an interesting thought experiment (just like the multiverse is an interesting thought experiment). I’ll leave them there.
Wherever it is we are living, these are interesting times.
***Updated October 28, 2021: D. (Devindra) Hardawar’s October 28, 2021 article for engadget offers details about the rebranding along with a dash of cynicism (Note: A link has been removed),
Here’s what Facebook’s metaverse isn’t: It’s not an alternative world to help us escape from our dystopian reality, a la Snow Crash. It won’t require VR or AR glasses (at least, not at first). And, most importantly, it’s not something Facebook wants to keep to itself. Instead, as Mark Zuckerberg described to media ahead of today’s Facebook Connect conference, the company is betting it’ll be the next major computing platform after the rise of smartphones and the mobile web. Facebook is so confident, in fact, Zuckerberg announced that it’s renaming itself to “Meta.”
After spending the last decade becoming obsessed with our phones and tablets — learning to stare down and scroll practically as a reflex — the Facebook founder thinks we’ll be spending more time looking up at the 3D objects floating around us in the digital realm. Or maybe you’ll be following a friend’s avatar as they wander around your living room as a hologram. It’s basically a digital world layered right on top of the real world, or an “embodied internet” as Zuckerberg describes.
Before he got into the weeds for his grand new vision, though, Zuckerberg also preempted criticism about looking into the future now, as the Facebook Papers paint the company as a mismanaged behemoth that constantly prioritizes profit over safety. While acknowledging the seriousness of the issues the company is facing, noting that it’ll continue to focus on solving them with “industry-leading” investments, Zuckerberg said:
“The reality is is that there’s always going to be issues and for some people… they may have the view that there’s never really a great time to focus on the future… From my perspective, I think that we’re here to create things and we believe that we can do this and that technology can make things better. So we think it’s important to to push forward.”
Given the extent to which Facebook, and Zuckerberg in particular, have proven to be untrustworthy stewards of social technology, it’s almost laughable that the company wants us to buy into its future. But, like the rise of photo sharing and group chat apps, Zuckerberg at least has a good sense of what’s coming next. And for all of his talk of turning Facebook into a metaverse company, he’s adamant that he doesn’t want to build a metaverse that’s entirely owned by Facebook. He doesn’t think other companies will either. Like the mobile web, he thinks every major technology company will contribute something towards the metaverse. He’s just hoping to make Facebook a pioneer.
“Instead of looking at a screen, or today, how we look at the Internet, I think in the future you’re going to be in the experiences, and I think that’s just a qualitatively different experience,” Zuckerberg said. It’s not quite virtual reality as we think of it, and it’s not just augmented reality. But ultimately, he sees the metaverse as something that’ll help to deliver more presence for digital social experiences — the sense of being there, instead of just being trapped in a zoom window. And he expects there to be continuity across devices, so you’ll be able to start chatting with friends on your phone and seamlessly join them as a hologram when you slip on AR glasses.
…
D. (Devindra) Hardawar’s October 28, 2021 article provides a lot more details and I recommend reading it in its entirety.
The event description is quite exciting and the poster image is engaging, although ….
Courtesy: Vancouver Biennale
Did they intend for the blocks to the left and right (gateway to the bridge?) to look like someone holding both hands giving you the finger on each side? Now that I’ve seen it, I can’t ‘unsee’ it.
Moving on, there’s more information about the expo from a Nov. 9, 2020 Vancouver Biennale announcement (received via email),
The Vancouver Biennale announces a global invitation to #ArtProject2020, a free virtual art and technology expo about how the latest technologies are influencing the art world. The expo will run from November 11th to 15th and feature over 80 international speakers and 40 events offering accessible information and educational resources for digital art. Everyone with a personal or professional interest in art and technology, including curators, galleries, museums, artists, collectors, innovators, experience designers, and futurists will find the expo fascinating and is invited to register. Trilingual programming in English, Spanish, and Chinese will be available.
To reserve a free ticket and see the complete speaker list and schedule, visit www.artproject.io.
Curated by New York-based Colombian artist Jessica Angel, the expo will accompany the Vancouver Biennale’s first exhibition of tokenized art with new works by Jessica Angel, Dina Goldstein, Diana Thorneycroft, and Kristin McIver. Tokenized art is powered by blockchain technology and has redefined digital artwork ownership, allowing artists and collectors the benefit of true digital scarcity. The exhibition will be launched via the blockchain marketplace, Ephimera.
About the Expo
Panel Discussions, Artist Talks, Keynote Speakers: Innovators, curators, legal experts, and artists working at the leading edge of digital art will cover topics including What Is Cryptoart?, Finding Opportunity in the Digital, Women Leading the Art and Tech Movement, The Art of Immersion, Decentralising Power and Resources in the Art World, and Tools for Artists and Collectors. Speakers include The Whitney Museum, Victoria & Albert Museum, Christie’s, Foundation for Art and Blockchain, SuperRare, and Art in America.
Learning: Barrier-free educational workshops will teach participants about using open-source and accessible innovative tools to create, monetize, and collect digital art. Workshops are integrated with various blockchain projects to drive adoption through experience. Featured presenters include Ephimera, Status, and MakerDAO. Indigenous Matriachs 4 will present from the Immersive Knowledge Transfer series for XR media creators, artists, and storytellers from diverse cultural communities.
Activities: A Crypto-Art Puzzle will drop clues every day of the event, and the Digital Art Battle will challenge artists to draw live. This gamified experience will offer winners rewards in different tokens. Participates can also join the Rare AF team on a Virtual Gallery Tour through the Metaverse, where gallery owners will share the inspirations behind their virtual spaces.
Anchoring the virtual expo is a future physical installation by Jessica Angel. Cleverly titled Voxel Bridge, this public artwork will transform the area underneath Vancouver’s Cambie Street Bridge into a three-layered immersive experience to transport visitors between physical and digital worlds. Working with the vastness of the concrete bridge as first layer, Angel adds her site-specific installation as a second layer, and completes the experience with augmented reality enhancements over the real world as the third and final layer. The installation is slated for completion in Spring 2021 as part of the Vancouver Biennale Exhibition.
“I never want to see the Biennale stuck in the past, presenting only static sculpture in an ever-changing world. We work with what comes next, the yet unknown, and we want to go where the future is heading and where public art has, perhaps, always been going. I am excited for this expo and the next chapter of the Biennale.” – Barrie Mowatt, Founder & Artistic Director of Vancouver Biennale
“Art is a mobilizing force with the power to bridge seemingly dissimilar worlds, and Voxel Bridge exhibits this capacity. This expo transcends the enjoyment of art into a unifying and experimenting effort, that enables blockchain technology and established art institutions to examine ways of interaction. Join us in the virtual public space, to learn, and to cultivate new forms of participation.” – Jessica Angel, Artist
Do check the schedule: http://www.artproject.io/ (keep scrolling) and don’t forget it’s free in exchange for your registration information. Enjoy!