Tag Archives: Marcelo Wanderley

Metacreation Lab’s greatest hits of Summer 2023

I received a May 31, 2023 ‘newsletter’ (via email) from Simon Fraser University’s (SFU) Metacreation Lab for Creative Artificial Intelligence and the first item celebrates some current and past work,

International Conference on New Interfaces for Musical Expressions | NIME 2023
May 31 – June 2 | Mexico City, Mexico

We’re excited to be a part of NIME 2023, launching in Mexico City this week! 

As part of the NIME Paper Sessions, some of Metacreation’s labs and affiliates will be presenting a study based on case studies of musicians playing with virtual musical agents. Titled eTu{d,b}e, the paper was co-authored by Tommy Davis, Kasey LV Pocius, and Vincent Cusson, developers of the eTube instrument, along with music technology and interface researchers Marcelo Wanderley and Philippe Pasquier. Learn about the project and listen to sessions involving human and non-human musicians.

This research project involved experimenting with Spire Muse, a virtual performance agent co-developed by Metacreation Lab members. The paper introducing the system was awarded the best paper award at the 2021 International Conference on New Interfaces for Musical Expression (NIME). 

Learn more about the NIME2023 conference and program at the link below, which will also present a series of online music concerts later this week.

Learn more about NIME 2023

Coming up later this summer and also from the May 31, 2023 newsletter,

Evaluating Human-AI Interaction for MMM-C: a Creative AI System for Music Composition | IJCAI [2023 International Joint Conference on Artificial Intelligence] Preview

For those following the impact of AI on music composition and production, we would like to share a sneak peek of a review of user experiences using an experimental AI-composition tool [Multi-Track Music Machine (MMM)] integrated into the Steinberg Cubase digital audio workstation. Conducted in partnership with Steinberg, this study will be presented at the 2023 International Joint Conference on Artificial Intelligence (IJCAI2023), as part of the Arts and Creativity track of the conference. This year’s IJCAI conference taking place in Macao from August 19th to Aug 25th, 2023.

The conference is being held in Macao (or Macau), which is officially (according to its Wikipedia entry) the Macao Special Administrative Region of the People’s Republic of China (MSAR). It has a longstanding reputation as an international gambling and party mecca comparable to Las Vegas.

Cyborgian dance at McGill University (Canada)

As noted in the Canadian Council of Academies report ((State of Science and Technology in Canada, 2012), which was mentioned in my Dec. 28, 2012 posting, the field of visual and performing arts is an area of strength and that is due to one province, Québec. Mark Wilson’s Aug. 13, 2013 article for Fast Company and Paul Ridden’s Aug. 7, 2013 article for gizmag.com about McGill University’s Instrumented Bodies: Digital Prostheses for Music and Dance Performance seem to confirm Québec’s leadership.

From Wilson’s Aug. 13, 2013 article (Note: A link has been removed),

One is a glowing exoskeleton spine, while another looks like a pair of cyborg butterfly wings. But these aren’t just costumes; they’re wearable, functional art.

In fact, the team of researchers from the IDML (Input Devices and Music Interaction Laboratory [at McGill University]) who are responsible for the designs go so far as to call their creations “prosthetic instruments.”

Ridden’s Aug. 7, 2013 article offers more about the project’s history and technology,

For the last three years, a small research team at McGill University has been working with a choreographer, a composer, dancers and musicians on a project named Instrumented Bodies. Three groups of sensor-packed, internally-lit digital music controllers that attach to a dancer’s costume have been developed, each capable of wirelessly triggering synthesized music as the performer moves around the stage. Sounds are produced by tapping or stroking transparent Ribs or Visors, or by twisting, turning or moving Spines. Though work on the project continues, the instruments have already been used in a performance piece called Les Gestes which toured Canada and Europe during March and April.

Both articles are interesting but Wilson’s is the fast read and Ridden’s gives you information you can’t find by looking up the Instrumented Bodies: Digital Prostheses for Music and Dance Performance project webpage,

These instruments are the culmination of a three-year long project in which the designers worked closely with dancers, musicians, composers and a choreographer. The goal of the project was to develop instruments that are visually striking, utilize advanced sensing technologies, and are rugged enough for extensive use in performance.

The complex, transparent shapes are lit from within, and include articulated spines, curved visors and ribcages. Unlike most computer music control interfaces, they function both as hand-held, manipulable controllers and as wearable, movement-tracking extensions to the body. Further, since the performers can smoothly attach and detach the objects, these new instruments deliberately blur the line between the performers’ bodies and the instrument being played.

The prosthetic instruments were designed and developed by Ph.D. researchers Joseph Malloch and Ian Hattwick [and Marlon Schumacher] under the supervision of IDMIL director Marcelo Wanderley. Starting with sketches and rough foam prototypes for exploring shape and movement, they progressed through many iterations of the design before arriving at the current versions. The researchers made heavy use of digital fabrication technologies such as laser-cutters and 3D printers, which they accessed through the McGill University School of Architecture and the Centre for Interdisciplinary Research in Music Media and Technology, also hosted by McGill.

Each of the nearly thirty working instruments produced for the project has embedded sensors, power supplies and wireless data transceivers, allowing a performer to control the parameters of music synthesis and processing in real time through touch, movement, and orientation. The signals produced by the instruments are routed through an open-source peer-to-peer software system the IDMIL team has developed for designing the connections between sensor signals and sound synthesis parameters.

For those who prefer to listen and watch, the researchers have created a video documentary,

I usually don’t include videos that run past 5 mins. but I’ve made an exception for this almost 15 mins. documentary.

I was trying to find mention of a dancer and/or choreographer associated with this project and found a name along with another early stage participant, choreographer, Isabelle Van Grimde, and composer, Sean Ferguson, in Ridden’s article.