Tag Archives: Sean Ferguson

Interconnected performance analysis music hub shared by McGill University and Université de Montréal announced* June 2, 2016

The press releases promise the Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT) will shape the future of music. The CIRMMT June 2, 2016 (Future of Music) press release (received via email) describes the funding support,

A significant investment of public and private support that will redefine the future of music research in Canada by transforming the way musicians compose,listen and perform music.

The Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT), the Schulich School of Music of McGill University and the Faculty of Music of l’Université de Montréal are creating a unique interconnected research hub that will quite literally link two exceptional spaces at two of Canada’s most renowned music schools.

Imagine a new space and community where musicians, scientists and engineers join forces to gain a better understanding of the influence that music plays on individuals as well as their physical, psychological and even neurological conditions; experience the acoustics of an 18th century Viennese concert hall created with the touch of a fingertip; or attending an orchestral performance in one concert hall but hearing and seeing musicians performing from a completely different venue across town… All this and more will soon become possible here in Montreal!

The combination of public and private gifts will broaden our musical horizons exponentially thanks to significant investment for music research in Canada. With over $14.5 million in grants from the Canada Foundation for Innovation (CFI), the Government of Quebec and the Fonds de Recherche du Québec (FRQ), and a substantial contribution of an additional $2.5million gift from private philanthropy.

“We are grateful for this exceptional investment in music research from both the federal and provincial governments and from our generous donors,” says McGill Principal Suzanne Fortier. “This will further the collaboration between these two outstanding music schools and support the training of the next generation of music researchers and artists. For anyone who loves music, this is very exciting news.”

There’s not much technical detail in this one but here it is,

Digital channels coupling McGill University’s Music Multimedia Room (MMR – a large, sound-isolated performance lab) and l’Université de Montréal’s Salle Claude Champagne ([SCC -] a superb concert hall) will transform these two exceptional spaces into the world’s leading research facility for the scientific study of live performance, movement of recorded sound in space, and distributed performance (where musicians in different locations perform together).

“The interaction between scientific/technological research and artistic practice is one of the most fruitful avenues for future developments in both fields. This remarkable investment in music research is a wonderful recognition of the important contributions of the arts to Canadian society”, says Sean Ferguson, Dean of Schulich School of Music

The other CIRMMT June 2, 2016 (Collaborative hub) press  release (received via email) elaborates somewhat on the technology,

The MMR (McGill University’s Music Multimedia Room) will undergo complete renovations which include the addition of high quality variable acoustical treatment and a state-of-the-art rigging system. An active enhancement and sound spatialization system, together with stereoscopic projectors and displays, will provide virtual acoustic and immersive environments. At the SCC (l’Université de Montréal’s Salle Claude Champagne), the creation of a laboratory, a control room and a customizable rigging system will enable the installation and utilization of new research equipment’s in this acoustically-rich environment. These improvements will drastically augment the research possibilities in the hall, making it a unique hub in Canada for researchers to validate their experiments in a real concert hall.

“This infrastructure will provide exceptional spaces for performance analysis of multiple performers and audience members simultaneously, with equipment such as markerless motion-capture equipment and eye trackers. It will also connect both spaces for experimentations on distributed performances and will make possible new kinds of multimedia artworks.

The research and benefits

The research program includes looking at audio recording technologies, audio and video in immersive environments, and ultra-videoconferencing, leading to the development of new technologies for audio recording, film, television, distance education, and multi-media artworks; as well as a focus on cognition and perception in musical performance by large ensembles and on the rhythmical synchronization and sound blending of performers.

Social benefits include distance learning, videoconferencing, and improvements to the quality of both recorded music and live performance. Health benefits include improved hearing aids, noise reduction in airplanes and public spaces, and science-based music pedagogies and therapy. Economic benefits include innovations in sound recording, film and video games, and the training of highly qualified personnel across disciplines.

Amongst other activities they will be exploring data sonification as it relates to performance.

Hopefully, I’ll have more after the livestreamed press conference being held this afternoon, June 2, 2016,  (2:30 pm EST) at the CIRMMT.

*’opens’ changed to ‘announced’ on June 2, 2016 at 1335 hours PST.

ETA June 8, 2016: I did attend the press conference via livestream. There was some lovely violin played and the piece proved to be a demonstration of the work they’re hoping to expand on now that there will be a CIRMMT (pronounced kermit). There was a lot of excitement and I think that’s largely due to the number of years it’s taken to get to this point. One of the speakers reminisced about being a music student at McGill in the 1970s when they first started talking about getting a new music building.

They did get their building but have unable to complete it until these 2016 funds were awarded. Honestly, all the speakers seemed a bit giddy with delight. I wish them all congratulations!

Cyborgian dance at McGill University (Canada)

As noted in the Canadian Council of Academies report ((State of Science and Technology in Canada, 2012), which was mentioned in my Dec. 28, 2012 posting, the field of visual and performing arts is an area of strength and that is due to one province, Québec. Mark Wilson’s Aug. 13, 2013 article for Fast Company and Paul Ridden’s Aug. 7, 2013 article for gizmag.com about McGill University’s Instrumented Bodies: Digital Prostheses for Music and Dance Performance seem to confirm Québec’s leadership.

From Wilson’s Aug. 13, 2013 article (Note: A link has been removed),

One is a glowing exoskeleton spine, while another looks like a pair of cyborg butterfly wings. But these aren’t just costumes; they’re wearable, functional art.

In fact, the team of researchers from the IDML (Input Devices and Music Interaction Laboratory [at McGill University]) who are responsible for the designs go so far as to call their creations “prosthetic instruments.”

Ridden’s Aug. 7, 2013 article offers more about the project’s history and technology,

For the last three years, a small research team at McGill University has been working with a choreographer, a composer, dancers and musicians on a project named Instrumented Bodies. Three groups of sensor-packed, internally-lit digital music controllers that attach to a dancer’s costume have been developed, each capable of wirelessly triggering synthesized music as the performer moves around the stage. Sounds are produced by tapping or stroking transparent Ribs or Visors, or by twisting, turning or moving Spines. Though work on the project continues, the instruments have already been used in a performance piece called Les Gestes which toured Canada and Europe during March and April.

Both articles are interesting but Wilson’s is the fast read and Ridden’s gives you information you can’t find by looking up the Instrumented Bodies: Digital Prostheses for Music and Dance Performance project webpage,

These instruments are the culmination of a three-year long project in which the designers worked closely with dancers, musicians, composers and a choreographer. The goal of the project was to develop instruments that are visually striking, utilize advanced sensing technologies, and are rugged enough for extensive use in performance.

The complex, transparent shapes are lit from within, and include articulated spines, curved visors and ribcages. Unlike most computer music control interfaces, they function both as hand-held, manipulable controllers and as wearable, movement-tracking extensions to the body. Further, since the performers can smoothly attach and detach the objects, these new instruments deliberately blur the line between the performers’ bodies and the instrument being played.

The prosthetic instruments were designed and developed by Ph.D. researchers Joseph Malloch and Ian Hattwick [and Marlon Schumacher] under the supervision of IDMIL director Marcelo Wanderley. Starting with sketches and rough foam prototypes for exploring shape and movement, they progressed through many iterations of the design before arriving at the current versions. The researchers made heavy use of digital fabrication technologies such as laser-cutters and 3D printers, which they accessed through the McGill University School of Architecture and the Centre for Interdisciplinary Research in Music Media and Technology, also hosted by McGill.

Each of the nearly thirty working instruments produced for the project has embedded sensors, power supplies and wireless data transceivers, allowing a performer to control the parameters of music synthesis and processing in real time through touch, movement, and orientation. The signals produced by the instruments are routed through an open-source peer-to-peer software system the IDMIL team has developed for designing the connections between sensor signals and sound synthesis parameters.

For those who prefer to listen and watch, the researchers have created a video documentary,

I usually don’t include videos that run past 5 mins. but I’ve made an exception for this almost 15 mins. documentary.

I was trying to find mention of a dancer and/or choreographer associated with this project and found a name along with another early stage participant, choreographer, Isabelle Van Grimde, and composer, Sean Ferguson, in Ridden’s article.