Tag Archives: Metacreation Lab for Creative AI

SFU’s Philippe Pasquier speaks at “The rise of Creative AI and its ethics” online event on Tuesday, January 11, 2022 at 6 am PST

Simon Fraser University’s (SFU) Metacreation Lab for Creative AI (artificial intelligence) in Vancouver, Canada, has just sent me (via email) a January 2022 newsletter, which you can find here. There are a two items I found of special interest.

Max Planck Centre for Humans and Machines Seminars

From the January 2022 newsletter,

Max Planck Institute Seminar – The rise of Creative AI & its ethics
January 11, 2022 at 15:00 pm [sic] CET | 6:00 am PST

Next Monday [sic], Philippe Pasquier, director of the Metacreation Labn will
be providing a seminar titled “The rise of Creative AI & its ethics”
[Tuesday, January 11, 2022] at the Max Planck Institute’s Centre for Humans and
Machine [sic].

The Centre for Humans and Machines invites interested attendees to
our public seminars, which feature scientists from our institute and
experts from all over the world. Their seminars usually take 1 hour and
provide an opportunity to meet the speaker afterwards.

The seminar is openly accessible to the public via Webex Access, and
will be a great opportunity to connect with colleagues and friends of
the Lab on European and East Coast time. For more information and the
link, head to the Centre for Humans and Machines’ Seminars page linked
below.

Max Planck Institute – Upcoming Events

The Centre’s seminar description offers an abstract for the talk and a profile of Philippe Pasquier,

Creative AI is the subfield of artificial intelligence concerned with the partial or complete automation of creative tasks. In turn, creative tasks are those for which the notion of optimality is ill-defined. Unlike car driving, chess moves, jeopardy answers or literal translations, creative tasks are more subjective in nature. Creative AI approaches have been proposed and evaluated in virtually every creative domain: design, visual art, music, poetry, cooking, … These algorithms most often perform at human-competitive or superhuman levels for their precise task. Two main use of these algorithms have emerged that have implications on workflows reminiscent of the industrial revolution:

– Augmentation (a.k.a, computer-assisted creativity or co-creativity): a human operator interacts with the algorithm, often in the context of already existing creative software.

– Automation (computational creativity): the creative task is performed entirely by the algorithms without human intervention in the generation process.

Both usages will have deep implications for education and work in creative fields. Away from the fear of strong – sentient – AI, taking over the world: What are the implications of these ongoing developments for students, educators and professionals? How will Creative AI transform the way we create, as well as what we create?

Philippe Pasquier is a professor at Simon Fraser University’s School for Interactive Arts and Technology, where he directs the Metacreation Lab for Creative AI since 2008. Philippe leads a research-creation program centred around generative systems for creative tasks. As such, he is a scientist specialized in artificial intelligence, a multidisciplinary media artist, an educator, and a community builder. His contributions range from theoretical research on generative systems, computational creativity, multi-agent systems, machine learning, affective computing, and evaluation methodologies. This work is applied in the creative software industry as well as through artistic practice in computer music, interactive and generative art.

Interpreting soundscapes

Folks at the Metacreation Lab have made available an interactive search engine for sounds, from the January 2022 newsletter,

Audio Metaphor is an interactive search engine that transforms users’ queries into soundscapes interpreting them.  Using state of the art algorithms for sound retrieval, segmentation, background and foreground classification, AuMe offers a way to explore the vast open source library of sounds available on the  freesound.org online community through natural language and its semantic, symbolic, and metaphorical expressions. 

We’re excited to see Audio Metaphor included  among many other innovative projects on Freesound Labs, a directory of projects, hacks, apps, research and other initiatives that use content from Freesound or use the Freesound API. Take a minute to check out the variety of projects applying creative coding, machine learning, and many other techniques towards the exploration of sound and music creation, generative music, and soundscape composition in diverse forms an interfaces.

Explore AuMe and other FreeSound Labs projects    

The Audio Metaphor (AuMe) webpage on the Metacreation Lab website has a few more details about the search engine,

Audio Metaphor (AuMe) is a research project aimed at designing new methodologies and tools for sound design and composition practices in film, games, and sound art. Through this project, we have identified the processes involved in working with audio recordings in creative environments, addressing these in our research by implementing computational systems that can assist human operations.

We have successfully developed Audio Metaphor for the retrieval of audio file recommendations from natural language texts, and even used phrases generated automatically from Twitter to sonify the current state of Web 2.0. Another significant achievement of the project has been in the segmentation and classification of environmental audio with composition-specific categories, which were then applied in a generative system approach. This allows users to generate sound design simply by entering textual prompts.

As we direct Audio Metaphor further toward perception and cognition, we will continue to contribute to the music information retrieval field through environmental audio classification and segmentation. The project will continue to be instrumental in the design and implementation of new tools for sound designers and artists.

See more information on the website audiometaphor.ca.

As for Freesound Labs, you can find them here.

Art, sound, AI, & the Metacreation Lab’s Spring 2021 newsletter

The Metacreation Lab’s Spring 2021 newsletter (received via email) features a number of events either currently taking place or about to take place.

2021 AI Song Contest

2021 marks the 2nd year for this international event, an artificial intelligence/AI Song Contest 2021. The folks at Simon Fraser University’s (SFU) Metacreation Lab have an entry for the 2021 event, A song about the weekend (and you can do whatever you want). Should you click on the song entry, you will find an audio file, a survey/vote consisting of four questions and, if you keep scrolling down, more information about the creative, team, the song and more,

Driven by collaborations involving scientists, experts in artificial intelligence, cognitive sciences, designers, and artists, the Metacreation Lab for Creative AI is at the forefront of the development of generative systems, whether these are embedded in interactive experiences or automating workflows integrated into cutting-edge creative software.

Team:

Cale Plut (Composer and musician) is a PhD Student in the Metacreation lab, researching AI music applications in video games.

Philippe Pasquier (Producer and supervisor) is an Associate Professor, and leads the Metacreation Lab. 

Jeff Ens (AI programmer) is a PhD Candidate in the Metacreation lab, researching AI models for music generation.

Renaud Tchemeube (Producer and interaction designer) is a PhD Student in the Metacreation Lab, researching interaction software design for creativity.

Tara Jadidi (Research Assistant) is an undergraduate student at FUM, Iran, working with the Metacreation lab.

Dimiter Zlatkov (Research Assistant) is an undergraduate student at UBC, working with the Metacreation lab.

ABOUT THE SONG

A song about the weekend (and you can do whatever you want) explores the relationships between AI, humans, labour, and creation in a lighthearted and fun song. It is co-created with the Multi-track Music Machine (MMM)

Through the history of automation and industrialization, the relationship between the labour magnification power of automation and the recipients of the benefits of that magnification have been in contention. While increasing levels of automation are often accompanied by promises of future leisure increases, this rarely materializes for the workers whose labour is multiplied. By primarily using automated methods to create a “fun” song about leisure, we highlight both the promise of AI-human cooperation as well as the disparities in its real-world deployment. 

As for the competition itself, here’s more from the FAQs (frequently asked questions),

What is the AI Song Contest?

AI Song Contest is an international creative AI contest. Teams from all over the world try to create a 4-minute pop song with the help of artificial intelligence.

When and where does it take place?

Between June 1, 2021 and July 1, 2021 voting is open for the international public. On July 6 there will be multiple online panel sessions, and the winner of the AI Song Contest 2021 will be announced in an online award ceremony. All sessions on July 6 are organised in collaboration with Wallifornia MusicTech.

How is the winner determined?

Each participating team will be awarded two sets of points: one a public vote by the contest’s international audience, the other the determination of an expert jury.

Anyone can evaluate as many songs as they like: from one, up to all thirty-eight. Every song can be evaluated only once. Even though it won’t count in the grand total, lyrics can be evaluated too; we do like to determine which team wrote the best accoring to the audience.

Can I vote multiple times for the same team?

No, votes are controlled by IP address. So only one of your votes will count.

Is this the first time the contest is organised?

This is the second time the AI Song Contest is organised. The contest was first initiated in 2020 by Dutch public broadcaster VPRO together with NPO Innovation and NPO 3FM. Teams from Europe and Australia tried to create a Eurovision kind of song with the help of AI. Team Uncanny Valley from Australia won the first edition with their song Beautiful the World. The 2021 edition is organised independently.

What is the definition of artificial intelligence in this contest?

Artificial intelligence is a very broad concept. For this contest it will mean that teams can use techniques such as -but not limited to- machine learning, such as deep learning, natural language processing, algorithmic composition or combining rule-based approaches with neural networks for the creation of their songs. Teams can create their own AI tools, or use existing models and algorithms.  

What are possible challenges?

Read here about the challenges teams from last year’s contest faced.

As an AI researcher, can I collaborate with musicians?

Yes – this is strongly encouraged!

For the 2020 edition, all songs had to be Eurovision-style. Is that also the intention for 2021 entries?

Last year, the first year the contest was organized, it was indeed all about Eurovision. For this year’s competition, we are trying to expand geographically, culturally, and musically. Teams from all over the world can compete, and songs in all genres can be submitted.

If you’re not familiar with Eurovision-style, you can find a compilation video with brief excerpts from the 26 finalists for Eurovision 2021 here (Bill Young’s May 23, 2021 posting on tellyspotting.kera.org; the video runs under 10 mins.). There’s also the “Eurovision Song Contest: The Story of Fire Saga” 2020 movie starring Rachel McAdams, Will Ferrell, and Dan Stevens. It’s intended as a gentle parody but the style is all there.

ART MACHINES 2: International Symposium on Machine Learning and Art 2021

The symposium, Art Machines 2, started yesterday (June 10, 2021 and runs to June 14, 2021) in Hong Kong and SFU’s Metacreation Lab will be represented (from the Spring 2021 newsletter received via email),

On Sunday, June 13 [2021] at 21:45 Hong Kong Standard Time (UTC +8) as part of the Sound Art Paper Session chaired by Ryo Ikeshiro, the Metacreation Lab’s Mahsoo Salimi and Philippe Pasquier will present their paper, Exploiting Swarm Aesthetics in Sound Art. We’ve included a more detailed preview of the paper in this newsletter below.

Concurrent with ART MACHINES 2 is the launch of two exhibitions – Constructing Contexts and System Dreams. Constructing Contexts, curated by Tobias Klein and Rodrigo Guzman-Serrano, will bring together 27 works with unique approaches to the question of contexts as applied by generative adversarial networks. System Dreams highlights work from the latest MFA talent from the School of Creative Media. While the exhibitions take place in Hong Kong, the participating artists and artwork are well documented online.

Liminal Tones: Swarm Aesthetics in Sound Art

Applications of swarm aesthetics in music composition are not new and have already resulted in volumes of complex soundscapes and musical compositions. Using an experimental approach, Mahsoo Salimi and Philippe Pasquier create a series of sound textures know as Liminal Tones (B/ Rain Dream) based on swarming behaviours

Findings of the Liminal Tones project will be presented in papers for the Art Machines 2: International Symposium on Machine Learning (June 10-14 [2021]) and the International Conference on Swarm Intelligence (July 17-21 [2021]).

Talk about Creative AI at the University of British Columbia

This is the last item I’m excerpting from the newsletter. (Should you be curious about what else is listed, you can go to the Metacreation Lab’s contact page and sign up for the newsletter there.) On June 22, 2021 at 2:00 PM PDT, there will be this event,

Creative AI: on the partial or complete automation of creative tasks @ CAIDA

Philippe Pasquier will be giving a talk on creative applications of AI at CAIDA: UBC ICICS Centre for Artificial Intelligence Decision-making and Action. Overviewing the state of the art of computer-assisted creativity and embedded systems and their various applications, the talk will survey the design, deployment, and evaluation of generative systems.

Free registration for the talk is available at the link below.

Register for Creative AI @ CAIDA

Remember, if you want to see the rest of the newsletter, you can sign up at the Metacreation Lab’s contact page.

Large Interactive Virtual Environment Laboratory (LIVELab) located in McMaster University’s Institute for Music & the Mind (MIMM) and the MetaCreation Lab at Simon Fraser University

Both of these bits have a music focus but they represent two entirely different science-based approaches to that form of art and one is solely about the music and the other is included as one of the art-making processes being investigated..

Large Interactive Virtual Environment Laboratory (LIVELab) at McMaster University

Laurel Trainor and Dan J. Bosnyak both of McMaster University (Ontario, Canada) have written an October 27, 2019 essay about the LiveLab and their work for The Conversation website (Note: Links have been removed),

The Large Interactive Virtual Environment Laboratory (LIVELab) at McMaster University is a research concert hall. It functions as both a high-tech laboratory and theatre, opening up tremendous opportunities for research and investigation.

As the only facility of its kind in the world, the LIVELab is a 106-seat concert hall equipped with dozens of microphones, speakers and sensors to measure brain responses, physiological responses such as heart rate, breathing rates, perspiration and movements in multiple musicians and audience members at the same time.

Engineers, psychologists and clinician-researchers from many disciplines work alongside musicians, media artists and industry to study performance, perception, neural processing and human interaction.

In the LIVELab, acoustics are digitally controlled so the experience can change instantly from extremely silent with almost no reverberation to a noisy restaurant to a subway platform or to the acoustics of Carnegie Hall.

Real-time physiological data such as heart rate can be synchronized with data from other systems such as motion capture, and monitored and recorded from both performers and audience members. The result is that the reams of data that can now be collected in a few hours in the LIVELab used to take weeks or months to collect in a traditional lab. And having measurements of multiple people simultaneously is pushing forward our understanding of real-time human interactions.

Consider the implications of how music might help people with Parkinson’s disease to walk more smoothly or children with dyslexia to read better.

[…] area of ongoing research is the effectiveness of hearing aids. By the age of 60, nearly 49 per cent of people will suffer from some hearing loss. People who wear hearing aids are often frustrated when listening to music because the hearing aids distort the sound and cannot deal with the dynamic range of the music.

The LIVELab is working with the Hamilton Philharmonic Orchestra to solve this problem. During a recent concert, researchers evaluated new ways of delivering sound directly to participants’ hearing aids to enhance sounds.

Researchers hope new technologies can not only increase live musical enjoyment but alleviate the social isolation caused by hearing loss.

Imagine the possibilities for understanding music and sound: How it might help to improve cognitive decline, manage social performance anxiety, help children with developmental disorders, aid in treatment of depression or keep the mind focused. Every time we conceive and design a study, we think of new possibilities.

The essay also includes an embedded 12 min. video about LIVELab and details about studies conducted on musicians and live audiences. Apparently, audiences experience live performance differently than recorded performances and musicians use body sway to create cohesive performances. You can find the McMaster Institute for Music & the Mind here and McMaster’s LIVELab here.

Capturing the motions of a string quartet performance. Laurel Trainor, Author provided [McMaster University]

Metacreation Lab at Simon Fraser University (SFU)

I just recently discovered that there’s a Metacreation Lab at Simon Fraser University (Vancouver, Canada), which on its homepage has this ” Metacreation is the idea of endowing machines with creative behavior.” Here’s more from the homepage,

As the contemporary approach to generative art, Metacreation involves using tools and techniques from artificial intelligence, artificial life, and machine learning to develop software that partially or completely automates creative tasks. Through the collaboration between scientists, experts in artificial intelligence, cognitive sciences, designers and artists, the Metacreation Lab for Creative AI is at the forefront of the development of generative systems, be they embedded in interactive experiences or integrated into current creative software. Scientific research in the Metacreation Lab explores how various creative tasks can be automated and enriched. These tasks include music composition [emphasis mine], sound design, video editing, audio/visual effect generation, 3D animation, choreography, and video game design.

Besides scientific research, the team designs interactive and generative artworks that build upon the algorithms and research developed in the Lab. This work often challenges the social and cultural discourse on AI.

Much to my surprise I received the Metacreation Lab’s inaugural email newsletter (received via email on Friday, November 15, 2019),

Greetings,

We decided to start a mailing list for disseminating news, updates, and announcements regarding generative art, creative AI and New Media. In this newsletter: 

  1. ISEA 2020: The International Symposium on Electronic Art. ISEA return to Montreal, check the CFP bellow and contribute!
  2. ISEA 2015: A transcription of Sara Diamond’s keynote address “Action Agenda: Vancouver’s Prescient Media Arts” is now available for download. 
  3. Brain Art, the book: we are happy to announce the release of the first comprehensive volume on Brain Art. Edited by Anton Nijholt, and published by Springer.

Here are more details from the newsletter,

ISEA2020 – 26th International Symposium on Electronic Arts

Montreal, September 24, 2019
Montreal Digital Spring (Printemps numérique) is launching a call for participation as part of ISEA2020 / MTL connect to be held from May 19 to 24, 2020 in Montreal, Canada. Founded in 1990, ISEA is one of the world’s most prominent international arts and technology events, bringing together scholarly, artistic, and scientific domains in an interdisciplinary discussion and showcase of creative productions applying new technologies in art, interactivity, and electronic and digital media. For 2020, ISEA Montreal turns towards the theme of sentience.

ISEA2020 will be fully dedicated to examining the resurgence of sentience—feeling-sensing-making sense—in recent art and design, media studies, science and technology studies, philosophy, anthropology, history of science and the natural scientific realm—notably biology, neuroscience and computing. We ask: why sentience? Why and how does sentience matter? Why have artists and scholars become interested in sensing and feeling beyond, with and around our strictly human bodies and selves? Why has this notion been brought to the fore in an array of disciplines in the 21st century?
CALL FOR PARTICIPATION: WHY SENTIENCE? ISEA2020 invites artists, designers, scholars, researchers, innovators and creators to participate in the various activities deployed from May 19 to 24, 2020. To complete an application, please fill in the forms and follow the instructions.

The final submissions deadline is NOVEMBER 25, 2019. Submit your application for WORKSHOP and TUTORIAL Submit your application for ARTISTIC WORK Submit your application for FULL / SHORT PAPER Submit your application for PANEL Submit your application for POSTER Submit your application for ARTIST TALK Submit your application for INSTITUTIONAL PRESENTATION
Find Out More
You can apply for several categories. All profiles are welcome. Notifications of acceptance will be sent around January 13, 2020.

Important: please note that the Call for participation for MTL connect is not yet launched, but you can also apply to participate in the programming of the other Pavilions (4 other themes) when registrations are open (coming soon): mtlconnecte.ca/en TICKETS

Registration is now available to assist to ISEA2020 / MTL connect, from May 19 to 24, 2020. Book today your Full Pass and get the early-bird rate!
Buy Now

More from the newsletter,

ISEA 2015 was in Vancouver, Canada, and the proceedings and art catalog are still online. The news is that Sara Diamond released her 2015 keynote address as a paper: Action Agenda: Vancouver’s Prescient Media Arts. It is never too late so we thought we would let you know about this great read. See The 2015 Proceedings Here

The last item from the inaugural newsletter,

The first book that surveys how brain activity can be monitored and manipulated for artistic purposes, with contributions by interactive media artists, brain-computer interface researchers, and neuroscientists. View the Book Here

As per the Leonardo review from Cristina Albu:

“Another seminal contribution of the volume is the presentation of multiple taxonomies of “brain art,” which can help art critics develop better criteria for assessing this genre. Mirjana Prpa and Philippe Pasquier’s meticulous classification shows how diverse such works have become as artists consider a whole range of variables of neurofeedback.” Read the Review

For anyone not familiar with the ‘Leonardo’ cited in the above, it’s Leonardo; the International Society for the Arts, Sciences and Technology.

Should this kind of information excite and motivate you do start metacreating, you can get in touch with the lab,

Our mailing address is:
Metacreation Lab for Creative AI
School of Interactive Arts & Technology
Simon Fraser University
250-13450 102 Ave.
Surrey, BC V3T 0A3
Web: http://metacreation.net/
Email: metacreation_admin (at) sfu (dot) ca