Category Archives: performing arts

Future of Being Human: a call for proposals

The Canadian Institute for Advanced Research (CIFAR) is investigating the ‘Future of Being Human’ and has instituted a global call for proposals but there is one catch, your team has to have one person (with or without citizenship) who’s living and working in Canada. (Note: I am available.)

Here’s more about the call (from the CIFAR Global Call for Ideas: The Future of Being Human webpage),

New program proposals should explore the long term intersection of humans, science and technology, social and cultural systems, and our environment. Our understanding of the world around us, and new insights into individual and societal behaviour, have the potential to provide enormous benefits to humanity and the planet. 

We invite bold proposals from researchers at universities or research institutions that ask new questions about our complex emerging world. We are confronting challenging problems that require a diverse team incorporating multiple disciplines (potentially spanning the humanities, social sciences, arts, physical sciences, and life sciences [emphasis mine]) to engage in a sustained dialogue to develop new insights, and change the conversation on important questions facing science and humanity.

CIFAR is committed to creating a more diverse, equitable, and inclusive environment. We welcome proposals that include individuals from countries and institutions that are not yet represented in our research community.

Here’s a description, albeit, a little repetitive, of what CIFAR is asking researchers to do (from the Program Guide [PDF]),

For CIFAR’s next Global Call for Ideas, we are soliciting proposals related to The Future of Being Human, exploring in the long term the intersection of humans, science and technology, social and cultural systems, and our environment. Our understanding of the natural world around us, and new insights into individual and societal behaviour, have the potential to provide enormous benefits to humanity and the planet. We invite bold proposals that ask new questions about our complex emerging world, where the issues under study are entangled and dynamic. We are confronting challenging problems that necessitate a diverse team incorporating multiple disciplines (potentially spanning the humanities, social sciences, arts, physical sciences, and life sciences) to engage in a sustained dialogue to develop new insights, and change the conversation on important questions facing science and humanity. [p. 2 print; p. 4 PDF]

There is an upcoming information webinar (from the CIFAR Global Call for Ideas: The Future of Being Human webpage),

Monday, June 28, 2021 – 1:00pm – 1:45pm EDT

Webinar Sign-Up

Also from the CIFAR Global Call for Ideas: The Future of Being Human webpage, here are the various deadlines and additional sources of information,

August 17, 2021

Registration deadline

January 26, 2022

LOI [Letter of Intent] deadline

Spring 2022

LOIs invited to Full Proposal

Fall 2022

Full proposals due

March 2023

New program announcement and celebration

Resources

Program Guide [PDF]

Frequently Asked Questions

Good luck!

The Internet of Bodies and Ghislaine Boddington

I stumbled across this event on my Twitter feed (h/t @katepullinger; Note: Kate Pullinger is a novelist and Professor of Creative Writing and Digital Media, Director of the Centre for Cultural and Creative Industries [CCCI] at Bath Spa University in the UK).

Anyone who visits here with any frequency will have noticed I have a number of articles on technology and the body (you can find them in the ‘human enhancement’ category and/or search fro the machine/flesh tag). Boddington’s view is more expansive than the one I’ve taken and I welcome it. First, here’s the event information and, then, a link to her open access paper from February 2021.

From the CCCI’s Annual Public Lecture with Ghislaine Boddington eventbrite page,

This year’s CCCI Public Lecture will be given by Ghislaine Boddington. Ghislaine is Creative Director of body>data>space and Reader in Digital Immersion at University of Greenwich. Ghislaine has worked at the intersection of the body, the digital, and spatial research for many years. This will be her first in-person appearance since the start of the pandemic, and she will share with us the many insights she has gathered during this extraordinary pivot to online interfaces much of the world has been forced to undertake.

With a background in performing arts and body technologies, Ghislaine is recognised as a pioneer in the exploration of digital intimacy, telepresence and virtual physical blending since the early 90s. As a curator, keynote speaker and radio presenter she has shared her outlook on the future human into the cultural, academic, creative industries and corporate sectors worldwide, examining topical issues with regards to personal data usage, connected bodies and collective embodiment. Her research led practice, examining the evolution of the body as the interface, is presented under the heading ‘The Internet of Bodies’. Recent direction and curation outputs include “me and my shadow” (Royal National Theatre 2012), FutureFest 2015-18 and Collective Reality (Nesta’s FutureFest / SAT Montreal 2016/17). In 2017 Ghislaine was awarded the international IX Immersion Experience Visionary Pioneer Award. She recently co-founded University of Greenwich Strategic Research Group ‘CLEI – Co-creating Liveness in Embodied Immersion’ and is an Associate Editor for AI & Society (Springer). Ghislaine is a long term advocate for diversity and inclusion, working as a Trustee for Stemette Futures and Spokesperson for Deutsche Bank ‘We in Social Tech’ initiative. She is a team member and presenter with BBC World Service flagship radio show/podcast Digital Planet.

Date and time

Thu, 24 June 2021
08:00 – 09:00 [am] PDT

@GBoddington

@bodydataspace

@ConnectedBodies

Boddington’s paper is what ignited my interest; here’s a link to and a citation for it,

The Internet of Bodies—alive, connected and collective: the virtual physical future of our bodies and our senses by Ghislaine Boddington. AI Soc. 2021 Feb 8 : 1–17. DOI: 10.1007/s00146-020-01137-1 PMCID: PMC7868903 PMID: 33584018

Some excerpts from this open access paper,

The Weave—virtual physical presence design—blending processes for the future

Coming from a performing arts background, dance led, in 1989, I became obsessed with the idea that there must be a way for us to be able to create and collaborate in our groups, across time and space, whenever we were not able to be together physically. The focus of my work, as a director, curator and presenter across the last 30 years, has been on our physical bodies and our data selves and how they have, through the extended use of our bodies into digitally created environments, started to merge and converge, shifting our relationship and understanding of our identity and our selfhood.

One of the key methodologies that I have been using since the mid-1990s is inter-authored group creation, a process we called The Weave (Boddington 2013a, b). It uses the simple and universal metaphor of braiding, plaiting or weaving three strands of action and intent, these three strands being:

1. The live body—whether that of the performer, the participant, or the public;

2. The technologies of today—our tools of virtually physical reflection;

3. The content—the theme in exploration.

As with a braid or a plait, the three strands must be weaved simultaneously. What is key to this weave is that in any co-creation between the body and technology, the technology cannot work without the body; hence, there will always be virtual/physical blending. [emphasis mine]

Cyborgs

Cyborg culture is also moving forward at a pace with most countries having four or five cyborgs who have reached out into media status. Manel Munoz is the weather man as such, fascinated and affected by cyclones and anticyclones, his back of the head implant sent vibrations to different sides of his head linked to weather changes around him.

Neil Harbisson from Northern Ireland calls himself a trans-species rather than a cyborg, because his implant is permanently fused into the crown of his head. He is the first trans-species/cyborg to have his passport photo accepted as he exists with his fixed antenna. Neil has, from birth, an eye condition called greyscale, which means he only sees the world in grey and white. He uses his antennae camera to detect colour, and it sends a vibration with a different frequency for each colour viewed. He is learning what colours are within his viewpoint at any given time through the vibrations in his head, a synaesthetic method of transference of one sense for another. Moon Ribas, a Spanish choreographer and a dancer, had two implants placed into the top of her feet, set to sense seismic activity as it occurs worldwide. When a small earthquake occurs somewhere, she received small vibrations; a bigger eruption gives her body a more intense vibration. She dances as she receives and reacts to these transferred data. She feels a need to be closer to our earth, a part of nature (Harbisson et al. 2018).

Medical, non medical and sub-dermal implants

Medical implants, embedded into the body or subdermally (nearer the surface), have rapidly advanced in the last 30 years with extensive use of cardiac pacemakers, hip implants, implantable drug pumps and cochlear implants helping partial deaf people to hear.

Deep body and subdermal implants can be personalised to your own needs. They can be set to transmit chosen aspects of your body data outwards, but they also can receive and control data in return. There are about 200 medical implants in use today. Some are complex, like deep brain stimulation for motor neurone disease, and others we are more familiar with, for example, pacemakers. Most medical implants are not digitally linked to the outside world at present, but this is in rapid evolution.

Kevin Warwick, a pioneer in this area, has interconnected himself and his partner with implants for joint use of their personal and home computer systems through their BrainGate (Warwick 2008) implant, an interface between the nervous system and the technology. They are connected bodies. He works onwards with his experiments to feel the shape of distant objects and heat through fingertip implants.

‘Smart’ implants into the brain for deep brain stimulation are in use and in rapid advancement. The ethics of these developments is under constant debate in 2020 and will be onwards, as is proved by the mass coverage of the Neuralink, Elon Musk’s innovation which connects to the brain via wires, with the initial aim to cure human diseases such as dementia, depression and insomnia and onwards plans for potential treatment of paraplegia (Musk 2016).

Given how many times I’ve featured art/sci (also know as, art/science and/or sciart) and cyborgs and medical implants here, my excitement was a given.

For anyone who wants to pursue Boddington’s work further, her eponymous website is here, the body>data>space is here, and her University of Greenwich profile page is here.

For anyone interested in the Centre for Creative and Cultural Industries (CCCI), their site is here.

Finally, here’s one of my earliest pieces about cyborgs titled ‘My mother is a cyborg‘ from April 20, 2012 and my September 17, 2020 posting titled, ‘Turning brain-controlled wireless electronic prostheses into reality plus some ethical points‘. If you scroll down to the ‘Brain-computer interfaces, symbiosis, and ethical issues’ subhead, you’ll find some article excerpts about a fascinating qualitative study on implants and ethics.

The Canada Council for the Arts, a digital strategy research report on blockchains and culture, and Vancouver (Canada)

Is the May 17, 2021 “Blockchains & Cultural Padlocks (BACP) Digital Strategy Research Report” discussing a hoped for future transformative experience? Given the report’s subtitle: “Towards a Digitally Cooperative Culture: Recommoning Land, Data and Objects,” and the various essays included in the 200 pp document, I say the answer is ‘yes’.

The report was launched by 221 A, a Vancouver (Canada)-based arts and culture organization and funded by the Canada Council for the Arts through their Digital Strategy Fund. Here’s more from the BACP report in the voice of its research leader, Jesse McKee,

… The blockchain is the openly readable and unalterable ledger technology, which is most broadly known for supporting such applications as bitcoin and other cryptocurrencies. This report documents the first research phase in a three-phased approach to establishing our digital strategy [emphasis mine], as we [emphasis mine] learn from the blockchain development communities. This initiative’s approach is an institutional one, not one that is interpreting the technology for individuals, artists and designers alone. The central concept of the blockchain is that exchanges of value need not rely on centralized authentication from institutions such as banks, credit cards or the state, and that this exchange of value is better programmed and tracked with metadata to support the virtues, goals and values of a particular network. This concept relies on a shared, decentralized and trustless ledger. “Trustless” in the blockchain community is an evolution of the term trust, shifting its signification as a contract usually held between individuals, managed and upheld by a centralized social institution, and redistributing it amongst the actors in a blockchain network who uphold the platform’s technical operational codes and can access ledgers of exchange. All parties involved in the system are then able to reach a consensus on what the canonical truth is regarding the holding and exchange of value within the system.

… [from page 6 of the report]

McKee manages to keep the report from floating away in a sea of utopian bliss with some cautionary notes. Still, as a writer I’m surprised he didn’t notice that ‘blockchain‘ which (in English) is supposed to ‘unlock padlocks’ poses a linguistic conundrum if nothing else.

This looks like an interesting report but it’s helpful to have some ‘critical theory’ jargon. That said, the bulk of the report is relatively accessible reading although some of the essays (at the end) from the artist-researchers are tough going.

One more thought, the report does present many exciting and transformative possibilities and I would dearly love to see much of this come to pass. I am more hesitant than McKee and his colleagues and that hesitation is beautifully described in an essay (The Vampire Problem: Illustrating the Paradox of Transformative Experience) first published September 3, 2017 by Maria Popova (originally published on Brain Pickings),

To be human is to suffer from a peculiar congenital blindness: On the precipice of any great change, we can see with terrifying clarity the familiar firm footing we stand to lose, but we fill the abyss of the unfamiliar before us with dread at the potential loss rather than jubilation over the potential gain of gladnesses and gratifications we fail to envision because we haven’t yet experienced them. …

Arts and blockchain events in Vancouver

The 221 A launch event for the report kicked off a series of related events, here’s more from a 221 A May 17, 2021 news release (Note: the first and second events have already taken place),

Events Series

Please join us for a live stream events series bringing together key contributors of the Blockchains & Cultural Padlocks Research Report alongside a host of leading figures across academic, urbanism, media and blockchain development communities.

Blockchains & Cultural Padlocks Digital Strategy Launch

May 25, 10 am PDT / 1 pm EDT / 7 CEST

With Jesse McKee, BACP Lead Investigator and 221A Head of Strategy; Rosemary Heather, BACP Editorial Director and Principal Researcher; moderated by Svitlana Matviyenko, Assistant Professor and Associate Director of Simon Fraser University’s Digital Democracies Institute.

The Valuation of Necessity: A Cosmological View of our Technologies and Culture

June 4, 10 am PDT / 1 pm EDT / 7pm CEST

With BACP researcher, artist and theorist Patricia Reed; critical geographer Maral Sotoudehnia, and Wassim Alsindi of 0x Salon, Berlin, who conducts research on the legal and ecological externalities of blockchain networks.

Recommoning Territory: Diversifying Housing Tenure Through Platform Cooperatives

June 18, 10 am PDT / 1 pm EDT / 7pm CEST

With 221A Fellows Maksym Rokmaniko and Francis Tseng (DOMA [a nonprofit organization developing a distributed housing platform]); Andy Yan (Simon Fraser University); and BACP researcher and critical geographer Maral Sotoudehnia.

Roundtable: Decentralized Autonomous Organizations (DAOs) & Social Tokens

Released June 25, Pre-recorded

Roundtable co-organized with Daniel Keller of newmodels.io, with participation from development teams and researchers from @albiverse, trust.support, Circles UBI, folia.app, SayDAO, and Blockchain@UBC

Blockchains & Cultural Padlocks is supported by the Digital Strategy Fund of the Canada Council for the Arts.

For more, contact us hello@221a.ca

Coming up: Vancouver’s Voxel Bridge

The Vancouver Biennale folks first sent me information about Voxel Bridge in 2018 but this new material is the most substantive description yet, even without an opening date. From a June 6, 2021 article by Kevin Griffin for the Vancouver Sun (Note: Links have been removed),

The underside of the Cambie Bridge is about to be transformed into the unique digital world of Voxel Bridge. Part of the Vancouver Biennale, Voxel Bridge will exist both as a physical analogue art work and an online digital one.

The public art installation is by Jessica Angel. When it’s fully operational, Voxel Bridge will have several non-fungible tokens called NFTs that exist in an interactive 3-D world that uses blockchain technology. The intention is to create a fully immersive installation. Voxel Bridge is being described as the largest digital public art installation of its kind.

“To my knowledge, nothing has been done at this scale outdoors that’s fully interactive,” said Sammi Wei, the Vancouver Biennale‘s operations director. “Once the digital world is built in your phone, you’ll be able to walk around objects. When you touch one, it kind of vibrates.”

Just as a pixel refers to a point in a two-dimensional world, voxel refers to a similar unit in a 3-D world.

Voxel Bridge will be about itself: it will tell the story of what it means to use new decentralized technology called blockchain to create Voxel Bridge.

There are a few more Voxel Bridge details in a June 7, 2021 article by Vincent Plana for the Daily Hive,

Voxel Bridge draws parallels between blockchain technology and the structural integrity of the underpass itself. The installation will be created by using adhesive vinyl and augmented reality technology.

Gfiffin’s description in his June 6, 2021 article gives you a sense of what it will be like to become immersed in Voxel Bridge,

Starting Monday [June 14, 2021], a crew will begin installing a vinyl overlay directly on the architecture on the underside of the bridge deck, around the columns, and underfoot on the sidewalk from West 2nd to the parking-lot road. Enclosing a space of about 18,000 square feet, the vinyl layer will be visible without any digital enhancement. It will look like an off-kilter circuit board.

“It’ll be like you’re standing in the middle of a circuit board,” [emphasis mine] she said. “At the same time, the visual perception will be slightly off. It’s like an optical illusion. You feel the ground is not quite where it’s supposed to be.”

Griffin’s June 6, 2021 article offers good detail and a glossary.

So, Vancouver is offering more than one opportunity to learn about and/or experience blockchain.

New podcast—Mission: Interplanetary and Event Rap: a one-stop custom rap shop Kickstarter

I received two email notices recently, one from Dr. Andrew Maynard (Arizona State University; ASU) and one from Baba Brinkman (Canadian rapper of science and other topics now based in New York).

Mission: Interplanetary

I found a “Mission: Interplanetary— a podcast on the future of humans as a spacefaring species!” webpage (Link: https://collegeofglobalfutures.asu.edu/blog/2021/03/23/mission-interplanetary-redefining-how-we-talk-about-humans-in-space/) on the Arizona State University College of Global Futures website,

Back in January 2019 I got an email from my good friend and colleague Lance Gharavi with the title “Podcast brainstorming.” Two years on, we’ve just launched the Mission: Interplanetary podcast–and it’s amazing!

It’s been a long journey — especially with a global pandemic thrown in along the way — but on March 23 [2021], the Mission: Interplanetary podcast with Slate and ASU finally launched.

After two years of planning, many discussions, a bunch dry runs, and lots (and by that I mean lots) of Zoom meetings, we are live!

As the team behind the podcast talked about and developed the ideas underpinning the Mission: Interplanetary,we were interested in exploring new ways of thinking and talking about the future of humanity as a space-faring species as part of Arizona State University’s Interplanetary Initiative. We also wanted to go big with these conversations — really big!

And that is exactly what we’ve done in this partnership with Slate.

The guests we’re hosting, the conversations we have lined up, the issues we grapple with, are all literally out of this world. But don’t just take my word for it — listen to the first episode above with the incredible Lindy Elkins-Tanton talking about NASA’s mission to the asteroid 16 Psyche.

And this is just a taste of what’s to come over the next few weeks as we talk to an amazing lineup of guests.

So if you’re looking for a space podcast with a difference, and one that grapples with big questions around our space-based future, please do subscribe on your favorite podcast platform. And join me and the fabulous former NASA astronaut Cady Coleman as we explore the future of humanity in space.

See you there!

Slate’s webpage (Mission: Interplanetary; Link: https://slate.com/podcasts/mission-interplanetary) offers more details about the co-hosts and the programmes along with embedded podcasts,

Cady Coleman is a former NASA astronaut and Air Force colonel. She flew aboard the International Space Station on a six-month expedition as the lead science and robotics officer. A frequent speaker on space and STEM topics, Coleman is also a musician who’s played from space with the Chieftains and Ian Anderson of Jethro Tull.

Andrew Maynard is a scientist, author, and expert in risk innovation. His books include Films From the Future: The Technology and Morality of Sci-Fi Movies and Future Rising

Latest Episodes

April 27, 2021

Murder in Space

What laws govern us when we leave Earth?

Happy listening. And, I apologize for the awkward links.

Event Rap Kickstarter

Baba Brinkman’s April 27, 2021 email notice has this to say about his latest venture,

Join the Movement, Get Rewards

My new Kickstarter campaign for Event Rap is live as of right now! Anyone who backs the project is helping to launch an exciting new company, actually a new kind of company, the first creator marketplace for rappers. Please take a few minutes to read the campaign description, I put a lot of love into it.

The campaign goal is to raise $26K in 30 days, an average of $2K per artist participating. If we succeed, this platform becomes a new income stream for independent artists during the pandemic and beyond. That’s the vision, and I’m asking for your help to share it and support it.

But instead of why it matters, let’s talk about what you get if you support the campaign!

$10-$50 gets you an advance copy of my new science rap album, Bright Future. I’m extremely proud of this record, which you can preview here, and Bright Future is also a prototype for Event Rap, since all ten of the songs were commissioned by people like you.

$250 – $500 gets you a Custom Rap Video written and produced by one of our artists, and you have twelve artists and infinite topics to choose from. This is an insanely low starting price for an original rap video from a seasoned professional, and it applies only during the Kickstarter. What can the video be about? Anything at all. You choose!

In case it’s helpful, here’s a guide I wrote entitled “How to Brief a Rapper

$750 – $1,500 gets you a live rap performance at your virtual event. This is also an amazingly low price, especially since you can choose to have the artist freestyle interactively with your audience, write and perform a custom rap live, or best of all compose a “Rap Up” summary of the event, written during the event, that the artist will perform as the grand finale.

That’s about as fresh and fun as rap gets.

$3,000 – $5,000 the highest tiers bring the highest quality, a brand new custom-written, recorded, mixed and mastered studio track, or studio track plus full rap music video, with an exclusive beat and lyrics that amplify your message in the impactful, entertaining way that rap does best.

I know this higher price range isn’t for everyone, but check out some of the music videos our artists have made, and maybe you can think of a friend to send this to who has a budget and a worthy cause.

Okay, that’s it!

Those prices are in US dollars.

I gather at least one person has backed given enough money to request a custom rap on cycling culture in the Netherlands.

The campaign runs for another 26 days. It has amassed over $8,400 CAD towards a goal of $32,008 CAD. (The site doesn’t show me the goal in USD although the pledges/reward are listed in that currency.)

A 3D spider web, a VR (virtual reality) setup, and sonification (music)

Markus Buehler and his musical spider webs are making news again.

Caption: Cross-sectional images (shown in different colors) of a spider web were combined into this 3D image and translated into music. Credit: Isabelle Su and Markus Buehler

The image (so pretty) you see in the above comes from a Markus Buehler presentation that was made at the American Chemical Society (ACS) meeting. ACS Spring 2021 being held online April 5-30, 2021. The image was also shown during a press conference which the ACS has made available for public viewing. More about that later in this posting.

The ACS issued an April 12, 2021 news release (also on EurekAlert), which provides details about Buehler’s latest work on spider webs and music,

Spiders are master builders, expertly weaving strands of silk into intricate 3D webs that serve as the spider’s home and hunting ground. If humans could enter the spider’s world, they could learn about web construction, arachnid behavior and more. Today, scientists report that they have translated the structure of a web into music, which could have applications ranging from better 3D printers to cross-species communication and otherworldly musical compositions.

The researchers will present their results today at the spring meeting of the American Chemical Society (ACS). ACS Spring 2021 is being held online April 5-30 [2021]. Live sessions will be hosted April 5-16, and on-demand and networking content will continue through April 30 [2021]. The meeting features nearly 9,000 presentations on a wide range of science topics.

“The spider lives in an environment of vibrating strings,” says Markus Buehler, Ph.D., the project’s principal investigator, who is presenting the work. “They don’t see very well, so they sense their world through vibrations, which have different frequencies.” Such vibrations occur, for example, when the spider stretches a silk strand during construction, or when the wind or a trapped fly moves the web.

Buehler, who has long been interested in music, wondered if he could extract rhythms and melodies of non-human origin from natural materials, such as spider webs. “Webs could be a new source for musical inspiration that is very different from the usual human experience,” he says. In addition, by experiencing a web through hearing as well as vision, Buehler and colleagues at the Massachusetts Institute of Technology (MIT), together with collaborator Tomás Saraceno at Studio Tomás Saraceno, hoped to gain new insights into the 3D architecture and construction of webs.

With these goals in mind, the researchers scanned a natural spider web with a laser to capture 2D cross-sections and then used computer algorithms to reconstruct the web’s 3D network. The team assigned different frequencies of sound to strands of the web, creating “notes” that they combined in patterns based on the web’s 3D structure to generate melodies. The researchers then created a harp-like instrument and played the spider web music in several live performances around the world.

The team also made a virtual reality setup that allowed people to visually and audibly “enter” the web. “The virtual reality environment is really intriguing because your ears are going to pick up structural features that you might see but not immediately recognize,” Buehler says. “By hearing it and seeing it at the same time, you can really start to understand the environment the spider lives in.”

To gain insights into how spiders build webs, the researchers scanned a web during the construction process, transforming each stage into music with different sounds. “The sounds our harp-like instrument makes change during the process, reflecting the way the spider builds the web,” Buehler says. “So, we can explore the temporal sequence of how the web is being constructed in audible form.” This step-by-step knowledge of how a spider builds a web could help in devising “spider-mimicking” 3D printers that build complex microelectronics. “The spider’s way of ‘printing’ the web is remarkable because no support material is used, as is often needed in current 3D printing methods,” he says.

In other experiments, the researchers explored how the sound of a web changes as it’s exposed to different mechanical forces, such as stretching. “In the virtual reality environment, we can begin to pull the web apart, and when we do that, the tension of the strings and the sound they produce change. At some point, the strands break, and they make a snapping sound,” Buehler says.

The team is also interested in learning how to communicate with spiders in their own language. They recorded web vibrations produced when spiders performed different activities, such as building a web, communicating with other spiders or sending courtship signals. Although the frequencies sounded similar to the human ear, a machine learning algorithm correctly classified the sounds into the different activities. “Now we’re trying to generate synthetic signals to basically speak the language of the spider,” Buehler says. “If we expose them to certain patterns of rhythms or vibrations, can we affect what they do, and can we begin to communicate with them? Those are really exciting ideas.”

You can go here for the April 12, 2021 ‘Making music from spider webs’ ACS press conference’ it runs about 30 mins. and you will hear some ‘spider music’ played.

Getting back to the image and spider webs in general, we are most familiar with orb webs (in the part of Canada where I from if nowhere else), which look like spirals and are 2D. There are several other types of webs some of which are 3D, like tangle webs, also known as cobwebs, funnel webs and more. See this March 18, 2020 article “9 Types of Spider Webs: Identification + Pictures & Spiders” by Zach David on Beyond the Treat for more about spiders and their webs. If you have the time, I recommend reading it.

I’ve been following Buehler’s spider web/music work for close to ten years now; the latest previous posting is an October 23, 2019 posting where you’ll find a link to an application that makes music from proteins (spider webs are made up of proteins; scroll down about 30% of the way; it’s in the 2nd to last line of the quoted text about the embedded video).

Here is a video (2 mins. 17 secs.) of a spider web music performance that Buehler placed on YouTube,

Feb 3, 2021

Markus J. Buehler

Spider’s Canvas/Arachonodrone show excerpt at Palais de Tokyo, Paris, on November 2018. Video by MIT CAST. More videos can be found on www.arachnodrone.com. The performance was commissioned by Studio Tomás Saraceno (STS), in the context of Saraceno’s carte blanche exhibition, ON AIR. Spider’s Canvas/Arachnodrone was performed by Isabelle Su and Ian Hattwick on the spider web instrument, Evan Ziporyn on the EWI (Electronic Wind Instrument), and Christine Southworth on the guitar and EBow (Electronic Bow)

You can find more about the spider web music and Buehler’s collaborators on http://www.arachnodrone.com/,

Spider’s Canvas / Arachnodrone is inspired by the multifaceted work of artist Tomas Saraceno, specifically his work using multiple species of spiders to make sculptural webs. Different species make very different types of webs, ranging not just in size but in design and functionality. Tomas’ own web sculptures are in essence collaborations with the spiders themselves, placing them sequentially over time in the same space, so that the complex, 3-dimensional sculptural web that results is in fact built by several spiders, working together.

Meanwhile, back among the humans at MIT, Isabelle Su, a Course 1 doctoral student in civil engineering, has been focusing on analyzing the structure of single-species spider webs, specifically the ‘tent webs’ of the cyrtophora citricola, a tropical spider of particular interest to her, Tomas, and Professor Markus Buehler. Tomas gave the department a cyrtophora spider, the department gave the spider a space (a small terrarium without glass), and she in turn built a beautiful and complex web. Isabelle then scanned it in 3D and made a virtual model. At the suggestion of Evan Ziporyn and Eran Egozy, she then ported the model into Unity, a VR/game making program, where a ‘player’ can move through it in numerous ways. Evan & Christine Southworth then worked with her on ‘sonifying’ the web and turning it into an interactive virtual instrument, effectively turning the web into a 1700-string resonating instrument, based on the proportional length of each individual piece of silk and their proximity to one another. As we move through the web (currently just with a computer trackpad, but eventually in a VR environment), we create a ‘sonic biome’: complex ‘just intonation’ chords that come in and out of earshot according to which of her strings we are closest to. That part was all done in MAX/MSP, a very flexible high level audio programming environment, which was connected with the virtual environment in Unity. Our new colleague Ian Hattwick joined the team focusing on sound design and spatialization, building an interface that allowed him the sonically ‘sculpt’ the sculpture in real time, changing amplitude, resonance, and other factors. During this performance at Palais de Tokyo, Isabelle toured the web – that’s what the viewer sees – while Ian adjusted sounds, so in essence they were together “playing the web.” Isabelle provides a space (the virtual web) and a specific location within it (by driving through), which is what the viewer sees, from multiple angles, on the 3 scrims. The location has certain acoustic potentialities, and Ian occupies them sonically, just as a real human performer does in a real acoustic space. A rough analogy might be something like wandering through a gothic cathedral or a resonant cave, using your voice or an instrument at different volumes and on different pitches to find sonorous resonances, echoes, etc. Meanwhile, Evan and Christine are improvising with the web instrument, building on Ian’s sound, with Evan on EWI (Electronic Wind Instrument) and Christine on electric guitar with EBow.

For the visuals, Southworth wanted to create the illusion that the performers were actually inside the web. We built a structure covered in sharkstooth scrim, with 3 projectors projecting in and through from 3 sides. Southworth created images using her photographs of local Lexington, MA spider webs mixed with slides of the scan of the web at MIT, and then mixed those images with the projection of the game, creating an interactive replica of Saraceno’s multi-species webs.

If you listen to the press conference, you will hear Buehler talk about practical applications for this work in materials science.

DEBBY FRIDAY’s LINK SICK, an audio play+, opens March 29, 2021 (online)

[downloaded from https://debbyfriday.com/link-sick]

This is an artistic work, part of the DEBBY FRIDAY enterprise, and an MFA (Master of Fine Arts) project. Here’s the description from the Simon Fraser University (SFU) Link Sick event page,

LINK SICK

DEBBY FRIDAY’S MFA Project
Launching Monday, March 29, 2021 | debbyfriday.com/link-sick

Set against the backdrop of an ambiguous dystopia and eternal rave, LINK SICK is a tale about the threads that bind us together.  

LINK SICK is DEBBY FRIDAY’S graduate thesis project – an audio-play written, directed and scored by the artist herself. The project is a science-fiction exploration of the connective tissue of human experience as well as an experiment in sound art; blurring the lines between theatre, radio, music, fiction, essay, and internet art. Over 42-minutes, listeners are invited to gather round, close their eyes, and open their ears; submerging straight into a strange future peppered with blink-streams, automated protests, disembodied DJs, dancefloor orgies, and only the trendiest S/S 221 G-E two-piece club skins.

Starring 

DEBBY FRIDAY as Izzi/Narrator
Chino Amobi as Philo
Sam Rolfes as Dj GODLESS
Hanna Sam as ABC Inc. Announcer
Storm Greenwood as Diana Deviance
Alex Zhang Hungtai as Weaver
Allie Stephen as Numee
Soukayna as Katz
AI Voice Generated Protesters via Replica Studios

Presented in partial fulfillment of the requirements of the Degree of Master of Fine Arts in the School for the Contemporary Arts at Simon Fraser University.

No time is listed but I’m assuming FRIDAY is operating on PDT, so, you might want to take that into account when checking.

FRIDAY seems to favour full caps for her name and everywhere on her eponymous website (from her ABOUT page),

DEBBY FRIDAY is an experimentalist.

Born in Nigeria, raised in Montreal, and now based in Vancouver, DEBBY FRIDAY’s work spans the spectrum of the audio-visual, resisting categorizations of genre and artistic discipline. She is at once sound theorist and musician, performer and poet, filmmaker and PUNK GOD. …

Should you wish to support the artist financially, she offers merchandise.

Getting back to the play, I look forward to the auditory experience. Given how much we are expected to watch and the dominance of images, creating a piece that requires listening is an interesting choice.

COVID-19 infection as a dance of molecules

What a great bit of work, publicity-wise, from either or both the Aga Khan Museum in Toronto (Canada) and artist/scientist Radha Chaddah. IAM (ee-yam): Dance of the Molecules, a virtual performance installation featuring COVID-19 and molecular dance, has been profiled in the Toronto Star, on the Canadian Broadcasting Corporation (CBC) website, and in the Globe and Mail within the last couple of weeks. From a Canadian perspective, that’s major coverage and much of it national.

Bruce DeMara’s March 11, 2021 article for the Toronto Star introduces artist/scientist Radha Chaddah, her COVID-19 dance of molecules, and her team (Note: A link has been removed),

Visual artist Radha Chaddah has always had an abiding interest in science. She has a degree in biology and has done graduate studies in stem cell research.

[…] four-act dance performance; the first part “IAM: Dance of the Molecules” premiered as a digital exhibition on the Aga Khan Museum’s website March 5 [2021] and runs for eight weeks. Subsequent acts — human, planetary and universal, all using the COVID virus as an entry point — will be unveiled over the coming months until the final instalment in December 2022.

Among Chaddah’s team were Allie Blumas and the Open Fortress dance collective — who perform as microscopic components of the virus’s proliferation, including “spike” proteins, A2 receptors and ribosomes — costumiers Call and Response (who designed for the late Prince), director of photography Henry Sansom and composer Dan Bédard (who wrote the film’s music after observing the dance rehearsals remotely).

A March 5, 2021 article by Leah Collins for CBC online offers more details (Note: Links have been removed),

This month, the Aga Khan Museum in Toronto is debuting new work from local artist Radha Chaddah. Called IAM, this digital exhibition is actually the first act in a series of four short films that she aims to produce between now and the end of 2022. It’s a “COVID story,” says Chaddah, but one that offers a perspective beyond the anniversary of its impact on life and culture and toilet-paper consumption. “I wanted to present a piece that makes people think about the coronavirus in a different way,” she explains, “one that pulls them out of the realm of fear and puts our imaginations into the realm of curiosity.”

It’s scientific curiosity that Chaddah’s talking about, and her own extra-curricular inquiries first sparked the series. For several years, Chaddah has produced work that splices art and science, a practice she began while doing grad studies in molecular neurobiology. “If I had to describe it simply, I would say that I make art about invisible realities, often using the tools of research science,” she says, and in January of last year, she was gripped by news of the novel coronavirus’ discovery. 

“I started researching: reading research papers, looking into how it was that [the virus] actually affected the human body,” she says. “How does it get into the cells? What’s its replicative life cycle?” Chaddah wanted a closer look at the structure of the various molecules associated with the progression of COVID-19 in the body, and there is, it turns out, a trove of free material online. Using animated 3-D renderings (sourced from this digital database), Chaddah began reviewing the files: blowing them up with a video projector, and using the trees in her own backyard as “a kind of green, living stage.”

Part one of IAM (the film appearing on the Aga Khan’s website) is called “Dance of the Molecules.” Recorded on Chaddah’s property in September, it features two dancers: Allie Blumas (who choreographed the piece) and Lee Gelbloom. Their bodies, along with the leafy setting, serve as a screen for Chaddah’s projections: a swirl of firecracker colour and pattern, built from found digital models. Quite literally, the viewer is looking at an illustration of how the coronavirus infects the human body and then replicates. (The very first images, for example, are close-ups of the virus’ spiky surface, she explains.) And in tandem with this molecular drama, the dancers interpret the process. 

There is a brief preview,

To watch part 1 of IAM: Dance of the Molecules, go here to the Aga Khan Museum.

Enjoy!

Being a bit curious I looked up Radha Chaddah’s website and found this on her Bio webpage (click on About tab for the dropdown menu from the Home page),

Radha Chaddah is a Toronto based visual artist and scientist. Born in Owen Sound, Ontario she studied Film and Art History at Queen’s University (BAH), and Human Biology at the University of Toronto, where she received a Master of Science in Cell and Molecular Neurobiology. 

Chaddah makes art about invisible realities like the cellular world, electromagnetism and wave form energy, using light as her primary medium.  Her work examines the interconnected themes of knowledge, illusion, desire and the unseen world. In her studio she designs projected light installations for public exhibition. In the laboratory, she uses the tools of research science to grow and photograph cells using embedded fluorescent light-emitting molecules. Her cell photographs and light installations have been exhibited across Canada and her photographs have appeared in numerous publications.  She has lectured on basic cell and stem cell biology for artists, art students and the public at OCADU [Ontario College of Art & Design University], the Ontario Science Centre, the University of Toronto and the Textile Museum of Canada.

I also found Call and Response here, the Open Fortress dance collective on the Centre de Création O Vertigo website, Henry Sansom here, and Dan Bedard here. Both Bedard and Sansom can be found on the Internet Move Database (IMDB.com), as well.

Dancing with a robot

Dancing with Baryshnibot. Alice Williamson, Courtesy Merritt Moore

Dancing robots usually perform to pop music but every once in a while, there’s a move toward classical music and ballet, e.g., my June 8, 2011 posting was titled, Robot swan dances to Tchaikovsky’s Swan Lake. Unlike the dancing robot in the picture above, that robot swan danced alone. (You can still see the robot’s Swan Lake performance in the video embedded in the 2011 posting.)

I don’t usually associate dance magazines with robots but Chava Pearl Lansky’s Nov. 18, 2020 article about dancer/physicist Merritt Moore and her work with Baryshnibot is found in ballet magazine, Pointe (Note: Links have been removed),

When the world went into lockdown last March [2019], most dancers despaired. But not Merritt Moore. The Los Angeles native, who lives in London and has danced with Norwegian National Ballet, English National Ballet and Boston Ballet, holds a PhD in atomic and laser physics from the University of Oxford. A few weeks into the coronavirus pandemic, she came up with a solution for having to train and work alone: robots.

Moore had just come out of a six-week residency at Harvard ArtLab focused on the intersection between dance and robotics. “I knew I needed something to look forward to, and thought how bizarre I’d just been working with robots,” she says. “Who knew they’d be my only potential dance partners for a really long time?” She reached out to Universal Robotics and asked them to collaborate, and they agreed to send her a robot to experiment with.

Baryshnibot is an industrial robot normally used for automation and manufacturing. “It does not look impressive at all,” says Moore. “But there’s so much potential for different movement.” Creating dances for a robot, she says, is like an elaborate puzzle: “I have to figure out how to make this six-jointed rod emulate the dance moves of a head, two arms, a body and two legs.”

Moore started with the basics. She’d learn a simple TikTok dance, and then map the movements into a computer pad attached to the robot. “The 15-second-routine will take me five-hours-plus to program,” she says. Despite the arduous process, she’s built up to more advanced choreography, and is trying on different dance styles, from ballet to hip hop to salsa. For her newest pas de deux, titled Merritt + Robot, Moore worked with director Conor Gorman and cinematographer Howard Mills to beautifully capture her work with Baryshnibot on film. …

You can find Moore’s and Baryshnibot’s performance video embedded in Nov. 18, 2020 article.