Tag Archives: Matt Russo

Chandra Sonifications (extraplanetary music and data sonification)

I’m not sure why the astronomy community is so taken with creating music out of data but it seems to be the most active of the science communities in the field. This October 15. 2023 article by Elizabeth Hlavinka for Salon.com provides a little context before describing some of the latest work, Note: Links have been removed,

Christine Malec, who has been blind since birth, has always been a big astronomy buff, fascinated by major questions about the universe like what happens when a limit reaches infinity and whether things like space travel could one day become a reality. However, throughout her childhood, most astronomical information was only accessible to her via space documentaries or science fiction books.

Nearly a decade ago, Malec discovered a completely new way to experience astronomy when she saw astronomer and musician Matt Russo, Ph.D., give a presentation at a local planetarium in Toronto. Using a process called astronomical sonification, Russo had translated information collected from the TRAPPIST-1 solar system, which has seven planets locked in an orbital resonance, into something people who are blind or have low vision could experience: music. 

Russo’s song sent a wave of goosebumps through Malec’s body. Something she had previously understood intellectually but never had turned into a sensory experience was suddenly, profoundly felt.

“It was unforgettable,” Malec told Salon in a phone interview. “I compare it to what it might be like for a sighted person to look up at the night sky and get a sensory intuition of the size and nature of the cosmos. As a blind person, that’s an experience I hadn’t had.”

Through astronomical sonification, scientists map complex astronomical structures like black holes or exploded stars through the similarly expansive and multidimensional world of sound. Translating data from outer space into music not only expands access to astronomy for people who are blind or have low vision, but it also has the potential to help all scientists better understand the universe by leading to novel discoveries. Like images from the James Webb telescope that contextualize our tiny place in the universe, astronomical sonification similarly holds the power to connect listeners to the cosmos.

“It really does bring a connection that you don’t necessarily get when you’re just looking at a cluster of galaxies that’s billions of light years away from you that stretches across many hundreds of millions of light years,” said Kimberly Kowal Arcand, Ph.D., a data visualizer for NASA’s Chandra X-ray Observatory. “Having sound as a way of experiencing that type of phenomenon, that type of object, whatever it is, is a very valid way of experiencing the world around you and of making meaning.”

Malec serves as a consultant for Chandra Sonifications, which translates complex data from astronomical objects into sound. One of their most popular productions, which has been listened to millions of times, sonified a black hole in the Perseus cluster galaxy about 240 million light-years away. When presenting this sonification at this year’s [2023] SXSW festival in March, Russo, who works with Chandra through an organization he founded called SYSTEM Sounds, said this eerie sound used to depict the black hole had been likened to “millions of damned souls being sucked into the pits of hell.” 

Here’s some of what the audience at the 2023 SXSW festival heard,

If you have the time , do read Hlavinka’s October 15. 2023 article as she tells a good story with many interesting tidbits such as this (Note: Links have been removed),

William “Bill” Kurth, Ph.D., a space physicist at the University of Iowa, said the origins of astronomical sonification can be traced back to at least the 1970s when the Voyager-1 spacecraft recorded electromagnetic wave signals in space that were sent back down to his team on Earth, where they were processed as audio recordings.

Back in 1979, the team plotted the recordings on a frequency-time spectrogram similar to a voiceprint you see on apps that chart sounds like birds chirping, Kurth explained. The sounds emitted a “whistling” effect created by waves following the magnetic fields of the planet rather than going in straight lines. The data seemed to confirm what they had suspected: lightning was shocking through Jupiter’s atmosphere.

“At that time, the existence of lightning anywhere other than in Earth’s atmosphere was unknown,” Kurth told Salon in a phone interview. “This became the first time that we realized that lightning might exist on another planet.”

And this (Note: Links have been removed),

Beyond astronomy, sonification can be applied to any of the sciences, and health researchers are currently looking at tonifying DNA strands to better understand how proteins fold in multiple dimensions. Chandra is also working on constructing tactile 3-D models of astronomical phenomena, which also expands access for people who are blind or have low vision — those who have historically only been able to experience these sciences through words, Malec said.

Chandra and other sonification projects

I found a brief and somewhat puzzling description of the Chandra sonification project on one of the of US National Aeronautics and Space Administration (NASA) websites. From a September 22, 2021 posting on the Marshall Science Research and Projects Division blog (Note: Links have been removed,)

On 9/16/21, a Chandra sonification image entitled “Jingle, Pluck, and Hum: Sounds from Space” was released to the public.  Since 2020, Chandra’s “sonification” project has transformed astronomical data from some of the world’s most powerful telescopes into sound.  Three new objects — a star-forming region, a supernova remnant, and a black hole at the center of a galaxy — are being released.  Each sonification has its own technique to translate the astronomical data into sound.

For more information visit: Data Sonifications: Westerlund 2 (Multiwavelength), Tycho’s Supernova Remnant, and M87. https://www.nasa.gov/missions_pages/chandra/main/index.html.

A Chandra article entitled “Data Sonification: Sounds from the Milky Way” was also released in the NASA STEM Newsletter.  This newsletter was sent to 54,951 subscribers and shared with the office of STEM engagements social media tools with approximately 1.7M followers. For more information visit: https://myemail.constantcontact.com/NASA-EXPRESS—-Your-STEM-Connection-for-Sept–9–2021.html?soid=1131598650811&aid=iXfzAJk6x_s

I’m a little puzzled by the reference to a Chandra sonification image but I’m assuming that they also produce data visualizations. Anyway, as Hlavinka notes Chandra is a NASA X-ray Observatory and they have a number of different projects/initiatives.

Getting back to data sonification, Chandra offers various audio files on its A Universe of Sound webpage,

Here’s a sampling of three data sonification posts (there are more) here,

Enjoy!

Space and sound (music from the Milky Way)

A May 17, 2021 posting on the Canadian Broadcasting Corporation (CBC) Radio Ideas programme blog describes and hosts embedded videos and audio clips of space data sonfications and visualizations,

After years of attempts and failures to get a microphone to Mars, NASA’s [US National Aeronautics and Space Administration] latest rover, Perseverance, succeeded. It landed in February carrying two microphones.

For Jason Achilles Mezilis, a musician and record producer who has also worked for NASA, listening to the haunting Martian wind was an emotional experience.

“I’m in this bar half drunk, and I go over to the corner and I listen to it on my cellphone and … I broke down.”

The atmosphere of Mars is a little thinner than Earth’s, but it still has enough air to transmit sound.

Ben Burtt, an Oscar-winning sound designer, editor and director, made the sounds of cinematic space fantasy — from Star Wars to WALL-E to Star Trek. But he’s also deeply interested in the sound of actual space reality.

“All sound is a form of wind, really. It’s a puff of air molecules moving. And when I heard the sound, I thought: ‘Well, you know, I’ve heard this many times in my headphones on recording trips,'” Burtt said

SYSTEM Sounds, founded by University of Toronto astrophysicist and musician Matt Russo, translates data from space into music. 

Planets or moons sometimes fall into what’s called “orbital resonance,” where two or more bodies pull each other into a regular rhythm. One example is the three inner moons of Jupiter: Ganymede, Europa, and Io. 

“The rhythm is very similar to what a drummer might play. There’s a very simple regularity,” Russo said.

“And there’s something about our ears and our auditory system that finds that pleasing, finds repeating rhythms with simple ratios between them pleasing or natural sounding. It’s predictable. So it gives you something to kind of latch on to emotionally.”

Russo created this tool to illustrate the musical rhythm of the Galilean moons. 

During the pandemic, scientists at NASA, with the help of SYSTEM Sounds, tried to find new ways of connecting people with the beauty of space. The result was “sonic visualizations,” translating data captured by telescopes into sound instead of pictures.

Most images of space come from data translated into colours, such as Cassiopeia A, the remains of an exploded star. 

A given colour is usually assigned to the electromagnetic signature of each chemical in the dust cloud. But instead of assigning a colour, a musical note can be assigned, allowing us to hear Cassiopeia A instead of just seeing it.

There are several embedded videos and the Ideas radio interview embedded in the May 17, 2021 posting. Should you be interested, you can find System Sounds here.

You will find a number of previous postings (use the search term ‘data sonification’); the earliest concerning ‘space music’ is from February 7, 2014. You’ll also find Matt Russo, the TRAPPIST-1 planetary system, and music in a May 11, 2017 posting.

Sounding out the TRAPPIST-1 planetary system

It’s been a while since a data sonification story has come this way. Like my first posting on the topic (Feb. 7, 2014) this is another astrophysics ‘piece of music’. From the University of Toronto (Canada) and Thought Café (a Canadian animation studio),

For those who’d like a little text, here’s more from a May 10, 2017 University of Toronto news release (also on EurekAlert) by Don Campbell,

When NASA announced its discovery of the TRAPPIST-1 system back in February [2017] it caused quite a stir, and with good reason. Three of its seven Earth-sized planets lay in the star’s habitable zone, meaning they may harbour suitable conditions for life.

But one of the major puzzles from the original research describing the system was that it seemed to be unstable.

“If you simulate the system, the planets start crashing into one another in less than a million years,” says Dan Tamayo, a postdoc at U of T Scarborough’s Centre for Planetary Science.

“This may seem like a long time, but it’s really just an astronomical blink of an eye. It would be very lucky for us to discover TRAPPIST-1 right before it fell apart, so there must be a reason why it remains stable.”

Tamayo and his colleagues seem to have found a reason why. In research published in the journal Astrophysical Journal Letters, they describe the planets in the TRAPPIST-1 system as being in something called a “resonant chain” that can strongly stabilize the system.

In resonant configurations, planets’ orbital periods form ratios of whole numbers. It’s a very technical principle, but a good example is how Neptune orbits the Sun three times in the amount of time it takes Pluto to orbit twice. This is a good thing for Pluto because otherwise it wouldn’t exist. Since the two planets’ orbits intersect, if things were random they would collide, but because of resonance, the locations of the planets relative to one another keeps repeating.

“There’s a rhythmic repeating pattern that ensures the system remains stable over a long period of time,” says Matt Russo, a post-doc at the Canadian Institute for Theoretical Astrophysics (CITA) who has been working on creative ways to visualize the system.

TRAPPIST-1 takes this principle to a whole other level with all seven planets being in a chain of resonances. To illustrate this remarkable configuration, Tamayo, Russo and colleague Andrew Santaguida created an animation in which the planets play a piano note every time they pass in front of their host star, and a drum beat every time a planet overtakes its nearest neighbour.

Because the planets’ periods are simple ratios of each other, their motion creates a steady repeating pattern that is similar to how we play music. Simple frequency ratios are also what makes two notes sound pleasing when played together.

Speeding up the planets’ orbital frequencies into the human hearing range produces an astrophysical symphony of sorts, but one that’s playing out more than 40 light years away.

“Most planetary systems are like bands of amateur musicians playing their parts at different speeds,” says Russo. “TRAPPIST-1 is different; it’s a super-group with all seven members synchronizing their parts in nearly perfect time.”

But even synchronized orbits don’t necessarily survive very long, notes Tamayo. For technical reasons, chaos theory also requires precise orbital alignments to ensure systems remain stable. This can explain why the simulations done in the original discovery paper quickly resulted in the planets colliding with one another.

“It’s not that the system is doomed, it’s that stable configurations are very exact,” he says. “We can’t measure all the orbital parameters well enough at the moment, so the simulated systems kept resulting in collisions because the setups weren’t precise.”

In order to overcome this Tamayo and his team looked at the system not as it is today, but how it may have originally formed. When the system was being born out of a disk of gas, the planets should have migrated relative to one another, allowing the system to naturally settle into a stable resonant configuration.

“This means that early on, each planet’s orbit was tuned to make it harmonious with its neighbours, in the same way that instruments are tuned by a band before it begins to play,” says Russo. “That’s why the animation produces such beautiful music.”

The team tested the simulations using the supercomputing cluster at the Canadian Institute for Theoretical Astrophysics (CITA) and found that the majority they generated remained stable for as long as they could possibly run it. This was about 100 times longer than it took for the simulations in the original research paper describing TRAPPIST-1 to go berserk.

“It seems somehow poetic that this special configuration that can generate such remarkable music can also be responsible for the system surviving to the present day,” says Tamayo.

Here’s a link to and a citation for the paper,

Convergent Migration Renders TRAPPIST-1 Long-lived by Daniel Tamayo, Hanno Rein, Cristobal Petrovich, and Norman Murray. The Astrophysical Journal Letters, Volume 840, Number 2 https://doi.org/10.5281/zenodo.496153 Published 2017 May 10

© 2017. The American Astronomical Society. All rights reserved.

This paper is open access.