What can we learn by keeping our ear to the cosmos?
Anyone who remembers the iconic Ridley Scott film “Alien” (who could forget?) may recall the tagline “in space, no one can hear you scream.” And while sci-fi films are frequent fodder for scientific debate, this assertion is widely acknowledged as true. Because there is no air in space, there is nothing to conduct the sound waves, and therefore no vibrations that are perceptible to the human ear. But that’s not to say it’s impossible to hear the sounds of space. With a little creativity and a lot of scientific ingenuity, astrophysicists have developed fascinating ways for us to listen to the cosmos.
In most cases, the listening process only requires a little bit of translation. Mechanical sound waves might not be able to travel through space, but electromagnetic waves can. Scientists use instruments to collect radio waves, microwaves, infrared rays, optical rays, ultraviolet rays, X-rays, and gamma-rays, then convert them into audible sound waves through a process known as sonification.
NASA researchers were considering the possibilities of sound in space as early as 1977, when the Voyager probes were launched. In the event that the interstellar probes encounter intelligent extraterrestrial life, NASA placed a “Golden Record” aboard the spacecraft that bears the auditory marks of life on earth: ocean waves, bird song, greetings in 55 languages, and even a playlist of multicultural music through the ages. But scientists were also thinking about what sounds the Voyagers could receive when they installed a Plasma Wave Subsystem onboard each probe.
In 2012, Voyager 1 crossed the boundary of the heliosphere. Not long after, it sent back an amazing piece of data: the vibrations of dense plasma, or ionized gas, rumbling in interstellar space. These eerie whistles are helping scientists learn about the density in this strange space beyond our solar system.
Plasma is frequently used as a medium for scientists to detect space sounds. Just as sound waves can move grains of sand on a plate, similar waves can cause the plasma in the Sun to rise and fall. This is how scientists learned that the Sun itself rings like a bell. Telescopes like the HMI and MDI observed movements in solar plasma, and the team at Stanford’s Solar Center created the Sonification of Solar Harmonics or SoSH Project to convert these observed solar vibrations into audible sounds.
Plasma wave instruments were also placed on NASA’s planetary explorers, like the Cassini probe to Saturn and the Juno probe to Jupiter. Cassini’s Radio and Plasma Wave Science Instrument has picked up a number of fascinating signals, including radio emissions from Saturn and its moons and an impressive lightning storm. Cassini also carried a microphone aboard the Huygens probe which recorded sound as it descended to the Saturnian moon of Titan.
The Juno probe captured data as it descended into Jupiter’s magnetosphere, the largest structure in our solar system. It picked up a series of electromagnetic waves trapped in a cavity within Jupiter’s magnetic field. A couple of months later, the instrument received radio signals from the planet’s notoriously intense auroras.
Not all sounds captured in space are a result of electromagnetic waves. Direct impacts can cause mechanical vibrations that are audible to the human ear. For instance, when Stardust-NExT encountered the comet Tempel 1 in 2011, its Dust Flux Monitor recorded the vibrations of dust particles pelting the craft.
In 2019, the InSight lander placed a highly sensitive seismometer on Mars which has collected the sounds of quakes and Martian winds. Inspired by the seismoter’s success, NASA opted to send a set of microphones onboard the Perseverance rover, which landed on the red planet in February of 2021. The Entry Descent and Landing mic recorded Percy’s successful landing, while the SuperCam mic sends back the mechanical sounds from the rover and the rocks and minerals it studies.
Other space sounds are sonifications from data about light. This has allowed NASA to glean sounds with some of its most impressive telescopes, including the Chandra X-Ray Observatory, the Hubble Space Telescope, and the Spitzer Space Telescope. These telescopes create images by capturing x-ray, infrared, and optical light. Sonification converts that data into audio in which the pitch and volume reflect the concentration and intensity of the light. As a result, we’re able to hear celestial objects like supernovas, nebulas, and even black holes. (In case you’re wondering, black holes sing in the key of B-flat).
But why go through all this trouble to recreate the sounds of space? The answers are simultaneously complex and very simple. In the case of plasma wave instruments, scientists can learn a great deal about the interactions and dynamics between objects in our solar system. The “sounds” that come from these studies are just a fun after-effect. And science should be fun. Musical composers, video game designers, and other multi-media artists have latched onto these space sounds for all sorts of creative endeavors. Who’s to say complex scientific data shouldn’t be accessible to the masses?
Accessibility is another important piece of the puzzle. Tools like sonification broaden the field of astrophysics so it can be studied and enjoyed by people who are blind or low vision. But it also presents data in a multi-sensory form that makes learning more accessible for everyone.
Listening with Eclipse Soundscapes
For more fun and accessible space science, download the Eclipse Soundscapes Mobile Application, which allows you to hear (and feel) a total solar eclipse. You can also sign up to join our upcoming Eclipse Soundscapes: Citizen Science Project, where we’ll be studying how eclipses impact the soundscapes here on planet Earth. It’s just another way to keep our ears open and learn about our universe!