Contributing Authors: MaryKay Severino, Vayujeet Gokhale, Rosanne LaBaige, Erin Nichols, Michael Dawson, Matt Barton, Jean Nock, Don Ficken
Eclipse Soundscapes (ES) volunteers continue to make an impact long after the eclipses. Many people chose to donate their AudioMoth devices after submitting their ES data. Because of this generosity, these recorders now have a second life supporting scientific and educational work.
Nineteen donated devices are already in use through a growing collaboration with DarkSky Missouri, a project led by Don Ficken that is working to connect people with the sounds of the night. Scientists, educators, and community leaders are using them to explore and understand the sounds of the night. We first highlighted DarkSky Missouri’s work in an August 2024 post and webinar.
An AudioMoth being tested by Dr. Vayujeet Gokhale at Truman State University. Photo taken by Dr. Vayujeet Gokhale, Truman State University.
At Truman State University, Dr. Vayujeet Gokhale is incorporating AudioMoths into their Freshman Seminar course “Dark Skies: A Natural Resource.” The course encourages students to appreciate the beauty of the night sky and to develop an appreciation of different life forms active after dark.
Listening to Nature After Dark
As one part of the course, between 8-10 AudioMoths will be installed in and around the main campus and the Truman farm, where the Truman observatory is based, to gather local nature sounds. Three students will monitor and care for each instrument, sharing collected data with the campus DarkSky student group, the Campus Environmental Committee and Student Club.
Connecting Light, Wildlife, and Conservation
This work also supports ongoing campus efforts to reduce light pollution and protect local wildlife. Truman State is installing fully shielded, dark-sky-friendly outdoor lights with color temperatures below 3000 K, which minimize glare and reduce disruption to nocturnal animals. Learning about which bird species are active on this Audubon-certified Tree Campus will help demonstrate to university stakeholders how these light improvements benefit local wildlife. Check out some Truman lighting pictures here.
Sharing Results with the Community
A short report summarizing the results of these observations will be shared with the Eclipse Soundscapes Community, DarkSky Missouri, and other campuses participating in the Campus SHINE initiative (https://www.campusshine.org/).
In the Mehlville School District, Library media specialist Erin Nichols is helping students tune into the world around them. Through two new projects that use AudioMoths, students will explore how sound connects to art, music, movement, and the natural environments across their school campuses.
Exploring Sound Through Art, Music, and Movement
At MOSAIC Elementary, Nichols is collaborating with the art, music, and PE teachers to build upon the first-grade sound unit by incorporating AudioMoths into a new lesson. Students will record sounds in nature and compare them to the tones of musical instruments, the rhythms of PE activities, and the textures of art materials. They will also hypothesize what each sound might look like and express their ideas through original artwork. This creative, cross-disciplinary project helps students explore how sound connects to both the natural world and human activity.
Exploring Biodiversity Across School Campuses
Another project now being planned involves MOSAIC, Bierbaum, and Wohlwend Elementaries to explore biodiversity on each school’s campus. Using AudioMoths to record nature sounds and comparing the amount and quality of green space at each site, students will investigate how natural areas affect local wildlife. This exploration may lead to future opportunities to create more native habitats on school grounds, giving students additional ways to connect with nature and learn about the importance of protecting native species.
Deploying an AudioMoth at Jefferson College in Hillsboro, MO. Photo taken by Rosanne LaBaige, President of Missouri Master Naturalist – Miramiguoa Chapter.
Missouri Master Naturalists will continue their efforts to use AudioMoth devices to monitor nocturnal flight calls and explore nighttime ecology. Volunteers across the St. Louis metro area, including members of DarkSky Missouri, Missouri Master Naturalists, the St. Louis Audubon Society, and the Audubon Center at Riverlands, are deploying AudioMoth devices to capture the faint nocturnal calls of migrating birds each May and September.
Listening to the Night Sky
In one example, the program recorded flight calls over Gateway Arch National Park, identifying species such as the American Redstart, Least Sandpiper, and Rose-breasted Grosbeak. Missouri Master Naturalists from the Miramiguoa chapter have joined the effort, placing AudioMoths in nature reserves, backyard habitats throughout the metro area, and at a college observatory in southeast Jefferson County.
Connecting Data to Conservation Action
By documenting when birds are overhead, the team aims to raise awareness about the impact of light pollution on migratory pathways. This data supports community collaboration with Lights Out Heartland to encourage lighting practices that protect birds and promote the health of all living things.
The Saint Louis Zoo is preparing a sound-at-night program and will use donated AudioMoths to replace non-functioning units in their collection.
Supporting the WildCare Spring Peeper Program
The Saint Louis Zoo’s WildCare Spring Peeper Program will use donated AudioMoths to replace non-functioning units in their collection to assist with an ongoing bioacoustics survey of frogs/toads in the Saint Louis Metro Area. The devices will be used to replace non-functioning units to assist with an ongoing bioacoustics survey project titled “Spring Peeper Program: STL-Metro Presence & Absence Survey of non-arboreal Hylidae Species.”
Listening for Frogs after Dark
This study will focus on surveying the statistical metro area of Saint Louis for the presence or absence of the following three species: Spring Peepers (Pseudacris crucifer), Boreal Chorus Frogs (Pseudacris maculata), and Cricket Frogs (Acris crepitans blanchardi).
Pay it Forwards - How to Donate Your AudioMoth
If you still have an AudioMoth you’d like to donate for use in education and nighttime science, the DarkSky Missouri team welcomes additional contributions. Send your device to:
DarkSky Missouri c/o Don Ficken 13024 Barrett Crossing CT St. Louis, MO 63122-4900
Questions? Reach out to Don at dficken@darkskymissouri.org.
Thank you again to ES volunteer scientists for being a part of Eclipse Soundscapes and for helping scientific exploration continue long after the 2023 and 2024 eclipses.
From Collection to Discovery: Why Processing Takes Time
By MaryKay Severino
If you mailed us a little microSD card for Eclipse Soundscapes, you might be wondering: what happened after it left your hands? Why did processing take over a year? The short answer: scale and complexity.
Think about it.
Nearly 1,000 AudioMoth devices were registered across both eclipses (219 in 2023 and 770 in 2024).
Over 600 microSD cards were mailed back (126 in 2023 and 477 in 2024), each with hours of audio.
Two eclipses happened only about six months apart, which meant we were still receiving and logging 2023 data while also updating protocols, training new volunteers, and preparing free kits for 2024.
Some cards came with carefully written notes about time and location, while others had only online notes, only handwritten notes, both, or none at all. That mix made every envelope a surprise, sometimes a complete package and sometimes a puzzle to solve.
When plans met reality
In 2023, things were fairly straightforward on paper. We had not yet invited people to use their own devices, so most returns came in the standardized envelopes we provided, each clearly marked with an ES ID. Even so, sorting took longer than expected. We had originally planned for about 50 sites in 2023 and 200 in 2024, but so many people were excited to join that we expanded both years. That surge meant every microSD card had to be carefully logged by hand, checking whether we had the card itself, the online location info, and any written notes. We worked hard to be transparent by releasing several shared “data dashboard” spreadsheets on the website and posting social media reminders to check them. These updates let participants know what we had on file for them, but the process was still manual work, card by card.
By 2024, when volunteers were invited to purchase and use their own AudioMoths, participation grew even bigger and the returns became more varied. Instead of neat, uniform envelopes, we began receiving packages ofall shapes and sizes, many without an ES ID on the outside. Matching each one to online or handwritten notes added another layer of complexity and time.
Device prep behind the scenes
For the 2023 annular eclipse, we prepped and mailed 219 kits before the event (64 for ES partners and team, 155 free kits). These devices were shipped with batteries uninstalled, which meant participants had to set the device’s internal clock themselves. That turned out to be more complicated than expected and also revealed that some AudioMoths might malfunction.
Between 2023 and 2024, we ran battery usage tests to see if we could set the clocks and install batteries before mailing and still have enough battery power left by eclipse day. The answer was yes. That change made things easier for volunteers in 2024, but it also added more work for the ES team. While we were still receiving and processing 2023 data, we were preparing and mailing390 additional free kits for 2024. Each device had its time manually set before mailing, in addition to batteries installed.
One thing remained consistent in both years: every AudioMoth required a firmware update, which had to be performed one by one by connecting the device to a computer. We also logged each device’s serial number, manually assigned an ES ID, and labeled the device in both written and braille formats.
Beyond the devices themselves, we also provided everything a Data Collector might need, so it was as easy as possible to focus on the science of data collection. Each kit was assembled by hand with return labels, bags, and zip ties, and packaged one by one. This careful preparation was time-consuming but essential for keeping everything organized and supporting volunteers.
Two eclipses, back to back
It was incredibly exciting that the 2023 annular and 2024 total eclipses happened so close together. The 2023 eclipse gave us the chance to test our protocols for the first time, then immediately improve them for 2024. But it also meant the timelines overlapped. While we were still receiving and logging annular data, we were also reviewing what went well for Apprentices, Observers, and Data Collectors in 2023, updating trainings, preparing free kits, and making improvements for the total eclipse. The quick turnaround left us with some catching up to do once the 2024 data began arriving.
Training improvements took time
After the 2023 annular eclipse, we carefully reviewed what went right and what could be better across all three roles: Apprentice, Observer, and Data Collector. That review directly shaped some big changes for 2024. In addition to having complete instructions on the website, we added more live Q and A sessions, more live trainings, and quick tips that went out weekly in the days and weeks before the total eclipse. These changes helped Data Collectors feel supported and prepared (and also improved training for ES Observers), but the careful review and the work to build new materials also took time.
We began receiving cards in October 2023, with huge influxes in the two months after the 2023 annular eclipse and again after the 2024 total eclipse. The last wave arrived at the end of 2024, leaving us with a mountain of data ready to process.
What happens behind the scenes
Processing was not just opening envelopes. It took custom computer programs written by the ES team to check every recording for a timestamp. If a device malfunctioned or was never set to the right time, we reviewed the Data Collector’s handwritten notes to determine the time and time zone. All times then had to be converted to UTC.
We also had to calculate the exact eclipse times for each site, based on latitude and longitude. Some people entered this information online, others wrote it by hand, and some used formats that did not match the guidance we provided. That meant our team often converted locations by hand, corrected missing negatives in coordinates, and double checked any site that appeared in the middle of the ocean.
To keep participants in the loop, we regularly updated a public “data dashboard” spreadsheet that showed what we had received for each site, including microSD cards, online notes, and written notes. Social media posts pointed people back to this dashboard so they could confirm their information. We also shared maps of sites and a feedback form where participants could flag errors or confirm details. Each case was resolved one by one, with as many fixes made as possible.
In total, five custom programs were developed to handle audio data, mapping, eclipse timing, and other tasks. All of this code, along with full documentation, will be released publicly on GitHub by the end of 2026.
It was a bit like receiving thousands of puzzle pieces from hundreds of different puzzle boxes. Each piece matters, but first we had to sort them into the right box before we could put the bigger picture together.
Why your effort mattered
Even if your recording did not end up in the final published analysis, your participation still mattered. Every card, every note, and every attempt helped us refine the process and build one of the most extensive eclipse sound archives ever created, which is on track to be publicly available by the end of 2026. You helped prove that a project of this scale is possible.
Explore the full journey
Want to see exactly how data moves from envelopes on our desks to public access on Zenodo? Check out the Data Processing Stages section of the Your Data in Action page. There you will find the full flowchart and a plain language explanation of how we move data from collection to discovery.
How Artificial Intelligence Might Shape the Future of Eclipse Soundscapes Data
By MaryKay Severino and Henry "Trae" Winter
It’s hard to avoid hearing about how Artificial Intelligence (AI) is changing the way we live and work. Today, the word AI is used to describe many different kinds of computer programs that can learn and help machines solve tough problems, sometimes with human help and sometimes completely on their own. Organizations are working to keep up with the rapid changes in AI tools, best practices, and questions about ethics. Both researchers and managers are taking these changes seriously and are figuring out how to best use AI in NASA’s mission to share the exploration of the universe around us.
Members of the Eclipse Soundscapes (ES) team recently attended a NASA open data repositories workshop that prompted us to consider how AI might impact the Eclipse Soundscapes Project, even as it comes to an end. AI is starting to influence many areas of research and data sharing. One way that AI might impact large datasets, like the 500+ ES audio datasets, is by helping future researchers find, process, and analyze large amounts of data more efficiently and effectively.
This raised an important question for us: How might these very near-future AI possibilities impact the way we share the audio data collected by ES Data Collectors during the 2023 Annular Eclipse and the 2024 Total Solar Eclipse? Here is what we learned and what we decided to do:
Preparing Data For AI Searches
One topic of discussion was how projects can prepare data and metadata so they are searchable by AI, since this may be the way of the future.
Right now, lots of metadata (information about the data) is language-based. That means additional information about the audio data, like site notes, habitat descriptions, or weather descriptions, might be recorded as words or phrases rather than numbers or standardized codes. While this works well for people reading the data, it makes it harder for AI to process consistently.
Language-based metadata examples from ES
Site Location notes might say “near cattle pasture.”
Habitat notes could say “forest,” “woods,” or “woodland.” These all mean the same thing to a person, but could be interpreted differently by AI.
Data Repositories and Preparing for AI
A data repository is a platform where projects store their data so that it can be preserved and reused by others. If data repositories want improved AI search functionality in the future, they may eventually require that data be submitted in new AI search-ready formats.
Zenodo, the platform ES uses to store and share its audio and observation data, is one example of a data repository.
GitHub, the platform where ES shares its software and code, is another example of a data repository.
Not all data repositories have decided on standards for AI search. GitHub has introduced AI tools such as Anthropic Claude Sonnet, ChatGPT, and Gemini 2.5 Pro for creating code, but has not yet included AI agents for finding already existing code. Zenodo has not yet incorporated AI tools into its repository, and adding such tools is not in its current development roadmap. With the AI search landscape changing so quickly, it is hard to predict how AI search tools will be implemented in data repositories and how data providers should format their data for AI.
Vector Databases: An AI Search-Friendly Format
One AI search-friendly option that was discussed is putting each project’s data into a vector database that could be shared with its chosen data repository. A vector database combines data with metadata and also describes that metadata numerically rather than through language and keywords.
These numerical metadata descriptions make it easier for AI to:
Recognize similarities rather than just exact search term matches
Zenodo, the repository where ES data is being archived, does not currently have a plan to support vector databases. It is impossible to predict how Zenodo or other online data repositories might incorporate vector databases and what future standards they may require.
ES’s Decision
Creating a vector database is more than what Eclipse Soundscapes can take on right now. It would take more time and resources than the project has and would mean looking for new data repositories or doing extensive work to fit it into Zenodo’s framework.
Still, we’re glad we explored this possibility. Thinking about what AI might mean for scientific data is worthwhile, even if we can’t take it on ourselves. As projects wind down, it helps to keep looking ahead. Our team will carry this knowledge into future efforts, and by sharing it here, the ES community can carry it forward too.
If you want to learn more about vector databases, check out these articles:
Preliminary Results from the Eclipse Soundscapes Project
by Kelsey Perrett and MaryKay Severino
Do birds sing during a solar eclipse? One year after the Great American Eclipse of 2024, we’re learning how avian species responded to this inspiring celestial event.
As one facet of the Eclipse Soundscapes Project, 1310 people began the process to be ES Data Collectors and almost 500 Data Collectors used AudioMoth recording devices to capture soundscapes before, during, and after the April 8, 2024 solar eclipse. The aim was to establish a baseline for “normal” soundscape activity on non-eclipse days and determine whether those soundscapes were altered by the total solar eclipse, in which the Moon passes in front of the Sun, temporarily blocking its rays. 45,960 total hours of audio were captured (if a human were to listen to the recordings for 8 hours a day, it would take 16 years to complete!) Over the past year, the Eclipse Soundscapes team has worked tirelessly to organize this gigantic collection of audio data. Now, with the help of wildlife biologist Dr. Brent Pease and the machine learning technology of BirdNET, Eclipse Soundscapes has uncovered some exciting patterns in the data that indicate how bird species reacted to the eclipse.
Eclipse Soundscapes was based on an early participatory science initiative from the 1930s. For the August 31, 1932 total solar eclipse, the Boston Society of Natural History invited the general public to submit written observations of any animal behaviors they noticed during the eclipse. Many of these anecdotal reports suggested that animals — including birds — changed their vocal activity as the sky darkened. “It was noticeable that as the eclipse progressed, there was a decrease in the chorus, with a silence during totality,” wrote one participant. Other participants noted an increase in bird activity — the society received “several reports of hooting from wilder sections in New Hampshire.”
Boston Society of Natural History (~1865-1890)
Establishing a baseline for bird vocalization
Interestingly, the preliminary findings from Dr. Pease and the team mirror the anecdotal reports presented by the Boston Society of Natural History. The Eclipse Soundscapes Project ran its 2024 audio data through BirdNET, a machine-learning tool that identifies bird species from sound recordings. Because Eclipse Soundscapes Data Collectors recorded for two days prior to the eclipse and for two days after the eclipse as baseline data, the researchers were able to estimate the probability of vocalization during a 4-minute window at the specific time of day when totality occurred. (Totality occurs when 100% of the Sun is blocked by the Moon). They then used that data to investigate the question of whether bird vocalization patterns change during a solar eclipse.
Do bird vocalization patterns change during an eclipse?
Tufted Titmouse
The initial data showed varied results. Some bird vocalizations increased during totality, while others decreased. However, when the researchers broke the vocalization down by species and by family, patterns began to emerge. Birds like the Tufted Titmouse showed a marked decline in vocalization during totality (a 5% decrease in the probability of vocalizing compared to the same 4-minute time period on non-eclipse days).
Barred Owl
On the other hand, Barred Owls increased vocalization during totality (a 4% increase in vocalization probability).
Although researchers have audio data from the eclipse day, they used probability of vocalizing to compare that behavior to a typical day at the same time and location. This approach helps account for natural daily rhythms (some species are more or less active at certain times) and provides a clearer picture of how eclipse conditions altered usual behavior. A negative change means the species was less likely to vocalize during the eclipse than expected, while a positive change indicates an increase beyond typical levels.
The trends remain when looking at bird families. Ichteridae and Paridae (including titmice) consistently showed decreased vocalizations, while Corvidae and Strigidae (including owls) consistently showed increased vocalizations. This variation may be explained by species’ sensitivity to light. Nocturnal species and those that typically vocalize at dusk, such as robins, appeared to respond to the eclipse with increased vocal activity, suggesting a response to the sudden dimming of light or heightened sensitivity to light cues.
This bar graph shows how the probability of vocalization on solar eclipse day by bird species. It uses audio data collected during the week of the April 8, 2024 total solar eclipse. The graph lists bird species on the left side and shows whether each became more or less vocal during the eclipse. Yellow bars pointing to the left show birds that were quieter than usual, while black bars pointing to the right show birds that were more vocal. Birds that increased their vocalizations include the Barred Owl, American Robin, Blue Jay, White-throated Sparrow, Fish Crow, and American Crow. Birds that became quieter include the Tufted Titmouse, Red-bellied Woodpecker, Brown-headed Cowbird, Common Grackle, and Red-shouldered Hawk. The biggest decrease in vocal activity came from the Tufted Titmouse, while the Barred Owl showed the largest increase.
Timing of the behavioral response
With these special and familial patterns in mind, the researchers could then explore the timing of the behavioral response: At which point before or during totality did vocalization changes start? And how long do those changes last? The team charted the average number of bird vocalizations per minute across time, for both a “random day” before/after the eclipse, and on the day of the eclipse. The graphs showed that decreased vocalization began trending downward about 22 minutes prior to totality, and returned to normal approximately 49 minutes after totality.
For all other periods on the day of the eclipse, vocalizations per minute roughly matched the trajectory of the random days. Recordings taken outside the path of totality also roughly matched the trajectory of the random days, and are not included in the graph below. This indicates that the changes in vocalizations per minute were unique to the path of totality on the day of the eclipse.
While these findings are currently more about gaining baseline knowledge, they raise interesting questions about how birds respond to sudden changes in light. Could similar patterns emerge during other abrupt natural events, like thunderstorms or wildfires? Understanding these responses might eventually help researchers distinguish between different types of environmental disruptions, offering new ways to monitor ecosystem health through sound.
What’s next?
“It’s still early stages,” Dr. Pease said about these preliminary results. “We’re starting to see some really interesting and fun things. But there are lots more questions and more analysis to do.” Dr. Brent Pease, a Soundscapes Subject Matter Expert, is leading research efforts to analyze eclipse-related audio data collected by ES volunteers. His work focuses on understanding how wildlife responds to eclipses through changes in natural soundscapes.
A few questions Dr. Pease would like to explore include:
Proximity to totality: How close to totality do birds need to be to produce these changes? Location data tied to the audio recordings will help Eclipse Soundscapes answer this question.
The “dawn chorus reprise:” Dr. Pease notes a slight bump in the graph after totality in which birds vocalization increased. A closer look at which birds are making these vocalizations will help researchers determine if it does in fact mimic dawn-like activity.
Non-bird responses: Do other species (like insects) change their vocalizations during the eclipse?
2023 Annular Eclipse vs. 2024 Total Solar Eclipse – Soundscapes Audio Data:Does the bird response along the path of annularity during the 2023 annular eclipse (with approximately 90% coverage) match the response just outside the path of totality during the 2024 total solar eclipse, where coverage was also around 90%?
While researchers are only beginning to scratch the surface of these questions, the collected audio data for the project will be uploaded to the Eclipse Soundscapes Community on Zenodo, a free, open-access repository before the project ends. Present and future researchers will be able to access this audio data to learn more about how eclipses impact soundscapes on Earth and other questions. The Eclipse Soundscapes team looks forward to learning more about the data collected by our fantastic Eclipse Soundscapes volunteers!
A mother and daughter experience the 2024 total solar eclipse together in Monkton, VT. – Photo courtesy of John Mejia
Following the 2023 and 2024 solar eclipses, Eclipse Soundscapes participants answered a number of survey questions that asked how the experience made them feel. Did the eclipse give them an experience of “awe” or a feeling of connection to something greater than themselves? Did participating in the project improve their feelings of belonging in science?
Researchers at North Carolina State University are using those participant responses to learn more about the emotions that eclipses evoke.
Kelly Lynn Mulvey, an Associate Professor of Psychology at North Carolina State University (NCSU), has spent her career studying how to increase participation in STEM. Mulvey worked through data from approximately 3,200 Eclipse Soundscapes Apprentices, Observers, and Data Collectors, uncovering some fascinating results.
An increase in science belonging
Overall, participating in the Eclipse Soundscapes project increased feelings of belonging in science. “We saw an increase in belonging from their reports of how they felt before they participated in the role and how they felt after they had participated,” Mulvey said.
Eclipse Soundscapes participants in 2024. – Photo courtesy of Kathleen Hay
Eclipse percentage and awe
Mulvey also studied how the eclipse impacted feelings of awe in participants. Her work built upon research surrounding the 2017 eclipse and the language used in Twitter (now X) posts. The earlier study indicated that the amount of eclipse coverage was related to increased awe. “People who were posting about the eclipse who were closer to totality used more words related to awe than those who were off the path of totality,” Mulvey said. For the Eclipse Soundscapes project, participants were specifically asked whether the eclipse increased feelings of awe. Again, the percentage of coverage proved important. “Those who experienced the total eclipse reported greater awe than those who experienced the partial eclipse,” Mulvey said.
Changes in animal behavior
Other teams at NCSU, led by Professor of Biological Sciences Adam Hartstone-Rose, are studying observations of animal behavior collected by Eclipse Soundscapes Observers. The teams hope to identify which groups of animals (birds, insects, mammals, etc.) changed or did not change their behavior during the eclipse. If animal groups did change their behavior, the team is trying to identify when, and how long it took them to return to “normal” behavior. “It’s laborious work, there is a lot of coding involved,” Mulvey said, but she expects to have some basic animal behavior findings soon. To learn more about the preliminary work done by Harstone-Rose using 2023 ES Observer data check out “Extraordinary Darkness: A Participatory Approach to Assessing Animal Behavior During Eclipses.“
Implications for future science
Mulvey’s research demonstrates the long-lasting scientific impact of the data from Eclipse Soundscapes participants. Not only does participatory science provide researchers with reliable and authentic data, it benefits the participants as well. “I think the belonging results especially are really important, because it suggests that doing these participatory science projects can help you feel like you fit in more with science,” Mulvey said. “I think it has the potential to really harness folks’ nascent interest in science and launch them into continuing to select either informal science activities, or maybe more formal science work, like pursuing a degree or a job in science. We have a lot of deficits in terms of who is entering STEM fields right now, and a need for more people and more people from (various) backgrounds…to leverage their skills and help us to answer the big scientific questions that are out there.”
When Eclipse Soundscapes Data Collectors submitted the audio data they recorded during the week of the April 8, 2024 eclipse, they had the option to keep or donate their AudioMoth recorder device. Several participants donated their AudioMoths to outlets like their local Library of Things. Others sent their AudioMoths back to Eclipse Soundscapes for us to donate to other science focused projects and community organizations. Eighteen of those AudioMoths have been donated to Dark Sky Missouri, an initiative to protect our night skies and the creatures that depend on them. Eclipse Soundscapes caught up with chapter founder Don Ficken to learn more about how these AudioMoths will contribute to future science.
The AudioMoth is a low-cost, full-spectrum acoustic recording device that can listen at audible frequencies, well into ultrasonic frequencies. It is capable of recording uncompressed audio to a microSD card at rates from 8,000 to 384,000 samples per second and can be converted into a full-spectrum USB microphone.
Don Ficken is a Missouri Master Naturalist and amateur astronomer who launched the Missouri chapter of Dark Sky after patrons in his community library telescope program expressed difficulty seeing the sky. He found the Eclipse Soundscapes Project through SciStarter, an online hub of citizen science projects, and participated in 2024 as a Data Collector. “It opened up a door for me because I never really thought about sound acoustics in this way,” Ficken said.
It occurred to Ficken that acoustics could help bolster Dark Sky Missouri’s efforts to study and conserve night time wildlife. One of these efforts, Lights Out Heartland, encourages homeowners and businesses to minimize artificial light usage in order to protect migrating birds from collisions due to disorienting bright lights. (A 2019 Cornell Lab survey of 125 urban cities found that the St. Louis metro area ranks as the fifth deadliest city for birds during the spring migration and the sixth deadliest for fall).
“That’s kind of a bummer to talk about with people,” Ficken admits. To “bring a positive spin to the project,” Ficken hopes to use the AudioMoths to capture the birds’ nocturnal flight calls as they fly over locations like the Gateway Arch, Shaw Nature Reserve, and Missouri Botanical Gardens. “You would think that migrating birds would go up and just stay there,” Ficken said. “That’s not actually what happens. They drop altitudes, probably because of thermals, maybe because magnetic shifts and things like that. They move up and down. And every time they shift formation, they have to make calls to get themselves reoriented. And this adds a whole new dimension that we can study through acoustics.”
The Lights Out Heartland initiative helps educate the public about impact of artificial lights on migrating birds.
Dark Sky Missouri also hopes to take more general surveys of nature at night, by placing AudioMoths in parks and natural areas. Even though parks are not typically open or staffed at night, the AudioMoths could help map the locations and movements of wildlife, creating talking points and learning opportunities for staff and visitors alike. “The trouble is that unless we reconnect people with the night, unless we get them excited about what’s out there at night, they won’t care about nature,” Ficken said. “If I can show them things like owls and bats and frogs and peepers and insects making noise, if I can show them how alive nature is at night, I think I might make them convert.”
Both initiatives will be piloted during the fall bird migration, with the goal of developing a framework for an expanded long term project. “The idea is we’re going to build this little network of people that know about AudioMoths and how they work,” Ficken said. “I’m drawing on people that already have expertise around birds. They’re really passionate. We’ll probably have a small team of maybe six to eight people doing this so we can just learn together, figure this out.”
While there are no opportunities for the general to get involved in the projects just yet, Ficken said participatory science surrounding birds and light pollution is as abundant as it is important. He cites the Globe at Night project, in which participants helped determine that global light pollution is growing 10 percent each year.
Ficken says participatory scientists can benefit from the multisensory methods employed in the Eclipse Soundscapes Project. “I think that the thing that they should think about is really the door that acoustics would be opening for them,” he said. “In other words, you don’t have to just visually look at daytime. Think about sound. Think about night.”
Check out the recorded webinar when special guest Don Ficken joined Eclipse Soundscapes to talk about how he is using AudioMoths for Nighttime Conservation. Click DarkSky webinar here to watch the recording.
Non-visual designer Lindsay Yazzolino shares her experience aboard the AstroAccess “AA2” Zero Gravity Flight
Listen to the audio interview or read the full transcript below.
Kelsey Perrett: Hi, I’m Kelsey Perrett with ARISA Lab. Today, I’m speaking with non-visual designer Lindsay Yazzolino. Lindsay is a consultant with ARISA Lab and the Eclipse Soundscapes: Citizen Science Project. She recently returned from a zero-gravity flight with AstroAccess, a project dedicated to promoting disability inclusion in space. Lindsay joined us to discuss what it’s like to experience zero gravity, how the crew’s experiments will impact the future of accessibility in space, and why it’s important to include people with disabilities in the design process from the start. And now, let’s hear from Lindsay.
Lindsay Yazzolino: I’m Lindsay Yazzolino, I am a non-visual designer, I have a background in cognitive neuroscience research, and I’ve always loved science. Growing up, I always knew I wanted to do something related to science, I was always really curious about how things worked. I’m also totally blind and I’ve been blind since birth. I have always, of course, been surrounded by this theme, I guess, of needing science to be non-visually accessible. And I’m just really passionate about, not only science accessibility, but also just making it not just accessible, but also really hands on and interesting and fun for people with disabilities, specifically for for blind people. And making it accessible in a way where someone can just participate, not necessarily because they’re blind, but just to be able to participate in science in really cool and interesting ways that also happen to be hands on.
Kelsey Perrett: Lindsay says she first heard about the opportunity with AstroAccess when she was presenting at the SciAccess conference in 2019.
Lindsay Yazzolino: So I was at this conference, and I met a whole bunch of cool people, among them Anna Voelker and Sheri Wells-Jensen, who were both organizers of this conference. And turns out, they also were both organizers of what would become AstroAccess. AstroAccess of courses is the organization that ran this really cool zero gravity flight that we’re talking about. So yeah, I found out about them through SciAccess and that’s how I knew to apply. I applied the first year that they were doing the first zero gravity flight, which was 2021. They were very clear that they wanted people who were really open about talking about their disabilities, and who are really passionate about creating accessibility in space travel, spaceflight, and all this stuff. So I applied the first year, and it was super competitive, and I didn’t get in. You know, it was really cool that there was so many qualified people that it would be this competitive. So then the next year, I applied again, and I got in. I was really excited, because I mean, zero gravity flight. I was just super curious, you know, what it would feel like to experience zero gravity. And of course, I believed in what the organization is doing. And, of course, I wanted to use my skills and my experience, and, whatever I can bring to help increase accessibility in space in spaceflight. So it was a combination of like, I want to help science and wow, I get to be in an airplane and fly around in zero gravity, because I love airplanes and I love like thrill seeking experiences. And so it was just like, a whole combination of of coolness. And a lot of people have tried to explain, people get asked, what does it feel like to be in zero gravity? And everyone’s like, it’s really hard to explain. So one of my sort of goals when I was doing this was to try and be able to explain it as best as I could. And specifically, of course, me being totally blind, I was experiencing it totally non-visually. And I remember the first that initial feeling of the floor just kind of disappearing, and being like, Oh crap, there’s no floor. Because my instinct was to find the floor, because you know when you’re on Earth, and you’re tossed up in the air, you know that you’re going to end up having to fall back down onto the floor. And usually that’s not a very comfortable thing to happen. But of course, I knew it was zero gravity. But still I felt that need to sort of test what it feels like to be in this place where the laws of gravity are different. So in a way, I sort of felt like a baby. That’s the best I can explain is like, because it was this whole kind of relearning what happens when you move in this new environment. But after a few parabolas, I was just like, Okay, this is really fun. And that that point I was, well, I had to do my my actual science experiments. But like, let’s face it, the really fun part, was being in zero gravity. Actually, what I realized, the best way to describe it for me, is in some ways, it felt like instead of me moving within the plane, it felt like the plane was moving around me. And it would sort of feel like being on a hamster wheel where like, whatever you were on was sort of under you, like you’re actually propelling it under you, as opposed to feeling like I was, you know, upside down on the ceiling or sideways. So it was a really interesting sensation and nothing like anything I’ve experienced before.
Kelsey Perrett: While on board, the AstroAccess crew conducted a variety of scientific experiments which were intended to advance Universal Design in space travel.
Lindsay Yazzolino: You know, I mean, it’s really cool to go into zero gravity, but of course, we were also doing experiments. And the idea of these experiments was to increase our knowledge of how we can make spaceflight accessible, how we can actually design, on a practical level, how we can design spacecraft and space stations and all these things to be accessible. So there was a group of us who were blind or low vision on the flight. And we worked on a project where we developed a set of tactile graphics to help people orient non-visually while they’re in zero gravity. A few of us actually went to New York a few times to visit the New York Public Library where there’s a whole accessibility program led by Chancey Fleet, and there’s a whole bunch of everything to make tactile graphics. So we had tactile graphics and embossers, we got the help of a couple people who are really expert designers. So they got to help us actually sketch out the graphics. We had all these tactile drawing boards. It was like a few blind people with tactile drawing tools, just collaborating, you know, brainstorming and coming up with ideas, just iterating on these designs, and coming up with the best designs that we could think of. So the most important thing we wanted to make sure people knew was which way is down, because if you think about it, you’re in zero gravity, you don’t know which way you’re facing, it’s really easy to get disoriented. And the idea is that you could reach out and touch any surface, whether it be walls or floor ceiling, and be able to feel which way is down. And then also, we had symbols to show which direction different emergency equipment was and also how far that equipment was, and whether it was on your same wall or across from you on the opposite wall. So this is a totally new system that we developed. And we wanted to do some initial testing to see how quickly and accurately people can read them while in zero gravity. So we set up a few of us with a whole set of test graphics where you had to read them. And during each parabola, you had to call out — we recorded people’s observations about what they thought they showed.
Kelsey Perrett: I asked Lindsay what was it like to work on a team that was intentionally inclusive of people with disabilities, and whether this experience changed the way she thought about herself or her career as a scientist.
Lindsay Yazzolino: I mean, it’s always nice when you work with a group of people where you don’t have to deal with people being really paternalistic, or being way too hovering, you know, hovering over you, or thinking people —blind people, people with other disabilities — can’t, you know, handle themselves or do certain things or need to be led by people without disabilities. Of course, I think you and I both know that that’s a huge problem when that does happen. I mean, on a functional level, it just meant we could get more done. We could actually design some of the experiments that we wanted to do, that we could test the things we wanted to test, and that we could experience zero gravity in a way where we weren’t being restricted unnecessarily. Because that is always a thing, you know, to think about. I know, the word empowering is overused, we use that word a lot. But it was in the sense that, you know, we are the researchers, right, like, we’re the scientists, we’re the researchers. And so it felt like, honestly, it just feels the way it should, you know, to be in a group where inclusion and participation is expected. Yeah, it just feels like the way it should be. It’s definitely shifted how I feel about the proximity of myself to space travel. Before, it always felt like the idea of going into space was one of those things that, like you said, everyone kind of dreams about at some point. But it didn’t feel like a reality exactly. It kind of felt like, oh, this is a thing that other people do. And not even just because of being blind, but just because it’s just the thing that other people did. Very few people get to do it. So I feel like now I do feel closer to that reality. I feel like, especially now that there’s a lot more commercial space travel happening, you know, the idea of me thinking of myself going into space feels a lot more conceivable than it did before. And of course, making it accessible is a big part of that. We’re just making what we can do our part to do is to prevent designers from, whether intentionally or not, just creating bad designs that don’t need to happen and unnecessarily excluding people from going to space. I do think that, you know, we have to see this as an ongoing thing. Like yes, we did this flight. Yes, it was awesome. But now there’s just a lot more work to be done. There’s a lot more science to be done. But it’s like, I feel like we’ve barely scratched the surface when it comes to starting to explore possible, you know, accessible features and inclusion in space travel, all this stuff. So this is like just the beginning. And I’m really looking forward to seeing how things progress and to being part of contributing to that. And I think that like we just need that we need everybody who can be part of, you know, who who can contribute their expertise and their knowledge, and their belief in us as people, you know, who can eventually go to space.
Solar Eclipse of August 21st, 2017 with lens flare and moon reflection.
A history of lessons learned and questions still to be answered
Solar eclipses are more than an exciting cosmic phenomenon, they have played a key role in helping humans understand the universe. By observing eclipses, scientists learned about the size and shape of the Sun, the Moon, and the Earth. Eclipses clued-in early astronomers to the orbits of the celestial bodies and how they relate to one another. Copernicus’ theory of heliocentrism cemented the understanding that solar eclipses occur when the Moon passes in front of the Sun. And just like that, public perception of eclipses shifted from a frightful darkening of the skies to an opportunity to learn more about the cosmos.
Studying the Corona
One of the first great modern discoveries surrounding an eclipse occurred in 1868. French solar physicist Jules Janssen discovered a new element while observing the Sun’s chromosphere through a prism. Astronomers named the element Helium, after Helios, the Greek god of the Sun. It would be more than 25 years before helium was discovered on Earth, but we now know it’s the second most common element in the universe.
Janssen was neither the first scientist nor the last to study the outer atmosphere of the Sun during an eclipse. A solar eclipse offers a unique chance for scientists to view the Sun’s corona. “Most of what we know about the corona is deeply rooted in the history of total solar eclipses,” Lina Tran wrote for NASA Goddard. “Before sophisticated instruments and spacecraft, the only way to study the corona from Earth was during a total eclipse, when the Moon blocks the Sun’s bright face, revealing the surrounding, dimmer corona.” An instrument called a coronagraph can mimic eclipse conditions on a telescope, but eclipses still remain the most authentic way to study the corona from Earth.
The Coronal Heating Problem
Scientists thought they discovered yet another new element in 1869 as they observed an eclipse through a spectrometer. A spectrometer helps scientists determine which elements compose a band of light, but the green line that appeared in 1869 didn’t correspond to any known element. Scientists briefly called the new “element” Coronium, but Swedish astronomer Bengt Edlén later determined that the element was superheated iron.
The extreme temperature of the iron indicates that the corona is 2 million degrees Fahrenheit — nearly 200 times hotter than the surface of the Sun. This phenomenon is known as the “Coronal Heating Problem.” The layers of the sun typically become cooler and less dense as they move outward from the core, so scientists are not sure why the corona would be significantly hotter than the surface below it. Heliophysicists believe this may be caused by wave heating, or perhaps nanoflares, but further study of the corona is necessary before we know for certain.
Solar Winds
The corona is full of other fascinating features. Eclipses give astrophysicists a good look at the behavior of loops, streamers, and coronal mass ejections. They are also an opportunity to learn about solar winds: charged particles that emanate from the corona. Solar winds are important in that they define the boundary of our solar system and protect us from cosmic radiation. On the downside, they can disrupt our satellite and GPS-based communications. During an eclipse, researchers can take more accurate temperature readings of solar winds. Interestingly enough, solar wind temperatures do not seem to fluctuate in tandem with the solar cycle. It’s another mystery that may require more solar eclipses to solve.
The Earth’s Ionosphere
Eclipses don’t just tell us about the Sun. We can also learn more about our own atmosphere here on Earth. The ionosphere is the upper level of the Earth’s atmosphere. During the daytime, the ionosphere is “charged” because “energy from the sun and its corona feed extreme ultraviolet photons into this area, creating free electrons and ions,” physicist Phillip Erickson told Slate. The ionosphere is less active at night. An eclipse is like a light switch for the ionosphere, turning the charge off and back on again as the Moon passes in front of the Sun. This allows scientists to study changes in real-time, and can provide clues about how the ionosphere affects communications and space weather.
Studying Other Structures
Eclipses also offer insight into other structures in the solar system. Some scientists have used the occasion of an eclipse to take more accurate thermal readings of Mercury. Others have embraced eclipses as a model that makes our stratosphere more “Mars-like.” During an eclipse, UVA and UVB levels in our upper atmosphere more closely resemble those on Mars, allowing researchers to test microbial responses to Mars-like conditions.
An eclipse also led to one of the most important “proofs” in modern science — a test of Einstein’s Theory of Relativity. Einstein’s theory posited that light shifts as it passes by a massive body (like the Sun). In 1919, researchers noted that the light from stars shifted before, during, and after a total solar eclipse. It appeared that Einstein guy was right all along.
Studying Life on Earth
These days, instruments like the Parker Solar Probe are teaching us about the Sun (and other structures) in ways we never imagined. But that doesn’t mean the days of learning from eclipses are through. Eclipses present a unique opportunity to learn about changes in our solar system. And some of those changes occur right here on Earth.
During the Eclipse Soundscapes: Citizen Science Project, we’ll be studying how life on Earth responds to those changes. Anecdotal evidence suggests that we could experience altered animal behaviors and sounds (for example, nocturnal animals calling during the eclipse, or diurnal animals producing a “false dawn chorus” as the light re-emerges). We’ll be taking soundscape recordings before, during, and after the eclipse. Then, we’ll analyze the recordings for any patterns or anomalies. Interested in joining us? Sign up here to be a part of the project!
What can we learn by keeping our ear to the cosmos?
Anyone who remembers the iconic Ridley Scott film “Alien” (who could forget?) may recall the tagline “in space, no one can hear you scream.” And while sci-fi films are frequent fodder for scientific debate, this assertion is widely acknowledged as true. Because there is no air in space, there is nothing to conduct the sound waves, and therefore no vibrations that are perceptible to the human ear. But that’s not to say it’s impossible to hear the sounds of space. With a little creativity and a lot of scientific ingenuity, astrophysicists have developed fascinating ways for us to listen to the cosmos.
In most cases, the listening process only requires a little bit of translation. Mechanical sound waves might not be able to travel through space, but electromagnetic waves can. Scientists use instruments to collect radio waves, microwaves, infrared rays, optical rays, ultraviolet rays, X-rays, and gamma-rays, then convert them into audible sound waves through a process known as sonification.
NASA researchers were considering the possibilities of sound in space as early as 1977, when the Voyager probes were launched. In the event that the interstellar probes encounter intelligent extraterrestrial life, NASA placed a “Golden Record” aboard the spacecraft that bears the auditory marks of life on earth: ocean waves, bird song, greetings in 55 languages, and even a playlist of multicultural music through the ages. But scientists were also thinking about what sounds the Voyagers could receive when they installed a Plasma Wave Subsystem onboard each probe.
In 2012, Voyager 1 crossed the boundary of the heliosphere. Not long after, it sent back an amazing piece of data: the vibrations of dense plasma, or ionized gas, rumbling in interstellar space. These eerie whistles are helping scientists learn about the density in this strange space beyond our solar system.
Plasma is frequently used as a medium for scientists to detect space sounds. Just as sound waves can move grains of sand on a plate, similar waves can cause the plasma in the Sun to rise and fall. This is how scientists learned that the Sun itself rings like a bell. Telescopes like the HMI and MDI observed movements in solar plasma, and the team at Stanford’s Solar Center created the Sonification of Solar Harmonics or SoSH Project to convert these observed solar vibrations into audible sounds.
Plasma wave instruments were also placed on NASA’s planetary explorers, like the Cassini probe to Saturn and the Juno probe to Jupiter. Cassini’s Radio and Plasma Wave Science Instrument has picked up a number of fascinating signals, including radio emissions from Saturn and its moons and an impressive lightning storm. Cassini also carried a microphone aboard the Huygens probe which recorded sound as it descended to the Saturnian moon of Titan.
The Juno probe captured data as it descended into Jupiter’s magnetosphere, the largest structure in our solar system. It picked up a series of electromagnetic waves trapped in a cavity within Jupiter’s magnetic field. A couple of months later, the instrument received radio signals from the planet’s notoriously intense auroras.
Not all sounds captured in space are a result of electromagnetic waves. Direct impacts can cause mechanical vibrations that are audible to the human ear. For instance, when Stardust-NExT encountered the comet Tempel 1 in 2011, its Dust Flux Monitor recorded the vibrations of dust particles pelting the craft.
In 2019, the InSight lander placed a highly sensitive seismometer on Mars which has collected the sounds of quakes and Martian winds. Inspired by the seismoter’s success, NASA opted to send a set of microphones onboard the Perseverance rover, which landed on the red planet in February of 2021. The Entry Descent and Landing mic recorded Percy’s successful landing, while the SuperCam mic sends back the mechanical sounds from the rover and the rocks and minerals it studies.
Other space sounds are sonifications from data about light. This has allowed NASA to glean sounds with some of its most impressive telescopes, including the Chandra X-Ray Observatory, the Hubble Space Telescope, and the Spitzer Space Telescope. These telescopes create images by capturing x-ray, infrared, and optical light. Sonification converts that data into audio in which the pitch and volume reflect the concentration and intensity of the light. As a result, we’re able to hear celestial objects like supernovas, nebulas, and even black holes. (In case you’re wondering, black holes sing in the key of B-flat).
But why go through all this trouble to recreate the sounds of space? The answers are simultaneously complex and very simple. In the case of plasma wave instruments, scientists can learn a great deal about the interactions and dynamics between objects in our solar system. The “sounds” that come from these studies are just a fun after-effect. And science should be fun. Musical composers, video game designers, and other multi-media artists have latched onto these space sounds for all sorts of creative endeavors. Who’s to say complex scientific data shouldn’t be accessible to the masses?
Accessibility is another important piece of the puzzle. Tools like sonification broaden the field of astrophysics so it can be studied and enjoyed by people who are blind or low vision. But it also presents data in a multi-sensory form that makes learning more accessible for everyone.
Listening with Eclipse Soundscapes
For more fun and accessible space science, download the Eclipse Soundscapes Mobile Application, which allows you to hear (and feel) a total solar eclipse. You can also sign up to join our upcoming Eclipse Soundscapes: Citizen Science Project, where we’ll be studying how eclipses impact the soundscapes here on planet Earth. It’s just another way to keep our ears open and learn about our universe!
The first thing to understand about auditory learning is that it is wrapped in a cloak of myth.
In the early ‘90s, the idea of different “learning styles” was popularized by the VARK questionnaire. The movement suggested that all humans fall into one of five categories: visual learners, auditory learners, reading/writing learners, or kinesthetic learners. The trend took off. Not only did pedagogues add more categories to the list, but learners began identifying with certain styles, and educators began teaching to specific styles. It wasn’t until the last decade or so that the concept of “learning styles” was debunked.
While the research does not support different “learning styles,” it’s obvious that learners do have different preferences and abilities. And if learners have a disability with which they physically or cognitively cannot learn through a particular method, it’s important to consider alternate modalities.
So, even if there’s no such thing as “auditory learners,” auditory learning remains an important tool in any educator’s toolbox.
Hearing is a powerful sense that can enrich our learning in myriad ways. Because ARISA Lab’s group of Eclipse Soundscapes projects will use auditory techniques to help participants learn about eclipses, in this blog we’ll explore how we hear, the fascinating link between hearing and memory, and the benefits of incorporating sound into education.
What happens when we hear a sound?
Receiving Sound
When sound waves enter our ears, they travel through the ear canal to vibrate our eardrums. The bones in our middle ear amplify these vibrations, causing fluid inside our cochlea to ripple. This in turn stimulates tiny hair cells in our inner ears. These hairs convert the sound waves into an electrical signal, and send those signals to the auditory nerve. The auditory nerve passes through the auditory cortex of our brain, located in the temporal lobe. Our brain then interprets those signals.
This video from the National Institute on Deafness and Other Communication Disorders gives a great overview.
Perceiving Sound
Most brains are adept at interpreting sounds, especially when it comes to human speech. They’re also great at filtering out background noise — sounds we do not need to process in the moment.
How the brain perceives and interprets heard sounds takes place at a number of levels.
Some perception is reflexive (like a loud sound that causes us to jump)
Some perception happens in the auditory cortex
Some perception happens in other areas of the brain
One part of the brain may recognize a memorized sound, like your mother’s laugh.
Another part of the brain may prepare a voluntary response to a question.
Yet another part of your brain might have an emotional reaction to the content of the sound.
This video from S. Blatrix and R. Puhol, shows the journey of sound through the auditory pathway.
All of this depends on our level of alertness. If we are asleep, our ears still work. Sound may cause reflexive movement, but the other parts of the brain involved in perceiving sound remain inactive.
How do we learn and remember through sound?
Hearing and Remembering
To understand learning through sound, it’s important to consider the unique link between hearing and memory. As Krause and White-Schwoch suggest in Unraveling the Biology of Auditory Learning: A Cognitive-Sensorimotor-Reward Framework, “the precision of automatic sound processing in the brain is linked to cognitive skills such as attention and working memory.”
Humans have a good ability to hold a large amount of auditory information for about 3-4 seconds. For this time we can “replay” the sound in our mind. This is known as “echoic memory.” Repetition, or repeated exposure to a set of sounds, can help us encode information in our long term memory, where it can be retrieved later. Attention is also at play here: while our working memory processes sounds, our ears are still paying attention to all the new sounds coming in.
Some research suggests that listening is exceptionally good exercise for the brain. Older adults experience rapid cognitive decline when they are not able to hear well. Per contra, studies show that music training may improve memory and linguistic expression. Music reaches parts of our brain that are responsible for attention, emotion, and procedural memory. Patients with Alzheimers can often remember music from their past, and music therapy may activate their brains and improve communication. Such evidence suggests a strong music-memory connection. This connection is the reason why you still remember the words to your high school favorite song (or that terrible chewing gum jingle) years after you last heard it. If remembering is one key to successful learning, is it possible that auditory modalities like music help us learn better?
A Case for Multi-Sensory Learning
The science on the specific benefits of auditory learning has not yet broken free of the confines of “learning styles.” There’s no reason to believe that auditory learning, on its own, is more or less effective than any other sensory form. Instead of focusing solely on auditory learning, educators should consider a multi-sensory approach.
The human brain is uniquely evolved to thrive in a multi-sensory environment. We are designed to process our world through sight, sound, touch, smell, and taste. One popular paper on the “Benefits of Multi-Sensory Learning” stated that “multi-sensory training protocols can better approximate natural settings and are more effective for learning.” The hope is that by engaging different areas of the brain, multi-sensory learning could help improve neural connectivity.
As ARISA Lab Education Director MaryKay Severino put it, the shift from “learning styles” to multimodal learning won’t necessarily change how educators plan activities. Instead of using multi-sensory activities to benefit individual styles, the aim is to use multi-sensory activities to benefit every learner. The change, Severino said, “is how we explain learning to learners themselves. If a learner understands that learning with as many senses as possible will support their understanding, there could be more engagement.”
If learners are engaged and educators are encouraging understanding, the brain will take care of the rest.