Behavioural Biology

The art of hearing

6. April 2022 von Theresa Dirtl
We hear a sound of a moving object and automatically predict in which direction it travels. What is the process behind this survival tool? The project Dynamates conducts the first comparative study on dynamic auditory predictions with marmoset monkeys and humans.
In the Dynamates project, cognitive biologist Michelle Spierings is working with marmoset monkeys to find out how they predict sound occurrences. © Pixabay

Every time you cross a street you look to the left and to the right to see if a car is coming and if it is safe to cross. But actually, you do not only use your eyes, but your hearing as well. "If we listen to a sound source we automatically try to predict where it is going to occur next", says Michelle Spierings of the Department of Behavioural and Cognitive Biology. "When, for example, a car is moving from left to right we keep predicting that the sound is also moving further to the right because of its travel speed and travel direction." It does not matter if it is busy traffic in a city or noises in the forest, this dynamic decision making in an auditory context is based on the same survival principle.

Change of direction

Sounds might change direction and this is what scientists call ‘change points’. "When a sound changes direction you would usually still assume that it will continue on its original path, based on all the information you have so far. But if you use only the most recent information after the change point, you would predict the next sound source quite differently", explains Michelle Spierings. "In our project Dynamates, we want to understand this process behind detecting and predicting sound occurrences. My colleague Ulrich Pomper and his team are conducting human studies, while me and my team are working with marmoset monkeys asking a more evolutionary question: What are the differences between the sound localisation processes in humans and in non-human primates? Do they use the same information or do they predict sounds differently?"

In the human studies, Ulrich Pomper from the Department of Cognition, Emotion and Methods in Psychology plays different sounds to the participants and asks them where they think the next sound is going to be. He combines these experiments with EEG studies measuring the brainwaves of the participants to see when the brainwaves spike and what kind of information they are using. Another project partner is Robert Baumgartner from the Acoustics Research Institute of the Austrian Academy of Sciences, who has specialised in psychoacoustics and experimental audiology. "Robert is a great computational modelist who within the project predicts what the humans should do in the sound experiments and what kind of information they are using when they make these predictions", says Spierings.

Interdisciplinary Dynamates project

Dynamates is conducting the first systematic comparative study on dynamic auditory predictions in space and time by both human and non-human primates. Dynamates comprises an interdisciplinary research team consisting of experts in the fields of computational neuroscience and psychoacoustics (Robert Baumgartner, psychoacoustics and experimental audiology, ÖAW; middle), human EEG and sensory processing (Ulrich Pomper, Department of Cognition, Emotion, and Methods in Psychology, University of Vienna; right), and comparative cognition between animal species (Michelle Spierings, Department of Behavioural and Cognitive Biology, University of Vienna; left). (© Dynamates)

Marmoset monkeys joining the experiments

"We think it is very likely that other species also share this ability to predict sound occurrences, because they also live in environments where they have to track sounds in order to survive", explains the behavioural biologist who worked on language perception in birds before joining this project. "But so far, not much is known about the way animals use sound information and whether they use it as humans do. And that is exactly what my team and I intend to find out. We set up experiments in which we so to say 'ask' the marmoset monkeys where they think the sound was and where they think the sound is going to be next. Then the modelers around Robert Baumgartner use all this data and compare it with the human data to see which kind of information humans and monkeys use."

Differences in hearing

Although the hearing range of marmoset monkeys is in a much higher range than the human one, the mechanism behind hearing is in principle alike. "We try to keep the experiments with the monkeys and the humans as similar as possible, we only change the sounds so that they are within their optimal hearing range", says Spierings. "For us, it is really important that the monkeys are happy to join the experiments freely, we never force them to do anything. We ask them politely – with a banana for example – to come to our experiment. If not, that is ok", smiles Spierings, whose PhD project at Leiden University focused on aspects of language perception shared by humans and other animals.

Sound experiment with special speaker set-up

Spierings and her team arranged a special array of loudspeakers, -The monkeys sit on a perch when the sounds are played to them. "They can move around as they like, but we only start the experiment when they sit there in order to estimate the position of their ears. As we cannot talk with the monkeys, we work with buttons." The principle at the beginning is quite simple: The monkeys should press the button closest to the speaker the sound comes from – if the answer is correct, they get a food reward. Then the biologists make it a bit more difficult: They play two or three sounds one after another and the monkeys only have to press the button closest to the location of the last sound.

"Once they have learned that we can make the task more complex and actually ask them to track a sound source that is travelling across the speakers, including change points. They should then press the button closest to the location where they last heard the sound", says Michelle Spierings. "We want to see how they are processing this information and how they make their decisions. There is no good or bad result, everything the monkeys do is fine."

Happy and comfortable monkeys

Up to now there are no study results to present yet as the data collection from the human studies is ongoing. Before the sound experiments can start, the monkeys have to settle in at their new home at the new University of Vienna Biology Building in St. Marx. "After moving, we have to wait until the monkeys are happy and comfortable, that is most important", says Michelle Spierings, who is looking forward to working with the monkeys in the new set-up. She concludes, "Most of the existing knowledge about perceptual decision-making comes from studying visual tasks. With our project, we hope to broaden the knowledge regarding this process in the auditory modality, which has an important function at least for survival and social behaviour." (td)

Rudolphina: Our current Semester question is called "What shapes human behaviour?". As a behavioural biologist what is your answer to the adapted question "What shapes animal behaviour?"

Michelle Spierings: The behaviour of animals is shaped by many different environmental and evolutionary pressures. Like many features, behaviour is strongly shaped by natural and sexual selection. But within and across social groups, learning also has a strong influence on behaviour. Animals can learn certain behavioural patterns by themselves, or learn them from one another and share important information about successful behavioural strategies.

Michelle Spierings is an assistant professor working on language perception and dynamic decision-making in different bird species and marmoset monkeys. From 2016 until 2020, she worked as a postdoctoral researcher in Tecumseh Fitch’s lab studying language perception in pigeons, marmosets and humans. Currently, Michelle Spierings is working in the Dynamates project funded by a Young Investigators Research Grant by the Austrian Science Fund (FWF).