Bats can anticipate their prey’s movements by building predictive models on the fly, says new study
Despite possessing better vision than humans at night, some bats prefer to use their advanced hearing to hunt for food. These nocturnal creatures manufacture sounds and rely on the echoes those sounds make while bouncing off objects to navigate in the dark and find their prey. This ability, known as echolocation, is a subject of particular interest to scientists since visually impaired humans can also be trained to use echolocation to improve their mobility. However, much is still unknown about the processes that support auditory object tracking.
In a newly published study, Johns Hopkins University (JHU) researchers explored the strategies bats employ to hunt at night using only their hearing. They hypothesized that echolocating bats build internal prediction models from dynamic acoustic stimuli to anticipate the future location of moving auditory targets. Using mathematical models, the researchers were able to not only quantify the direction of a bat’s sonar beam aim and echolocation call rate, but also differentiate between the animal’s non-predictive and predictive behaviors while hunting. Their findings appeared in the journal PNAS.
How bats track objects using sounds
Unlike other predators who rely on vision, bats use an active audiomotor feedback system to locate and track their prey. They do this by emitting ultrasonic signals to probe their surroundings and listening to the echoes, or “acoustic snapshots,” that return from objects. According to earlier studies, bats also have the ability to modify the duration, direction, timing, intensity and spectral content of the signals they produce based on the information they get from echoes. This ability is what allows them to react to environmental stimuli or changes and successfully track the trajectory of their prey.
One good example of a bat species that employs these features is the big brown bat (Eptesicus fuscus), a type of vesper bat widely distributed throughout North and South America and in the Caribbean. Big brown bats modulate the frequency of their echolocation signals to hunt for flying insects. As these hunters close in on their prey, they increase the rate of their sonar calls and lock their sonar beam aim onto their target about 300 milliseconds (ms) before making contact. Researchers believe that these bats use the discrete echo snapshots they get from releasing ultrasonic signals to compute the three-dimensional position of moving targets.
The JHU researchers, however, were interested in learning about the strategies bats use to track and intercept erratically moving prey when the sensory information they rely on is interrupted by the presence of stationary objects, such as trees. To do this, the researchers developed an ethologically inspired behavioral paradigm to test their hypothesis that echolocating bats depend on internal prediction models to anticipate where their prey is headed. And because these models are so robust, the researchers believe that bats don’t even mind when their targets temporarily disappear behind objects that disrupt target-focused echolocation signals.
Bats can predict the future location of auditory targets by predictive computation
Bats that hunt for free-flying prey take many things into account, such as brief interruptions in their echo snapshot feedback, acoustic and neural delays, abrupt changes to the trajectory of their target and temporary occlusions. When bats produce ultrasonic signals, it takes time for their sonar call to travel to an object and for the echo to travel back to them. Their brains also need time to process echo features and induce an appropriate motor response. All in all, researchers estimate these delays add up to over 100 ms for each echo snapshot, which is enough time for the bats’ prey to gain distance or change its course.
But the fact that bats are almost always successful in catching their prey led the researchers to think that these predators accumulate information about target motion to build internal prediction models of insect trajectories. Using these internal models, bats extrapolate the future position of their target and compensate for the acoustic and neural delays, as well as the presence of temporary blockages. The JHU researchers confirmed this in their chosen model, the big brown bat, which uses this strategy to keep track of its prey even when the insect is hidden by an occluder during a portion of its trajectory.
The researchers also found that when its prediction fails because of a sudden change in its prey’s velocity, the big brown bat rapidly adapts by increasing its sonar call rate. This adjustment in sonar behavior allows it to update its internal models of auditory target motion and track its evasive prey with ease. These findings collectively demonstrate that echolocating bats collate information from echo snapshots over time to predict the movement of their prey and ensure capture despite conditions of uncertainty.
“Just [as] a tennis player needs to find out when and where they will hit the ball, a bat needs to anticipate when and where it will make contact with the insect it’s hunting,” explained Cynthia Moss, the study’s senior author. “The insect is flying. The bat is also flying. In this very rapidly changing environment, if the bat were to just rely on the information it got from the most recent echo, it would miss the insect.”
Visually impaired individuals can also rely on a similar strategy and be trained to use echolocation. For instance, a recent study published in the Proceedings of the Royal Society B reported that blind people who have become expert echolocators are able to judge the location of objects in their immediate surroundings by clicking their tongues and listening to the echoes that bounce back.
This study not only highlights the importance of furthering echolocation research, but also reveals the adaptability of the human brain. Specifically, the researchers found substantial evidence that in the absence of a particular sense (e.g., sight), the brain regions responsible for processing stimuli for that missing sense adapt to process other types of input, such as sound and touch. This also supports an emerging notion that the human brain is organized by task instead of sensory modalities.
For full references please use source link below.