For Dolphins, Echolocation May Be More Like ‘Touching’ Than ‘Seeing’
NEWS | 11 October 2025
It’s midnight in a pitch-dark parking lot. Trying to unlock your car, you fumble and drop the keys. You squat down and run your hand across the invisible pavement. To the left you feel a firm, rubbery tire. Reversing course, you pass over jagged pebbles and papery leaves. Finally your fingers discover—and instantly close around—a notched piece of metal. This kind of tactile exploration may be the closest we can get to imagining the experience of dolphin echolocation, say the authors of a study on dolphin brains that was recently published in PLOS One. People often imagine echolocation as “seeing” with sound—experiencing auditory signals as a world of images like the ones our brains typically create from light perceived by our eyes. Like sonar devices, which turn sonic waves into visual representations, echolocators emit sounds and then decode spatial and textural information in the echoes that bounce back. And when Russian scientists inserted electrodes into the heads of dolphins and porpoises in the 1970s and 1980s, they reported detecting brain activity in the visual cortex while the animals heard sounds. “It made a neat little story because you have visual and auditory [brain regions] right next to each other,” says Lori Marino, a neuroscientist and president of the Whale Sanctuary Project, who was not an author of the new study but is mentioned in its acknowledgments section. She adds, however, that thanks to today’s more precise technology, “the whole [research] landscape is changing.” Although we still can’t translate echolocation perfectly into human terms, the new findings suggest a better metaphor: “touching” with sound. On supporting science journalism If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today. Dolphin echolocation works differently in the brain than human echolocation, which, for those who learn the skill, is processed primarily in the visual cortex. To pinpoint the neural mechanisms behind the dolphin variety, the researchers compared preserved brains from three echolocating dolphin species with that of a sei whale, which is closely related but doesn’t echolocate. They measured the diffusion of water molecules along nerve fibers—like cars driving along a highway, as Marino puts it—to better understand which parts of the brain interact in living dolphins and in sei whales. Contrary to the earlier Russian research, there seemed to be nothing exceptional occurring in the dolphins’ visual cortex. Instead an entirely different stretch of neural highway caught the researchers’ attention: the one linking the inferior colliculus to the cerebellum. In dolphins, as in humans, the inferior colliculus is a relay point for auditory input after it enters the ear, and the cerebellum is where information from senses and bodily movements gets combined for rapid calculation of the body’s next best move. “Anytime you need to move quickly, decisively and without consciously deliberating, your cerebellum comes alive,” says Peter Cook, a comparative neuroscientist at the New College of Florida and senior author of the new study. He and his colleagues found a strong connection between these two brain structures in the dolphins but not in the sei whale. So just like touch does in humans, echolocation seems to rely heavily on the cerebellum’s precise motor control and the tight feedback loop it promotes between sensation and motion—and less on the visual cortex. “Every time you move, you get different feedback,” Cook says. “And every time the feedback changes, you change how you’re moving. It’s like this constant circle of sensory, motor, motor, sensory.” This process makes sense to lead author Sophie Flem, a master’s student at the New College of Florida. If you need to constantly fine-tune your movements to home in on prey, Flem says, “it does seem intuitive that something like a cerebellum would really help.” And there’s another way in which echolocation seems more similar to touch than to vision: a dolphin’s sonar beam is far narrower than our visual field. Whereas we take in 180 degrees at a glance, dolphins move their beam around and build spatial understanding gradually—like a human groping for dropped keys in the dark. Still, it would be hubris to presume we know for certain what an animal’s experience of echolocation is actually like. “There may be things other animals do for which there is no model in our sensory system,” Marino says. “We just have to realize that.”
Author: Sarah Lewin Frasier. Cody Cottier.
Source