All the wellness news you need to know today, including personality traits that make you eat healthier, vitamin gummies, and
human echolocation.
While the existence of
human echolocation is well documented, the details of the underlying acoustic mechanisms have been unclear.
A new study published in PLOS Computational Biology provides the first in - depth analysis of the mouth clicks used in
human echolocation.
The researchers also used the recordings to propose a mathematical model that could be used to synthesize mouth clicks made during
human echolocation.
«Exploring the potential of
human echolocation: Visually impaired people use the pitch, loudness and timbre of echoes to locate nearby objects.»
Human echolocation shares some similarities with animal echolocation, though people use the skill to compensate for their sight, rather than as an additional sense.
Not exact matches
A better understanding of
echolocation may improve methods for teaching the technique to people who have lost their sight later in life, and yield additional insights into
human hearing.
It makes sense:
Humans have been perfecting sonar for more than a century, but evolution has been honing
echolocation for much, much longer.
The detector picks up the
echolocation calls emitted by bats and translates it to a frequency the
human ear can hear.
No one taught him the technique, which is now recognized as a
human form of
echolocation.
A gene implicated in the evolution of
human speech also played a role in bat
echolocation.
Researchers modified a
human speech model developed in the 1970s to study dolphin
echolocation.
These mammals make
human speech look simple: In a behavior called
echolocation, a bat must coordinate its nose, mouth, ears, and larynx to emit and receive calls, all the while executing flight maneuvers guided in part by these signals.
But some surfaces, especially
human - made ones, could mess with
echolocation.