We're used to seeing bats and whales use echolocation to find their way around. And for a while now we've known that, with practise, humans can also visualise their surroundings by making clicking sounds.

A new study presents the first detailed description of human echolocation, including the acoustic characteristics and spatial range of mouth clicks. The researchers used the results to develop synthetic mouth clicks, which could be used to learn more about this extraordinary skill.

"Understanding the acoustic mechanisms will help us understand which information the human brain uses to echolocate," said Lore Thaler, one of the researchers from Durham University, speaking to ScienceAlert. "This in turn will help us understand the cognitive mechanisms."

For bats, dolphins and some whale species, echolocation is an innate ability used for navigation and foraging for food in the dark. When an animal produces a call, they listen out for the echoes that bounce back from objects in their environment to detect their surroundings.

While 'seeing' the world through sound isn't something humans have a natural skill for, studies have shown that vision-impaired people can develop bat-senses with practice.

The most famous 'real-life bat-man' is Daniel Kish, who lost his sight at the age of one. Kish became an internet sensation, climbing mountains, riding bikes and living alone in the wilderness using mouth clicking skills to picture his surroundings with mind blowing accuracy.

And it turns out you don't need to be visually impaired to develop this perspective on the world. Back in February, we reported on a study revealing that people with normal vision can learn how to echolocate to detect the size of virtual rooms.

But even though we have known about human echolocation for decades, we still don't know much about the acoustic patterns of mouth clicks or what goes on in the brain when they are produced.

With this in mind, Thaler and her team set out to construct a detailed description of human mouth clicks with the help of three blind echolocation experts who use the technique in their daily lives.

Each participant was placed in an empty room and instructed to make clicks as they normally would. The researchers recorded the clicks and analysed the sound characteristics, such as the spatial distribution and range of frequencies produced.

When the team looked at the length of the clicks, they found that they were brief at three milliseconds long – much quicker than previous studies have reported. Within these three milliseconds, the clicks had a sudden, loud onset before dropping off in a smooth downward slope.

Surprisingly, the beam pattern of the clicks also proved to be more directional than speech and could be easily reproduced by any individual.

"One way to think about the beam pattern of mouth clicks is to consider it analogous to the way the light distributes from a flashlight," Thaler told ScienceAlert. "The beam pattern of the click in this way is the 'shape of the acoustic flashlight' that echolocators use."

But to learn more about how human echolocation works, researchers need to examine much larger sample sizes. The problem is, skilled echolocators aren't exactly common or easy to track down.

To tackle this problem, the team used their detailed results to develop a mathematical model that could be used to synthesise human mouth clicks. This way, researchers can investigate echo-acoustics without the need for a skilled echolocator to be present.

"Using virtual simulations will allow us to explore acoustics relevant to human echolocation," said Thaler. "Knowing the relevant acoustic mechanisms might also be useful to inspire technology."

The research was published in PLOS Computational Biology.