How Echolocation Inspired Lidar and the Technologies Behind It

Echolocation is not limited to dolphins. This brilliant use of sound as a directional tool can be used by many species and has even been developed as a new form of radar called “Lidar”. 

Most people know that dolphins use echolocation to find things in their environment. The clicking noises they emit bounce off objects under the water and travel back to the smart mammal. This helps them measure size and distance between them and far away objects. But echolocation is not limited to dolphins. This brilliant use of sound as a directional tool can be used by many species and has even been developed as a new form of radar called “Lidar”.

What Is Echolocation?

The idea of echolocation is simple: you send a sound into the environment, then calculate how long it takes to echo back. Bats and other mammals use sound waves — a little click or squeak bounces off objects in the surrounding environment and returns to them slowly or quickly depending on how far away the object is. Humans adapted this process into sonar, which sends pings into the water and listens for the echoes. Knowing exactly how fast sound travels in water helps you know how long it took to hit an object and return. This helps you know exactly how far away whatever the ping hit is.

The perfection of sonar then evolved into radar, which is the same process but with radio waves through the air. Radar dishes spin around, unleashing a beam of radio waves. As those waves hit something hard, they bounce back, and the dish reads them. As we know the exact speed of light in our air, we can calculate how distant objects are.

How Echolocation Is Used In the World

Dolphins, much as bats, have a living sonar system that allows them to use echolocation. Their heads are filled with acoustic fat, a thick lipid that lets them focus sound. Ultrasonic clicks travel from the dolphin’s snout through the acoustic fat and emanate from the front of the head. The clicks bounce off things in the ocean and return to the dolphin, who uses the information for spatial awareness. Bats send out high-frequency “chirps” and evaluate the time delay of the returning echoes so they can fly through the night. Taking these examples from nature, scientists are attempting to apply echolocation to new devices to help disabled individuals.

Dr. Seth Horowitz, formerly of Brown University, conducted experiments for years to determine how humans can see the world through sound. Now, as CEO and chief neuroscientist for NeuroPop, Horowitz’s experiments investigate the role of acoustic information in how we perceive our world. Those who could benefit most from new sound technology are those who are blind. The majority of blind people use sound as a guide, many of them practicing long and hard enough to achieve a form of echolocation. Horowitz has taken this information to design devices to help the blind navigate the world better.

These newer devices are now being promoted to the blind community, but most blind people still use a cane or service dog to get around. Horowitz hopes his devices will supplant the old standards. But for humans, echolocation doesn’t come naturally to everyone. The BBC reports that Californian Daniel Kish, who has been blind since birth, lives an incredibly active life, including hiking and mountain biking. “He has perfected a form of human echolocation, using reflected sound waves to build a mental picture of his surroundings. When Daniel Kish clicks his tongue, the world answers back.”

From a young age, Kish developed a sonar technique allowing him to navigate his world using echoes from repeated tongue-clicks. “It is the same process bats use,” he says. “You send out a sound or a call and sound waves are physical waves – they bounce back from physical surfaces. So, if a person is clicking and they’re listening to surfaces around them they do get an instantaneous sense of the positioning of these surfaces.” Knowing the lengths to which echolocation can be used, scientists took the next leap in its evolution involving the use of lasers, now known as Lidar.

How Does Lidar Work?

Lidar is a complex remote sensing technology using laser light to densely sample the surface of an object. Currently, it is being used by archeologists to delve into the hidden layers of the earth. Scientists beam millions of laser pulses at the ground every four seconds from a helicopter or plane. The wavelengths are timed as they bounce back, not unlike how bats use sonar. The hyper-accurate measurements are then used to produce a detailed, three-dimensional image of the topography.

Lidar systems’ lasers use pulses of light outside the visible spectrum and measure how long it takes each light pulse to come back. As it does, the distance to and direction of whatever that pulse hit is documented as a single point on a large 3D map, with the lidar system at the middle. These lidar systems generally sweep in a circle like a radar dish, while also moving the laser down and up. The “point cloud” (the gathering of coordinates from recurring laser pulses), can be remarkably detailed, showing all the outlines of the environment and things within it. This has led scientists to the practical application of lidar in research.

How Do We Use Lidar Technology?

In recent years, archaeologists have started using Lidar to map the height of features on the ground. One of the largest, most recent surveys helped archeologists determine that the Mayan civilization may have been much larger and more tightly populated than thought before. Lidar helped the scientists cover 2,000 kilometers in a relatively short period of time. Covering that much land by hand, using ground penetrating radar, would have taken decades. But Lidar’s usefulness doesn’t stop there.

There are also multiple types of Lidar, including flash lidar and phase array lidar.

As posted on digitaltrends.com, “Flash is basically where you have a light source and that light source floods the entire field of view one time using a pulse. A time-of-flight imager receives that light and is able to paint the image of what it sees.” Lidar can be thought of as a camera that sees distance rather than color. The downfall of flash lidar is the expense. It requires huge bursts of energy and expensive imagers made from gallium-arsenide to capture the light. These imagers can cost upwards of $200,000. Fortunately, a close cousin of lidar known as the phased array, is a brilliant workaround. It can broadcast radio waves in any direction using a microscopic array of individual antennas synced together. By monitoring the timing — or phase — between each antenna broadcasting its signal, engineers can “steer” one signal in a specific direction. This is the first step in using lidar for autonomous vehicles.

Electromechanical lidar, like what would be used in autonomous vehicles, can run for 1,000 to 2,000 hours before needing replaced. But, since the average American spends 293 hours in his or her car each year, most people would end up replacing the lidar before their tires. Also, Lidar can’t do it alone: it can’t read signs since they’re flat; the systems can be disrupted easily by limited visibility (heavy snow, fog, or something else obstructing the unit’s view). So, to work in a vehicle, lidar has to work together with other systems: its cousins’ radar and ultrasound (a form of sonar) and ordinary visible-light cameras help Lidar. The cameras may read a “bump” sign on its own, but cameras are slow to process information and might not realize the sign is there until it’s too late.

Dr. Seth Horowitz is using the way bats process sound as the key to unlocking a new class of assistive devices that could allow blind people to navigate the world with unparalleled freedom. He hopes to achieve more than early devices that were no more than indicators that warned the user of objects or obstructions nearby. The first models designed in laps gave users too much detail, had bulky battery packs, and could not distinguish between size of an objects and its composition (was it a large body of water or a small curb?) While Lidar may be the next wave of the technological future for developing autonomous vehicles, aiding the blind in reading their environment with sound, and increasing archeologists’ ability to map the past, there are still many concerns with its limitations and costs.

Conclusion

Mammals around the globe still use echolocation every day for navigating their environments. While primitive echolocation led to sonar, radar, and now Lidar, Lidar’s use in archeology is the most comprehensive example of its value. In order to be used safely in autonomous cars, Lidar alone cannot do the job. It must still rely on cameras and radar, as well as an on-board computer to compile the incoming data in microseconds. Scientists still have lots of innovation, development, and testing to do before Lidar can be safely used to drive autonomous vehicles. And there are still more problems that Lidar may yet be the solution for.

Find this article by Anna Kučírková and many others at https://www.iqsdirectory.com/resources/how-echolocation-inspired-lidar-and-the-technologies-behind-it/

Please follow and like us:

Related Posts