On Tech & Vision Podcast
Batman Technology: Using Sonar for Human Navigation
On Tech and Vision Podcast with Dr. Cal Roberts
Today’s big idea is Sonar and a somewhat similar technology called LiDAR! Can we use the latest sonar technology for obstacle detection the way bats and other nocturnal creatures do? There have been many exciting advances happening in sonar sensors that now make this possible for people who are blind. However, unlike bats, we won’t need to receive feedback signals through our ears. Advances in haptic technologies and languages make communication through touch possible. Dr. Cal Roberts talks with Dr. Matina Kalcounis-Rueppell from the College of Natural and Applied Science at the University of Alberta, Ben Eynon, and Diego Roel from Strap Technologies, Marco Trujillo of Sunu, and Sam Seavey of The Blind Life YouTube Channel to find out more.
Podcast Transcription
Matina: Bats produce echolocation, and echolocation is the use of sound to navigate and find food. My name is Matina Kalcounis-Rueppell and I am a professor of biological sciences and I study animal behavior and animal conservation.
Roberts: Dr. Kalcounis-Rueppell is interim Dean of the College of Natural and Applied Science at the University of Alberta where her research focuses on bats, and mice, and how they use sound to thrive in their nighttime worlds.
So, what is a bat doing when it’s echolocating?
Kalcounis-Rueppell: It’s sending out a signal and it’s waiting for the response in the form of an echo. And then processing the timing of the arrival of those echoes back to each ear to calculate where the object is in front of it that their original signal bounced off of. What is so amazing about all of this is that the world isn’t that simple. One signal goes out and its echo is coming off of a massive 3D space in front of that animal that is flying.
That flying animal is actually sending out multiple signals. He has a continuous set of signals that allows the bat then to, in an auditory way, construct a 3D view of the environment that it is barreling through as it’s sending out those signals. There’s wind and there’s rain and the leaves are fluttering in the forest. And there are owls flying by and flying squirrels flying by.
Even a human eye that is well-functioning during the day becomes sort of useless at night in true darkness because we need light to see. And here are these huge groups of animals that really represent a lot of the earth’s biodiversity, certainly mammalian biodiversity, that have evolved to be nocturnal. And in both of those big groups we see that sound is really one of the main modalities for navigation and communication.
So, we’re diurnal and very visual. We’re active during the day and we have this bias towards things that we need light for. Half of our lives is night time and it’s a whole other world. It’s an unexplored frontier that is really exciting.
Roberts: This is On Tech & Vision and I’m Dr. Cal Roberts. Today’s big idea involves using sonar technologies for obstacle detection. Advances in sonar sensors, pushing them to be smaller and cheaper to produce, are making it possible for new products to come to market that more and more eloquently provide obstacle detection for people who are blind.
And advances in haptics, meaning, unlike bats, we humans won’t have to receive feedback signals through our ears.
Eynon: The sensors we use are sonar and lidar.
Roberts: This is Ben Eynon, Chief Technology Officer at STRAP Technologies. STRAP uses a series of readily available off-the-shelf sonar and lidar sensors built into straps across the chest to help people who are blind detect obstacles.
Eynon: They park spacecraft with these same sensors. Recent developments have really pushed minimization or miniaturization of the components such that a human being can now wear them in a very compact form factor.
Roberts: Sonar sensors emit sound waves. And those sound waves bounce off obstacles and come back to the sensor.
Eynon: These sonar sensors are ones that simulate how bats echolocate.
Roberts: Dolphins navigate the same way. These animals have been honing their echolocation skills for years. And some people who are blind are able to click with their tongue or tap with their white canes to get this kind of sonic feedback about where obstacles are.
Eynon: And the lidar sensors are using actual light waves bouncing off of objects.
Roberts: Lidar is similar to sonar, except that instead of emitting sound it emits light. These sensors are more accurate and cheaper than ever which has made it possible for Ben and STRAP Technologies CEO and Founder Diego Roel to design the STRAP. Here’s a bit of my conversation with Ben and Diego.
So, tell us about the technology and tell us how it works.
Eynon: The sensors are worn at the chest and they are always looking forward, up and down. Down where your feet are walking, forward where you’re heading, and then up in case there’s any obstacle that you might run into. These sensors have a way through our firm to communicate the type of obstacle, the location to our firmware, such that when we send the signals back out to your chest through the backside of the device that’s now sitting on your chest and also into the straps where we have haptic actuators placed, these signals can then vibrate or gently poke you. If you think of Morse Code, how dots and dashes communicate the letters of the alphabet and numbers, we can communicate the types of obstacles that are ahead of you.
Roberts: And so, do you see this as a unique, one piece of technology for someone who is blind or visually impaired, or is this something that works best in combination? Would you where this pack on your chest with a white cane? With another device that’s talking to you like a GPS and telling you where you are? How do you envision this technology to be best used?
Roel: Really interesting question. The short answer is you can use both at the same time. Like the cane and our product – that’s possible. We recommend for our users to use both at the same time until they get used to using our device. But, we state that we actually developed the totally first replacement of the white cane in the world, actually.
The white cane, as you know, was invented in 1921. 2021, in almost a century nothing has changed. You could say, hey, the cane works so perfectly it doesn’t need an innovation. But it also has some pain points. Like the head obstacles. And if you don’t tap on that part on the floor, you won’t have the feedback on that part. If you tap the cane on the left side, you don’t have any feedback on the right side, and vice versa.
So, now imagine having a lot of canes looking for you. Now imagine having a lot of canes taking care of you in all directions at the same time.
Roberts: We know some listeners rely on their white canes, and using both products together, like Diego says, is always an option. But, Ben has seen first hand how freeing it is for some STRAP users to navigate hands-free. Here he describes on early user testing out the technology.
Eynon: He quickly and easily put it on, turned on the power, and I remember him taking his cane and setting it on the chair. And so, he got oriented with it, started walking around. Big smile came to his face and he said, “I’m walking faster than I have in a long, long time,” because he started to trust that the haptic vibrations were telling him every obstacle in the way.
Roberts: The young man came to a staircase.
Eynon: He instantly realized that, hey, these stairs go up. What most sighted people don’t realize is that when you have the white cane, you have one hand occupied all the time. So, he had a bottled iced tea in his left hand, and he was able to reach up and grab the hand rail and walk more safely up the stairs while holding his iced tea. He stopped at the top of the stairs and just cried out, “I haven’t done this in six years where I’ve walked up a flight of stairs and was able to drink a sip of my iced tea.” Little things that we take for granted, as Diego said, are life-changing for the visually impaired community.
Roberts: At some point you had to decide where you were going to put the transmitter on the person’s body. So we talked about something that somebody uses their fingers for. We talked about Brain Port that uses the tongue. Why the chest, of all areas, as being the area that you’re going to transmit the signal?
Eynon: If you put the device on top of your head, you would have the most distance forward as the device is looking ahead. That would be the most distance forward. But, we put it at the chest to access more of your body with the straps. So now we can put haptic actuators throughout these straps and have a palette, or a 2D space that we can send signals and easily make it more intuitive for the wearer to understand what we’re trying to communicate. So the chest was a nice high point in your body with all of this surface area we could access with our haptic actuators.
Roberts: Now, you could have used the same sensor and converted it into a sound language. A series of beeps or noises. Why into vibrations?
Roel: The first prototype of the STRAP was with audio feedback. And the blind told us, no, no, no. I use hearing for a lot of stuff, even to know if there’s another person in the room, to be aware of my surroundings. They told us don’t bother with the hearing, and let them have their hearing totally available.
Roberts: Dr. Kalcounis-Rueppell says that noise pollution is a problem for the nocturnal mice and bats as well.
Kalcounis-Rueppell: We’re changing our environment around us in ways that also changes the accousting environment. So if you think about a mouse in a tree, for example, that’s using echolocation to navigate, and then put traffic noise in the mix, or airplane noise in the mix. How adaptable will those animals be that reply so heavily on sound to a changing sound landscape in the environments that we all live in.
Roberts: Interpreting information haptically rather than sonically is an adaptation that humans, using technologies like STRAP, can make. And haptic information can navigate without sight, leaving our ears open to process the world. This is as true for people who are blind as it is for people interested in the night time worlds that Dr. Kalcounis-Rueppell loves.
For Ben and Diego, once the sensors were set up, designing STRAP became a matter of making the hardware comfortable. This raised the question of where the sensors should be placed, and what regions around the user should be scanned.
Eynon: Think of any kind of sensor. We have the ability to put that on a person and then give them feedback immediately through haptics. But we’re so happy and so excited to release this first version to the visually impaired for their safe navigation. GPS can get you from A to B, but STRAP gets you there safely.
Roberts: So, what does this mean for the future?
Eynon: We only have eyes on the front of our heads. There could be ways to enhance human performance and capability by putting sensors on our back, but there are other sensors we could add, which is an accelerometer, a gyro and a compass. So we know whether you are just standing up and moving forward from a chair, for example. All this information is resident on the device and can be used.
So, if we have extremely fast response time or extremely fast sensing, we could someday, and this is kind of a dream, but someday a user, a visually impaired wearer could perhaps even ride a bike or run a race. These are enhancements to the device, for sure. As we go forward we’ll be collecting feedback from users and how, when we go to manufacturing and the devices moves into the hundreds of thousands or millions of users, we intend to use that experience and information to keep enhancing, keep adding new sensors. It could be infrared, it could be radiation sensors for dangerous work.
Roberts: Like the STRAP chest band, the Sunu band, which is worn on the wrist, scans for obstacles using sonar and communicates potential impediments to users in haptic feedback.
Trujillo: I picked sonar because of energy consumption.
Roberts: Sunu Founder and CEO, Marco Trujillo.
Trujillo: It’s definitely lower consumption than laser or other types of technology.
Roberts: Marco appreciates that echolocation is already in the toolbox of many of his users who are blind. And the Sunu band, he says, can help them improve their skills.
Trujillo: It’s a very natural skill that the human body can develop. The secret is hearing your echo and touching and making those patterns happen. But if you have the Sunu band, we believe you can help accelerate the process of learning echolocation. Because you don’t need to now go and grab, you can hear your echo and feel the Sunu band and know there is something that is hard, that is right there. So the process of learning echolocation could be improved if you’re wearing a Sunu band. That’s our hypothesis.
Roberts: Haptic feedback can be great for older users who might need additional sensory feedback.
Trujillo: I remember this user. He used to be like a great echolocator. He was great just using his cane and his echolocation. But, because of age he started losing his hearing. So, the Sunu actually made a great deal for him to replace his echolocation skills.
Roberts: Marco was inspired to build the Sunu band while he was leading mobility lessons at a school for children who are blind in Guadalajara.
Trujillo: There are kids that run all the time, they hurt themselves. And the tools that they have, even though they’re great, they’re not fully reliant. The cane. The guide dog. It’s common for them to have accidents. The problem is that as they grow up they become more passive, more dependent on others. I thought there’s a huge gap here that technology can easily solve. How is it that we have self-driving cars, we have rockets than land themselves, we have a better iPhone every year, but we don’t have something better than a stick? And not to speak poorly about the white cane. We know how important it is. But, how often this happens, we still have people moving around and having issues every day. We gave prototypes and the kids were running around all over the playground with it. We made obstacle courses and a presentation for their parents. There was this particular girl, she started running with the device. Her parents were surprised. How does it happen?
Roberts: Let’s not take Marco’s word for it, though.
Today we’re going to talk about how we can use echolocation to help us see the world around us. Hey guys, it’s Sam with “The Blind Life.” Before we get into the video I’d like to thank Sunu for sending me this band so I could try it out and make this video for you guys.
Seavey: I’m a one person show. I do everything myself.
Roberts: Sam Seavey is the creator of The Blind Life YouTube channel where, since 2013, he has been reviewing assisted tech for people who are blind or visually impaired like he is. He was diagnosed at age 11 with Stargardt’s disease. We’re using tape from his show with permission.
Seavey: So, it is a lot of work. It’s the reaction from the community. I get so many emails about how helpful the channel has been for people. It’s definitely that positive feedback that keeps me going.
Roberts: Sam did an episode about the Sunu band on his channel.
It uses haptic feedback to alert you of these objects and it works really, really well. I was pretty impressed with this, guys.
Roberts: Sam’s video is great. If you want to see the whole thing, just search “The Blind Life Sunu Band” on YouTube or see the link in our show notes. With over 35,000 subscribers and hundreds and hundreds of product reviews, Sam’s vote of confidence goes a long way. The Sunu band also benefits from recent advances in haptics.
Trujillo: Mobility aids and travel aids exist from a long time ago, but they haven’t gone into this level of progress until today. This is because haptic actuators weren’t as good as they are today. Vibrators were big and bulky and the signal was very heavy, so hard on energy management for one side. But also, it wasn’t very precise. It’s like black and white versus high definition.
Roberts: Marco has created haptic gestures to communicate not just where an obstacle is, but whether that obstacle is large or small, hard or soft, and its general shape.
Trujillo: Now we have very high-definition haptics which allow us to communicate more information to the user.
Roberts: And the more users practice with the Sunu band, the better they can interpret each haptic pulse or vibration.
Trujillo: So, you need to do a language that is very natural for your brain to understand. That’s the key about haptics. You need to make them in such a way that the user progressively understands it. As he progressive with it, he has a better and bigger understanding about the space through vibrations.
Roberts: Like the evolution of bats’ bodies and brains for echolocation, understanding this new haptic language will build on millions and millions of years of human intelligence but will also require new capacities for understanding.
Trujillo: The human brain is capable of amazing things. What we’re doing is just using the power of the brain instead of using the power of artificial intelligence. Which, by the way, is based on the brain structure. Our neuro-networks, so they’re trying to imitate what the brain does, but they can’t do that as good as the brain does it.
Roberts: But Dr. Kalcounis-Rueppell has full confidence in our ability to learn.
Kalcounis-Rueppell: There’s human examples and non-human animal examples where the capacity to learn is really astonishing. So, we don’t even have to think about a particularly sensory modality for that. We know that, at different life stages and ages, we have the capacity to learn new languages. We have the capacity to navigate different environments in different ways. And so, I would be really optimistic about the capacity for humans that are using sound as a sensory modality to be able to detect even really small differences in that haptic system associated with those sounds to mean very different things.
Roberts: Sunu is continuing to develop its haptic language, and is building out partnerships with GPS providers and other app providers to make the Sunu band as useful as possible. We should add that it also tells time.
Today, developers stand on the precipice of big data and artificial intelligence, inexpensive sensors in hand, to figure out what good they can do and how they can use these newly accessible tools to solve real challenges for people who are blind. We’re living through a moment of technological synthesis as obstacle detection products like STRAP and the Sunu band show. We don’t know what the next iterations of these technologies will be when other sensors can translate into haptics for us, or how our bodies and brains will adapt on haptic languages.
We don’t know what the future holds, but we’ll be feeling it out together, moving through our worlds like bats hunt prey, with more dynamic sonic and haptic abilities. That sounds, and feels, like progress to me.
I’m Dr. Cal Roberts and this is On Tech & Vision. Thanks for listening.
Did this episode spark ideas for you? Let us know at podcasts@lighthouseguild.org. And if you liked this episode please subscribe, rate and review us on Apple Podcasts or wherever you get your podcasts.
I’m Dr. Cal Roberts. On Tech & Vision is produced by Lighthouse Guild. For more information visit www.lighthouseguild.org. On Tech & Vision with Dr. Cal Roberts is produced at Lighthouse Guild by my colleagues Jaine Schmidt and Annemarie O’Hearn. My thanks to Podfly for their production support.
Join our Mission
Lighthouse Guild is dedicated to providing exceptional services that inspire people who are visually impaired to attain their goals.