Content

Image
image of Donghyun Kim posed in front of green foliage

Donghyun Kim, assistant professor in the Manning College of Information and Computer Sciences, has received a National Institutes of Health (NIH) Trailblazer Award to push forward the development of robotic guide dogs for the visually impaired, work that he says is practical, useful and meaningful for humanity.

The NIH Trailblazer Award program supports new and early-stage investigators pursuing research bridging engineering and the physical sciences with the life and/or biomedical sciences. A Trailblazer project may be exploratory, developmental or high-risk/high impact. Kim will receive a total of $624,064, distributed over three years.

Guide dogs enable safe and independent mobility assistance for blind or low-vision individuals. However, it takes approximately $50,000 and two years to train a guide dog, and their working span is relatively short (less than 10 years). Robotic alternatives are positioned to expand access to this empowering assistance, while addressing these limitations.

Previously, Kim and his research team published an award-winning paper where real guide-dog users and trainers provided insight into how guide dogs are currently used and how this translates into the necessary features that need to be built into a robotic aide. With this current Trailblazer award, the team will start building the technology to meet these requirements. 

One such focal point is addressing how these robots should handle navigation. “Originally, we thought that we were developing an autonomous driving car,” says Kim. They envisioned that the user would tell the robot where they want to go, and the robot would autonomously navigate to that location with the user in tow. However, Kim’s previous paper revealed that this is not what users want. Instead, the user wants to be more in control over the route, with the guide dog scanning for immediate obstacles.

To address this need, the researchers will build a route-recall system. In this system, a sighted person shows the robot the path the user wants to take. The robot saves the path so it can be repeated in the future. The challenge is that AI is struggles to generalize information, especially when the environment changes, such as at night or in rainy or snowy conditions.

“Humans know which part is a stationary building and, even in the case that light changes, we’re not confused by a trash can that moved from here to there because we know that it is a movable object,” he says. “How can you teach the neural network to organize which part is navigation-critical and which part is movable?” 

With this award, they will also make hardware developments. This includes building features like legs that can climb stairs, a battery life long enough for a round-trip commute, and obstacle detection from all directions, including overhead and on the side blocked by the handler. 

“There is a handler right standing right next to the robot, so normally this part is hidden,” says Kim. “We need to utilize something other than vision, because otherwise, there is always a certain part we cannot observe. That’s why we put in the microphone array to utilize the audio signals if any cars are coming.”

Another challenge has been reducing the noise that the quadruped robots make when they walk. Kim explains that the tapping of these four “paws” is 60 to 70 decibels—on par with a vacuum cleaner. For a population that relies heavily on auditory cues, this noise level isn’t just bothersome; it’s a safety issue. “But we developed the algorithm that can suppress even more than twice the noise level, so now the robot can walk really gently,” he says. 

While Kim notes that robot dogs are commercially available, they are not yet tailored to the needs of the visually impaired. He hopes his work will contribute something practical, useful and meaningful for humanity.

This story was originally published by the UMass Amherst Office of News & Media Relations. 
 

Article posted in Awards