To Optimize Guide-Dog Robots, First Listen to the Visually Impaired

May 16, 2024 • by Staff Writer

Guide-dog users and trainers can provide insight into features that make robotic helpers useful in the real world.

A dog-like robot ascends steps on the campus of a university.

What features does a robotic guide dog need? Ask the blind, say researchers behind an award-winning paper led by University of Massachusetts Amherst computer scientists and involving The University of Texas at Austin’s Joydeep Biswas. 

The study, which identified how to develop robot guide dogs with insights from guide dog users and trainers, won a Best Paper Award at CHI 2024: Conference on Human Factors in Computing Systems (CHI), the leading venue for human-computer interaction research. 

Guide dogs enable remarkable autonomy and mobility for their handlers. However, only a fraction of people with visual impairments have one of these companions. The barriers include the scarcity of trained dogs, cost (which is $40,000 for training alone), allergies and other physical limitations that preclude caring for a dog. 

Robots have the potential to step in where canines can’t and address a truly gaping need—if designers can get the features right. 

“This paper really takes a user-first perspective to developing guide-dog robots: by starting out with a thorough analysis of interviews and observation sessions with dog guide handlers and trainers,” said Biswas, an associate professor of computer science.

The research team worked with 23 visually impaired dog-guide handlers and five trainers. Through thematic analysis, they distilled the current limitations of canine guide dogs and the traits handlers are looking for in an effective guide with considerations to make for future robotic guide dogs.

“We’re not the first ones to develop guide-dog robots,” said Donghyun Kim, assistant professor in the UMass Amherst Manning College of Information and Computer Science (CICS) and one of the corresponding authors of the paper. “There are 40 years of study there, and none of these robots are actually used by end users. We tried to tackle that problem first so that, before we develop the technology, we understand how they use the animal guide dog and what technology they are waiting for.”

Nuanced themes came from the interviews, such as the delicate balance between robot autonomy and human control – all issues important to understanding how to develop robots deployable in the real world. In other examples, researchers learned about the importance of extended battery life for ensuring the robots would meet real-world needs for visually impaired commuters and providing clarity about the importance of guidance to follow the street (as a sidewalk does) versus always heading in the same direction.

Biswas brought to the project experience in creating artificial intelligence algorithms that allow robots to navigate through unstructured environments. Biswas is involved in a project that studies how robots in close proximity with people in public interact, including on the campus of UT Austin, the university that declared this year to be the Year of AI.

Other researchers who contributed to the paper were Hochul Hwangand Ivan Lee, of UMass Amherst; Hee Tae Jung of Indiana University; and Nicholas Giudice, of the University of Maine.

Adapted from a post by University of Massachusetts Amherst.

Share


Two people look at a wall emblazoned with the words "AI for the rest of us"

Podcast

AI for the Rest of Us