Binghamton computer scientists program robotic seeing-eye dog to guide the visually impaired
Computer science faculty and students train robot to respond to tugs on leash
Last year, the Computer Science Department at the Thomas J. Watson College of Engineering and Applied Science went trick-or-treating with a quadruped robotic dog. This year, they are using the robot for something that Assistant Professor Shiqi Zhang calls 鈥渕uch more important鈥 than handing out candy, as fun as that can be.
Zhang and PhD student David DeFazio and junior Eisuke Hirota have been working on a robotic seeing-eye dog to increase accessibility for visually impaired people. They presented a demonstration in which the robot dog led a person around a lab hallway, confidently and carefully responding to directive input.
Zhang explained some of the reasoning behind starting the project.
鈥淲e were surprised that throughout the visually impaired and blind communities, so few of them are able to use a real seeing-eye dog for their whole life. We checked the statistics, and only 2% of them are able to do that,鈥 he said.
Some of the reasons for this deficiency are that real seeing-eye dogs cost about $50,000 and take two to three years to train. Only about 50% of the dogs graduate from their training and go on to serve visually impaired people. Seeing-eye robot dogs present a potentially significant improvement in cost, efficiency and accessibility.
This is one of the early attempts at developing a seeing-eye robot following the development and cost decrease of quadruped technology. After working for about a year, the team managed to develop a unique leash-tugging interface to implement through reinforcement learning.
鈥淚n about 10 hours of training, these robots are able to move around, navigating the indoor environment, guiding people, avoiding obstacles, and at the same time, being able to detect the tugs,鈥 Zhang said.
The tugging interface allows the user to pull the robot in a certain direction at an intersection in a hallway, prompting the robot to turn in response. While the robot shows promise, DeFazio said that further research and development are needed before the technology is ready for certain environments.
鈥淥ur next step is to add a natural language interface. So ideally, I could have a conversation with the robot based on the situation to get some help,鈥 he said. 鈥淎lso, intelligent disobedience is an important capability. For example, if I鈥檓 visually impaired and I tell the robot dog to walk into traffic, we would want the robot to understand that. We should disregard what the human wants in that situation. Those are some future directions we鈥檙e looking into.鈥
The team has been in contact with the Syracuse chapter of the in order to get direct and valuable feedback from members of the visually impaired community. DeFazio thinks that specific input will help guide their future research.
鈥淭he other day we were speaking to a blind person, and she was mentioning how it is really important that you don鈥檛 want sudden drop-offs. For example, if there鈥檚 an uneven drain in front of you, it would be great if you could be warned about that, right?鈥 DeFazio said.
While the team is not limiting themselves in terms of what the technology could do, their feedback and intuition lead them to believe the robots might be more useful in specific environments. Since the robots can hold maps of places that are especially difficult to navigate, they can potentially be more effective than real seeing-eye dogs at leading visually impaired people to their desired destinations.
鈥淚f this is going well, then potentially in a few years we can set up this seeing-eye robot dog at shopping malls and airports. It鈥檚 pretty much like how people use shared bicycles on campus,鈥 Zhang said.
While still in its early stages, the team believes this research is a promising step for increasing the accessibility of public spaces for the visually impaired community. Zhang lauded Binghamton and the effective and service-driven initiative of his students.
鈥淲e are the very best public school in the Northeast area,鈥 he said. 鈥淥ur undergraduate and graduate students are fantastic. I really appreciate my students, Dave DeFazio and Eisuke Hirota. They are able to develop the software and intelligence for this robot to serve the community. They are very helpful and this kind of research is not possible at all without the support of the students.鈥
The team will present a paper on their research at the (CoRL) in November.