Two new apps allow visually impaired people to navigate indoor buildings with voice instructions from a smartphone app, providing a safe way to navigate where GPS doesn’t work.
Roberto Manducci, a professor of computer science and engineering at the University of California, Santa Cruz, has dedicated much of his research career to developing accessible technology for the visually impaired. Through years of working with these communities, he learned that there was a particular need for tools to assist with indoor navigation of new spaces.
It is especially difficult to move around independently in unfamiliar places because there is no visual reference – it is very easy to get lost. The idea here is to try to make this a little easier and safer for people. ”
Roberto Manducci, Professor of Computer Science and Engineering, University of California, Santa Cruz
In a new paper published in a magazine ACM Transactions on Accessible ComputingManducci’s research group has unveiled two smartphone apps that provide indoor wayfinding, navigation to specific points, and safe return, a process that follows past routes. The app provides audio cues and does not require the user to hold the smartphone in front of them, which is inconvenient and draws undue attention.
More secure and scalable technology
Smartphones provide an excellent platform for hosting accessible technology because they are less expensive than dedicated hardware systems, supported by a company’s information technology team, and have built-in sensors and accessibility features.
Other smartphone-based wayfinding systems require people to take out their smartphones and walk, which can cause some problems. Blind people navigating new spaces often use at least one hand for a guide dog or cane, and using the other hand to make a phone call is less than ideal. Handing over the phone also makes navigators more susceptible to crime, and people with disabilities are already disproportionately likely to experience crime.
Companies like Apple and Google have developed indoor wayfinding for specific locations such as major airports and stadiums, but their methods rely on sensors installed inside these buildings. This makes the solution much less scalable due to the cost of adding and maintaining additional infrastructure.
Using built-in sensors
Manduchi’s wayfinding app provides directions in a similar way to GPS services such as Google Maps. However, GPS-based systems do not work indoors because the satellite signals are distorted by building walls. Instead, Manduchi’s system uses other sensors in the smartphone to provide voice instructions for navigating unfamiliar buildings.
Wayfinding apps use in-building maps to find your way to your destination and track your navigator’s location using your phone’s built-in inertial sensors, accelerometer, and gyro, which provide features such as a step counter I will. Continue along the road.
The same sensor can also track the direction of the phone, i.e. the direction of the navigator. However, the estimated position and orientation are often somewhat inaccurate, so the researchers incorporated another method called particle filtering to enforce the building’s physical constraints, allowing the navigator to avoid walls and other possible obstacles. I tried not to interpret it as walking in a situation where there is no such thing.
Backtracking apps are useful in situations where a visually impaired person is guided into a room and wants to exit independently by simply reversing the route previously taken by the navigator. In addition to inertial sensors, mobile phone magnetometers are used to identify characteristic magnetic field anomalies, typically produced by large household appliances and that serve as landmarks within buildings.
conveying instructions
Both systems provide instructions through voice communication, but can also be used with a smartwatch to supplement instructions with vibrations. Overall, the researchers tried to minimize the amount of input given to the navigators so they could focus on safety.
It also relies on the navigator to make decisions about where to turn, taking into account errors during tracking. The system tells people to make their next turn five meters before the expected turn, with instructions such as “Turn left at the next intersection.” The navigator can begin to find turns with the help of a cane or cane. Guide dog.
“In my opinion, sharing responsibility is the right approach,” Manducci said. “As a philosophy, you can’t rely on technology alone. That also applies to driving a car. When you’re told to turn right, you don’t turn right immediately, you look for where the intersection is. You have to make an effort. ” in the system. ”
The researchers tested the system at UC Santa Cruz’s Baskin Engineering building and found that it successfully navigated users through many hallways and turns. The team will use the same interface but will continue to hone separate apps to ease development.
Future work will focus on integrating AI capabilities that will allow navigators to take photos of their surroundings and obtain descriptions of the scene when they are in particularly difficult areas, such as building alcoves or large spaces. . We also want to enhance the ability to access and download architectural maps, perhaps by leveraging the open source software ecosystem.
“I am very grateful to the blind community in Santa Cruz for their great advice. [As engineers creating technology for the blind community]We have to be very careful, we have to be very humble, and we have to start not with the technology itself, but with the people who use the technology,” Manducci said.
sauce:
University of California Santa Cruz
Reference magazines:
Tsai, C.H. others. (2024). Back and forth: An inertia-based indoor wayfinding and backtracking app for blind travelers with their phones in their pockets. ACM Transactions on Accessible Computing. doi.org/10.1145/3696005.