Abstract:
Vision is a vital cue for human navigation. Thus, visually impaired people encounter many challenges in day-today travelling. Identifying and avoiding obstacles in the environment is the most crucial among them. To empower blind navigation, numerous electronic travel aids were created in the past few decades, by using various obstacle sensing technologies such as sonar, infrared, and stereo vision. However, optical flow estimations based navigation, which is heavily used by insects and experimented in the field of robotics, has not been used in them. This project aimed to evaluate the potential optical flow estimation based techniques has in guiding a visually impaired person to avoid obstacles using auditory and tactile feedback. To demonstrate the researched core concepts, a prototype consisting of a virtual reality world was designed and developed. It also employs an existing optical flow algorithm for motion estimation, other image processing techniques, speech synthesis for auditory feedback and embedded programming for tactile feedback. The modular design of the prototype enables it to be used either in simulation mode, or as a standalone application in a real world environment. This work demonstrates the attractive possibilities of using optical flow estimations for visually impaired navigation.