Published on Aug 21, 2023
This paper describes the system architecture for a navigation tool for visually impaired persons. The major parts are: a multi-sensory system (comprising stereo vision, acoustic range finding and movement sensors), a mapper, a warning system and a tactile human-machine interface. The sensory parts are described in more detail, and the first experimental results are presented.
About 1% of the human population is visually impaired, and amongst them about 10% is fully blind. One of the consequences of being visually impaired is the limitations in mobility. For global navigation, many tools already exist. For instance, in outdoor situations, handheld GPS systems for the blind are now available. These tools are not helpful for local navigation: local path planning and collision avoidance. The traditional tools, i.e. the guide dog and the cane, are appreciated tools, but nevertheless these tools do not adequately solve the local navigation problems. Guide dogs are not employable at a large scale (the training capacity in the Netherlands is about 100 guide dogs yearly; just enough to help about 1000 users). The cane is too restrictive.
The goal of this research is to develop a wearable tool that assists the blind to accomplish his local navigation tasks. Fig. 1 shows the architecture of the proposed tool. It consists of a sensory system controlled by the user. The primary data needed for local navigation is range data (which is not necessarily obtained from visual data alone; at this point, the type of sensors is still an open question). The mapper converts the range data into map data. The local map is the input to a warning system that transforms the map data into a form that is suitable for communication. In order to give the blind person freedom of movement, he must be able to control the focus of attention of the sensory system. For that purpose, the tool must be provided with a man-machine interface.
The ultimate goal of this project is to provide an electronic tool for the local navigation task of the blind. The tool must provide information about the direct surroundings of the blind to enable him to move around without collisions. We assume that, although mostly unknown, the environment does have some structure such as in an urban outdoor situation (e.g. a street), or in an indoor situation: smooth floors, now and then a doorstep, stairs, walls, door openings and all kind of objects that possibly obstruct the passage.
We start with three sensor types: stereovision, optical flow, and sonar. Preliminary research has shown that other types of
sensors are also of interest, e.g. ladar, radar and infrared (detection of people and traffic). The system should be expandable such that the information from these types of sensors can be integrated easily in a later stage of the project.
1. Whenever the blind want go to particular place, before that he will set the path through mobile
2. Wherever he wants to go he has to carry this system
3. When he is going out his system will communicate to house through GSM.
4. His system will communicate to bus stop/shop system through RF communication.
5. After receiving the data from blind system, it will communicate through voice using head phone.
6. This same data will send to house by GSM.
7. The house members can monitor the blind through mobile and which street, which area he is going.
8. This system will support the blind and the children also.
9. The ultrasonic will support the blind distance of each object
10. RF will support the path name, signal identification.
1. Ultra Sonic Object Sensing and Distance Measuring – 50KHZ
2. Path Planning Algorithm
3. FBus Protocol(GSM)
4. RF-433MHZ Communication