Technology makes our life better and always influences our daily in a positive and effective way, and helps us overcome kinds of problems to enjoy the wonderful things. There is a lot of work and research being done to find ways to improve life for visually-challenged people. Reading and recognition devices could make mobile devices like smartphones, tablets, and smart glasses, into indispensable aids for the visually impaired.
Right now, there are many assistant wearable devices and apps that can help partially-sighted and blind people.
1. Assisted Vision Smart Glasses
This is specially designed for those not-completely-blind people who have some remaining sight; these smart glasses can be tuned to make the most of it. They are constructed using transparent OLED displays, two small cameras, a gyroscope, a compass, a GPS unit, and a headphone. “Our latest prototype (pictured) has an Epson Moverio BT 100 and an Asus xtion depth camera and 3D printed frames,” explained Dr Hicks. “What we’re trying to do with the project is produce a pair of glasses that can enable someone who has got very little sight to allow them to walk around unfamiliar places, to recognize obstacles, and to get a greater independence.“
2. AI Glasses
Another glass project is called AI Glasses, which combines glasses with stereo sound sensors and GPS technology attached to a tablet, which can give spoken directions and recognize denominations of currency, read signs, identify colors, and other things. It also employs machine learning to recognize different places and objects. Because it uses ultrasound, it can also detect translucent obstacles, like glass doors. “We currently have a light weight, ergonomically acceptable prototype since it almost looks like a normal pair of glasses and can work in real time with batteries that last approximately four hours in continuous use. We hope to have a commercial prototype by next August at the latest, and being able to market it in early 2015,” Project Leader - Eduardo José Bayro Corrochano said.
3. Braille ebook reader
The Anagraphs project took up the idea and began to work on plans for a device that would employ thermo-hydraulic micro-actuation to activate Braille dots by infrared laser radiation via a micro-mirror scanning system. It’s easier to imagine it as a kind of wax material, which can go from solid to liquid with heat and be easily reshaped to create Braille dots. Unfortunately the EU funding has run out and the project needs more cash to be realized.
This MIT Media Labs project is a wearable device, a very chunky ring that sits on the finger and is capable of detecting and interpreting 12-point printed text as the user scans his or her finger across it. It reads aloud in real-time. Small vibrations alert the wearer to any deviation off the line.
All these visual aids are considered as additional equipment that leads to discomfort for the visually-impaired people. Also there are many existing smartphone apps that have the capability of identifying an object or speaking text that helps to describe places. In Spite of these smartphone apps, the sensors that are embedded within the device are not exploited completely.
However, the research on how to help visual-challenged people “see” makes a big-step progress recently. A piece of news indicates that specialists in computer vision and machine learning based at the University of Lincoln, UK, funded by a Google Faculty Research Award, are aiming to embed a smart vision system in mobile devices to help people with sight problems navigate unfamiliar indoor environments.
To overcome the problem that common visual assistance tools have, the specialist team plans to use color and depth sensor technology inside new smartphones and tablets, like the recent Project Tango by Google, to enable 3D mapping and localization, navigation and object recognition. The team will then develop the best interface to relay that to users – whether that is vibrations, sounds or the spoken word.
Dr Nicola Bellotto, an expert on machine perception and human-centred robotics from Lincoln’s School of Computer Science, said: “This project will build on our previous research to create an interface that can be used to help people with visual impairments.
“There are many visual aids already available, from guide dogs to cameras and wearable sensors. Typical problems with the latter are usability and acceptability. If people were able to use technology embedded in devices such as smartphones, it would not require them to wear extra equipment which could make them feel self-conscious. There are also existing smartphone apps that are able to, for example, recognise an object or speak text to describe places. But the sensors embedded in the device are still not fully exploited. We aim to create a system with ‘human-in-the-loop’ that provides good localisation relevant to visually impaired users and, most importantly, that understands how people observe and recognise particular features of their environment.”
Benefits of Smartphone Vision Technology
3D mapping and localisation: The user can visualize the environment because the device enables 3D mapping and localisation that helps in identifying a location.
Easy navigation: Vision technology has been embedded inside the smartphone in which the data would be detected through the device camera that in turn identifies the surrounding environment using visual clues. The space where the user is moving around the space can be recognized and helps the users to navigate using the Smartphone.
Object Recognition: Object at any place in a room can be identified using this technology so that the visually-challenged people can move as if they can see everything without any problem.
Smart Vision technology enables visually-impaired people to recognize visual clues in the environment. As they become accustomed to this technology, they can work by themselves without depending on others thereby enhancing their self-confidence and makes them more self-conscious.