With computers starting to help normal human beings to navigate better around the world, robotics and AI are all set to help the visually impaired to navigate around the world without any help. Scientists from IBM Research and Carnegie Mellon University (CMU) are working on solutions that will turn this into reality.
At the moment, Scientists are working on an app called “NavCog” which will help the visually impaired to navigate through the sounds or vibrations generated on the smartphones. Users will have the option to either run the app invoice or vibration mode whichever suits their need.
[must url=”https://www.hackread.com/google-chrome-color-enhancer-extension/”]Now Color-Blind people can View Web– Thanks to New Chrome Extension[/must]
Basically, algorithms will create a 3D image of the images it gets from the smartphone’s original camera. The 3D model then generates turn-by-turn guidance in the form of a voice or vibration for the visually impaired.
The project is a joint collaboration of three IBM research centers Yorktown Heights, N.Y. and Carnegie Mellon. It is worth noting that one of the lead researchers of this project (Chieko Asakawa) is visually impaired herself. She herself has faith in the app and believes that it will enable her to move around the research center with ease.
But, this is just the start; there is more to be done in this project like scientist are now looking to test this app outside the research center campus as till now it has experimented only inside the campus. So, scientists are now making the project available via BlueMix cloud for other developers to work on the things that are not ratified yet and also to see how the app is working in other environments.
Researchers are working on making the app supportive to the environment that doesn’t have any Bluetooth beacons (Currently the app can only work on environments that are supportive to Bluetooth beacons). For this purpose, they are working on advancements of computer-aided vision.
According to the researchers, computer aided vision with localization technologies could help in saving the locations. This will rule out the use for Bluetooth beacons, which are not available easily everywhere.
In a press release one of the researchers said:
“From localization information to understanding of objects, we have been creating technologies to make the real-world environment more accessible for everyone,” said Martial Hebert, director of the Robotics Institute at Carnegie Mellon. “With our long history of developing technologies for humans and robots that will complement humans’ missing abilities to sense the surrounding world, this open platform will help expand the horizon for global collaboration to open up the new real-world accessibility era for the blind in the near future.”
[must url=”https://www.hackread.com/smartspecs-wearable-glasses-legally-blind/”]These Wearable Glasses Lets Legally Blind People See Again[/must]
But, the project doesn’t end on mere navigation, further goals include facial recognition which will allow the visually impaired to recognize the people they know in the real-time.
With that, the blind might be able to recognize the facial expression of a person through the censors researchers are planning to add.
Currently, IBM is on the verge of creating something that will change the world forever. As the researchers believe in the long-term this project will help people like medical professionals and weather forecaster. But, there is still a lot to be done.
This is not the first time when researchers have developed an app for visually impaired. In March 2015, an app made news (known as Wayfindr) to help people use London’s Tube via sound.
[src src=”Via” url=”http://www.prnewswire.com/news-releases/ibm-research-and-carnegie-mellon-create-open-platform-to-help-the-blind-navigate-surroundings-300160351.html”]News Wire[/src]
[src src=”Source” url=”http://www.post-gazette.com/news/education/2015/10/21/CMU-IBM-app-helps-people-with-visual-impairments-navigate-their-surroundings/stories/201510200182″]Post Gazette[/src]