In a breakthrough move, Google Glass is partnering with smartphone-app maker Envision to help blind and visually-impaired users ‘see’ the world around them using the spoken word.

Envision’s software is equipped with AI that detects information from what’s being ‘viewed’ through the glasses. It then announces what it’s detecting so the wearer can better understand their environment.

Most people, particularly those with vision impairments, have been waiting with bated breath to find out how AI, IoT, and advancing assistive technologies would impact the blind community. Finally, Google and Envision are stepping up to fill the need.

How Does the Technology Work?

The new glasses read text aloud to the wearer, assisting those with visual impairments to interpret the world around them with greater ease.

The new glasses read text aloud to the wearer, assisting those with visual impairments to interpret the world around them with greater ease.

Envision’s software uses Optical Character Recognition (OCR) to discern text within a digital image. That means that food packaging, billboards, posters, digital displays, or any type of text written in more than 60 languages can be detected by the software.

Google Glass Enterprise Edition 2 will then read the text aloud to the wearer, giving blind users more independence to interact with the world around them.

What Does it Allow the Wearer to See?

Since Envision’s OCR is the fastest and most accurate available, it can also recognize faces, detect colors, describe scenes, and find objects. Someone using the glasses could find their way around town, read street signs, or recognize friends passing by.

Wearers could ‘read’ documents at work or find a misplaced object at home. They could pick out a bouquet of flowers in a friend’s favorite color or determine if that snake they want to hold is poisonous or not. How does the saying go? Black on red, kiss it on the head. Black on yellow, probably kills a fellow… something like that.

When Will it Be Available?

The next gen glasses will be available for purchase starting in late August 2020.

The award-winning Envision app is available on both iOS and Android and will now be integrated into Google’s next generation of smart glasses.

Incorporating Envision into the glasses frees up both hands of the wearer, which is invaluable when one may already be holding a cane or the reins on a guide dog. 

This development could dramatically change the way that nearly 30 percent of the global population with vision impairments interprets their world.

The glasses are set to hit shelves in August of this year, with pre-orders available now. They’re not cheap, but for many visually-impaired wearers, it may be hard to put a price on greater independence.

Related: How Medical Technology is Helping Ease Chronic Pain

The piece Smart Glasses Give ‘Sight’ to the Blind by Patricia Miller first appeared on Innovation & Tech Today.