Brandyn White and Andrew Miller are computer science Ph.D. students and the principals of Dapper Vision, which provides “computer vision consulting and development with a focus on web-scale, mobile, and cloud applications.” They are also spearheading, via Dapper Vision, the OpenGlass Project, which is using emerging Google Glass technology to develop applications that can help blind and visually impaired users identify objects and environments via crowd-sourcing technologies and feedback.
About Google Glass: the Basics
Google Glass (pictured at left) is a wearable computer with an optical head-mounted display that is being developed by Google in the Project Glass research and development project. Google’s mission is to produce a mass-market pervasive computer [i.e., computing that can appear everywhere and anywhere]. Google Glass displays information in a smartphone-like hands-free format that can interact with the Internet via natural language voice commands.
Here is more information about the Google Glass research and development project from Engadget’s comprehensive – and excellent – overview:
…it’s not a pair of “Google Glasses,” but a single Google Glass headset. Glass has a very simple, clean design that, in some regards, is beautiful and elegant; in others, crude and clumsy. We’ll start with the elegant bits, most compelling being the plastic-backed titanium band that sweeps around and forms the frame.
From here, two nose grippers (also titanium) arc down, each one terminating with a clear silicone pad. These pads are replaceable and tacky enough to keep the whole assembly from immediately sliding down your nose.
All the circuitry for the device lies in two plastic housings, one that rests behind your ear (containing the battery and bone conductive speaker) and a second that’s up front (with the processor, camera and display assembly). The side of the forward portion is also touch-sensitive, forming a … slender trackpad.
Glass can function with a WiFi or Bluetooth data connection – it is a fully independent device. This means you can leave your phone behind and walk around anywhere with WiFi without losing connection.
The display in Glass is an interesting one. When wearing the headset, you can look straight through the transparent part and barely even see it. It only minimally refracts the light that’s beaming toward your eye. But, if you look at it from above, you can clearly see the reflective surface embedded inside at a 45-degree angle, forming the display your eyes see.
The panel itself is off to the right, built into the headset and beaming light into the clear piece from the side, which then hits that sliver of material and reflects into your eye. It’s an interesting arrangement and the net result is, indeed, a glowing image that appears to be floating in space. Google says it’s “the equivalent of a 25-inch high definition screen from eight feet away.”
The full Engadet review (with photos) describes how to activate Glass; use touch controls and voice commands; and take photos and videos.
About the OpenGlass Project
The OpenGlass Project is using Google Glass technology to develop applications that can help blind and visually impaired users identify objects and environments via established crowd-sourcing technologies.
The following videos demonstrate user trials of two OpenGlass applications in development that can inform blind and visually impaired users about critical features and/or objects in their environments:
- The first application, called Question-Answer, allows blind and visually impaired users to use Google Glass to take a picture with a question attached, which is sent to “the cloud” for answers from sighted respondents via Twitter or Amazon’s Mechanical Turk platform. The answer is read aloud to the user through the bone conduction speaker that is part of the Google Glass headset.
- The second application, called Memento, automatically recites notes when the blind or visually impaired user faces, or looks at, a recognizable scene. To use Memento, sighted users must first record descriptions or commentary about environmental features or a room setup. When a blind or visually impaired person using Google Glass approaches the same spot, Google Glass will recognize the feature or scene and read back the pre-recorded commentary.
You can view “Testing OpenGlass with Visually Impaired Users” at YouTube.
You can view “Glass Applications for Visually Impaired Users” at YouTube.
The Future of the Open Glass Project
According to Dapper Vision, the OpenGlass applications will remain in limited testing until Google releases Google Glass to the general public. Until then, Dapper Vision is developing a method to reward Question-Answer contributors with BitCoins [i.e., digital currency]. Dapper Vision is also releasing weekly videos to document their progress on the ongoing OpenGlass project.