Engadget post time: April 22, 2013 at 07:09PM
While perusing the code for Google Glass’s companion Android app, Reddit user Fodawim chanced across several lines of code that could be offer up some interesting navigation options for your Glass. Titled ‘eye gestures,’ it looks like the wearable’s built-in sensors should be able to detect eye activity and integrate that into device input. Two lines of code mention enabling and disabling eye gestures, suggesting it’ll be an optional feature, while other lines hint that it would have to be calibrated to your wink before use. Get your well-timed slow-wink at the ready, however, as the final line spotted suggests that a wink gesture can command the 5-megapixel camera to capture whatever you’re looking at. Google was already granted a patent for unlocking a screen using eye-tracking information, although wink-based commands sounds a shade easier to deal with — as long as it doesn’t think we’re blinking.
Reference source: Engadget