Wearable Tech World Feature Article
June 06, 2013

On the Heels of Apple's Gaze Patent Application Comes Google's Own Eye Tracking Patent

Just yesterday we wrote about Apple filing a patent related to "eye gazing at a mobile display" and using it to manipulate the screen - much as one might use gestures to manipulate a touch screen. We went so far as to suggest that such a capability would be quite useful on a smart watch or an iWatch – and indeed it would. Also worthy of a bit of notice is that it has just come to light that Apple has filed a "trademark patent" on the term "iWatch", in Russia of all places. It isn't worth speculating on this beyond taking note of it - but there it is.

As much as we could have related the Apple gazing IP to how such a thing might be useful with, say, Google Glass, we didn't. But as it happens, as we were posting our Apple article, along came news of Google receiving its own patent, specifically for using eye gestures to unlock the Google Glass screen and for manipulating it in terms of asking Glass to do things.

As we should all know by now, Google Glass is activated by touching the side of Glass where all the components are stored, and once activated, utilized by then speaking commands that begin with "OK Glass…" There is a benefit to doing this, in that it alerts bystanders to a Google Glass wearer "doing stuff." Of course Google Glass also provides a light that is visible to people when the device is active and activated, so that people who are in the vicinity of a Glass wearer generally know the device is in use, but if you are distracted enough, you may not in fact notice any such thing. A vocal command to Google Glass may at least alert a person through a distraction.

Ah, but what if a Glass wearer is able to manipulate Glass simply through a silent eye wink or by tracking the progress of the wearer's eye through a certain pre-built motion that the system can recognize as having some sort of meaning - such as a command to unlock the display or to take a photo or to begin a video? Well, the distracted individuals around the Glass wearer may truly be unaware of being photographed or that video is being shot. Interesting indeed.

And now we have definite proof that Google happens to have been granted an eye tracking patent on just such a capability. The patent itself is interesting in that it dates back a few years. The patent targets the described methods as being for use with a head-mounted display - what could that mean? Google Glass, eh? At least in today's world, since it didn't exist back in 2008.

Though it was only published in March 2013, the original ideas for the patent date back to 2008, and, in fact, there are references to technology dating back a decade earlier, to 1998. Here's the abstract:

An eye tracking system includes a transparent lens, at least one light source, and a plurality of light detectors. The transparent lens is adapted for disposal adjacent an eye. The at least one light source is disposed within the transparent lens and is configured to emit light toward the eye. The at least one light source is transparent to visible light. The plurality of light detectors is disposed within the transparent lens and is configured to receive light that is emitted from the at least one light source and is reflected off of the eye. Each of the light detectors is transparent to visible light and is configured, upon receipt of light that is reflected off of the eye, to supply an output signal.

Various methods for tracking eye movement are described, including projecting moving objects that describe - or rather display - a path of some sort for the eye to follow. There are other tracking examples, such as tracking the eye as it reads text, or rather tracking the eye to determine if it is indeed reading text or not. Below is an image that shows what a user might see on the Glass display.

Below is the same image, only this time provided from the side perspective. The images, we should note, are part of the patent and are in fact directly available on Google's website.

There is one particular additional aspect of the eye tracking that is important to take note of. In many cases, the eye's movement simply reflects the regular motion that the eye goes through maybe thousands of times a day. How can Glass distinguish between such random but common eye movements and eye movements that specifically connote that Glass should take notice and take some action? It’s an interesting question, and this is why ensuring that an eye command is anything but random, or otherwise normal, must take place.

As shown above, Glass would. In this sample drawing, a bird is displayed flying across the Glass user's field of view. If the eye tracks the image exactly, it provides Glass with the information it needs to take any number of actions, some of which could conceivably be additional tracking images that could represent different choices to take based on what the user wants to do.

Keep in mind that what the Glass user actually perceives is the equivalent of a 25 inch display at some distance from the user. The user is not seeing some tiny little flick of a trail on some tiny little screen. A bird traveling across a 25 inch space is far easier to perceive than the images above might otherwise suggest, and an infinite number of paths are theoretically possible that would be very easy for Glass to decode.

Are we there yet? Hardly, but it certainly suggests the way Glass is likely to develop. Google can, of course certainly also put the patent to use on its smartphones or very possibly use it on its own smartwatch. It will certainly be interesting to see how it is ultimately implemented.

The patent itself is available via the Patent Office or directly from Google.




Edited by Blaise McNamee




Comments powered by Disqus


Featured Video

Dedicated to Wearable Tech: Mobile, Sports, Fitness, Audio, Fashion, Design


Featured Podcasts

The Business of Wearable Computing: An Interview with Brand Finance An interview with Edgar Baum, Managing Director North America with Brand Finance, the world’s leading brand valuation and strategy consultancy. Mr. Baum specializes in marketing ROI and financially quantified brand strategy.
Getting Attention for Your Wearables Joe Daniels of Loeb & Loeb discusses how wearable tech entrepreneurs can gain exposure for their ideas and what to do once they've won attention from potential investors.
Wearable Success Rides on Actionable Intelligence Lux Capital's Adam Goulburn focuses on the traits sought by investors as they consider wearable startups, such as how well their software turns collected data into actionable intelligence.
Wearable Tech Startup Strategy CRV's George Zachary talks wearable startups and how they can secure the attention of the right investors as the seek to become the next great thing in wearables.
How to Win the Wearable Tech Funding Game Donatella Giacometti speaks with Canary Ventures' Alex Goldberg about what the investment community looks for in startups, such as the many new wearable tech companies that are emerging.

Wearable Tech World Media Sponsors