A user over on Reddit has shared some bits of code found in the MyGlass app now available on the Play Store: Obviously, nothing found suggests that we will be seeing full eye tracking in the first generation of Glass. As far as we know, it doesn't even have the hardware to support such a feature. But how could it possibly have the hardware to detect even a "wink" gesture, much less calibrate one? Is there something I'm missing here? Google doesn't mention one in Glass' specs, but if the device has an accelerometer I could see it being able to detect that quick and abrupt gesture. If you wear glasses you can see that quickly winking the right eye may be detectable. I have a feeling Abhishek will find this interesting.