Thursday, August 15, 2013

Google Glass

I saw Kyle Samani speak at AMPD last night on Google Glass. There are maybe ten thousand Google Glasses in circulation right now in advance of their official release likely before Christmas of this year. Google put these out in the wild to build excitement and of course in the name of testing. You can have one yourself via an eBay purchase for four to five thousand dollars. (Kyle guesses a Christmastime price would be around $266 as the devices cost $120 in materials and perhaps $160 when you add in shipping and other factors like that.) Kyle wore a pair during his presentation and in my photo of him (halfway down this blog posting) you can see the fatter of the two handles, the one that sits at your right as you wear the glasses. (One may swipe this with one's finger to scroll through a "Timeline" user interface which Kyle found to be a clumsy experience.)

Anyways, after Kyle finished speaking the kitschy cool (oxymoron?) toy was passed around the room. By dumb luck, I got it first. The individual sitting next to me, who must have been named Amber Lindholm given her name tag, asked that I take her photo within the glasses and it is above. Below, Amber takes the picture of "Gene" while he sports the specs and as you can see there is a line forming to play with the device. (The first individual in the line is a Tim Scott who I've met a few times before. He used to work in C# and is now in the Ruby space.) The man at the far left with his back to the camera is a David Vogel who was hosting AMPD (Austin Mobile Professional Developers).

This talk ended up being a series of bullet points on Google Glass. Yes, it comes with Google Now but, no, Google Googles were not baked in for some reason, etc. Kyle Samani is the founder and CEO of a medical software company called Pristine and is developing software for medical professionals who will wear the glasses, but his talk was lightweight on specifics. I eventually put my hand up and asked how a doctor would use the software and he told me he could not speak to the specifics. His investors have a gag on him it seems. He mentioned that security professionals could use the glasses to look through the eyes of security guards from an administrative role and I imagined that as much could be done in the medical field too. Then I thought of how recording via the glasses could get the costs of lawsuits down too. It was fun to guess at what Kyle was up to. Kyle mentioned that Telsa cars emit wireless signals exposing an API for metrics and that someone had made a silly app to see these metrics from Google Glasses. Maybe Kyle's software could get cues from machinery in an emergency room and relay alerts to a doctor who had both hands tied up. I'm just daydreaming at this point. The bullet points on Google Glasses in a more generic sense were:

  • There is a proximity sensor which can tell if you are taking the glasses on or off. It can tell if you tilt your head back thirty degrees and this is one way to activate the glasses. It can't tell where you are moving your eyes the way a Galaxy's camera might as an eyeball is a perfect sphere and you would need a camera watching the eyeball's iris for this. Such a feature is not rolled in as it would be energy consumption intensive. (There is one camera looking outward from the glasses however and one may take a photo by touching the glasses slightly.) The sensor can tell if you are winking or blinking. Kyle asked us to put our fingers to our temples and feel how little they moved when one blinked. He then had us wink and the effect, the amount to which you twitch and the crinkling of the skin, is noticeably more extreme. The differences between winking and blinking can be picked up by the proximity sensor and someone has written a tool called Winky for triggering wink event handler stuff.
  • There is no control over brightness, contrast, etc. with the camera.
  • The 640x320 (pixels) screen holds 40 to 50 words of text. Good UX entails white text on a black box. One cannot be looking at something and looking at the peripheral stuff broadcast by the glasses. You instead look away from what you are doing to use the glasses. Google glasses are not augmented reality eyewear in which the whole of one's eyes are obscured and one sees, for example, text about the person they are looking at through the glasses. That is a different thing and one loses the ability to communicate with others by way of the facial gestures the eyes afford in such a space. There are some other companies working on this stuff but not Google. The Google glasses don't have any facial recognition software and Google has blocked at least one attempt by a developer to offer such software for Google Glasses.
  • A light is visible to others when the screen is on.
  • Three to five hours of battery life are available, and it takes under an hour for the device to fully recharge.
  • A speaker at the back of the big handle feeds you audio. Passersby will be able to hear beeps but will not be able to really discern words.
  • The operating system was forked off of Android 4.0.4 (Ice Cream Sandwhich) and what is different is anyone's guess. 4.2.2 may be loaded on the device and the default fork has done some things to emulate some 4.2.2 stuff.
  • You may do things with voice commands. Programming to accommodate one word voice commands is ideal as there is no way to correct a mistake in a sentence. Every spoken word is thus a potential point of failure. Google is not making it easy for developers to create custom voice commands. Kyle implied that he persevered in this arena after much heartache.
  • There is going to be some GeoLocation stuff. When you pull into a McDonalds expect the glasses you are wearing to tell you of a special or a discount. Speaking of which, there has been some effort in West Virginia to ban Google Glasses from being worn while driving. Kyle defended the glasses by saying they were considerably safer than smartphones in the same circumstances. I suppose just as a cocaine addict might improve his lot slightly by becoming addicted to methadone instead, someone who texts and drives might become a slightly better bad driver by way of Google Glasses.
  • The initial setup is painful and involves scanning some QR codes.
  • For tethering (connecting one device to another) one may talk to an Android app called MyGlass at an Android smartphone. In the iPhone realm all one can do is expose a hotspot via the iPhone and use it from Google Glasses to jump to somewhere else.
  • Mirror is a RESTful API which allows you to push data in.
  • HUD projection is at the screen of which Wikipedia says: "A head-up display or heads-up display—also known as a HUD—is any transparent display that presents data without requiring users to look away from their usual viewpoints."
  • Warby Parker is to do prescription lenses for the glasses. If you are farsighted you will currently have a hard time reading the display.

Kyle stressed that he did not see a huge commercial market for Google Glass as getting Twitter feeds flashed to your eyeball is just an inferior experience to using a smart phone or tablet for the same thing. One has to be clever to make use of these accessories that make you look goofy when you are wearing them. It would seem that Mr. Samani sees a business model with return on investment in the medical space, but again, he is being tight-lipped about the specifics. Another difference between Joe Public and Joe Professional is that you can tell Joe Professional that he has to be wearing a cord to power the glasses under his clothes if he is going to wear them 8 hours a day, but that sort of thing doesn't play well with hobbyists at all. There apparently is company called PowerGlass which makes a headband that one wears in tandem with Google Glasses which holds a bunch of batteries and increases the life of Google Glasses significantly. Individuals who would wear such a headband are extreme geeks or dedicated professionals, not Joe Public.

Addendum 8/22/2013: I am noticing that big white spot in the midst of the photo above for the first time right now. It is pretty interesting how iPhones will miss a spot sometimes.

No comments:

Post a Comment