One of Google’s latest projects is entitled “Project Glass.” Through this project, Google hope to successfully develop a prototype for an augmented reality head-mounted display. The purpose of these “glasses” is to essentially provide a hands free smartphone; the information would be available through the glasses. They would also allow for interaction with the internet through a series of “natural language” voice commands, very similar to that of Siri, the voice command program introduced on Apple’s latest iPhone, the iPhone 4S. Unlike the iPhone, however, these glasses will use Google’s Android operating system.
As of now, the prototype for the first Project Glass looks similar to a normal pair of eyeglasses, but instead of the usual lens found in these glasses, Google has inserted a heads-up display. Google hopes to be able to further develop this technology so that it may be integrated in to people’s normal eyewear, no changes necessary. Babak Parviz, the electrical engineer who announced this project on Google+, has also been working on adapting this technology so that it is possible to integrate it with contact lenses. The other announcers of this project include Steve Lee, a project manager and “geolocation specialist,” as well as Sebastian Thrun, who has made a name for himself with his development of Udacity as well as the self-driving car project he has been working on.
On an interview with Charlie Rose, Sebastian Thrun actually operates the Project Glass prototype. He takes a picture with a button on the headset, and then his eyes visibly tilt as he looks at the a list of personal contacts and picture, which appear automatically. He then moves his head from side to side in order to select one, with a simple nod to confirm.
The article discussing this interview, which contains a link to the interview itself, can be found at the following link: http://www.technolog.msnbc.msn.com/technology/technolog/google-glass-gets-demonstrated-camera-interview-737903
This product is not being introduced without skepticism, however. An article on TechCrunch.com points out some major flaws and limitations in the Project Glass prototype discussed above. This article points out how Thrun never actually gives the prototype any voice commands, a feature that has been expected in the prototype since the beginning of April, when a video about the product was initially released.
This article goes on to discuss the picture Thrun took of Rose. The author notes that the picture does appear as advertised, but he says that the image quality is not only unacceptable, but that it is downright terrible. He states, and I agree, that this problem is quite serious in this day in age with the vast leaps technology has made with digital photography. He goes on to note that Google does not seem to be taking this image quality very seriously, something I found quite surprising considering Google’s seeming need for perfection. This author says that the majority of Google’s past products are “half-baked,” which is where I disagree with the author.
Despite these limitations, I find this to be a very intriguing. It is not the first of its kind, something I was unaware of before looking further into this topic. Based on some research on my part, I believe that these head-mounted displays began with military uses during the mid-seventies. It was incorporated into an aircraft for experimental purposes to see if it could aid in targeting heat seeking missiles. Throughout the history of head-mounted displays, the SAAF (South African Air Force) seems to be one of the primary drivers in implementing these head-mounted displays into their aircrafts. These displays are still being integrated into the cockpits of modern helicopters and fighter aircrafts.