Tan Ming Sin
Finally Google X Lab unveiled their biggest project yet, Google Glass. The project known as Project Glass was undertaken by Sebastian Thrun and his team of Google engineers. The Google Glass is a futuristic spectacle-like device designed to gives users the perfect information on every subject in the user view – just like what Arnold Schwarzenegger would see from his Terminator eyes. However, given the futuristic concept in this design, it has been argued that the device is just another vapourware for Google Company to increase their company ratings. As a reply to that, we see Sebastian Thrun appeared with his prototype Google Glass during his interview in The Charlie Rose Show.
During an interview with Charlie Rose at The Charlie Rose Show, Thrun wore the prototype Google's Heads-Up Display (HUD) spectacle. The prototype is believed to be how the Google Glass’s form will look like including the HUD technology. However, as it is still only a prototype and that Google mentioned that the Google Glass will not be available at the end of the year, this prototype version may take a drastic change before it is released to markets.
Before we move on to the explanations, take a look at the Google Glass demo video here;
As you can see from the video, as soon as the Google Glass sensed movement (the moment the user wake up), Google Glass automatically pops-up the available functions that it has. You will realize that the same functions can later be seen every time the user look upward and the functions are arranged in specific orders. Then later there is the pop-up when the user receives an email from his friend. The reply was typed using the voice-recognition similar to the Siri on the iPhone 4S. So it shows there will be a built-in microphone to detect voices in the device.
While the user walked to the subway to take a ride, another message pop-up and this time, it is an alert or notification to inform the user that the subway service is suspended. Underneath that message, you will find that there are two methods provided to the user to get to his destination – the railway and by walking. The Google Glass then calculated the closest route and the total walking distance required, after than the navigation system took over. Then the top right of the screen appear to have a navigation arrow telling the user right direction. In this section, we can safely assume that the Google Glass will be relying on 3G or 4G data connectivity to provide the users with such information.
Later on, the user speaks out (voice-recognition) to make a reminder for Google Glass to remind him to buy tickets – probably a calendar app or the notification app. The user then walked into a bookstore and automatically the top right sign changed from navigation to a check-in symbol. This is a location-tracking service and not check-in app because you will find the user using the check-in app later. Then the user asked “where is the music section?” A directory of the bookstore pops-up and showed the music section in the bookstore. The directory appears to be similar to the navigation system just before this but we commonly know that no GPS can work to tell directory of a bookstore. Only downloaded directory of a shopping mall app can do this much. This is definitely a unique function or app that Google wants to show us. In the bookstore, the user asked if his friend is around. Google Glass then tracked his friend location (with his friend’s permission of sharing his location) and showed him the distance of his friend from the bookstore.
After the user met with his friend, they passed by a Mud Truck store and the user seems to have nodded a bit which brings down the functions which we have seen earlier in the video. While holding on the same position, the function check-in was chosen and the user checked-in the Mud Truck location via Google Plus.
Then the user move on and passed by an artistic building wall and bring out the camera app and took a shot. It shows that the Google Glass can accurately capture the view that the user is looking at. We will provide a brief explanation later on the accuracy of this device to track the user view. The user then proceeds to share it via Google Plus again.
After the walk, the user will show that the whole time the he was listening to music using the Google Glass. So it shows that Google can even store data into it. The user then video-call his friend and it seems that he can see her in a small window at the bottom of the screen. The voice part definitely makes sense with the voice-recognition that we have seen earlier but we are not sure if the friend on the other end can see him or not. We wonder where Google Glass’ front-facing camera is located.
Nonetheless, the user then showed his friend the view he is currently seeing. Probably using the same camera app that he used to take picture just now to show what he is looking at. And that’s the end of the video.
As a starter, the Google Glass does not have a glass on it. It does have the frame of a spectacle including the pads resting on the nose. However, there are not glasses on it just a translucent panel on the right side. Then again, according to a report by TechRadar, Google is currently experimenting with a design to fit the Google Glass with users’ glasses.
The translucent panel is the control panel for the Google Glass. It tracks the right eye movement to provide the Google Glass a calculation for the accurate users’ view. The Google Glass then align the users eye and head movement (swiping up, down, left and right) to provide a responsive user-interface on the HUD. That’s how the user in the above video choose the apps he needs and snap photos accurately at his view.
However, about the HUD that is being displayed at your eyes vision, it still seems to be an augmented-reality as argued by a report at Wired. Google blogger Seth Weintraub explained that the Google Glass might be using a transparent LCD or AMOLED display to show the HUD and put information at the front of our eyeballs.
So in overall, what can we expect the Google Glass to come with? From the evidence that we are shown, we can expect that on the hardware side (other than the frame and the translucent panel), the Google Glass will have a high megapixel camera, a built-in microphone, 4G or 3G connections chipset, an internal data storage (including USB connection), a camera shutter (as shown by Thurn in the talk show) and probably the transparent LCD. On the software side, it will have most of the general apps such as time, calendar, temperature and music. The remarkable apps as shown in the video will be the camera app, navigation (both on the road and in a shop), email, notification, video camera, voice-recognition, reminder, and the check-in app.