Google glass is the unquestionable flag bearer for a new segment of personal computing dubbed wearables. It’s still hugely unclear how the market will engage with the wearables, and whether they’ll be as successful as expected. For many of you this post may be your first exploration into wearable computing, and for some it may even be astonishing to hear that Google Glass is not the only wearable computer on the market. It’s not even the first! So while Google Glass Explorers roam your streets, it is good to know that there are other contenders vying for your attention too.
Some argue that wearable computers date back 100’s of years, just think of the first abacus, or more recently the wrist watch. All can be argued to be form of wearable computing devices. In this case however, the case of augmenting reality, I’d like to narrow down the conversation to three players: Meta, Vuzix and Oculus Rift. You may notice that neither of them are Google Glass, but all, especially Meta, carry huge credibility in this space.
HEAD MOUNTED DISPLAYS
Starting in reverse order it’s the oculus Rift which has everybody talking. Having raised an enormous $2.4 million in crowd funding via Kickstarter (September 1st, 2012). Designed initially as a Head Mounted Display (HMD) virtual reality headset for video games, we are now seeing developers adapt the platform for use with Augmented Reality too. Not initially conceived with that in mind, the quick evolution is a sign of how disjointed the wearable platform still is.
HOME-BREW PROTOTYPE WITH APPEAL
The Oculus Rift Kisktarter campaign, and high profile launch at the 2012 Electronic Entertainment Expo, revolved around developer engagement. While gamers were all being teased with what would be, it was developers that we’re being enticed with “what is now”. The reality is that what was showcased on stage that day was a home-brew prototype of the Oculus Rift featuring a 5.6 inch LCD displays visible via dual lenses.
Where Oculus Rift and Meta begin to have synergy, and where they differ from the Vuzix M100 and Google Glass, is the way in which the user views the virtual, or augmented experience. To understand this synergy we need to know of Professor Steve Mann’s Point-of-Eye (PoE) theory, i.e. the point-of-view of the cameras (and what is seen on the screens) is the point-of-eye (PoE) of the viewer, such that each screen sees as if they were the wearer’s own eyes. Both the Oculus Rift and Meta’s Space Glasses share this user perspective, while Google Glass, Vuzix M100, and others intend the user to peer upwards-and-outwards to gaze at the information being presented, in a similar manner to traditional military head up displays (HUDs).
OCULUS AUGMENTING REALITY TOO
While at this stage Oculus Rift appears on track with their approach to how they augment user’s reality, they are also more focused on gaming and virtual reality than Google and Meta. It’s not to say one-or-the-other is right, but it only suggests that no-one really knows how the market will adopt the technology as it unfolds. There is definitely enough users interested that either may prove profitable enough to be viable.
What is in everyone’s favour is that the rate at which micro-computers are evolving means the inclusion of sensory data gathering chips (like accelerometers, gyroscopes and GPS) that eyewear can possess will provide a larger mass of information in which manufacturers can interpolate when creating immersive experiences for users. I discussed true immersive experiences for gaming earlier, but for mobile computing the wearable glass maybe the answer AR is looking for!
VUZIX M100 AUGMENTED REALITY HEAD MOUNT DISPLAY
Vuzix has a slightly different approach, and history, but is vying for a similar position in bringing virtual reality to life. It was their VFX1 goggle (circa 1994) which first inspired many to consider the wearable as a potentially profitable sector in the consumer market which had yet been untapped. The first consumer virtual reality product that offered a truly immersive virtual experience. The VFX1 featured a flip-up visor, adjustable IPD, a dedicated video card and the Cyber-puck – a virtual reality hand controller.
Currently Vuzix has the M100 in development and promises a fully commercialized version via a “coming soon” moniker on their site. The M100 is dubbed “Smart Glass” and as per the description of Vuzix President Paul Travis, the M100 is “effectively a smart phone that’s wearable.” Whereas Oculus Rift is a fully immersive 3D image, the M100 is a cellphone sized screen projected about 12” in-front of the wearer. It runs the Android operating system and includes smartphone like GPS, accelerometers, gyroscopes, Wi-Fi and effectively everything you’ll find on a smart phone. With a HD camera the user is promised vision of the real-world and potentially modify the information as a function of where you’re looking. As opposed to the Oculus design, Vuzix is touts the core competencies of the M100 as being ideal for warehousing situations and increasing efficiencies in more mundane logistical daily tasks. Ironically, as unsexy as this sounds, the calling actually talks very loudly to the enterprise market as a productivity tool and that in turn takes away the novelty, and gives the M100 a truer business functionality that may see it win out over the Oculus Rift and perhaps Google Glass.
By way of design you’ll note very quickly that the Oculus product is very immersive and all-encompassing of the face, i.e. totally obtrusive and encumbering. This is not a mistake but a result of their target intention, i.e. to provide a full immersive virtual experience. The Vuzic M100 takes a cue from Google Glasses intentions and is designed to be much less obtrusive and thereby sits offset to one’s PoE and utilized as and when needed, and when not in use is somewhat out-of-the-way.
Meta somehow combines both and is probably the wearable team I like most. After having met the Chief Scientist on the Meta Team, professor Steve Mann in Silicon Valley, I knew the Meta product would be potentially less commercially viable, but likely to hold more intellectual integrity in its design. Why would I make such a bold statement, you might ask? I think if you’d met Steve Mann you’d know.
Steve is the indisputable “father of wearable computing”. With his history in this field chronicled by anyone that’s every written about the sector, he is often seen as a wearable fundamentalist with purist methodologies that stem back to a passion for cyborg communication and lifestyles. Meta also brings clout to the project via Augmented Reality UX/UI designer Professor Steven Feiner (yes, they’re both Steve’s). Steve Feiner is also an AR fanatic and boasts and array of technology ventures to his name.
What’s most exciting however (and no offense to “the Steves”) is the Meta brings youth to its side. With COO Ben Sand, and CEO Meron Gribetz, both in their twenties, Meta seems to have both energy and experience on its side. Even Robert Scoble seems to have caught the Meta bug with this recent google+ post even drawing similarities to the early days at Apple and Microsoft, and calling the post ‘the next ones!”
The beauty of Meta is that it talks straight to arguably the world’s most famous Augmented Reality SDK, that of Unity 3D. Within Unity developers are able to perform complex Unity 3D Surface Tracking, Hand Tracking (including gestures) and custom computer vision work. Unity3D is a platform very popular platform for developers and a great fit for Meta.
While on kickstarter Meta Space Glasses didn’t come anywhere near Oculus rift’s $2 million, having reached only $194,444, it’s developer base is potentially more powerful. With both vision based technology and augmented technologies it has the power to enhance and combine virtual worlds and real worlds together. Dubbed as “not just a popup in the corner of your eye” allowing limited access to computing applications (stab a Google Glass) nor is it a shield that traps you in a world of virtual reality (stab at Oculus rift).