Last week, I posted about Occipital Bridge, a VR and AR headset for the iPhone. I got to try it at CES 2017. Here are my impressions.
Mobile VR headsets today are limited to tracking your head movement (rotational tracking). Desktop VR headsets are able to track your forward, upward, or sideward movement (positional tracking) but they need external devices to track the headset’s position.
Occipital Bridge is a headset for the iPhone that uses Occipital’s Structure depth-sensing camera as an inside-out tracking device, enabling positional tracking with 6 degrees of freedom, without any external devices.
Moreover, Occipital Bridge uses the iPhone’s camera plus an ultrawide converter and data from the Structure sensor to provide a 3D view of the viewer’s surroundings. This makes it possible to use as an AR device (strictly speaking, a mixed reality device).
The Occipital Bridge was available for demo at CES. Their appointment times filled up quickly but I was fortunately able to get a demo.
I wore the headset which was reasonably comfortable. The headset gave me a partial view of what was in front of me. But instead of merely being stereoscopic 3D as in a 3D 360 photo or video, it was like looking at a 3D model of the room, similar to the 3D view from a Matterport camera, although with perhaps a less detail. In addition, it wasn’t the entire room, but about a 90-degree field of view. Outside of that portion, everything was black. Occipital’s representative (I’m sorry I forgot your name!) explained that the room had not yet been fully scanned.
If I turned my head too far to the side, I would get a warning that I was outside the scanned area, and the view would switch to a 2D view of the room (like a pass-through camera view).
One of the first things I tried was the positional tracking. Does it work? Yes. I could move in any direction and the view would change accordingly. The only problem was that there was substantial lag, much more than even Google Cardboard. They explained that they currently don’t have access to the necessary APIs and therefore the latency was around 50 ms (well above the 20 ms maximum required for immersion), but they are requesting access from Apple, after which they expect to drive the latency to 5 ms (!).
I also got to try the Bridget demo, where their robot mascot appears in the scanned 3D space, and you can interact with her, such as pointing to where you want her to go, or throwing items for her to fetch. Aside from the lag, it worked well as a tech demo of what’s possible with Bridge’s VR / AR headset.
Overall I was impressed with Occipital Bridge. I think this is close to what Tim Cook envisioned with AR. I really hope Occipital gets the resources and support it needs in order to fully realize the potential of its Bridge headset.