0

Oculus Rift Impressions

By Rjay Haluko
Share on Facebook0Tweet about this on TwitterShare on LinkedIn0Google+0

I love it but it makes me want to lose my lunch.

We were all really excited to get our hands on the Oculus Rift at LW. The Oculus comes in a nice carrying case and is surprisingly nice for a development kit. After downloading the SDK we had it up and running and it’s awesome at first. Then something odd happens, you begin to feel disconnected from your body. Some people in the office had a stronger tolerance, but everyone had the same disjointed feeling that often lead to feeling nauseous.

I personally had a pretty good tolerance for it, I was able to last close to 2 hours. After 2 hours I felt sweaty and nauseous, and I actually had to lay down for about an hour. Sounds horrific right? Sort of.

It’s an amazing piece of hardware that I think has a ton of potential. Keep in mind the Oculus we have is any early build, and the consumer product will deal with part of the problems we at LW felt. The screen you view is rather low resolution and the latency is problematic but I’m confident with John Carmack at the helm of CTO of Oculus Rift they will squash these problems.

“The old ways of control suck with the Oculus Rift"

Redefining interfaces and methods of control

Once latency and screen quality are improved we are still left with the disjointed sensation of moving while sitting while tapping on a keyboard. The keyboard and mouse totally suck to use with the Oculus, it’s really such an awful way to experience it. While wearing the Oculus we need to be able to look down and see ourselves with some sense of location. Inputs and controls need to become way more transparent, and interfaces need to be reimagined. Some advances are already under way with things like iMotion and Virtuix Omni to name a few. This years should be interesting for the VR scene in term of new inputs.

Personally, I would love to see a Kinect body scan me and place me in the environment. Looking down and seeing a one to one with my motion in the environment would help ground the experience. Additionally we need to lose the mouse and keyboard, and move to something more transparent. Again, I think correlating my real body position with my actions and eye movement in the game environment could be a step in the right direction.

I think haptic feedback is going to make a huge difference once we remove the controllers, mice and keyboards. Letting a user explore an environment with haptic feedback will be a game changer. Imagine reaching out to grab that in game object and feeling contact in you hand. A haptic feedback glove can apply the sensation of contact, letting go of an object wouldn’t be a key press it would be literally opening your hand. Interfaces can be placed around the user in 3D space, and triggers could be based on body language or hand proximity. This is will be a completely different UX problem to solve than what we are used to.

Here at LW we have a ton of different ideas about how we can use these new methods of interactions and inputs. Overall the Oculus Rift is the future and only in its infancy. I think 2014 should be an interesting year for VR.