Knowledge Base
28th January 2019
VR: Intuitive interaction
The whole point of virtual reality (VR) experiences is to suspend disbelief sufficiently to make you feel as if you’ve stepped into a whole new world. Being able to interact with that world makes it all the more convincing. However, interaction can make or break the illusion. Here, James Burrows discusses intuitive interaction and developments to look out for in the future..
Ways to interact
The ultimate ambition for VR experiences is for there to be a way of interacting that’s so seamless it feels completely natural – as if you really are there in the virtual world. ‘Natural Interfaces’ are the VR developer’s dream and have no need for external controllers. We’re not quite there yet but the following methods are common ways we can add interactivity to VR.
Hand controllers
The most prevalent interactivity systems for VR, these range from simple point-and-click mechanisms for headsets such as the Gear VR or Daydream, to more complicated dual hand controllers such as those for the HTC VIVE or the Oculus Rift. These offer multiple functionality, ranging from triggers to trackpads, reflecting a variety of options in the experience or game.
Gaze detection
This was one of the main ways to activate elements of a VR experience when mobile headsets started gaining in popularity and today it still has its uses – e.g. at busy trade shows or events where there’s every chance a hand controller will go missing! In fact, a lot of developers implement gaze direction as a fallback even if they feel the user will more likely use a hand controller.
Hand tracking
This method of interaction involves a third party piece of software, such as a Leap Motion or glove controllers and enables people to see and use their hands in the virtual world. The concept is sound but so far the technology has yet to live up to the VR developer’s dream as it can be inaccurate. A lot of developers are waiting for this type of tech to be built directly into the headset, which is on the cards for later in 2019.
Choosing your interactivity
There are a few key points you need to consider before you implement an interaction system in your experience.
Your hardware
This will straightaway determine your options. If your experience is only going to be available on an Oculus Go, then you only need to worry about the point-and-click controller. The VIVE, Rift or Playstation, on the other hand, offer more detailed functionality.
The type of experience
Are you building a game, a simulation or a brand experience? Games and simulations tend to require more functionality so you should consider more than just point-and-click or hand tracking. A brand experience will more likely be used at a trade show or event, meaning something simple such as gaze detection is more suitable.
The competency of the end user
This is very important. If the end-user has never tried VR before, then you need to implement a very simple interactivity system. Struggling with a controller you can’t see is never going to be a good user experience and may put people off VR. If, on the other hand, you’re aiming at gamers or a tech-savvy audience, you can reasonably assume they’ll be able to find their way around a controller quickly.
Intuition is key
If you want your end-user to have the most “real” experience possible, then you need to think intuitively. What is the most natural way to interact with the experience? Does it feel more natural to point and click or to use a trigger, or simply to look?
How you interact will always depend on hardware, but just because you have access to a VIVE doesn’t mean you need to find a function for every button on the controller. Instead, ask yourself how big a learning curve your user will have – are they a novice or are they a dedicated gamer?
By putting yourself in your user’s shoes, you’ll be able to choose an interaction system that makes the most sense and keeps the virtual illusion going. In the meantime, we’ll be waiting for skeletal tracking to get to the point where we don’t need to worry about controllers at all.
James has over two decades’ experience in highly technical roles – from the main IT agency of the British Government to digital marketing. He’s led development teams in three different agencies, plus worked as a freelance developer and consultant. His love of all things tech led James to co-found Infinite Form in 2015. When he’s not playing with computers, James is usually playing bass.
More from the Knowledge Hub
Do you have a message you want to convey? A situation that needs simulating, or an audience that needs reaching? Whatever your challenge – we have the ideas, the experience, and the equipment to help.