Planning a set of traditional user testing sessions this week got us thinking: how could we conduct a user test in virtual reality? User testing, in its current form, is a well-known entity. We have the tools, we have the techniques, we’re set on that front.
But what happens to the user test when the display attaches to the user’s face? VR as an experience adds a lot of complexity that we’ll need overcome to be successful.
For those who’ve conducted user testing, whether as the client or as the UX professional, you’ll know it demands both time and expense. This will be just as true in the VR space, if not more so, as it requires so much more setup.
“Setting up VR tests is costly and time consuming. We realised that to maximise our time and testing efficiency, it was necessary to see outside help” – Philip Cohen, Schell Games
So what are the challenges to user testing in the VR space? and how might we counter them? The first must be the price and pain of application
The likely use-cases for low-cost VR headsets that equally can be distributed easily; and short experiences for applications such as on-boarding or HR.
The cost of the high-end systems such as Oculus Rift currently prohibits large-scale take up. In addition the technology means attaching users to high-power hardware in a fixed location.
We want to test VR that can be experienced anywhere, at any time, by anyone.
Next, we can no longer directly study the participants view.
With the display set within the headset itself, the days of looking over a participant’s shoulder are over. Content will need to be mirrored on the moderator’s and observer’s screens.
Mirroring technology is not the challenge in itself. Plenty of methods for converting a mobile screen to a larger stand-alone device already exist. However VR involves side-by-side views – from the duel viewing ports within a VR headset – which are then merged by the lens of a VR headset. No solution currently exists for merging these duel output signals. We may need to live with this for now.
The physiological response is just as important as emotional
Virtual Reality can elicit powerful physical reactions in the user. Whether excitement or motion sickness, this too needs to be a consideration for the UX professional.
Whilst users are more than happy to express excitement over VR, they are less likely to admit to feeling sick. In fact, studies show users stick with the VR experience long past the first stages of motion sickness, and lie about feeling ok.
“They would insist that they were fine, because they wanted to keep playing, even if they began sweating and you could observe their skin turning pale.” – Kevin Burke, Twenty Milliseconds
As such, we’ll need to monitor a VR participant’s physiological condition. Something user tests have never had to consider before. This can be achieved by monitoring body language, heart rate, even the words they employ.
To reduce the effects of VR, we suggest including a wearable device, such as an Apple Watch or Samsung Gear Fit, to record heart rate/blood pressure information. In conjunction video coverage of the test should monitor both body and verbal language.
Everyone will have a different experience of VR
In general, users receive complete control of the VR experience and environment. They can look in any direction at any time, and are free to interact however they choose. This presents enormous challenges when outlining the steps required to complete a task, and makes completion far less predictable than in a desktop/mobile environment.
Tasks will have to be less linear than currently, to allow for more user freedom when completing them.
Multiple Streams of Data
Traditional analysis from a user test involves a single format recording of a user’s on screen activity, followed by discussion with a moderator as the activity is later examined. However, VR will demand multi-level analysis of multiple output streams to reach the desired results.
- Recording of the actual VR experience, mirrored onto a desktop environment.
- Video recording of the participant throughout the test, recording not only their conversation but also their body language.
- Physiological data about the participant, captured by a wearable device and/or enhanced by specialist observation
So what are the advantages to user testing with VR?
Similar to Mobile, VR applications offer true user agility – they’re not restricted to static, desk-based use-cases. VR can be tested while seated, standing, on a train, inside or outside a specific environment.
However, users tasked with increased physical activity will need to avoid confined or restrictive spaces and inadvertent collision with furniture of other people. This is a matter of more effective logistical pre-planning.
Immersion and Audio
Audio is a key part of the immersive VR experience. Sound is not only integral to the activity but can also be used to direct the user when required to observe, and even interact. This in fact presents a double challenge:
To validate audio cues successfully, user test experts will need to hear this signal alongside the visual feed.
Unlike current best practice, each task will need to form a complete experience/workflow. Otherwise, whenever users are prompted to remove the headset between tasks, any workflow immersion will be lost.
Task Completion is not the only measure of Success
Traditionally, whether the user succeeds in a task or not is based not only on actual task completion, but also the length of time this takes.
These conditions need to be re-considered for VR. Yes, the user needs to be able to complete the task in a reasonable amount of time, but other factors are now in play.
Preeminent is the physiological response experienced by each participant. Where physical stress occurs – from feeling sick to dangerous levels of heart rate or blood pressure – this should also be seen as a task fail.
User tests are not concerned with the user’s abilities but application efficiency, and if the application causes sickness, it isn’t working.
Try not to test the Device
Lastly, when planning user tests in VR, we should include one final factor: we cannot assume the user will know how to operate the device. If this is not managed correctly, then each user test will simply produce a repetitive cycle analysing the device rather than the application.
VR is growing in popularity and recognition, but we are a long way from the inherent knowledge required to use it that we now see with computers and mobile devices.
The obvious first step to countering this problem exists at the recruitment level. Filtering out participants who have no experience of VR devices would remove the need for explanation. However, this may hamper subsequent tests, in multiple ways.
What of the additional recruitment costs that come from the additional screening filter? The additional time required to complete recruitment and the removal of the most advantageous element contained within VR today – the initial ‘wow’ factor. That alone is something well worth capturing and user testing.
Our recommendation would be to add time at the start of each test to ensure individuals can get used to the device and the required interactions. That way, once they actually set about completing the task, they won’t be side tracked by operational discussions.
Want to know more?
Interested in seeing what the VR experience can add to your business, and discovering how Omobono can realise your potential? Please get in touch.
Alternatively, if you have an existing VR application to test, but don’t know where to start; we can help with that too.
We are the digital experience company for business brands.
In today’s connected world, experience is brand.
So we help you create better experiences for your customers, employees, partners and stakeholders. Ones that work in empathy with them to achieve their goals, engage and delight them, and build brand loyalty.