Final lesson- Designing & validating your 3D experiences

--

“The world of reality has its limits; the world of imagination is boundless.”- Jean-Jacques Rousseau

Designing

When designing for V/A/MR, you have to ask yourself the basics- What are your business objectives? Why are you creating this experience? and how is it going to make life better for anyone? Once we have this clear, we shall make sure to have some checks in place for our 3D UX.

3D UX checklist

  1. Scene approach: As soon as the user puts on the VR and begins his experience, what does he see? What are the affordances? (like a glowing object that would give them a clue to tap on it). What are the first 30 seconds going to look like? How will you build curiosity that pulls in the user and how will the story arc progress? (We’ve discussed these in our previous posts).
Affordance example- A door that gives a clue about what to do next

Trending AR VR Articles:

1. Augmented Reality Robotics

2. Infographic: The Future of Virtual Reality

3. Mario Kart in a real vehicle with VR!

4. How XR Can Unleash Cognition

2. Experiencing state: How do you want users to consume your content? Sitting? Standing? or laying back? It is well noted that for experiences that require higher concentration, people prefer it to be in a ‘sitting’ state, whereas for experiences that require higher focus (like boxing or table tennis), people prefer to be in a ‘standing’ state.

What’s their state like when consuming your experience?

3. Emotions: Designing VR experiences takes a lot of cues from movie making. It has to produce relatable emotions and deliver it at the right time. You have to decide what episode would spark what emotion? What are the roles of these emotions? (like why are you even sparking it?) and what do you expect the user to feel after they’ve felt these emotions?

Keep the users' emotions in mind before designing VR experiences

4. Intentional Interactions: When a user is in 3D, he must be able to do whatever he believes he can do. If it is a door, he must be able to open it. This will enhance believability. What is the point if you cannot perform natural interactions? Not allowing this will break immersion. However, there is a thing called ‘intentional interaction’, which basically means that even though the user can do what he thinks he can, what are some of the intentional interactions you want him to do? It could be that you intentionally want him to perform a particular interaction in order to proceed with the experience. Make sure to specify these (if any) in your UX.

It may be that you want the user to intentionally perform an interaction as a demo to how of how to interact with your experience

Now that we have the basics settled, let’s look at interactions for a bit.

Interactions

Direct manipulation: Do not make the user think! Manipulate the environment in such a way that seems seamless to the user. A good example of this is from an AR experience that clearly hints the user to go in a particular direction, so the user does not have to think through.

An example of direct manipulation

Indirect interaction: Indirect interaction can be explained as an interaction the user performs but has to go through an additional step. If I were to click to go inside a room, I must directly interact with the door to go through. But when I have to click on a section, then I get an option for 3 doors, then I click again to select a door in order to get in, this is not direct and increases the number of steps to complete the task- Indirect. This will hamper the experience as more steps mean more effort on the user’s part to figure out.

An example of going through several steps to get to the desired point

Physical considerations: If a user has to perform an action repeatedly, it can get very tedious. A regular complaint of HoloLens is that there is too much air-tapping. Similarly, think of how the user’s body would feel if they had to repeat a particular task 20 times in the span of 5 minutes.

Too much air tapping can get tedious

Feedback: As a common practice in 2D UX, hover is also essential in MR as the user must know if what he’s done is registered or not. You can also give hints (affordance) to convey a point. Make sure that your interactions have necessary feedback for the user. He must not feel clueless.

An example of Hover- It helps the user know that it’s doing what the user intended

Controller mechanics: “Just because you can, doesn’t mean you should.”
If your experience requires the use of 3 buttons, keep it that way. Just because your controller has 5–6 buttons that the user can interact with, does not mean you make use of all. Keep the learning curve low, it will help the user and you alike.

Use fewer buttons if possible, helps reduce the learning curve for the user

Gaze vs Point vs Voice: When the user interacts with your options, how does he do it? Does he point at it with his controller to select it? Or does he just gaze and the option is selected? Keep this selection mechanism in mind before designing your experience, and also see if it is supportable by your hardware choice. For example, Gaze & commit to action is commonly used in MR experiences, like HoloLens, whereas point & commit is common in VR, as shown in the hover image above. You can also use voice, but as noted in our previous post, we must be careful with voice due to its limitations to identify different accents.

Types of interfaces- Microsoft

Obstacles

Let’s look at some of the obstacles faced when designing for VR.

Judging closeness: A common obstacle can be to judge the closeness of an object in VR. What is the intended viewing distance? As we know that objects lose their perception of depth if >20 meters away, how do we make sure we don’t run into such obstacles? Well, we can refer to VR design guidelines by Oculus, Google, Microsoft, etc and work around the best practices for our experiences.

Viewing: What is in view and what is out of view? This can be handled by referring to our FOV (field of view) and rotation angles in the previous post. Also, we must always test the experience in VR to get a real understanding, only 2D mockups would not give a thorough picture.

What’s your Field of View going to include?

Reading and typing: Damn, just thinking about typing in VR gives me a headache. Can you imagine pointing at every letter to select? Reading in VR is also a NO-NO! You must not make the user read stuff in VR, as the orientation of the content can shake up other elements. If you want them to read a popup, the popup expands to make the content viewable, while hiding the rest of the elements. Make sure you place readable elements within a proper readable size and keep it as minimum as possible. An example of making typing fun in VR is by Daydream labs, as shown below. Instead of pointing at every letter/number, the user is encouraged to play the drum in order to select. Creative typefaces like these are originating and people are still trying to figure it out. You can refer to Google’s ‘text-hit size’ to design typable elements accordingly.

Tying in VR

Designing Immersive experiences

To design immersive experiences, you’ve to take into account the FOV in an immersive context. As we saw earlier that a normal FOV is 94 degrees for exploring in VR, the FOV for immersive experiences is 120 degrees.

FOV within VR- typically 94 degrees
120 degrees for an immersive experience

Also, it must be 6 degrees below the horizon, as our natural neck position is slightly bending 6 degrees downwards. Within the experience, we must allow a 30–35-degree up/down neck movement.

Validating your experience

When testing your VR experience, we must perform what’s called ‘play testing’ and not ‘hallway testing’. Hallway testing is a way to test your product with your friends/family, colleagues, etc. someone down the hall. But when it comes to VR, you must have real users test your product. If it is a game, have real gamers test the product, not anyone who is not into VR.

Testing a boxing game with real users

Top metric- ease of use

Evaluate the ease of use on 3 attributes:
Success rate- Was the user able to do what was desired of him?
Failure rate- How many times did the user fail to understand this?
Partial rate- Was the user confused/successful only at times?

Note: Test in target headsets (Oculus, Valve, etc).

Tips

Presence: Since you are in your own place/world in VR, if the illusion of place is broken, it is easier to recover. But if the illusion of plausibility is broken, it is much harder to recover. Plausibility can be broken with un-realistic graphics, lag in action (if you open a door, does the door open after 2 seconds instead?) sounds, etc. Hence, take inspiration from the real-world when designing for VR. The image below of a game with no-so-realistic graphics, which breaks the plausibility illusion, which is harder to recover.

A game with unrealistic graphics

You can measure presence by asking users to describe what they felt with a questionnaire, or calculating their emotional/physiological states (sweat, heartbeat, etc) or by simply observing their behavior- do they go “Woooaaahhh, this is the coolest thing ever!” ?

Time dilation: a good indication of presence is when a user has lost track of time and is within the experience for longer than he thought. It implies that the user is fully immersed- you’ve done a good job!

Embodiment is key- realism is a must. Make everything look as real as possible (maybe not hands, as it’s creepy, as observed in our previous post)

A quick evaluation card below allows you to get key feedback on what the user experienced within VR, which is extremely valuable.

Lastly, Take into consideration different people, genders, etc. when designing your experience. Make sure your experience is gender-inclusive, can be experienced by people of different height & sizes and can cater to people with disabilities.

--

--

Keen interest in technology, design, philosophy, and psychology. Why are we here? And where are we headed?