Lesson 2: Design plan & 3D UIs

--

In this post, we learn how to design UIs in 3D, along with reflecting on the UX of the experience. To begin with, let’s look at how to plan your immersive experience.

HEART framework:

The HEART framework by Google

The HEART framework for design was introduced by Google and is encouraged as a reference point for designing experiences. A few additional points below.

  1. Goals- what are your goals for the user and yourself?

2. Audience- Who are you making the experience for? Why will they use it?

3. Limits- Will your experience have limits? If yes, how do you solve them?

4. Experience- How do you want the user to feel in/after the experience? Also, how present are they? (remember we spoke about the importance of presence in our previous post).

Trending AR VR Articles:

1. How XR Can Unleash Cognition

2. Expert View: 3 ways VR is transforming Learning & Development

3. Five Augmented Reality Uses That Solve Real-Life Problems

4. Virtual Reality Headsets: What are the Options? Which is Right For You?

What research shows:

Multi-modal interactions-
According to various studies, it’s safe to say that ‘Multi-Modal Interactions’ are necessary to make the experience more immersive. Multi-Modal means the involvement of various sensory inputs, like sound, visual, haptic feedback (your controller vibrates), etc. Imagine playing a game where there are only visuals and no sound, would you feel immersed in it?

A multi-modal experience deepens immersion. Image from https://bit.ly/2Z466BY

Motion-
Imagine walking in a park and all of a sudden, the park rotates itself and your path is suddenly changed; how would you react? The picture below by Escher always trips me out, haha!

Relativity by Escher

When you are immersed in a 3D world, you experience it from a 1st person perspective. Humans are sensitive to motion and mainly when it is unexpected. A major confusing point is when the user does not initiate the movement, especially when it is vertical (imaging randomly falling off a cliff, it would definitely fright you).

State while experiencing-
One of the key points to keep in mind is how you’d want your users to experience your offering. Sitting? Standing? or laying down?

Image taken from Mike Alger’s talk at MCE 2018.

According to Mike’s research, people want long-form content, something they’ve never seen before, in a sitting state.

UIs in 3D:

Viewing content-
In VR, a few common ways to view your UIs are flat, faceted and curved.

Image taken from Mike Alger’s talk at MCE 2018.

I am not sure which I’d pick, maybe it depends on the task. For e-mails, I definitely wouldn’t use faceted (center option). According to Mike, Curved seems more sleek and futuristic.

Placing content:
One of the best guidelines I’ve seen to design UIs for VR comes again from Mike Alger’s interview here. Start the video from 5:20 to get to the point.

VR Interface Design Pre-Visualisation Methods

Field of view- Horizontal
A person can view content in a 30-degree range, and can comfortably view up to a 55-degree range. Anything more than that, the user has to tilt a bit more, which might not be comfortable (assuming you are NOT sitting on a rotating chair).

Field of view- sideways

Field of view- Vertical
According to the presentation above, a user can view comfortable 20-degree and 12-degree above/below the horizon, with 60-degree and 40-degree max. Note, this was observed 4 years ago by Alex Chu (then head of Samsung Gear VR)

Field of view- Horizontal/vertical

Relatively recent, Microsoft offers it’s own guidelines for gaze, taking a perspective from Hololens.

Allowable field of view (FOV) as determined by neck range of motion- Microsoft

Quoting directly from their guidelines below:

  • “Avoid gaze angles more than 10 degrees above the horizon (vertical movement)
  • Avoid gaze angles more than 60 degrees below the horizon (vertical movement)
  • Avoid neck rotations more than 45 degrees off-center (horizontal movement)”

Content zone-
Anything too close to the eye can be daunting, hence it is recommended to not place permanent content within the 0.5m range. For an immersive experience, 1.25–2.5m is recommended.

1 to 20m is the main content zone, and the user requires to use their peripheral vision for content placed between 77–102 degrees. It is recommended that you do not place important content in the peripheral zone., as it may be missed. Beyond that is the curiosity zone, where the user literally turns back (imagine a scene where the user has heard their name being called from the back). Content placed beyond 20m loses the perception of depth. According to Oculus best practices, menus and GUIs should be placed at least 0.5m away, while many have found 1m to be comfortable.

Viewing patterns-
As studies have shown that websites have a typical ‘gaze pattern’ which starts from the top left, and swiftly progresses down vertically, while reducing horizontal gaze (the F or Z pattern, depending on the website). Likewise, even VR has a gaze pattern.

Website gaze pattern- https://bit.ly/2MHRimI
VR gaze pattern- Mike Alger

In VR, a user naturally first focuses on the center of the experience, then looks around/sideways to make sense of the 3D environment, followed by looking down (maybe to see the surface below), followed by looking up (as we do not look up for answers, which may explain this delay), and lastly, looking at the controller in the hand.

Text/object size-
In VR, the minimum text size should be 14px, while 20px is recommended for comfortable reading.

Image taken from Mike Alger’s talk at MCE 2018.

Similarly, according to Microsoft Hololens, a comfortable setting for viewing objects is 2m, and the object size should not be more than 10cm, as it will lose its believability.

Object size guidelines- Microsoft

Voice

Voice is something that feels natural to us. It’s very intuitive and is arguably our most useful interaction technique (well, visual of course, but voice (language) has helped us humans to unprecedented degrees). When in VR, we may be tempted to shout/say something we’d want to happen within the experience, after all, we are totally immersed, right?
Naturally, voice systems can have some pros & cons, and let’s understand them below.

Pros:
1. Reduces time and effort- Imagine just telling the computer what to do instead of having to do tasks manually. Hella time-saving.
2. Socially acceptable- People around us all talk on their smartphones, so it is not something that would be seen as weird. (Unless of course, you do it in a public setting with a high tone).
3. Routine- Done well, this form of interaction can become routine quickly, as it is intuitive and feels natural.

Cons:
We just have to look at existing voice interaction systems to get a clue.

1. Accent- What if the VR does not understand your accent? If it does understand, what about the context?
2. Quantify- Supposedly you tell the VR computer to reduce the volume, or increase it, what is it supposed to do? How much louder do you want it? Think about checking your e-mail and you say “zoom in”; well, how much?

Tips for voice interactions:

  1. Keep your commands as concise as possible- “play video” > “play currently selected video”
  2. Keep it simple- “show note” > “show place card”
  3. Do not use similar-sounding commands
  4. Having an ‘undo’ command is one thing, but make sure no command can destroy what you’ve built (think alt+f4).
  5. Maintain consistency in commands- If “go back” takes you to the previous page, maintain that consistency.

The transition from 2D to 3D:

Image: Interaction Design Foundation
  1. Goals- In 2D, our goals consist of getting the user from one page of the app to another. This is done by giving them precise navigation points (icons). In VR, we do this by giving users ‘directions’, instead of precise icons like a mobile app.
  2. Affordances- In 2D, you can give affordances like highlighting an object, so the user knows where to click (screens). In VR, we give affordances through scenes and tools. Since it’s a 1st person experience, we make it look like they’re around us.
  3. Flows- For a mobile app, we have things like ‘flow chart’ for the user journey, etc. In VR, we have these flows in the form of a narrative. It's more of a ‘story flow’ that describes the user journey.
  4. States- On a mobile app, we have hoover states that indicate what we’re about to select. In 3D, we have ‘episodes’ that act as a collection/memory. For example, you want to go inside a shopping mall in VR, but can only do so if you get on the bus waiting below your virtual building. This is an episode that acts as a state, while in VR.
  5. Emotions- On 2D mobile apps, several apps have done their best to entice emotions in us with fancy/unique push notifications, fun welcome animations, etc. How well they work is debatable. In 3D though, you cannot just make someone feel something, you have to create a mood for it. If you want someone to feel relaxed/stress-free in VR, you have to create an environment that makes them feel that way. You have to induce the mood! (think multi-modal experiences- visuals of them laying on the beach, sounds of waves, etc).
  6. Navigation- In 2D, navigation is done through the design of the website/app. The UI elements like the ‘←’ button take you to the previous page, etc. In 3D though, where are you? In a new world where you can turn 180-degrees, how do you navigate to where? This is spatialized navigation, where it considers the entire space around you.
  7. Tools- If you use example Photoshop, you have your toolkit on the left of the GUI. In VR though, you can have it either ‘body locked’ or ‘world locked’. Body locked refers to something that is with you at all times, think a pigeon on your shoulder at all times that can assist you if you need something. World locked is something that is out there in the world/environment. For example- to exit an experience, you always look up in your surrounding, where the tools to exit the experience ‘X’ is placed.
  8. Diagetic- Diagetic UIs refers to the experience you have as a 3rd person. It’s like a narrator narrating a story to you, where he explains the plot, etc. Non-diagetic refers to the experience you have, as a 1st person, in the virtual world. This experience is way more immersive as you feel like you’re a part of it.

Notes:
-
With this in mind, we have to remember that believability is the core principle in VR. This has bought ‘skeumorphism’ design back in the trend, as you have to design for believability.

- Discovery should be super easy. Menu density should be less, a user shouldn’t have to go through many sub-menus to get what she wants. 1 sub-menu looks decent if done right, but imagine having more options popping out of the right side menu. It certainly would hamper immersion and could break presence (a big NO-NO).

Elite Dangerous in Oculus Rift.

- Less movement, less multi-tasking, and less visual input are keys to maintaining a presence.

- Always think of the PCT- Persona of your user, the Context in which they’re trying to achieve what they are trying to achieve, and the Task they’re trying to achieve.

Lessons learned by Christophe Tauziet- Facebook head of Social VR:

Christophe was a member of the FB Social VR team and more recently, he’s the design manager at Uber. Here are his tips for designing interactions in VR. He’s given a few more tips for Social VR in particular, I have omitted them to adjust to the context above.

  1. Avoid high-arm raises frequently- The higher the arms raise, the faster the interaction should be, else the user may suffer from fatigue. Also, raising your arm when your elbow is not in contact with the body can be tiring.
  2. Moving interfaces are a challenge, so try to avoid major interactions/selections with moving interfaces. Could be used for quick interactions.
  3. As discussed above, objects should be kept at an optimal distance (2m) from your virtual self. Objects that are too close are harder to pick up, making it challenging.
  4. This came as a surprise to me, but he stated that ‘realistic hands’ feel creepy to users, as it makes them feel like they’re in someone else’s body. One would assume that realistic avatars would deepen immersion, but somehow it wasn’t true in this case; maybe in the future this would change.
  5. Making the VR world behave differently from the real-world could break immersion. Example- your hand shouldn’t go through the wall if you’re trying to touch it, etc.

Common AR/VR emotions:

  • Surprise- people often get surprised at how real this feels, and most report losing a sense of time.
  • Excitement- As users discover their way in VR, they get intrigued and excited about what they can do.
  • Suspense- As most people do not know what might happen next, there is a level of suspense that is also present.
  • Delight- Once a user has a good experience, they are left with a sense of pleasure, as they’re experienced from a 1st person narrative.

That’s all for this week. I will update again next week with my learnings.

Best,
BQ

Links:

https://www.youtube.com/watch?v=49sm52fG0dw

https://medium.com/@morganmfritz/ar-vr-resources-learning-and-inspiration-f96f60ffd18c

https://medium.com/the-language-of-vr/virtual-reality-s-fundamental-question-ee94763a917d

https://www.youtube.com/watch?v=id86HeV-Vb8&t=452s (start from 5:20)

https://www.youtube.com/watch?v=n3b8hZ5NV2E

https://medium.com/r/?url=https%3A%2F%2Fwww.interaction-design.org%2Fcourses%2Fhow-to-design-for-augmented-and-virtual-reality

Don’t forget to give us your 👏 !

--

--

Keen interest in technology, design, philosophy, and psychology. Why are we here? And where are we headed?