Is UX for AR just UX for Reality?

In a previous blog I covered developing the user experience for a large scale VR application in relation to our (award winning) project at REA Group. My post detailed a number of thoughts around UX testing in VR. Via trial and error I learnt that VR and UX had a strong relationship, but I was unsure how to bring them together at the start of the project. By the end, it was a lot clearer.

Our journey of UX in AR started out in a similar fashion. In both AR and VR, It’s amazing how much you can achieve by implementing a minimal experience and letting users vestibular system and perception fill in the gaps. To get there though, we need to think about how we design with random environments, different surfaces, light, and external sounds in mind.

If the world becomes your UX canvas, how can you design for that?

Recently, Google announced the new ARCore library and features at the Google.IO conference. One of the talks presented by Alesha Unpingco and Alex FaaBorg was around AR best practices. In this talk they presented their 5 Pillars of AR Design. This followed in the footsteps of an excellent blog piece by Alesha on AR design. It also provided some additional discussion for their best practices documentation on the web.

The 5 pillars are:

  • Understand the users environment
  • Plan for the users movement
  • Onboard users by initialising smoothly
  • Design natural object interactions
  • Balance on screen and volumetric interface design

With my recent experience on AR projects, combined with various learnings from the DiUS UX team, I will provide thoughts and scenarios on how we go about applying these patterns.

UNDERSTAND THE USER’S ENVIRONMENT

An indoor space is easier to work with because a common context is shared by everyone involved. You can expect to see walls, tables, chairs, and floor space. Even an empty warehouse has floor space and walls. Outdoor scenes are much harder. The only guarantee is the ground.

Light has a noticeable difference in either environment. Take time with the lighting so your augmented objects appear as if they are part of the environment.

The more you integrate your application with the environment, the better the experience will be for your users. When designing for this environment, detach yourself from your phone frame. Don’t think about mobile UI or even the phone at all. Sketch the environment they will be in and think about scale relative to the user. Think about the objects that they will be interacting with.

If you were creating a treasure hunt app indoors, you could utilise the environment to great affect. You could hide objects behind furniture and use different surfaces to animate objects. What if you had to catch a rabbit? You could have it jump from the table to the floor. Add in some well thought out light estimation to provide a rich experience.

PLAN FOR THE USER’S MOVEMENT

Designing your experience beyond the bounds of the screen can make your application feel more alive. In one of our projects, we wanted to view a large building model in the place it would eventually be in the real world. Once the AR building appeared, some users wanted to walk up to it and move around it. They wanted to appreciate the scale of the model. Other users wondered if they could even move near it at all.

We’ve all grown accustomed to information being presented to us on something resembling a tv screen. We generally don’t need to physically move to interact with that information. What if an augmented object moves out of the device view? Use navigational aids to suggest to a user where they can physically point their device. This will help when objects can move or have moved offscreen.

At the Google.IO conference, Alesha Unpingco mentioned that AR experiences generally fall into a 3 scale sizes – table scale, room scale, and world scale.

I find that room scale and world scale provide the most interesting AR user experience. You feel like there is something genuinely within your space.

Room scale gives a big boost in realism. Screencap source: Introducing ARCore

Table scale tends to feel like something I would do on a normal laptop or desktop screen. This is where thinking outside the bounds of the screen makes a huge difference. Could you imagine the pokemon experience involved chasing a pokemon around your living room? And as you are chasing, it jumped across surfaces etc? Moving to room scale gives the app a lot more life.

Table scale is like god mode. Screencap source: Introducing ARCore

Set the right expectation early for users so they know how much space they will need to use your app.

ONBOARD USERS BY INITIALIZING SMOOTHLY

In any AR application, understanding depth requires some movement. In our experiments, it always feels like an imagination breaking experience to ask the users to first scan for a surface. The trick is in making this a fun or easy step. The simplest and easiest way is a simple animation highlighting exactly what the user should do.

When placing objects into the scene, there are a few suggestions to try. Use navigational aids to show object placement possibilities. Suggestive icons or animations under objects you might want to place into the scene help.

An example is when you drag your objects around, the icons or animations change depending on the situation. If you cannot place an object onto a surface in the scene, show a red X icon underneath the object.  If you can place an object on a surface, project an anchor icon onto the surface to highlight exactly where the object will sit.

DESIGN NATURAL OBJECT INTERACTIONS

Think about how we interact with objects in the real world. Imagine you have a cup of coffee in your hands. You want to place it on the coffee table next to your couch. You don’t drag it along each surface I see until it magically ‘snaps’ into place on the coffee table. You hold it and when it’s above the table, you naturally place it.

In an AR scene think of the objects in the same way. If I was to place a 3D cup onto the coffee table, I would drag it from a 2d gallery popup into the scene. While dragging it around, it would hover above the surfaces at the same height. I could use a shadow or other feedback on all the known surfaces as I move it around. When I am ready to place it, I would let it drop to the desired surface. I could add some physics to make it bounce a little or show some kind of ‘hit’ animation. This kind of visual feedback shows the user what AR Core knows about their environment.

Source: https://designguidelines.withgoogle.com/ar-design/initialization-adding-virtual-assets/optimal-placement-range.html

Visualising the space at your disposal is harder if you are dragging an object with your finger on the screen. An idea from the Google VR playbook is to use a 2D reticle in the center of the screen with a select object button in the corner. When the reticle is over an object, you can select it with the button. Then you move the phone to place it elsewhere, without having your hand get in the way.

There are more design ideas to explore here and far too many to list. Think about what interactions might take place between the user and your application and take some time to experiment.


BALANCE ONSCREEN AND VOLUMETRIC INTERFACE DESIGN

Consider that the mobile device is the users only view of your AR application. Avoid having too much 2D UI obscuring the the application window. Once you introduce any controls, it starts to impede on the users ability to enjoy your app.

We’re used to designs with visual cues and call to action buttons. For augmented apps, this annoys users because they want to view your app using all the screen space.

When testing the realestateVR application, we had a similar problem around UI. We started by placing the user in a pretty polished placeholder house, right in the lounge room. They were unable to explore it. The intention of it was to set the house ‘theme’. We had the users initially look at a large wall size menu that was positioned in front of them. Almost all the users tested wanted to explore the placeholder house and ignored the menu.

In VR we solved this by cutting back on the size and quality of the placeholder house so it felt more like a room. In AR we don’t have that luxury. In AR, aim to use 2D UI for high frequency use controls or controls that need fast access. A good example is the camera shutter button or a fire button in a game.

Use volumetric UI and use context or object specific UI out into the scene itself. If an object in the scene needs contextual information, have that information appear above it or closeby.

Contextual UI is much better

SO IN CONCLUSION…

We are seeing a new age of immersion emerge. We are moving away from seeking information and now can get the information brought to us based on our physical environment and preferences. I look forward to spending more time exploring augmented reality in the next decade as smart glasses replace our smartphones.

At DiUS we are increasingly playing at the intersection of UX and AR. Bringing together everything that we are known for today, to engineer physical and digital products (chat, voice, mobile/web, AR/VR, sensors, actuators, data sources, services) that intelligently learn, adapt and respond to user behaviour and needs.

References:

https://www.blog.google/products/google-ar-vr/best-practices-mobile-ar-design/

https://designguidelines.withgoogle.com/ar-design/

Want to know more about how DiUS can help you?

Offices

Melbourne

Level 3, 31 Queen St
Melbourne, Victoria, 3000
Phone: 03 9008 5400

DiUS wishes to acknowledge the Traditional Custodians of the lands on which we work and gather at both our Melbourne and Sydney offices. We pay respect to Elders past, present and emerging and celebrate the diversity of Aboriginal peoples and their ongoing cultures and connections to the lands and waters of Australia.

Subscribe to updates from DiUS

Sign up to receive the latest news, insights and event invites from DiUS straight into your inbox.

© 2024 DiUS®. All rights reserved.

Privacy  |  Terms