Augmented reality (AR) is not the most common form of the user interface (UI) experience – but may be in the coming years. Apple has been developing their AR glasses for some time now and many other big tech brands are also developing their own augmented reality glasses. The majority of UX/UI designers could soon be seriously looking into how to overlay the digital user experience on top of real world settings.

1 Automatic Adjustments

Unlike a screen with a predictable backdrop, the real world could have a wide range of settings. Reading black text in a dark alleyway or finding yellow highlights on a bright sunny day on the beach is clearly a terrible user experience. A screen that adapts quickly to a wide range of backdrops can help a ton.

Texts appearing during a phone call at night – AR concept

2 Side notifications

Unless it’s an emergency (I.e, serious amber alert), users will quickly grow infuriated at having their Instagram or Twitter notifications pop up right into the middle of their view. Even with the rectangle screen slabs in our pockets, the vibrations or chimes are already a distraction. To lessen the irritating factor, by default new texts, status updates, or news headlines could appear on the side or top of your field of view.

3 Minimalism

Users generally speaking will not want to read swathes of text in front of them all the time. Visual minimalism is key. Instead of writing out directions, simply show a small arrow and have a voice dictate which way to go. Even those hard of hearing will likely only need the arrow as a guide. Smaller icons and symbols can also prevent the user moving backwards (potentially dangerous) to get a better view of the icons on screen. Also by default only show what is needed at a particular moment. The battery percentage does not need to hang continuously on the top right of your view like it would on a computer or phone screen.

4 Mixed Reality

Mixed reality is augmented reality on steroids. With mixed reality, your digital overlays interact more closely with the real world environment, mimicking the shadows and lighting it would have if it were physically present. It also mirrors the real world physics. When your digital object hits the physical ground, it may shatter for example. To better visualize depth, these glasses could use mixed reality. For example if you’re walking down a road full of restaurants and you’ve opened the Yelp app for reviews, it appears less like digital clutter in your face if the ratings float realistically beside the restaurants down the street.

5 MAYA Principle

MAYA = “Most Advanced, Yet Acceptable”. It means to design for the future, while keeping the current user base in mind. For example when the first iPhone came out, most users were still used to pushing physical buttons. So the iPhone UI used skeuomorphism. They added shadows to the digital buttons to make them look like physical world buttons. For AR it still needs to be intuitive. Pinch to shrink, swipe to remove, tap to select, etc. However, new techniques can be introduced as well. With your hands constantly in the potential interaction area of your device, it might need ways to tell if you’re truly selecting something. The AR game “EyeToy” came up with an answer to this back in the early 2000s, where you have to hover and wave on a button for a few seconds to fully select it as the button’s interior fills up. For important “are you sure?” interactions, this could be implemented.

A minigame in EyeToy (2003)

In the end, this list is what would be recommended as a default. The best practice in today’s digital world is still customizability. Some like text and symbols big, some like it small. Some want to constantly see the battery percentage, some don’t. Some might even want each text to be a blaring red notification right in the center of their vision. The key is personalization options.

Leave a Reply