In this installment of Innovation Fridays, we're introducing a new series on anti-patterns – common but less-than-ideal approaches to solving common problems – in augmented reality (AR) app development. The inspiration for this series came from Mike, one of our crackerjack developers who pretty much can code in his sleep and knows JS backwards and forwards. In this first post, we take a closer look at why porting computer (both desktop and laptop) applications and operating systems into augmented reality headsets can be more difficult than you think.
Today's technology has made it relatively easy to port different applications onto your computer (desktop and laptop), smartphone, or tablet from video game consoles and various operating systems. And with virtual reality (VR) and augmented reality (AR) set to become the next user interface paradigm, it would seem that porting over existing applications like Excel or Chrome would be the easiest solution to getting apps into an AR headset. However, it's not as simple as dumping existing code from the PC, Mac, or console versions of an app into an AR headset and then expecting it to work.
From Flat 2D Screens to Immersive 3D Experiences
At a very high level, porting current computer and video game console applications into an AR headset is like converting a film shot in 2D into 3D: everything will pop at you, achieving some semblance of a 3D effect, and the end product won't be enjoyed by many people. The design considerations used in developing today's apps are constrained by the nature of screens' ability to only display virtual environments – an entirely different medium than the immersive 3D experience we have come to know as being possible with today's AR headsets.
And if you stop to think about it, AR headsets need to combine both virtual and real-world environments, which presents a relatively unique set of challenges similar to the ones faced by VR app developers. Note that the challenges with porting applications from PCs and consoles onto virtual reality (VR) headsets have been well-documented by VR legend Jesse Schell and debated amongst video game developers.
Optimizing Apps for AR Headsets
Applications on current gen computing platforms are typically produced and optimized to take advantage of the overall computing system's capabilities: high frame rates, crystal clear resolution, and fast computing and graphics processing power. Similarly when working on AR apps, developers must consider optimizing for the following capabilities that modern AR headsets have:
• Field of View (FOV)
A wide FOV, e.g. like the 90° FOV on the Meta 2, allows for more information to be displayed and reduces the amount of effort needed to divide your attention between the real and augmented virtual world – ultimately creating a more immersive AR experience.
• Resolution & Frame Rate
Goes hand in glove with FOV and is key to rendering virtual objects' motion speed in AR. Low resolution and frame rates means holograms will look blurry, and the converse is also true.
A no-brainer given that virtual objects need to behave like their real-world counterparts. For example, if you leave a virtual model of a shoe on your desk and turn away from it, the virtual shoe should be anchored in place on your desk like a real shoe.
A virtual shoe as seen through the Meta 2.
Optimizing apps for these factors will allow you to create truly immersive, holographic AR experiences. And as these different factors work together to produce 3D holograms, you can start to see how porting over an app like Excel into a headset is not nearly as straightforward as it would seem.
Using Excel again as our hypothetical application, you can see (in your mind's eye) that the graphs and charts we've all come to know will now be in 3D. But what if those data displays aren't anchored to our FOV and instead fly around the room every time we turn our heads? Or what if those graphs and charts you're seeing look too small and blurry, and trying to expand them only leaves them looking blurrier? As you can imagine, things could get quickly complicated, especially when you factor in correct brightness and contrast so that you can adjust graphs and charts' transparency – or the need to ensure that your data visualizations are being registered properly by the simultaneous localization and mapping (SLAM) sensor and not flying around whenever you move your head.
Designing for a Buttonless Interface
Another thing to consider as part of the porting process is the translation of flat gestures and interfaces, like pushing a button on a screen, onto a real-world environment. You can't exactly interact with things in 3D space in the same manner as you would with input devices like a mouse or keyboard. You have to match functionalities from the original app with a 3D buttonless interface.
Using Excel again as our application port example, let's examine a commonly-used function: applying a filter to your data. On a PC and Mac (only on Excel 2016), there are two ways of applying a filter: using a keyboard shortcut (CTRL + Shift + L) or pressing the Filter button under the Home menu in the ribbon.
However, in a headset that only relies on gestures, how would users place a filter on their 3D volumetric data – especially when considering how a mouse and keyboard distract from AR's immersiveness? Users can either press a 3D button labeled "Filter" (much like they would with pressing the filter button on the Mac and PC versions of Excel) or use voice commands to prompt the use of a filter or gazing. However, additional development work would need to be done to ensure that the headset can track and register the pressing of the filter button in 3D space or that the microphone can clearly understand users' commands.
Ideally, the incorporation of flat gestures into AR apps would also integrate haptic feedback in order to better augment the virtual object. But according to AR researchers Dieter Schmalstieg and Tobias Hollerer, applying haptic feedback to AR environments has been limited so far – especially considering that current haptic feedback systems are rather large and would distract from the sense of immersion.
A haptic feedback device, Geomagic Touch, used in conjunction with a visual AR system (Ulrich Eck and Christian Sandor)
As AR app development matures, and AR design guidelines become standardized, porting apps from today's computers will undoubtedly become easier. And while many proverbial trails in AR app development have yet to be fully blazed (and could very well be difficult to explore) we're looking forward to working with you on those discoveries. In the next installment of our AR Anti-patterns series, we're going to be taking a closer look at how displays are visualized in AR and in the meantime, we invite you to weigh in by sharing your thoughts on Twitter or Facebook.
Want to learn more about AR app development? Check out past Innovation Friday posts:
In this introductory Meta 2 tutorial, you'll learn how to create a cube that helps you become familiarized with the Unity Editor and the Meta 2 SDK.
Whether you're curious about learning Unity or are looking to sharpen your skills & knowledge, these in-depth tutorials & resources will help you master Unity.
Whether you’re just starting to learn Unity or have been building apps in Unity for years, the following tips should help you pass the Certified Developer exam.