Explaining the Science Behind Augmented Reality

Part 1: The Natural Machine

07

JUL 
2016
BY Stefano Baldassi

 

“The time has come for devices that keep us in the moment rather than obscuring it. Natural machines, built around the senses we already have, using neuroscience.”Meron Gribetz, CEO of Meta, at TED 2016

 

Gribetz made some bold assertions in his talk at TED2016 painting his vision for the natural machine.  Not only did he announce that by one year from now, everyone at Meta will toss their desktop screens and be using natural machines, but Meron also stated his belief that natural machines can begin to replace traditional computers in as little as three years. As a neuroscientist and Meta’s director of user research, this is unbelievably exciting for me, and represents the crux of our work and mission at Meta.

 

So what is a “natural machine” and why does it have the potential to reshape how we work and connect with each other and the world around us? The idea of the natural machine is essentially a computer that fits onto a person and his or her brain and body like a glove to a hand. It enables people to interact with content and with each other in ways that are totally natural. You see something, grab it, manipulate it, hand it to someone else. Sounds simple, right? But think about how we currently use technology. It’s anything but natural. We use a keyboard, mouse or a trackball to manipulate data and content on a screen, from our tiny phones to large desktop monitors. But no matter how large the screen, size can’t overcome the fundamental flaw. Our brains just weren’t meant to engage that way. The screens create obstacles to connecting rather than facilitating those connections.

 

Natural machines provide a fully transparent display that reveals the real world alongside digital content in a beautiful blend of the digital and physical worlds. The person sees both and can engage naturally with digital and physical objects together. Picture picking up a digital coffee cup and placing it on a physical table, for example. Meta delivers this natural computing experience via see-through headsets that visualize content in 3D – it’s see-through augmented reality. This is very distinct from virtual reality, in which the physical world is completely obscured and the only engagement is with the 3D content. The Meta headset is equipped with an array of sensors that provide information about the orientation of the headset in space and an accurate representation of the environment around the person wearing the headset.


 

In addition to the headset, the other critical feature of Meta’s natural computer is that it’s hand-controlled. That’s the “see it, grab it, manipulate it” part. No input device required, just your hands – the most natural tool for the brain to use. Natural computing represents a return to the most basic and productive pairing for humans – brain and hand.

 

By taking the kind of content we’d normally interact with through a screen and bringing it into the real world, it’s instantly more personal, relatable, and actionable. It’s called augmented reality but it’s really a much bigger idea – making natural machines that truly feel like extensions of ourselves and create deeper understanding, freer expression, and optimal productivity.

 

Want to learn more about The Science Behind AR? Read the second part in the series:

Part 2: How Neuroscience Informs AR Design

In this installment, Meta Chief Neuroscientist Stefano Baldassi explains how neuroscience informs AR design.

Forest-Rouse-Brain-Behind-Meta2.png