The Future of Productivity is Spatial: Introducing Meta's AR Workspace

Introducing the Meta Concept Operating Environment: Workspace

BY Team Meta

At this year's Augmented World Expo (AWE) USA, our CEO and founder Meron Gribetz unveiled the Meta 2 DK's concept operating environment, which won Best-in-Show and resulted in people waiting in line to try it out. For those of you who weren't able to try out Workspace and or didn't have a chance to catch Meron unveiling it, check out the video below (many thanks to Ori Inbar, Tom Emrich, and the rest of the amazing AWE team for producing and sharing the numerous talks and workshops from the expo!). Also, we've transcribed Meron's presentation (below the video) in case you'd prefer to read through his presentation.

From the "Inspire" Track at AWE 2017:


Transcript of Meron's Presentation

How's it going AWE? Having fun so far?


Well, we're going to have fun today!

So I'm Meron, from a company called Meta, and of course we build augmented reality headsets. I've been looking forward for a very, very long time to showing off what we're going to show off today.

Today we're going to open up a brand new interface paradigm - a brand new operating environment for augmented reality that will open up the paradigm and the next era of computing. It's something that we believe and hope - it's that serious. And I hope you guys enjoy it. We wanted to give a teaser here at AWE first.

The journey of how everyone in this crowd probably realized that augmented reality is the future and it's not locked inside one of these flat devices and is a really cool thing.

And I'd like to share with you today the journey of how Meta and myself really came to figure out that AR is the future and it all started with a Kickstarter. And this Kickstarter, we launched the Meta 1, which for those of you who remember, the very first AR headset on the market. This was years before everyone else, and we learned a ton. We learned so much from our customers and I want to share with you everything that we learned in a very open way, and then I want to share the culmination of all of these learnings, which came to the Meta 2.

We're going to demo for you today that we've approached photorealism - that's something I want to show off today. So, here's a video of [footage] behind the lens. I took my iPhone and I put it behind the lens; you could see the exact quality of that.

What do we think?

This eyeball is the size of a human being: this is immersive as well. So we're going to have fun doing that and of course, it all starts three years ago - I'm sorry, five years ago - with the Meta 1 Kickstarter. So, in Harlem, in a one-bedroom apartment at Columbia University, we hacked together this craziness. This is the Epson Moverio. When I say "hacked together", I mean it literally. We took a knife and cut through the Moverio, placing a depth sensor on top so that you can have 3D input and 3D output, and this is the first real prototype that gave birth to this next chapter of computing.

Then we decided "We're excited about this! Let's check out if anyone else is excited about this.". So we decided to put out a Kickstarter campaign. First of all, getting animators to work with you with a zero dollar budget is really tough. So my friend Ben flew in from Australia, and my friend Mike from New York, and they convinced a group of animators that this was Woz in the garage, that this was the next chapter of computing, except more trippy and with holograms. And somehow, they helped us out. So that's how we got through the Kickstarter. Then, the next part of any Kickstarter - I don't know if you've tried running a Kickstarter. Can you raise your hand if you've ever tried? Okay, there's a couple of people in the crowd.

What's the hardest part? The endorsements. At least that was for me. So I decided to go out after a North Star in our industry: Steve Feiner. He's here in the corner. Steve was one of the pioneers of augmented reality, and he was my professor and mentor at Columbia. Getting an endorsement from Steve was like going through a Jewish conversion process [laughs]. You have to get three "No's" before you get a "Yes" and kind of persevere. Except every "No" I got from Steve resulted in another semester - a full semester - of user interfaces class I would have to take and get an "A" just get "No" again. And this happened four times. By the fourth "No", he broke and he joined our Kickstarter - so thank you Steve!

We put this online and we measured. We wanted to see if people were going to connect with this kind of product. I want to picture this together today: we're sitting in a dorm room, in Harlem, all of us, and we're looking at a monitor of Kickstarter orders coming. The second order comes from Arup, the largest construction company in the world. The third order, and I still have the records to prove this, comes from Steve Wozniak. I have no idea how he got in so quickly. And then Toyota places a bunch of orders, and Ford, and it's just mind-blowing for a group of students hacking together stuff in a dorm room. So that was the beginning of the company and a memory that I reflect fondly on.

From that point on, it was all about what we learned from our customers. Our customers taught us first and foremost, they were coming from the realms of productivity. I thought maybe this would be 50-50 entertainment and productivity paradigm. 80% of our customers were productivity. Only 5% were entertainment, and only 2% were from the gaming sphere. That taught us very early on this was about productivity.

The next thing we learned, pretty profoundly, was the hardware - what people wanted from a real AR experience. The first thing they wanted was immersion. They wanted three to four times field of view (FOV) larger than what the Meta 1 could provide, or what HoloLens could provide, because it's critical for productivity. The next thing they wanted was photorealism: to be able to look at a single pixel line of CAD or when I'm visualizing a model building or a human anatomy unit, you have to get that resolution.

The next thing was direct manipulation: so the ability to reach out and grab the hologram the way your brain wants to was critical for our customers as opposed to using a gesture at a distance or using a controller or a mouse - they wanted to grab the holograms. And the next one was they wanted it collaborative: two people can grab holograms and pass them from one to the other. And finally and critically, everything from a meter and a half and closer had to be crystal clear because that's where productivity happens. Our Meta 1 focal plane was infinity, so we learned quickly what people wanted.

We hired a group of 130 hardware and software experts, no longer hacking together with knives and prototypes. Now these folks are building light field displays, advanced optical engines altogether, and it's just an insane journey to be a part of the Meta growth. We took our time, and really wanted to get this right. But we built a product that we think will satisfy these requirements. Some of you may know these specs, but I'm going to show you the new specs.

90-degree field of view (FOV) you may know, but resolution we hit 2.5K (2560 x 1440) - actually 21 pixels per degree, which means you can look at Times New Roman 12 at 50 centimeters away in a holographic panel and you can start with your Iron Man and Minority Report workflows. This is all enabled by that, and it's really bright displays, so we're super proud of that. Direct manipulation: check. Natural collaboration, and of course, everything within even two meters and closer is super clear. Taking the time to get it right and be patient yielded results.

Now, people are saying the following things about us, and I think that if we put it in prematurely, it wouldn't have been this experience. And so we also have a pretty fun announcement, which is that if you jump to the next one, this is our manufacturing facility where we're ramping up mass manufacturing. It's all local here in the USA. Just quality, quality, quality. We're getting ready to fulfill everyone's orders over the course of the summer. That's our little announcement.

The most profound thing I think is actually not the hardware. This is something that we learned from our customers, but we also kind of intuited. You know when the iPhone came along, there was half a dozen other smartphones before it. Spanning all the way a decade earlier. But, those smartphones didn't really take the time to get the user interface right. They were using plastic keyboards from the past; they were using interfaces that were kind of glued together and really weren't as thought out as the operating system from Apple.

And so we said "Hey, we're going to take this time and really focus on that." So how did we do this? First of all, what did we not do? We didn't put an operating system from the past into the headset. We decided we're going to think it from the ground up, so we hired a professor from Stanford - a professor of neuroscience: Stefano Baldassi. We also hired PhDs from that department, as well as Berkeley and UCSF, and we locked ourselves up in a room for two years.

We thought "What's the most natural and intuitive AR experience that we can possibly provide across every interaction technique that you'd expect in AR?" And I'm happy to say that after these years of research, you can now download the first ever design guidelines for augmented reality. Wrapped around this is an operating environment which we call the Meta Workspace, which is why I'm here today. And out of the Meta Workspace came a series of inventions that all start from how the brain wants to process things. So the user interface paradigm was what we obsessed about for all of these years. And the problem with the UIs of the past – well, let's not look at the problems. Let's just look at what happened in the past: we had our command line interfaces, even before that punch cards, we have our mouse, we have our pinch to zoom, which allows us to scale about one plane and it became much more natural and intuitive than anything that had come before it.

The user interface paradigm we're opening up today is called Air Grab. And Air Grab is just what it sounds like: you reach out and you grab the hologram and you manipulate it in X, Y, Z [planes]. You could do this kind of pinch to zoom, stretch it in any axis with two hands. You can rotate, etc. We think that Air Grab is going to be the way that people do AR for the next who knows how long.

So we took these kinds of ideas, we patented the heck out of them to protect them from the big guys, and we opened them up for developers and for people in this room to experiment with, and I'd like to, with no further ado, just show you it because it's more fun that way.

Let's look at Air Grab. All right. We're going to switch to the feed. I'm super excited! Look at this mama. Let's start with an eye popper. This was that eyeball we saw at the beginning, and there's nothing else to say aside from that it's so photorealistic. The quality of the hologram behind me is actually lower than what you'd expect in the headset, and I invite everyone to try this because it's just beautiful. So how about it? You like it?

So Air Grab. It's real simple. Your hand can occlude the hologram, and the hologram can occlude your hand to give you depth perception. Real simple when you touch the hologram, you make a fist. These two concentric circles become one. And now it's a glowing dot indicating that you can move it around, and that's pretty much it. That's how the brain wants to do it and that's how we're giving it to the user.

You can do the same thing with two hands by stretching and rotating. I get goosebumps from playing with this kind of technology because it's been a dream for a long, long time. So that's Air Grab. Give a round of applause to our team that worked really hard for this demo! But the bigger message I came to show off today is the Meta Workspace: the very first augmented reality built from the ground up for this medium.

And this is the hardest part: building light field optics is way easier, so I hope that you enjoy this next part. So I'm going to Air Grab the eyeball and place it in this thing. That's our Mac Launcher, you know, our iOS homescreen basically. It's real simple. And because it's photorealistic, there's a couple of things you might see here: 2D panels, you could browse the Web spatially. There's a spatial Web browser that comes with this. There's a Windows desktop duplicator that you could do anything with your Windows machine. You have 3D models for a myriad of purposes. Let's just get started.

Because of this real crazy resolution, you could actually read. I don't use a monitor anymore for the last month I've just been using the Meta Workspace. It's fully functional. You can go into an email and check. Here's something I might do in my own personal workstation. I'll rotate using Air Grab and then I'll place it above into the side. You can now spatialize our thoughts, which is a fun little thing to try.

How about a little bit of distraction? Yeah! I get a clap for that. Zuckerberg would be proud. So you can go into the photos. It's like a physical thing, it reacts to your body viscerally and physically sometimes. You can just hit the side of the panel. Now let's look at my office. This is the way I actually I do work these days; see that intense gaze? That's mean looking at holograms. This is the future of the PC era, where I can take anything I do on the PC. So what do we think about that first stage?


But we're all here for the future, for the next step, right? The next step starts with 3D models. Let's go into that. All right. This model is particularly mind-blowing to me. This is the glass brain by Professor Adam Gazzaley. This is a real MRI scan, and when I spatialize my thoughts and my desktop, I like to put this guy kind of above to remind me of what it's all about and where this is all going.

Here's a model from our partner BioDigital. By the way that first stunning eyeball comes from Intervoke and SketchFab, who are our partners with the Workspace. Now I'd like to show you a model of the body by our friends at BioDigital. This reminds us that action comes through the nervous system and the vascular system. I might have a model like this on a typical day when I'm trying to get inspired - stick the head inside there, why not?

Computing has become as easy as Legos. It's a lot of fun.

To be honest with you, it's not so much about the 3D models, as cool as they are, or about the panels. It's about something much bigger than that. The paradigm will rest on the tools that are built for this operating environment, so I want to share with you some remarkable tools built by our developers. Let's go in.

First tool: from Bryan of Coding Leap. He designed this mind mapping tool and this is something I've wanted to do for a very, very long time

  • just mind map in AR. Here's the Meta 2 where this all culminates. And for me as CEO, it's a trade-off between all of this hardware and software features, so being able to spatialize the thoughts is not something you can do in any other medium. It's just a beautiful, beautiful experience to try. Here's another tool designed by our very own Mike Stein and his team at Meta in a two-day hackathon - they built this: I could take a picture with this button on my headset of anything in the Workspace. See that? It's a little recursive, it's a little Meta. This is going to make sure that you're not anxious anymore to erase a whiteboard, and this has profound implications on pretty much everything that we do in our day-to-day work.

Here's another tool I really like: this is designed by Tim's team of our neuroscience group, and it shows the success vs. failure rate of our Air Grab, and it's showing you that it's about one centimeter or less accuracy, and error rate. When I'm spatializing this plot, and I look around in it, I could notice that there's a little bit more error in Z than I want, even though the cluster looks really nice in X and Y. So we can design affordances that make it even better in depth.

Now all work and no play makes a cranky CEO, so let's do some fun stuff. Audience participant! Who wants to come and play?


Come! Come come come! Quick! What's your name?


Audience member: Valarie.

Meron: Valarie, put your hand on mine like that. Good! And we're going to go into this, and now you're going to move your hand around. This is the very first holographic instrument and direct manipulation. You could move your hand around and look at the screen. Guys! A round of applause for Valarie!


See that? I took a picture of you! I'll send it to you later. Thanks so much! You're a rock star. Guys, this is going to change the way we play instruments and I'm just so inspired.

The very last thing I'm going to show you is this guy: As I reflect back at the memories that came out of Meta, here's the first group was in our three-bedroom house and it's been an amazing experience. Imagine if I can take this out and pull it out of my phone.

Maybe I want to take another photo out here: this is the cathedral - the first 50 employees. See if that will work. Look at that! This is real, guys. This isn't hacked. This is just working. And this is a very important picture, gang, that I want to add to my work space. This is Sofi, my pup; always a crowd-pleaser. You can do more cool stuff with this. Here's a sticky note app: I can draw a sticky. I like to be productive with sticky notes – so what if I could do that?


Or this.

Or this.


Or, my friends, this is what got us all started: this is our Kickstarter and it makes me a little emotional. Here goes:


Wait for it! Steve Feiner endorsement! We made it!

Well guys, it's been a pretty cool journey.


Thanks for your time.

More information

by selecting this option, I affirmatively consent to Meta's use of cookies.