paint-brush
Scene by Scene Feasibility Breakdown of Facebook's Metaverse at Work Video  by@geekonrecord
982 reads
982 reads

Scene by Scene Feasibility Breakdown of Facebook's Metaverse at Work Video

by Geek on recordNovember 16th, 2021
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Mark Zuckerberg introduced Meta to the Metaverse in a presentation video. The video shows a man entering the office in his house, putting on a pair of natural-looking glasses and suddenly being immersed in a corporate environment. In that short movie, we can see a set of translucent windows, bookshelves, a desk and a chair. But there are no cameras or sensors in sight, just a room that is void of any technology. This is an important detail, because the lack of any tech that could aid in an AR environment is where the problems start.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Scene by Scene Feasibility Breakdown of Facebook's Metaverse at Work Video
Geek on record HackerNoon profile picture


Facebook is now Meta and the Facebook app is now part of the metaverse. Mark Zuckerberg introduced a rebranding of the parent company that owns properties like Instagram, WhatsApp or Oculus, and at the same time introduced the world to their vision of the future, the metaverse.


Regardless of whether or not you saw Meta’s presentation video, you might be wondering what the metaverse actually is. Is it a mix of virtual reality (VR) and augmented reality (AR) devices and apps? Is it a development platform where content creators can create VR/AR experiences? Is it an actual virtual place where users can go online to meet other like-minded people?


It seems like the answer is vague in purpose, and perhaps it includes all of the above. Zuckerberg invited large and small companies to “participate” in the metaverse, as if it was the new internet. As if Meta was the usher of a new AR/VR era based on open standards.


I have no doubt that AR is in fact the future of personal computing; a future where regular glasses will give us access to a desktop-like experience on the go. However, does that mean that every company working on those devices will do it on Meta’s platform? Absolutely not. Big players will continue competing with each other to become the de facto marketplace for AR/VR content. This means, for example, that content creators will have to decide if their AR game is released on Meta’s Oculus store and/or on a (hypothetical) Apple Glass App Store.


Of course, Zuckerberg was setting a new vision for his whole company, so it’s understandable that he shared a bold set of aspirations. He needs to inspire and make a whole generation move on from their recent privacy and democracy-bending issues. And what better way of inspiring people than using cinematic and futuristic videos.


Among all of the videos used to introduce the concept of the metaverse in areas like entertainment, gaming, fitness, education and more, there was one specific area that struck me as overly unrealistic: using the metaverse to “work better”. In that short movie, we can see a man entering the office in his house, putting on a pair of natural-looking glasses and suddenly being immersed in a corporate environment.



Today, I want to dissect that vision, go deeper into what it’s trying to convey and offer a realistic perspective on how that future could be accomplished. Unfortunately, through this dissection, you will realize that there are missing aspects that Meta didn’t mention or show, just to make their video look better and simpler.


Let’s start analyzing the physical space where the subject of the video (we will call him John moving forward) experiences this “better” work environment.


John’s work space looks like a regular home office (albeit rather luxurious), with windows, bookshelves, a desk and a chair. Note that there are no cameras or sensors in sight, just a room that is very explicitly shown as void of any technology —other than the glasses at the center of the work desk. This is an important detail, because the lack of any tech that could aid in an AR environment is where the problems start.


But let’s continue with John’s work day. He sits at the desk while drinking a cup of coffee and puts on his glasses, adding to his field of vision a bunch of virtual elements that transport him to the corporate world. What’s striking about this is that not only a set of translucent windows appear in front of him, but also coworkers in digital avatar form, walking around and even sitting at their own desks.


Here’s where the first questions come up for me. What does it mean for John to see these digital coworkers in transit? I mean, what are the actual people behind those avatars doing in the real world? If their avatars are shown walking, are they actually walking in real life? If so, are they at the office or also at their homes? This seems trivial, but it’s a critical piece of the puzzle that creates the virtual reality that John sees.


None of the possible answers are satisfying in my opinion. If these coworkers are actually walking, it means that they are being tracked by cameras or sensors and that their destination must match the virtual work space around John. It’s complicated even to explain it, and it has a bunch of privacy implications (e.g. can your boss see that you are not at your desk from his own home office?)


Another possibility is that these are not actual coworkers and just a “live background” option that John has selected for his AR office setup. However, the following few seconds of the video disprove that theory: we see one of these digital avatars walking by, smiling and waving at John.


And John smiles back at her, raising his coffee as a greeting sign. This confirms that the digital avatar and John are seeing each other as she walks around the office or (the creepier option) her house. But what did she actually see? Did she see John sitting at his work desk? Did she see John in digital avatar form? Did she choose to be seen walking by John? So many questions, all with big privacy implications.


The least problematic explanation for this is that John is also a digital avatar from everyone else’s perspective and that a coworker chose to “ping” John to say hi over a chatting app, and then the app translated that ping into their digital avatar walking by. Meaning, John’s coworker was not being tracked walking anywhere. Nonetheless, the privacy risks are still present; even if the explanation above was indeed real, the video makes us believe that John’s reaction and facial expression is being used in response to his coworker’s ping. One can only hope that all of this will be customizable, and that the metaverse won’t become a gateway into an Orwellian corporate future.


As the video continues, John decides to sync with a coworker (let’s call her Stacy) who happens to be outdoors, in the middle of what looks like a park. I believe it’s fair to assume that there are no hidden cameras or sensors tracking Stacy, so the following scenario is even more baffling than the previous one.


As John and Stacy join their debrief meeting, we see them both standing. Stacy shows up as an ethereal or holographic presence that covers her whole body and includes an accurate representation of her current clothes. This implies that the glasses she is wearing are enough to scan her whole body, even her back! Notice how her ponytail is also present in her holographic form.


This is of course not only impossible today, but also a practically miraculous use of technology. In other words, at this point the video is in full science-fiction mode and your suspension of disbelief is pretty much required.


Wait, so does that mean that we won’t ever have holograms like that? We can and we will, but a full body real-time representation like the one above will definitely require more than a pair of glasses. A more realistic version of this future tech would involve the user wearing a set of bracelets that track their arms and hands through a combination of relative location tracking and electrical impulse sensors. These bracelets will connect with the glasses (or the communication device in the user’s pocket) to transmit a hybrid hologram, where the facial expressions and the arms are read and tracked in real time, visually connected to a simulation of the rest of the body, providing realism to the whole avatar.


In fact, Meta showed in a different part of the metaverse presentation that they are working on wrist wearable technology that will be able to detect the fingers and hand movements. That’s the foundational block of the futuristic bracelets that I described above. Regardless, Stacy’s video clearly shows her bare wrists in the first few seconds of her appearance; so again, that scene can be considered a nice sci-fi clip, nothing more.


Finishing John’s journey, the video shows him walking into a group meeting in the metaverse. This turns out to be one of the most revealing parts of the video, since it shows people joining the metaverse in three different forms: a hologram with a real-time full-body representation, full-body digital avatars, and traditional web-cam users.


John’s hologram (highlighted in blue in the screenshot above) has already been dissected earlier as an impossible form unless he is wearing those futuristic bracelets —add also ankle bracelets, since we see him walking with his legs.


The digital avatars (highlighted in black above) are closer to today’s understanding of VR; since it’s difficult to imagine why anyone would wear VR headsets when they could wear fancy light glasses like John’s, I’ll assume that they actually are wearing the exact same tech as him, but that simply chose to fully conceal their face and body with the digital avatar. That’s the best way to explain how the avatars are waving and turning their heads when John enters the virtual space.


A simpler and more disappointing explanation for this second type of user doesn’t require any hardware. Instead, the app being used for this meeting could have a function of “greeting” a user, which gets translated as the avatar directing a smile and hand wave to the user that just joined. Similar to how today’s video conferencing apps can highlight whoever is speaking, the corresponding avatars could move their mouths when sound is detected from a user. This would enable a virtual presence using today’s technology, but I’d anticipate a lackluster response if this was ever launched, given how unrealistic everything would feel.


The third type of user in this virtual meeting (highlighted in yellow above) is nothing new for most people who had to endure virtual meetings during the coronavirus pandemic: a flat view of someone else brought to you via webcam. It’s telling that Meta chose to show people using this (by then outdated) tech to communicate, since it brings me to the metapoint of this article.


The metaverse will be truly helpful when it doesn’t require complex setups with tracking cameras and sensors. Technology will likely get there. But will we want to use it? A virtual representation of one’s body without that tracking will fall flat and unrealistic. Will we have to choose between an uncanny future and our own privacy, again?


The biggest problem with Meta’s vision for the metaverse is the user’s privacy and security, which ironically have been Facebook’s biggest challenges to date. In a world of digital avatars and holograms representing our persona, having strong security systems in place will be critical to avoid, for example, someone stealing our virtual identity and presence.


I expect convenience to continue being used as a bargaining chip, and privacy as currency for years to come. The metaverse is poised to exacerbate the privacy and security challenges that we are having today in the digital world, so the question is: will you buy into it?



Did you like this article? to get new posts by email.


Images via Meta/Facebook

바카라사이트 바카라사이트 온라인바카라