This website uses cookies

Read our Privacy policy and Terms of use for more information.

Welcome to Lowpass! This week: Meta’s XR calling service, and an interview with Matthew Ball.

In partnership with

This week’s Lowpass newsletter is free for all subscribers; next week’s lead story will only go out to paying members. Upgrade now to not miss it.

Scoop: Meta is building an XR calling service

Meta is working on an XR calling service that incorporates the company’s photorealistic Codec Avatars, according to a series of recent job listings. Intriguingly, one of the roles that Meta is looking to fill for this service is that of an iOS developer.

Meta has been working on Codec Avatars for a few years years now, with the goal of ultimately “making it as natural and effortless to interact with people in virtual reality as it is with someone right in front of you,” as a 2019 blog post put it. While that work had long been research-dominated, the newly-surfaced job listings suggest that the company is getting closer to testing an immersive telepresence service with Codec Avatars among its own employees.

A job listing for a design prototyper states that the hire will be working “closely with engineers, scientists, and research product managers to build an internal XR calling service.” “You will be responsible for the highest level of polish, creativity, and interaction that allows the team to activate users and collect feedback that informs and empowers researchers and cross-functional partners toward the next generation of Codec Avatars,” the job listing continues.

A Meta spokesperson declined to comment.

One of the issues that has held Codec Avatars back in the past was the complexity of scanning people to create their avatars. When Mark Zuckerberg recorded an interview with Lex Fridman using Codec Avatars last year, he admitted that the scanning process alone had taken hours. What’s more, Meta has built a massive 3D capture rig that ingests 180 gigabytes of data per second to create Codec Avatars.

More recently, Meta has been working on using mobile devices to scan what it calls “Instant Codec Avatars,” – and that’s likely where the iOS developer the company is now looking to hire comes in. That developer will help “build and scale an internal XR calling service,” according to the job listing, which adds: “We are looking for a developer with experience in user interfaces, infrastructure, and/or tools supporting applications on the iPhone or iPad using the iOS SDK.”

While the job listing doesn’t specify how Meta intends to use iOS devices, it is likely that the company is looking to build a scanning app powered by the Lidar sensors present in high-end iPhones and iPads.

Meta executives have said in the past that it may still take years for the company to bring Codec Avatars to its VR headsets. However, VR scoop hound Luna discovered this month that a recent Quest headset update already includes code to support Codec Avatars-powered video calls. I’d expect that the company will update us on its work in this space during September’s Meta Connect conference.

SPONSORED

Volu.dev, your spatial development toolkit

Building for WebXR is exhausting: local server, SSL certificates, IP address, port forwarding… Plus, headsets don't even have debug tools 😑

Meet Volu.dev, your spatial development toolkit.

🌈 Easily connect to your headset. Local-only, peer-to-peer connection, your code never leaves your network.

🧰 Inspect, debug, and tweak code, directly from your headset.

⚙️ Support for three.js, AFrame, and MRjs.

🎈 Free and account-less.

Matthew Ball: The metaverse isn’t about just one device or service

Former Amazon Studios exec turned VC Matthew Ball has been a metaverse evangelist ever since first writing about the subject in 2019. Ball published a seminal book about the metaverse in 2022, and released an updated and revised version titled The Metaverse: Building the Spatial Internet this week. 

I recently caught up with Ball to ask him about the lessons he learned writing about the metaverse, how his thinking about VR headsets has changed, and how we’ll actually know that the metaverse has arrived.

Ever since you published your Fortnite essay in 2019, you’ve become known as “the guy” to explain the metaverse. Has that job gotten easier, or harder?

It has become both. It has become easier because I have a deeper and richer understanding of the term. I also have a better understanding of which analogies are most effective and how to explain the metaverse effectively. 

At the same point, the [metaverse] mania of late 2021 and 2022 meant it's now necessary to overcome preconception and misconception. When I started writing about the space, the typical response was: What is the metaverse? Now, it’s more common for people to [say]: Isn’t the metaverse just VR, or isn’t crypto the metaverse? That’s a different challenge.

How have your own thoughts on the metaverse evolved over the past few years?

Where my thinking has evolved most of all is around what it takes to actually build it. Technical, societal, from a regulatory perspective. It relates to the laws of physics, the impracticality of certain hardware problems today. It spans standards, networking protocols, as well as head-mounted displays, and so on.

Let’s talk about headsets. You seemed to be fairly skeptical of VR hardware in the past. In the new version of your book, you do go into a lot of technical challenges, but also acknowledge that they will be a major part of the metaverse. How has the Quest, or the Apple Vision Pro, impacted your thinking on that subject?

First and foremost, I maintain that head-mounted displays are not required for the metaverse. There are hundreds of millions of people who are fluent in navigating 3D spaces using existing interfaces. WASD on a keyboard, or a touch screen – but it’s not for the majority of people, just in the same way that the iPhone massively expanded who used smartphones in a way that any advanced Blackberry or Nokia device was unlikely to.

That’s one point, just thinking about participation. The other is about what you can do in these spaces. [With devices like] the Quest Pro, or the Vision Pro, with a suite of external tracking cameras and sensors, you have essentially unlimited input. It’s not a question of hitting A or B, or doing a combo of buttons. You can precisely map your actions in the real world. The twisting of your wrist. Three fingers versus four. These movements all translate into virtual space with near exact precision. That’s not just changing who can engage in 3D spaces, it fundamentally changes what you can do in those 3D spaces.

One aspect I found fascinating about the new book is your writing about metaverse interoperability, and what it means beyond the notion of visiting multiple worlds with the same avatar.

One of the unfortunate things about metaverse dialogue, and partly as a result of Ready Player One, is this overwhelming focus on interoperability and asset portability around avatars. And then most people naturally say: How much value is there taking your Peely outfit from Fortnite into Call of Duty? 

Most people who are focused on building the metaverse consider that a relatively unimportant matter. They are a lot more focused on the portability of data overall. One of the interesting things about the internet is it's actually so interoperable, including at the file format level, because most of those file formats predate the emergence of large tech companies. Facebook, as an example, didn't really have a practical option to deny JPEGs and PNGs and GIFs for their social network, just like Apple had to embrace MP3s to launch the iPod.

In this instance, the problem is actually quite the reverse. Most of the 3D formats were privately produced, and they have been customized for proprietary software. Therefore, standardization is a very different problem.

You’ve long argued that a true metaverse doesn’t exist yet. In your new book, you forecast that it will arrive before the end of the decade. How will we know when it’s here?

The question of when has something arrived is inherently flawed. The smartphone era began in 1991 with the launch of the first wireless digital network. Many people think it actually began in 2007 or 2008 with the launch of the iPhone, Android, and the App Store. 

For the average American, the mobile era didn't actually start until 2014. That's when half of America had a smartphone. It wasn't until the first year of the pandemic that half of the world had a smartphone. And of course, it took time until many companies became mobile first, or even had some semblance of revenue that reflected that.

When I say that [the metaverse] has begun by the end of the decade, it’s not that there’s some decisive product to watch. Maybe the Vision Pro 2 will be a catalyst moment, maybe not. It’s instead that you will have hundreds of millions of people, or a billion plus, who realize that an increasing share of their leisure and their work is taking place in 3D networks.

So it’s ultimately less about one set of technologies than the ubiquity of a social practice?

Yeah, I would say so. Mobile began because it was not just a collection of use cases. Not just [a handful of] applications. It was that most of our functions, our socializing, our time was spent on that device, and through that device. For some people, that’s already the case [with 3D spaces]. But for many more people, it’s not. For some, it will never be.

What else

Redbox employees file billion-dollar lawsuit. Employees of the DVD kiosk chain allege that Chicken Soup for the Soul Entertainment and its CEO engaged in a “ponzi scheme” to hide alleged wage theft.

Apple tries to rein in streaming budgets. After spending more than $20 billion on Apple TV+ content, Apple is looking to control costs.

Netflix hires Epic Games exec to lead its gaming efforts. Alain Tascan is joining Netflix as its new president of games.

HTC teases new VR headset. The company said in a teaser video that the new device will “change the game for good,” suggesting it is getting ready to introduce a new consumer device.

Is FAST in trouble? A new study suggests that the ad spend market share declined by 64% on average in Q2.

Amazon Prime Video now has a tab for … Prime Video. Amazon’s video service had been among the worst offenders when it came to mixing content from a variety of sources, including promotions for paid services. Now, it’s easier to find the stuff that’s actually available to subscribers.

Meta is bringing its AI Assistant to the Quest. The company’s Quest 3 VR headset will soon offer object recognition for its passthrough video feed.

Comcast lost 419,000 pay TV subscribers in Q2. Somehow, that’s an improvement.

Spotify now has 246 million paying subscribers. The music streamer has plans to introduce a high-end tier in the future.

Google’s next Chromecast is a streaming box. Well, I didn’t see this coming: Google is getting ready to release a new streaming device, and it’s not a dongle.

That’s it

Some enterprising modders have built a Nintendo Wii clone that’s the size of a key chain. It’s the smallest Wii ever! But it also doesn’t work with Wii controllers, won’t run Wii Sports, and doesn’t support the Wii sensor bar. Not to badmouth someone’s otherwise impressive modding project but … that’s not a Wii?

Thanks for reading, have a great weekend!

And many thanks to Volu.dev for sponsoring this week’s newsletter.

(Image courtesy of Meta.)

How did you like this week's newsletter?

Login or Subscribe to participate

Keep Reading