Hi there! My name is Janko Roettgers, and this is Lowpass. This week, I interviewed James Cameron and Meta CTO Andrew Bosworth. Also: I talked to the CEO of Luma AI, and Snap is bringing WebXR to its Spectacles.
My interview with James Cameron and Meta CTO Andrew Bosworth
In December, Meta announced a multiyear partnership with James Cameron’s Lightstorm Vision to bring 3D entertainment to Meta’s Quest headsets. This week, Cameron joined Meta CTO Andrew Bosworth on stage during the company’s Meta Connect conference to share a first result of that partnership: Quest owners are able to watch an exclusive preview clip for Cameron’s upcoming Avatar 3 movie via the headset’s new Horizon TV app.
I had a chance to sit down with the duo ahead of their keynote appearance to talk about the potential that mixed reality headsets represent for 3D video, the complicated history of 3D TV, and Cameron’s attitude toward generative AI.
This interview has been edited for length and conciseness.
Why did you partner with Meta on 3D entertainment?
James Cameron: It just seems like a natural convergence. I had been proselytizing about stereoscopic media and entertainment for 25 years. It kind of went dormant for a while, because cinema was the only place to really see it. It had a brief life on flat panel TV, but the devices never really worked that well.
But in mixed reality headsets, you're innately a stereoscopic viewer. [When I saw] mixed reality headsets, it occurred to me: It's time to bring back the capability that I had spent 15, 16 years developing. So I stood up a new company, Lightstorm Vision, around stereoscopic production. At the same time, the Meta content team was looking for a partner in stereoscopic production [to] break into entertainment, meet the studios and filmmakers.
Andrew Bosworth: [We] were looking for each other without realizing it. We kind of pitched each other. It continues to be a tremendous partnership for us. [James isn’t just a great storyteller,] but also an innovator who can tell us: Here's the ways in which the thing that you're building is not meeting the needs of storytellers. To have somebody who is both an expert and a critic is an extremely high value for us.
A lot of VR storytelling has focused on putting viewers into the middle of the movie, making things more interactive. It seems like you're more about 3D stereoscopic, framed lean-back entertainment?
Cameron: You’re right. What I've spent a career doing is telling stories in a linear narrative format. Sometimes, those are documentary stories. Sometimes, those are completely fictional stories. But it's in a rectangle.
Everybody was quick to discount the rectangle. But what that rectangle does is it directs the eye. Avatar movies are an example of what I like to do, which is give you a lot of things to enjoy within a frame. But the frame is the frame. The frame is telling the story.
We have this hundred-plus year cinematic vocabulary [that] maps to the way the mind writes memory. That's why cinematic vocabulary is identical in China, India, Japan, the Americas, Europe. We all think through that rectangular window the same way, because it's how the brain works.
Bosworth: The timing of this is [key.] Why didn't we embrace this hundred-year vocabulary earlier? Partly, we just didn't have the displays for it. You had a TV. You had a phone. Why would you watch a movie in the headset when the resolution was not as good?
What’s different now is that we have the resolution. We have brightness actually in excess of what people are seeing in these other environments. We have a refresh rate that’s really high. We're at the point with the headsets that we have in the market today, not just ours, where we can do what the TV does, and also immersive media. I think there is probably room for both.
Cameron: I think episodic television is the big overlooked thing here. Stereoscopic production enhances your sense of engagement with the people you're seeing in the frame. But there hasn't been a way to distribute episodic television stereoscopically to date. That's changing, and I think it's going to be huge.
It’s interesting you mentioned 3D TVs. Those obviously flopped, and sometimes people say: VR is going to be the next 3D TV. Now you're telling me: Yes, VR is going to be the next 3D TV, but it's actually going to work.
Bosworth: A funny thing happens where people say: Oh, 3D is all the same. The first time I went to visit James in L.A., he showed us a piece of his upcoming film. He showed it to us how the average stereoscopic movie theater projects it, which is relatively low brightness. Then he put it on a laser projector, which is how he intended it to be seen in theaters. It’s a profound difference.
One of the problems that you do have is that audiences think they're all the same. You have to be relatively savvy to understand what Dolby Vision means, or what IMAX means. 3D TV was a relatively poor stereoscopic experience. Some of it was the glasses. Some of it was limited depth projection. Some of it was brightness …
Cameron: Sweet spots …
Bosworth: Totally. It wasn’t a great experience. The nice thing about the headset is you're guaranteed an outstanding experience every single time. It's a different product, and I think to some degree, we do ourselves a disservice when we in the consumer electronic space overly flatten things.
James, you’re on the board of Stability AI. What gets you as a filmmaker, and as a CGI pioneer, excited about generative AI?
Cameron: I look at the cost of VFX these days as becoming quite limiting for the types of movies and shows that get greenlit. The labor rates have gone up significantly, and the theatrical market has partially collapsed, at least 30 percent. And so we need a solution.
The solution will probably lie in creating specific custom gen AI models that can be injected into existing visual effects workflows. I'm less interested in a kind of magic wand text-to-video approach. If I were a young filmmaker with no resources, no money, and couldn't afford actors, I would be very interested in that type of production. But that's not my interest.
My interest is [in] mainstream and high-end production that involves a lot of effects. And I'm not anti-artist at all. I don't want to cut people. I don’t want people to lose their jobs. What I want them to do is be more productive, so that we can have more throughput through the existing companies.
Through Stability, I [have met] a lot of gen AI developers. Great people, but they're making stuff in a vacuum. They've never made a shot for a movie from end to end. All the production-focused tools that were built over the last 30-some years in CG and in VFX were created because productions needed them. They weren't created in a vacuum.
Do you think generative AI is going to democratize filmmaking for that young filmmaker though? People used to say the same thing about game engines, real-time and virtual production tools, and that’s not exactly what happened. Instead, it has led to Hollywood using gigantic, hugely expensive LED screens …
Cameron: I think it's going to create entry-level avenues for people who can come [to Hollywood] with a film that they've made using prompts. I think it's going to make it easier to get into [that] system, but I don't think the system will change fundamentally.
I personally hope we never replace actors. To me, the joy of the process is working with other artists, creating a moment, an authentic moment, an emotional moment, creating characters.
People say: Gen AI can’t be as creative as humans. I think that's dead wrong. I think it could be just as creative. What it can’t do is create that unique lived experience of an individual viewpoint, which is what we love the most in literature, in novels and film. It can’t do that, but it can be in service of that unique vision. And I intend to embrace it as much as I can, but always in service of the creative process.
Enjoyed reading this story? Then please consider upgrading to the $8 a month / $80 a year paid tier to support my reporting, and get access to the full Lowpass newsletter every week.
SPONSORED
Get $10k in free ad spend for TV ads
Marpipe is the leader in catalog ads. If you run catalog ads, your life is about to get a lot better…
In partnership with Universal Ads, we can transform your catalog into high-performing video ads for TV.
For a limited time only, test it risk free with $10k in free credits.
Want to get your company in front of an audience of over 15,000 tech and media insiders and decision makers? Then check out these sponsorship opportunities.

Image courtesy of Luma AI
Luma AI CEO: Hollywood is dying, only AI can save it
Are you tired of Hollywood churning out movie after movie from the same three franchises year after year? So is Amit Jain, the founder and CEO of Luma AI.
Luma has been working with a number of creatives eager to try out its AI video generator, and even opened up an AI lab in Los Angeles this summer to help filmmakers incorporate AI into their craft. But when I talked to Jain about the company’s new Ray 3 video model this week, he had some harsh words about the current state of the movie business.
“Hollywood is already dead if it continues on its current path,” Jain told me. “This has nothing to do with AI. The way it is consolidating, and keeps telling the same stories over and over again.”
Hollywood has become too risk-averse, and stopped trying new things, he said. “If all you can do is make 100-million, 200-million-dollar movies, then you are never going to touch novelty. Why are you making 5 [to] 10 Blockbusters a year instead of, like, you know, trying 50 to 100 ideas?”
“This current decline in the visual arts, and Hollywood production, [,,,] has to stop,” Jain said. “It stops by people trying out new ideas, and AI is the only way.” By giving filmmakers the tools to try out new ideas faster, and for a lot less money, generative AI can help them get back to their roots, and take risks. “AI enables Hollywood to touch novelty again,” he said.
Jain made these remarks a few days before Luma officially unveiled Ray 3, its newest video model. He told me Ray 3 is the first generative video model with reasoning. “Ray 3 is able to evaluate itself, and make sure it’s exactly what you're asking for,” Jain said.
Among other things, this enables Ray 3 to respond to visual annotations: Creatives can draw arrows on a still image to direct motion, and the model then makes a person, animal or object move in the direction of the arrow. The model is also able to generate HDR footage, and comes with a new draft mode that allows filmmakers to quickly try out a bunch of new ideas without having to burn through countless tokens. Once the model spits out something promising, the draft can be used as a starting point to generate high-quality footage.
Jain also told me that he has seen a massive shift in Hollywood over the past year or so. While Luma initially had to court studio executives, it is now seeing a ton of inbound requests – and plenty of studios trying to figure out how to incorporate AI into their workflows. “It's just not public yet,” Jain said. “And the reason it's not public yet [is that] nobody wants to show their hand before they have the thing fully working.”
As Hollywood is warming up to AI, it faces a new kind of challenge: The technology is evolving at an incredibly fast pace. A model introduced just a few months ago may already be outdated before a studio had a chance to fully integrate it. However, Jain argued that this shouldn’t hold the industry back.
“Some of the best directors we work with actually think of generative [AI] as this whole different thing,” he said. “They make use of the artifacts that the models produce. They're trying to use the morphing to make horror movies.”
“For creatives, this is just a whole new easel,” Jain added.
SPONSORED
International HR and payroll in 185+ countries
Growing your team outside your HQ country? RemoFirst helps you employ anywhere, with EOR services available in 185+ countries.
International HR and payroll for global employees and contractors— without setting up a local entity.

Image courtesy of Snap
Snap’s Spectacles get native browser, WebXR support
Just ahead of Meta’s announcement of its first consumer AR device, Snap is back with an update for its own AR glasses: Spectacles, which were first launched as a dev kit a year ago, are getting an OS upgrade that includes, among other things, a new browser.
I know, sounds mundane, right? But there’s actually something notable: The browser comes with support for WebXR, which opens up Spectacles to web-based AR and VR apps.
That means two things: First, Spectacles users will soon have access to a lot more immersive apps optimized for headsets and wearables. Secondly, developers will be able to make apps available on Spectacles without having to use Snap’s Lens Studio development kit, and ship them without having to go through the device’s Lens Explorer app store.
What else
Meta’s first smart glasses with display are here. The Verge’s Victoria Song calls them “the closest we’ve ever gotten to what Google Glass promised over 10 years ago.”
The music streaming service that just won’t die. The second part of my story about the history of Pandora, which turns 20 this week, including never-before-reported details about the company’s secret Pandora X project.
Amazon is having a hardware event on September 30. The company may unveil new Echo speakers, among other things.
Mental health challenges are a normal part of life, and so is asking for help. With BetterHelp, you can match with a therapist easily, message anytime, and get the convenience of therapy that’s 100% online. Get Started with 25% off Your First Month (SPONSORED)
The Sonos price increase is here. Thanks, tariffs! Some Reddit users are reporting that the company doubled their upgrade credits, making possible to buy products for 30% off.
The Internet Archive has settled a music industry lawsuit. Universal Music and others had sued the non-profit over old, rare sound recordings.
Inside Apple’s AirPod lab. I love me a good lab story. And for some reason, Apple put Govee lights in its anechoic chamber?
That’s it
I’m doing a subscriber drive as part of Back Indie Media. Right now, I’m 25% towards my goal of signing up 20 new paying subscribers by the end of September. Did you like today’s newsletter? Then why don’t you help me get one step closer towards that goal?
Thanks for reading, have a great weekend!
Was this email forwarded to you? To get Lowpass for free every week, sign up here.
Got a news tip? Simply respond to this email.
Interested in partnering with Lowpass? Check out our sponsorship page.


