↩ Accueil

Vue lecture

Deadpool VR Is Out Now, Exclusive To Quest 3 & 3S

Marvel's Deadpool VR, the latest Quest 3 and Quest 3S exclusive title, is out now for $50.

Developed by Meta-owned Twisted Pixel Games in collaboration with Marvel Games, Deadpool VR has a cel-shaded graphics style, and unlike in the movies, Deadpool in VR is voiced by Neil Patrick Harris, not Ryan Reynolds.

The arcade-style action game sees you, as Deadpool, kidnapped by the supervillain Mojo, voiced by John Leguizamo, and forced to compete in, and hunt down talent for, his galaxy-wide reality TV show. The talent you'll kidnap are iconic villains from across the Marvel universe, including Mephisto, Lady Deathstrike, Omega Red, and Ultimo.

0:00
/1:26

In UploadVR's review of Deadpool VR, Pete Austin described the visuals as paying off "beautifully", with the best implementation of cel-shading that he's seen yet in VR. He also found Neil Patrick Harris' performance to be "easily on par" with Ryan Reynolds in the films, and the "gloriously over-the-top" soundtrack to feel like it was straight out of one too.

However, his feelings on the combat system were more mixed. While it impressed in the early phases of the game, he criticized the fact that it's "disappointingly weightless", with weapons clipping through each other, hands clipping through the environment, and two-handed weapons feeling like they’re made of paper.

"Deadpool VR is a paradox. It captures the antihero's essence perfectly but wraps it around mechanics that just never feel like they completely deliver - great presentation carrying combat that never quite lives up to its potential."
Marvel’s Deadpool VR Review: Merc With A Meta Quest
Marvel’s Deadpool VR captures the antihero’s essence well, but doesn’t fully deliver on the potential of its combat.
UploadVRPete Austin

You should go read Pete's full review, and if it leaves you wanting to, you can buy Deadpool VR on the Meta Horizon Store for Quest 3 and Quest 3S, priced at $50.

  •  

PlayStation VR2 Will Be $300 On Black Friday

PlayStation VR2 will be just $300 on Black Friday, $100 off, in a sale that will last for an unspecified "limited time".

Originally priced at $550, Sony officially cut the price of the headset to $400 earlier this year, just over two years after it launched.

Now, for Black Friday 2025 PlayStation VR2 will be temporarily sold for just $300, its lowest price ever.

The closest we've seen was $350, a price the headset has gone on sale for three times: in summer 2024, the 2024 holiday period, and for the Days Of Play 2025 event. During that summer 2024 discount, sales reportedly skyrocketed, with one retailer selling more units in one day than had been previously sold all year so far.

Open-Source Tool Adds Eye-Tracked Foveated Rendering To Many SteamVR Games
An open-source tool for Windows PCs with modern Nvidia GPUs adds eye-tracked foveated rendering to a huge number of SteamVR games.
UploadVRDavid Heaney

If you're a PC gamer, you'll be able to pick up the headset, Sony's PC adapter, and (if required) a DisplayPort cable and Bluetooth adapter for less than $400 all-in.

And with PSVR2Toolkit and PimaxMagic4All, if you have a GTX 16 series or RTX graphics card, you can even leverage eye-tracked foveated rendering in a wide range of SteamVR titles.

With its 2K OLED displays, PlayStation VR2 offers a more vibrant image with far greater contrast than any other affordable PC VR headset, though with the tradeoff that the image is softer and has some fixed-pattern noise over it.

Sony Increases PS5 Console Prices In The US
From tomorrow, the Digital Edition will be $500, the version with a disc drive $550, and the PS5 Pro $750.
UploadVRDavid Heaney

The PS5 and PS5 Pro will also be on sale from Friday, with the same $100 discount. The Black Friday deals come three months after Sony increased the price of the consoles.

The digital edition PS5 will be on sale for $400, while the PS5 Pro will be offered at $650.

That means you'll be able to grab a PS5 and PlayStation VR2 together for $700, or a PS5 Pro and PS VR2 for $950, delivering a full consolized high-end VR experience for less than $1000.

Of course, next year Valve too will offer a consolized high-end VR experience with Steam Frame and Steam Machine. That combination will have the significant benefit of being wireless, but will also likely cost at least twice as much, making Sony's Black Friday proposition still a good deal.

  •  

Free Tool Adds Eye-Tracked Foveated Rendering To Many SteamVR Games

A free tool for Windows PCs with modern Nvidia GPUs adds eye-tracked foveated rendering to a huge number of SteamVR games.

Called PimaxMagic4All, the tool re-implements a feature Pimax ships in its Pimax Play software used to set up and adjust its headsets. As such, if you already own a Pimax headset, you don't need it.

PimaxMagic4All should work with any SteamVR-compatible headset that exposes a low-level public API to retrieve eye tracking data, or which has third-party software that does so, including:

What Is Foveated Rendering?

  • Fixed Foveated Rendering (FFR) means rendering the central area of the image at a higher resolution than the peripheral area.
  • Eye-Tracked Foveated Rendering (ETFR), occasionally also called Dynamic Foveated Rendering, means rendering the area you're currently looking at at higher resolution than everywhere else, as determined by eye tracking sensors.

Both techniques save performance in VR, and this can be used to either run demanding experiences at a smoother framerate or render experiences already hitting framerate at higher peak resolution.

FFR comes with noticeable pixelation at the edges, but works on any headset, while with ETFR there shouldn't be any noticeable difference, depending on the settings and that assuming the eye tracking system has low enough latency.

The developer says that it should "likely" work with Valve's Steam Frame too, when streaming from a Windows PC with an Nvidia GPU, and in theory could work with HTC Vive Pro Eye and Vive Focus Vision with additional development time.

The developer, by the way, is Matthieu Bucchianeri, a name you may recognize if you're a regular UploadVR reader.

Bucchianeri is a very experienced developer, having worked on the PS4 and original PlayStation VR at Sony, Falcon 9 and Dragon at SpaceX, and HoloLens and Windows MR at Microsoft, where he currently works on Xbox. At Microsoft he contributed to OpenXR, and in his spare time he developed OpenXR Toolkit, VDXR (Virtual Desktop's OpenXR runtime), and most recently Oasis, the native SteamVR driver that revived Windows MR headsets.

PimaxMagic4All used with Varjo Aero.

PimaxMagic4All has a simple graphical interface with three levels of foveated rendering: Maximum, Balanced, and Minimum. You can choose between prioritizing increasing performance, achieving a result where you shouldn't notice the difference, or a balance of the two.

The tool can inject foveated rendering into any title that uses the DirectX 11 graphics API and OpenVR, Valve's deprecated API for SteamVR. The game also needs to not have an anti-cheat system, since those will prevent code injection. And remember, you need to have an Nvidia graphics card, specifically a GTX 16 series or RTX card.

You can find a small list of supported titles on the GitHub project's wiki page, and it includes Half-Life: Alyx, Skyrim VR, Fallout 4 VR, Elite Dangerous, Assetto Corsa, and Boneworks. But this is only a fraction of the total number of games that should be supported in theory.

Note that three titles you won't need this for are Microsoft Flight Simulator 2024, DCS, and iRacing, since all three now support OpenXR eye-tracked foveated rendering natively.

Microsoft Flight Simulator 2024 Now Has Foveated Rendering
Microsoft Flight Simulator 2024 now has both fixed and eye-tracked foveated rendering, alongside a range of other improvements to VR support.
UploadVRDavid Heaney

PimaxMagic4All is available on GitHub, where you'll find both the source for the code added around Pimax's core as well as compiled releases.

  •  

Quest 3S Is $200 At Costco Today And Includes 12 Months Of Horizon+

Quest 3S is just $200 today at Costco for members or $215 for non-members, and includes 12 months of the Meta Horizon+ games subscription.

You can find the deal on Costco's website, and the $100 discount from the regular $300 price will apply at checkout, with a $15 surcharge added if you're not a Costco member.

This is the lowest outright price we've ever seen for Quest 3S, and a year of the Horizon+ subscription normally costs $60. New Meta Quest headsets otherwise come with 3 months of the subscription.

Horizon+ includes a Games Catalog with some of Quest's best VR games, including Asgard's Wrath 2, Cubism, Demeo, Dungeons of Eternity, Eleven Table Tennis, Ghosts of Tabor, Job Simulator, Maestro, Onward, Pistol Whip, Red Matter, Synth Riders, The Climb 2, and Walkabout Mini Golf. It also lets subscribers redeem 2 monthly games pre-selected by Meta.

Quest’s Horizon+ Monthly Games For November 2025 Revealed
From a VR adventure hit to a strategic submarine sim, these are the Horizon+ monthly games for November on Quest.
UploadVRHenry Stockdale

While Quest 3S can run all the same content as Quest 3, and has the same fundamental capabilities (including the same XR2 Gen 2 chipset and 8GB RAM), if you have the funds we always recommend Quest 3 over Quest 3S. The proper Quest 3 features Meta's advanced pancake lenses which are clearer and sharper over a wider area, have a wider field of view, and have precise separation adjustment, making them suitable for essentially everyone's eyes. These pancake lenses also enable Quest 3 to be thinner, which makes the headset feel slightly less heavy.

Still, at just $200 or $215 and with a year of Horizon+ games, Costco's Quest 3S deal could be hard to say no to.

The deal ends today, so grab it quickly if you want to affordably bring a friend or loved one into VR and mixed reality this holiday season.

  •  

Steam Frame Isn't Valve Index 2, And That's A Good Thing

Of the many things Steam Frame is, what it isn't is a Valve Index 2. But that's a good thing.

When Valve Index launched in 2019, it was one of the most expensive VR headsets on the consumer market. Facebook had just launched the $400 Rift S and Oculus Quest headsets, and there was nothing like Apple Vision Pro or Samsung Galaxy XR.

At $1000 for the full kit, Index was a premium product for enthusiasts, meant to push the high-end, with (relatively) wide field of view lenses, off-ear speakers, and precise laser tracking. The thick, heavy tether and wall-mounted base stations were a feature, not a bug.

Based on some of the reactions to Steam Frame over the past few days, it's clear that many Index owners, and hardcore VR enthusiasts in general, were hoping that Valve would repeat its last-decade strategy, with another high-end tethered headset.

They wanted 4K micro-OLED panels (or at least, say, 3K LCD with local dimming) fed by yet another DisplayPort cable, with ultra wide field of view lenses, face tracking, and "Lighthouse" base station tracking, backwards-compatible with existing SteamVR peripherals.

But there are good reasons why Valve didn't do this, and why Steam Frame is the better strategy.

Oculus Quest 2 Is Now The Most-Used VR Headset On Steam
Nearly 5 months after launch, Facebook’s Oculus Quest 2 is the most used VR headset on Steam. That is according to the February 2021 results for the Steam Hardware Survey. Valve finally started counting Quest 2 headsets in its own category at the beginning of January. Quest 2 jumped
UploadVRJamie Feltham

Index was relatively successful for what it was trying to be, by all accounts. More than six years later it still makes up around 15% of SteamVR usage. But what it did not do is meaningfully increase the total number of people playing VR games on Steam.

Instead, it was the $300 Quest 2 that achieved that feat. Less than six months after launch it became the most used headset on Steam, and today standalone headsets make up over 2/3rds of SteamVR use.

Standalone headsets with computer vision tracking allow anyone to connect to SteamVR on their PC with a couple of clicks, completely wirelessly, with no base stations or other complex setup required. And that they are wireless matters.

Valve “Looking Into Several Methods” For Wireless Index
During the Index VR headset launch party Valve CEO Gabe Newell stated that the company is “looking into several methods” for making the headset untethered: “As I said, shipping a product is truly the beginning. There are some obvious next steps. It’s simple for us to broaden our distribution
UploadVRDavid Heaney

Among existing VR enthusiasts, there is a sentiment that wireless is a nice-to-have, but far from essential feature, while some are even actively opposed to it, adamant that they'll never cut the tether.

But there is a selection bias at play here. People who consider the cable a dealbreaker didn't buy the Index, or any other tethered PC VR headset. And they are the majority.

Since the HTC Vive Wireless Adapter, seven years ago, it has been obvious that wireless is the ideal for VR. You don't have to stow a cable and avoid running over it with your chair wheels. You can rotate freely in VR without worrying about getting tangled. And you can truly lose yourself in the virtual world because you don't have a tether reminding you where your PC is.

In fact, in 2017 Valve CEO Gabe Newell called wireless VR a “solved problem”. “My expectation is that wireless will be an add-on in 2017, and then it will be an integrated feature in 2018”, Newell was quoted as saying during a press conference that year.

Of course, the Vive Wireless Adapter relied on a 60GHz signal, unable to penetrate solid objects at all, so the transmitter had to be wall mounted and the receiver positioned on the top of your head, plus it was expensive. It was the right goal, but with the wrong technology.

Within days of the release of Oculus Quest people started using their existing home Wi-Fi network, leveraging the same H.264 codec used for video streaming to turn a $400 headset into a wireless room-scale PC VR system for no additional cost.

From here, the death of tethered PC-only VR headsets, or at least their relegation to a tiny niche, was inevitable.

Steam Frame Hands-On: UploadVR’s Impressions Of Valve’s New Headset
UploadVR’s Ian Hamilton and David Heaney went hands-on with Steam Frame at Valve HQ, trying both standalone use and PC VR.
UploadVRIan Hamilton

There are two problems with this approach, however.

Firstly, the high compression ratio means that this kind of wireless VR doesn't look as good as a DisplayPort signal. And secondly, while some enthusiasts have ideal dedicated network setups with a high-end dedicated access point, most people rely on the cheap router their ISP supplied them a decade ago, which may not be near their VR playspace and also has to handle the traffic from the rest of the household.

With Steam Frame, Valve is using a combination of both hardware and software cleverness to refine the compressed wireless streaming experience. The headset has dual wireless radios, one of which is dedicated to the PC wireless adapter included in the box. And eye-tracked foveated streaming is used at all times, optimizing the video stream quality for where you're currently looking.

The Steam Frame box included the wireless adapter, front and center (photo by UploadVR at Valve HQ).

Essentially, Steam Frame is trying to package the high-quality wireless VR setups that only enthusiasts experience today into a relatively mainstream PC gaming product.

It's not about delivering yet another tethered PC VR headset with higher resolution – there are Bigscreen and Pimax headsets for that. Instead, Steam Frame is focused on delivering the best possible wireless PC VR experience that can be sold for less than $1000 (Valve's current plan).

And it's exactly this that PC VR needs. A product that out of the box, for every buyer, delivers an excellent wireless PC VR experience, without modifying their home network setup. Steam Frame isn't Index 2, but it's the better move for Valve. And instead of selling to the same few hundred thousand enthusiasts, I suspect it could sell millions of units through its lifetime, bringing far more customers for developers building PC VR games.

  •  

Quest 3S Is $250 At Best Buy And Comes With $110 Of Black Friday Perks

Quest 3S is on sale for $250 at Best Buy, and comes with a $50 Best Buy gift card, 1 month of Xbox Game Pass Ultimate, and The Walking Dead: Saints & Sinners VR game.

That's a $50 discount from the headset's regular $300 price, and the three perks together are worth $110. You can find the deal for the 128GB base model of Quest 3 here.

A similar offer is available for the 256GB storage model, with a $330 price ($70 off) and the same $110 of perks. In both cases, you still get 3 months of the Meta Horizon+ subscription, as with all purchases of new Meta Quest headsets.

You could use the $50 Best Buy gift card to get the Elite Strap to make the headset more comfortable for just $20, for example, while during the 1 month of Xbox Game Pass Ultimate (normally $30) you can play flatscreen games like Call of Duty and Fortnite on a giant virtual screen.

As for The Walking Dead: Saints & Sinners, it's also normally $30, and it's widely considered to be one of the best VR games of all time due to its physics-based combat system, earning an 'Essential' score in our review.

The Walking Dead: Saints & Sinners Review - The Best Zombie Apocalypse To Date
With two big updates under its belt, there’s never been a better time to get into The Walking Dead: Saints & Sinners. Read on for our 2021 The Walking Dead: Saints & Sinners review! Note: This is an updated review based on The Walking Dead: Saints & Sinners after its second free
UploadVRJamie Feltham

While Quest 3S can run all the same content as Quest 3, and has the same fundamental capabilities (including the same XR2 Gen 2 chipset and 8GB RAM), if you have the funds we always recommend Quest 3 over Quest 3S. The proper Quest 3 features Meta's advanced pancake lenses which are clearer and sharper over a wider area, have a wider field of view, and are fully horizontally adjustable, suitable for essentially everyone's eyes. These pancake lenses also enable Quest 3 to be thinner, which makes the headset feel slightly less heavy.

Still, at $250 and with $110 worth of perks Quest 3S could be hard to say no to, and it could be an impulse gift for the holiday season to bring a friend or loved one into VR and mixed reality.

  •  

Lynx's New Headset Won't Run Android XR, But Will Have Widest Standalone FOV

Lynx says its next headset won't run Android XR, as Google "terminated" its agreement, but will have by far the widest FOV of any standalone.

If you're unfamiliar, Lynx is a French startup that in 2020 announced Lynx-R1, a standalone mixed reality headset with an open periphery design, and ran a Kickstarter for it in 2021. Had it shipped on time, in 2022, Lynx-R1 would have been the first consumer standalone headset with color passthrough. But after repeated delays it was beaten to market by Meta Quest Pro, and by the time backers started to receive their headsets, years later, Quest 3 and Apple Vision Pro had shipped too, with much more powerful chipsets.

Further, at the time of the Kickstarter Lynx-R1 was envisioned as a roughly $500 consumer product, directly competing with Meta Quest headsets. But the price for new orders rose to $850 and then $1300 as the company pivoted to primarily targeting businesses.

Lynx-R1 Production Has Been “An Absolute Mess”
Lynx-R1 production has been “an absolute mess”, says the founder & CEO in a new blog post explaining the reasons.
UploadVRDavid Heaney

When Google revealed its Android XR operating system back in December, it announced that Lynx, Sony, and Xreal were building devices for it too, to follow Samsung.

Last month, Lynx teased its next headset with a darkened image, and because of Google's December announcement, we speculated that it could be the second opaque Android XR headset.

However, Lynx tells UploadVR that Google "terminated Lynx's agreement to use Android XR" in what the startup describes as a "surprising turn of events".

"We remain open to having Android XR running on the device when Google releases the OS for other headsets, as we worked closely with them for a year to make sure the compatibility would be guaranteed", Lynx says in a prepared statement.

Instead, the next Lynx headset will continue to run Lynx OS, the startup's open-source fork of Android with OpenXR support. And Lynx says it will release the source code for both hobbyists and businesses to use as an alternative to closed-source XR operating systems.

UploadVR reached out to Google to ask about the Lynx partnership and the status of Android XR for headsets other than Samsung Galaxy XR. While the company wouldn't comment on the status of any agreement with Lynx, it confirmed that it's still working with Xreal and Sony.

Lynx Releases Open-Source Android 6DoF Positional Tracking System
Lynx released an open-source 6DoF positional tracking system that should work on any Android headset with a Qualcomm chip.
UploadVRDavid Heaney

Lynx will announce details and specifications of its new headset over the coming months, with a full reveal at SPIE in late January.

For now, it's only saying that it will be a "mid-range" headset, priced somewhere between Quest 3 and Galaxy XR, with the widest field of view of any known standalone due to the use of advanced aspheric pancake lenses built in collaboration with Israeli startup Hypervision.

The optical approach here should be somewhat similar to Meta's Boba 3 prototype, though given the practicalities of the standalone form factor, Lynx cautions that while its headset will be noticeably wider than anything else on the market today, it still won't be anywhere near as wide as Boba 3.

When it comes to delivering this time, Lynx founder Stan Larroque tells UploadVR that his company has "learned so much with the R1" in regards to electronics supply chains, and will not do a Kickstarter or preorders for the new headset. When it's available to buy, it will be ready to ship immediately, Larroque claims.

  •  

Valve Officially Announces Steam Frame, A "Streaming-First" Standalone VR Headset

Valve just officially announced Steam Frame, a "streaming-first" standalone VR headset launching in "early 2026".

Steam Frame has a lightweight modular design and runs a VR version of Valve's SteamOS, the Linux-based operating system used in Steam Deck. With an evolved version of the Proton compatibility layer it can run almost any Linux, Windows, and Android game, including SteamVR games. Many titles won't perform well on the mobile chipset, though, so Steam Frame has a wireless dongle in the box to leverage the power of your gaming PC – hence Valve's "streaming-first" positioning.

0:00
/0:08

The headset does not require or support base stations. It tracks itself and its included controllers using four onboard grayscale tracking cameras, two of which can be used for monochrome passthrough, and it also has eye tracking for foveated streaming.

Steam Frame will replace Valve Index on the market, which the company confirmed to UploadVR is no longer in production, and joins Valve's "family" of hardware products, which will also soon include a Steam Machine consolized PC and a new Steam Controller.

Steam Frame Hands-On: UploadVR’s Impressions Of Valve’s New Headset
UploadVR’s Ian Hamilton and David Heaney went hands-on with Steam Frame at Valve HQ, trying both standalone use and PC VR.
UploadVRIan Hamilton

My colleague Ian Hamilton and I went hands-on with Steam Frame at Valve HQ, and you can read our impressions here. This article, on the other hand, provides a full rundown of the design, specifications, and features of Steam Frame, based on the information provided to us by Valve.

Lightweight Modular Design

Steam Frame will come with a replaceable battery strap, with built-in dual driver speakers and a 21.6 Wh rear battery.

The strap itself is fabric and the rear battery unit has soft padding, meaning it can "collapse" against the lenses for portability and naturally deform when your head is resting on a chair, sofa, or bed.

Steam Frame and the Steam Frame Controllers (image from Valve).

The core frontbox of Steam Frame weighs just 185 grams, Valve says, while the entire system with the default included facial interface, speakers, strap, and rear battery weighs 440 grams.

That makes Steam Frame the lightest fully-featured standalone VR headset to date.

The rear battery of Steam Frame's included default strap (image from Valve).

Steam Frame is a modular system, and Valve will make the CAD and electrical specifications available to third parties to build custom facial interfaces and headstraps. Someone could, for example, build a rigid strap with an open interface, or a fully soft strap with a tethered battery. Expect a range of accessories.

2K LCDs & Pancake Lenses

Steam Frame features dual 2160×2160 LCD panels, meaning it has twice as many pixels as the Valve Index and roughly the same as Meta Quest 3.

The panels have a configurable refresh rate between 72Hz and 120Hz, with an "experimental" 144Hz mode, just like the Index.

Steam Frame's lenses (photo by UploadVR at Valve HQ).

Valve says the multi-element pancake lenses in front of the panels offer "very good sharpness across the full field of view", which the company describes as "slightly less than Index", and "conservatively" 110 degrees horizontal and vertical.

Lens separation is manually adjusted via a wheel on the top of the headset, letting wearers match their interpupillary distance (IPD) for visual comfort.

Wireless PC Adapter With Foveated Streaming

Steam Frame does not support DisplayPort or HDMI in. It is not a tethered headset. Instead, Valve is going all-in on compressed wireless streaming, aiming to perfect it with a combination of clever hardware and software.

The headset has two separate wireless radios. One is used as a client, connecting to your home Wi-Fi network on the 2.4GHz or 5GHz band for the general internet connection of SteamOS. The other is for a 6GHz Wi-Fi 6E hotspot, created by the headset, that SteamVR on your PC automatically connects to via the USB adapter included in the box.

It's a truly dedicated point-to-point connection between Steam Frame and your PC.

The wireless adapter is included in the box (photo by UploadVR at Valve HQ).

This gives Valve precise firmware-level control over the entire network stack for wireless PC VR and eliminates the problems you might experience using other standalone headsets for this, such as being bottlenecked by a router that's either too far away, blocked by too many walls, congested by other traffic, or just supplied by your ISP because it was cheap, not because it's any good.

Of course, some enthusiasts already have a high-quality Wi-Fi setup for PC VR, with a premium router or access point in the room where they play. Valve tells us that such people can continue to use their setup instead of the adapter if they really want, but suspects they won't.

The other feature Valve has implemented to make the wireless PC VR experience as good as it can possibly be is foveated streaming. Steam Frame has built-in eye tracking, and when you're using PC VR it's always used to encode the video stream in higher resolution where you're currently looking.

0:00
/0:05

While this feature has existed as part of Steam Link VR for Quest Pro since the app launched in late 2023, Valve says on Steam Frame the foveated streaming has lower latency and greater precision, thanks to the company controlling the entire software stack on the headset side.

Linux, Windows & Android Apps Standalone

Steam Frame can run Linux, Windows, and Android applications through a combination of compatibility layers and emulation.

As with other SteamOS devices such as Steam Deck, Steam Frame can run Linux titles natively as well as Windows applications via Proton, the compatibility layer Valve has been working on for almost a decade now in collaboration with CodeWeavers.

But while Steam Deck is an x86 device, the same CPU architecture as a gaming PC, Steam Frame uses the mobile-focused ARM architecture. That supports a huge advantage: Steam Frame can natively run Android APKs, including those you download in the web browser, as long as they don't require Google Play Services. And Valve will now be accepting Android APKs on Steam, so developers can easily port their Meta Quest games.

But the ARM architecture also means that Steam Frame can't natively run x86 applications, which the majority of Steam games are.

Photo by UploadVR at Valve HQ.

To solve this, Valve has been investing in FEX, an open-source tool for emulating x86 applications on ARM Linux devices that it has integrated into Proton on Steam Frame. The company tells UploadVR that the performance impact here is "shockingly small" – on the order of a few percent.

The ability to run x86 Windows applications means that Steam Frame can, in theory, run almost any VR title on Steam.

However, the key word here is "run". Steam Frame features a roughly 10-watt chipset originally designed for use in smartphones, and has only a fraction of the power of the gaming PC hardware that most SteamVR titles were designed for. Thus, while visually simplistic and well-optimized titles at relatively low graphics settings will run well, and there'll be a "Steam Frame Verified" tag for such titles on Steam, for high-fidelity VR gaming such as playing Half-Life: Alyx you'll want to leverage your PC.

Snapdragon 8 Gen 3 + 16GB RAM

Steam Frame is powered by Qualcomm's Snapdragon 8 Gen 3 chipset, paired with 16GB of LPDDR5X RAM.

Two models will be sold, one with 256GB UFS storage and the other with 1TB, and there's also a microSD card slot for expanded storage. In fact, you can even transfer the microSD card from your Steam Deck or Steam Machine, and your games will instantly be available to play.

Non-shipping transparent internal prototype (photo by UploadVR at Valve HQ).

So just how powerful is Steam Frame's chip? Well, the XR2 Gen 2 series used in pretty much every other non-Apple headset features the Adreno 740 GPU from the 8 Gen 2 smartphone chip, and the 8 Gen 3 is the successor from the year after with the newer Adreno 750.

On paper, Steam Frame's Adreno 750 GPU is 25% more powerful than the Adreno 740 in Meta Quest 3, and this difference increases to over 30% when you factor in the fact that Quest 3 slightly underclocks its GPU, while Valve confirmed that Steam Frame does not. Further, the effective performance difference will be even greater in titles that leverage eye-tracked foveated rendering.

The CPU, on the other hand, is much more difficult to compare, as the XR2 Gen 2 uses a non-standard core configuration and 2D benchmarks run on headsets don't induce the maximum clock speed. But based on what we know about the chips, expect Steam Frame to have around 50% improved single-threaded performance compared to Quest 3 and around 100% greater multithreaded, as a rough estimate.

Essentially, from a standalone performance perspective Steam Frame is notably more powerful than other non-Apple standalone headsets, though still significantly less powerful than a gaming PC.

SLAM Tracking & Monochrome Passthrough

Steam Frame has four outwards-facing greyscale fisheye cameras for inside-out headset and controller tracking via computer vision. You don't need base stations, and the headset doesn't support them anyway.

Two of the cameras are on the top corners, and the other two are on the front, near the bottom, widely spaced.

One of Steam Frame's greyscale fisheye tracking cameras (image from Valve).

To make headset tracking work in the dark, Steam Frame also features infrared illuminators, bathing your environment in IR light that the cameras can see.

You can choose to see the real world around you via the two front cameras at any time, though the view is monochrome, and lower resolution than the passthrough on headsets with dedicated mixed reality cameras. But combined with the IR illuminators, the advantage is that it lets you see in the dark.

Front Expansion Port

While Steam Frame has only low-resolution monochrome passthrough by default, it has a user-accessible front expansion port
that in theory enables color cameras, depth sensors, face tracking sensors and more to be added.

Valve says the port offers a dual 2.5Gbps MIPI camera interface and also supports a one-lane Gen 4 PCIe data port for other peripherals.

"There is certainly enough flexibility in this port to do anything people are interested in doing", Valve's Jeremy Selan told UploadVR.

Included Controllers With Gamepad Parity

The included Steam Frame Controllers have a relatively similar ringless design to Meta's Touch Plus controllers, and are also tracked by the headset via infrared LEDs under the plastic. However, while Touch Plus controllers have 8 IR LEDs each, 7 on the face and 1 on the handle, Steam Frame Controllers have 18 each, dispersed throughout the face, handle, and bottom, which should make them more resistant to occlusion.

The bigger difference between Touch Plus and Steam Frame Controllers is the inputs. Valve has put all four A/B/X/Y buttons on the right controller and a D-Pad on the left controller, while both have an index bumper in addition to the index trigger.

Steam Frame Controllers (image from Valve).

The idea here is that, together, the Steam Frame Controllers have all the same inputs as a regular gamepad, meaning they can be used for both VR and flatscreen gaming. You can switch between VR and flatscreen seamlessly, and you'll need less space in your bag when traveling.

Steam Frame Controllers also feature capacitive finger sensing on all inputs and the handle, as well as advanced tunneling magnetoresistance (TMR) thumbsticks. TMR technology means they should have improved precision and responsiveness compared to traditional potentiometer thumbsticks, and should be significantly more resistant to drift – an issue that plagued the Valve Index Controllers.

Unlike the Index controllers, Steam Frame Controllers don't have built-in hand grip straps. But Valve says it will sell them as an optional accessory for people who want them, a similar strategy to Meta.

Steam Frame and the Steam Frame Controllers (photo by UploadVR).

As with Touch Plus controllers, the Steam Frame Controllers are powered by a single AA battery. They should last roughly 40 hours, though this is highly dependent on how much the haptic actuator gets activated.

Steam Frame does not currently support controller-free hand tracking. It requires some form of input device.

Spec Sheet & Competitors Comparison

Here's a full list of Steam Frame's specs, directly compared to Meta Quest 3 and Samsung Galaxy XR for context:

Valve
Steam Frame
Meta
Quest 3
Samsung
Galaxy XR
Displays 2160×2160
LCD
2064×2208
LCD
3552×3840
micro-OLED
Refresh
Rates
72-120Hz
(144 Experimental)
60-120Hz
(90Hz Home)
(72 App Default)
60-90Hz
(72Hz Default)
Stated
FOV
110°H × 110°V 110°H × 96°V 109°H × 100°V
Platform SteamOS
(Valve)
Horizon OS
(Meta)
Android XR
(Google)
Chipset Qualcomm
Snapdragon
8 Gen 3
Qualcomm
Snapdragon
XR2 Gen 2
Qualcomm
Snapdragon
XR2+ Gen 2
RAM 16GB 8GB 16GB
Strap Soft + Battery
(Modular)
Soft
(Modular)
Rigid Plastic
(Fixed)
Face Pad Upper Face
(Enclosed)
Upper Face
(Enclosed)
Forehead
(Open Default)
Weight 185g Visor
440g Total
397g Visor
515g Total
545g Total
Battery Rear
Pad
Internal Tethered
External
IPD Manual
(Dial)
Manual
(Dial)
Automatic
(Motorized)
Hand
Tracking
Eye
Tracking
Face
Tracking
Torso & Arm
Tracking
Passthrough Black & White
Low Resolution
Color
4MP
Color
6.5MP
IR
Illuminators
Active
Depth Sensor
dToF
Wi-Fi 7
(Dual Radios)
6E 7
PC
Wireless
Adapter

(6GHz Wi-Fi 6E)
Discontinued
(5GHz Wi-Fi 6)
Default
Store
Steam Horizon Store Google Play
Unlock PIN PIN Iris
Data Ports 1x USB-C
(USB2)

+

2x MIPI /
Gen 4 PCIe
1x USB-C
(USB 3.0)
1x USB-C
Storage 256GB / 1TB 512GB 256GB
MicroSD Slot
Controllers Steam Frame
Controllers
Touch Plus +$250
Price TBD $500
(512GB)
$1800
(256GB)

Spec Sheet & Index Comparison

And here's it compared to Valve Index, the company's now-discontinued tethered PC VR headset from 2019:

Valve
Steam Frame
Valve
Index
Standalone
Wireless
Lenses Pancake Fresnel
Displays 2160×2160
LCD
1440×1600
LCD
Refresh
Rates
72-120Hz
(144 Experimental)
72-120Hz
(144 Experimental)
Tracking Inside-Out
Computer Vision
Laser
Base Stations
Strap Soft + Battery
(Modular)
Rigid Plastic
(Fixed)
Weight 185g Visor
440g Total
809g Total
Eye
Tracking
Data Ports 1x USB-C
(USB2)

+

2x MIPI /
Gen 4 PCIe
1x USB-A
(USB 3.0)
Controllers Steam Frame
Controllers
Valve Index
Controllers
Price TBD $1000

Steam Machine

While Steam Frame (of course) supports any gaming PC that can run SteamVR titles, Valve is also releasing its own desktop PC running SteamOS, which, as well as being able to act as a living room console, could make getting into PC VR a more streamlined experience than ever.

Steam Machine is more than 6 times more powerful than Steam Deck, Valve tells UploadVR, with a discrete CPU and GPU, not a unified APU architecture.

Steam Machine (image from Valve).

Here are the full specs of Steam Machine:

  • CPU: Semi-custom AMD Zen 4 6C / 12T
    • up to 4.8 GHz
    • 30W TDP
  • GPU: Semi-Custom AMD RDNA3 with 28 CUs
    • 110W TDP
    • 2.45GHz max sustained clock
    • 8GB GDDR6 VRAM
    • Ray tracing supported
  • RAM: 16GB DDR5
  • Storage: 512 GB & 2 TB SSD models + microSD card slot
  • Internal power supply, AC power 110-240V

The RAM and storage are user-upgradable, Valve confirmed, while the CPU and GPU are soldered on.

You'll "eventually" be able to wake Steam Machine via a Steam Frame without needing a physical display or other peripherals attached, though Valve couldn't say whether this functionality will be available at launch. When this does arrive, it means you'll be able to just grab your Steam Frame and jump straight into high-performance PC VR at any time in seconds, no need to manually boot up a PC.

"Aiming" For Cheaper Than Index

Valve isn't yet giving a specific price for Steam Frame or Steam Machine, saying that it doesn't yet know and referencing the volatility of the current macroeconomic environment.

The company did however tell UploadVR that it's aiming to sell Steam Frame for less than the $1000 Index full-kit.

"As soon as we know pricing, we'll be sharing", Valve said.

The soon-to-be Steam hardware family (image from Valve).

Steam Frame is set to launch in "early 2026", alongside the new Steam Machine and Steam Controller. It will be available in all the same countries where Steam Deck is sold today, and fully replaces Index in Valve's lineup.

If you're a developer, you can apply for early access to a Steam Frame kit today, though there are limited units available.

Steam Frame Hands-On: UploadVR’s Impressions Of Valve’s New Headset
UploadVR’s Ian Hamilton and David Heaney went hands-on with Steam Frame at Valve HQ, trying both standalone use and PC VR.
UploadVRIan Hamilton

  •  

Apple Now Sells The PS VR2 Sense Controllers For Use With Vision Pro

The PlayStation VR2 Sense Controllers are now sold by Apple, priced at $250, and the charging stand is included.

Apple added support for Sony's tracked controllers to Vision Pro headsets with visionOS 26, which released in September, but Sony itself doesn't sell them separately from its $400 VR headset.

The PS VR2 Sense controller support of visionOS includes 6DoF positional tracking, capacitive finger touch detection, and basic vibration support. The precision haptics of the controllers are not supported, however, and nor are their unique resistive triggers.

‘XR Is Having A Moment’: Colocated Pickleball With Pickle Pro From Resolution Games
“XR is having a moment” with colocation experiences like Pickle Pro from Resolution Games.
UploadVRIan Hamilton

One of the first Vision Pro games to support the PS VR2 Sense controllers was the indie title Ping Pong Club, which we tested when visionOS 26 launched.

And three weeks ago, Resolution Games launched a title leveraging the controllers called Pickle Pro, a pickleball game with both local and remote SharePlay, so you can play against other Vision Pro owners in the same room or remotely over the internet as Personas.

The $250 price includes Sony's official charging stand.

The PlayStation VR2 Sense Controllers are available on the online Apple Store in the US, priced at $250, with Sony's official charging stand included.

There's no word yet on availability outside the US.

Technically, PS VR2 headset owners who lose or damage both controllers could also buy the package from Apple instead of a new headset, though it would probably be a better idea to get used replacements on a marketplace like eBay instead.

  •  

Meta Opens LA Store To Sell Smart Glasses & Quest Headsets

Meta just opened a store in LA to demo and sell its smart glasses and Quest headsets.

Called Meta Lab, this is the company's second permanent store, joining the Burlingame store opened in 2022 right beside one of its main campuses.

Meta Opens First Store To Sell Quests, Portals, And Glasses
Meta’s campus in Burlingame, California is home to its first physical retail space where you can check out Quest 2 and its accessories as well as Ray-Ban Stories sunglasses and Portal video-calling devices. The store is officially open as of May 9 with interactive demos for “Beat Saber, GOLF+
UploadVRIan Hamilton

The Los Angeles store is located on Melrose Avenue. Meta describes it as its "flagship" retail location and says it spans 20,000 square feet, with multiple levels "specifically designed to highlight the features and benefits of our hardware".

Meta says its full hardware lineup is featured at the store, including Ray-Ban Meta Gen 2Oakley Meta HSTNOakley Meta VanguardMeta Ray-Ban Display, Quest 3, and Quest 3S.

Meta Lab

Meta is also opening temporary "pop-up spaces" in New York and Las Vegas to demo its smart glasses:

  • The Vegas pop-up is relatively small, a 560 square foot space inside the Wynn, and opened last month.
  • The New York 5th Avenue pop-up will be much larger, at 5000 square feet, and is set to open "soon".
Meta Lab

Meta says it also plans to open a series of smart glasses "micro-stores", that may be similar to the hardware vending machines it had at Connect 2025. Snap tried that just under a decade ago for its original Spectacles smart glasses, but like the product itself, it didn't catch on.

All of this is in addition to the thousands of stores where Meta's smart glasses are already demoed and sold, thanks to its partnership with EssilorLuxottica, the owner of Ray-Ban and Oakley. But Meta's stores have the potential to include more technical staff who are aware of the intricate details of the devices, and they can include its Quest headsets too.

  •  

New Apple Immersive Video Puts You On The USS Nimitz As Super Hornets Launch

The 14-minute 'Flight Ready' Apple Immersive Video puts you on the flight deck of the USS Nimitz aircraft carrier as Super Hornets launch and land.

The USS Nimitz is the lead ship of the 10 Nimitz-class aircraft carriers of the US Navy, one of its 11 total current supercarriers. The Nimitz has been involved in the Iran hostage crisis, the Gulf of Sidra incident, the Gulf War, the Iraqi no-fly zones enforcement, and the 21st-century wars in Iraq and Afghanistan. In 2004 two of its Super Hornets reported that they encountered the now-famous "tic tac" UFO, a rapidly maneuvering white oblong flying object with no obvious means of propulsion.

What Is Apple Immersive Video?

The Apple Immersive Video format is 180° stereoscopic 3D video with 4K×4K per-eye resolution, 90FPS, high dynamic range (HDR), and spatial audio. It's typically served with higher bitrate than many other immersive video platforms.

We highly praised Apple Immersive Video in our Vision Pro review. It's not possible to cast or record Apple Immersive Video though, so you'll have to take our word for it unless you have access to a Vision Pro.

The new Flight Ready immersive documentary walks you through the flight deck of the USS Nimitz as it prepares for deployment, including the role of the pilots and crew on the deck itself and in the tower.

"The flight deck of a naval aircraft carrier is the most chaotic place on Earth", the narrator brings you into the film by declaring.

Submerged Review: First Scripted Apple Immersive Video Sends Chills From Vision Pro
Submerged is a must-see immersive short film, available now on Apple Vision Pro.
UploadVRIan Hamilton

The video features many close-up immersive shots of Super Hornets launching and landing, as well as sweeping aerial views of the Nimitz sailing through the ocean as the jets fly by at low level.

It's an impressive use of the immersive video format that will appeal to any fan of military documentaries, and may induce some nostalgia for those who served at sea.

You can find Flight Ready in the Apple TV app, for free, exclusively on Apple Vision Pro.

New Apple Immersive Video Coming From Red Bull, Audi, BBC Proms, CNN & More
New Apple Immersive Video content is coming from Red Bull, Audi, BBC Proms CNN, and more, filmed with Blackmagic’s new camera and edited in DaVinci Resolve.
UploadVRDavid Heaney

  •  

Meta Ray-Ban Display Review: First Generation Heads-Up Mobile Computing

Meta Ray-Ban Display is an early glimpse of a future where mobile computing doesn't mean looking down and taking something out of your pocket.

Most people never leave home without their phone, and take it out of their pocket so often that it's even become a replacement for fidgeting. The smartphone is so far the ultimate mobile computing device, an omnitool for communication, photography, navigation, gaming, and entertainment. It's also your alarm clock, calendar, music player, wallet, and flashlight. Globally, more people own a smartphone than TVs and cars combined. To get philosophical for a moment, the smartphone has become humanity's second cognitive organ.

The problem is that taking out your phone harshly disconnects you from the world around you. You have to crane your neck down and disengage from what you were otherwise doing, your attention consumed by the digital world of the little black rectangle.

Photograph by Margaret Burin of ABC News.

In recent years, multiple startups have tried and failed to solve this problem. The smug founders of Humane came to liberate you from your phone with a $700 jacket pin, while Rabbit r1 promised the "large action model" on its $200 pocket device could handle your daily life instead.

The truth, and the reason why these companies failed, is that most people adore their phones, and are borderline addicted to the immense value they provide. And the screen of the smartphone is a feature, not a bug. People love being able to view content in high-resolution color anywhere they go, and despite the cries of a small minority of dissenters, the models with the biggest screens sell the best.

"If you come at the king, you best not miss", as the phrase goes.

The only form factor that seems to have any real chance of one day truly replacing the smartphone is AR glasses, which could eventually provide even larger screens that effectively float in midair, anywhere the wearer wants, any time they want. But while prototypes exist, no one yet knows how to affordably produce wide field of view true AR glasses in a form factor that you'd want to wear all day. In the meantime, we're getting HUD glasses instead.

HUD glasses can't place virtual 3D objects into the real world, nor even 2D virtual interfaces. Instead, they provide a small display fixed somewhere in your vision. And in the case of many of the first-generation products, like Meta Ray-Ban Display, that display is only visible to one of your eyes.

Meta Ray-Ban Display is also highly reliant on your nearby phone for connectivity, so it isn't intended to be a replacement for it as a device. It is, however, meant to replace some of the usage of your phone, preventing the need to take it out of your pocket and keeping your head pointed up with your hands mostly free. So does it succeed? And is it a valuable addition to your life? I've been wearing it daily for around a month now to find out.

(UploadVR purchased Meta Ray-Ban Display at retail with our own funds, while Meta provided us with the correctly sized Meta Neural Band for review.)

Comfort & Form Factor

Unlike a VR headset that you might use at home or on a plane for a few hours, the pitch for smart glasses is that you can wear them all day, throughout your daily life. Even when they run out of battery, they can still act as your sunglasses or even prescription eyewear (for an extra $200 and weeks of waiting).

Meta Ray-Ban Display Prescription Lenses: What You Need To Know
Looking to use Meta Ray-Ban Display as your everyday prescription glasses? Here’s a rundown of what prescriptions it supports, and how that works.
UploadVRDavid Heaney

As such, it's crucial that they have a design you'd be okay with wearing in public, and that they're comfortable enough to not hate having them on your face.

Meta Ray-Ban Display weighs 69 grams, compared to the 52 grams of the regular Ray-Ban Meta glasses, and 45 grams of the non-smart Ray-Ban equivalent. It's also noticeably bulkier, with thicker rims and far thicker temples.

Ray-Ban Meta vs Meta Ray-Ban Display vs Xreal One Pro

In my month with Meta Ray-Ban Display I've worn it almost every day throughout my daily life, sometimes for more than 8 hours at a time, and I experienced no real discomfort. The additional weight seems to be mostly in the temples, not the rims, while the nose pads are large and made out of a soft material. If anything, because the larger temples distribute the weight over a greater area and are more flexible, I think I even find Meta Ray-Ban Display slightly more comfortable than the regular Ray-Ban Meta glasses.

So, for my head at least, physical comfort is not an issue with Meta Ray-Ban Display. But what has been an issue is the social acceptability of its thick design.

With the regular Ray-Ban Meta glasses, people unfamiliar with them almost never clocked that I was wearing smart glasses. The temples are slightly thicker than usual, but the rims are essentially the same. It's only the camera that gave them away. With Meta Ray-Ban Display, it's apparent that I'm not wearing regular glasses. It's chunky, and everyone notices.

In some circles, thick-framed glasses are a bold but valid fashion choice. For most people, they look comically out of place. I've asked friends, loved ones, and acquaintances for their brutally honest opinions. Some compared it to looking like the glasses drawn on an archetypal "nerd" in an old cartoon, while only a few said that the look works because it matches current fashion trends. And my unit is the smaller of the two available sizes.

Ray-Ban Meta vs Meta Ray-Ban Display vs Xreal One Pro

Meta Ray-Ban Display also comes in two colors, 'Black' and 'Sand', and a confounding factor here is the black is glossy, not matte. I'm told that this decision was made because glossy was the most popular color for the regular Ray-Ban Meta glasses. But combined with the size, the glossy finish on Meta Ray-Ban Display makes it look cheap in a way that an $800 product really shouldn't, like a prop for a throwaway Halloween costume.

So Meta Ray-Ban Display is physically very comfortable, but not socially. More on that soon.

The Monocular Display

The fixed HUD in Meta Ray-Ban Display covers around 14 degrees of your vision horizontally and vertically (20 degrees diagonal). To understand roughly how wide that is, extend your right arm fully straight and then turn just your hand 90 degrees inward, keeping the rest of your arm straight. To understand how tall, do the same but turn your hand upwards or downwards.

What you see within that 20 degrees is a clear and high detail image, though ever so slightly soft rather than fully sharp, with higher angular resolution than even Apple Vision Pro. There's a slight glare that sees, for example, icons mildly bleed into the empty space around them, but this is very minor, and not distracting.

More notably, the significant difference between a waveguide like this and the interfaces you might see in a mixed reality VR headset is that it's very translucent, with a ghostly feel. You can see the real world through it at all times.

Display System Specs

  • Display Type: Full-Color LCOS + Geometric Reflective Waveguide (Monocular)
  • Resolution: 600×600
  • Angular Resolution: 42 pixels per degree
  • Field Of View: 14°H × 14°V (20° D)
  • Peak Brightness: 5000 nits (automatically adjusts)
  • Frontal Light Leak: 2%

The display's perceived opacity and brightness is highly variable, though, because with waveguides this depends on the environmental light level, and the system also rapidly automatically adjusts display brightness, leveraging the ambient light sensor. You can manually adjust the brightness if you want, but the system is very good at deciding the appropriate level at all times, so I never do.

With a 5000 nit maximum, it's even visible in daytime sunlight, though it has a very ghostly translucent feel. As the photochromic lenses transition to dark, the perceived opacity slightly increases.

One notable quirk is that because an LCOS is essentially (to greatly simplify things) an LCD microdisplay, if you're in a very dark environment you'll see a faint glow throughout the display area when it's on, like an LCD monitor trying to show black. But in anything above almost pitch black, you won't see this.

So Meta Ray-Ban Display's display is surprisingly good, and lacks the distracting visual artifacts seen in many of the waveguide AR headsets of the 2010s. But there is a massive, glaring problem: it's only visible to your right eye.

0:00
/0:11

No through-the-lens approach accurately depicts what the HUD looks like, so here's Meta's generic marketing clip instead.

Meta Ray-Ban Display is a monocular device. Your left eye sees nothing at all.

Other than your nose, which your visual system is hardwired to understand, there is no analog in nature for one eye seeing something the other doesn't. It just feels wrong, and induces a constant minor feeling of eyestrain when I look at the display for more than a few seconds.

I can put up with it for a few seconds at a time, and have gotten slightly more used to it over time, but I would never want to watch a video or conduct a video call like this. I've also put the glasses on more than a dozen people now, and while some of them could just about tolerate the monocular display, others found it hurt their eyes within seconds.

I suspect that this is a core reason why Meta Ray-Ban Display is only available to buy after a retail demo. This just isn't a visually comfortable product for many people, and Meta likely wants to avoid mass returns.

Bloomberg's Mark Gurman and supply-chain analyst Ming-Chi Kuo have both claimed that Meta plans to retire Meta Ray-Ban Display in 2027 upon launching a binocular successor, with significantly ramped up marketing, production, and availability. By closing my left eye, I can already get a pretty good feel for just how much more visually comfortable the next generation could be.

No Light Leak! But Is That A Good Thing?

Almost all of the brightness of Meta Ray-Ban Display stays on your side of the glasses – 98% according to Meta. The display is not visible to people looking at you. I've repeatedly asked friends whether they can even tell if I have the display on or off, and none have been able to so far. The clickbait YouTube thumbnails you may have seen are fake.

This is partially due to the low "light leak" of the geometric waveguide in Meta Ray-Ban Display, but it's also because of the automatic brightness adjustment. If you manually turn up the brightness to an uncomfortably high level, which I can't imagine anyone intentionally doing, you can make the display slightly visible externally, though not its content (it just looks like a scrambled pattern). But again, even this requires an adjustment that no regular user would reasonably make.

All this said, while I initially assumed that the low light leak was an important feature of Meta Ray-Ban Display, I've come to see the inability for nearby people to know whether you're looking at the HUD as somewhat of a bug.

When you're with another person and take out your phone, that's an unambiguous indicator that you're diverting attention from them. Similarly, while Apple Vision Pro shows a rendered view of your eyes, if virtual content is occluding the person you're looking at, Apple intentionally renders an occluding pattern. Why? To clearly signal that you're looking at virtual content, letting the person know when they do or don't have your full attention.

When someone wearing an Apple Vision Pro is looking at virtual content that partially occludes you, you'll see a pattern on the display in front of their rendered eyes (center image above). With Meta Ray-Ban Display, you don't know whether the wearer is looking at the HUD or you.

With Meta Ray-Ban Display, there is no such signal. People who spend a lot of time with you can eventually figure out that you're looking at the HUD when your eyes are looking slightly down and to the right, but it's far more ambiguous, and this is not conducive to social acceptability. Are you fully present with them or are you not? They can't clearly tell.

And the worst case scenario is, when looking at the HUD, to a person sitting in front of you it can, in some specific circumstances, appear as if you're just looking at their chest. Yikes.

I'm not saying that I want other people to be able to see the content of my display, as that would be a terrible privacy flaw. But I do wish there was an external glow on the lens when the display is on. I don't want the other person to have to guess whether I'm fully present or not, and whether I'm looking at the HUD or their body.

The Interface & Meta Neural Band

Like the regular Ray-Ban Meta glasses, you can control Meta Ray-Ban Display with Meta AI by using your voice, or use the button and touchpad on the side for basic controls like capturing images or videos and playing or pausing music. But unlike any other smart glasses to date, it also comes with an sEMG wristband in the box, Meta Neural Band.

In its current form, Meta Neural Band is set up to detect five gestures:

  • Thumb to middle finger pinch: double tap to toggle the display on/off, single tap to go back to the system menu, or hold for quick shortcuts to the 3 menu tabs.
  • Thumb to index finger pinch: how you "click".
  • Thumb to side of index finger double tap: invoke Meta AI.
  • Thumb swiping against the side of your index finger, like a virtual d-pad, which is how you scroll.
  • Thumb to index finger pinch & twist: to adjust volume or camera zoom, as you would a physical volume knob.

How Does Meta Neural Band Work?

Meta Neural Band works by sensing the activation of the muscles in your wrist which drive your finger movements, a technique called surface electromyography (sEMG).

sEMG enables precise finger tracking with very little power draw, and without the need to be in view of a camera.

The wristband has an IPX7 water resistance rating, and charges with an included proprietary magnetic contact pin charger.

While in the above clips I have my arms extended to illustrate the gestures, the beauty of sEMG is that you don't need to. Your hand can be at your side, resting on your leg, or even in your pocket. And it works even in complete darkness.

The gesture recognition is almost flawless, with close to 100% accuracy. The one exception is that rarely, if I'm walking it will fail to pick up a sideways directional swipe. But in general Meta Neural Band works incredibly well. The volume adjustment gesture, for example, where you pinch and twist an imaginary knob, feels like magic.

And whether I'm washing my hands, eating food, or driving a car, I have never found the display waking by accident, as the middle finger double-tap gesture only ever triggers when I'm intentionally making it. The only accidental activation I've encountered is that sometimes, if I'm tapping my phone with my thumb, Meta AI will trigger. It's hard to imagine solving this without the low-level access to the phone OS that only companies like Apple and Google have.

0:00
/0:05

Wake (top left), Scroll (top right), Click (bottom left), and Volume/Zoom (bottom right)

The Meta Neural Band gestures control the HUD interface, which looks much like that of a smart watch. It has 3 tabs, which you horizontally scroll between:

  • The center home tab (the default) shows the date and time, your notifications, and a Meta AI button, with tiny shortcuts to active audio or navigation at the top.
  • The right tab is the two-column applet library: WhatsApp, Instagram, Messenger, Messages, Calls, Camera, Music, Photos, Captions, Maps, Tutorials, and a game called Hypertrail. Four rows are shown at a time, and you navigate vertically to see the rest.
  • The left tab features quick controls and settings like volume, brightness, Do Not Disturb, as well as shortcuts to Captions, Camera, and Music.

When I first used Meta Ray-Ban Display I found the interface to be "just too much", often requiring too many sequential gestures to do what you want. While I still broadly hold this view a month later, I have found myself becoming more used to quickly performing the correct sequence with experience, and I've discovered that if you pinch and hold your thumb to your middle finger, you get shortcuts to the three main menu tabs, which can speed things up.

I still think there's plenty of room for improvement in Meta Ray-Ban Display's interface though, from a simplicity perspective, and repeat my assertion that the menu should have two tabs, not three. The Meta AI finger gesture makes the Meta AI button on the center tab redundant, for example, and when you don't have any notifications, the tab feels like a waste of space.

0:00
/0:10

This clip from Meta shows the 3 tabs of the system interface, and how you swipe between them.

A lot of the friction here will eventually be solved with the integration of eye tracking. Instead of needing to swipe around menus, you'll be able to just look at what you want and pinch, akin to the advantages of a touchscreen over arrow keys on a phone. But for now, it can sometimes feel like using MP3 players before the iPod, or smartphones before the iPhone. sEMG is obviously going to be a huge part of the future of computing. But I strongly suspect it will only be one half of the interaction answer, with eye tracking making the whole.

A major improvement since the Meta Connect demo though, is performance. While I still wouldn't describe Meta Ray-Ban Display as very snappy, the abject interface lag I encountered at Connect is gone in the shipping consumer model. The most noticeable delay is in waking the display, which often takes a second or two after the gesture.

Meta Neural Band size comparison with Fitbit Luxe.

Coming back to Meta Neural Band for a second, the only real problem with it is that it's something else you need to wear and charge (with a proprietary cable). I always wear a Fitbit on my left wrist, and now I wear the Meta Neural Band on my right too.

That's not to say that Meta Neural Band is a comfort burden. I find it no more or less comfortable than I did the Pixel Watch I used to own, and having tried a Whoop it feels similar to that too. And while it does leave a minor mark on my wrist, so do the straps of those other devices.

But it's another thing to remember to put on charge at night, another cable to remember to bring when traveling, and another USB-C port needed at my bedside.

Ideally, I should only have to wear and charge one wrist device. But today, Meta Neural Band is solely an sEMG input device, and nothing more.

Traveling means bringing the proprietary wristband charging cable with you.

The band already has an accelerometer inside, so in theory, a software update could let it track your daily step count. And if a future version could add a heart rate sensor for fitness, health, and sleep tracking, I wouldn't need my Fitbit at all anymore. But we're not there yet. And wearing a second wrist device just for input is a big ask for any product.

Features & Use Cases

There are six primary use cases of Meta Ray-Ban Display. It's a camera, a communications device, an on-foot GPS navigator, an assistive captions and translations tool, a personal audio player, and an AI assistant that can (if you want) see what you see.

So how well does it do each of these things?

Capturing Photos & Videos

When the regular Ray-Ban Meta glasses first launched, they were primarily pitched as camera glasses, like their predecessor the Ray-Ban Stories, and this remains one of the biggest use cases even as Meta now calls the product category "AI glasses". But without a display, you didn't get a preview of what you're capturing, nor could you check if the result was any good until it synced to the phone app. Sometimes you got lucky, and other times you failed to even frame your subjects, an issue not helped by the camera on almost all smart glasses being on a temple rather than centered.

0:00
/0:08

Meta depiction of photography.

Meta Ray-Ban Display also has 32GB of storage for media, and also syncs everything to your phone via Wi-Fi 6 when you open the Meta AI app. But the experience of capturing media is fundamentally better, because you get a live visual preview of exactly what's in frame. And with the Meta Neural Band, you can capture without raising your arm, as well as adjust the zoom on the fly by pinching your index finger to your thumb and twisting, the same gesture used to adjust the volume.

The camera quality isn't as good as your phone, given that the sensor has to fit into the temple of glasses. It appears to be the same camera as the regular Ray-Ban Meta glasses, and while it can produce great results in daytime, in low-light environments or with heavy zoom you'll get a relatively grainy output. You can see some sample shots and clips in my colleague Ian Hamilton's launch-week impressions piece.

Hands-On With Meta Ray-Ban Display & Meta Neural Band Across New York City
UploadVR’s Ian Hamilton bought the Meta Ray-Ban Display glasses on launch day and tested them across Manhattan.
UploadVRIan Hamilton

Regardless, the ability to capture precisely framed shots without needing to hold up a smartphone is Meta Ray-Ban Display at its best. It lets you both stay in the moment and capture memories at the same time, without the doubt that what you've shot doesn't capture what you wanted to include. And it works completely standalone, even if your phone is out of battery.

With a couple of swipes and taps you can also easily send your captured media to a friend, something that required extensive voice commands with the displayless glasses. And this brings us on to messaging.

Messaging

While Meta doesn't have an existing device ecosystem like Apple and Google, what it does have is the most popular messaging platform and two most popular social networks in the world, all three of which have an integration on Meta Ray-Ban Display.

You can opt to have incoming WhatsApp, Messenger, Instagram, and text messages pop up on the display, and so without needing to look down at a smartwatch or take your phone out of your pocket you can "screen" them to decide which are important enough to respond to immediately. Tap your thumb to your middle finger to dismiss a message, or to your index finger to open it.

The notification appears in the very bottom of the display area, which is already slightly below your sightline, so doesn't block your view of what's in front of you. And of course, unlike with a phone, no one around you can see it. There's also a setting to automatically detect when you're in a moving vehicle, so if you start driving a car you won't be interrupted.

0:00
/0:03

Meta depiction of messaging.

If you want to respond to a message, there are 4 options in the interface: dictate, voice note, suggested emoji, or suggested reply.

I don't send voice notes, and I don't find the suggested emojis or replies useful, but I do love the dictation. It's powered by an on-device speech recognition model with relatively low latency and surprisingly great accuracy, in my experience. Even with my Northern Irish accent that some other systems (and people) can find difficult to understand, I'm able to dictate full responses without needing to type. And what's most impressive is that it works even when you speak quietly, thanks to the six-microphone array that includes a dedicated contact mic positioned just a few inches above your lips. That's yet another advantage of the glasses form factor.

Still, there are plenty of messages that I wouldn't want to dictate even softly in public, and situations when I want to use combinations of punctuation and letters that don't have a spoken form. Meta plans to release a software update in December that will let you enter text by finger-tracing letters on a physical surface, such as your leg, bringing some of its more advanced sEMG research out of the lab and into the product. It sounds straight out of science fiction, and we'll bring you impressions of it when the update rolls out. But it's not there today.

What About iMessage?

If you're an iPhone user, you're probably wondering whether all this works with iMessage.

I use an Android phone, from which Meta Ray-Ban Display can receive and send both 1-on-1 and group text messages, including both SMS and RCS.

For iPhone, I'm told, you can receive and send only 1-on-1 iMessages, with no support for group threads. And this support is limited to only pop-up notifications - you won't see a 'Messages' applet in the list.

These limitations, to be clear, are imposed by Apple, and Meta would gladly support the missing features if Apple let it.

As well as receiving new messages as pop-up notifications, you can also access your 10 most recent threads on each messaging platform at any time by swiping to it on the apps list. In the apps, you can scroll through past messages and send new ones, including your captured photos and videos stored on the glasses.

The problem with accessing your past message threads, and with viewing photos and videos you've been sent, is that it's incredibly slow to load. Open WhatsApp on your phone and you'll typically see your messages update within a fraction of a second. On Meta Ray-Ban Display you'll see everything as it was the last time the applet was opened for upwards of 10 seconds, while media can take over a minute at times, or even just seemingly never load. And it's badly missing any kind of loading progress bar.

So, for example, while in theory you can definitely use Meta Ray-Ban Display to catch up on the dozens of Reels that one unemployed friend sends you all day on the go, assuming your eyes can put up with the monocular display for this long, in practice, as with waiting for fast-moving group chats to finally load, I often found it faster to take the phone out of my pocket and open the real app.

The cause of this slowness seems to be that Meta Ray-Ban Display is entirely reliant on your phone's Bluetooth for internet connectivity. This connection speed issue is something I ran into repeatedly on Meta Ray-Ban Display, and made me wish it had its own cellular connection. In fact, I'm increasingly convinced that cellular will be a hard requirement for successful HUD and AR glasses.

Audio & Video Calls

For over two years now, I've made and taken almost every phone call, both personal and professional, on smart glasses. I hate in-ear and over-ear audio devices for calls because it feels unnatural to not hear my own voice clearly, and the people on the other end love the output of the optimally-positioned microphone array.

With Meta Ray-Ban Display, the addition of the HUD and wristband lets you see how long a call has gone on for, and end it without having to raise your hand. You can also view and call recently called numbers in the Calls applet.

Meta depiction of video calling.

On the regular Ray-Ban Meta glasses you can also share your first-person view in a video call, and the big new calling feature of Meta Ray-Ban Display is that you can now see the other person too.

But again, this great-in-theory video calling feature is ruined by the fact that the internet connection is routed to Meta Ray-Ban Display via your phone's Bluetooth. Even with my phone and the person I'm calling on very strong internet connections, the view on both sides was pixelated and exhibited constant stuttering. Bluetooth just isn't meant for this.

On-Foot Navigation

One of the features I've most wanted HUD glasses (and eventually AR glasses) for is pedestrian navigation. There are few things I hate more in this world than arriving in a new city and having to constantly look down at my phone or watch while walking, almost bumping into people and poles. Worse, in cities with dense skyscrapers, the GPS accuracy degrades to dozens of meters, and I hopelessly watch the little blue dot on my phone bounce around the neighborhood.

In theory, Meta Ray-Ban Display solves at least the first problem, but with a massive caveat you absolutely must be aware of if you're thinking of buying it for this use case.

0:00
/0:05

Meta depiction of navigation.

You can open the Maps applet anywhere, and you'll see a minimap (powered by OpenStreetMap and Overture) with nearby venues and landmarks. You can zoom all the way in to the street level, or out to the city level, using the same pinch-and-twist gesture used for volume control. You can also search for places using speech recognition, and scroll through the results by swiping your fingers.

The problem is that everywhere except inside the 28 cities Meta explicitly supports, you won't be able to initiate navigation. Instead, you just have an option to send it to your phone, where you can tap to open it in Google Maps, defeating the purpose. It's a rare product where core functionality is geofenced to a handful of cities.

I have been able to use Meta's navigation feature multiple times when visiting London, and found it genuinely very useful when in New York for the Samsung Galaxy XR launch event. I love the idea here, and where it works, the implementation isn't bad. Not having to look down to navigate is exactly what I wanted. But it works in so few places, relative to my life at least, that I just can't stop wishing it had Google Maps. And it's hard to imagine not switching to whatever HUD glasses have Google Maps first.

The Supported Cities

USA
• Atlanta, Georgia
• Austin, Texas
• Boston, Massachusetts
• Chicago, Illinois
• Dallas, Texas
• Fort Worth, Texas
• Houston, Texas
• Los Angeles, California
• Miami, Florida
• New York City, New York
• Orlando, Florida
• Philadelphia, Pennsylvania
• Phoenix, Arizona
• San Antonio, Texas
• San Diego, California
• San Francisco, California
• San Jose, California
• Seattle, Washington
• Washington D.C.Canada
• Toronto, Canada
• Montreal, Canada
• Vancouver, Canada

UK
• London, UK
• Manchester, UK

France
• Paris, France

Italy
• Rome, Italy
• Milan, Italy
• Naples, Italy 

It's baffling that Meta decided to roll its own navigation system. It may be the right bet in the long term, but in the short term it compromises the product and leaves the goal wide open for Google to deliver a significantly better experience. Meta already has a wide-ranging partnership with Microsoft – why didn't it license Bing Maps for navigation? Or why not acquire a provider like TomTom?

It somewhat reminds me of the Apple Maps launch debacle, except in a parallel universe where it arrived alongside a first-generation iPhone that didn't have Google Maps.

As for the other problem with navigating in cities, the GPS issue, Meta Ray-Ban Display offers no solution. In theory, Meta could leverage the camera to calibrate the position and orientation, a technique often called VPS, but it doesn't today, and would likely require the company to build up a huge imagery dataset similar to Google Street View.

Meta AI & Its Dedicated Gesture

Meta has been marketing its smart glasses as "AI glasses" for some time now, riding the wave of hype that has for better and for worse almost entirely taken over the tech industry.

I didn't use Meta AI often on the regular Ray-Ban Meta glasses because I hate saying "Hey Meta", just as I hate saying "Hey Google", especially in public. With Meta Ray-Ban Display, you can still invoke the AI that way if you want, but there's also a dedicated gesture: just slightly curl your fingers inward and double-tap the side of your index finger with your thumb.

There's something very satisfying about this gesture. It feels more natural than the pinches used for everything else. And it has me using Meta AI far more often than I would if I had to keep saying "Hey Meta".

With the display, you also get visual aids in your responses. Ask about the weather in a place, for example, and you'll see a five-day forecast appear. Or for most queries, you'll see the response appear as text. I wish I could disable the voice response and just get the text, for some situations, and while I can do so by just reducing the volume to zero temporarily, this isn't a very elegant solution.

The real problem with Meta AI is that it's still Meta AI. It just isn't as advanced as OpenAI's GPT-5 or Google's Gemini 2.5, sometimes failing at queries where it needs to make a cognitive leap based on context, and fundamentally lacking the ability to think before it responds for complex requests.

Occasionally, I've run into situations where I wanted the convenience of asking advanced AI about something without taking out my phone, but ended up doing so after Meta AI just couldn't get the answer right. Ideally, I'd be able to say "Hey Meta, ask [ChatGPT/Gemini]". But that's not supported.

This is part of the reason Mark Zuckerberg is spending billions of dollars acquiring top AI talent for Meta Superintelligence Labs.

Audio Playback & Control

The primary way I use the regular Ray-Ban Meta glasses is for listening to podcasts and audiobooks. Smart glasses speakers don't do music justice, but they're great for spoken word content, and having your ears fully open to the real world is ideal.

What's different on Meta Ray-Ban Display is that you can far more easily and precisely adjust the volume. Rather than needing to raise your arm up to your head and awkwardly swipe your finger along the temple, you can just wake the display, pinch and hold your index finger to your thumb, and twist. It's a satisfying gesture that feels natural and precise.

The HUD also shows you a thumbnail for the content you're viewing, as well as how far you're into it and how long is left. It's a nice addition, and one less reason to take out my phone.

Live Captions & Translation

For accessibility, one of the biggest marketed features of Meta Ray-Ban Display is Live Captions.

The real-time speech transcription has decent accuracy, with the exception of niche proper nouns, and fairly low latency, always giving you at least the gist of what the person you're looking at is saying. And yes, I do just mean the person you're looking at. It's remarkable how well the system ignores any audio not in front of you, leveraging the microphone array to cancel it out. Look away from someone and the captions will stop. Look back and they'll continue. It really does work.

0:00
/0:06

Meta depiction of live captions.

Just one swipe over from the virtual button to start Live Captions is the one to start Live Translation, and this feature blew me away. Having a friend speak Spanish and seeing an English translation of what they're saying feels like magic, and I could see it being immensely useful when traveling abroad. For the other person to understand you, by the way, you just hand them your phone with the Meta AI app open it shows them what you're saying, in their language. Brilliant.

Yes, you can do live translation with just a phone, or audio-only devices like AirPods and Pixel Buds.

Unfortunately, it only supports English, French, Spanish, and Italian. This is another example of where Google's services, namely Google Translate, would be immensely valuable.

The bigger problem with both Live Captions and Live Translation is that because the display is slightly below and to the right of your sightline, you can't look directly at someone when using the features. It's far better than having to look down at a phone, but ideally I'd want the text to appear centered and much higher, so that I could appear to be keeping eye contact with them while seemingly-magically understanding what they're saying. This would require different hardware, though.

The Big Missing Feature

While I'm glad that Meta Ray-Ban Display lets me decide whether it's worth taking my phone out of my pocket for inbound personal messages, what I want the most out of HUD glasses is the ability to screen my emails and Slack messages.

There is no applet store on Meta Ray-Ban Display, and Meta's upcoming SDK for phone apps to access its smart glasses won't support sending imagery to the HUD. For the foreseeable future, any new "apps" will have to come directly from Meta, and I call them "applets" because each really only does one thing and is essentially part of the OS.

(The company says it plans to add two new apps soon, a Teleprompter and a dedicated IG Reels experience.)

So a Slack applet won't be happening anytime soon. But there's a far easier way I could get what I want here.

Many smartwatches (yes, including third-party ones on iPhone) already let you view your phone notifications, so it's definitely technically possible. I asked Meta why it doesn't do this for Meta Ray-Ban Display, and the company told me that it came down to wanting to not overwhelm the user. I don't understand this answer though, since as with smartwatches, Meta could let you select exactly which apps you do and don't want notifications from, just as you can already for WhatsApp, Messenger, Instagram, and texts.

If I had this feature, letting me screen Slack notifications and emails, I would take my phone out of my pocket far less often. If any similar pair of glasses arrived with this feature, I'd switch over immediately.

The Case Folds Down & Has Huge Potential

Just like with AirPods, for smart glasses the included battery case is almost as important as the device itself. Meta Ray-Ban Display has an official "mixed use" battery life of 6 hours, which I've found to be fairly accurate, while the case provides 4 full charges for a total of 30 hours of use between needing to charge it with a cable.

The problem with the regular Ray-Ban and Oakley Meta glasses cases is that they're far too bulky to fit in almost any jacket pocket. And the Meta Ray-Ban Display case has an elegant solution for this, likely inspired by what Snap did in 2019 for the Spectacles 3.

0:00
/0:31

How Meta Ray-Ban Display's case folds when you're wearing the glasses.

When containing the glasses, the case is a triangular prism. But when not, it folds down into something with very similar dimensions to a typical smartphone, just slightly taller and less wide. This means that not only does it fit in most jacket pockets, but it even fits into my jeans pocket. So when I'm dressed relatively light and using Meta Ray-Ban Display as my sunglasses, for example, I can keep the case in my pocket when on-the-move and put the glasses back in the case when in the shade. The glasses and case together are the product, along with the wristband, and the folding design makes the whole system significantly more portable.

In fact, the glasses and case make such a great pair that I'm eager for the case to do more than just act as a battery and container.

When I need to take the glasses off to wash my face, take a shower, or just let them charge, docking them in the case should turn the duo into a smart speaker, letting me continue to listen to audio, take calls, and prompt Meta AI as I would with Alexa on an Amazon Echo. This could be achieved with no extra hardware, but ideally Meta would add a speaker to the case.

When folded, the case (center) is flat enough to fit in a pocket.

It also seems like the battery case could be the perfect way to bring cellular connectivity to a future version of Meta Ray-Ban Display without nuking the battery life of the glasses. The case would send and receive cellular signals, and relay the data to the glasses via Bluetooth for low-bandwidth tasks and Wi-Fi for high-bandwidth.

Interestingly, Meta has a partnership with Verizon to soon sell Meta Ray-Ban Display in stores. Could this evolve into a cellular partnership for the next generation?

Conclusions: Is It Worth $800?

Meta Ray-Ban Display is very much so a first generation product, an early attempt at a category that I'm convinced, after a month of testing, could eventually become a key part of the lives of at least hundreds of millions of people, if not billions.

In its current form, it's an amazing way to capture photos and videos without leaving the moment, and a great way to decide whether WhatsApp messages are worth your time without taking out your phone. In the cities where it's supported, the on-foot navigation is genuinely useful, and for the languages supported, the live translation feature could change what it means to travel. Like displayless smart glasses, it's also a great way to listen to spoken audio and take calls.

But the monocular display just doesn't feel right, too much of what Meta is trying to do is hampered by the device's lack of cellular connectivity, and the lack of established services like Gmail, Google Maps and Google Translate makes Meta Ray-Ban Display so much less useful than HUD glasses theoretically could be.

Further, while the Meta Neural Band works incredibly well, future versions need to replicate the functionality of smartwatches, instead of asking buyers to justify wearing yet another device on their wrist.

If you're an early adopter who loves having the latest technology and you don't mind looking odd in public, can live with the flaws I've outlined, and are fine with wearing a dedicated input device on your wrist, there's nothing else quite like Meta Ray-Ban Display, and the novelty could make up for the issues.

For everyone else, I recommend waiting for future generations of HUD glasses, ideally with binocular displays and either cellular connectivity or a seamless automatic phone Wi-Fi sharing system that I suspect only Apple and Google, the makers of your phone's OS, can pull off.

Just like with its Quest headsets, Meta is set to see fierce competition from the mobile platform incumbents in this space, and it'll be fascinating to see how the company responds and evolves its products through the rest of this decade and beyond.

Appreciate our reporting? Consider becoming an UploadVR Member or Patron.

  •  

Meta Launches $1.5 Million Competition For New Quest Apps & Significant Updates

The Meta Horizon Start Developer Competition 2025 will award new Quest apps and "significant" updates a total of $1.5 million in 32 prizes of up to $100,000.

Meta Horizon Start, originally called Oculus Start years ago, is a program run by Meta that gives VR/MR developers direct access to Meta developer relations staff as well as a community Discord and "exclusive Meta events, advanced technical education, community mentorship, software credits, go-to-market guidance, and more".

Now, Meta is running a competition for Horizon Start Program developers to build or significantly update Quest apps across entertainment, "lifestyle", and gaming. Here's the list of the main awards and prizes:

  • Best Entertainment Experience: "An experience that makes consuming or watching content more immersive, interactive, and innovative than traditional formats."
    • $100,000 winner
    • $60,000 runner-up
    • $30,000 honorable mention
  • Best Lifestyle Experience: "An experience that enhances peoples’ daily lives, how they get things done, learn new skills, or connect with others around shared interests."
    • $100,000 winner
    • $60,000 runner-up
    • $30,000 honorable mention
  • Best Social Game: "A game that connects people in real time to play together online or in a colocated space."
    • $100,000 winner
    • $60,000 runner-up
    • $30,000 honorable mention
  • Best Casual Game: "A game that is accessible, single-player, and designed for quick, engaging fun."
    • $100,000 winner
    • $60,000 runner-up
    • $30,000 honorable mention
  • Judge’s Choice: "This award is given to experiences that push the boundaries of what’s possible and have created something truly unique and innovative."
    • $30,000 each for 6 winners

Additionally, there are eight "special awards" for implementing specific features or using specific SDKs and toolkits:

  • Best Implementation of Hand Interactions
    • $50,000 each for 3 winners
  • Best Use of Passthrough Camera Access with AI
    • $50,000 each for 3 winners
  • Best Immersive Experience Built with Spatial SDK
    • $50,000 each for 2 winners
  • Best Immersive Experience Built with Immersive Web SDK
    • $30,000 each for 2 winners
  • Best Android App Leveraging Features Unique to Meta Quest
    • $25,000 winner
  • Best Android Utility App
    • $25,000 winner
  • Best Android App for Travel Mode
    • $25,000 winner
  • Best Experience Built with React Native
    • $25,000 winner

Entries can be entirely new projects or "significant" updates for an existing Quest app, with Meta citing adding hand tracking support, mixed reality, or multiplayer as examples of "significant". Essentially, for an update to enter it will need to bring a new modality.

The new competition comes late in a year where Meta has awarded the creators of smartphone-focused Horizon Worlds a total of $3.5 million across three competitions, and may allay some concerns that the company is only focused on Horizon Worlds with no further interest in apps.

The deadline for submitting a project for consideration is December 9, and interested developers can enter the competition at this URL. If you're not already a Meta Horizon Start member, you'll need to apply first.

  •  

Cambridge & Meta Researchers Confirm "Retinal" Resolution Is Far Higher Than 60 PPD

Cambridge and Meta researchers conducted a study confirming that "retinal" resolution is far higher than the 60 pixels per degree figure often cited.

While you'll usually see only the panel resolution of a headset mentioned on its spec sheet, what really matters is its angular resolution, or how many pixels occupy each degree of the field of view: the pixels per degree (PPD). For an extreme example, if two headsets used the exact same panels but one had a field of view twice as wide, it would have half the angular resolution.

Since Oculus widely demoed the DK1 over a decade ago, we've seen the angular resolution of affordable headsets advance from 6 PPD, an acuity that would classify a person as legally blind, to now 25 PPD, while higher-end headsets like Apple Vision Pro and Samsung Galaxy XR reach around 35 PPD, and Varjo XR-4 even achieves 51 PPD in the center. But what's the limit past which the human eye can no longer discern a difference?

Meta Tiramisu “Hyperrealistic VR” Hands-On: A Stunning Window Into Another World
We also went hands-on with Tiramisu, Meta’s prototype that combines beyond-retinal resolution, high brightness, and high contrast.
UploadVRDavid Heaney

In the XR industry, people often say that it's "generally accepted" that the limit is 60 PPD, since in theory, on paper, it offers 20/20 vision. Meta's Butterscotch prototype from a few years ago with 56 PPD was described as "near-retinal", for example. However, there has been significant skepticism of the 60 PPD figure among AR and VR experts for a long time now.

I tried Meta's 90 PPD "beyond retinal" Tiramisu prototype earlier this year, and while the demo wasn't set up to allow dynamically adjusting the resolution, the researchers behind it told me that they have done so in the lab and could clearly see a difference between 60 and 90. But this was only anecdotal.

Now, three researchers have conducted a study with 18 participants at the University of Cambridge, experimentally confirming the idea that 60 PPD is not the limit of human perception of detail.

Of the three authors of the paper, one is a Cambridge researcher, one is from Meta's Applied Perception Science team, and the third is at both.

The experiment setup.

Their experiment placed a 27-inch 4K monitor on a 1.6-meter motorized sliding rail in front of the participants, who had their heads fixed on a chin rest and were asked to discern specific visual features head-on as the conditions were varied.

The participants were presented with two different types of stimuli throughout the experiment: square-wave grating patterns (both with and without color) and text (both white-on-black and black-on-white).

Square-wave gratings, the researchers explain in the paper, are used in vision experiments because prior research suggests that "the foundational visual detectors of the human visual system are likely optimised for similar waveforms".

The resolution was varied both by moving the display closer or further away (between 1.1m and 2.7m, distance) and by upsampling or downsampling the spatial frequency of the patterns. The researchers also adjusted the viewing angle between 0°, 1°, and 20°.

For the full details of the experimental methods, you should read the paper in Nature Communications. It's an interesting read and you'll learn a lot about how this kind of perceptual science research is conducted. But it's the results that have fascinating implications for VR and AR.

The findings of the experiments.

The findings of the experiment, according to the researchers, are that the participants could discern grayscale details up to 94 PPD on average, red-green patterns at 89 PPD, and 53 PPD for yellow-violet patterns.

One participant in the study was even able to reach 120 PPD for grayscale, suggesting that for some people the threshold for "retinal" is double the generally accepted figure.

It will be a long, long time before shipping headsets reach anywhere near these resolutions. Meta's Tiramisu prototype hit 90 PPD only over a tiny 33° field of view, and Tiramisu 2 is aiming for 60 PPD over a 90° field of view instead as a better balance of specs. And while the study demonstrates that there is a difference, in my experience headsets with even "just" 56 PPD can feel incredibly real to the point where I suspect we won't want to trade off other aspects for further resolution any time soon.

Still, it's important that a formal study has been conducted to discover exactly where the limit to what the human eye can truly discern lies, and it reinforces the fact that while smartphones and tablets are plateauing, VR and AR hardware still has decades of runway for meaningful improvements to steadily arrive.

One point of skepticism here, however, is that a stationary display system like the one in the experiment does not benefit from the spatial temporal supersampling effect you get for free in a positionally tracked VR headset from the natural micromovements of your head.

  •  

Quest 3S On Sale Certified Refurbished For Lowest Price Ever

Quest 3S and the 512GB Quest 3 certified refurbished are just $216 and $360 respectively this week on Meta's official US eBay page.

First spotted by IGN, you can find the Quest 3S deal here and the Quest 3 deal here. For both headsets, you need to enter eBay's discount code TECH4THEM to get the lowest price.

The code expires after 11:59pm Pacific Time on Sunday, the end of this week.

Meta claims that its certified refurbished headsets "are inspected and thoroughly tested, professionally cleaned, and restored to original factory settings so they function and look like new and include the same accessories and cables as new devices". And eBay is offering a two-year certified refurbished warranty, a year longer than you'd get even if buying a new headset from Meta.com.

Meta Now Sells Quest 3S Refurbished For $270
Meta now offers Quest 3S refurbished from $270, though you can get the headset new for that price for the next few days.
UploadVRDavid Heaney

Quest 3S certified refurbished is normally $270, and the lowest we've seen it sold at brand new is $250, so the $216 offer here represents by far the lowest price we've ever seen for a fully standalone headset with included tracked controllers, hand tracking, and color mixed reality.

Meanwhile, Quest 3 certified refurbished at $360 is arguably an even better deal, as the only remaining 512GB model certified refurbished is normally $450.

While Quest 3S can run all the same content as Quest 3, and has the same fundamental capabilities (including the same XR2 Gen 2 chipset and 8GB RAM), if you have the funds we always recommend Quest 3 over Quest 3S. The proper Quest 3 features Meta's advanced pancake lenses which are clearer and sharper over a wider area, have a wider field of view, and are fully horizontally adjustable, suitable for essentially everyone's eyes. These pancake lenses also enable Quest 3 to be thinner, which makes the headset feel slightly less heavy.

Still, at just $216, Quest 3S certified refurbished enters the realm of an impulse buy for many, or perhaps an impulse gift for the holiday season to bring a friend or loved one into VR.

  •  

Sharp Is Crowdfunding A Strange Lightweight Tethered PC VR Headset

Sharp is crowdfunding a strange lightweight tethered PC VR headset that can also connect to one of its smartphones.

Called Xrostella VR1, the headset features dual 2160×2160 LCD panels paired with "light-efficient" pancake lenses with a 90-degree field of view, two grayscale fisheye tracking cameras, and one color passthrough camera. The included controllers, meanwhile, seem to resemble Quest 2's but with more hefty tracking rings and included hand grip straps.

Sharp says the headset's "body" weighs just 198 grams, making it lighter than any shipping headsets except Bigscreen Beyond 2 and Shiftall MeganeX.

IPD adjustment between 58mm and 71mm is supported, as well as diopter adjustment from 0D to -9.0D.

While Xrostella VR1 is primarily designed for PC VR, Sharp says it will also be compatible with its AQUOS sense10 smartphone, projecting the phone's display onto a fixed virtual screen. More smartphone models will be "expanded sequentially", the company claims.

Sharp Is Making A Hybrid Haptic VR Glove & Controller
Sharp is making a hybrid VR glove and controller, combining tactile feedback with buttons and a thumbstick, though hasn’t yet decided whether to sell it.
UploadVRDavid Heaney

It's unclear exactly who Xrostella VR1 is supposed to be for.

We haven't seen a major VR headset use only two tracking cameras since the Windows MR headsets that came before HP Reverb G2, as this approach severely limits the tracking range of the controllers. It's also rare for a headset to only use one camera for passthrough, as this results in a complete lack of correct depth and scale.

Further, the lack of eye tracking and hand tracking means the headset probably won't appeal to many VRChat users, while the narrow field of view and mediocre resolution won't appeal to simulator fans.

Of course, it's somewhat premature to assess the product proposition here without a price. If available at a low cost, Sharp could be aiming to offer a kind of "ultralight headset for the rest of us". But that seems unlikely.

Sharp says it will crowdfund Xrostella VR1 on the Japanese platform GREENFUNDING later this month. It's unclear why a company of Sharp's size is crowdfunding rather than just launching, but it may be a mechanism to gauge interest before taking the risk.

  •