↩ Accueil

Vue normale

Reçu aujourd’hui — 11 novembre 2025 UploadVR

Meta Ray-Ban Display Review: First Generation Heads-Up Mobile Computing

11 novembre 2025 à 01:36

Meta Ray-Ban Display is an early glimpse of a future where mobile computing doesn't mean looking down and taking something out of your pocket.

Most people never leave home without their phone and take it out of their pocket so often that it's become a replacement for fidgeting. Today, the smartphone is the ultimate mobile computing device, an omnitool for communication, photography, navigation, gaming, and entertainment. It's your alarm clock, calendar, music player, wallet, and flashlight. Globally, more people own a smartphone than TVs and cars combined. To get philosophical for a moment, the smartphone has become humanity's second cognitive organ.

The problem is that taking out your phone harshly disconnects you from the world around you. You have to crane your neck down and disengage from what you were otherwise doing, with your attention consumed by the digital world of the little black rectangle.

Photograph by Margaret Burin of ABC News.

In recent years, multiple startups have tried and failed to solve this problem. The smug founders of Humane came to liberate you from your phone with a $700 jacket pin, while Rabbit r1 promised the "large action model" on its $200 pocket device could handle your daily life instead.

The truth, and the reason why these companies failed, is that most people adore their phones, and are borderline addicted to the immense value they provide. And the screen of the smartphone is a feature, not a bug. People love being able to view content in high-resolution color anywhere they go, and despite the cries of a small minority of dissenters, the models with the biggest screens sell the best.

"If you come at the king, you best not miss", as the phrase goes.

The only form factor that seems to have any real chance of one day truly replacing the smartphone is AR glasses, which could eventually provide even larger screens that effectively float in midair, anywhere the wearer wants, any time they want. But while prototypes exist, no one yet knows how to affordably produce true AR glasses in a form factor that you'd want to wear all day. In the meantime, we're getting HUD glasses instead.

HUD glasses can't place virtual 3D objects into the real world, nor even 2D virtual interfaces. Instead, they provide a small display fixed somewhere in your vision. And in the case of many of the first-generation products, like Meta Ray-Ban Display, that display is only visible to one of your eyes.

Meta Ray-Ban Display is also highly reliant on your nearby phone for connectivity, so it isn't intended to be a replacement for it as a device. It is, however, meant to replace some of the usage of your phone, preventing the need to take it out of your pocket and keeping your head pointed up with your hands mostly free. So does it succeed? And is it a valuable addition to your life? I've been wearing it daily for around a month now to find out.

(UploadVR purchased Meta Ray-Ban Display at retail with our own funds, while Meta provided us with the correctly sized Meta Neural Band for review.)

Comfort & Form Factor

Unlike a VR headset that you might use at home or on a plane for a few hours, the pitch for smart glasses is that you can wear them all day, throughout your daily life. Even when they run out of battery, they can still act as your sunglasses or even prescription eyewear.

Meta Ray-Ban Display Prescription Lenses: What You Need To Know
Looking to use Meta Ray-Ban Display as your everyday prescription glasses? Here’s a rundown of what prescriptions it supports, and how that works.
UploadVRDavid Heaney

As such, it's crucial that they have a design you'd be okay with wearing in public, and that they're comfortable enough to not hate having them on your face.

Meta Ray-Ban Display weighs 69 grams, compared to the 52 grams of the regular Ray-Ban Meta glasses, and 45 grams of the non-smart Ray-Ban equivalent. It's also noticeably bulkier, with thicker rims and far thicker temples.

Ray-Ban Meta vs Meta Ray-Ban Display vs Xreal One Pro

In my month with Meta Ray-Ban Display I've worn it almost every day throughout my daily life, sometimes for more than 8 hours at a time, and I experienced no real discomfort. The additional weight seems to be mostly in the temples, not the rims, while the nose pads are large and made out of a soft material. If anything, because the larger temples distribute the weight over a larger area and are more flexible, I think I even find Meta Ray-Ban Display slightly more comfortable than the regular Ray-Ban Meta glasses.

So, for my face at least, physical comfort is not an issue with Meta Ray-Ban Display. But what has been an issue is the social acceptability of its thick design.

With the regular Ray-Ban Meta glasses, people unfamiliar with them almost never clocked that I was wearing smart glasses. The temples are slightly thicker than usual, but the rims are essentially the same. It's only the camera that gives them away. With Meta Ray-Ban Display, it's apparent that I'm not wearing regular glasses. It's chunky, and everyone notices.

In some circles, thick-framed glasses are a bold but valid fashion choice. For most people, they look comically out of place. I've asked friends, loved ones, and acquaintances for their brutally honest opinions. Some compared it to looking like the glasses drawn on an archetypal "nerd" in an old cartoon, while only a few said that the look works because it matches current fashion trends. And my unit is the smaller of the two available sizes.

Ray-Ban Meta vs Meta Ray-Ban Display vs Xreal One Pro

Meta Ray-Ban Display also comes in two colors, 'Black' and 'Sand', and a confounding factor here is the black is glossy, not matte. I'm told that this decision was made because glossy was the most popular color for the regular Ray-Ban Meta glasses. But combined with the size, the glossy finish on Meta Ray-Ban Display makes it look cheap in a way that an $800 product really shouldn't, like a prop for a throwaway Halloween costume.

So Meta Ray-Ban Display is physically comfortable, but not socially. More on that soon.

The Monocular Display

The fixed HUD in Meta Ray-Ban Display covers around 14 degrees of your vision horizontally and vertically (20 degrees diagonal). To understand how wide that is, extend your right arm fully straight and then turn just your hand 90 degrees inward, keeping the rest of your arm straight. To understand how tall, do the same but turn your hand downwards.

When it comes to what you see within that 20 degrees, it's a clear and high detail image, though ever so slightly soft rather than fully sharp, with higher angular resolution than even Apple Vision Pro. There's a minor but non-distracting glare that sees, for example, icons mildly bleed into the empty space around them.

More notably, the significant difference between a waveguide display like this and the interfaces you might see in a mixed reality VR headset is that it's very translucent, with a ghostly feel. You can see the real world through it.

Display System Specs

  • Display Type: Full-Color LCOS + Geometric Reflective Waveguide (Monocular)
  • Resolution: 600×600
  • Angular Resolution: 42 pixels per degree
  • Field Of View: 14°H × 14°V (20° D)
  • Peak Brightness: 5000 nits (automatically adjusts)
  • Frontal Light Leak: 2%

The display's perceived opacity and brightness is highly variable, because with waveguides this depends on the ambient light level, and the system also rapidly automatically adjusts the LCOS brightness, leveraging the ambient light sensor. You can manually adjust the brightness if you want, but the system is very good at deciding the appropriate level at all times, so I never do.

With a 5000 nit maximum, it's even visible in daytime sunlight, though it has a very ghostly translucent feel. As the photochromic lenses transition to dark, the perceived opacity slightly increases.

One notable quirk is that because an LCOS is essentially an LCD microdisplay, if you're in a pitch black room you'll see a faint glow throughout the display area when it's on, like an LCD monitor trying to show black. But in anything above pitch black, you won't see this.

So Meta Ray-Ban Display's display is surprisingly good, and lacks the distracting visual artifacts seen in many of the waveguide headsets of the 2010s. But, of course, there is a massive, glaring problem: it's only visible to your right eye.

0:00
/0:11

No through-the-lens approach accurately depicts what the HUD looks like, so here's Meta's generic depiction instead.

Meta Ray-Ban Display is a monocular device. Your left eye sees nothing at all.

There is no situation in nature where one of your eyes sees something yet the other doesn't while both are open. It just feels wrong, and induces a constant minor feeling of eyestrain when I look at the display for more than a few seconds, and thus I would never want to watch a video or conduct a video call on it (more on those use cases later). I've also put the glasses on more than a dozen people now, and while some of them could just about tolerate the monocular display, others found it hurt their eyes within seconds.

I suspect that this is a core reason why Meta Ray-Ban Display is only available after a retail demo. This just isn't a visually comfortable product for many people, and Meta likely wants to avoid returns.

Bloomberg's Mark Gurman and supply-chain analyst Ming-Chi Kuo have both claimed that Meta plans to launch a binocular successor to Meta Ray-Ban Display in 2027, at which point the company is expected to significantly ramp up marketing, production, and availability. By closing my left eye, I can already get a pretty good feel for just how much better the next generation will be.

No Light Leak! But Is That A Good Thing?

Almost all of the brightness of Meta Ray-Ban Display stays on your side of the glasses – 98% according to Meta. The display is not visible to people looking at you. I've repeatedly asked friends whether they can even tell if I have the display on or off, and none have been able to so far. The clickbait YouTube thumbnails you may have seen are fake.

This is partially due to the low "light leak" of the geometric waveguide in Meta Ray-Ban Display, but it's also because of the automatic brightness adjustment. If you manually turn up the brightness to an uncomfortably high level, which I can't imagine anyone intentionally doing, you can make the display slightly visible externally, though not its content. It just looks like a scrambled pattern. But again, even this requires an adjustment that no regular user would reasonably make.

All this said, while I initially assumed that the low light leak was an important feature of Meta Ray-Ban Display, I've come to see the inability for nearby people to know if you're looking at the HUD as somewhat of a bug.

When you're with another person and take out your phone, that's an unambiguous indicator that you're diverting attention from them. Similarly, while Apple Vision Pro shows a rendered view of your eyes, when virtual content is occluding the person you're looking at, Apple intentionally renders a pattern on the front display. Why? To signal that you're looking at virtual content, making it clear when the person does and doesn't have your full attention.

When someone wearing an Apple Vision Pro is looking at virtual content that partially occludes you, you'll see a pattern on the display in front of their rendered eyes (the center). With Meta Ray-Ban Display, you have no idea whether the wearer is looking at the HUD or you.

With Meta Ray-Ban Display, there is no such signal. People who spend a lot of time with you can eventually figure out that you're looking at the HUD when your eyes are looking down and to the right, but it's far more ambiguous, and this is not conducive to social acceptability. Are you present with them or are you not? They can't clearly tell.

And the worst case scenario is, when looking at the HUD, to a person sitting in front of you it can, in some circumstances, appear as if you're just looking at their chest. Yikes.

I'm not saying that I want people to be able to see the content of the display, as that would be a terrible privacy flaw. But I do wish there was an external glow on the lens when the display is on. I don't want the other person to have to guess whether I'm fully present or not, and whether I'm looking at the HUD or their body.

The Interface & Meta Neural Band

Like the regular Ray-Ban Meta glasses, you can control Meta Ray-Ban Display with Meta AI by using your voice, or use the button and touchpad on the side for basic controls like capturing images or videos and playing or pausing music. But unlike any other smart glasses to date, it also comes with an sEMG wristband in the box, Meta Neural Band.

In its current form, Meta Neural Band is set up to detect five gestures:

  • Thumb to middle finger pinch: double tap to toggle the display on/off, single tap to go back to the system menu, or hold for quick shortcuts to the 3 menu tabs.
  • Thumb to index finger pinch: how you "click".
  • Thumb to side of index finger double tap: invoke Meta AI.
  • Thumb swiping against the side of your index finger, like a virtual d-pad, which is how you scroll.
  • Thumb to index finger pinch & twist: to adjust volume or camera zoom, as you would a physical volume knob.

How Does Meta Neural Band Work?

Meta Neural Band works by sensing the activation of the muscles in your wrist which drive your finger movements, a technique called surface electromyography (sEMG).

sEMG enables precise finger tracking with very little power draw, and without the need to be in view of a camera.

The wristband has an IPX7 water resistance rating, and charges with an included proprietary magnetic contact pin charger.

While in the above clips I have my arms extended to illustrate the gestures, the beauty of sEMG is that you don't need to. Your hand can be at your side, resting on your leg, or even in your pocket. And it works even in complete darkness.

The gesture recognition is almost flawless, with close to 100% accuracy. The one exception is that rarely, if I'm walking it will fail to pick up a sideways directional swipe. But in general Meta Neural Band works incredibly well. The volume adjustment gesture, for example, where you pinch and twist an imaginary knob, feels like magic.

0:00
/0:05

Wake (top left), Scroll (top right), Click (bottom left), and Volume/Zoom (bottom right)

The Meta Neural Band gestures control the HUD interface, which looks much like that of a smart watch. It has 3 tabs, which you horizontally scroll between:

  • The center home tab (the default) shows the date and time, your notifications, and a Meta AI button, with tiny shortcuts to active audio or navigation at the top.
  • The right tab is the two-column app library: WhatsApp, Instagram, Messenger, Messages, Calls, Camera, Music, Photos, Captions, Maps, Tutorials, and a game called Hypertrail. Four rows are shown at a time, and you navigate vertically to see the rest.
  • The left tab features quick controls and settings like volume, brightness, Do Not Disturb, as well as shortcuts to Captions, Camera, and Music.

When I first used Meta Ray-Ban Display I found the interface to be "just too much", often requiring too many sequential gestures to do what you want. While I still broadly hold this view a month later, I have found myself becoming more used to quickly performing the correct sequence with experience, and I've discovered that if you pinch and hold your thumb to your middle finger, you get shortcuts to the three main menu tabs, which can speed things up.

I still think there's plenty of room for improvement in Meta Ray-Ban Display's interface though, from a simplicity perspective, and repeat my assertion that the menu should have two tabs, not three. The Meta AI finger gesture makes the Meta AI button on the center tab redundant, for example, and when you don't have any notifications, the tab feels like a waste of space.

0:00
/0:10

This clip from Meta shows the 3 tabs of the system interface, and how you swipe between them.

A lot of the friction here will eventually be solved with the integration of eye tracking. Instead of needing to swipe around menus, you'll be able to just look at what you want and pinch, akin to the advantages of a touchscreen over arrow keys on a phone. But for now, it can sometimes feel like using MP3 players before the iPod, or smartphones before the iPhone. sEMG is obviously going to be a huge part of the future of computing. But I strongly suspect it will only be one half of the interaction answer, with eye tracking making the whole.

A major improvement since the Meta Connect demo though, is performance. While I still wouldn't describe Meta Ray-Ban Display as very snappy, the abject interface lag I encountered at Connect is gone in the shipping consumer model. The most noticeable delay is in waking the display, which often takes a few seconds after the gesture.

Meta Neural Band size comparison with Fitbit Luxe.

Coming back to Meta Neural Band for a second, the only real problem with it is that it's something else you need to wear and charge (with a proprietary cable). I always wear a Fitbit on my left wrist, and now I wear the Meta Neural Band on my right too.

That's not to say that Meta Neural Band is a comfort burden. I find it no more or less comfortable than I did the Pixel Watch I used to own, and having tried a Whoop it feels similar to that too. And while it does leave a minor mark on my wrist, so do those other devices.

But it's another thing to remember to put on charge at night, another proprietary cable to remember to bring when traveling, and another USB-C port needed at my bedside.

Ideally, I should only have to wear and charge one wrist device. But today, Meta Neural Band is solely an sEMG input device, and nothing more.

Traveling means bringing the proprietary charging cable with you.

The band already has an accelerometer inside, so in theory, a software update could let it track my daily step count. And if a future version could add a heart rate sensor for fitness, health, and sleep tracking, I wouldn't need my Fitbit at all. But we're not there yet. And wearing a second wrist device is a big ask for any product.

Features & Use Cases

There are six primary use cases of Meta Ray-Ban Display. It's a camera, a communications device, a GPS navigator, an assistive captions and translations tool, a personal audio player, and an AI assistant that can (if you want) see what you see.

So how well does it do each of these things?

Capturing Photos & Videos

When the regular Ray-Ban Meta glasses first launched, they were primarily pitched as camera glasses, like their predecessor the Ray-Ban Stories, and this remains one of the biggest use cases even as Meta now calls the product category "AI glasses". But without a display, you didn't get a preview of what you're capturing, nor could you check if the result was any good until it synced to the phone app. Sometimes you got lucky, and other times you failed to even frame your subjects, an issue not helped by the camera on almost all smart glasses being on a temple rather than centered.

0:00
/0:08

Meta depiction of photography.

Meta Ray-Ban Display also has 32GB of storage for media, and also syncs everything to your phone via Wi-Fi 6 when you open the Meta AI app. But the experience of capturing media is fundamentally better, because you get a live visual preview of exactly what's in frame. And with the Meta Neural Band, you can capture without raising your arm, as well as adjust the zoom on the fly by pinching your index finger to your thumb and twisting, the same gesture used to adjust the volume.

The camera quality isn't as good as your phone, given that the sensor has to fit into the temple of glasses. It appears to be the same camera as the regular Ray-Ban Meta glasses, and while it can produce great results in daytime, in low-light environments or with heavy zoom you'll get a relatively grainy output. You can see some sample shots and clips in my colleague Ian Hamilton's launch-week impressions piece.

Hands-On With Meta Ray-Ban Display & Meta Neural Band Across New York City
UploadVR’s Ian Hamilton bought the Meta Ray-Ban Display glasses on launch day and tested them across Manhattan.
UploadVRIan Hamilton

Regardless, the ability to capture precisely framed shots without needing to hold up a smartphone is Meta Ray-Ban Display at its best. It lets you both stay in the moment and capture memories at the same time, without the doubt that what you've shot doesn't capture what you wanted to include. And it works completely standalone, even if your phone is out of battery.

With a couple of swipes and taps you can also easily send your captured media to a friend, something that required extensive voice commands with the displayless glasses. And this brings us on to messaging.

Messaging

While Meta doesn't have an existing device ecosystem like Apple and Google, what it does have is the most popular messaging platform and two most popular social networks in the world, all three of which have an integration on Meta Ray-Ban Display.

You can opt to have incoming WhatsApp, Messenger, Instagram, and text messages pop up on the display, and so without needing to look down at a smartwatch or take your phone out of your pocket you can "screen" them to decide which are important enough to respond to immediately. Tap your thumb to your middle finger to dismiss a message, or to your index finger to open it.

The notification appears in the very bottom of the display area, which is already slightly below your sightline, so doesn't block your view of what's in front of you. And of course, unlike with a phone, no one around you can see it. There's also a setting to automatically detect when you're in a moving vehicle, so if you start driving a car you won't be interrupted.

0:00
/0:03

Meta depiction of messaging.

If you want to respond to a message, there are 4 options in the interface: dictate, voice note, suggested emoji, or suggested reply.

I don't send voice notes, and I don't find the suggested emojis or replies useful, but I do love the dictation. It's powered by an on-device speech recognition model with relatively low latency and surprisingly great accuracy, in my experience. Even with my Northern Irish accent that some other systems (and people) can find difficult to understand, I'm able to dictate full responses without needing to type. And what's most impressive is that it works even when you speak quietly, thanks to the six-microphone array that includes a dedicated contact mic positioned just a few inches above your lips. That's yet another advantage of the glasses form factor.

Still, there are plenty of messages that I wouldn't want to dictate even softly in public, and situations when I want to use combinations of punctuation and letters that don't have a spoken form. Meta plans to release a software update in December that will let you enter text by finger-trace letters on a physical surface, such as your leg, bringing some of its more advanced sEMG research out of the lab and into the product. It sounds straight out of science fiction, and we'll bring you impressions of it when the update rolls out. But it's not there today.

What About iMessage?

If you're an iPhone user, you're probably wondering whether all this works with iMessage.

I use an Android phone, from which Meta Ray-Ban Display can receive and send both 1-on-1 and group text messages, including both SMS and RCS.

For iPhone, I'm told, you can receive and send only 1-on-1 iMessages, with no support for group threads. And this support is limited to only pop-up notifications - you won't see a 'Messages' app in the list.

These limitations, to be clear, are imposed by Apple, and Meta would gladly support the missing features if Apple let it.

As well as receiving new messages as pop-up notifications, you can also access your 10 most recent threads on each messaging platform at any time by swiping to it on the apps list. In the apps, you can scroll through past messages and send new ones, including your captured photos and videos stored on the glasses.

The problem with accessing your past message threads, and with viewing photos and videos you've been sent, is that it's incredibly slow to load. Open WhatsApp on your phone and you'll typically see your messages update within a fraction of a second. On Meta Ray-Ban Display you'll see everything as it was the last time the applet was opened for upwards of 10 seconds, while media can take over a minute at times, or even just seemingly never load. And it's badly missing a loading progress bar.

So, for example, while in theory you can definitely use Meta Ray-Ban Display to catch up on the dozens of Reels that one unemployed friend sends you all day, assuming your eyes can put up with the monocular display for this long, in practice, as with waiting for fast-moving group chats to finally load, I often found it faster to take the phone out of my pocket and open the real app.

The cause of this slowness seems to be that Meta Ray-Ban Display is entirely reliant on your phone's Bluetooth for internet connectivity. This connection speed issue is something I ran into repeatedly on Meta Ray-Ban Display, and made me wish it had its own cellular connection. In fact, I'm increasingly convinced that cellular will be a hard requirement for successful HUD and AR glasses.

Audio & Video Calls

For over two years now, I've made and taken almost every phone call, both personal and professional, on smart glasses. I hate in-ear and over-ear audio devices for calls because it feels unnatural to not hear my own voice clearly, and the people on the other end love the output of the optimally-positioned microphone array.

With Meta Ray-Ban Display, the addition of the HUD and wristband lets you see how long a call has gone on for, and end it without having to raise your hand. You can also view and call recently called numbers in the Calls applet.

Meta depiction of video calling.

On the regular Ray-Ban Meta glasses you can also share your first-person view in a video call, and the big new calling feature of Meta Ray-Ban Display is that you can also see the other person.

But again, this great-in-theory video calling feature is ruined by the fact that the internet connection is routed to Meta Ray-Ban Display via your phone's Bluetooth. Even with my phone and the person I'm calling on very strong internet connections, the view on both sides was laggy and pixelated. Bluetooth just isn't meant for this.

On-Foot Navigation

One of the features I've most wanted HUD glasses (and eventually AR glasses) for is pedestrian navigation. There are few things I hate more in this world than arriving in a new city and having to constantly look down at my phone or watch while walking, almost bumping into people and poles. Worse, in cities with dense skyscrapers, the GPS accuracy degrades to dozens of meters, and I hopelessly watch the little blue dot on my phone bounce around the neighborhood.

In theory, Meta Ray-Ban Display solves at least the first problem, but with a massive caveat you absolutely must be aware of if you're thinking of buying it for this use case.

0:00
/0:05

Meta depiction of navigation.

You can open the Maps applet anywhere, and you'll see a minimap (powered by OpenStreetMap and Overture) with nearby venues and landmarks. You can zoom all the way in to the street level, or out to the city level, using the same pinch-and-twist gesture used for volume control. You can also search for places using speech recognition, and scroll through the results by swiping your fingers.

The problem is that everywhere except inside the 28 cities Meta explicitly supports, you won't be able to initiate navigation. Instead, you just have an option to send it to your phone, where you can tap to open it in Google Maps, defeating the purpose. It's a rare product where core functionality is geofenced to a handful of cities.

I have been able to use Meta's navigation feature multiple times when visiting London, and found it genuinely very useful when in New York for the Samsung Galaxy XR launch event. I love the idea here, and where it works, the implementation isn't bad. Not having to look down to navigate is exactly what I wanted. But it works in so few places, relative to my life at least, that I just can't stop wishing it had Google Maps. And it's hard to imagine not switching to whatever HUD glasses have Google Maps first.

The Supported Cities

USA
• Atlanta, Georgia
• Austin, Texas
• Boston, Massachusetts
• Chicago, Illinois
• Dallas, Texas
• Fort Worth, Texas
• Houston, Texas
• Los Angeles, California
• Miami, Florida
• New York City, New York
• Orlando, Florida
• Philadelphia, Pennsylvania
• Phoenix, Arizona
• San Antonio, Texas
• San Diego, California
• San Francisco, California
• San Jose, California
• Seattle, Washington
• Washington D.C.Canada
• Toronto, Canada
• Montreal, Canada
• Vancouver, Canada

UK
• London, UK
• Manchester, UK

France
• Paris, France

Italy
• Rome, Italy
• Milan, Italy
• Naples, Italy 

It's baffling that Meta decided to roll its own navigation system. It may be the right bet in the long term, but in the short term it compromises the product and leaves the goal wide open for Google to deliver a significantly better experience. Meta already has a wide-ranging partnership with Microsoft – why didn't it license Bing Maps for navigation? Or why not acquire a provider like TomTom?

It somewhat reminds me of the Apple Maps launch debacle, except in a parallel universe where it arrived alongside an iPhone that didn't have Google Maps.

Meta AI & Its Dedicated Gesture

Meta has been marketing its smart glasses as "AI glasses" for some time now, riding the wave of hype that has for better and for worse almost entirely taken over the tech industry in recent years.

I didn't use Meta AI often on the regular Ray-Ban Meta glasses because I hate saying "Hey Meta", just as I hate saying "Hey Google", especially in public. With Meta Ray-Ban Display, you can still invoke the AI that way if you want, but there's also a dedicated gesture: just slightly curl your fingers inward and double-tap the side of your index finger with your thumb.

There's something very satisfying about it. It feels more natural than the pinch gestures used for everything else. And it has me using Meta AI far more often than I would if I had to keep saying "Hey Meta".

With the display, you also get visual aids in your responses. Ask about the weather in a place, for example, and you'll see a five-day forecast appear. Or for most queries, you'll see the response appear as text. I wish I could disable the voice response and just get the text, for some situations, and while I can do so by just reducing the volume to zero temporarily, this isn't a very elegant solution.

The real problem with Meta AI is that it's still Meta AI. It just isn't as advanced as OpenAI's GPT-5 or Google's Gemini 2.5, sometimes failing at queries where it needs to make a cognitive leap based on context, and fundamentally lacking the ability to think before it responds for complex requests.

Occasionally, I've run into situations where I wanted the convenience of asking advanced AI about something without taking out my phone, but ended up doing so after Meta AI just couldn't get the answer right. Ideally, I'd be able to say "Hey Meta, ask [ChatGPT/Gemini]". But that's not supported.

Audio Playback & Control

The primary way I use the regular Ray-Ban Meta glasses is for listening to podcasts and audiobooks. Smart glasses speakers don't do music justice, but they're great for spoken word content, and having your ears fully open to the real world is ideal.

What's different on Meta Ray-Ban Display is that you can far more easily and precisely adjust the volume. Rather than needing to raise your arm up to your head and awkwardly swipe your finger along the temple, you can just wake the display, pinch and hold your index finger to your thumb, and twist. It's a satisfying gesture that feels natural and precise.

The HUD also shows you a thumbnail for the content you're viewing, as well as how far you're into it and how long is left. It's a nice addition, and one less reason to take out my phone.

Live Captions & Translation

For accessibility, one of the biggest marketed features of Meta Ray-Ban Display is Live Captions.

The real-time speech transcription has decent accuracy, with the exception of niche proper nouns, and fairly low latency, always giving you at least the gist of what the person you're looking at is saying. And yes, I do just mean the person you're looking at. It's remarkable how well the system ignores any audio not in front of you, leveraging the microphone array to cancel it out. Look away from someone and the captions will stop. Look back and they'll continue. It really does work.

0:00
/0:06

Meta depiction of live captions.

Just one swipe over from the virtual button to start Live Captions is the one to start Live Translation, and this feature blew me away. Having a friend speak Spanish and seeing an English translation of what they're saying feels like magic, and I could see it being immensely useful when traveling abroad. For the other person to understand you, by the way, you just hand them your phone with the Meta AI app open it shows them what you're saying, in their language. Brilliant.

Yes, you can do live translation with just a phone, or audio-only devices like AirPods and Pixel Buds.

Unfortunately, it only supports English, French, Spanish, and Italian. This is another example of where Google's services, namely Google Translate, would be immensely valuable.

The bigger problem with both Live Captions and Live Translation is that because the display is slightly below and to the right of your sightline, you can't look directly at someone when using the features. It's far better than having to look down at a phone, but ideally I'd want the text to appear centered and much higher, so that I could appear to be keeping eye contact with them while seemingly-magically understanding what they're saying. This would require different hardware, though.

The Big Missing Feature

While I'm glad that Meta Ray-Ban Display lets me decide whether it's worth taking my phone out of my pocket for inbound personal messages, what I want the most out of HUD glasses is the ability to screen my emails and Slack messages.

There is no app store on Meta Ray-Ban Display, and Meta's upcoming SDK for phone apps to access its smart glasses won't support sending imagery to the HUD. For the foreseeable future, any new "apps" will have to come directly from Meta, and I call them "applets" because each really only does one thing and is essentially part of the OS.

(The company says it plans to add two new apps soon, a Teleprompter and a dedicated IG Reels app.)

So a Slack app won't be happening anytime soon. But there's a far easier way I could get what I want here.

Many smartwatches (yes, including third-party ones on iPhone) already let you view your phone notifications. I asked Meta why it doesn't do this for Meta Ray-Ban Display, and the company told me that it came down to not overwhelming the user. I don't understand this answer though, since as with smartwatches, you could select exactly which apps you do and don't want notifications from, just as you can for WhatsApp, Messenger, Instagram, and texts already.

Another interesting approach here could be to just show the icon for the app sending a notification, and require a pinch to open the full thing. If I had this, and could screen Slack notifications and emails, I would take my phone out of my pocket significantly less often.

The Case Folds Down & Has Huge Potential

Just like with AirPods, for smart glasses the included battery case is almost as important as the device itself. Meta Ray-Ban Display has an official battery life of 6 hours, which I've found to be accurate, while the case provides 4 full charges for a total of 30 hours of use between needing to charge it at the wall.

The problem with the regular Ray-Ban and Oakley Meta glasses cases is that they're far too bulky to fit in almost any jacket pocket. And the Meta Ray-Ban Display case has an elegant solution for this, likely inspired by what Snap did for the 2019 Spectacles 3.

0:00
/0:31

How Meta Ray-Ban Display's case folds when you're wearing the glasses.

When containing the glasses, it's a triangular prism. But when not, it folds down into something with very similar dimensions to a typical smartphone, (just slightly taller and less wide). This means that not only does it fit in most jacket pockets, but it even fits into my jeans pockets. So when I'm dressed relatively light and using Meta Ray-Ban Display as my sunglasses, for example, I can keep it in my pocket when on-the-move and put it back in the case when in the shade. The glasses and case together are the product, along with the wristband, and the folding case makes the whole system significantly more portable.

In fact, the glasses and case make such a great pair that I'm eager for the case to do more than just act as a battery and container.

When I need to take the glasses off, for example to wash my face, take a shower, or just let them charge, docking them in the case should turn the duo into a smart speaker, letting me continue to listen to audio, take calls, and prompt Meta AI as I would with Alexa on an Amazon Echo. This could be achieved with no extra hardware, but ideally you'd add a speaker to the case with similar quality to a smartphone.

When folded, the case (center) is flat enough to fit in a pocket.

It also seems like the battery case could be the perfect way to bring cellular connectivity to a future version of Meta Ray-Ban Display without nuking the battery life. The case would send and receive 5G and LTE signals, and relay data to the glasses via Bluetooth for low-bandwidth tasks and Wi-Fi for high-bandwidth.

Interestingly, Meta has a partnership with Verizon to sell Meta Ray-Ban Display in stores. Could this evolve into a cellular partnership for the next generation?

Conclusions: Is It Worth $800?

Meta Ray-Ban Display is very much so a first generation product, an early attempt at a category that I'm convinced, after a month of testing, could eventually become a key part of the lives of at least hundreds of millions of people.

In its current form, it's an amazing way to capture photos and videos without leaving the moment, and a great way to decide whether WhatsApp messages are worth your time without taking out your phone. In the cities where it's supported, the on-foot navigation is genuinely useful, and for the languages supported, the live translation feature could change what it means to travel. Like displayless smart glasses, it's also a great way to listen to spoken audio and take calls.

But the monocular display just doesn't feel right, too much of what Meta is trying to do is hampered by the device's lack of cellular connectivity, and the lack of established services like Gmail, Google Maps and Google Translate makes Meta Ray-Ban Display so much less useful than HUD glasses theoretically could be.

Further, while the Meta Neural Band works incredibly well, future versions need to replicate the functionality of smartwatches, instead of asking buyers to justify wearing yet another device on their wrist.

If you're an early adopter who loves having the latest technology and you don't mind looking odd in public, can live with the flaws I've outlined, and are fine with wearing a dedicated input device on your wrist, there's nothing else quite like Meta Ray-Ban Display, and the novelty could make up for the issues.

For everyone else, I recommend waiting for future generations of HUD glasses, ideally with binocular displays and either cellular connectivity or a seamless automatic phone Wi-Fi sharing system that I suspect only Apple and Google, the makers of your phone's OS, can pull off.

Just like with its Quest headsets, Meta is set to see fierce competition from the mobile platform incumbents in this space, and it'll be fascinating to see how the company responds and evolves its products through the rest of this decade and beyond.

Appreciate our reporting? Consider becoming an UploadVR Member or Patron.

Reçu hier — 10 novembre 2025 UploadVR

Escape from Hadrian’s Wall Is A 5th Century Puzzler Out Now On Quest & PC VR

10 novembre 2025 à 22:00

Escape from Hadrian’s Wall is a 5th-century VR puzzle game where you navigate the titular location with magical abilities.

Developed by Jim Gray Productions, Escape from Hadrian’s Wall is a historical fantasy puzzler set in 402 A.D. Britannia during Roman occupation and dives into Celtic legends. As a nameless prisoner held captive inside one of its forts, you become a witch's apprentice and use magical artifacts to solve the puzzles within. That's out today on Quest and PC VR.

Exploring the dungeons beneath Hadrian's Wall, this campaign sees you using magical cards and tools as you manipulate elements like earth, air, fire, and water to solve these puzzles. Jim Gray Productions states the full game features 38 puzzles in total, which include 18 Elemental Golem fights carried out through card battles.

Recently featured in the last Steam Next Fest, a free PC VR demo remains available to download that's seen several updates since its initial launch, introducing additional accessibility features and a French localization. That same demo is also available on Quest, and further language support is promised at a later date.

Escape from Hadrian's Wall launches today on PC VR and Quest.

UploadVR's Winter Showcase 2025 Announcement

10 novembre 2025 à 16:30

Every June and December, UploadVR connects with dozens of developers and publishers within the XR industry to highlight the best that virtual reality has to offer. It’s showcase season once again, and we’re excited to announce:

The Showcase will premiere December 5th @10am PT on the IGN and UploadVR YouTube channels.

Kudos to everyone who participated in this past summer’s event - none of this would be possible without the support of the VR community and the trust of developers and publishers who give us their secret showcase announcements. A huge thank you also goes out to last season’s sponsors: Elsewhere Electric, Dixotomia, Fruit Golf, Nightclub Simulator VR, and Virtual Skate. You have helped UploadVR continue to bring you the latest and greatest in all things VR, XR, AR, KR... ZR... (what acronyms are we supposed to use nowadays?) and supported your fellow developers by giving us the means to make the showcase. Thank you!

As always, we’re searching for exclusive content, reveals, and announcements. If you have a new game or exclusive news and you want to submit a video for this season, fill out the application!

Here’s some additional information about the UploadVR Showcase - Winter 2025: 

How Do I Watch the Showcase?

Subscribe to our YouTube channel to receive notifications once the showcase goes live. You can also follow us on X, Bluesky, and Instagram for the latest updates.

How Do I Submit My Game to the Showcase?

To submit a game or sign up as a sponsor, please fill out this form. Applying to the showcase tells us what you intend to announce in your video, you do not need to have a video created when you apply. We will respond with our level of interest in the project, and tell you next steps for submitting your video.

How Does UploadVR Select What is in The Show?

We’re looking for originality, oddity, interest, and impact. The projects we highlight are an amalgamation of large and smaller-scale works. While some submissions may not jive with this season’s showcase, we’re always open to future submissions.

Content must be kept under an embargo so that announcements are exclusive to the premiere. 

When Will I Know If My Application Was Accepted?

The UploadVR team reviews applications as they come in, and submitters can expect a reply that states our level of interest in what you've described.

The ideal deadline for videos is Thursday, November 20th, 2025. However, we will accept final video submissions until Thursday, November 27th, 2025.

Videos should be in 1080p or 4K and 30-60fps.

There is no deadline for submitting an application, although you probably shouldn't apply the morning of the show.

If your project won’t be ready by the end of November, we encourage you to submit for the next season, or chat with us about coverage and collaboration by emailing tips@uploadvr.com

When Are Selections Made?

If your project has been accepted, we’ll be contacting you as soon as we review your application. Please wait to contact us regarding the status only if you haven’t heard from us within 7 days after submitting. 

How Do I Sponsor the Event?

Want to be a sponsor? Please fill out the application or book a meeting with Beck

PS - check out past UploadVR showcases here.

Inu Atsume VR Is A Puppy Collecting Sim Coming To Quest Next Week

10 novembre 2025 à 14:24

Inu Atsume VR is a virtual pet simulator by Hit-Point, creators of the popular cat-collecting game Neko Atsume Purrfect, and it's launching on Quest soon.

Similar in style to the studio's feline-filled experience, Neko Atsume Purrfect, Inu Atsume VR offers puppy-loving players the chance to complete a Dog Encyclopedia and compete with their newfound companions across three competitions. It's playable in both VR and MR, with the mixed reality mode allowing pups to roam freely around your living space without running the risk of a mess.

0:00
/1:32

Official trailer

To find new pets, you can visit a park called the “Square,” where you can throw frisbees to earn the attention of your desired pet and play together. By continuously showering the pup with praise and attention, it will gradually inch closer and eventually become your new friend.

From here, the canine companions can be trained up and taught tricks like Shake, all while receiving gifts that fill out the virtual space. Those with a keen eye for interior design can also customize their in-game home by tweaking the colors of walls and doors.

While the store page lists a 'November 2025' release window, the official website confirms Inu Atsume VR will make its Quest debut on November 20 for $14.99.

Cave Crave Adds Competitive Arcade Mode, PC VR Launch Coming Soon

10 novembre 2025 à 13:45

Exploration sim Cave Crave added an arcade mode and new horror map in its latest update, and a PC VR release will follow soon.

Developed by 3R Games, Cave Crave sees you exploring tight tunnels and caves as you try to find an escape, marking walls with chalk and using various tools. While this update will arrive “soon” on PS VR2, Quest players can now jump into a new Arcade Mode that turns this into a competitive race against time, where you aim for the quickest run on the online leaderboards.

As for Cave Crave's optional Horror Mode, that's been updated with a brand new map called 'Abyss,' where your goal is to simply make it back alive. 3R Games states that it's been “inspired by cosmic dread and subterranean monstrosities straight out of a Lovecraftian nightmare,” warning of something “ancient and malevolent” hiding in the dark.

This follows the recent addition of Utah's Nutty Putty Cave as a free update on both platforms, a real-life cave closed in 2009 after the death of John Edward Jones. 3R Games says this was recreated using the official cave map and additional data without gamifying it, stating its aim to offer a “respectful, authentic way” to explore this permanently closed site.

Cave Crave is out now on PlayStation VR2 and Quest, while the Steam version is “scheduled to launch within the next few weeks.”

Cave Crave Review: All The Thrills Of Cave Exploration, Minus The Danger
Cave Crave delivers all the thrills of cave exploration that comes recommended on Quest, and it’s out next week on PS VR2.
UploadVRJames Galizio

Forefront Takes #6 In Quest Weekly Revenue Charts

10 novembre 2025 à 12:45

Less than a week since arriving in early access, VR FPS Forefront took #6 for top-earning games by weekly revenue on Quest.

Launched on November 6 in early access, Forefront is a 16v16 VR shooter from Triangle Factory that features semi-destructible maps where you split into four-person squads. Four days after that initial launch, it's reached #6 at the time of writing with a 4.6-star rating on the Meta Horizon Store after 490 user reviews, while Steam lists a “very positive” rating at 298 reviews.

Elsewhere in the charts, the top 10 earners this week remain a mostly familiar sight that's a mix of paid apps and free-to-play titles. UG is at #1 and now boasts the most user reviews on the Horizon Store at 172k. That now surpasses Gorilla Tag, which is currently at 164k user reviews.

Meta Horizon Store: Top-earning games this week by revenue as of November 10, 2025

Beat Saber holds #2, which we'd speculate was further boosted by the recent Spooky Scary Skeletons DLC for Halloween, and that's followed respectively by Animal Company, VRChat, and Gorilla Tag. Rounding out the top 10 in order after Forefront are Blade & Sorcery: Nomad, PokerStars - Vegas Infinite, and Bonelab. #10 keeps changing between FitXR and Golf+, so we cannot determine which one officially holds that position.

We'll continue monitoring these standings, and this list may evolve as the week goes on. You can find the full charts here, which cover the top 50 games and account for all forms of revenue. It's a different approach to the top 50 best-selling Quest games of all time charts, which only factor in paid app sales without including DLC, and that recently saw Assassin's Creed Nexus join the list.

Assassin’s Creed Nexus Joins Top 50 Best-Selling Quest Games of All Time
Assassin’s Creed Nexus has joined the top 50 best-selling paid Quest games of all time, with Bonelab now in the top 10.
UploadVRHenry Stockdale

Update Notice

This article was updated shortly after publication when the top 50 games became viewable instead of the top 49. #10 was briefly listed as FitXR but this was changed to Golf+ after the stats were refreshed.

Thrasher Gets Remastered Steam Release Today

7 novembre 2025 à 18:27

Thrasher receives its remastered edition with a visual update, flatscreen mode, and more today on Steam.

Released on Quest and Apple Vision Pro last year, Thrasher is a cosmic action racer that tasks you with controlling a space eel through obstacle-filled levels, and we previously named it our favorite Apple Vision Pro game of 2024. Following September's PC VR demo release, developer Puddle has launched it today on Steam, with a price drop to $9.99 on all platforms.

0:00
/0:51

Release trailer

As detailed in September, Thrasher's remastered Steam release promises improved visuals compared to standalone platforms. Puddle states the new PC VR controls are more responsive too, letting you pick either controllers or hand tracking support. UX and UI changes are also included, there's an optional flatscreen mode on PC with gamepad and mouse controls, and Steam Deck compatibility at 90 FPS.

Other changes include a new Play+ mode that aims to provide a harder challenge for advanced players, while Time Trials test your speed at clearing levels with no combo bonuses. When asked by UploadVR if these modes will eventually come to Quest or Apple Vision Pro, Puddle advised it has no updates to share about other platforms at this time.

The Steam release also follows Puddle releasing Thrasher's remastered version as a launch title for Samsung Galaxy XR, joining the list of Android XR games currently available. Much like the Steam edition, this also runs at 90 fps on Samsung's headset with the new modes and support for both hand tracking and controllers.

Thrasher is out now on Quest, Galaxy XR, Apple Vision Pro, and Steam.

Update Notice

This article was updated shortly after publication with a response from Puddle and following the official launch of a Samsung Galaxy XR port. It was updated again when the Steam release launched.

Constellations Offers Connect The Dots Stargazing In Early Access This December

7 novembre 2025 à 18:00

Constellations: Touch the Stars lets you scan the night sky with a connect the dots experience.

It's the latest experience from developer Grant Hinkson via Parietal Lab, who previously released Connectome earlier this year using the same “connect the dots” engine. Constellations: Touch the Stars includes all 88 constellations recognized by the International Astronomical Union (IAU), letting you trace and connect each constellation until the pattern is complete.

0:00
/0:32

Teaser trailer

It's been designed using a “hands-first” philosophy with hand tracking support, using a thumb tap motion to bring constellations forward. The sky is positioned based on the user’s location to determine which constellations you see, and Constellations: Touch the Stars comes with fully immersive environments in early access.

Further updates are planned following the initial launch, such as a mixed reality stargazing mode that sees stars overlaid against their real positions. This pulls up constellation names and data using the immersive view's overlay. Other promised features are a 'lie-back' mode for looking up at the stars while lying down, social stargazing with friends, and creating your own constellation patterns.

0:00
/0:35

In-game footage

Constellations will launch in early access on Meta Quest 3/3S, arriving in the first half of December. Pre-early access builds are also available by joining the official Discord server.

Temporal Sci-Fi Puzzler UnLoop Reaches PC VR Next Week

7 novembre 2025 à 16:59

Retro-futuristic puzzler UnLoop reaches PC VR in early access next week.

Published by CM Games (Into the Radius) and developed by Superposition NULL, UnLoop is a sci-fi puzzle game built around self-cooperation and time manipulation that's reminiscent of We Are One. Set on a remote space station called the Temporal Research Hub, you create copies of yourself each loop and replay your past actions in real time as you retrieve data.

Following its full release on Quest and Pico, CM Games has chosen early access on Steam to gather feedback about “optimization, player experience, graphics, and to address possible PCVR-related feature requests.” It still contains content parity with the standalone edition, and a Version 1.1 update is planned this December that promises new puzzles and a story continuation.

On the hardware side, UnLoop on Steam will initially support using Quest, Pico, and Valve Index headsets. A Steam FAQ confirms the developer will explore compatibility with additional headsets and controllers depending on community feedback, and PC VR visual improvements are also planned.

We had positive impressions in our UnLoop hands-on back in September, considering it a “clever self-co-op experience” held back by a “few rough edges.”

UnLoop looks to be a promising head-scratcher for players who love time-looping puzzles and self-orchestrated hijinks. Its core concept is compelling and clever, but a few rough edges keep it from being a standout recommendation just yet. With a bit of polish and hopefully some patches, this could be one to loop back to.

UnLoop is out now on Quest and Pico, and the Steam Early Access launch will follow on November 13.

UnLoop Hands-On: Sci-Fi Puzzling On Repeat
UnLoop is a fresh, time-bending VR puzzle game from the creators of Into the Radius, launching you into a clever self-co-op experience with a dash of sci-fi espionage.
UploadVRPete Austin

Real-Time Strategy Game Homeworld: Vast Reaches Heads For PC VR

7 novembre 2025 à 16:05

Homeworld: Vast Reaches brings the real-time strategy game to SteamVR with upgraded visuals.

Developed by FarBridge, Homeworld: Vast Reaches takes place between the events of Homeworld 1 and Homeworld 2, setting adventurous astronauts on a fresh journey within the series’ universe. You play as Tyrra Soban, a new Fleet Commander, guided by Karan S’jet as they tackle an unknown evil. Originally launched on Quest last year, it's out today on Steam after originally targeting an October 23 launch.

0:00
/0:54

Traditionally a flatscreen series, Homeworld: Vast Reaches adapts the controls for VR, allowing players to immerse themselves in their space war strategies up close and from 'any angle.' Using a virtual command module located on your wrist, clever tacticians can create ships and direct formations in your search for victory.

In addition to visual improvements, the SteamVR version introduces new Challenge Levels designed to test experienced players. Those levels are also available on Quest with a new free update.

“When we launched on Meta Quest initially, some core strategy players reported they had mastered the gameplay in Vast Reaches and wanted harder missions, so we built three new Challenge Levels for this new version with them in mind,” said FarBridge Creative Director Richard Rouse. “Get ready!”

In our previous impressions on Quest, we felt Homeworld: Vast Reaches maintained the strategic depth and storytelling chops of its predecessors.

“This new adventure successfully translates the complex, strategic gameplay of the Homeworld series, all while bringing the franchise into a new and immersive medium, making Vast Reaches a standout title in the VR RTS genre and one that we feel is a must-play for both fans of the long-running series and newcomers to VR and MR gaming.”

Homeworld: Vast Reaches is out now on Steam and Quest.

Homeworld: Vast Reaches Brings The Iconic Franchise to Virtual and Mixed Reality
Strategizing in real-time while resizing the vast reaches of space in Homeworld is an incredible feeling.
UploadVRDon Hopper

Update Notice

This article was initially published on October 3, 2025. It was updated on November 7, 2025, when Homeworld: Vast Reaches launched on Steam.

RUSH: Apex Edition Hands-On - Strong Remaster For An Aging Racer

6 novembre 2025 à 22:00

RUSH: Apex Edition brings the 2017 wingsuit racer back today on PlayStation VR2, read on for our full impressions.

The Binary Mill has been going all in on PlayStation VR2 this last year, delivering high quality ports for Into Black and Resist while taking full advantage of PS5 Pro enhancements. More than eight years since RUSH first appeared on Gear VR, later followed by subsequent ports and updates, it's now returned with some welcome changes, like expanding online multiplayer to support 12 players.

0:00
/0:43

RUSH: Apex Edition adds some appreciated visual upgrades like revamped lighting and textures, and it looks great in motion. Subtle touches like your mask showing frost in the corners as you glide through this icy mountain are rather nice, though I do wish the landings were smoother as you reach the end. Performance feels great at a native 120fps on PS5 Pro, while the base PS5 supports 90fps.

Four solo modes are included alongside online multiplayer. Standard 'Races' against the AI earn medals for a top three finish, and those convert into points, unlocking more courses and wingsuit customization options. 'Time Attack' involves beating your own scores, 'Score Challenge' adds an interesting twist where gliding through a specific part of a checkpoint ring gets a better score. Finally, 'Free flight' mode lets you explore without any course restrictions.

Races are RUSH's biggest draw. Visually diverse environments set the scene well and each hosts dozens of courses that follow different paths, though said courses begin feeling very similar after a while. Even still, there's an initial rush (no pun intended) as you descend, gliding your way across these courses in hopes of being first. Failing to reach a checkpoint adds a five second penalty, forcing you to follow a specific path to have any chance of winning.

The initial platform jump uses gaze tracking to determine you're looking forward, asking you to hold this for three seconds before the race begins. That's tracked by a pointer and while you can swap to a less noticeable one, not using eye tracking for this feels like a missed opportunity. It's moments like this that show the game's aging foundations, something that also applies to the control scheme.

Screenshot captured by UploadVR on PlayStation VR2

Gliding through and steering involve lifting your arms up to different positions. Raise both at once to ascend, down to descend, or alternate your hand movements here for going left and right. A functional but basic approach that leaves you holding your arms out, though it's a better choice than using analog sticks. For greater immersion, putting a fan on feels great as the cold “wind” hits you while racing.

You can build up speed boosts in two ways: either reaching checkpoints or gliding close to a wall and the ground, and I enjoy how RUSH: Apex Edition rewards risk takers with the latter. It's a critical balancing act as those boosts can be the difference between 1st and 2nd, but a single collision is all it takes to end your run. Boosting also benefits from adaptive trigger support on PlayStation VR2.

Descending through these courses remains satisfying, though that feeling becomes fleeting in longer stints. I'm having plenty of fun messing around in the lobbies where you can shoot some hoops, or shoot other players with dart guns; I'm just not compelled to stick it out much longer with the main game.

Given that PlayStation VR2 lacks backward compatibility with the original PlayStation VR, I'm pleased more games are getting a second life, though RUSH's aging gameplay makes it a harder recommendation in 2025. Still, Apex Edition is a great remaster effort from The Binary Mill that's the best way to play.

RUSH: Apex Edition is out now on PlayStation VR2, while previous versions remain available on PC VRPicoPSVR, and Quest.

32-Player VR FPS Forefront Is Out Now In Early Access

6 novembre 2025 à 21:05

Forefront, a 32-player VR FPS from the Breachers studio, is out now in early access on Quest, Steam, and Pico.

Developed by Triangle Factory, Forefront is a 16v16 VR shooter with expansive, semi-destructible maps where each team splits into four-person squads. Featuring four playable classes, four maps, a friends system, alongside customization and attachments for weapons, it's now entered early access on all three platforms with cross-platform multiplayer support.

0:00
/1:27

Launch trailer

Forefront takes place in a near-future setting of 2035, where an energy corporation called O.R.E. has gone to war with local governments over control of a rare mineral. Battles feature over 20 types of weapons and 10 vehicles covering land, air, and sea, while you can choose between four classes with their own unique weapons and gadgets: Assault, Engineer, Medic, or Sniper.

Detailing its release plans in a Steam FAQ, Triangle Factory states that Forefront will approximately remain in Early Access for "8-12 months." Planned additions for the full release include more maps, vehicles, and gadgets, joined by class perks, performance improvements. Updated PC VR graphics are also mentioned, and the studio plans to "gradually raise the price" as new content gets introduced.

Forefront's current roadmap

Forefront is out now in early access on QuestSteam, and Pico. We'll be bringing you our full early access impressions as soon as we can.

PvP Brawler Elements Divided Leaves Early Access On Quest Today

6 novembre 2025 à 19:05

Elements Divided, the Avatar-inspired multiplayer action game, is leaving Early Access today on Quest.

Developed by Loco Motion Devs and published by Fast Travel Games (Mannequin), Elements Divided is a PvP brawler for up to eight people that allows battle-hungry players the opportunity to wield the powers of fire, water, earth, and air. After debuting in April this year, the multiplayer action game was already in full release on Steam, and it's now leaving Early Access on Quest.

0:00
/0:56

During our hands-on with the Early Access build, we believed that Elements Divided is a great addition to the VR brawler genre, going on to say, “The matches are fast enough to just hop in for some quick rounds whenever else you have spare time, and frenetic enough that you'll work up a bit of a sweat doing so.”

Today's update also introduces sub-elements to combat. Loco Motion Devs is currently working on further free content updates that include more sub-elements, four new maps, and cosmetics, as well as a specific Winter Map update. While there aren't any exact dates specified for these additions, Fast Travel Games confirmed they will arrive between 'now and until January 2026.'

Elements Divided is available on Quest and Steam for $9.99, the former of which offers a free 30-minute trial.

Little Critters Is A Tower Defense Game That Hits Home

6 novembre 2025 à 19:00

Little Critters does many things right. It minds the little details. The game is frenetic, accessible, and fun. And it just works, in the way Steve Jobs famously described the magic of early iPhone devices. But better than a casual iPhone time-killer for the subway… this “tower defense” game brings the tower home.

What if the tower in “tower defense” is your living room?

In 2025, tower defense games often use more established formulas than wider innovations.

  1. Start: Few weapons, small horde.
  2. Progress: Add weapons, upgrades, and enemies.
  3. Endgame: Complex weapon combos, enormous hordes, chaos!

Little Critters’ core innovation is its point of view: you're not an omniscient invisible presence up above; you’re on the ground next to the tower. Enemies don’t attack from just 360 degrees - you’ll need to prepare fully spherical defense coverage as hordes crawl from below and swoop from up high.

It deftly leverages VR tech and industry best practices. The slingshot is a delight: aim via the relative position of your two controllers, pull & release via the grip button, then pile on the chaos of the horde. You’ve got a weapon that’s exactly as accurate as necessary - more accurate (or less!) and mowing down critters simply wouldn’t be as fun. Spatial audio works perfectly to cue me to turn around to my blind spot and catch up on the spawn point I’ve been neglecting; no HUD needed.

Mostly it’s just a thrill to watch these monsters emerge from the wall and hop on my couch before jumping down to the floor. (Of course… moments before I splatter them!)

Perfect Little Details

Your robo-companion doesn’t just dance when you start a wave…

0:00
/0:05

… they do the wave. 

Even staring straight at it, you might not make the connection, or you might shrug it off as a cheesy pun. As a critic, the signal here is quite loud: Developer Purple Yonder could put anything here, or nothing, and it chose to put something perfect. This inconsequential but briefly visible detail telegraphs the countless other truly invisible details where the team applied critical decision-making and taste.

When collecting slimeballs, the in-game currency, nearly every toss to your bank is a swish as long as there’s nothing physically obstructing your path.

0:00
/0:05

This detail is likewise subtle, but there’s real gameplay impact: In later levels, as the difficulty ramps up, you’ll need to buff up your defense forces mid-wave. Few things would be as frustrating as losing a run because your slimeball missed the hoop. Or losing because you need to periodically abandon your tower to literally dunk the slimeball because soft tosses from any greater distance are too unreliable.

One final little detail that’s even more abstract: Little Critters doesn’t really have a main menu or “home” landing scene. Many games drop you at a menu where you can select between “start” and “options,” or might bog you down with the preamble. Little Critters is remarkable in how hastily it gets you to your first wave of foes.

These details add up: This is the kind of game you could drop a Quest first-timer into, offering the kind of title that a more experienced gamer will likely relish too.

Scaling the Tower

Your weapons level up as you progress. While I wish I could choose where to apply my upgrades, Little Critters choosing for me encourages me to overcome waves using the whole weapons suite instead of winnowing down to (and more rapidly growing bored with) my early favorites.

I have mixed feelings about weapon effectiveness and combos: wall weapons seem best to place near the horde’s spawn points, but wherever you place them gets rendered useless as soon as said spawn points relocate. To boot, from the natural limitations of mixed reality gaming, wall weapons can be the most frustrating to reattach to a new area in a hurry.

For the floor weapons, it’s hard to tell whether, say, the bubble gun or the pie launcher provides more effective ground defense; perhaps they work best in tandem. The bubble gun slows them down and your pie launcher knocks them out, or perhaps the difference is mostly aesthetic.

My only lament is that I’ve unlocked the second “realm” available, which brings with it a new cohort of baddies… but the “realm” itself is still my living room. That’s more than a word choice nit: Games gain depth by transporting you to new locales; Mythic Realms does it well enough, but this is difficult for any game operating on the MR pretense of being in your own room.

In hindsight, if I'm really motivated to seek a new “realm,” I could physically relocate from my living room to my kitchen, where I’d navigate the fridge and countertop instead of the couch and coffee table.

While a more sophisticated game might offer stat bars and DPS calculations, what makes Little Critters a rush is dialing that part of your brain down and pelting numerous ogres with pies or tomatoes. It's a worthy addition to your library following Purple Yonder's work on Little Cities, and it's out now on Quest 3/3S.

Meta Launches $1.5 Million Competition For New Quest Apps & Significant Updates

6 novembre 2025 à 00:51

The Meta Horizon Start Developer Competition 2025 will award new Quest apps and "significant" updates a total of $1.5 million in 32 prizes of up to $100,000.

Meta Horizon Start, originally called Oculus Start years ago, is a program run by Meta that gives VR/MR developers direct access to Meta developer relations staff as well as a community Discord and "exclusive Meta events, advanced technical education, community mentorship, software credits, go-to-market guidance, and more".

Now, Meta is running a competition for Horizon Start Program developers to build or significantly update Quest apps across entertainment, "lifestyle", and gaming. Here's the list of the main awards and prizes:

  • Best Entertainment Experience: "An experience that makes consuming or watching content more immersive, interactive, and innovative than traditional formats."
    • $100,000 winner
    • $60,000 runner-up
    • $30,000 honorable mention
  • Best Lifestyle Experience: "An experience that enhances peoples’ daily lives, how they get things done, learn new skills, or connect with others around shared interests."
    • $100,000 winner
    • $60,000 runner-up
    • $30,000 honorable mention
  • Best Social Game: "A game that connects people in real time to play together online or in a colocated space."
    • $100,000 winner
    • $60,000 runner-up
    • $30,000 honorable mention
  • Best Casual Game: "A game that is accessible, single-player, and designed for quick, engaging fun."
    • $100,000 winner
    • $60,000 runner-up
    • $30,000 honorable mention
  • Judge’s Choice: "This award is given to experiences that push the boundaries of what’s possible and have created something truly unique and innovative."
    • $30,000 each for 6 winners

Additionally, there are eight "special awards" for implementing specific features or using specific SDKs and toolkits:

  • Best Implementation of Hand Interactions
    • $50,000 each for 3 winners
  • Best Use of Passthrough Camera Access with AI
    • $50,000 each for 3 winners
  • Best Immersive Experience Built with Spatial SDK
    • $50,000 each for 2 winners
  • Best Immersive Experience Built with Immersive Web SDK
    • $30,000 each for 2 winners
  • Best Android App Leveraging Features Unique to Meta Quest
    • $25,000 winner
  • Best Android Utility App
    • $25,000 winner
  • Best Android App for Travel Mode
    • $25,000 winner
  • Best Experience Built with React Native
    • $25,000 winner

Entries can be entirely new projects or "significant" updates for an existing Quest app, with Meta citing adding hand tracking support, mixed reality, or multiplayer as examples of "significant". Essentially, for an update to enter it will need to bring a new modality.

The new competition comes late in a year where Meta has awarded the creators of smartphone-focused Horizon Worlds a total of $3.5 million across three competitions, and may allay some concerns that the company is only focused on Horizon Worlds with no further interest in apps.

The deadline for submitting a project for consideration is December 9, and interested developers can enter the competition at this URL. If you're not already a Meta Horizon Start member, you'll need to apply first.

Cambridge & Meta Researchers Confirm "Retinal" Resolution Is Far Higher Than 60 PPD

5 novembre 2025 à 22:34

Cambridge and Meta researchers conducted a study confirming that "retinal" resolution is far higher than the 60 pixels per degree figure often cited.

While you'll usually see only the panel resolution of a headset mentioned on its spec sheet, what really matters is its angular resolution, or how many pixels occupy each degree of the field of view: the pixels per degree (PPD). For an extreme example, if two headsets used the exact same panels but one had a field of view twice as wide, it would have half the angular resolution.

Since Oculus widely demoed the DK1 over a decade ago, we've seen the angular resolution of affordable headsets advance from 6 PPD, an acuity that would classify a person as legally blind, to now 25 PPD, while higher-end headsets like Apple Vision Pro and Samsung Galaxy XR reach around 35 PPD, and Varjo XR-4 even achieves 51 PPD in the center. But what's the limit past which the human eye can no longer discern a difference?

Meta Tiramisu “Hyperrealistic VR” Hands-On: A Stunning Window Into Another World
We also went hands-on with Tiramisu, Meta’s prototype that combines beyond-retinal resolution, high brightness, and high contrast.
UploadVRDavid Heaney

In the XR industry, people often say that it's "generally accepted" that the limit is 60 PPD, since in theory, on paper, it offers 20/20 vision. Meta's Butterscotch prototype from a few years ago with 56 PPD was described as "near-retinal", for example. However, there has been significant skepticism of the 60 PPD figure among AR and VR experts for a long time now.

I tried Meta's 90 PPD "beyond retinal" Tiramisu prototype earlier this year, and while the demo wasn't set up to allow dynamically adjusting the resolution, the researchers behind it told me that they have done so in the lab and could clearly see a difference between 60 and 90. But this was only anecdotal.

Now, three researchers have conducted a study with 18 participants at the University of Cambridge, experimentally confirming the idea that 60 PPD is not the limit of human perception of detail.

Of the three authors of the paper, one is a Cambridge researcher, one is from Meta's Applied Perception Science team, and the third is at both.

The experiment setup.

Their experiment placed a 27-inch 4K monitor on a 1.6-meter motorized sliding rail in front of the participants, who had their heads fixed on a chin rest and were asked to discern specific visual features head-on as the conditions were varied.

The participants were presented with two different types of stimuli throughout the experiment: square-wave grating patterns (both with and without color) and text (both white-on-black and black-on-white).

Square-wave gratings, the researchers explain in the paper, are used in vision experiments because prior research suggests that "the foundational visual detectors of the human visual system are likely optimised for similar waveforms".

The resolution was varied both by moving the display closer or further away (between 1.1m and 2.7m, distance) and by upsampling or downsampling the spatial frequency of the patterns. The researchers also adjusted the viewing angle between 0°, 1°, and 20°.

For the full details of the experimental methods, you should read the paper in Nature Communications. It's an interesting read and you'll learn a lot about how this kind of perceptual science research is conducted. But it's the results that have fascinating implications for VR and AR.

The findings of the experiments.

The findings of the experiment, according to the researchers, are that the participants could discern grayscale details up to 94 PPD on average, red-green patterns at 89 PPD, and 53 PPD for yellow-violet patterns.

One participant in the study was even able to reach 120 PPD for grayscale, suggesting that for some people the threshold for "retinal" is double the generally accepted figure.

It will be a long, long time before shipping headsets reach anywhere near these resolutions. Meta's Tiramisu prototype hit 90 PPD only over a tiny 33° field of view, and Tiramisu 2 is aiming for 60 PPD over a 90° field of view instead as a better balance of specs. And while the study demonstrates that there is a difference, in my experience headsets with even "just" 56 PPD can feel incredibly real to the point where I suspect we won't want to trade off other aspects for further resolution any time soon.

Still, it's important that a formal study has been conducted to discover exactly where the limit to what the human eye can truly discern lies, and it reinforces the fact that while smartphones and tablets are plateauing, VR and AR hardware still has decades of runway for meaningful improvements to steadily arrive.

One point of skepticism here, however, is that a stationary display system like the one in the experiment does not benefit from the spatial temporal supersampling effect you get for free in a positionally tracked VR headset from the natural micromovements of your head.

Quest 3S On Sale Certified Refurbished For Lowest Price Ever

5 novembre 2025 à 18:30

Quest 3S and the 512GB Quest 3 certified refurbished are just $216 and $360 respectively this week on Meta's official US eBay page.

First spotted by IGN, you can find the Quest 3S deal here and the Quest 3 deal here. For both headsets, you need to enter eBay's discount code TECH4THEM to get the lowest price.

The code expires after 11:59pm Pacific Time on Sunday, the end of this week.

Meta claims that its certified refurbished headsets "are inspected and thoroughly tested, professionally cleaned, and restored to original factory settings so they function and look like new and include the same accessories and cables as new devices". And eBay is offering a two-year certified refurbished warranty, a year longer than you'd get even if buying a new headset from Meta.com.

Meta Now Sells Quest 3S Refurbished For $270
Meta now offers Quest 3S refurbished from $270, though you can get the headset new for that price for the next few days.
UploadVRDavid Heaney

Quest 3S certified refurbished is normally $270, and the lowest we've seen it sold at brand new is $250, so the $216 offer here represents by far the lowest price we've ever seen for a fully standalone headset with included tracked controllers, hand tracking, and color mixed reality.

Meanwhile, Quest 3 certified refurbished at $360 is arguably an even better deal, as the only remaining 512GB model certified refurbished is normally $450.

While Quest 3S can run all the same content as Quest 3, and has the same fundamental capabilities (including the same XR2 Gen 2 chipset and 8GB RAM), if you have the funds we always recommend Quest 3 over Quest 3S. The proper Quest 3 features Meta's advanced pancake lenses which are clearer and sharper over a wider area, have a wider field of view, and are fully horizontally adjustable, suitable for essentially everyone's eyes. These pancake lenses also enable Quest 3 to be thinner, which makes the headset feel slightly less heavy.

Still, at just $216, Quest 3S certified refurbished enters the realm of an impulse buy for many, or perhaps an impulse gift for the holiday season to bring a friend or loved one into VR.

Assassin's Creed Nexus Joins Top 50 Best-Selling Quest Games of All Time

5 novembre 2025 à 17:25

Assassin's Creed Nexus has joined the top 50 best-selling paid Quest games of all time, with Bonelab now in the top 10.

You may recall that back in April, Meta revealed the 50 best-selling paid Quest games of all time via a then-new section of Quest's Horizon Store. This excludes free-to-play games unless they initially launched as a paid title, such as Population: One, and the lineup has seen some slight changes in the past six months.

We're not certain when this list was last updated, but compared to April's charts, Assassin's Creed Nexus is arguably the biggest new name to arrive at #50. NightClub Simulator is the only other new entry at #46. Exiting the list are the former #50, Please, Don’t Touch Anything, and former #48, Angry Birds VR: Isle of Pigs.

The top 10 games are mostly unchanged; Beat Saber retains #1, followed respectively by Job Simulator, Superhot VR, Blade & Sorcery: Nomad, The Thrill of the Fight, Virtual Desktop, and Among Us 3D. The two exceptions are Vader Immortal Episode I in #8, which pushed Onward into #9. There's also Bonelab in #10, overtaking The Walking Dead: Saints & Sinners.

It's worth remembering this list is unlikely to include Asgard's Wrath 2 and Batman: Arkham Shadow. The former was initially bundled for free with every new Quest 3, and Arkham Shadow did the same for new Quest 3 and Quest 3S purchases. These activations wouldn't be considered sales. Titles in the Horizon+ games catalog are also less likely to appear, since subscribers can access them without a separate purchase.

For everything else, here's Meta's full list of the best-selling paid Quest titles of all time as of November 5, 2025:

  1. Beat Saber
  2. Job Simulator
  3. Superhot VR
  4. Blade & Sorcery: Nomad
  5. The Thrill of the Fight
  6. Virtual Desktop
  7. Among Us 3D
  8. Vader Immortal Episode I
  9. Onward
  10. Bonelab
  11. The Walking Dead: Saints & Sinners
  12. Creed: Rise to Glory
  13. Vader Immortal Episode III
  14. Five Nights at Freddy’s: Help Wanted
  15. Vader Immortal Episode II
  16. GOLF+
  17. Population: One
  18. Eleven Table Tennis
  19. Drunkn Bar Fight
  20. Walkabout Mini Golf
  21. I Am Cat
  22. Contractors
  23. GORN
  24. Resident Evil 4
  25. NFL Pro Era
  26. Pistol Whip
  27. The Thrill of the Fight 2
  28. Vacation Simulator
  29. Ghosts of Tabor
  30. Real VR Fishing
  31. Waltz of the Wizard
  32. Wander
  33. A Township Tale
  34. The Climb 2
  35. Star Wars: Tales from the Galaxy’s Edge
  36. Pavlov Shack
  37. Fruit Ninja
  38. Hand Physics Lab
  39. Arizona Sunshine
  40. I Am Security
  41. I Expect You To Die
  42. Gun Club VR
  43. Warplanes: WW1 Fighters
  44. Shave & Stuff
  45. The Room VR: A Dark Matter
  46. Nightclub Simulator
  47. Skybox VR Video Player
  48. The Climb
  49. Moss
  50. Assassin's Creed Nexus

Did you expect any wider changes or any other games to appear? Let us know in the comments below.

Laser Dance Early Access Review: The Mixed Reality Game Quest 3 Needs

5 novembre 2025 à 11:00

A wizard arrives precisely when he means to this week with the early access release of Laser Dance in mixed reality.

The wizard in question is Cubism developer Thomas Van Bouwel and his newest creation gives reason to scan your living room with a Meta Quest 3 or 3S headset for breakthrough mixed reality gameplay in Laser Dance.

0:00
/0:12

Laser Dance clip provided by Thomas Van Bouwel

Released over two years ago now, Quest 3 was pitched as a "next-gen mixed reality device" with high quality passthrough and the promise of a new class of experience using your physical environment as the backdrop to its gameplay. We've seen some interesting work in this space with Starship Home, releasing last year, being one of the first examples of what a game could do with your room as a backdrop.

The Facts

What is it? Figure out how to get past the lasers to touch a button on the opposite wall. This is a mixed reality game that requires an accurate room scan to work properly and can be played with hand tracking or tracked controllers.
Platforms: Quest 3/3S
Release Date: November 6, 2025 (Early Access)
Developer/Publisher: Vanbo BV
Price: $9.99

Laser Dance is more accessible than established hits like Beat Saber, given it works with or without controllers. If you cast the view from Quest in a party setting, watching your friend crawl across the living room to avoid a low laser is likely far more engaging than watching them slice boxes. Even after Laser Dance comes off your head, there's going to be joy in watching others dodge lasers.

Gameplay consists of getting from one end of your room to the other. The only rule is that your head, arms and spine cannot cross paths with one of the lasers. The game uses Quest's upper body tracking to figure out where you are and you can learn through progression alone that your legs aren't tracked. That means your legs can't collide with the lasers, nor end your run across the room. It's up to you whether you let that affect your strategies around the lasers or not. You can use it like people who cheat in laser tag by covering their body-worn sensors, or you can just continue to carefully step over lasers near the ground because it's fun to imagine the system could track that danger too.

Big red buttons on opposing walls mark the start and end points of each level and you set up each playspace yourself by selecting the spots on the walls where the buttons go. As you would expect, difficulty stacks over your successive trips back and forth across the room, with lasers that move or blink in patterns you need to think about for a little bit before making your move.

0:00
/0:03

Laser Dance clip provided by Thomas Van Bouwel

Over a couple hours of play, only a couple times do I feel like the system unfairly matches my body movements to a laser, forcing me to walk back across my living room half a dozen times in a row to try again. The solution, I found with one particular level, is to crawl across my floor just a little bit further than I thought I should have been required to get past a laser colliding with my back. By the end of my time playing with only hand tracking, I find myself holding my hands up in front of my face to ensure the headset sees them and doesn't think my elbows are behind me.

There are timed and no fail challenges.

Comfort

Laser Dance adapts each level to both your room layout and body dimensions, the latter of which can be adjusted in the "accessibility" tab for the options menu. There are no artificial locomotion options, you must move directly across your environment.

You can register player height, shoulder width, and also set the lowest height you can go if mobility is an issue. Player height can be adjusted automatically, and you can also halve the speed of moving and blinking lasers.

Room-scale mixed reality was promised by Meta for Quest 3 when it released in 2023. In 2025, Laser Dance becomes the most accessible way to show why mixed reality is best in a VR headset and hand tracking is the future.

Laser Dance Around Your Furniture

I finished Laser Dance's included early access levels without controllers in hand, sweating under the headset, after moving in my Quest 3 through spots in my home where I've never taken a headset and creating solid memories as I went. I've never experienced anything like this in a headset and, even in early access, Laser Dance becomes one of the first experiences you should drop a friend into so they can understand what's possible in mixed reality with a Quest 3 or 3S.

Laser Dance is one of the easiest games to play ever made. It's not endlessly replayable, at least not yet, but it belongs in most libraries and should be a go-to party game. Thomas Van Bouwel is introducing us to the idea that dodging your furniture is just part of the fun as mixed reality lasers buzz when you get too close and cut into your carpeting with murderous energy.


UploadVR normally uses a 5-Star rating system for our game reviews – you can read a breakdown of each star rating in our review guidelines. As an early access release, this review is unscored.

Realize Music: Sing Relaunches The VR Self-Care Singing App Soon

5 novembre 2025 à 10:00

Realize Music: Sing, a self-care singing app, relaunches next week on Quest.

Following an initial “soft launch” earlier this year, Realize Music: Sing by Realize Music - a studio co-founded by Devolver Digital co-founder Mike Wilson - is returning on November 13. Boasting a music catalog of over 1 million licensed tracks, it aims to amplify singing as a tool for joyful expression. While we had considerable criticisms back in February, this new release comes with a changed access model and expanded features.

Notably, you no longer need a subscription to jump in since you can now preview a selection of tracks for free, though a subscription model remains in place elsewhere for unlimited catalog access. Songs and albums are now purchasable individually, while Realize Music also promises improvements to song discovery across this library and word-by-word lyrics.

Two new gameplay modes are available that include a 'Song Hero' mode that sees you competing for the high scores across leaderboards, while Singadelic Mode is a non-scoring option “that turns every track into a freeform, expressive wellness experience.” New tracks will also be added weekly, too.

Realize Music states it's aiming to create a “safe, judgment-free space to sing” with a reactive world that responds to your voice. Two new gameplay modes are available: 'Song Hero' sees you compete for the high scores across leaderboards, while Singadelic Mode is a non-scoring option. New tracks will also be added weekly, too.

Realize Music: Sing will relaunch on November 13 on Quest in the United States, with plans to follow in additional regions as licensing expands. An introductory offer lets you optionally subscribe for $9.99 per month for the first three months, which then increases to $14.99 per month or $119.99 per year. 

Roboquest VR Gets PlayStation VR2 & Steam Release Date

4 novembre 2025 à 22:00

Roboquest VR brings the roguelite action shooter to Steam and PlayStation VR2 later this month, followed by Quest next year.

Originally developed by RyseUp Studios, Roboquest originally launched two years ago as a flatscreen PC game, and we've been anticipating Roboquest VR ever since our preview in March. Playing as a Guardian, this FPS roguelite with a comic book-inspired art style sees you taking down mechanical foes across randomly generated environments while navigating bullet hell battles. Now, we've learned it's launching later this month.

As seen on PlayStation Blog, Flat2VR Studios confirmed this upcoming adaptation has been “fully rebuilt for VR” with new features including manual reloading and interactive weapon handling. Co-op support will arrive in a future update in early 2026, while other PS VR2-specific features include adaptive trigger support, controller haptics, headset rumble, and eye-tracked foveated rendering.

It's one of today's five major announcements from Flat2VR Studios, which has been hosting a PlayStation VR2-focused livestream via PSVR2 Without Parole. Other announcements include a surprise launch for VRacer Hoverbike on Sony's headset, PS VR2 release dates for Audio Trip and Shadowgate VR later this month, plus updates on Out of Sight VR and RAGER.

Roboquest VR is out on November 20 for Steam and PlayStation VR2, while the Quest version will launch in early 2026.

Roboquest VR Is My Most Anticipated Flat2VR Studios Game
Roboquest VR shows promise with compelling FPS roguelike mechanics, and we recently previewed the PC VR version.
UploadVRHenry Stockdale

VRacer Hoverbike Flies Onto PlayStation VR2 Today

4 novembre 2025 à 21:45

Futuristic racing game VRacer Hoverbike just launched on PlayStation VR2.

First released on Steam Early Access seven years ago, VRacer Hoverbike by VertexBreakers entered full release this June alongside a new Quest port. Offering a simcade hoverbike racer where you fly down one of 30 futuristic tracks, it's out today on PS VR2 with cross-platform multiplayer support, dynamic foveated rendering, headset rumble, and adaptive triggers.

0:00
/0:58

Much like VRider SBK, VRacer Hoverbike uses a ‘chest-leaning control system’ instead of traditional analog stick-based controls. Seven gameplay modes are available that include a career mode, time trials, and weekly challenges. You can also select a combat mode which introduces items like missiles, drones, and EMPs into the mix.

Today's launch joins several major announcements from Flat2VR Studios, which shared more on PlayStation Blog alongside a livestream via PSVR2 Without Parole. Other reveals include PS VR2 release dates for Audio Trip, Roboquest VR, and Shadowgate VR, all of which will launch later this month. We also learned further news on Out of Sight VR and RAGER.

We recommended VRacer Hoverbike in our August review across Quest 3 and Steam, considering it an enjoyable VR racing game that “feels fast, tactical, and physically engaging.”

It’s a worthy step forward for the futuristic-racer genre, with innovative leaning mechanics taking players deeper into the action and making them feel like they're in control. Add in customizable content and the smart design choices that make every race more thrilling than the next, and you have the makings of a solid racer that delivers a nice rush of adrenaline every time you play.

VRacer Hoverbike is out now on PC VR, PlayStation VR2, and Quest.

VRacer Hoverbike Review: Perfect Blend Of Speed, Motion & Haptics
VRacer Hoverbike recently drove onto Quest with futuristic hoverbikes, and we took it for a spin.
UploadVRDon Hopper

Flat2VR Studios Is Releasing Four PlayStation VR2 Games This Month

4 novembre 2025 à 21:35

Flat2VR Studios is releasing four games this month alone on PlayStation VR2, with at least two more to follow in 2026.

It's becoming an increasingly busy month for new VR games this November, and that's only expanding with today's news from Flat2VR Studios. Revealed through PSVR2 Without Parole and PlayStation Blog, the publisher is celebrating Sony's headset with new announcements for a PlayStation VR2 day. Some of these are multiplatform games, though the focus is primarily on Sony's headset.

We've listed today's announcements in release order where possible, and it's worth noting Flat2VR Studios also uses the Impact Inked label for games that aren't direct VR adaptations of flatscreen games. Today's stream also confirmed a quality-of-life update and seasonal DLC items are coming to Surviving Mars: Pioneer, too.

We've linked new trailers and videos where individually available, and here's what you can expect on PlayStation VR2 soon.


VRacer Hoverbike - Out Today

Developed by VertexBreakers, V-Racer Hoverbike is a simcade motorbike racer where you fly down one of 30 futuristic tracks. Following its recent relaunch on Steam and a Quest port, it's out now on PlayStation VR2 with cross-platform multiplayer support, dynamic foveated rendering, headset rumble, and adaptive triggers.

V-Racer Hoverbike Review: Perfect Blend Of Speed, Motion & Haptics
V-Racer Hoverbike recently drove onto Quest with futuristic hoverbikes, and we took it for a spin.
UploadVRDon Hopper

Audio Trip - November 11

One of today's two new reveals for PlayStation VR2, rhythm game Audio Trip has been around since its early access launch in 2019. Featuring 120 levels across 32 tracks with playlists and a built-in choreography editor, this upcoming version supports dynamic foveated rendering and headset rumble.


Roboquest VR - November 20

We've been anticipating Roboquest VR ever since our GDC preview, and Flat2VR Studios confirmed a release date for the roguelite action shooter. The VR adaptation features manual reloading and interactive weapon handling with co-op support arriving in a future update in early 2026. PS VR2 supports adaptive triggers, controller haptics, headset rumble, and eye-tracked foveated rendering.

It's also launching on SteamVR on the same day, and the Quest version will follow in early 2026.

Roboquest VR Is My Most Anticipated Flat2VR Studios Game
Roboquest VR shows promise with compelling FPS roguelike mechanics, and we recently previewed the PC VR version.
UploadVRHenry Stockdale

Shadowgate VR: The Mines of Mythrok - November 25

Shadowgate VR: The Mines of Mythrok has been around since 2021, delivering a first-person fantasy action-adventure set in the world of Kal Torlin. The dungeon crawler's new port runs at 120Hz without reprojection and uses foveated rendering. Headset and sense controller haptics are supported on PS VR2 with adaptive triggers, and eye tracking is also integrated into the gameplay.


RAGER - Q1 2026

Developed by Insane Prey, music action brawler RAGER is now heading to PlayStation VR2 next year following last month's early access launch. Featuring twelve levels, three boss fights, online leaderboards, and an electronic soundtrack ranging from darksynth to metalstep, this upcoming PS VR2 port will feature headset rumble and haptic feedback support.

RAGER Hands-On: Slice And Dice Is Pretty Nice
RAGER presents an enjoyable cyberpunk rhythm fighting game, and it’s out now in early access on Quest 3 and SteamVR.
UploadVRK. Guillory

Out of Sight VR - 2026

Out of Sight VR received an early access launch alongside its flatscreen counterpart on Steam back in May, and we've known for some time that it's coming to PlayStation VR2 and Quest as well. While that initially targeted a late 2025 release on both headsets, that's now arriving in 2026.

Out of Sight VR Review-In-Progress: A Sight to Behold
Out of Sight’s “second-person” perspective proves it’s already a great fit for VR, even with more polish to come.
UploadVRJames Galizio

'XR Is Having A Moment': Colocated Pickleball With Pickle Pro From Resolution Games

4 novembre 2025 à 18:45

Alex Coulombe and I played pickleball together recently.

We each opened Pickle Pro from Resolution Games in our Apple headsets, and I clicked the share icon next to its window. Coulombe clicked accept, and we soon had a pickleball court overlaid on the ground between us. After centering the court and switching sides, we played a full match together.

That's just part of the backdrop behind Coulombe's “XR is Having a Moment” commentary for his YouTube channel talking over video of our play session, embedded below.

Earlier this year, I put together a puzzle in Jigsaw Night in much the same way, joined in Quest headsets by the app maker Steve Lukas and CNET writer Scott Stein aboard the Queen Mary in Long Beach.

“It’s largely overlooked how monumental it is that we finally have solid automatic co-location technology in affordable VR,” Lukas wrote to me over direct message this week. “If enough developers seize on it properly, this moment in time may actually be our real 'ground floor' for ubiquitous mixed reality.”

The invisible anchoring systems making these colocated experiences “just work” have been in development by Apple and Meta for years. We've gone from an impressive makeshift arena-scale multiplayer experience in 2018 at Oculus Connect to tools in 2025 that enable people like Coulombe and his partners to build experiences that sell real estate. It required manual alignment, but I've even gotten this kind of colocated experience working in the app Figmin XR between Vision Pro and Quest.

0:00
/1:07

We're a long way from people in any VR headset sharing the same digital universe easily, but what developers are starting to do already between the same headsets feels like magic.

“There's a lot of work we've put into using Antilatency and Optitrack and Meta Shared Spatial Anchors to get people to be in VR headsets and to feel like they are in the same place looking at the same content and having the real world align with that,” Coulombe says in the video. “And Apple went ahead and in VisionOS 26, they just made this work.”

Pickle Pro is sold for $7.99 on the Apple App Store, while Home Sports on Meta Quest sells for $19.99. Both titles have colocation features from Resolution Games, and the studio states it's making this a priority with its games.

“We've also invested heavily in colocation with Spatial Ops, where multiplayer is designed for colocation. We have tons of videos of players playing colocated with friends in the most amazing locations, which is so fun to see,” wrote Mathieu Castelli, Resolution Games Chief Creative Officer, over email.

“Sure colocation can be a rare thing, but when you have friends with their headsets, it really feels like a miss to not have colocation, so we've really been making that a priority with our games. Even Battlemarked, which is launching later this month, will have colocation.”

❌