Medivis helps doctors overlay 3-dimensional MRI scans on patients to visualize anatomy. This use of augmented reality helps doctors identify the exact location of tumors, blood vessels, and other structures before performing surgery.
The Quest 3 lets you scan a room and build up an internal 3D mesh that represents the world you are in. This can take from 20 seconds to minutes and requires the user walking around the area – and is not able to change dynamically to opening/closing doors/etc.
The Depth API provides live depth frames up to 5 meters in distance – but how to use that to build up the environment in real time?
Julian Triveri‘s multiplayer mixed reality Quest 3 game Lasertag does just this. It takes the live frames and uses an open-source Unity implementation of marching cubes. Apple Vision Pro and Pico 4 Ultra already use this method – but have hardware accelerated depth sensors to help. Quest 3 developers need to do this computation themselves.
This new VR headset proports to fix a lot of common VR issues. It boasts 2560 x 2560 micro-OLED displays at 75hz native and a hugely wide 116º diagonal FOV that claims to have 100% edge-to-edge sharpness all in a much smaller package than current VR headsets. It comes in a dramatically lightweight 272 grams compared to 518 grames of a Quest 3.
Bigscreen Beyond 2 isn’t cheap – it costs $1019
This is a pretty good review. He’s most impressed with the groundbreakingly amazing lenses, comfort due to light weight, and using some methods that help reduce the jitter that normally makes quick left-right head turns disorienting.
The CEO is clearly a technically knowledgeable fellow who likes talking about the factors that make this device good – a wonderful change from Apple’s ‘magical’ marketing and Meta’s shotgun approach. He talks about the challenges and promises of newer approaches like foveated rendering.
Voxon has been showing off it’s Voxon VX2 VLED technology to create interactive volumetric holograms. It costs $6,800 so it’s definitely not cheap.
It’s likely using a high rpm spinning panel to generate the image which means that dampening the sound of the spinning array, keeping the display carefully synced to avoid pixel drift, and are some of the primary engineering concerns. They do provide a Unity and Blender SDK which is interesting.
Here’s a version of Doom playing on the volumetric display
Google Earth VR was the first mainstream real-world immersive map exploration app for modern PC VR headsets, but the app never made it to the standalone VR headset era. The new app FLYstill uses Google Earth’s 3D map tiles, but brings exploring Google Earth in VR to Quest 2, Quest Pro, Quest 3, and Apple Vision Pro. It even includes the 3D geometry for certain cities.
I wrote about Disney’s VR floor before, now comes VR shoes from FreeAim. They strap to your feet a little like rollerskates. What’s interesting is that the shoes sense where they are and apply small auto-corrections to bring you back to the center of your play space.
It sure seems a bit more viable than some of the treadmill systems like the Kat Walk in which you’re strapped on top of a slick curved surface.
Ryan Trahan used VR goggles and a camera attached to a rig looking down on him from the classic 3rd person game camera location. He then tried to survive the next 50 hours doing everyday things.
It appears to have largely gone comically rough for him. Doors, curbs, playing basketball, manipulating anything small (eating food with utensils, picking up coffees, brushing his teeth) were his worst enemies.
However, I think he might be onto something. It makes me wonder if the camera setup was better and could be moved to left/right so he could see his own hands then someone could likely make a living being a ‘living robot’ who livestreams their life.