Browsed by
Category: VR

Shibuya Halloween 2021 is Virtual

Shibuya Halloween 2021 is Virtual

Shibuya is trying something interesting to deal with the annual gathering of Halloween revelers during COVID. They’re making the event, concerts, fan groups, color pages, and lots of other activities free online.

You can attend events for free online and even attend the festivities in VR:

Japan has never traditionally celebrated Halloween. During the 1990’s, a bunch of Halloween loving foreigners started wearing costumes and riding the ring trains around Tokyo. Unfortunately, drunken behavior cause the police to ask the riders to disembark (please don’t be one of these ugly tourists – I don’t care how cool you think you are. It’s rude to go to another country and act like that).

Revelers soon started gathering in the neighborhoods around the train stations and the street parties grew and grew. Shibuya soon became one of the major stops for these festivities – and has grown to be the focal point. Unfortunately large amounts of trash, drunken disturbances and destruction have become all too common. Which is really unfortunate since it could be such a good chance for people to have fun and share some cross-cultural exchange.

More links:

DCS and Augmented Reality

DCS and Augmented Reality

DCS World is one of the most amazing flight simulators out there. The hyper reality and complexity of just getting your plane off the ground is well known. Modders and modelers sell hyper-realistic models for the game like the F/A-18C Hornet shown here. In the high-fidelity cockpits, literally every button/switch/knob is clickable. It’s not just for show either – people claim they can start up the real planes by learning it first in DCS.

One user has cleverly combined some augmented reality in their setup.

If you like this, you can check out some of the amazing fun folks have in this game. Check out the Youtube channel for the Grim Reapers. They have interesting, and often hilarious, skill competitions – like their AAA/SAM evasion canyon run.

360 VR Spheres

360 VR Spheres

Nova is an “untethered VR motion simulator,” making virtual reality games and training programs feel more real by rotating in any direction. Their 5.9 foot diameter sphere, which has been compared to a “human-sized hamster ball,” weighs about 1,100 lbs and simulate vehicles of all sorts by being able to move 360 degrees in any direction.

These units are too expensive for the home gaming market, the company leases each Nova unit with ongoing maintenance and upgrades, at a cost of $150,000 US Dollars per year. Eight360 is working with defense forces, mining and forestry industries, where vehicles cost millions of dollars, accidents are a very big deal and training needs to encompass tilt angles.

Unreal and films – highlight reel

Unreal and films – highlight reel

Epic Games and Unreal Engine helped to develop the amazing virtual environments used in The Mandalorian.

As I have reviewed before, this is not done with greenscreens – but instead are projected backgrounds that are rendered realtime based on camera position. This solves almost all the visual problems created when using greenscreens.

As this highlight reel shows, their technique of combining projected, camera-tracked CG environments with live actors and props can be used for all levels of production, not just blockbusters.

I suspect we’re going to see some very interesting applications soon.

IMMERSIVE LIGHTFIELD VIDEO WITH A LAYERED MESH REPRESENTATION

IMMERSIVE LIGHTFIELD VIDEO WITH A LAYERED MESH REPRESENTATION

I worked with a little bit of early lightfield photography back in the day. Looks like they’ve expanded and possibly found an interesting VR application. These researchers present a system for capturing, reconstructing, compressing, and rendering high quality immersive light field video. 

Here’s the Siggraph paper and some more examples:
https://augmentedperception.github.io/deepviewvideo/

Giving up on greenscreen?

Giving up on greenscreen?

CG has always had problems with realism. Eye-lines and focus distances are never perfect. Colors between live/CG elements never quite match. Reflections can be directionally incorrect, missing, or mismatched in color/intensity. Lighting color/intensity/direction is often inconsistent between the live elements and CG elements. Mattes have problems at edges. Motion tracking is usually off by just enough to cause odd movement discontinuities. All of this makes CG look cheap.

But there is a new approach using LED stages – large displays surrounding your shooting scene. It’s completely changing the game. Even more amazing, camera movement and simulation are done using the Unreal gaming engine. Even back in the mid 2000’s, I worked on a project that attempted to use a game engine for movie pre-visualization. That’s how far things have come. The amazing visuals of the Mandalorian were created using this technique – and it’s blowing green-screens away.

Google hand tracking now open source

Google hand tracking now open source

Google has made its hand detection and tracking tech open-source, giving developers the opportunity to poke around in the tech’s code and see what makes it tick.

“We hope that providing this hand perception functionality to the wider research and development community will result in an emergence of creative use cases, stimulating new applications and new research avenues,” reads a blog post from the team.

That post over on the Google AI Blog dives into exactly how the tech works, and devs interested in getting a closer look at it can find the project over on Google’s Github repository.