I worked with a little bit of early lightfield photography back in the day. Looks like they’ve expanded and possibly found an interesting VR application. These researchers present a system for capturing, reconstructing, compressing, and rendering high quality immersive light field video.
CG has always had problems with realism. Eye-lines and focus distances are never perfect. Colors between live/CG elements never quite match. Reflections can be directionally incorrect, missing, or mismatched in color/intensity. Lighting color/intensity/direction is often inconsistent between the live elements and CG elements. Mattes have problems at edges. Motion tracking is usually off by just enough to cause odd movement discontinuities. All of this makes CG look cheap.
But there is a new approach using large displays surrounding your shooting scene – and it’s changing the game completely. Even more amazing, camera movement and simulation are done using the Unreal gaming engine. Even back in the mid 2000’s, I worked on a project that attempted to use a game engine for movie pre-visualization. That’s how far things have come. The amazing visuals of the Mandelorian were created using this technique – and it’s blowing green-screens away.
Google has made its hand detection and tracking tech open-source, giving developers the opportunity to poke around in the tech’s code and see what makes it tick.
“We hope that providing this hand perception functionality to the wider research and development community will result in an emergence of creative use cases, stimulating new applications and new research avenues,” reads a blog post from the team.
Fologram combines computer-aided design with the holographic capabilities of Microsoft’s HoloLens headset to help in assembling even complex objects. The hologram can overlay exactly where each piece of the build should go, as well as an outline of the finished product.