{"id":14206,"date":"2025-06-11T12:27:56","date_gmt":"2025-06-11T19:27:56","guid":{"rendered":"https:\/\/mattfife.com\/?p=14206"},"modified":"2025-05-24T12:38:23","modified_gmt":"2025-05-24T19:38:23","slug":"continuous-scene-meshing-on-quest-3","status":"publish","type":"post","link":"https:\/\/mattfife.com\/?p=14206","title":{"rendered":"Continuous Scene Meshing On Quest 3"},"content":{"rendered":"\n<p>The Quest 3 lets you scan a room and build up an internal 3D mesh that represents the world you are in. This can take from 20 seconds to minutes and requires the user walking around the area &#8211; and is not able to change dynamically to opening\/closing doors\/etc. <\/p>\n\n\n\n<p>The <a href=\"https:\/\/www.uploadvr.com\/quest-3-mixed-reality-occlusion-v67-sdk-upgrade\/\" data-type=\"link\" data-id=\"https:\/\/www.uploadvr.com\/quest-3-mixed-reality-occlusion-v67-sdk-upgrade\/\">Depth API<\/a> provides live depth frames up to 5 meters in distance &#8211; but how to use that to build up the environment in real time?<\/p>\n\n\n\n<p><a href=\"https:\/\/trev3d.com\/?ref=uploadvr.com\" target=\"_blank\" rel=\"noreferrer noopener\">Julian Triveri<\/a>&#8216;s multiplayer mixed reality Quest 3 game\u00a0<a href=\"https:\/\/www.meta.com\/experiences\/lasertag\/25121192657495694\/?ref=uploadvr.com\" target=\"_blank\" rel=\"noreferrer noopener\">Lasertag<\/a> does just this. It takes the live frames and uses an open-source\u00a0<a href=\"https:\/\/github.com\/keijiro\/ComputeMarchingCubes?ref=uploadvr.com\" target=\"_blank\" rel=\"noreferrer noopener\">Unity implementation<\/a>\u00a0of <a href=\"https:\/\/en.wikipedia.org\/wiki\/Marching_cubes?ref=uploadvr.com\" target=\"_blank\" rel=\"noreferrer noopener\">marching cubes<\/a>. <a href=\"https:\/\/www.uploadvr.com\/apple-vision-pro-review\/\">Apple Vision Pro<\/a>\u00a0and\u00a0<a href=\"https:\/\/www.uploadvr.com\/pico-4-ultra-hands-on-impressions\/\">Pico 4 Ultra<\/a> already use this method &#8211; but have hardware accelerated depth sensors to help. Quest 3 developers need to do this computation themselves.<\/p>\n\n\n\n<p>See the code\u00a0<a href=\"https:\/\/github.com\/anaglyphs\/lasertag\/branches?ref=uploadvr.com\" target=\"_blank\" rel=\"noreferrer noopener\">on GitHub<\/a>.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-4-3 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<span class=\"embed-youtube\" style=\"text-align:center; display: block;\"><iframe loading=\"lazy\" class=\"youtube-player\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/DglIK62flXo?version=3&#038;rel=1&#038;showsearch=0&#038;showinfo=1&#038;iv_load_policy=1&#038;fs=1&#038;hl=en-US&#038;autohide=2&#038;wmode=transparent\" allowfullscreen=\"true\" style=\"border:0;\" sandbox=\"allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox\"><\/iframe><\/span>\n<\/div><\/figure>\n\n\n\n<p><a href=\"https:\/\/www.uploadvr.com\/developer-implemented-continuous-scene-meshing-quest-3-lasertag\">https:\/\/www.uploadvr.com\/developer-implemented-continuous-scene-meshing-quest-3-lasertag<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The Quest 3 lets you scan a room and build up an internal 3D mesh that represents the world you are in. This can take from 20 seconds to minutes and requires the user walking around the area &#8211; and is not able to change dynamically to opening\/closing doors\/etc. The Depth API provides live depth frames up to 5 meters in distance &#8211; but how to use that to build up the environment in real time? Julian Triveri&#8216;s multiplayer mixed&#8230;<\/p>\n<p class=\"read-more\"><a class=\"btn btn-default\" href=\"https:\/\/mattfife.com\/?p=14206\"> Read More<span class=\"screen-reader-text\">  Read More<\/span><\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[9,5,20],"tags":[],"class_list":["post-14206","post","type-post","status-publish","format-standard","hentry","category-cool","category-technical","category-vr"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p4WECr-3H8","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts\/14206","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=14206"}],"version-history":[{"count":2,"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts\/14206\/revisions"}],"predecessor-version":[{"id":14208,"href":"https:\/\/mattfife.com\/index.php?rest_route=\/wp\/v2\/posts\/14206\/revisions\/14208"}],"wp:attachment":[{"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=14206"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=14206"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/mattfife.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=14206"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}